Could you explain the memstat summarys “Leaked Memory” and “Total Leakage” values?
The statistical part works like this. Say you have a setting (Sampling rate) of one in thirty. Every 30th allocation we record it in a table. Every free gets looked up in that table. If it is in there it is recorded, if it isn’t it is ignored. So the sampling is only on the allocations, not the frees. In the table, the totals (including leaked memory) and counts are multiplied by the sampling rate. If you have enough samples, this will be entirely valid. We record what you pass to the O/S, not necessarily what the O/S actually allocates. This could under-estimate the amount of memory in certain cases. (e.g. if the memory manager always allocates in quad-word steps it would allocate 16 bytes when you requested 4). The statistics that identify certain allocation points as “Growth” are based on least squares linear regression analysis.