Skip to main content

Quiz - Where am I?

The question is, in C++, how do detect if an object is allocated on the stack or heap.

On Windows, the stack address is in the range of 0x80000000. If the address of the variable is in this range, then you could say that the object is allocated on the stack; else it is allocated on the heap. This technique of detecting is not preferrable since it may not work on other operating systems (such as linux), and deals with the platform specific information making it a non-portable solution.

Let us try to use standard C++ stuff to solve the problem. Ok, we want to know where an object is allocated. In C++, new operator is responsible for allocating an object.

It was very thoughtful of Stroustoup to keep the object allocation and initialization\construction separate. new operator is responsible only for allocation. The compiler is responsible for woving up the code for allocation and calling the constructor. It was also very thoughtful of being able to specify the location where the object needs to be allocated, which of course does not necessarily require overloading the new operator.

C++ allows taking control of object allocation by overloading the new operator. By taking control, you would be able to detect where an object is allocated, and also keep a count of the objects allocated on the stack\heap. The following code snippet illustrates the same:-

//
// SomeClass.h
//

#include <iostream>
#include <stdlib.h>
#include <deque>
#include <algorithm>

using namespace std;

class SomeClass;
typedef std::deque<SomeClass*>      SomeClassDB;
typedef SomeClassDB::iterator    SomeClassDBIter;
typedef SomeClassDB::const_iterator SomeClassDBConstIter;

class SomeClass
{
private: static SomeClassDB heapObjects;
private: static SomeClassDB stackObjects;

private: double value;
private: bool isOnHeap;

public: SomeClass(double d) : value(d), isOnHeap(SomeClass::IsOnHeap(this))
        {
           if (!IsOnHeap())
           {
              SomeClass::stackObjects.push_back(this);
           }

           PrintLocationInfo();
        }

public: ~SomeClass()
        {
           SomeClassDBIter iter = std::find(SomeClass::heapObjects.begin(),
                    SomeClass::heapObjects.end(),
                    this);

           if (iter != SomeClass::heapObjects.end())
           {
              SomeClass::heapObjects.erase(iter);
           }
        }

public: double Value() const
        {
           return this->value;
        }

public: bool IsOnHeap() const
        {
           return this->isOnHeap;
        }

public: bool IsOnStack() const
        {
           return !IsOnHeap();
        }

public: std::string Location() const
        {
           return IsOnHeap() ? "Heap" : "Stack";
        }

public: void PrintLocationInfo() const
        {
           std::cout << Value() << " is on " << Location().c_str() << std::endl;
        }

private: static bool IsOnHeap(SomeClass* scPtr)
         {
            SomeClassDBConstIter iter = std::find(SomeClass::heapObjects.begin(), SomeClass::heapObjects.end(), scPtr);
            return iter != SomeClass::heapObjects.end();
         }

private: static bool IsOnStack(SomeClass* scPtr)
         {
            return !IsOnHeap(scPtr);
         }

public: static void* operator new(size_t size)
        {
           SomeClass* scPtr = (SomeClass*)malloc(size);
           SomeClass::heapObjects.push_back(scPtr);
           return scPtr;
        }

public: static void operator delete(void* ptr)
        {
           free(ptr);
        }

public: static size_t HeapCount()
        {
           return SomeClass::heapObjects.size();
        }

public: static size_t StackCount()
        {
           return SomeClass::stackObjects.size();
        }

public: static void PrintObjectCount()
        {
           std::cout << "OnHeap: " << HeapCount() << " .... OnStack: " << StackCount() << std::endl;
        }
};

//
// SomeClass.cpp
//

#include "SomeClass.h"

SomeClassDB SomeClass::heapObjects;
SomeClassDB SomeClass::stackObjects;

int main (int argc, char *argv[])
{
   SomeClass sc(0.123);
   SomeClass* scPtr = new SomeClass(1.234);

   SomeClass::PrintObjectCount();

   {
      SomeClass sc1(2.345);
      SomeClass::PrintObjectCount();
   }

   delete scPtr;

   SomeClass::PrintObjectCount();

   return 0;
}

You should be aware of implementing the custom delete if you have provided a custom new operator. It is logical because only a custom implementation that allocated memory for the object could possibly know how to deallocate. The above technique of overloading new\delete is portable and safe since it is standard C++. As always, writing standard C++ is fun.

But why would one care to know where an object is allocated or keep a count of objects. I don't think it would something you would require for production purposes. It could be for development\debugging purposes; maybe you want to detect memory leaks or a general distribution of objects. You could take control of the allocation by overloading new for a particular class or for all classes by declaring a global new operator.

Post a Comment

Popular posts from this blog

Passing CComPtr By Value !!!

This is about a killer bug identified by our chief software engineer in our software. What was devised for ease of use and write smart code ended up in this killer defect due to improper perception. Ok, let us go!CComPtr is a template class in ATL designed to wrap the discrete functionality of COM object management - AddRef and Release. Technically it is a smart pointer for a COM object.void SomeMethod() { CComPtr siPtr; HRESULT hr = siPtr.CoCreateInstance(CLSID_SomeComponent); siPtr->MethodOne(20, L"Hello"); }Without CComPtr, the code wouldn't be as elegant as above. The code would be spilled with AddRef and Release. Besides, writing code to Release after use under any circumstance is either hard or ugly. CComPtr automatically takes care of releasing in its destructor just like std::auto_ptr. As a C++ programmer, we must be able to appreciate the inevitability of the destructor and its immense use in writing smart code. However there is a difference between …

out, ref and InvokeMember !!!

When I was working on the .NET reflection extravaganza thing that I explained in my previous column, i learnt one another interesting thing, that is about the Type.InvokeMember. How will pass out or ref parameters for the method invoked using Type.InvokeMember ? If you are going to invoke a method with the prototypeint DoSomething(string someString, int someInt);then you would use InvokeMember like this:-object obj = someType.InvokeMember("DoSomething",
BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance,
null,
this,
new object[] {"Largest Integer", 1});or use some variables in the new object[] {...}. But what do you with the args if DoSomething takes out or ref parameters ?int DoSomething(out string someString, ref int someInt);Something like this will not work string someText = string.Empty;
int someInt = 0;
object obj = someType.InvokeMember("DoSomething",
BindingFlags.Public | BindingFlags.NonPublic …

Offering __FILE__ and __LINE__ for C# !!!

THIS POST USES SYNTAXHIGHLIGHTER AND HAS ISSUES RENDERING CODE ONLY IN CHROME
Not the same way but we could say better.
Visual Studio 2012, another power packed release of Visual Studio, among a lot of other powerful fancy language features, offers the ability to deduce the method caller details at compile time.
C++ offered the compiler defined macros __FILE__ and __LINE__ (and __DATE__ and __TIME__), which are primarily intended for diagnostic purposes in a program, whereby the caller information is captured and logged. For instance, using __LINE__ would be replaced with the exact line number in the file where this macro has been used. That sometimes beats the purpose and doesn't gives us what we actually expect. Let's see.

For instance, suppose you wish to write a verbose Log method with an idea to print rich diagnostic details, it would look something like this.
void LogException(const std::string& logText, const std::string& fileName, …