Core Image memory management problem
I'm working on the codebase of Apple's example application CIVideoDemoGL to create a computer vision software capable of face detection techniques. I'm currently extending the method renderCurrentFrame to catch Core Video buffer and store it into a CIImage instance, which I'll later draw inside a CIContext. I'll post the code fragment I'm working on:
- (void)renderCurrentFrame
{
// bla bla bla
if(currentFrame)
{
// bla bla bla
inputImage = [CIImage imageWithCVImageBuffer:currentFrame];
// bla bla bla
// calling of the method that make me do ugly dreams at night
[detectedFaces addObjectsFromArray:[faceFilter findFaces:inputImage]];
// I'll talk better about it later
// PS: detectedFaces is a NSMutableArray defined, faceFilter is an instance of a class I wrote.
// creation of outputImage as I need: bla bla bla
// drawing of outputImage: bla bla bla
}
QTVisualContextTask(qtVisualContext);
}
As I announced, within this method, I pass the CIImage instance to the method findFaces belonging to a class I wrote. That method is expected to return a NSMutableArray filled with NSDictionry instances. This array should contain the exact positions of the faces my App finds within every frame. This is the code I used.
-(NSMutableArray*) findFaces:(CIImage*)image
{
[image retain]; // here retainCount is 2...
// bla bla bla
VJIntegralImage *tempImage = [[VJIntegralImage alloc] init];
[tempImage doTheMagicWithImage:image];
[tempImage release];
// bla bla bla
[image release]; // ...and here retainCount comes back to 2
return faces;
}
VJIntegralImage is some class I wrote that takes in input a CIImage and creates an internal data structure to model the concept of Integral Image I need to implement such algorithm (formally a float**). This is the method that does the trick:
-(BOOL)doTheMagicWithImage:(CIImage*)image
{
[image retain]; // retainCount == 3
// creating a NSImage from the CIImage through NSCIImageRep
inputNSCIImageRep = [[NSCIImageRep alloc]initWithCIImage:localCIImage];
localNSImage = [[NSImage alloc] initWithSize:NSMakeSize(w, h)];
[localNSImage addRepresentation:inputNSCIImageRep];
// extracting NSBitmapImageRep from NSImage
inputImageRep = [[NSBitmapImageRep alloc] initWithData:[localNSImage TIFFRepresentation]];
[image release];
}
By releasing objects properly in -(void)dealloc (I'm talking about localNSImage, localCIImage and inputImageRep), I made sure that input CIImage ' s retain count is 2 at the end of the method findFaces.
Now, the question.
Using MallocDebug I found a substantial memory leak located upon the call to TIFFrepresentation. Appearently the software leaks 10 to 15 MB of memory every few seconds (as the autorefresh of the Activity Monitor can see). According to my analysis, I'm experiencing memory anomalies around the deallocation of CIImage's instance. In fact, by ending the method renderCurrentFrame with a retain count on that object equal to 1, my App crashes and GDB retuns a objc_msgSend as catched error, while if I avoid releasing the CIImage just one time I get that huge memory leak.
It appears that the problem is located within the code I wrote (yeah, figures...). I'm just a newbie to obj-c programming and I'm beginning to think that I misunderstood something really basic about memory managament. Can somebody help?
I hope my english and my code were plain enough. If not, I'd be happy to give more informations.
macbook pro, Mac OS X (10.4.7), cool!