Hello all,
I'm doing visualization of series of 3D volumes stored in VTK files. First I load all vtk files to vector of vtkStructuredPoints using this simple code inside cycle vtkSmartPointer<vtkStructuredPointsReader> reader = vtkSmartPointer<vtkStructuredPointsReader>::New(); reader->SetFileName(filename); reader->Update(); ptsCopy = vtkSmartPointer<vtkStructuredPoints>::New(); ptsCopy->DeepCopy(reader->GetOutput()); loadedPoints.push_back(ptsCopy); reader->CloseVTKFile(); then when I want to change the dataset I just call loadedPoints.clear() and start loading other vtk files. Now on windows, everything goes as expected, by calling .clear() memory get freed. This is not the case in linux, here the memory is stil occupied and since I'm working with very large datasets (>12GB) this is a serious problem. Now another interesting thing is, that with some datasets the memory gets cleared even in linux, but with some others it doesn't... I'm using VTK 7.0 compiled from source, vs2015 in windows, gcc6.1 in linux I made a small project that only loads the vtk files to vector and then clears it, displaying amount of used memory in between. I can reproduce the memory leak with this code in linux, in windows it clears the memory. http://tesla.math.sk/vtkbug.7z Here you can download two datasets, zebra is causing the leak, phallusia is not http://tesla.math.sk/zebra.7z http://tesla.math.sk/phallusia.7z Thanks for any help Robert Spir PhD Department of Mathematics Slovak University of Technology Radlinskeho 11 813 68 Bratislava Slovakia |
Hi Robert, I took a look at your test case (thanks, btw -- these help a lot!) and while I can reproduce the results, it appears that linux is just caching the memory for the process, not leaking it. I modified your script to do 50 cycles of load/clears, with this result: ./vtkbug Press 1 for zebra, 2 for phallucia 1 Data loaded: 352 MB in use. Data cleared: 248 MB in use. Data loaded: 378 MB in use. Data cleared: 378 MB in use. Data loaded: 352 MB in use. Data cleared: 352 MB in use. Data loaded: 352 MB in use. Data cleared: 352 MB in use. [snipped 45 load/clears] Data loaded: 378 MB in use. Data cleared: 378 MB in use. While the memory is still marked as reserved by the process, it doesn't grow, and the 'leaked' memory is reused by later allocations. I also verified that the dataset destructors are being called at the appropriate times, and valgrind confirms that no memory is leaking at exit. I did a quick google search for clearing linux memory caches, and it appears that it may be possible to instruct linux to drop the cache if this is causing problems, but I suspect the operating system will reclaim cached memory when it starts running out. Unless you start seeing memory allocations fail or excessive swapping, this can likely be ignored. HTH, Dave On Tue, Jun 21, 2016 at 2:36 AM, RobertS <[hidden email]> wrote: Hello all, _______________________________________________ Powered by www.kitware.com Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Please keep messages on-topic and check the VTK FAQ at: http://www.vtk.org/Wiki/VTK_FAQ Search the list archives at: http://markmail.org/search/?q=vtkusers Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/vtkusers |
Thank you for the information David.
I also noticed that when I'm loading new files the memory usage is not increasing (unless I load more files than before). I just haven't seen this behavior in linux before and also the fact that with some datasets it actually frees the memory is weird. Robert
|
Free forum by Nabble | Edit this page |