
20 Jun
2013
20 Jun
'13
5:44 a.m.
I'm training multiple networks based on a single database. So to accelerate speed and reduce disk reading, I use shared_memory_object class provided by boost. Since the lab workstation is currently unavailable, I migrated my code to my personal computer. On the lab workstation, the host program successfully reads all data to memory. But on my PC, strangely it creates a file on system drive rather than storing the data in memory. The whole database is about 3.7 GB; the lab workstation has 32 GB memory and runs Windows Server 2008 R2; my PC has 8 GB memory and runs Windows 7. There should be enough memory to store the data. So why? Are there certain ways to force the program to keep all data in memory?