Hi,
What version of PyFerret are you running?
Memory allocation does not affect processing speed, but only the
size of grids that can be loaded into memory. PyFerret and Ferret
automatically do some operations, particularly transformations, in
"chunks", so that you don't need to allocate larger amounts for
those tasks. It also automatically saves data in memory cache for
efficiency, so as not to have to re-read data all the time, but
will re-use memory space as needed. The next release has some
updates to memory management and does this chunking
What response do you see when you try set memory/size=nnn? If it
is successful, Ferret will say, for instance,
yes? set mem/siz=200
Cached data cleared from memory
yes? show memory
Current size of FERRET memory cache: 200 MegaWords (1 word =
8 bytes)
If it is unable to allocate the amount of memory, it will say so,
yes? set mem/siz=200000
Cached data cleared from memory
Unable to allocate 200000.0 Mwords of memory.
Restoring previous memory of 200.0 Mwords.
You can also issue the "CANCEL MEMORY" command which removes data
that is stored in memory but does not affect the amount of memory
allocated.
Ansley
On 6/6/2017 3:12 AM, saurabh rathore
wrote:
Dear
ferreters
I am facing a problem that in pyferret set
memory/size=nnn not working for me.
It is unable to clear the memory. So the
processing speed is too low. It suddenly stop working in
latest version of pyferret for me. I did the reinstallation
also but it is still not working.
Any help for this ?
Regards Saurabh
--
REGARDS
Saurabh Rathore
Research Scholar (PhD.)
Centre For Oceans, Rivers,
Atmosphere & Land Science Technology
Indian Institute Of Technology,
Kharagpur
contact :- 91- 8345984434
|