From:"Sam Liao"   
Subject:[NMusers] max no. of observation per subject
Date:Fri, January 25, 2002 11:15 am  


Hi,

I am interested to find out what is the max no. of observation per subject
we can run in NONMEM.  I modify the 'NO' from 50 to 5000 in NONMEM source
code to run a simulation with over 3000 ovservations and over 300 multiple
doses in one subject.  NONMEM seems to become very slow, I have to interrupt
it after 12 hours run.

In my past experience, I had 1000 observations in one subject that worked
fine. Any comment on this issue will be greatly appreciated.

Best regards,



Sam Liao, Ph.D.
PharMax Research
270 Kerry Lane,
Blue Bell, PA 19422
phone: 215-6541151
efax: 1-720-2946783

 


*******

From:Steve_Charnick@vpharm.com
Subject:Re: [NMusers] max no. of observation per subject
Date:Fri, 25 Jan 2002 1:04 pm


On what platform and with what specs are you running NONMEM?


*******


From: "Sam Liao" 
Subject:RE: [NMusers] max no. of observation per subject
Date:Fri, 25 Jan 2002 1:42 pm


Hi Steve:
My system to run NONMEM is Win2K with Pentium 4, 1.5 GHz CPU and 800 MB RAM.
 
If your system is different, I will still be interested to know.  Thanks!


*******

From:Nick Holford   
Subject: Re: [NMusers] max no. of observation per subject   
Date:Fri, January 25, 2002 2:20 pm  

Sam,

There is no maximum in theory. Increasing NO changes the dimension of fixed size
arrays used by the NONMEM executable. The effect is to increase the memory
requirements but does not directly affect execution speed of each subject. Execution
speed does slow down however because the virtual memory exceed physical memory
allocated to the NONMEM executable and more swapping between the pagefile on the
hard disk and physical RAM must take place. You can check on the Memory Usage,
Virtual Memory Size and Page Fault rate using the Task Manager (WinXP, Win2K,
WinNT). You will need to use the View Select Columns menu to see VM Size and PF
Delta. This will give you an idea of how much memory your NONMEM run is using and
especially if it is causing a lot of page faults (ie. swapping memory between RAM
and the hard disk).

The memory swapping overhead may also depend on the compiler you use and how it
requests memory from the OS e.g. the Watcom compiler uses a third part memory
manager while the Compaq compiler seems more aware of the native Windows memory
management.

The default settings in NSIZES seem to be a pretty good optmization. Increasing the
number of parameters and/or number of observations will slow down execution time. If
time is important to you then I do not recommend routinely using larger settings for
NONMEM unless your particular model/data combination calls for it.

You say "NONMEM seems to become very slow, I have to interrupt it after 12 hours
run.". Does this mean that each iteration takes longer and longer and eventually
NONMEM seems to have stopped making any progress so you kill the task? I cannot
explain this behaviour unless somehow there is a memory leak. I have not noticed
that NONMEM has memory leaks although the Pharsight Trial Simulator 2.1.1. has a
leak that causes this kind of problem with its execution when trying to simulate
large data sets. If you check the Task Manager performance page you can watch if
memory usage is increasing while NONMEM runs. Are you running other programs e.g.
Visual NM that might be monitoring the NONMEM execution and perhaps are using up
memory? [I know you sometimes use WFN but that does not run while NONMEM runs and so
I cannot see that this would cause a problem].

Nick

-- 
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/


*******

From:Steve_Charnick@vpharm.com
Subject:RE: [NMusers] max no. of observation per subject
Date:Fri, 25 Jan 2002 3:14 pm


Hi Sam, 

The system is more than enough to run NONMEM under 'usual' or 'typical' conditions - 
I would say that the number of observations per subject is therefore the likely culprit.
If the same control stream works with a lower number (much lower)
then I think you've got your answer. 

Steven


*******


From:Steve_Charnick@vpharm.com
Subject:Re: [NMusers] max no. of observation per subject
Date:Fri, 25 Jan 2002 3:17 pm



I think Nick's question regarding the cessation
of the run after 12h is an interesting one.  I still think the
culprit is the large number of samples taxing the system, 
however if it turns out that NONMEM is simply
ceasing to making progress after 12h, 
I'd be interested as to why.



*******


From:"Sam Liao" 
Subject:RE: [NMusers] max no. of observation per subject
Date:Sat, 26 Jan 2002 10:15 pm



Hi Niclas:

Thanks alot for your example case.  Just what I asked for.

The model I used for this simulation is ADVAN2.  I did just one subject in
order to project the run time for over 250 subjects.  Since this is a very
time consuming task to find out the max, I tried to find out from other
NONMEM user's past experience.

Concerning memory leak question, I did not find any memory leak in NONMEM
run.  I am not aware of any memory leak problem in DOS applications.


Best regards,



Sam Liao, Ph.D.
PharMax Research
270 Kerry Lane,
Blue Bell, PA 19422
phone: 215-6541151
efax: 1-720-2946783