From: "Pravin Jadhav" pravinj@gmail.com
Subject: [NMusers] addition of dummy points 
Date:  Wed, April 20, 2005 11:33 am 

Dear all,
 
I ran into a problem and looks like it has been discussed before.
 http://www.cognigencorp.com/nonmem/nm/99apr202004.html

However, I need some solution for the same.
 
While using ADVAN9 or ADVAN6, the addition of a few dummy points (DV=0 and MDV=1)
affects the estimation results. In my model, the addition of 50 dummy points drastically
changes the parameter estimates. I attribute this to the complexity of the model. Because
when I ran a simple one compartment PK model using ADVAN9/6 with and w/o additional records,
the results were slightly different. However, one would expect them to be identical.
Increasing the complexity of the model, results in more deviation from the expectation. 
 
I need to be include dummy points for simulation purposes because the data are very sparse
and thus sampled data do not give an idea of the "true" trajectory. 
 
Important to note that the estimation results are dependent on the number of dummy points
added. I would like to run the estimation step using the raw dataset and perform simulations
in the subsequent step. Can I do both operations in the same run(one control stream)? 
 
Basically, I plan too have two have two datasets (with and w/o dummy points). I would like to
run the estimation using a dataset w/o dummy points and pass on the estimated parameters for
simulation that uses rich dataset. I guess MSFO file will help me store the results of the
estimation. But how do I make the simulation step to read the parameters from that file? The
other option is manual editing(two step approach- estimation followed by simulation), however,
I am fitting each subject one by one. Just want to know if there is any automatic option. 
 
Thank you in anticipation,
 
Pravin 
_______________________________________________________

From: "Leonid Gibiansky" leonidg@metrumrg.com
Subject: Re: [NMusers] addition of dummy points 
Date: Wed, April 20, 2005 12:30 pm

Pravin

I would try to increase precision of estimation (SIGDIGITS=5 or 6) in order to get
similar results 
with and without the dummy points. If this would not fix the problem, I would prefer
to use the same 
data set for simulation and estimation.

One useful test would be to do two estimations: with and without the dummy points,
and plot the 
population and individual predictions of both models against each other (at
observation time 
points). If you see points scattered around the unit line then it should not matter
which parameter 
estimates to use: the model is over-parametrized, and two different sets of
parameters give the same 
predictions. If the predictions are different, you would need to decide (using
diagnostic plots, OF, 
etc.) which of them better describes the data, and then use it for simulation.

I would not recommend to move to simulations before you resolve the problem with
different parameter 
estimates obtained using the same data: dummy points should not significantly alter
the parameter 
estimates.

Leonid
_______________________________________________________

From: "Pravin Jadhav" pravinj@gmail.com
Subject: Re: [NMusers] addition of dummy points
Date:  Thu, April 21, 2005 4:52 pm 

Hi Leonid,
 
Thanks for the reply. I looked at the estimates and the predictions from two
runs (w and w/o dummy point). As I said earlier, the estimates are different.
However, the predictions are identical (I mean, making exactly 45 degree angle).
So I thought your following comment applies to this situation: 

 
I was having trouble convincing myself that the model is over-parametrized or
there is any identifiability issue. Because, all I am doing is the addition of
a few dummy points that --conceptually-- DO NOT even count during estimation.
Could it be a system problem? 
 
So I ran a small experiment. All the codes, datasets and results of which
are available at http://www.geocities.com/pravin1851/nmtest/
 
I simulated data for one subject after 100 mg i.v. blous dose
(clearance=0.2 L/hr, V=1 L, additive residual variability=2 mg/L).
A total of 9 PK samples were obtained at 1,2,4,6,8,10,12,14,18 hr.  

(http://www.geocities.com/pravin1851/nmtest/onecomp.xls)
Then, I appended the above dataset with 50 dummy points and MDV=1.
(http://www.geocities.com/pravin1851/nmtest/onecomp_app.xls)

Both dataset were analyzed using ADVAN1, ADVAN6 and ADVAN9 with identical
initial conditions. CTL files are also available on that link.
(onecomp_adv*.ctl and onecomp_adv*_app.ctl for the respective datasets)
 
And the estimates are surprisingly different! 
(http://www.geocities.com/pravin1851/nmtest/nmmbt.xls)
 

#Run
 Obj CL V SIG1 
onecomp_adv1 10.888 0.206 1 1.10905 
onecomp_adv1_app 10.888 0.206 1 1.10905 
onecomp_adv6 10.889 0.546 2.65 1.10905 
onecomp_adv6_app 10.882 0.522 2.54 1.10905 
onecomp_adv9 10.913 0.55 2.68 1.11355 
onecomp_adv9_app 10.882 0.522 2.54 1.10905 

 
However, the predictions are identical. Here is a plot PREDs for each dataset,
ADVAN1 is the reference. (EXACTLY MY PROBLEM- different estimates- same pred)  
(http://www.geocities.com/pravin1851/nmtest/pred_comparison.wmf)
 
Does that mean this model is over-parametrized? I don't think so. If this dataset was blinded,
any of those estimates and fits look okay to me. I will have no idea which one to believe in real life. 
 
So it is something else. As Nick mentioned in the previous post- something about step size.
Please note that ADVAN1 implementing closed form solution yields TRUE estimates. But why
should it make such a large difference? Please correct me, if there is any conceptual
mistake in implementing these simulation. 
 
I also noticed in this problem as well as my dataset that the ratio of the KEY
parameters remains the same.
 
I look forward to hearing from you.
 
Pravin 
-- 
Pravin Jadhav
Graduate Student
Department of Pharmaceutics
MCV/Virginia Commonwealth University 
DPE1/OCPB/CDER/Food and Drug Administration
Phone: (301) 594-5652
Fax: (301) 480-3212

_______________________________________________________

From: "Leonid Gibiansky" leonidg@metrumrg.com
Subject: Re: [NMusers] addition of dummy points 
Date: Thu, April 21, 2005 5:07 pm

Pravin,
In your advan1 code, you use CENT=F while in advan6/9 code CENT=A(1). It should be
CENT=A(1)/V
Try it
Thanks
Leonid
_______________________________________________________

From: "Pravin Jadhav" pravinj@gmail.com
Subject: Re: [NMusers] addition of dummy points 
Date:  Thu, April 21, 2005 5:27 pm

Sorry about that. Although, not as dramatic as it looked before- here are updated results-
 
#Run Obj CL V Res 
onecomp_adv6 10.888 0.206 1 1.10905 
onecomp_adv6_app 10.896 0.205 0.999 1.10905 
onecomp_adv9_app 10.896 0.205 0.999 1.10905 
onecomp_adv9 10.912 0.205 1 1.11355 

 
Still, the Obj and parameters are not identical. Is that reasonable
to say? Should we even expect that?
 
Or the previous comment stays put 'the model I am working with is over-parametrized'? I
agree that the model I am working on is complex. Would the small difference we see here
for the simplest case get worse with the complexity of the model? 
 
Thanks,
 
Pravin 
_______________________________________________________

From: "Ekaterina Gibiansky" gibianskye@guilfordpharm.com
Subject: Re: [NMusers] addition of dummy points
Date: Thu, April 21, 2005 5:38 pm 

Pravin,

another thing to check, just in case... You do not have EVID in your
data file. EVID data item is required if PREDPP is used, but NONMEM is
smart to insert it when only dosing and observation records are present.
Generally, your dummy observations should have EVID=2. You may want to
check that NONMEM works correctly when EVID is not specified.

Katya
_______________________________________________________

From: "Pravin Jadhav" pravinj@gmail.com
Subject: Re: [NMusers] addition of dummy points
Date: Thu, April 21, 2005 5:57 pm 

Hi Katya,
 
I do have EVID data item (item #5) but certainly did not have
EVID=2 for missing points. Addition of EVID=2 did not make any
difference to the results.
 
#Run Obj CL V Res 
onecomp_adv6  10.888 0.206 1 1.10905 
onecomp_adv6_app 10.896 0.205 0.999 1.10905 
onecomp_adv9_app 10.896 0.205 0.999 1.10905 
onecomp_adv9 10.912 0.205 1 1.11355 

Thanks.
Pravin
_______________________________________________________

From: "Leonid Gibiansky" leonidg@metrumrg.com
Subject: Re: [NMusers] addition of dummy points
Date: Thu, April 21, 2005 6:19 pm

Pravin,
I would regard these results as identical. Try to increase SIGDIG to 5-6 to get the
last digits right.

The reason for the difference is that the NONMEM selects integration steps (when the
system of 
differential equations is integrated) using some rule that depends on the distance
to the next 
observation or prediction point. When you insert extra points, steps change. If your
system is 
well-defined, this is irrelevant (as in your last example where only the last digits
are different). 
If the system is unstable or over-parametrized, addition of extra points (as well as
different 
initial values, different SIGDIG parameters, etc.) may lead to changes in the
parameter estimates. I 
would repeat my guess that your model is most likely over-parametrized and unstable.
You may try to 
start the run of the model without extra points at the initial parameters obtained
by fitting the 
data set with extra points and compare parameter estimates.
Thanks
Leonid
_______________________________________________________

From: "Pravin Jadhav" pravinj@gmail.com
Subject: Re: [NMusers] addition of dummy points
Date: Thu, April 21, 2005 8:58 pm 

Leonid,
 
Thanks a lot for your help. The problem is much clearer now. As I understand it, I
should first try to get both the systems to work alike irrespective of the number of
dummy points by taking a closer look at the model and the estimates. I will try
changing the model and/or fix the estimates before proceeding. 
 
Thanks.
 
Pravin
_______________________________________________________

From: "Nick Holford" n.holford@auckland.ac.nz
Subject: Re: [NMusers] addition of dummy points 
Date: Fri, April 22, 2005 1:08 am 

Leonid,

I agree with you that the results are equivalent for each method given that 3 sig
digs were requested. But I don't understand why you say the model is 1)
over-parameterized and 2) unstable. Can you explain why these negligible differences
in the parameter estimates and OBJ lead you to assert these 2 conclusions? 

Nick
_______________________________________________________

From: "Leonid Gibiansky" leonidg@metrumrg.com
Subject: Re: [NMusers] addition of dummy points
Date:  Fri, April 22, 2005 6:56 am 

Nick,
This was the tail of the long exchange of messages. It started with the question
what does it mean 
when addition of dummy points alters the parameter estimates. Somewhere in the
middle there was an 
observation that model predictions are identical independently of insertion of those
points and in 
spite of the difference in parameter estimates + an example of the simulated set
with the same 
property. There was an error in the example. After the error was fixed, the
discrepancy in the 
parameter estimates reduced to the last digits (see below).

My first statement (results are equivalent...) referred to the example, while the
second one (system 
is unstable ...) related to the original model where the discrepancy was significant.
Sorry for confusion
Leonid
_______________________________________________________