Adaptive Gaussian quadrature is a generalization of the Laplacian method, and when FOCE Hess is selected the FOCE ELS method, that uses numerical integration techniques to improve the likelihood approximation. It is applicable to both Gaussian and user-supplied log likelihood cases.
When Laplacian with FOCE Hess is selected with NAGQ=1, the resulting method is the same as FOCE ELS and very similar to NONMEM's FOCE engine with interaction. When FOCE Hess is not selected and NAGQ=1, it is similar to NONMEM's Laplacian engine. When NAGQ is set to a value greater than 1, the method has no NONMEM equivalent, and the quality of the likelihood approximation is improved over simple Laplacian or FOCE ELS.
The main difference is that the Laplacian approximation is replaced by a numerical integration step that samples the joint log likelihood function at a grid of ETA values in addition to the ETAs at the maximum. The initial steps are identical, a conditional step to get optimal ETAs followed by numerical second derivative evaluation at the optimal ETAs to compute a Gaussian kernel. In the simpler Laplacian case, the approximation is computed by using only using the joint likelihood value at the optimal ETA values in addition to the second derivative values, whereas AGQ continues with additional evaluations at special Gaussian quadrature nodes to improve the accuracy of the approximation.
The general AGQ method allows the user to specify the number N of nodes along each ETA variable axis, with the accuracy of the approximation as well as the computational cost increasing with Nd, where d is the number of random effects ETA for each individual. Because of this, AGQ is often most useful for improving estimation accuracy for models with small numbers d of random effects, as is often the case with user-supplied log likelihood functions, particularly for count data.
In the special case of Gaussian data, the user can optionally specify the use of a FOCE approximation to compute the Gaussian kernel covariance matrix instead of numerical second derivatives. This is more stable than a numerical second derivative computation.
Last modified date:7/9/20
Legal Notice | Contact Certara
© 2020 Certara USA, Inc. All rights reserved.