Laplacian

Developed around the 1990s [7], Laplacian is applicable to both Gaussian data and data modeled with general user-supplied likelihoods such as count, categorical, and time-to-event data. The Laplacian method has the same basic structure as FOCE ELS in that a top-level minimization of an approximation to the marginal negative log likelihood is performed, but the details of the approximation are different and more computationally complex than FOCE ELS.

The Laplacian algorithm is based on approximating the marginal log likelihood with a Laplacian approximation to the integral of the joint log likelihood. This replaces the joint log likelihood with a “nearby” Gaussian likelihood called the Gaussian kernel that can be integrated analytically. The determination of the approximating Gaussian function requires both a conditional step as well as a numerical evaluation of the second derivatives of the joint log likelihood function at each point where the top-level objective function is evaluated. This is usually more computationally intensive and less numerically reliable than a FOCE ELS objective function evaluation. Laplacian is often regarded as slightly more accurate but slower and numerically less reliable than FOCE algorithms in Gaussian data cases where FOCE is applicable.

For a more mathematical look at the Laplacian algorithm, refer to “Explore the Formulation of Laplacian”.

Note:    The Laplacian method with FOCE Hess selected is the same as NONMEM's FOCE engine when NAGQ=1. When FOCE Hess is not selected, it is similar to NONMEM's Laplacian engine.


Legal Notice | Contact Certara
© Certara USA, Inc. All rights reserved.