Validation Suite

The Phoenix Validation Suite provides automated functionality-based testing of Phoenix’s analytic capabilities to assist customers in validating their Phoenix installation. The Validation Suite provides a set of test cases that are used to test the corresponding version of Phoenix on the user’s system in the user’s environment.

Note:The Validation Suite tests do not provide the appropriate technical controls to ensure a 21 CFR Part 11 compliant implementation.

This section contains the following topics:

Validation overview
Select and run the validation tests
Access validation reports
Validation templates
Phoenix WinNonlin validation tests
Phoenix NLME validation tests

Validation overview

The Phoenix Validation Suite focuses on numerical testing of Phoenix’s analytical tools to verify the accuracy of software computations on the processors in the user's environment. The test cases demonstrate that Phoenix computational engines perform as intended in the user’s environment. Each test case uses Phoenix to execute the operational objects from an input project in the user’s environment. The output worksheets from the executed project are exported from Phoenix as CSV files with column headers and column units if used. The exported CSV files are referred to as the ‘run’ files from the user’s environment. The user’s run files are then compared to a set of CSV files that have verified results. The CSV files with the verified results are referred to as the ‘reference’ files. The comparison is done by computing the differences in the numeric results and by testing for differences in text results.

The reference files have been verified by comparing either to computations in other products or to published examples: NONMEM, Microsoft Excel, SAS code, SAS procedures, S-PLUS code, results given in textbook examples or journal references, and examples and results supplied by the National Institute for Standards and Technology (NIST). The methods for verification of each test case are explained in the document included with the Validation Suite titled “Computational Engines Verifica­tion Report for Phoenix WinNonlin x.x.pdf,” where “x.x” is the version number.

The test case will have a status of Passed when all of the following conditions are true:

Number of lines in the ‘run’ and ‘reference’ files are equal,

Text values all match exactly between the ‘run’ and ‘reference’ files, and

Numerical differences between the ‘run’ and ‘reference’ files are within the acceptable toler­ance (described below).

Due to limitations in computer accuracy (which is 14 to 15 significant digits for double-precision arith­metic), when computing the differences of ‘run’ and ‘reference’ numbers, the ‘run’ and ‘reference’ numbers are each rounded to 14 significant digits prior to taking the difference. This is so that ‘differ­ence’ file does not display differences smaller than 1e–14 which are beyond computer accuracy and so are meaningless.

In addition, when comparing numerical results in ‘run’ files to ‘reference’ files, the Validation Suite uses an acceptable numerical tolerance when determining whether a test case passes or fails, so as to not fail test cases for insignificant differences. The use of the acceptable tolerance is relative to the magnitude of the result being validated, not an absolute numerical tolerance. For validation of Win­Nonlin, the acceptable tolerance is 1e–6, and the computation is:

validui00655.png 

where 'run' is the value from the ‘run’ file and 'ref' is the value from the ‘reference’ file. Therefore, the ‘difference’ file may display differences <= 1e–6 for a test case with a Passed status.

Select and run the validation tests

The validation tests are started immediately and a Validation status dialog is displayed showing the progress.

Validation_Status.png 

Validation Status dialog showing tests in progress.

Use the expand_all.png and collapse_all_1.png buttons in the upper right corner to expand and collapse the tree of tests, respectively, with a single click.

Access validation reports

After executing all test cases in the Validation Suite, a Validation Report PDF file is generated and saved in <username>\Documents\Certara\Validation Reports. The file contains the user’s system information and, for each test case, the following:

test case name

test case objective,

test case status (Passed, Failed, or N/A)

error messages, if any

links to three files: reference (reference result), run (user’s result from the Validation Suite run), and difference (differences between the reference and run results)

The report is also accessible through the Validation > Validation Reports menu option.

Note:Only Adobe® Acrobat® is supported for opening the Validation Report.

For more information on testing methodology and test cases, refer to WNL “WNL Test cases” or “NLME test cases”.

Validation templates

The following template documents are provided by each Validation Suite product to assist in docu­menting validation plans for the corresponding Phoenix product and the execution of the validation plan.

Computational Engines Verification Report for Phoenix <product><version>.pdf: This docu­ment is a report showing the means used to verify the reference results.

Requirements Specification template.docx

Test Plan template.docx

Traceability Document template.docx

Validation Plan template.docx

Validation Summary Report template.docx

To access the template documents


Last modified date:7/9/20
Certara USA, Inc.
Legal Notice | Contact Certara
© 2020 Certara USA, Inc. All rights reserved.