General NAVLE® Information
Updated June 27, 2012
In January 1998, the NBVME selected the National Board of Medical Examiners (NBME) to develop a new computer-based licensing examination, the North American Veterinary Licensing Examination (NAVLE®). The NAVLE replaced both the National Board Examination (NBE) and the Clinical Competency Test (CCT) beginning in November 2000.
Located in Philadelphia, the NBME was founded in 1915, and has as its primary mission the development of examinations for physicians, including the United States Medical Licensing Examination (USMLE®). The NBME's Division of Client Programs provides examination development services for a variety of outside clients in medical fields.
The NAVLE is a requirement for licensure to practice veterinary medicine in all licensing jurisdictions in North America. The NAVLE consists of 360 clinically relevant multiple choice questions.
The NAVLE is offered throughout North America and at certain overseas sites at computer testing centers operated by Prometric. The NAVLE is available during a four week testing window in November-December, and a two week window in April.
All candidates taking the NAVLE must agree with a confidentiality statement before they are able to take the examination. See the Irregular Behavior page for more information.
Additional information on the NAVLE can be found on the NAVLE Information and Application page on this web site.
NAVLE Test Specifications
Forms of the NAVLE administered from November-December 2005 through April 2011 were constructed according to a NAVLE Job Analysis approved by the NBVME in 2003. Forms administered beginning in November-December of 2011 were constructed according to the NAVLE Job Analysis approved in 2010. For more information, see the NAVLE Job Analysis page.
NAVLE Scoring and Standard Setting
The passing standard for the NAVLE is established using a content-based or criterion referenced standard setting procedure. This means that the passing standard depends only on the content and difficulty of the items on the examination. Each candidate's performance is measured against a fixed passing standard, and the examination is not graded on a curve. In theory, all candidates could pass or all could fail. The procedure used, called the modified Angoff method, is commonly used to set passing standards on licensing and certification examinations
The NBVME convened a standard setting workshop in January 2001 to establish a recommended passing standard for the examination. The passing standard approved following that workshop was applied to examinations given from the fall of 2000 through April 2004. The passing rate for criterion candidates for the examinations given in the fall of 2000, 2001, 2002, and 2003 was 90.4%, 92.5%, 93.3%, and 94.9%, respectively.
A new standard setting workshop was held in December 2004, and the passing standard developed at that workshop was approved by the NBVME and applied beginning with the fall 2004 NAVLE administration. The passing rate for criterion candidates for the fall 2004, 2005, 2006 and 2007 administrations was 89%, 88%, 90.1% and 92.1%, respectively.
The next NAVLE setting workshop was conducted in July 2008. The passing standard derived from that workshop was approved by the NBVME in January 2009, and was applied beginning with the November-December 2008 NAVLE. The passing rate for criterion candidates for the fall 2008, 2009, and 2010 administrations was 93.4%, 96.1%, and 95.1%, respectively.
A new standard setting workshop was held in December 2011. The passing standard derived from that workshop was approved by the NBVME in December, and was applied beginning with the November-December 2011 NAVLE. The passing rate for criterion candidates for the November-December 2011 administration was 93.0%.
Each form of the NAVLE includes 300 scored items and 60 pretest items. See the NAVLE Bulletin of Information for Candidates for more information.
The percentage of questions that a candidate has to answer correctly in order to pass ranges from 55% to 65%. The actual passing score is adjusted for each of the forms of the NAVLE, and for each administration of the NAVLE, to account for slight differences in difficulty between the forms. This ensures that each candidate's score is valid and reliable, no matter which form of the examination they take.
Additional information can be found in the annual NAVLE Technical Reports and NAVLE Candidate Performance Reports available on this site.