Understanding Job Analysis and Competency Modeling in the Certification Context
It is a truth universally acknowledged, that a certification must be supported by a job analysis. Indeed, this truth has been codified in NCCA Standard 14: Job Analysis, which states that a certification program, “must have a study that defines and analyzes descriptions of job-related elements linked to the purpose of the credential.” Job analysis is the means by which the program defines the scope of the credential and universe of potential content for the examination.
In the credentialing world, job analysis has been referred to by a variety of terms including “job task analysis,” “practice analysis” and “role delineation.” Different credentialing organizations may prefer one of these terms over the others due to perceived nuances in their meaning. Essentially, however, all these terms refer to a structured method and process for identifying and documenting elements of a job, role or profession. In a job analysis, the job or professional role is broken down into components, which include job tasks and responsibilities that describe the work performed (i.e., specific work behaviors), and may also include the knowledge, skills and abilities (KSAs) an individual needs to perform the job tasks and responsibilities. Job analysis studies may also investigate other key parameters of a profession or role — for example, in health-related professions, this may include patient ages or presenting conditions. Job analysis studies typically involve up-front qualitative data gathering to create a new or update an existing delineation of tasks, KSAs or other variables. This is followed by the administration of a survey to job incumbents to gather validation evidence to gauge empirical support for each job element. For certification programs, a crucial final output of a job analysis is a set of test specifications defining the content covered and distribution of items on the credentialing examination.
Is Competency Modeling the Same as Job Analysis?
There are different views in our industry as to whether — and how — competency modeling differs from job analysis. The language of NCCA Standard 14 implies that competencies are one possible output of a job analysis, stating that “the job analysis must lead to clearly delineated job-related elements (e.g., domains; tasks; competencies; and knowledge, skills and abilities) that characterize proficient performance.” In addition, the most recent I.C.E. terminology cross-references the two terms (I.C.E. Basic Guide to Credentialing Terminology, 2020).
It is true that the competency modeling process is quite similar to that of job analysis. Both competency modeling and job analysis studies typically involve a mixed method approach, with up-front qualitative data gathering to create a delineation of tasks and/or KSAs, sometimes also including other variables (knowledge, skills, abilities and other characteristics, KSAOs) followed by administration of a survey of job incumbents to gather validation evidence to gauge empirical support for each job element. Both processes yield outputs that can be used to develop blueprints for credentialing examinations, although the process may be more complex deriving blueprints from a competency model than a job analysis (Raymond, 2018).
It would be helpful if there was a universally acknowledged definition of competencies for the purposes of credentialing. However, that is not the case, and this makes it all the more challenging to explain how competency modeling differs from a traditional job analysis.
As we see it, professional competencies represent observable and measurable demonstrations of underlying characteristics of the individual. They are aspects of the job holder, rather than of the job itself. Competencies represent the integration and application of KSAOs. They may often go beyond job-specific KSAs to include non-technical essential skills required for successful job performance. Competencies often include a performance benchmark that is either explicitly stated or “baked in” via the choice of the wording of the competency statement itself. In addition, the “A” in competency modeling may also refer to attitudes or attributes, which are individual qualities that are quite distinct from abilities, which are more typically articulated in job analyses.
When/Why Would a Credentialing Organization Wish to Delineate Competencies?
Competencies may provide a fuller picture of a role or profession, and a competency model might provide a richer picture of the role that is useful to multiple end users in addition to credentialing organizations. A credentialing organization may wish to partner with interested parties such as educators, employers and/or accreditors to develop a joint description of requirements to practice in a role or profession. The resulting competency model may then be used by different stakeholders for different purposes. The delineation may include competencies that exceed what an organization is assessing on its entry-to-practice certification examination, either in breadth or depth. This approach is used quite commonly in the Canadian regulatory community.
For example, a credentialing organization wanting to focus on the continuing competence aspects of their program might develop a competency framework that articulates how a professional grows and develops post-certification, and therefore their competency model may include various career stages and proficiency levels. Developmental progressions created in this type of study can inform self-assessment, professional development and career laddering. Organizations can even develop mechanisms that assist certificants target their continuing education within the delineated competency progressions. Advanced competencies may inform educators or employers in developing training opportunities that can serve both individuals and organizations.
To fully address competencies is beyond the scope of this article. We encourage the reader to refer to the I.C.E. Research and Development Committee’s 2019 report on competency modeling for a deeper dive into the subject. Another excellent resource is an article by Mark Raymond published in the Winter 2017/18 CLEAR Exam Review. While the authors do not expect this article to provide definitive answers on this evolving topic, we hope that it has provided food for thought on the meaning and uses of job analysis and competency modeling.
References
Institute for Credentialing Excellence (2021). National Commission for Certifying Agencies Standards for the Accreditation of Certification Programs. Washington, DC: Institute for Credentialing Excellence.
Institute for Credentialing Excellence (2020) I.C.E. Basic Guide to Credentialing Terminology (2nd Edition). Washington DC: Institute for Credentialing Excellence.
Raymond, M. (2018). Integrating Competency Modeling with Traditional Job and Practice Analysis. CLEAR Exam Review, Volume XXVII, No. 2. Winter 2017/18. https://www.nbme.org/sites/default/files/2020-01/Raymond2018_CER_Competency.pdf
Risk, N., Fabrey, L., & Muenzen P. (2019). Competency Modeling. Washington DC: Institute for Credentialing Excellence.