top of page
HCL Review
HCI Academy Logo
Foundations of Leadership
DEIB
Purpose-Driven Workplace
Creating a Dynamic Organizational Culture
Strategic People Management Capstone

The MOST Assessment: How Empirical Validation Is Reshaping Organization Development Practice and Professionalization

ree

Listen to this article:


Abstract: Organization Development has long struggled with establishing empirically validated competency frameworks that balance theoretical rigor with practical application. The recent publication of the MOST (Mastering Organizational & Societal Transformation) competency model represents a significant step toward professionalizing OD practice. Grounded in socio-technical systems theory and validated through psychometric testing with over 1,100 participants, the MOST Assessment provides a research-based framework for defining and developing OD capabilities. This article examines the professional landscape that necessitated such validation, analyzes consequences of competency ambiguity in OD, and presents evidence-based strategies for leveraging validated competency models to enhance professional credibility, inform workforce planning, and support the field's evolution toward mainstream recognition.

For decades, Organization Development practitioners have navigated a paradox: while the field espouses evidence-based practice and systematic inquiry, it has historically lacked empirically validated frameworks for defining professional competence itself. This gap creates practical challenges for educators designing curricula, employers seeking qualified practitioners, and professionals attempting to benchmark their capabilities against recognized standards. The recent validation and publication of the MOST competency model in the Journal of Applied Behavioral Science addresses this longstanding need by providing a psychometrically tested framework anchored in socio-technical systems theory (Brendel & Chang, 2025, as announced in professional communications).


As organizations face accelerating technological disruption, demographic shifts, and evolving employee expectations, demand for change leadership expertise has intensified. Yet without validated competency models, stakeholders struggle to distinguish qualified OD practitioners from those with adjacent but fundamentally different skill sets. Organizations investing in transformation initiatives need frameworks for assessing whether their OD resources possess capabilities aligned with strategic objectives. Academic programs require structures that connect learning outcomes with employability. The profession itself needs coherent identity markers that support collective advancement while honoring contextual diversity.


The Organization Development Professional Landscape

Defining Competency Validation in the OD Context


Competency validation in professional contexts involves systematic testing to ensure that identified skills, knowledge, and behaviors can be reliably measured and meaningfully distinguished from one another. The validation process typically includes reliability analysis to ensure measurement consistency, exploratory factor analysis to identify underlying competency structures, and confirmatory factor analysis to test whether proposed models fit observed data (Schippmann et al., 2000).


The MOST model addresses validation through its socio-technical systems foundation, which organizes competencies around three domains: Social (interpersonal and relational capabilities), Technical (methodological and analytical skills), and Influence (strategic impact and ethical leadership). According to the publication announcement, psychometric validation involved reliability testing, exploratory factor analysis, and confirmatory factor analysis with over 1,100 participants. This methodological approach distinguishes the MOST Assessment from earlier frameworks that, while conceptually valuable, lacked empirical testing to confirm their structural validity or measurement reliability.


State of Practice: The Need for Validated Frameworks


Within many established professions—medicine, engineering, accounting—competency frameworks play central roles in credentialing, professional development, and quality assurance. These frameworks typically evolve through cycles of expert consensus, empirical validation, field testing, and refinement (Campion et al., 2011).


OD's developmental trajectory has differed. While several competency frameworks have emerged over recent decades, most have relied primarily on expert consensus and qualitative synthesis rather than psychometric validation. Such approaches honor practitioner wisdom and theoretical coherence but leave critical questions unanswered: Do proposed competencies genuinely cluster as frameworks suggest? Can they be reliably measured? Do assessment results remain stable across different evaluators or measurement occasions?


Research on competency modeling emphasizes that effective frameworks require both conceptual clarity and empirical validation (Campion et al., 2011). Conceptual clarity ensures that competencies are theoretically grounded and practically meaningful. Empirical validation confirms that competencies can be reliably assessed and that the proposed framework structure reflects actual patterns in how capabilities cluster. The MOST framework's grounding in socio-technical systems theory provides conceptual clarity, while its psychometric validation provides empirical confirmation—a combination representing an important methodological advance for the OD field.


Organizational and Individual Consequences of Competency Ambiguity

Organizational Performance Impacts


When organizations lack validated frameworks for assessing OD competence, several challenges emerge. First, hiring and selection uncertainty increases when decision-makers cannot reliably evaluate practitioner capabilities. Meta-analytic research demonstrates that structured assessment approaches significantly outperform unstructured interviews and credential screening alone (Schmidt & Hunter, 1998). Without empirically grounded competency models, organizations default to less structured selection methods that may be less predictive of subsequent performance.


Second, professional development targeting becomes more challenging when organizations cannot accurately diagnose capability gaps or prioritize learning investments. Research on training effectiveness indicates that development interventions aligned with clear competency frameworks tend to produce stronger performance improvements than generic programs (Salas et al., 2012). When organizations understand which specific capabilities need development and can measure baseline competency levels, they can design more focused learning experiences and track progress more systematically.


Third, strategic alignment conversations may become more difficult when organizational leaders and OD practitioners lack shared language for defining value creation. Competency frameworks that explicitly connect capabilities to outcomes can facilitate dialogue about how specific expertise translates into organizational impact.


The publication announcement for the MOST Assessment describes its application in the Ontario Healthcare System, where leaders used competency data to identify complementary strengths across OD teams and target capability gaps affecting organizational goals. This approach reportedly supported workforce planning by revealing which competency combinations contributed to successful transformation initiatives and where development investments might yield strategic return.


Individual Wellbeing and Professional Development Impacts


Competency ambiguity also affects individual practitioners. Career navigation challenges arise when individuals cannot reliably assess their developmental needs or progress toward mastery. Research on professional identity formation emphasizes that clear competency frameworks can support self-directed learning and reflective practice by providing concrete benchmarks for self-assessment (Ibarra, 1999).


Professional legitimacy represents another individual-level concern. Research on professionalization processes indicates that fields with well-established competency frameworks and credentialing systems tend to achieve stronger external recognition of practitioner expertise (Abbott, 1988). For OD specifically, historical reliance on non-validated frameworks may have contributed to perceptions of the field as less professionalized than adjacent disciplines with more established assessment systems.


Validated competency frameworks may support several positive individual outcomes: transparent feedback by establishing shared criteria for evaluating performance and growth; mentoring relationships by providing structured frameworks for developmental conversations; and professional community by establishing common language and standards across diverse contexts.

Evidence-Based Organizational Responses


Competency-Based Hiring and Selection Systems


Organizations seeking to strengthen capabilities in professional domains increasingly adopt structured, competency-based selection approaches. The logic is well-established: selection methods anchored in clear competency models tend to predict job performance more accurately than traditional approaches emphasizing credentials and unstructured interviews (Schmidt & Hunter, 1998).


Schmidt and Hunter's (1998) meta-analysis examined 85 years of findings on selection method validity, finding that general mental ability tests combined with structured interviews or work sample tests provided the highest validity for predicting job performance. Unstructured interviews showed considerably lower validity coefficients. While this research examined selection broadly rather than OD roles specifically, the principles apply: structured, competency-based assessment approaches tend to predict performance more accurately than informal evaluation methods.


Approaches to competency-based selection that have shown effectiveness in professional contexts include:


  • Structured behavioral interviews organized around specific competency domains, asking candidates to describe past situations demonstrating relevant capabilities

  • Work sample exercises requiring candidates to perform tasks representative of actual role responsibilities

  • Multiple assessment methods combining interviews, exercises, and other evaluation approaches to gather convergent evidence

  • Standardized evaluation criteria using competency frameworks to create consistent rating dimensions across all candidates

  • Diverse assessment perspectives including multiple evaluators to reduce individual bias


Organizations implementing competency-based OD selection would benefit from tracking outcomes to build field-specific evidence about which selection approaches best predict subsequent performance.


Strategic Professional Development and Capability Building


Validated competency frameworks enable organizations to design professional development systems that build capabilities systematically. Research on adult learning and expertise development emphasizes the importance of deliberate practice targeting specific competencies with appropriate challenge and feedback (Ericsson et al., 2007). Competency models provide structure for such targeted development by clarifying which capabilities need strengthening and establishing benchmarks for measuring progress.


Evidence-based approaches to capability building informed by adult learning research include:


  • Competency gap analysis using validated assessments to identify individual and collective development priorities

  • Focused learning objectives designing development experiences targeting specific competencies rather than addressing broad, undefined skill areas

  • Practice-based learning providing opportunities to apply developing competencies in realistic contexts with appropriate support and feedback

  • Progress monitoring using periodic reassessment to track competency development and adjust learning strategies

  • Self-directed learning support helping practitioners identify their own competency gaps and create personalized development plans


Research on training effectiveness supports these approaches. Salas and colleagues' (2012) comprehensive review identified several factors consistently associated with learning transfer: needs analysis to identify specific development targets, learning objectives clearly linked to desired competencies, practice opportunities in realistic contexts, feedback on performance, and post-training support for application. Competency frameworks facilitate each of these elements by providing clear specifications of target capabilities.


Curriculum Design and Academic Program Development


Academic institutions preparing future practitioners face the challenge of translating broad field knowledge into specific learning outcomes that employers value and students can develop. Validated competency frameworks provide potential grounding for these translation efforts, connecting theoretical concepts with observable capabilities.


Approaches to competency-informed curriculum design that have been explored in professional education include:


  • Learning outcome mapping aligning course objectives with competency frameworks to ensure systematic coverage

  • Practice-based pedagogy designing assignments and projects that require competency application in realistic contexts

  • Competency scaffolding sequencing learning experiences to build foundational capabilities before progressing to advanced integration

  • Student self-assessment using competency frameworks to support learner reflection and developmental planning

  • Program evaluation measuring aggregate student competency development to assess curriculum effectiveness


The MOST Assessment potentially provides OD educators with empirically validated tools for supporting some of these curriculum functions. Faculty could use the assessment to measure student competency development over program duration, identify which courses or experiences contribute most to specific competency growth, and gather data for continuous curriculum improvement.


Building Long-Term Competency-Based Practice Infrastructure

Continuous Competency Framework Evolution


Professional competency frameworks require ongoing refinement as practice contexts, methodological approaches, and theoretical understanding develop. The MOST model's design reportedly anticipates this evolution through its socio-technical systems structure, which can accommodate new practices while maintaining conceptual coherence (as described in the publication announcement).


This adaptive approach aligns with established principles of professional knowledge development. Abbott's (1988) analysis emphasizes that professional knowledge systems must balance abstraction—which enables broad application—with enough specificity to guide practice.


Mechanisms that could support framework evolution include:


  • Ongoing validation studies periodically retesting framework structure and measurement properties

  • Cross-context testing examining framework applicability in diverse settings

  • Stakeholder feedback systems gathering input about framework utility and gaps

  • Integration of emerging practices systematically reviewing new methods to identify potential competency implications


The MOST model's developers indicate commitment to keeping the assessment freely available to support widespread use and continuous refinement. This approach aligns with open science principles increasingly recognized as valuable for cumulative knowledge development (Vicente-Saez & Martinez-Fuentes, 2018). Open access enables broader testing across diverse contexts, potentially generating evidence more rapidly than proprietary models.


Competency-Based Research and Evidence Generation


Validated competency frameworks enable research examining which capabilities predict outcomes in various contexts, how competencies develop over time, and which interventions effectively build specific capabilities. Such research could strengthen evidence for effective practice by moving beyond broad assertions about OD value toward more specific understanding of which competencies matter most in which situations.


Research directions enabled by validated competency frameworks include:


  • Competency-outcome relationships investigating which competencies or combinations predict intervention success across organizational contexts

  • Developmental trajectories tracking how competencies develop throughout careers

  • Learning intervention effectiveness testing which pedagogical approaches or experiences most effectively build particular competencies

  • Assessment validity studying whether competency assessment results predict subsequent performance


This research agenda aligns with broader calls for evidence-based management and practice (Rousseau, 2006). Rousseau's framework emphasizes using best available scientific evidence to inform decisions, combined with practitioner expertise and stakeholder values. Competency research could contribute to the scientific evidence base while acknowledging that competency application always involves professional judgment adapted to specific contexts.


Conclusion

The validation and publication of the MOST competency framework represents an important development in OD's evolution as a profession. By subjecting a theoretically grounded competency model to rigorous psychometric testing, the framework's developers have provided infrastructure that could support more systematic professional practice, development, and research.


The organizational and educational applications examined here—competency-based hiring, structured development systems, and curriculum design—demonstrate potential uses for validated frameworks. While most specific OD applications lack published outcome research, these approaches build on established principles from personnel psychology, adult learning, and professional education that have shown effectiveness in other contexts.


Several themes emerge from this analysis. First, empirical validation matters for professional frameworks that aspire to support high-stakes decisions about hiring, development, and credentialing. Second, professional infrastructure requires ongoing stewardship rather than one-time development. Effective frameworks must evolve as practice contexts and theoretical understanding develop. Third, application generates evidence that strengthens frameworks over time. As organizations implement competency-based practices and measure outcomes, understanding deepens about which approaches work in which contexts.


Looking forward, the field's opportunity lies in building on this infrastructure through systematic application and research. Organizations implementing competency-based practices could document their approaches and outcomes. Researchers could examine competency-outcome relationships and developmental pathways. Educators could track how competency-based curriculum designs affect student learning. Professional associations could explore competency-based credentialing while studying its effects on practice quality.


The MOST framework provides foundation for such efforts by offering an empirically validated, freely accessible competency model. By demonstrating that rigorous competency validation in OD is feasible and valuable, this work invites the field to embrace higher standards for professional infrastructure—ultimately strengthening OD's capacity to contribute to organizational and societal wellbeing.


References

  1. Abbott, A. (1988). The system of professions: An essay on the division of expert labor. University of Chicago Press.

  2. Campion, M. A., Fink, A. A., Ruggeberg, B. J., Carr, L., Phillips, G. M., & Odman, R. B. (2011). Doing competencies well: Best practices in competency modeling. Personnel Psychology, 64(1), 225–262.

  3. Ericsson, K. A., Prietula, M. J., & Cokely, E. T. (2007). The making of an expert. Harvard Business Review, 85(7), 114–121.

  4. Ibarra, H. (1999). Provisional selves: Experimenting with image and identity in professional adaptation. Administrative Science Quarterly, 44(4), 764–791.

  5. Rousseau, D. M. (2006). Is there such a thing as "evidence-based management"? Academy of Management Review, 31(2), 256–269.

  6. Salas, E., Tannenbaum, S. I., Kraiger, K., & Smith-Jentsch, K. A. (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13(2), 74–101.

  7. Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2), 262–274.

  8. Schippmann, J. S., Ash, R. A., Battista, M., Carr, L., Eyde, L. D., Hesketh, B., Kehoe, J., Pearlman, K., Prien, E. P., & Sanchez, J. I. (2000). The practice of competency modeling. Personnel Psychology, 53(3), 703–740.

  9. Vicente-Saez, R., & Martinez-Fuentes, C. (2018). Open science now: A systematic literature review for an integrated definition. Journal of Business Research, 88, 428–436.

ree

Jonathan H. Westover, PhD is Chief Academic & Learning Officer (HCI Academy); Associate Dean and Director of HR Programs (WGU); Professor, Organizational Leadership (UVU); OD/HR/Leadership Consultant (Human Capital Innovations). Read Jonathan Westover's executive profile here.

Suggested Citation: Westover, J. H. (2025). The MOST Assessment: How Empirical Validation Is Reshaping Organization Development Practice and Professionalization. Human Capital Leadership Review, 27(3). doi.org/10.70175/hclreview.2020.27.3.1

Human Capital Leadership Review

eISSN 2693-9452 (online)

Subscription Form

HCI Academy Logo
Effective Teams in the Workplace
Employee Well being
Fostering Change Agility
Servant Leadership
Strategic Organizational Leadership Capstone
bottom of page