Sie sind auf Seite 1von 12

BUILDING AN EVALUATION FRAMEWORK FOR A COMPETENCY-BASED GRADUATE PROGRAM AT THE UNIVERSITY OF SOUTHERN MISSISSIPPI

Cyndi H. Gaudet Heather M. Annulis John J. Kmiec Jr.

This article describes an ongoing project to build an evaluation framework for a competencybased graduate program at The University of Southern Mississippi. Many traditional methods of evaluating performance at academic institutions provide only a partial assessment of individual program performance. In an increasingly competitive global economy, program evaluations should also consider the perspectives of students, graduates, and employers in order to develop curricula that will address the critical skills sets needed for strategic and value-added performance improvement work in the 21st century.

THIS ARTICLE DESCRIBES an ongoing project to build a comprehensive evaluation framework for the competency-based Master of Science in Workforce Training and Development (MSWTD) program at The University of Southern Mississippi (USM). First, it discusses some trends and issues in evaluating the performance of higher education programs in the United States. Second, it looks at the approach to evaluating the MSWTD program as an example of how to use a proven evaluation framework for ensuring that graduates are equipped with the necessary competencies for dynamic careers in workplace learning and performance (WLP) improvement. Third, it shares some lessons learned during the first phase of the evaluation project and some plans for the future application of the evaluation strategy.

TRENDS AND ISSUES IN EVALUATING HIGHER EDUCATION


Many traditional methods of evaluating the performance of academic institutions (tracking enrollments, retentions, grades, publications, grants, accreditations, and so forth), although important and useful to the institutions
Performance Improvement, vol. 47, no. 1, January 2008 2008 International Society for Performance Improvement Published online in Wiley InterScience (www.interscience.wiley.com)

themselves, provide an incomplete picture of the value of higher education in the intensely competitive global economy of today. The Southern Association of Colleges and Schools Commission on Colleges (SACS-COC) administers the traditional accreditation process. SACS accreditation is a form of self-regulation that is strictly voluntary and based on standards developed by member institutions that develop, amend, and approve accreditation requirements (SACS-COC, 2006, p. 2). A more comprehensive evaluation framework must go beyond institutional selfregulation and consider the perspectives of many other stakeholders, including students, faculty, staff, graduates, and the employers of graduates. From the students and graduates points of view, for example, earnings matter. After all, an education represents a major investment in time, effort, and money. Indeed, the promise of increased earnings over ones lifetime is a commonly accepted motivation for pursuing a college education in the first place. From the employers perspective, the motivation to hire, retain, and advance employees is directly related to the employees ability to produce meaningful business results for the organization.

26

DOI: 10.1002/pfi.176

From this perspective, what matters most is what the graduates do for their organizations as a result of their education. That is, the evaluation of an educations effectiveness should extend beyond the college or university to the performance of the graduates for the organizations that hire them. The evaluation should extend beyond tracking enrollments, retentions, grades, publications, grants, and accreditations to examining the value or results achieved. Such a results-based approach is consistent with SACS accreditation standards, which state that the onus is on each member institution to demonstrate institutional commitment to the concept of quality enhancement through continuous assessment and improvement (SACS-COC, 2006, p. 2). Evaluating the impact of higher learning and engaging in continuous improvement matters. According to a report from the National Center for Public Policy and Higher Education, higher education is now seen as a crucial component of cross-border economic competitiveness (Wagner, 2006, p. 2). Although evaluating higher education through the perspectives of colleges, universities, students, and graduates is important, evaluating higher educations impact and contributions to employers and the nation has become a higher priority in recent years. Unfortunately, measuring the impact of a higher education on graduates competencies and contributions in the workplace has not always been a priority. Trends in globalization, increased competition, and the national skills gap all make a strong case for building competency-based learning, development, and evaluation into workforce development models, especially those that pertain to institutions of higher learning. In this era of rapidly expanding, intense global competition, there are increasingly more high-demand, high-growth occupations in the United States than skilled workers to fill those jobs. U.S. Secretary of Labor Elaine Chao has said of this situation, which is commonly referred to as the skills gap:
The biotechnology, geospatial technology, health care, financial services, and the skilled trades are just a few of the areas that have been defined as high growth, emerging areas. Leaders in these industries are telling us the same thing: they cant find enough workers with the right skills for these high-skilled, good-paying jobs. (U.S. Department of Labor, 2006)

models. The Presidents High Growth Job Training Initiative identified industry competency models as one of several key workforce solutions that support the development of talent in the right skills for the jobs of the future (CareerOneStop, 2006). According to the U.S. Department of Health and Human Services (HHS):
The competency model is future-oriented, describing an ideal workforce. The competencies that make up the model serve as the basis for employee management, since they play a key role in decisions on recruiting, employee development, personal development, and performance management. A competency model helps to bridge the gap between where an organization is now and where it wants to be in the future. This occurs in two ways. First, the competency model serves as a guide for management in making decisions, since it is based on the competencies that support the mission, vision, and goals of the organization. Second, the competency model serves as a map to guide employees toward achieving the mission of their organization and their functional area. This provides management and staff with a common understanding of the set of competencies and behaviors that are important to the organization. (U.S. Department of Health and Human Services, 1999)

In fact, the literature shows widespread agreement across many groups that the skills gap is a real concern in the United States. What is being done to overcome this problem? One recent and increasingly popular solution to the skills shortage involves the adoption of competency

Competency models organize the competencies (the knowledge, skills, abilities, and attributes) needed for exemplary performance in a particular role, work setting, job, occupation, or industry. How does this affect instructional design, delivery, and measurement, especially in higher education in the United States? A top priority for colleges and universities across the nation should be building the competencies needed by industry to keep the U.S. workforce competitive. But how will academic institutions know whether they are teaching the right things right? They will know by asking employers what their needs are in terms of the competencies required of graduates, by balancing those needs against national priorities for filling high-demand and high-growth jobs, by evaluating the effectiveness of educational programs in meeting the demands of industry, and by using the knowledge gained from evaluation to continuously improve the performance of educational programs and, in turn, the performance of graduates in the workplace. Examination of USMs MSWTD program provides an example of a competency-based instructional framework designed to ensure that graduates are prepared with the right knowledge skills and competencies for a dynamic career in workplace learning and performance improvement.

Performance Improvement

Volume 47

Number 1

DOI: 10.1002/pfi

27

A top priority for colleges and universities across the nation should be building the competencies needed by industry to keep the U.S. workforce competitive. But how will academic institutions know whether they are teaching the right things right?

Organizational development Performance improvement Training and development Training management Workforce development Finally, the MSWTD program uses an innovative executive delivery format, including distance learning, projectbased courses, and weekend class sessions, specifically designed to expand instructional opportunities beyond the traditional boundaries of on-campus class delivery and to meet the demanding lifestyles of working professionals (USM-DEWD, 2006).

ASTD Competency Model: The MSWTD Program Foundation


The instructional design of the MSWTD was founded on the American Society for Training and Development (ASTD) competency model for workplace learning and performance. As the industry standard, the ASTD competency model details the roles, competencies, and areas of expertise needed for exemplary performance in the profession. In this model: Roles are groupings of targeted competencies.
They are not job titles. An individuals job may encompass one or more roles, similar to different hats that one might have to wear. Areas of expertise are the specialized knowledge/skills an individual needs over the foundational competencies. An individual may need expertise in one or more areas. Foundational competencies define relevant behaviors for learning and performance professionals to varying degrees. (American Society for Training and Development, 2004)

THE MSWTD PROGRAM About the Program


Administered by the Department of Economic and Workforce Development at The University of Southern Mississippi (USM-DEWD), the main focus of the MSWTD program is to prepare workplace learning and performance professionals to
define and design training and non-instructional interventions that can improve performance at the worker, the work process, and the organizational level. Human performance improvement in organizations requires more than an understanding of training design and delivery. The . . . program provides students with the tools they need to better understand factors that affect job performance such as job expectations, task design, incentive systems, feedback systems, performance strategies and tools, job aids, and resources. Students learn to think strategically and design interventions that will positively impact workplace learning and performance. (USM-DEWD, 2006)

Students in the MSWTD program come from a wide variety of backgrounds and organizations, including business, industry, government, education, and the military. Students join the program to prepare for or advance careers in (USM-DEWD, 2006): Human performance consulting Human resource development Instructional design

By aligning MSWTD instructional objectives with the roles, areas of expertise, and competencies identified by ASTD, MSWTD program designers created a rigorous, research-based curriculum capable of meeting the demands of workplace learning and performance improvement professionals and their employers.

Individual Development Planning Process


Another important component of the program is a competency-based individual development planning (IDP) process for MSWTD students and graduates. Based on the ASTD competency model, the IDP process is designed to provide students with a practical tool for focusing on needed workplace knowledge, skills, and abilities.

28

www.ispi.org

DOI: 10.1002/pfi

JANUARY 2008

While enrolled in the MSWTD program, each student will, under the guidance of an adviser, coach, or instructor: 1. Self-assess his or her individual level of expertise for each competency. 2. Select competencies for personal improvement. 3. Develop a personalized plan to improve his or her knowledge, skills, and abilities used to perform the selected competencies. 4. Implement the plan, and monitor and track progress toward the achievement of individual development goals. 5. Reassess the level of expertise, and plan further actions. Because professional development is a career-long pursuit that each individual should continue throughout the program and long after graduation, the IDP process is designed to transition with the graduate into the workplace.

Prepared workplace learning and performance improvement professionals to define and design training and non-instructional interventions that can improve performance at the worker, the work process, and the organizational level (USM-DEWD, 2006) Enabled individual development planning through provision of a practical development tool Data collection without action is meaningless. The project team decided that the evaluation data collected could best be used with a continuous improvement process that employed the analyze-design-developimplement-evaluate (ADDIE) framework. Subsequently, the data collection, analysis, and measurement process was designed to: Analyze MSWTD goals, the competency model, and evaluation data to determine needs, gaps, and opportunities; set priorities; isolate causes; and list possible solutions, remedies, or interventions. Design a strategy to close the gaps (with action plans, definition documents, infrastructure checklist, work breakdown structure, PERT chart, or Gantt chart as needed). Develop the capacity to execute the plan (get approval, build a team, assign roles and responsibilities, find a champion, train and equip, and so forth). Implement the action plan (actively manage the evaluation project; test or pilot the plan on a small scale first when possible). Evaluate at each stage of the data collection process to ensure continuous improvement of the MSWTD program and its competency model (make adjustments as needed).

Evaluation Strategy Evolution


The MSWTD program initially evaluated program effectiveness through alumni surveys and anecdotal data; however, this approach provided only a partial account of program success. Subsequently, it was determined that a more systematic evaluation strategy was needed to identify (a) the extent to which the students applied what they learned in the program to their jobs, (b) how effectively students applied the learning in their work centers, and (c) the actual business impact, value-added, or results of applied learning to the workplace. The first phase of strategy renewal began in August 2006, with the identification of MSWTD stakeholders and the purposes of program evaluation. The various stakeholders now included students and graduates, the Department of Economic and Workforce Development faculty and staff, the MSWTD program coordinator, and the graduates employers. Further, evaluation purposes ranged from determining success in achieving program objectives to testing the clarity and validity of tests, cases, and action learning projects to ensuring leadership in the workplace learning and performance profession. The evaluation strategy used the Phillips five-level, return-on-investment (ROI) framework, summarized in Table 1. The evaluation project now planned to collect a variety of data, ranging from student satisfaction with the MSWTD program to the impact of students applied learning on specific employer business outcomes. Other key objectives targeted for evaluation included the extent to which the MSWTD program:

LESSONS LEARNED AND PLANNED ACTIONS


By the time the first phase of the evaluation strategy concluded in December 2006, the team had learned the following three lessons: 1. Gathering accurate application and impact data from both graduates and employers would be difficult using only a postgraduate survey. Organizational culture, interpersonal relations, and fear of consequences, for example, might cause graduates to deny permission to evaluators to query immediate managers for application and impact information or might skew the data provided by the graduates. Therefore data would be collected from graduates only.

Performance Improvement

Volume 47

Number 1

DOI: 10.1002/pfi

29

TABLE 1
LEVEL OF EVALUATION
Level 1

PHILLIPS EVALUATION FRAMEWORK

TYPES OF DATA
Reaction, satisfaction, and planned action Target: 100% of programs offered (typical)

DEFINITION
Measures participant satisfaction with the program and captures planned actions, if appropriate

FOCUS
Focus is on response and planned actions related to the learning intervention or solution.

Level 2

Learning Target: 50% of programs offered (typical)

Measures changes in knowledge, skills, and abilities

Focus is on the participants learning gains.

Level 3

Application and implementation Target: 30% of programs offered (typical)

Measures changes in on-the-job behavior or performance

Focus is on the participants behavior in work setting, and enablers of and barriers to application of learning.

Level 4

Business impact Target: 10% of programs offered (typical)

Measures changes in business impact variables

Focus is on the impact of applied learning and on-the-job performance with specific organizational outcomes.

Level 5

Return on investment (ROI) Target: 5% of programs offered (typical) Intangible benefits

Compares program benefits to the cost of the program

Focus is on monetary benefits derived from the program or solution. Focus is on the added value and impact of learning and on-the-job performance in nonmonetary terms.

Refers to measures not converted to monetary benefits

Note: Most learning interventions or performance improvement solutions are not evaluated at all five levels. The level of evaluation and the types of data reported are determined by the specific needs of key stakeholders. Most programs are evaluated at level 1 (reaction) or level 2 (learning), or both. A lesser number of programs are evaluated at level 3 (application). Only the most critical programs are evaluated at levels 4 (business impact) or 5 (ROI), or both. Computing the ROI of training at Level 5 involves four steps: (a) isolating the effects of training, (b) converting these effects (benefits) into monetary values, (c) calculating the training costs, and (d) comparing the value of the effects to the costs. The ROI is calculated with this equation: ROI % = Net Program Benefits Total Incurred Costs Note: From Certification pre-work guide: Measuring the return on investment in human resources and training and development programs, by the ROI Institute, 2006, pp. 1, 21. Copyright 2006 by the ROI Institute, Inc. Adapted with permission. 100

2. Not every graduate would apply every competency addressed in the program in the work center. For evaluation consistency and manageability, a few critical competencies or roles would be identified. 3. The MSWTD program evaluation and continuous improvement model, like the rest of the project com-

ponents, should remain open to refinement and continuous improvement so as to be adaptable to changing needs. Subsequently, the second phase of the MSWTD evaluation strategy began in January 2007. Although taking a more focused approach, the overall purpose of the project

30

www.ispi.org

DOI: 10.1002/pfi

JANUARY 2008

Because professional development is a career-long pursuit that each individual should continue throughout the program and long after graduation, the IDP process is designed to transition with the graduate into the workplace.

paratively narrow set of select, industry-standard competencies that have broad application to job success in each of the four key WLP roles defined in ASTDs competency study (ASTD, 2004): learning strategist, business partner, project manager, and professional specialist. These roles also have broad application to human performance consultants, professionals, and practitioners. These roles are associated with competency in the following activities: Analyzing needs and proposing solutions: Identifying and understanding business issues and client needs, problems, and opportunities; comparing data from different sources to draw conclusions; using effective approaches for choosing a course of action or developing appropriate solutions; taking action that is consistent with available facts, constraints, and probable consequences Building trust: Interacting with others in a way that gives them confidence in ones intentions and those of the organization

remained the same; support the continuous improvement goals of the MSWTD program and measure its effectiveness in meeting key program objectives, including: Preparing workplace learning and performance professionals Enabling individual development planning Ensuring satisfaction (by measuring participant and stakeholder reaction to, satisfaction with, and planned actions toward the MSWTD program) Ensuring achievement of select learning outcomes (by assessing student learning gains in relation to key components of the WLP competency model) Ensuring achievement of select application outcomes (by assessing the extent to which and frequency with which MSWTD students have applied, or used, learned knowledge or critical skills, relative to the WLP competency model, in their respective workplaces). Planned deliverables associated with the updated evaluation project included a revised evaluation project plan; data analysis strategy; and an MSWTD Program Balanced Scorecard reflecting program success across multiple measures of effectiveness described above.

Communicating effectively: Expressing thoughts, feelings, and ideas in a clear, concise, and compelling manner in both individual and group situations; actively listening to others; adjusting ones style to capture the attention of the audience; developing and deploying targeted communication strategies that inform and build support Demonstrating adaptability: Maintaining effectiveness when experiencing major changes in work tasks, the work environment, or conditions affecting the organization (for example, economic, political, cultural, or technological conditions); remaining open to new people, thoughts, and approaches; adjusting effectively to new work structures, processes, requirements, or cultures Leveraging diversity: Appreciating and leveraging the capabilities, insights, and ideas of all individuals; working effectively with individuals having diverse styles, abilities, motivations, and backgrounds (including cultural differences) Planning and implementing assignments: Developing action plans, obtaining resources, and completing assignments in a timely manner to ensure that performance goals are achieved on individual, process, and organizational levels Table 2 illustrates how defined competency needs are linked to overall program objectives and how MSWTD program objectives link to the evaluation value chain. The business alignment focus led to a revised MSWTD data collection plan, shown in Table 3. This plan shows the

Evaluation Project Plan: Phase Two


The revised evaluation plan identifies key steps, milestones, and completion dates for accomplishing evaluation objectives, starting with the planning process and ending when the data are communicated to target audiences. Phase two program evaluation focuses on a com-

Performance Improvement

Volume 47

Number 1

DOI: 10.1002/pfi

31

TABLE 2
EVALUATION LEVEL
4

MSWTD BUSINESS ALIGNMENT LINKAGE

NEEDS ASSESSMENT
Business Needs Improved business productivity through enhanced role effectiveness and WLP task efficiencies.

PROGRAM OBJECTIVE
Impact Objectives Increase graduate impact on defined business outcomes, including productivity and organizational efficiency.

EVALUATION
Business Impact At 6 months, survey the extent of impact on defined business outcomes resulting from the graduates application of the MSWTD core competencies.

Job Performance Needs Enhanced effectiveness in WLP roles through application of MSWTD core competencies.

Application Objectives Graduates consistently and effectively apply MSWTD core competencies on the job. Graduates report application of core competencies as critical to job success. Graduates report application of MSWTD IDP process elements toward their continued professional development. Graduates identify barriers and enablers of application.

Application and Implementation At 6 months, survey the extent and frequency of applied knowledge and skills. Graduate self-assessment of knowledge and skill use and job relevance at 3 months. Individual development planning status at 3 months.

Skills and Knowledge Needs Graduates acquire sufficient knowledge and skills needed to successfully apply the MSWTD program core competencies in the workplace. Graduates acquire sufficient knowledge and skills needed to successfully apply the MSWTD program core competencies toward continued professional development.

Learning Objectives Graduates identify improved knowledge with select core competencies, involving * Analyzing needs and proposing solutions * Building trust * Communicating effectively * Demonstrating adaptability * Leveraging diversity * Planning and implementing WLP assignments Graduates identify confidence in applying knowledge of designated core competencies.

Learning and Confidence At 6 months, survey the extent and frequency of applied knowledge and skills. Graduate self-assessment of knowledge and skill gains and perceived confidence with gained knowledge/ skills at 3 months. Tests and evaluations during program. Demonstrated proficiencies during capstone project at end of program. Individual development planning during program. Reaction and Planned Action Reaction, satisfaction, and planned action questionnaire at end of program.

Preferences Graduates perceive learning as relevant and important to successful job performance. Graduates report they would recommend the program to others. Graduates report intent to apply learning.

Satisfaction Objectives Graduates rating of 4 or higher (on a 5 point scale) for relevance of instruction. 90% of graduates report they would recommend the program to others. 80% of graduates develop IDP plans to apply learning on the job.

Note: From Certification pre-work guide: Measuring the return on investment in human resources and training and development, by the ROI Institute, 2006, pp. 1, 5, 6, 22. Copyright 2006 by the ROI Institute, Inc. Adapted with permission.

32

www.ispi.org

DOI: 10.1002/pfi

JANUARY 2008

TABLE 3

MSWTD DATA COLLECTION PLAN

LEVEL

BROAD PROGRAM OBJECTIVES MEASURES/DATA TIMING

DATA COLLECTION METHODS OR INSTRUMENTS

DATA SOURCES

RESPONSIBILITY

Reaction, satisfaction, and planned action questionnaire

Graduates

End of program

Faculty MSWTD coordinator

1. Graduates perceive learning as relevant and important to successful job performance. 2. Graduates report they would recommend the program versus course to others. 3. Graduates intend to apply learning.

1. Graduate rating of 4 or above for relevance and importance of learning. 2. 90% of graduates report they would recommend the course to others. 3. 80% of graduates develop IDP action plans to apply learning on the job.

2 Graduates SOAR database

Students

During program End of program

MSWTD coordinator Graduate committee Faculty

Graduates acquire knowledge and skill gains to successfully apply defined core competencies of the MSWTD program, including Analyzing needs and proposing solutions Building trust Communicating effectively Demonstrating adaptability Leveraging diversity Planning and implementing WLP assignments

1. Minimum 3.0 cumulative grade point average for the program achieved 2. 80% report knowledge gains in each designated core competencies, involving Analyzing needs and proposing solutions Building trust Communicating effectively Demonstrating adaptability Leveraging diversity Planning and implementing WLP assignments 3. 80% report confidence in applying core competencies. 4. Capstone project passed, as determined by graduate committee. 5. 80% self-assess level of expertise for each core competency as 3 or above on 5-point scale.

1. Tests and evaluations 2. Capstone project book and presentation 3. Individual Development Plans 4. MSWTD Program Impact Questionnaire

Performance Improvement 1. 80% report frequent use of core competencies on the job. 2. 80% report effective use of core competencies on the job. 3. 80% agree that core competencies are critical to job success. Extent of impact on defined business outcomes resulting from graduates application of the MSWTD core competencies.

Graduates

Graduates consistently and effectively apply MSWTD core competencies on the job.

1. Self-assessment questionnaire 2. Individual development plans 3. MSWTD program impact questionnaire

3 months postprogram

MSWTD coordinator

Volume 47

Graduates report use of core competencies as critical to job success.

Number 1

Increase graduate impact on defined business outcomes, including productivity and efficiency.

MSWTD program impact questionnaire

Graduates

6 months postprogram

MSWTD coordinator

DOI: 10.1002/pfi

Target ROI 20%

Comment: To measure perceived cost benefit of applied core competencies

33

34
METHOD FOR CONVERTING DATA TO MONETARY VALUES COST CATEGORIES
Director, Department of Economic and Workforce Development

www.ispi.org

DOI: 10.1002/pfi

TABLE 4

MSWTD ROI ANALYSIS PLAN

JANUARY 2008

DATA ITEMS
Assigned value(s) for efficiency or productivity increases, per expert, or participant input

METHODS FOR ISOLATING PROGRAM OR PROCESS EFFECTS INTANGIBLE BENEFITS AUDIENCE FOR FINAL REPORT

OTHER INFLUENCES AND ISSUES DURING APPLICATION


Organizational enablers and barriers Participants motivation

COMMENTS

Extent of impact on defined business outcomes (productivity, efficiency)

Participant estimates

Salaries and benefits of faculty and staff, overhead for office and classroom space, and administrative costs

Director, Jack and Patti Phillips Workplace Learning and the MSWTD program coordinator Faculty

Nature of job role and function Development support network(s) Opportunity to apply competencies

1. Increased credibility for the university 2. Opportunities given by alumni 3. Relationships with business and industry 4. Opportunities for the community 5. Program enhancement from grants 6. High-profile internships 7. Student projects that give back to the university without added cost 8. Attracting experts to do presentations, workshops, and classes

Note: From ROI analysis plan [ROI Institute template], by the ROI Institute, 2004, http://www.roiinstitute.net/tools/. Copyright 2004 by the ROI Institute, Inc. Adapted with permission.

sources and methods of data collection that ensure the data collected are aligned with multiple evaluation objectives. For instance, business impact measures that may potentially be assigned and converted to monetary value for ROI analysis include: Productivity measures in graduates work unit, department, or agency (may reflect enhanced efficiency in workplace learning and performance) Efficiency measures of business processes (including WLP work processes) in graduates work unit, department, or agency In general, projected productivity or efficiency gains related to graduates applied competencies may include: Improved time savings through increased role effectiveness Less rework or duplication of efforts Reduced error rates Improved quality of WLP services Improved collaboration with business partners Improved decision making due to better planning and analysis Improved reporting capabilities Finally, a targeted ROI analysis plan, shown in Table 4, was developed to ensure that MSWTD resource allocation decisions are based on relevant and supplemental cost-benefit data.

Southern Mississippi. Many of the traditional methods of evaluating the performance of academic institutions provide an incomplete picture of the value of higher education in todays intensely competitive global economy. A more complete evaluation strategy considers the perspectives of students, graduates, and employers. This article examines the MSWTD programs approach of using a proven, research-based evaluation framework to meet the requirement for a more complete evaluation and specifically one that measures the successful degree of gained competencies from diverse stakeholder perspectives. This evaluation and process improvement strategy is designed to advance the needs of students, graduates, and employers by better quantifying and increasing the value of an education in workplace learning and performance improvement. Finally, this article discusses lessons learned in the continual evaluation of this program and planned actions. Future phases of evaluation will focus on incorporating additional postgraduate surveys and recommended learning management systems to capture and report data related to desired outcomes. Insights gained from this comprehensive evaluation effort have broad implications for other institutions and educators attempting to prepare students with the competencies needed for careers in any aspect of human performance improvement.

References
American Society for Training and Development. (2004). Competency study. Retrieved August 14, 2007, from http://www.astd.org/content/research/competencyStudy.htm. CareerOneStop. (2006). Competency Model Clearinghouse. Retrieved September 18, 2006, from http://www.careeronestop .org/competencymodel/learnCM.aspx. Southern Association of Colleges and Schools Commission on Colleges. (2006, December). Principles of accreditation: Foundations for quality enhancement (interim ed.). Retrieved August 15, 2007, from http://www.sacscoc.org/pdf/2007 %20Interim%20Principles%20complete.pdf. U.S. Department of Health and Human Services. (1999, November). Building successful organizations: Workforce planning in HHS. Retrieved September 20, 2006, from http://www.hhs.gov/ohr/workforce/wfpguide.html. U.S. Department of Labor, Learning and Training Administration. (2006, May 24). The presidents high growth job training initiative. Retrieved September 18, 2006, from http://www.doleta.gov/BRG/JobTrainInitiative.

Balanced Scorecard Approach


To ensure the continued effectiveness of the MSWTD program, the project team will use a balanced scorecard approach to inform key stakeholders about program success. The scorecard will reflect success with data items across the multiple measures of targeted results listed in the data collection plans. Traditional (enrollments, retentions, grades, publications, grants, accreditations, and so forth) and nontraditional (graduate feedback on individual performance and business impact) sources will be used. The team will use progress reports to drive continuous improvement efforts and priorities, including the implementation of new postgraduate surveys and recommended learning management system options.

CONCLUSION
This article describes an ongoing evaluation strategy for the competency-based Master of Science in Workforce Training and Development Program at The University of

Performance Improvement

Volume 47

Number 1

DOI: 10.1002/pfi

35

University of Southern Mississippi, Department of Economic and Workforce Development. (2006, August 11). Workforce training and development: An executive format program. Retrieved September 11, 2006, from http://www.usm.edu/wtd.

Wagner, A. (2006, September). Measuring up internationally: Developing skills and knowledge for the global knowledge economy (National Center for Public Policy and Higher Education Report #06-7)). Retrieved November 4, 2006, from http://www .highereducation.org/reports/muint/exec_summary.shtml.

CYNDI H. GAUDET is director of the Jack and Patti Phillips Workplace Learning and Performance Institute (WLPI) and associate professor and coordinator for the HCD degree program at The University of Southern Mississippi (USM). Her cutting-edge workforce development research has received awards from NASA, the Southern Growth Policies Board, and the New Orleans Chapter of the American Society for Training and Development. One of the high-growth, high-technology research initiatives under her direction was a top-five finalist for the U.S. Department of Labors 2005 Recognition of Excellence Award, Educating Americas 21st-Century Workforce. She holds a BS degree and an MS degree in education from The University of Southern Mississippi and a PhD degree in human resource education and workforce development from Louisiana State University. She may be reached at cyndi.gaudet@usm.edu. HEATHER M. ANNULIS is assistant director of the WLPI, assistant professor of WTD at USM, and co-principal investigator for the U.S. Department of Labors Geospatial Technology Apprenticeship Program (GTAP) grant to develop the geospatial workforce. She directs the WLPI training and development certificate program offered for training and human resource development professionals as a systematic method for developing competencies. Her research has garnered numerous awards, and she was recently named one of Mississippis Top 40 Under 40 by the Mississippi Business Journal. She holds a BS degree and a masters degree from The University of Louisiana at Lafayette and a doctorate in international development from USM. She may be reached at heather.annulis@usm.edu. JOHN J. KMIEC JR. is a recent graduate of the USM Master of Science in Workforce Training and Development (MSWTD) program. Since joining the program in 2005, he has developed a conflict management course aimed at enhancing employee collaboration at Health Firsts three not-for-profit hospitals on Floridas Space Coast. He has helped to write a talent acquisition course for MasterCard International, and he has managed a successful performance improvement project for RPM Pizza to benefit line personnel at over 130 Dominos Pizza franchises in five states. He has also developed a competency model for the U.S. Air Forces Robins Noncommissioned Officers Academy and has built a strategic training evaluation plan for Health First, using the Phillips ROI model. He is an Air Force veteran with 27 years of service and is currently enrolled in USMs PhD in human capital development program. He may be reached at john.kmiec@usm.edu.

36

www.ispi.org

DOI: 10.1002/pfi

JANUARY 2008