Details about the metrics included in the Academic Portfolio Review (APR) are provided below. For updates on the implementation of the Academic Portfolio Review or to submit a question, please visit the OIRE Projects & Initiatives webpage.

As an additional reference, please see the APR task force's report, Recommendations for a Comprehensive Portfolio Review of Academic Programs at the University of Illinois Springfield, and the APR Frequently Asked Questions webpage.

Calculation of the Higher-Level Criteria

Three higher-level criteria were identified by the APR task force: Demand and Viability, Institutional and Community Impact, and Versatility and Effectiveness. The APR task force recommended that demand and viability represent 60% of the overall score, institutional and community impact represent 20% of the overall score, and versatility and effectiveness represent 20% of the overall score. A program's overall score will be displayed on the dashboard reports on a 10-point scale.

The following sections identify which metrics comprise each criterion and how each criterion score is calculated. Additional information about each metric can be found in the accordion menu at the bottom of this page.

Demand and Viability

Demand metrics include 5-year enrollment headcount, 5-year enrollment growth, Lightcast completion growth rate, and Lightcast occuption growth rate. The demand metric scores will be averaged to provide an overall score for demand. One metric is used for viability, program cost study. Each metric will be displayed on the dashboard reports on a 2-point scale.

The overall demand and overall viability scores will be averaged for an overall demand and viability criterion score. A multiplier of 3 will be applied to have the demand and viability criterion represent 60% of the overall score. Each program's overall demand and viability score will be displayed on the dashboard reports on a 6-point scale.

Institutional and Community Impact

Institutional and community impact metrics include contribution to UIS service courses, external sponsorship, advancement contributions, sponsoring of registered student organizations, community learning placements, community engagement (local), community engagement (broad-reaching), and community visibility. Each metric will be displayed on the dashboard reports on a 2-point scale.

The institutional and community impact metric scores will be averaged for an overall institutional and community impact criterion score. Each program's overall institutional and community impact score will be displayed on the dashboard reports on a 2-point scale.

Versatility and Effectiveness

Versatility metrics include accreditation, program review, curriculum revision, catalog maintenance, assessment plan, and ongoing program assessments. The versatility metric scores will be averaged to provide an overall score for versatility. Effectiveness metrics include 5-year average graduation rates, 5-year average program completion rates, time-to-degree completion, and external exam pass rate. The effectiveness metric scores will be averaged to provide an overall score for effectiveness.

The overall versatility and overall effectiveness scores will be averaged for an overall versatility and effectiveness criterion score. Each program's overall versatility and effectiveness score will be displayed on the dashboard reports on a 2-point scale.

Categorization of Programs

The APR task force recommended that programs be categorized into one of three categories based on the ranking of their overall score. Programs with scores in the overall lower third should be earmarked for sunset, consolidation, or reform, with specific determinations for sunset, consolidation, or reform being made by the Provost with these considerations: where programs fall in the distribution of lower-third scores, whether programs are in the lower-third for both distribution rankings (total programs and by program type), whether there are gaps between market demand and enrollment or gaps between enrollment and graduation rates, specific scores in the institutional and community impact and versatility and effectiveness criteria and metrics that might indicate areas for growth or reform, and responses from programs following the review.

The APR task force recommended that programs with scores in the overall middle and upper-third be granted a maintain or enrich status. They recommended that specific determinations regarding which programs to enrich should be made by the Provost but encourages the consideration of the following: programs that fall on the cusp between the middle and top level are strong candidates for enrichment or support; gaps between enrollments, market demand, and graduation scores are likely places for enrichment or reform; plans for reform or change in the responses from programs following the review.

Metrics Definitions and Explanations

Last updated on May 27, 2025

5-year enrollment headcount

Determination: This was calculated by adding the headcount of students majoring in the program or the stand-alone minor at fall census over the past five years. These figures include students who have the program as their primary major or as their secondary major.

Data Source & Validation: Two OIRE staff members  gathered data independently from Banner and compared results.

Value Range: Values range from 0 to 1,151.

Dashboard Example: 118 (0+2+29+45+42)

Interpretation: There were 0 students majoring in the program at census for Fall 2020, 2 students majoring in the program at census for Fall 2021, 29 students majoring in the program at census for Fall 2022, 45 students majoring in the program at census for Fall 2023, and 42 students majoring in the program at census for Fall 2024. The total of that program's fall census enrollment headcounts over the five years is 118 (0+2+29+45+42). 

Scoring: Values were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

 

5-year enrollment growth

Determination: This was calculated by determining the percentage change in enrollment over the past five years based on the headcount at fall census.

Data Source & Validation: Two OIRE staff members gathered data independently from Banner and compared results

Value Range: Values range from -100% to +33,500%.

Dashboard Example 1: 20%

Dashboard Example 2: -15%

Interpretation of Example 1: The change in the number of students majoring in the program from census of Fall 2020 to census of Fall 2024 was an increase of 20%. The program saw a growth in enrollment between those two periods.

Interpretation of Example 2: The change in the number of students majoring in the program from census of Fall 2020 to census of Fall 2024 was a decrease of 15%. The program saw a decline in enrollment between those two periods.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points). Programs that were established within the past five years received a score of NA (not applicable).

Lightcast completion growth rate

Determination: Lightcast was used to determine the change in completion growth over the past four years (2019-2023) for all programs within the midwest region based on Classification of Instructional Programs (CIP) code and degree level. (Limitation: Lightcast is unable to differentiate growth rates at a concentration level. All concentrations within the same program CIP code and degree level will share the same completion growth rate.)

Data Source & Validation: Two OIRE staff members  gathered data independently from Lightcast and compared results

Value Range: Values range from -65% to +140%.

Dashboard Example 1: 17.1%

Dashboard Example 2: -9%

Interpretation of Example 1: Lightcast shows that there has been a 17.1% increase in the completion of degrees for that program at that degree level in the midwest region over the past four years.

Interpretation of Example 2: Lightcast shows that there has been a 9% decrease in the completion of degrees for that program at that degree level in the midwest region over the past four years.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points). If Lightcast data were not available, the program received a score of NA (not available).

Lightcast occupation growth rate

Determination: Lightcast was used to determine the five-year growth rate for target occupations for each program CIP code and degree level within the midwest region. (Limitation: Lightcast is unable to differentiate growth rate at a concentration level. All concentrations within the same program CIP code and degree level will share the same occupation growth rate.)

Data Source & Validation: Two OIRE staff members  gathered data independently from Lightcast and compared results

Value Range: Values range from -20% to +37%.

Dashboard Example 1: 9.7%

Dashboard Example 2: -4.7%

Interpretation of Example 1: Lightcast shows that there has been a 9.7% increase in the number of jobs in target occupations for individuals with a degree in that program at that degree level in the midwest region over the past five years.

Interpretation of Example 2: Lightcast shows that there has been a 4.7% decrease in the number of jobs in target occupations for individuals with a degree in that program at that degree level in the midwest region over the past five years.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Program cost margin

Determination: The UI-System Program Cost Study report was used to determine the program cost per credit hour at each degree level for each program. Line 14 - Total All Costs from the Program Cost Study reports was subtracted from the average net total revenue per student credit hour for all programs at that degree level. Tuition differentials figures were added to eligible programs. See pages 12-13 of the Academic Portfolio Review Task Force's report, Recommendations for a Comprehensive Portfolio Review of Academic Programs at the University of Illinois Springfield, for additional details on how the total costs are calculated for undergraduate programs, the net total revenue figures for different degree levels, and the tuition differential figures used for eligible programs.

Data Source & Validation: An OIRE staff member conducted the calculations on two separate occasions from UI-System Cost Study reports. The VC for Finance and Administration reviewed calculations and revenue/differential numbers. 

Value Range: Values range from -$305.97 per student credit hour (SCH) to +$232.82 per SCH.

Dashboard Example 1: $212.35 

Dashboard Example 2: -$201.66

Interpretation of Example 1: The difference (the margin) between the program's average net total revenue per student credit hour and the program's cost per student credit hour is $212.35. The program's net total revenue received is $212.35 per student credit hour greater than the program's expenses per student credit hour.

Interpretation of Example 2: The difference (the margin) between the program's average net total revenue per student credit hour and the program's cost per student credit hour is -$201.66. The program's expenses is $201.66 more per student credit hour than the net total revenue received per student credit hour.

Scoring: Results were sorted from the highest value to lowest value. Programs received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points). If Cost Study data were not available, the program received a score of NA (not available).

Contribution to UIS service courses

Determination: This was determined by the number of credit hours generated by the academic unit and its faculty for UIS service courses from Summer 2023-Spring 2025. Service courses included (a) general education and university requirement (ECCE) courses offered as part of the general education and university curriculum, (b) honors courses with a CAP prefix taught by faculty members within the academic unit, (c) courses with a UNI prefix taught by faculty members within the academic unit, and (d) first-year seminar courses. This metric was applicable only to undergraduate programs and stand-alone minors; graduate and doctoral programs did not receive a score for this metric.

Data Source & Validation: An OIRE staff member pulled the list of courses and student credit hours from Banner. Courses and instructors were paired with their respective programs. UEOs were asked to confirm if the list of courses and the instructors identified aligned with the appropriate programs.

Value Range: Values range from 0 student credit hours (SCH) to 4,847 SCH

Dashboard Example: 2188 SCH (UIS300-Patel, UIS300-Smith, UIS305-Garcia, UIS305-Ali, UIS316-Kim, UIS316-Jones, UIS316-Chen)

Interpretation: Courses from the UIS program that were identified in Banner that met one or more of the attributes listed were UIS300, UIS 305, and UIS 316. Professors from the UIS program that taught courses that were identified in Banner that met one or more of the attributes listed were Patel, Smith, Garcia, Ali, Kim, Jones, and Chen. The dashboard example identifies which courses and instructor combinations are recognized as service courses (UIS300-Patel, UIS300-Smith, UIS305-Garcia, UIS305-Ali, UIS316-Kim, UIS316-Jones, UIS316-Chen). The enrollments of those course-instructor combinations are compiled for every semester from Summer 2023-Spring 2025, and then those totals are added together to get the program's total of service course hours. The UIS program contributed 2,188 student credit hours of service courses over the two year period.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

External sponsorship

Determination: This was determined by the amount of external sponsorship received in the Office of Research and Sponsored Programs (ORSP) that is associated with the program. External sponsorship may include contracts and grants received from an external organization. A five-year total of the external sponsorships was used.

Data Source & Validation: Data obtained from the Office of Research and Sponsored Programs

Value Range: Values range from $0 to $6,137,821

Dashboard Example: $228,087 (Johnson-3, Hernandez-1)

Interpretation: There were four grants listed by ORSP that were associated with faculty teaching in this program. Dr. Johnson was the principal investigator for three grants and Dr. Hernandez was the principal investigator for one grant. The total of these four grants was $228,087.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Alumni Engagement

Determination: A change was made to this metric from the proposed metric in the APRTF report. Per recommendations of the VC for Advancement, this metric was adjusted from External Sponsorship to Alumni Engagement, as Alumni Engagement is a better reflection of the influence of programs on alumni's connections with UIS. Alumni engagement was drilled down to a college and degree level. Annual alumni engagement figures since 2022 (post-reorganization) were used.

Data Source & Validation: Data obtained from the Alumni Engagement Dashboard from U of I Foundation

Value Range: Values range from 16.9% to 24.8%.

Dashboard Example 1:21.7% of CUIS UG alumni

Dashboard Example 2:16.9% of CUIS Grad alumni

Interpretation of Example 1: 21.7% of the undergraduate alumni from the academic college (CUIS) were engaged with the university since 2022 per the Alumni Engagement Dashboard.

Interpretation of Example 2: 16.9% of the graduate alumni from the academic college (CUIS) were engaged with the university since 2022 per the Alumni Engagement Dashboard.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Sponsoring of RSOs

Determination: UEOs were asked to identify the registered student organizations sponsored by or advised within their program(s) or by faculty within the program. 

Data Source & Validation: Student organizations identified on a UEO survey were cross-referenced with Student Life to ensure they were active, registered student organizations. If they were not, organiztions listed by the UEO were counting in the community engagement (local) metric.

Value Range: Values range from 0 to 5

Dashboard Example 1:3-Project Insight, Sigma Solutions, Campus Collective

Dashboard Example 2:0-see com eng metric for Prism Circle

Interpretation of Example 1: There are three registered student organizations associated with the program and/or its faculty members. Those RSOs are Project Insight, Sigma Solutions, and Campus Collective.

Interpretation of Example 2: Although the UEO reported that Prism Circle is a student organization for the program, the Student Life office indicated that the student organization was not officially registered the past two years. Rather than listing Prism Circle for this metric, it was listed as part of the Community Engagement metric instead.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Community learning placements

Determination: UEOs were be asked whether their unit’s programs require community learning placements, such as internships, clinicals, or practicums, as a part of its requirements for graduation.

Data Source & Validation: UEO survey 

Value Range: Required, not required

Dashboard Example 1: Yes, IPL 300

Dashboard Example 2: Not required

Interpretation of Example 1: The program requires a community learning placement as part of the graduation requirements. The course is IPL 300.

Interpretation of Example 2: The program does not require a community learning placement as part of the graduation requirements.

Scoring: A point score of 2 was received if the answer was yes and a point score of 0 was received if the answer was no.

Community engagement (local)

Determination: UEOs were asked to identify the local community events associated with the program in the past two years. These events were inclusive of demonstrations of research, performance(s) or displays, panel discussions, workshops, non-registered student organizations, etc., that were specifically open to students or the members of the broader Springfield and central Illinois community in a non-virtual context. It could also include faculty members’ involvement in local professional boards and organizations where their involvement as a member of the UIS community was represented.

Data Source & Validation: An OIRE staff member reviewed the activities listed on the completed UEO survey to ensure they adhered to the metric definition.

Value Range: Values range from 0-126

Dashboard Example: 6 (Advisory Board, Springfield Society, Panels-3, Alumni Event)

Interpretation: There were six activities/events that were submitted by the program UEO that met the requirements. Those activities/events included an advisory board, Springfield Society, three panels, and an alumni event.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Community engagement (broad-reaching)

Determination: UEOs were asked to identify broad-reaching community events associated with the program(s) in the past two years. These events were inclusive of demonstrations of research, performance(s) or displays, panel discussions, workshops, non-registered student organizations, etc., that were delivered virtually or occured in a non-virtual setting in a region outside of the broader Springfield and central Illinois community and were specifically open to students or members of the public. They could also include faculty members’ involvement in professional boards and organizations where their involvement as a member of the UIS community was represented.

Data Source & Validation: An OIRE staff member reviewed the activities listed on the completed UEO survey to ensure they adhered to the metric definition.

Value Range: Values range from 0-55

Dashboard Example: 18 (ARQ, grant reviews-3, presentation-10, symposium-2, board, NYS)

Interpretation: There were 18 activities/events that were submitted by the program UEO that met the requirements. Those activities/events included ARQ, three grant reviews 10 public presentation, 2 symposiums, serving on a board, and NYS.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Community visibility

Determination: This was determined by the number of times in which a program and its faculty were recognized in the UIS in the News daily e-mail announcement that is distributed by the UIS Media Strategy department. UIS in the News captures external media mentions of UIS. A two-year count of the mentions was used (5/1/2023-5/1/2025).

Data Source & Validation: UIS in the News submissions were reviewed by Public Relations and by OIRE.

Value Range: Values range from 0-41

Dashboard Example: 4 mentions (UIS program, Carter, Abebe)

Interpretation: The UIS program and/or its faculty were mentioned four times in the UIS in the News e-mail announcements of external media mentions. Dr. Carter, Dr. Abebe, and the UIS program were mentioned in the different media. 

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points).

Accreditation

Determination: UEOs were asked to select one of the following options to describe their program’s external accreditation, including conditional accreditation, status: (a) Yes, and we must be, (b) Yes, and we choose to be, (c) No, and we can’t be, and (d) No, but we could be. 

Data Source & Validation: UEO survey 

Value Range: Accredited, not accredited, not applicable

Dashboard Example 1: Externally accredited (either by choice or required)

Dashboard Example 2: Could be externally accredited

Dashboard Example 3: No external accreditation available

Interpretation of Example 1: The UEO reported that the program is externally accredited.

Interpretation of Example 2: The UEO reported that the program could be externally accredited, but the program currently is not externally accredited.

Interpretation of Example 3: The UEO reported that there is no external accreditation available for the program.

Scoring: Programs that answered with option a or b received 2 points, programs that answer with option d received 0 points, and programs that answer with option c received a score of NA (not applicable).

Program review

Determination: This metric assessed whether a program’s program review is current according to the timeline found on the OIRE Program Review Schedule.

Data Source & Validation: OIRE's IBHE program review status list

Value Range: Current, Past-Due

Dashboard Example 1: 2024-25 Self-Study Year (submitted)

Dashboard Example 2: 2024-25 Self-Study Year (not submitted)

Dashboard Example 3: 2025-26 Self-Study Year (current)

Interpretation of Example 1: The program's self-study year is 2024-25 and the program has submitted its self-study report.

Interpretation of Example 2: The program's self-study year is 2024-25 and the program has not yet submitted its self-study report.

Interpretation of Example 3: The program's self-study year is 2025-26. The program is presently current on its review.

Scoring: Programs that were current received 2 points. Programs that were past-due received 0 points.

Curriculum revision

Determination: This metric captured whether a program has conducted any program curriculum revision, including those currently in the governance process, in the last five years.

Data Source & Validation: Campus Senate Resolutions

Value Range: Revised, Not Revised

Dashboard Example 1: Fall 2021 (established)

Dashboard Example 2: Fall 2024

Dashboard Example 3: None in past 5 years

Interpretation of Example 1: The program's last curriculum revision occurred in Fall 2021 when the program was established.

Interpretation of Example 2: The program's last curriculum revision occured in Fall 2024.

Interpretation of Example 3: The program has not had any curriculum revisions in the past five years.

Scoring: If a program had revisions, it receive 2 points. If it had not, it received 0 points.

Catalog maintenance

Determination: The number of courses listed in the UIS catalog that had not been taught in the past three years were counted by course prefix. Some course types were excluded: continuing enrollment courses, newly created courses, honors seminars, independent studies, internship courses, ION credit courses, practicums, project courses, research courses, thesis courses, topics courses, tutorial courses, and cross-listed courses when they were not the controlling department of the course.

Data Source & Validation: Enrollment & Retention Management provided a list of courses that had not been taught. After consulting with ERM and Provost Team Leadership, OIRE identifed and removing the excluded course types. 

Value Range: Values range from 0-26 courses

Dashboard Example 1: 0 courses

Dashboard Example 2: 3 courses - UIS 509, 528, 537

Interpretation of Example 1: The program has no courses listed in the course catalog that had not been taught in the past three years beyond those course types that were excluded.

Interpretation of Example 2: The program has 3 courses listed in the course catalog that had not been taught in the past three years beyond those course types that were excluded. Those 3 courses are UIS 509, UIS 528, and UIS 537.

Scoring: Results were ranked from lowest to highest number of courses. Programs with the least courses received a score of 2 points, the middle third received a score of 1 point, and the highest third received a score of 0 points.

Assessment plan

Determination: UEOs were asked to submit their program’s current assessment plan. 

Data Source & Validation: OIRE staff scored the plans submitted by the UEOs.

Value Range: Well-developed and current, current but not well-developed, well-developed but not current/no plan.

Dashboard Example 1: Well-developed and current

Dashboard Example 2: Current; could benefit from further development

Dashboard Example 3: Not submitted

Interpretation of Example 1: The program submitted an assessment plan that was well-developed and current.

Interpretation of Example 2: The program submitted an assessment plant that was current, but it could benefit from additional development.

Interpretation of Example 3: The program did not submit an assessment plan.

Scoring: Assessment plans that are current (completed or updated within the last 5 years) and well-developed received a score of 2. Assessment plans that are out-of-date or current but not well-developed received a score of 1. Programs that do not submit or have an assessment plan will receive a score of 0.

Ongoing program assessments

Determination: UEOs were asked to list any changes to the curriculum (particularly, although not necessarily exclusive to, changes made in response to assessment) that did not go through governance in the past five years. 

Data Source & Validation: UEO survey

Value Range: Changes, No Changes 

Dashboard Example 1: Change to pre-reqs

Dashboard Example 2: None

Interpretation of Example 1: The UEO identified changes to the program. For this program, the changes were changes to prerequisite requirements 

Interpretation of Example 2: The UEO did not identify any changes to the program.

Scoring: Programs that identify curriculum changes will receive 2 points. Programs that did not identify curriculum changes will receive 0 points.

5-year average graduation rates

Determination:

Data Source & Validation: 

Value Range: 

Dashboard Example:

Interpretation:

Scoring: 

5-year average program completion rates

Determination: This metric represents the number of program graduates for a given academic year divided by the number of students enrolled in the program during that same academic year. A program's number of students was determined by calculating an unduplicated headcount across academic terms from the years 2020-2024. A five-year average of program completion rates is used.

Data Source & Validation: Two OIRE staff members calculated the metric from Banner data and compared results.

Value Range: Values ranged from programs having no graduates dring the five-year period to a program having a 100% completion rate.

Dashboard Example:12% (no students, no students, 0%, 13%, 24%)

Interpretation: The program had no students enrolled in the program in 2020 and 2021. In 2022, the program had students but 0% of them graduated. In 2023, 13% of the students graduated. In 2024, 24% of the students graduated. The average percentage across the years with students enrolled was 12%.

Scoring: Results were sorted from the highest value to the lowest value. Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points). Minors were excluded from this metric.

Time-to-degree completion

Determination: This metric involves a calculation of the amount of time, in years, a program's graduates took to earn their degree. In a given academic year, program graduates' entering term cohort codes were used to determine the amount of time required for each student to complete the program. We calculated the average time to complete for all graduates of a program for each academic year from 2019-2024, then calculated the average of those five academic years. There were two exclusion criteria when selecting cases for analysis: students missing an entering term cohort code and students with an entering term cohort code that preceded the university's time limitation policy for completing a degree program were excluded (6 years for masters, 7 years for bachelors, and 8 years for doctoral). 

Data Source & Validation: Two OIRE staff members calculated the metric and compared results.

Value Range: Doctoral: values range from having no graduates to 6.0 years; Masters: values range from having no graduates to 3.6 years; Bachelors: values range from having no graduates to 3.8 years; Minors: excluded

Dashboard Example: 2.3 years (2.4, 2.1, 2.3, no grads, 2.6)

Interpretation of Example: The average time to degree completion for graduates was 2.4 years for students graduating in AY20, 2.1 years for students graduating in AY21, 2.3 years for students graduating in AY22, there were no graduates in AY 23, and 2.6 years for students graduating in AY 24. The average across these five years, excluding the year with no graduates, was 2.3 years.

Scoring: Results were sorted from the highest value to the lowest value within the same degree level (bachelors, masters, doctoral). Each program received a point score based on whether it fell in the top-third of programs (2 points), middle-third of programs (1 point), or bottom-third of programs (0 points) within its degree level.

External exam pass rate

Determination: If the UIS pass rate is at or above national/state pass rates, the program will get a score of 2. If the pass rate is below national/state pass rates, the program will get a score of 0. Programs that don’t require an external exam will not receive a score for this metric. A score of 0 will be assessed if atesting results are not received from the unit executive officer.

Data Source & Validation: UEOs were asked to provide exam pass rates and national/state pass rates.

Value Range: Rates above national/state average, Rates below national/state average, not applicable

Dashboard Example 1: UIS Pass Rate: 73%; National Pass Rate: 76%

Dashboard Example 2: Exam not required

Interpretation of Example 1: The program had an external exam pass rate of 73%. The national pass rate is 76%.

Interpretation of Example 2: The program does not require an external exam.

Scoring: Programs with rates above national/state averages received a 2, programs with rates before national/state averages received a 0, programs not requiring an exam are excluded.