Assessment Team Resources
- 6/30: 18-19 Data Collection Ends (18-19 Results Report)
- 7/1: New data collection cycle begins (19-20)
12/1 – 18-19 Results Reports Due
TBD – 19-20 Plans Due
Each department coordinator will present to the Division Review Committee about their process, findings, and Success Story
Each department coordinator will present to the Division Review Committee about their process, findings, and Success Story.
- What is your departmental assessment process?
- What is new in your process this year? What went well? What will you change in the next cycle?
- How does your department use assessment data?
- Who reviews the data? Is the data used to support reports, audits, accreditations, certifications, or other official uses?
- How did you prepare for a successful assessment cycle?
- Talk about how you planned to meet the deadline, how you worked with your Advisor, whether you utilized the DRC Chair and/or OEAS consultations, whether you attended training, etc.
- Feedback and Suggestions
- Division Process
- Division resources & tools
- Rubrics / Expectations
- Online System
- Your departments rubric scores over time
- Trend line for at least the past five years
- Success Stories
- An example of Closing the Loop
- An example of Closing the Loop with Documented Gains
- What changes to your operations, services, or business practices were made in this cycle as a result of assessment data?
- What changes are you making to the 2018-19 Assessment Plan?
Administration and Finance DRC A
Division Review Committee: Tracy Slavik (chair), Cynthia Pugsley, Mike Shumack, Brian Sargent
- Josh Emambakhsh, Human Resources
- Cynthia Pugsley, Security and Emergency Management
- Meghan McCollum, Finance and Accounting
- Debbie Frankenbach, Human Resources
- Louann Huynh, Parking and Transportation
- Monica Mayer and Nellie Nido, Procurement Services
- Cynthia Pugsley, UCF Police
Administration and Finance DRC B (Facilities)
For information on DRC B, contact Brian Wormwood at email@example.com.
Assessment is an ongoing process that uses the results from measured outcomes to improve programs and operations. In 1994, UCF established a goal that all academic and administrative units would develop mission statements, objectives, and at least three outcome measures to assess and improve programs, operations, and services. In 1996, President Hitt established the University Assessment Committee and in 1991 the Office of Operational Excellence and Assessment Support (OEAS) was instituted to support quality improvement efforts across our campuses. Today, every academic program and administrative unit at UCF is actively engaged in an assessment process. Assessment is conducted in response to many external drivers (e.g., Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) Criteria for Accreditation, CAEP, AACSB, and ABET) and most importantly the internal drive to improve.
Assessment Glossary for Administrative Units
Use ctrl + f to search for key words.
Accreditation— process that involves a regular, comprehensive review of all institutional functions. No US institution is required to see accreditation; however, institutions seek accreditation because of the benefits. See SACS.
Advisor— a DRC member who is assigned to a department as a reviewer. The A&F DRC members and coordinators voted in 2016 to update our reviewer titles from Mentor to Advisor. The Advisors assist the Assessment Coordinator (and the department assessment team) with achieving successful, meaningful assessment and meeting the rubric requirements. In other university DRCs Advisors may be referred to as mentors or reviewers.
Assessment—UCF’s Institutional Effectiveness initiative. Assessment can be defined as the systematic and ongoing method of gathering, analyzing, and using information from various sources to demonstrate that we are making evidence-based business decisions that impact our operations with positive change that moves our university forward in the intended strategic direction. In 1994, UCF established a goal that all academic and administrative units would develop mission statements and outcomes with at least two measures to assess and improve administrative units, operations, and services. Assessment is conducted in response to many external drivers (e.g., Southern Association of Colleges and Schools [SACS], International Association of Counseling Services [IACS], and Council for Advancement of Standards in Higher Education [CAS]) and most importantly the internal drive for continuous improvement.
Assessment Coordinator— Serves as the assessment leader within a department. They are responsible for coordinating communication among the department assessment team, presenting documents and findings to department leadership, collaborating with thier advisor (DRC member/mentor/reviewer) to meet rubric expectations, entering documents into the OEAS assessment portal, and providing any additional documentation as requested by their advisor, DRC Chair, or OEAS regarding Assessment.
- (1) the period of time reflected in the documents. e.g., fiscal year, academic year, or calendar year. In Administration and Finance, our assessment cycle is based on the fiscal year (July 1 – June 30). Your plan sets out specific measures for that time frame, and during the cycle you collect the data. After the cycle you report on the data, and adjust your plan for the next cycle.
- (2) the period of time when the report and plan documents are being actively edited. In Administration and Finance, that is a 12 week period following the beginning of the fiscal year.
Assessment Instruments— surveys, logs, etc. that are used to collect the data that is reported. In the plan module, you are asked to attach a draft to demonstrate that you are forward-thinking about how you are going to collect your data. In the Results module you are asked to attach a final draft of an assessment instrument to show how you did collect the data.
Assessment Team— The department’s assessment team includes the Assessment Coordinator (the leader for the department’s assessment initiative), their colleagues who participate in the assessment process, the director/chair of the department, the DRC Advisor, the DRC chair, and the OEAS staff. All members of the team should seek collaboration with each other; however, the primary responsibility for initiating collaboration is assigned to the Assessment Coordinator.
Baseline Data—a first determination of performance (determining the starting point) that can be used to set improvement goals and provide a basis for assessing future progress and is used in a measure that does not already have data to measure success against. Future progress will be assessed or comparisons made based on this initial data.
Benchmark—model effort, best practices, recognized standard of excellence, a reference point, or historical data against which other data are compared. It is used to help establish performance and project targets.
Closing the Loop— you used a measure to assess a business practice. In the results report, you documented that you found something that needed to be improved (a gap) and documented that you were making a change to operations to address the gap you found. In the next plan year, you added or changed a measure in order to evaluate the success of the change you made as documented in the previous results report. In a measure that is closing the loop (“yes” radio button), you must describe how you determined the gap (quote from previous results report) and your strategy to bring about the change.
Collaborative Model—we ask that all units create a collaborative team model for assessment, involving some or all members of the assessment team as well as the director/chair of the department. The assessment team works together to create meaningful, formative assessment plans, collect and analyze data, and translate that data into meaningful change to departmental operations. Those officially included in the process should be listed in the participants section of the plan, and their roles described in the Assessment Process section. Other members of the department should be included in brainstorming sessions and open discussions.
Direct Measure—data captured from a source that shows exact quantity and is a sample of something actual (log, staff time, cost, audit data, observation, demonstration, demonstrative learning questions for workshops/trainings, rubrics, expert observer report, etc.). This category includes methods that assess demand, quality, efficiency, and effectiveness. For example, efficiency may address completion of service, productivity of service and efficiency of individual points of service. Direct measures also includes a pre and post instrument that demonstrates workshop and other training success. Note that a survey, while normally indirect (opinion), can be used as a direct measure if it is the only method of reporting success toward a target (e.g., a measure regarding customer satisfaction would be directly measured by a report of customer opinions on their experience).
Division Review Committee (DRC) – a team of Advisors each of whom are paired with one or two departments. They assist the Assessment Coordinator and their department assessment team with achieving successful and meaningful assessment, as well as meeting the rubric expectations.
Documentation—any change or improvement or result must be included in the assessment report. If it isn’t in the report, it didn’t happen!
Documented Improvement — a change to operations that yields positive, measurable results. Also called a Gain.
Formative Assessment—measures that clearly promote evaluating success over time and creating quality improvements to business practices. (In contrast, see Summative Assessment). In general, when a target for a measure is 100% or when a measure is written to “maintain” or simply document a level of performance, it is unlikely that the measure has formative potential.
Gain— a change to operations that yields positive, measurable results. Also called Documented Improvement.
Gap— you may hear this term in reference to a “gap in service” – something you found that needs to be changed in order to improve services. Discovering gaps is the first element of successful assessment! Discover a need for change, make a change, evaluate the change—that is Assessment Nirvana!
Goal— goals are broad statements that describe the long-term directions of development. Note: Measurable outcomes help in achieving the goals, and those measures have quantifiable targets.
Gratitude—THANK YOU for your commitment to excellence and for taking the initiative to make this process meaningful! You ROCK!
Implemented Changes— at the time of the writing (wherever you are entering information), this change has already been put in place.
Inclusiveness Language— use language that minimizes bias and maximizes inclusion, such as staff or personnel hours rather than “man hours” and alphabetizing pronoun parings and lists (he or she; her or him). University standards can be accessed here: UCF Brand Site
Indirect Measure— data captured that are not first-hand or factual. They are an individual’s perceptions or opinions. For example, customer satisfaction surveys, workshop/training surveys, case studies, focus groups, interviews, etc. Note, a post-test may have questions that are considered direct measures. For example, if you ask “did this workshop expand your knowledge?” the response is the opinion of the attendee. If you ask a factual question at the beginning and at the end that the participant would learn something during the workshop that would give them the skills to solve, the question is direct because they are demonstrating knowledge gained rather than giving an opinion about whether the knowledge was gained.
Indicator—the rubric requirements for your plans and reports.
Institutional Effectiveness— “Institutional effectiveness is the systematic, explicit, and documented process of measuring performance against mission in all aspects of an institution.” (SACS/COC Resource Manual) See Assessment.
Objective—a term from Strategic Planning that means the same thing as Outcome.
OEAS—Operational Excellence and Assessment Support office. This is the “hub” and heart of assessment for UCF; our expert team who have made UCF a national leader in Assessment and Institutional Effectiveness. They serve as a resource to every assessment coordinator, mentor/reviewer, and DRC chair at UCF. They also program and support the online assessment system, and can be contacted for technical assistance.
Outcome—specific statements that describe desired quality (timeliness, accuracy, responsiveness, etc.) of key functions and services that support the department’s mission. There is NO QUANTIATIVE TARGET in an outcome; outcomes are the desired achievement; measures have a quantifiable performance target.
Meaningful Assessment—assessment plan designed by the department assessment team to promote and measure positive change to department operations and business practices. It is the responsibility of the department assessment coordinator to work collaboratively with leadership and other staff members to ensure a meaningful, useful assessment process. See Useful Assessment.
Measure—at least two separate and distinct ways to document progress toward this outcome. Providing different ways to measure success toward the outcome provides triangulation of data.
Mission Statement—the department’s primary purpose, functions, and stakeholders. The mission statement listed in the plan should be the official mission statement of the department. If the official mission of the department doesn’t have all of the rubric (required) elements, or if there are additional elements you would like to include in this section, use a subheading so that the Mission Statement section begins with the official mission statement of the department.
Plan— the set of outcomes and measures that you will collect data on throughout the cycle and provide a detailed and thorough report on when the cycle concludes.
Planned Changes— at the time of the writing (when you are entering information), this change has not yet been put into place.
Quantitative Target—a specific number or percent in your measure that will clearly and decisively show whether your target for this measure was met. If your target was three and you achieved four, your target was met. You must clearly indicate “target met” or “target not met” when you report on each measure.
Reflective Statement— analyze and discuss your results. Why do you think you got the results that you did? Include what you learned from the data, how that data compares to last year (if a continued measure), and what changes will be made to either your business practices or your assessment plan or process in the coming cycle based on this data. If you are making a change to your business practices, how will you measure the success of that change in the next cycle? Will you change this measure, add another measure, add an outcome? If you are making changes, include this in your Planned and Implemented Changes section of the report, and refer back to this Outcome’s reflective statement in the description.
Results — should include accurate and thorough data reporting. This means to look at your measure and report what you set out to do (all elements/concepts mentioned in the measure). What is your target? What specific things did you say you were trying to accomplish with this measure? Include response rates for survey data—how many people were given the survey and how many responded? If you are reporting on cost or personnel hours savings, you must include the numbers that support the percentages. Include the strategies used to collect the data, and how that data is meaningful to you.
Results Report—the final report you do that includes your results for your measures, reflections, and the documentation regarding changes you have made and plan to make based on your assessment findings.
Rubric— our “guidepost” – the list of expectations and requirements that we need to meet. The rubric tool provides guidance so we know what the reviewers are looking for.
SACSCOC—Southern Associations of Colleges and Schools Commission on Colleges (accreditation agency). SACSCOC is the regional body for the accreditation of degree-granting higher education institutions in the southern states. It serves as the common denominator of shared values and practices among the diverse institutions in Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Texas, Virginia and Latin America and other international sites approved by the Commission on Colleges that award associate, baccalaureate, master’s, or doctoral degrees. The Commission also accepts applications from other international institutions of higher education. The mission of SACS is to assure the educational quality and improve the effectiveness of its member institutions.
Strategy— how we determine the target in this measure. The strategy is described in the measure and the success of the strategy is discussed in the results or reflective statement.
Stretch Targets— when a measure is kept the same, but the target quantity is increased over the previous cycle. This does not achieve closing of the loop unless the stretched target is evaluating the success of a change that was made based on previous assessment data.
Success Stories—examples of closing of the loop achievements and documented improvements. These are reported in the assessment system as well as being highlighted in the OEAS Annual Assessment Report to to the President.
Summative Assessment— assessment plan that documents business practices, such as cumulative evaluations over multiple cycles to determine effectiveness and measuring success toward a long-term goal, but does not focus on promoting immediate change to operations.
Strategic Planning (section of the Assessment Plan)— demonstrate that this plan’s outcomes and measures document the department’s support of the university’s strategic direction. UCF’s strategic plan elements can be viewed here: https://www.ucf.edu/strategic-planning/strategic-plan-key-elements/. In order to meet the rubric expectation, you must give an example of at least one outcome/measure and describe how investing time to measure that item supports the university’s strategic direction.
Student Learning Outcome—desired achievement for persons in a learning setting such as a workshop, staff development training, or presentation.
Target—specific, quantifiable achievements that document and demonstrate the success of this measure. The target is clearly stated in the measure. (Note: the target in a measure is not a goal. Please see Goal).
Triangulation—multiple lines of evidence pointing to the same conclusion (more than one measure supporting an outcome).
University Assessment Committee (UAC) – comprised of the chairs of each of the 20 (est.) DRCs at the university. This entity is charged by the president to steer the assessment process for UCF, review the assessment reports of all departments and ensure quality assessment is being conducted at all levels, and assist OEAS in guiding the department-level assessments to support the university’s accreditation requirements.
Vision— narrative description in the Strategic Plan that helps to keep the team focused. It is a conceptual image of the desired future. Some departments have their own strategic plan and goals and vision statements; these should support UCF’s strategic plan and vision. UCF’s strategic plan elements can be viewed here: https://www.ucf.edu/strategic-planning/strategic-plan-key-elements/
UNIVERSITY OF CENTRAL FLORIDA
DIVISION OF ADMINISTRATION AND FINANCE
4365 Andromeda Loop N. Orlando, Florida 32816 | 407.823.2351
Web: admfin.ucf.edu | email: firstname.lastname@example.org
© University of Central Florida