Frequently Asked Questions

Why can't I log into Xitracs?

Why do we need to assess student learning outcomes?/Now that SACS is over, why should we continue to assess?

What's the difference between a student learning outcome (SLO) and an operational outcome?

What types of supporting documents should I upload into Xitracs?

How many examples do I need?

What happens if you keep meeting your criteria?

How do you "close the loop?"

What are some examples of "closing the loop?"

Where can I find examples for my discipline of closing the loop?

What should our assessment committee do?

What's the role of the college assessment committee?

What's the role of the University Academic Assessment Council?

Why can't I log into Xitracs?

First, make sure you are trying to log into the Xitracs Portal (rather than the Xitracs Accreditation Management System). Second, be sure to use your ASU user name and password. You may also want to try a different browsesr to see if that makes a difference. If you are still having problems, please contact IRAP to make sure the correct permissions have been assigned to you.

Why do we need to assess Student Learning Outcomes? / Now that SACS is over why should we continue to assess?

Accountability

Appalachian's assessment efforts go beyond meeting the expectations of SACS and various programmatic accreditors. Sound assessment practices will allow programs to speak to institutional stakeholders authoritatively about the impact of their efforts and present evidence that a reasonable person would accept as proof that their claims are accurate. The call for accountability in higher education just keeps getting louder. In response, the University of North Carolina system has included publishing expected learning outcomes for each degree program on each campus as part of the UNC's Strategic Directions.

Program Improvement

Most importantly, the assessment process at Appalachian is not about keeping score; it is about getting better. The assumption is that programs will craft assessment plans to address that which is most important to student learning in their disciplines. It is also expected that during this process programs will find areas that need improvement, address them, and reassess until the program is functioning up to faculty expectations.

What's the difference between a Student Learning Outcome (SLO) and an operational outcome?

There are two kinds of outcomes: student learning outcomes, which focus on what students will be able to do as a result of participating in a program, and operational outcomes (sometimes referred to as program outcomes), which are changes in the program that faculty or staff want to occur as a result of their efforts.

Student learning outcomes typically start with the phrase, "Students will (insert specific action verb such as explain, apply, describe, create, etc.) . . ."

Operational outcomes for academic programs often focus on faculty (related to teaching, research, or service), student or alumni satisfaction, recruitment, retention and graduation rates, fundraising, or facilities. Basically, if the outcome is not specifically focused on student learning, then it is an operational outcome for the program.

What types of supporting documents should I upload into Xitracs?

A program should attach documents related to its assessment methods such as surveys, instruments (tests/questionnaires), rubrics, etc. Supporting documents should also be attached to assessment results, including full reports of assessment results and examples of student artifacts where appropriate. Finally, a program should attach action plans for any criteria that have not been met, which should include specific steps the program plans to take to ensure that a set criterion will be met in the future.

How many examples do I need?

In the assessment process, artifacts are the samples of student work produced in response to faculty developed assignments. There is no set rule for the number of artifacts that a program should attach to assessment reports. At the very least, there should be an example to represent each level of student performance (ex. Inadequate, Adequate, and Excellent levels of competency). The goal here is to allow any potential outside reviewer of a program's assessment report to get a basic understanding of how faculty distinguished between levels of proficiency.

What happens if you keep meeting your criteria?

If a criterion is met for an outcome, then a program has a couple of choices to make. The program may decide that the criterion was set too low and decide to raise it (set the bar higher). Alternatively, the program may be satisfied with meeting the set criterion and should move on to measure another outcome in their assessment plan. Ideally, a program should cycle through measuring all of the outcomes in its assessment plan within a 5-7 year period. Once the program has cycled through measuring all of its outcomes, it will be time to return to the beginning and start again to see if the program is continuing to meet its set criteria.

How do you close the loop?

Closing the loop in assessment happens when a program makes changes in response to a criterion that has not been met and then, measures the outcome again to see if the changes had any impact on meeting the criterion.

What are some examples of "closing the loop?"

In the last report to SACS, Appalachian provided several examples of closing the loop from each college. The table of examples can be found here (PDF, 86 KB).

Where can I find examples for my discipline of closing the loop?

A simple internet search would probably provide many examples of closing the loop in any discipline. Or, contact colleagues at other institutions in your discipline to ask them to share their assessment reports. There are also numerous conferences on assessment where people present examples of successful assessment efforts.

What should our assessment committee do?

An assessment committee could potentially have lots of responsibilities. First, assessment committee meetings are the primary place where communication about assessment happens. Assessment committee members should be communicating all information about assessment back to their programs such as changes in expectation around assessment, feedback on assessment plans and reports, deadlines for assessment, etc. Committee meetings are an opportunity to ask questions and get advice related to assessment. Second, assessment committees should review and provide feedback to programs on their assessment plans and reports on at least an annual basis. As part of the review process, a committee may make suggestions on whether goals and outcomes are clear, whether criterion levels are set high enough, whether assessment methods fit the outcomes, whether all necessary documentation around assessment is available for reference, and whether programs have established action plans for any unmet criteria. Most importantly, assessment committees need to make sure that programs are closing the loop when criteria are not met. Each assessment committee has some flexibility in deciding what the committee will do. For example, an assessment committee may decide they want to recognize programs that are doing an exceptional job with assessment or they may want to organize professional development opportunities on assessment.

What's the role of the college assessment committee?

According to the University Academic Assessment Council:

College Assessment Committees (for every college / school / library) will oversee assessment plans of the programs in their college. Their charge is to (a.) ensure that every program offered in the college is on track with assessment; (b.) offer feedback to program coordinators and faculty when needed; and (c.) give progress reports to deans and administrators.

What's the role of the University Academic Assessment Council?

The University Academic Assessment Council was charged to "communicate and improve assessment efforts on campus." (See http://irap.appstate.edu/assessment/university-academic-assessment-council)