Justify

VI. Evaluation - Analyze - Justify - Resources media type="youtube" key="BZ3USs16J3Y" height="505" width="640" align="center" In this video, Mr. Bill Moore explains why assessment is necessary for effective instruction and challenges teachers to think of "assessment as learning" (Center for Instructional Innovation, 2010).


 * Justify evaluations to clients**

**Assessment Vs. Testing**
It is often assumed by faculty that by merely delivering quizzes, tests, and other assessments, that they have participated in "assessing" the students. While tests are a tool used for "collecting information about student learning, not all tests should be counted as assessment" (Educational Assessment, 2010). When a test is given, many times it is merely to fill the need to assign grades to th students. They rarely offer any information that help to identify areas of weakness that need to be improved.

If we are going to justify our educational decisions as instructors we need to not only assess what was learned, but also how it was learned and what resulted from the learning. To make an assessment that will lead to improved student learning we can use a variety of instruments such as "observations, surveys, interviews, performance tasks, and portfolios. Thus, assessment is a comprehensive concept, centering its endeavors on student learning, and serving the purpose of student improvement and development through a variety of ways" (Educational Assessment, 2010).

Assessments, if used properly, should benefit both students and the faculty. The assessment data "generates information to support and justify faculty's educational decisions, and it helps document evidence of teaching effectiveness" (Stony Brook University, 2010). If schools and faculty are to gain the most from their assessments, they need to "acquire ownership of their assessment" (Stony Brook University, 2010). The only way that an assessment can lead to change is if the "faculty initiate, implement, sustain, and use their own assessments to address questions that really matter to them" (Educational Assessment, 2010).

"Assessment is a means to the end, not the end itself. Assessment is of little value unless the assessment loop is closed by making use of the assessment results" (Educational Assessment, 2010).


 * Overview of Methods to Collect Information**

The following table provides an overview of the major methods used for collecting data during evaluations. checklists ||= Need to quickly and/or easily get lots of information from people in a non threatening way ||< * can complete anonymously
 * = **Method** ||= **Overall Purpose** ||= **Advantages** ||= **Challenges** ||
 * = questionnaires, surveys,
 * inexpensive to administer
 * easy to compare and analyze
 * administer to many people
 * can get lots of data
 * many sample questionnaires already exist ||< * might not get careful feedback
 * wording can bias client's responses
 * are impersonal
 * in surveys, may need sampling expert
 * doesn't get full story ||
 * = interviews ||= Want to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires ||< * get full range and depth of information
 * develops relationship with client
 * can be flexible with client ||< * can take much time
 * can be hard to analyze and compare
 * can be costly
 * interviewer can bias client's responses ||
 * = documentation review ||= Want impression of how program operates without interrupting the program; is from review of applications, finances, memos, minutes, etc. ||< * get comprehensive and historical information
 * doesn't interrupt program or client's routine in program
 * information already exists
 * few biases about information ||< * often takes much time
 * info may be incomplete
 * need to be quite clear about what looking for
 * not flexible means to get data; data restricted to what already exists ||
 * = observation ||= To gather accurate information about how a program actually operates, particularly about processes ||< * view operations of a program as they are actually occurring
 * can adapt to events as they occur ||< * can be difficult to interpret seen behaviors
 * can be complex to categorize observations
 * can influence behaviors of program participants
 * can be expensive ||
 * = focus groups ||= To explore a topic in depth through group discussion, e.g., about reactions to an experience or suggestion, understanding common complaints, etc.; useful in evaluation and marketing ||< * quickly and reliably get common impressions
 * can be efficient way to get much range and depth of information in short time
 * can convey key information about programs ||< * can be hard to analyze responses
 * need good facilitator for safety and closure
 * difficult to schedule 6-8 people together ||
 * = case studies ||= To fully understand or depict client's experiences in a program, and conduct comprehensive examination through cross comparison of cases ||< * fully depicts client's experience in program input, process and results
 * powerful means to portray program to outsiders ||< * usually quite time consuming to collect, organize and describe
 * represents depth of information, rather than breadth ||

= VI. Evaluation - Analyze - Justify - Resources =