Label Cloud

Monday, October 24, 2011

Ch 10 - Evaluation in Instructional Design

formative evaluation: designed and intended to support the process of improvement, done by someone who can make improvements
summative evaluation: rest of evaluation done by any observer or decision makers who need evaluative conclusions for any reasons

2 key features: 
1- testing focus on objectives criterion-referenced testing - not to sort the learners with grades, but rather determine the extent to which each objective is mastered
2- focus on the learners as the primary source of data for making decisions about instruction -


Evaluation: process of determining the merit, (intrinsic value of the evaluation object/evaluand)
 worth,(market value of the evaluand)
 and the value (involves making of value judgments)
 of things and evaluations are the products of that process. (Scriven)
 Logic of evaluation: 1 - select the criteria of merit or worth
2- set specific performance standards
3- collect performance data and compare the level of observed performance with the level of required performance
4- make the evaluative judgments
--identifying criteria of merit and worth, setting standards, collecting data and making value judgements

CIPP evaluation model - Stufflebeam
CIPP = context, input, process, product
context eval: assessment of the environment to determine the need and objectives and identify the factors of the environment that will impact the success - needs assessment and used in program planning decisions
input eval: eval questions raised about the resources that will be used develop and conduct the innovation/program - used in program structuring decisions
process eval: (formative eval) - examine the ways in which an innovation is being developed, initial effectiveness ans effectiveness after revisions - used to make implementation decisions
product eval:focus on the success of the innovation in producing the desired outcomes - used in summative eval decisions

Five Domain evaluation model - Rossi's
needs assessment: is there a need for this type of program in this context?
need: a gap between the actual and desired state of affairs
theory assessment: is the program conceptualized in a way it should work?
if not based on sound social, psychological and educational theory, it doesn't work - theory failure
implementation assessment: was this program implemented properly and according to the program plan? - if not operated properly, implementation failure
impact assessment: did this program have an impact on its intended targets?
efficiency assessment: is the program cost effective?

Training evaluation model - Kirpatrick
Level 1: Reaction: assessment of learner's reactions and attitudes toward the learning experience
Level 2: Learning: determining what participants in the learning program learned
training events based on knowledge: measured with achievement test,
  skills: measured with performance test
  attitudes: measured with questionnaire
Level 3: Behavior (Transfer of Training): determining whether the training program participants change their on-the-job behavior (OJB)as a result of having participated in the training  program.
Level 4: Results: finding out if the training leads to final results

Success Case Method - Brinkerhoff
determining what works is to examine successful cases and compare them to unsuccessful cases
Steps
1- focus and plan the case
2- construct a visual impact model
3- conduct a surve research study to identify the best cases and the worst cases.
4- schedule and conduct in-depth interviews with multiple success cases
5- write-up and communicate the evaluation findings

Utilization-Focused Evaluation (U-FE) - Patton
evaluation done for and wioth specific intended userd for specific intended uses
- rule: should be judged by the degree to which it is used
Process use:occurs when clients learn the logic of evaluation and appreciate its use in organization
Steps
1- conduct a readiness assessment
2- identify the primary intended users and develop a working relationship with them
3- conduct a situational analysis
4- identify the primary intended uses
5- focus the evaluation
6- design the evaluation
7- collect, analyze and interpret the evaluation data
8- continually facialiate evaluation use
9- conduct a meta evaluation - an evaluation of the evaluation

0 comments:

Post a Comment