History of Instructional Media
Instructional media: physical means via which instruction is presented to learners.
-prior to the twentieth century, three primary means of instruction: teacher, chalkboard, and textbook
School museums: portable exhibits, stereographs, slides, films, charts - not intended to supplant the teacher
Visual instructional movement and instructional films: lantern slide projectors and stereograph viewers. Thomas Edison argued that motion picture will replace textbooks. 1914-1923, movement grew
Audiovisual Instruction movement and Instructional Radio: broadcasting, sound motion pictures, sound recordings - 1920s - 1930s
Radio gained a big attention
World War II : instructional movements slowed. But large amount of training films and filmstrips were used in military. Overhead projectors were produced.
Instructional Television: most important that affects the audiovisual movement was interest in television. Increase in 1950s -funded by Ford Foundation. In 1960s, the interest is declined.
it is been realized that instructional television had small potential in practice. Reasons include:
- the teacher resistance to change: top-down change: change by school administrators with little or no input from teachers.
- the quality of the tv programs.
- the expense of installing and maintaining tv systems in schools and failure to provide teachers with guidance
Computers for Instructional Purposes: In 1950s at IBM, computer-aided instruction (CAI) was done- which had a little impact on education.
In 1984, Papert indicated that computer was going to be a catalyst of very deep and radical change in educational system.
Recent Developments: Training in companies done by instructional media
Social media used in higher education for instructional purposes
read and create blogs, wikis, view online videos and listen podcasts
use of distance learning in higher education
availability of technology has increased in first decade of 21st century
instead of technology being used in 1990s as a drill and practice, in 21st century, technology used for:
-solve problems, analyze data, perform calculations, develop multimedia presentations, create art/music/movies/webcasts/graphics/visual displays.
-major role in US military
Reasons for instructional technology to increase in education:
- low cost
- easy accessibility of computers
-increased interactive capabilities - Moore (1989) describes three types of interactivity:
1- between learners and instructional content
2- between learners and the instructor
3- among learners themselves
Nowadays, via chat rooms, email, bulletin boards, learners can interact with the instructor and themselves as well as the content.
- use of social media lets learners share information and acquire new skills and knowledge
History of Instructional Design
Origins: World War II
The Programmed Instruction Movement: mid-1950s through 1960s. Skinner's ideas regarding requirements of increasing human learning and the desired characteristics of effective instructional materials - programmed instructional materials - should present instruction in small steps, require active responses to frequent questions, provide immediate feedback, allow for learner self-pacing.
Popularization of Behavioral Objectives: identifying specific objectives learners would be expected to attain.
Bloom's Taxonomy of Educational Objectives(1956) - within the cognitive domain there are various types of learning outcomes and that there was a hierarchical relationship between the various types of outcomes.
The Criterion-Referenced Testing Movement: could be used
- to assess students entry-level behavior
- to determine the extent to which students had acquired the behaviors an instructional program was designed to teach.
Robert M. Gagne: Domains of Learning, Events of Instruction, and Hierarchical Analysis:
1965 - The Conditions of Learning, by Gagne: describes five domains of learning outcomes: verbal information, intellectual skills, psychomotor skills, attitudes, cognitive strategies,
Sputnik: The Indirect Launching of Formative Evaluation: a space satellite, launched in 1957, have effects on instructional design
formative evaluation: try out the drafts of instructional materials with learners prior to the final form of the materials and revise them
summative evaluation: testing the instructional materials after they are in the final form.
Early Instructional Design Models:
concepts developed: task analysis, objective specification, criterion-referenced testing
terms: instructional design, instructional development, systematic instruction, instructional system.
1970s: Burgeoning of Interest in the Systems Approach:
interest in personal computers in 1980s has major effect on instructional design
1990s: Recognizing the Importance of Performance:
interest in constructivist views of teaching and learning has major influence on instructional design
authentic learning tasks, that reflect the complexity of the real world environment in which the learners will be using the skills they are learning
21st Century: e-Learning and Informal Learning: Internet as a means of presenting instructions to learners
Informal methods, as a means of improving learning and performance in the workplace
Social Media to share knowledge ad skills as an example of informal methods
Label Cloud
Showing posts with label trends and issues in instructional design. Show all posts
Showing posts with label trends and issues in instructional design. Show all posts
Monday, January 2, 2012
Tuesday, December 27, 2011
Ch 2 - Characteristics of Instructional Design Models
A system of procedures for developing education
1- Early ID models based on behaviorism
2- General Systems theory become another fundamental tenet f ID.
Nine characteristics GST: Systematic: adapting rules and procedures as a way to move through a process.
Systemic: application of the creative problem-solving methods.
Responsive: accepting whatever goals are established
Interdependence: all elements within a system are connected and depend on each other.
Redundancy: duplicate processes and duplicate procedures to prevent from failure
Dynamic: system can adjust to changing conditions.
Cybernetic: elements communicate among themselves
Synergistic: all together elements achieve better than one can
Creativity: use of special human talents
ADDIE: based on a systematic product development
Analyze: needs assessment, identifying a problem and stating a goal.
Design: writing objectives, specifying learning activities and media,
Develop: preparing student and instruction materials
Implement: delivering the instruction in the setting which it was designed
Evaluate: both formative evalution: involves collecting data to identify needed revisions, summative evalution: collecting data to assess overall effectiveness, and revision: making necessary changes based on the formative evaluation data.
In ADDIE, one can move back and forth, doesn’t have to be step by step.
ADDIE is iterative and self correcting
Characteristic of ID:
1- student centered: learners can be given to select their own learning objectives
2- goal oriented:
3- focuses on meaningful performance: instead of learners recalling information, ID prepares learners for complex behaviors and solvings of authentic problems
4- assumes outcomes can be measured in a valid way:opposite of a paper and pencil test, observer observing learner’s performance with a checklist
5- empirical, iterative, self-correcting: data is the heart of ID. Not linear, or sequential
6- team effort
Pebble-in-the-Pond Approach (Merrill)
For whole problems or task
An alteration to traditional ID
Traditional ID starts with some abstract representation and has actual content is delayed until the development of the ID process
1- casting in a pebble, identify the problem
2- identify the progression of such problems
3- identify the component knowledge and skill required
4- determine the instructional strategy
5- interface design
4C/ID model:(Merrienboer and Kirschner)
ten steps approach:
-specifying a series of learning tasks
-learner will perform a simpler version of the whole skill and gradually move on to complex versions.
Wednesday, December 21, 2011
Ch 1- What field did you say you were in?
Instructional Technology definitions:
- 1963 definition: "the design and use of messages which control the learning process"
- 1970 definition: 1 - media born of communications revolution which can be used for instructional purposes...
2- a systematic way of designing, carrying out and evaluating whole process of learning and teaching in terms of specific objectives...
1977 definition: educational technology is a complex, integrated process involving people, procedures, ideas, devices and organization, for analyzing problems and devising, implementing, evaluating and managing solutions to those problems involving all aspects of human learning.
1994 definition: beyond viewing Instructional Technology as a Process: is the theory and practice of design development, utilization, management, and evaluation of processes and resources for learning.
- Early definitions focuses on instructional media that is being produced by professionals
Definition by AECT: educational technology is study and ethical practice of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources.
- ethical, focuses on the ones that should maintain a high level of professional conduct.
Book's definition:
"instructional design and technology encompasses analysis of the learning and performance problems, and the design, development, implementation, evaluation and management of instructional and non-instructional processes and resources intended to improve learning and performance in a variety of settings."
- the goal of the field has changed over the years and the most recent goal is: to improve learning and performance
- 1963 definition: "the design and use of messages which control the learning process"
- 1970 definition: 1 - media born of communications revolution which can be used for instructional purposes...
2- a systematic way of designing, carrying out and evaluating whole process of learning and teaching in terms of specific objectives...
1977 definition: educational technology is a complex, integrated process involving people, procedures, ideas, devices and organization, for analyzing problems and devising, implementing, evaluating and managing solutions to those problems involving all aspects of human learning.
1994 definition: beyond viewing Instructional Technology as a Process: is the theory and practice of design development, utilization, management, and evaluation of processes and resources for learning.
- Early definitions focuses on instructional media that is being produced by professionals
Definition by AECT: educational technology is study and ethical practice of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources.
- ethical, focuses on the ones that should maintain a high level of professional conduct.
Book's definition:
"instructional design and technology encompasses analysis of the learning and performance problems, and the design, development, implementation, evaluation and management of instructional and non-instructional processes and resources intended to improve learning and performance in a variety of settings."
- the goal of the field has changed over the years and the most recent goal is: to improve learning and performance
Monday, October 24, 2011
Ch 11 - An Introduction to Return on Investment
Levels of Data
- Level 0: input
- Level 1: reaction and perceived value
- Level 2: learning and confidence
- Level 3: Application and Implementation
- Level 4: Impact and Consequences
- Level 5: Return on Investment - ROI: shows the benefits of the impact measures compared to project cost.Usually in terms of benefit-cost ratio (BCR), ROI as percentage, or payback period
ROI Process Model
Data collection:
during project implementation: surveys, questionnaires, tests, observations
after project implementation: surveys, questionnaires, observations, interviews, focus groups, action plans, performance contracts, business performance monitoring
Data Analysis:
isolation of project effects: control groups, trend line analysis, forecasting models, participant estimates, manager estimates, senior management estimates, expert input, customer input
data conversion: use of standard values, output data, cost of quality, time savings converted to wage and employee benefits, analysis of historical costs, use of internal and external experts, search of external databases, use of participant estimates, soft measures mathematically linked to hard measures
project costs: initial analysis costs, cost - to design and develop,
-of all project materials, -for the project team, - of the facilities for the project, travel, lodging, and meal costs, participant salaries, admin and overhead costs, evaluation costs
return on investment calculation: Benefit-cost ratio BCR = project benefits / project costs
ROI = net project benefits/ project costs * 100
Ch 10 - Evaluation in Instructional Design
formative evaluation: designed and intended to support the process of improvement, done by someone who can make improvements
summative evaluation: rest of evaluation done by any observer or decision makers who need evaluative conclusions for any reasons
2 key features:
1- testing focus on objectives criterion-referenced testing - not to sort the learners with grades, but rather determine the extent to which each objective is mastered
2- focus on the learners as the primary source of data for making decisions about instruction -
Evaluation: process of determining the merit, (intrinsic value of the evaluation object/evaluand)
worth,(market value of the evaluand)
and the value (involves making of value judgments)
of things and evaluations are the products of that process. (Scriven)
Logic of evaluation: 1 - select the criteria of merit or worth
2- set specific performance standards
3- collect performance data and compare the level of observed performance with the level of required performance
4- make the evaluative judgments
--identifying criteria of merit and worth, setting standards, collecting data and making value judgements
CIPP evaluation model - Stufflebeam
CIPP = context, input, process, product
context eval: assessment of the environment to determine the need and objectives and identify the factors of the environment that will impact the success - needs assessment and used in program planning decisions
input eval: eval questions raised about the resources that will be used develop and conduct the innovation/program - used in program structuring decisions
process eval: (formative eval) - examine the ways in which an innovation is being developed, initial effectiveness ans effectiveness after revisions - used to make implementation decisions
product eval:focus on the success of the innovation in producing the desired outcomes - used in summative eval decisions
Five Domain evaluation model - Rossi's
needs assessment: is there a need for this type of program in this context?
need: a gap between the actual and desired state of affairs
theory assessment: is the program conceptualized in a way it should work?
if not based on sound social, psychological and educational theory, it doesn't work - theory failure
implementation assessment: was this program implemented properly and according to the program plan? - if not operated properly, implementation failure
impact assessment: did this program have an impact on its intended targets?
efficiency assessment: is the program cost effective?
Training evaluation model - Kirpatrick
Level 1: Reaction: assessment of learner's reactions and attitudes toward the learning experience
Level 2: Learning: determining what participants in the learning program learned
training events based on knowledge: measured with achievement test,
skills: measured with performance test
attitudes: measured with questionnaire
Level 3: Behavior (Transfer of Training): determining whether the training program participants change their on-the-job behavior (OJB)as a result of having participated in the training program.
Level 4: Results: finding out if the training leads to final results
Success Case Method - Brinkerhoff
determining what works is to examine successful cases and compare them to unsuccessful cases
Steps
1- focus and plan the case
2- construct a visual impact model
3- conduct a surve research study to identify the best cases and the worst cases.
4- schedule and conduct in-depth interviews with multiple success cases
5- write-up and communicate the evaluation findings
Utilization-Focused Evaluation (U-FE) - Patton
evaluation done for and wioth specific intended userd for specific intended uses
- rule: should be judged by the degree to which it is used
Process use:occurs when clients learn the logic of evaluation and appreciate its use in organization
Steps
1- conduct a readiness assessment
2- identify the primary intended users and develop a working relationship with them
3- conduct a situational analysis
4- identify the primary intended uses
5- focus the evaluation
6- design the evaluation
7- collect, analyze and interpret the evaluation data
8- continually facialiate evaluation use
9- conduct a meta evaluation - an evaluation of the evaluation
summative evaluation: rest of evaluation done by any observer or decision makers who need evaluative conclusions for any reasons
2 key features:
1- testing focus on objectives criterion-referenced testing - not to sort the learners with grades, but rather determine the extent to which each objective is mastered
2- focus on the learners as the primary source of data for making decisions about instruction -
Evaluation: process of determining the merit, (intrinsic value of the evaluation object/evaluand)
worth,(market value of the evaluand)
and the value (involves making of value judgments)
of things and evaluations are the products of that process. (Scriven)
Logic of evaluation: 1 - select the criteria of merit or worth
2- set specific performance standards
3- collect performance data and compare the level of observed performance with the level of required performance
4- make the evaluative judgments
--identifying criteria of merit and worth, setting standards, collecting data and making value judgements
CIPP evaluation model - Stufflebeam
CIPP = context, input, process, product
context eval: assessment of the environment to determine the need and objectives and identify the factors of the environment that will impact the success - needs assessment and used in program planning decisions
input eval: eval questions raised about the resources that will be used develop and conduct the innovation/program - used in program structuring decisions
process eval: (formative eval) - examine the ways in which an innovation is being developed, initial effectiveness ans effectiveness after revisions - used to make implementation decisions
product eval:focus on the success of the innovation in producing the desired outcomes - used in summative eval decisions
Five Domain evaluation model - Rossi's
needs assessment: is there a need for this type of program in this context?
need: a gap between the actual and desired state of affairs
theory assessment: is the program conceptualized in a way it should work?
if not based on sound social, psychological and educational theory, it doesn't work - theory failure
implementation assessment: was this program implemented properly and according to the program plan? - if not operated properly, implementation failure
impact assessment: did this program have an impact on its intended targets?
efficiency assessment: is the program cost effective?
Training evaluation model - Kirpatrick
Level 1: Reaction: assessment of learner's reactions and attitudes toward the learning experience
Level 2: Learning: determining what participants in the learning program learned
training events based on knowledge: measured with achievement test,
skills: measured with performance test
attitudes: measured with questionnaire
Level 3: Behavior (Transfer of Training): determining whether the training program participants change their on-the-job behavior (OJB)as a result of having participated in the training program.
Level 4: Results: finding out if the training leads to final results
Success Case Method - Brinkerhoff
determining what works is to examine successful cases and compare them to unsuccessful cases
Steps
1- focus and plan the case
2- construct a visual impact model
3- conduct a surve research study to identify the best cases and the worst cases.
4- schedule and conduct in-depth interviews with multiple success cases
5- write-up and communicate the evaluation findings
Utilization-Focused Evaluation (U-FE) - Patton
evaluation done for and wioth specific intended userd for specific intended uses
- rule: should be judged by the degree to which it is used
Process use:occurs when clients learn the logic of evaluation and appreciate its use in organization
Steps
1- conduct a readiness assessment
2- identify the primary intended users and develop a working relationship with them
3- conduct a situational analysis
4- identify the primary intended uses
5- focus the evaluation
6- design the evaluation
7- collect, analyze and interpret the evaluation data
8- continually facialiate evaluation use
9- conduct a meta evaluation - an evaluation of the evaluation
Subscribe to:
Posts (Atom)