IDE 656 – Computers as Critical Thinking Tools

 
Lesson Plan Format - Part One  
   
  

Project Number: 3
Author: Erin Cunia
Lesson Title: Preparing for the Comprehensive Examination
Tool: Expert System, Dynamic Modeling Mindtool

 

 

Overview of Lesson:
The purpose of the Comprehensive Examination is to test the student's ability to synthesize and apply knowledge of the content from the core courses. The general approach is to present a common context and problem to be addressed along with a set of criteria that reflect integration of most if not all of the core course skills and knowledge in an applied exercise. During the three hour examination, students are expected to show mastery of material covered in the core courses through application to new contexts or content, through integration across areas, or through summarization and critique of alternative positions.

Developing an expert system in preparation for the exam, provides a framework for studying for and taking the exam.
 

 

Learning Outcomes: (What will student learn during this lesson?)
The Gagne and Briggs Format: Situation, Learned Capability, Object, Action, Tools and Other Constraints.

  • Given a test scenario containing a complex instructional problem, design a response plan that illustrates knowledge of ISD principles and practices, as well as the ability to apply those to the given problem.
  • Build an expert system / knowledge base to describe your process of response; the expert system should provide advice that requires different conclusions from different premises (applicable to other scenarios).
  • Construct a rule base that represents a decision process or processes with
    • At least three different solutions (conclusions)
    • At least fifteen (15) "IF THEN" rules
    • Relevant factors (questions) identified

 

Intended Audience:
Graduate level student with basic computer skills and working knowledge of HTML (or use of HTML editing software), presentation software (i.e. Power Point) or multi-media software (i.e. Flash).
 

 

Key content concepts:

  • Demonstrate mastery of the core materials.
  • Use the expert system as a study tool for the comprehensive exam.
 

Rationale for computer tool chosen:
An Expert System is a Dynamic Modeling Mindtool. Preparing for the comp involves problem solving for a particular scenario and finding alternatives that encompass the core instructional design elements learned throughout the graduate program. Having graduate students design an expert system for a given scenario evokes critical thinking and demonstrates mastery of the instructional design material.

Jonassen (Computers as Mindtools for Schools, 2000) reviews Wideman and Owston (1991) who found that expert system development is most useful with learners possessing higher abstract reasoning ability. This ability is essential in the process of completing the comprehensive exam. Using this expert system as a tool for preparing for the exam, primes the learner for this level of critical thinking and processing.

Grabinger, Wilson, and Jonassen (Building Expert Systems in Training and Education, 1990) reviewed research by Renate Lippert (1988) that found when students develop their own small-scale expert systems, "they learn the content in a deeper, more comprehensive way."
 

 

Materials and Technology required for lesson:
Necessary materials and technology include a computer, some sort of presentation application in which to visually create the expert system, the supplemental materials (sample scenario and response plan outline) and any course materials as references to the necessary concepts.
 

 

Evaluation Criteria: (Linked to learning objectives)

Grading Rubric
  Beginning
1
Developing
2
Accomplished
3
Exemplary
4
Score
Knowledge base simulates meaningful thinking Simulates implausible or inaccurate thinking; poorly models activity; represents associative thinking Inconsistently represents thinking Consistently simulates thinking; anomalies occur Simulates coherent thinking; models meaningful activity  
Sensitivity of factors (questions); they ask the right questions? Factors (questions) don’t discriminate solutions at all Factors not clear, collect overlapping or redundant information Factors elicit most of important information; some overlap in information Factors ask important questions; each factor elicits different information  
Decisions represent meaningful advice or conclusions Provide vague, implausible advice with no explanations or reasoning; does not enhance understanding Advice is inconsistent; misses important elements or conclusions Advice is useful but missing few important elements; misses some reasoning Advice is always plausible; explains reasoning, enhances user understanding  
Completeness and logical structure of rule base Frequently provides no advice; often fails to provide appropriate advice Occasionally provides no advice but usually predicts or infers appropriate advice Links describe general nature of relationship of related site Accurately predicts/infers/explains correct values always  
      Total Score (of possible 16)  
 

 

 
  
   
dev-ide656-lessonplan-temp1
 

 

 


IDE 656 – Computers as Critical Thinking Tools

 
Lesson Plan Format - Part Two  
   
  Event of Instruction Lesson Rationale  
1. Gaining Attention Distribute lesson outline. Giving the scope of the lesson provides background and introduction.  
         
2. Informing the Learner of the Objectives Discuss learning outcomes. Make learners aware of expectations.  
         
3. Recall of Prior Learning Discuss the core courses and they fit into the analysis, design, development, implementation, and evaluation model. Prior knowledge assists the learner in focusing on the new information.  
         
4. Presenting the Stimulus Distribute the sample scenario and response plan outline. Presenting new information for learner to build on.  
         
5. Learner Guidance Present the demo expert system. Gives example for what is expected from the learner.  
         
6. Eliciting Performance (From Jonassen, p. 125) Instruct learner to: (1) make a plan, (2) identify the purpose for building the expert system, (3) specify the problem solutions or decisions, (4) isolate problem attributes, factors, or variables, and (5) generate rules and examples. Discovery-based, independent learning to prompt critical thinking.  
         
7. Giving Feedback Provide rubric for completed expert system knowledge base. Have the learner (6) refine logic and efficiency of decision making, (7) test the system, and (8) reflect on the activity. Clear, regular feedback enhances learning.  
         
8. Assessing Performance Instruct learner to create the expert system. Use grading rubric to assess performance. A means of testing learner outcomes.  
         
9. Retention and Transfer Encourage continued use of the expert system to prepare for the comprehensive exam. Aids in retention and assists learner in organizing content throughout graduate program.  
         
Gagne', Robert M., Briggs, Leslie J., and Wager, W. (1992). Principles of Instructional Design. Belmont, CA: Wadsworth/Thomson Learning.  
      dev-ide656-lessonplan-temp2