Thursday, February 28, 2013

Reflection


Reflection

This week’s presentations and readings clearly articulate the benefits of incorporating reflection into instruction.  The age-old act of reflecting on what we have learned, are learning, have completed, or have to complete allows for better transfer of knowledge to the individuals involved.  And, as Davis (2003) highlights, certain reflective techniques yield enhanced learning.  Specifically, generic prompts allow the learner to make more meaningful connections as an individual, compared to directed prompts that may not fit a learners existing schema.  The directed prompts may result in less efficient and effective transfer, or even possibly confusion and a lack of learning.

There is a wide array of recommended methods for implementing reflection prompts in instruction, as discussed by the readings as well as Dr. Glazewski’s webcast.  Similar to visual design principles, I understood this week’s lesson to indicate that while there are best-practices and preferred methods, there may not be any 100% correct, direct method for instituting the reflective prompts.  Rather, it is situation-based.

This is where I get philosophical, and perhaps a touch melancholy about the state of today’s education (and world in general):

The world today is one of constantly competing resources, looming deadlines and non-stop rushing.  Reflection seems to be a skill that is slowly eroding in our social fabric.  As information is available at increasingly blinding speed and we are called to produce results all the time, pausing to reflect seems to be happening less and less.  At least I know I am guilty of this, and my observations seem to indicate I am not alone. 

The same seems to be true in our educational systems.  Schools must complete delivery of curriculum in a set amount of time.  Class sizes are growing.  So, I must ask: is reflection being sacrificed in our schools too?  I am not a professional educator.  I feel I do a decent job of monitoring my child’s education, but I am largely unaware of the actual classroom environment and instructional techniques.  Is reflection mandated?  Is it used?  Davis (2003) describes how “one size fits all” techniques for reflection prompts are not as effective as ones tailored to the individual.  Are class sizes and competing priorities resulting in non-effective or non-optimal reflective prompts?  Whether for metacognitive, observational, or other purposes, the benefits of reflection are clear, and I hope they are not being brushed aside in schools today.  To reiterate, I am not an educator, and am not asking this with an agenda.  I am genuinely curious, and am hopeful that some of the individuals in the class with a more formal educational background can shed some light on this for me.

As a side note, one of the most tangible learning outcomes for me as an IST Master’s student has been the introduction to the amazing array of tools available to design instruction.  It is nice to learn these tactical tools along with the theory behind them.  Dr. Leftwich’s presentation on the tools available online, including Screencast’O’Matic, Wordle, etc. (and that is just this week’s menu of recommended tools) is extremely helpful!  Dr. Bonk provided a variety of tools in P540 last semester as well.  Having an understanding of these capabilities will be extremely useful for both the IST program, as well as in the professional setting after the degree.  The availability of tools such as these in the hands of educators can maximize the benefits of reflection, and may help to overcome the adverse conditions I bemoan above.

Friday, February 22, 2013

Impact of Instructor Experience in OLEs


Open Learning Environments (OLEs) are dynamics, effective methods of learning outside of the traditional educational approaches.  Hannifan, Land and Oliver (1999) clearly delineated many of the benefits to implementing OLEs for certain situations.  The authors also allude to situations where OLEs may not be possible, mostly due to time constraints or standardization needs.   While the article discussed many of the strengths and weaknesses of these two approaches, as well as the techniques to be used by the instructor, it left me wondering if there are any prescriptions or recommendations as to the ideal background of the instructor.

Does an OLE require a more experienced teacher/instructor?  OLEs, especially those within the externally induces or individually generated Enabling Contexts (Hannifan et al., 1999), can meander down an infinite number of potential solution paths as the learner pursues non-prescribed solutions.  This is due to the often realistic/ authentic context of the learning, which can delve into topics which may not have been foreseen.   Does this require an instructor with more real-world experience (compared to an academic background)?  As the learner makes meaning and comes to solutions, it is possible that no list of correct answers exists.  Does this mean that an OLE instructor needs to be more adaptable within the subject matter, and more aware of minute details that may be outside of the scope of standardized instruction? 

Additionally, an instructor/teacher/trainer with significant experience may be necessary to evaluate to what degree learning has occurred, both formatively and summatively.   A student may not have learned strict theory, but has developed an acute ability to apply theory to solving real-world problems.  But, due to the lack of static evaluative resources often available to the teachers in traditional classrooms, such as quizzes/tests/etc., I wonder if the burden of proving knowledge transfer requires a SME who is intimately familiar with the what might make an individual proficient in a given field. 

I thought that some of the authors’ description of challenges with OLEs was interesting, and I wonder if instructor background might be another limitation to this otherwise excellent instructional method.

Thursday, February 14, 2013

Sea-Stories


This week in R541, we are learning about the use of Storytelling/Humor/Cases in the context of Instructional Design.  As I was going through this week’s readings, a recurring thought came to mind: Sea-Stories.  Being in the Coast Guard- which is the nation’s oldest maritime service- I have heard (and probably told) my fair share of sea-stories.  In looking back on these, they really connect each of the three topics headlining this week’s readings.  A good sea-story really captures and uses elements of storytelling, humor, and a case/event, and can often be found in many USCG training environments- either from design or instructor whim.  As Dr. Glazewski pointed out in the screencast, a case is sort of a story with an instructional purpose.  Sea-stories can be both just a story, or angled to have a lesson-learned in a formal instructional or other learning setting.

Here is perhaps a good (and short) example of a sea-story from my own background:

As a Boarding Officer on a Coast Guard Cutter in the Bering Sea, I was tasked with leading a boarding/inspection of a fishing boat.  Seas were about 6-8 feet, and it was freezing out.  The small-boat carried us (the Boarding Team of 4) from the cutter over to the fishing boat.  As our small-boat and the fishing boat pitched in the seas, we each precarious jumped over the gunnel (raised lip around the fishing boat – like a half-wall surrounding the deck).  One of the guys on the team mis-timed the jump, and the small-boat dropped out from under him.  He barely grabbed the edge of the gunnel, and was dangling dangerously over the freezing water.  He was able to pull himself up, with the help of the Boarding Team members who had already made it over.  Close call. 

There are many similarities between this example and the technician examples discussed in Jonassen and Hernandez-Serrano (2002).  It relays problem solving, develops self-efficacy through vicarious experience, enhances memory through connecting to existing schemata, and builds organizational culture. In retrospect, not every member on the cutter saw this happen, and the story was shared around the crew over the next few hours.  As it was told, it relayed a tale of caution, but with the eventual positive ending in that he made it aboard the fishing boat safely.  It was told in a certain “salty” manner that may not be appropriate for an academic forum, conveying some humor.  And it was an excellent case/example for both the technique of timing the jump, as well as why physical fitness is vital for the Boarding Team.

Stories have real-world context that pertains to the situation in which the learners will apply their new knowledge.  But in rigorous academic learning, I believe that stories may not be the most efficient or effective way to understand the full scope of details of complex subject matter, but rather may be better suited as attention-getters or as examples to clarify and contextualize the information.  As Jonasseen and Hernandez-Serrano (2002) stated: “Rather than generating scientific and rigorously rational explanations of phenomena, the new concern of these disciplines now seems to be with “meaning-making” (Bruner, 1990).”

Ertmer, Quinn and Glazewski outline a wide array of benefits to case-based learning, with the crux of the article positing that budding designers have much to learn from the experiences of seasoned professionals in the IST field.  This is true in almost every profession; we can all benefit from learning from our “elders” to some degree.  It is also true that learning is a life-long process, and even seasoned professionals can learn from a unique experience encountered by a younger or less-experienced individual, and can certainly benefit from their own new experiences, adding to their own Case Based Reasoning ability. 

The “reflective mind-set” that Ertmer, Quinn and Glazewski proffer is a skill that can be used in almost any situation, by any individual.  I view this point as critical.  Much like evaluation, which is present in almost everything we do (ex. smell the milk to evaluate if it went bad, or evaluate the success of a Fortune 500 company’s new product), the reflective mindset allows us to learn from the success and failure of others  (ex. Timmy didn’t turn off the power before changing a light fixture, and got badly shocked, so maybe I should turn it off when I do the same task).   Open-mindedness, adaptability, and the ability to question your own views allow for an increased likelihood of success in facing the complexities of the real world.

In addition to enhancing learner comprehension of a complex topic, storytelling and case learning both allow for sharing vital safety related information and best-practices.  In dangerous or other high-risk (financial, relationship, etc) tasks, learning from examples provides vital decision making resources to novices. As Jonassen and Hernandez-Serrano (2002) state:

Given the lack of previous experiences by novices, experiences available through a case library are expected to augment their repertoire of experiences by connecting with those they have experienced. Their prior experiences serve as a basis for interpreting current and future stories, forewarning us of potential problems, realizing what to avoid, and foreseeing the consequences of our decisions or actions.

In summary, this week’s readings were enlightening in that they portrayed common story-telling, which we as humans engage in on a daily or even hourly basis, as a tremendous resource which can be woven into the fabric of instruction to enhance the richness of the learning experience.   These examples can be simple stories, or more focused cases, and used in a wide array of ways.  Personally, I am a learner who finds examples very helpful.  I think that I had previously understood the role of storytelling and cases, but perhaps not in the formal way I do now.  Each of the three recommended uses of stories (examples, problems to be solved, advice) provided by Jonassen and Hernandez-Serrano (2002) are significantly beneficial in fulfilling Merrill’s First Principles of Instruction, especially in achieving the level of Integration.  

Problem solving through cases and examples brings context to the abstract.

Friday, February 8, 2013

Evaluation in Instructional Design


This week’s R541 topic, evaluation, ties together two of the core courses of the IST program, as well as a few of the principles of the IST field, according to the ADDIE model.  It reinforces the simultaneousness of Analyze, Design, Develop, Implement and Evaluate; these steps can all occur independently and in conjunction with each other. 

In R561, Fitzpatrick (2011) is the course text, which provided much of the below insights on evaluation as a field, process and product.  Evaluation is the step in which the entire instructional process and product- as well as the individual steps- are reviewed to ensure they are effective and efficient, and provide decision makers with information necessary to make decisions.  Evaluation is quite a complex field, and according to the professionals within the field, its principles and approaches are ubiquitous in all aspects of human life.  When we smell milk close to it’s expiration date, we are evaluating whether it is safe to drink it.  Evaluation exists on a spectrum of formative to summative.  Formative evaluation helps provide correction and validation to the process, while summative evaluation focuses on making more ultimate decisions concerning continuing with or ending a program.

It is interesting that evaluations built into IST’s instructional products/outcomes not only provide the learner with feedback on their success or failure in learning/developing a new behavior, they also provide feedback to the instruction on whether it is successful in meeting its intended goals or delivering the intended information. 

One core concept of evaluation, at its very roots, is the concurrence of the stakeholders, clients, evaluators, etc. on the commonly accepted values that the evaluated item will be compared to.  This concept is present in this week’s Thorndike (1997) reading.  While achievement tests may be easier to determine a commonly accepted performance standard, other tests or subjects may be more nebulous.  Thorndike specifically mentions aptitude tests and personality or interest tests as example of more difficult topics to create a high level of concurrence and validity.  This is because these topics are extremely broad in nature, with an infinite number of indicators of behavior that can be classified as intelligent (aptitude) or as indicative of a personality (do all Intuitive people display behavior X?).  But on some level, both the evaluator and the eventual user of the evaluation must agree on the validity of the evaluative standards, or the product of the evaluation will be generally useless, or perhaps worse, misused.

I recall from R511 last spring that one example of the way in which the ADDIE steps can be re-ordered is to conduct the analysis (A) to determine the instructional need, then directly develop the evaluation (E) to ensure the standards to which students will be held/tested both directly meet the results of the analysis, and are formulated into the instructional goals which serve as the basis for the design and development (D, D).  Implementation (I) follows, with recurring evaluation to ensure that the instructional product meets the need, and to ensure that the need itself is still accurate.

Friday, February 1, 2013

Motivation in Instructional Settings


As I read this week’s readings, and watch the video lesson, one thought kept running through my head.  Learners (and people in general) seem to live by an “Economy of Effort” generalization, applying their time and energy to tasks or other functions which they believe will benefit them, interest them, or be necessary to their survival.  This is where motivation comes into play in designing instruction- designers ultimately want their product to be used and lead to the successful education of the learners.  As such, designers must keep in mind the following concepts in developing the training (these were derived from a combination of Keller and Burkman (1993) and Dr. Glazewski’s online presentation:
  •  Selective Attention of the learner
  • What information will they use, and what will they use it for
  • What are they interested in
  •  What drives the learner
  • What are the learning goals
  •  What does the learner need to know
  • What our resources
  • What are our constraints
  • Who are the learners


Clearly, the designer must take a keen interest in knowing the intended audience.  But a designers sphere of influence is much broader- a key take away from this week’s lesson is that motivation does not rest solely with the learner; Instructional Designers can play a significant role in inspiring motivation through design and content decisions.  As Keller and Burkman (1993) stated in the very introduction to their article, motivational factors fall into two categories:

  • Motivating the student: understanding motivational factors internal to the student
  • Design elements which can stimulate attention, and increase learner awareness of how the information fits their motivations


There is some overlap between these two factors, especially in using learning theories such as Vygotsky’s Zone of Proximal Development, self-efficacy, learner control, etc.  (See Epilogue below)  
As a novice designer (novice may actually be giving myself too much credit), I am borderline overwhelmed with the actual design elements, and the prescriptions/principles thereof that exist for effective design.  These can be leveraged to help engage an otherwise unmotivated learner, both in the way the material is presented (why this is important) and the subliminal (avoiding overwhelming designs that deter learners).

The 4 factors which Dr. Glazewski discussed (Interaction, Engagement, Structure, Complexity) each exist on a sliding scale that are independently adjusted given the specific scenario.  Those factors dictate the content, which in turn dictates the format.  As the format comes together, tools such as font, color, layout, graphics, photos, text guidelines (10-12 words per line, etc.), writing style, and so on, each vary on their own sliding scale. Even this week’s and last week’s lesson had some overlap, but variations as well.  

I suspect that this is only the tip of the iceberg, and am wondering if there is some sort of design checklist or flowchart that can help budding designers with quick references as to commonly accepted or recommended design principles for various situations.  I might try to put one together on my own, but if anyone has any useful links, I would be grateful!

-Bryan


Epilogue

We live in a revolutionary time in which many of the learner centered design elements, such as learner control, pacing, difficulty level, etc. no longer must be designed to fit the masses, though that is prevalent in many educational and professional training sessions.  But web access, online training, and other digital resources have provided tools to learners to have more control over their learning experience.  This really relies on the learners own motivation.  In my experience with these tools, because there are no requirements for the audience to engage in learning in these forums, the design must be effective in maintaining the learners attention.  A few free examples that come to mind are:

  •  Khan Academy:  http://www.khanacademy.org  (Wide array of subjects)
  •  LiveMocha:  http://www.livemocha.com (Language Learning)
  • 43 Things: http://www.43things.com (Goal Setting)

(These are just a few, there are hundreds more, and even more if you are willing to pay)