Erik Soell and Jon Basden, both instructional technologists at the Federal Reserve Bank of St. Louis, describe two hybrid training/reference tools that they developed for the Fed. One tool, "Examining Bank Operations," came out in December 2002; the second tool, "STaRT," came out in December 2003.
The role of the Federal Reserve Bank is three-pronged: monetary policy (think: Greenspan), financial services (think: cash and check processing), and bank regulation. To address bank regulation, the Fed has a staff of highly skilled bank examiners who monitor and regulate Fed-member banks throughout the United States.
Bank examiners come to the Fed with a wide range of education, experience, and knowledge. There are about 2,000 examiners throughout the Fed’s 12 banks, and they must all complete a curriculum of courses, which include approximately 15 classroom courses, as well as a host of prework, postwork, self-study, and on-the-job training. At the end of training, examiners complete a comprehensive proficiency exam to become commissioned examiners.
In early 2002, the Fed’s management expressed an interest in developing training for examining a bank’s internal controls to prevent or identify fraud. The target date was December 2002, which gave the team approximately six months to complete the project.
Starting the project
As with most new projects, there was an initial analysis and design phase. To get started, we asked SMEs to assist with content development. We brought 10 of the system’s experts on internal controls, from throughout the country, to St. Louis for a two-day, kick-off meeting. We discussed goals, materials that already existed, and what the finished course or tool might look like.
With the help of the SMEs, we identified the content, which derived from a three-day traditional class. We organized that content into modules and lessons (some modules had one or two lessons, some had six lessons). Also, from our interviews with past students, we found that the participant’s favorite part of the learning experience was hearing stories from the experienced instructors. For example, regulations and rules can come to life when examiners hear what a senior staff member does in Boston or Phoenix, Arizona. To address this, we built a mini case study element into each lesson.
Finally, we set up a conference call schedule to review progress and receive development input from SMEs.
Designing the learning experience
After the analysis and information-gathering phases, it was time to start designing the tool. While we knew that there was specific information that we needed to cover, we also knew that we that we wanted to design a tool that was easily accessible by the lowest common denominator workstation—but not sacrifice instructional value. For example, many bank exams take place in small towns, some of which have one phone line in a conference room. Therefore, we quickly concluded that using a 56k connection to access online content was not an option. Although many banks have very advanced technical capabilities, we had to consider how examiners would access the tool on a majority of jobs. For instance, Fed examiners have laptops, which they use on the job, to reference materials on their desktops and to track comments and assessments.
So, we knew we wanted something online, but we also needed to keep the content of the tool in context. Think: modular. Unfortunately, that notion was in direct contradiction to the idea of offering a downloadable option. What to do? We settled on creating individual static Web pages, but we used a single Macromedia Dreamweaver template with a series of carefully structured library items.
The instructional technology group was tasked with managing all of the technical development, as well as some of the development of graphics and interaction. First, we developed a storyboard template to get these lessons into a new, yet-to-be-designed tool.
Next, our group spent a lot of time talking about the tool, what is should do, how it should work, how users would get content, and so on. We said, "It’s more than just learning, and it’s more than just reference." We started referring to it as a hybrid tool. So finally, after plenty of analysis and brainstorming, then we sat down at a PC, launched Fireworks and Dreamweaver, and started putting ideas on screen. The collaboration was great, and we soon had a tool we could send to the graphic designers who polished the layout and concept into a tool that was easy to use, visually appealing, and professional. (Of course, solid instructional design is key, but having a nice looking tool or course can make a major difference in use and adoption of learning.)
Putting it all together
So with a shell ready, we continued to work with the 10 SMEs on 38 lessons, each of which would take a user about 15 minutes to complete. Some lessons were delivered in early, some right on time, others late. Once SMEs sent in their content, we still had a lengthy process of editor review and IT review. Then we sent lessons back to the SMEs to review (to ensure the content was still accurate and appropriate). Finally, it was returned to our group to actually build.
Finally, with the content built and online, SMEs would go out again and review their content in the tool, in context of this hybrid reference/training tool that we had all built together.
|