Interactive Media and Distance Education for Saskatchewan Schools
by R. Schwier, B. Brown, E. Misanchuk, L. Proctor (1992).

SSTA Research Report #92-06: 100 pages, $17.

What are Interactive Media?
Why Interactive Media?
Ways to Use Interactive Media in Classrooms
Traditional Instructional Support
Independent Study
Cybernetic Learning
Educators as Interactive Media Producers
Authoring Programs
Video Media
VideoDisc
CD-ROM Media
Interactive Communication Devices: Interfaces
Microcomputers
Epilogue
Selected Sources of Interactive Media
The Technology and Role of Distance Learning in Saskatchewan
Education at a Distance
Overview
Delivery Continuum
Implications of Distance Education for Saskatchewan Schools
References
Glossary of Terms: Multimedia and Distance Education
Interactive Media and Distance Education for Saskatchewan Schools describes several new learning technologies influencing education. Widespread adoption of educational technology in our province is occurring rapidly, if not systematically. Most educators recognize that our society is strongly influenced by information technologies, and they have attempted to respond to the change. During the 1980's, schools acrosss the province invested heavily in microcomputer technology and learning resources, and in many locations, administrators are now struggling with the transition to a second generation technology; old equipment, materials and instructional strategies must be replaced, and the demand for additional resources continues to grow. New instructional systems, characterized by an interactive multimedia approach to instruction are emerging at the same time that the Saskatchewan curriculum emphasizes resource based learning. At the same time, distance education options for K - 12 students have mushroomed in the province with the introduction of SCN, the "schools of the future" and expanded Correspondence School initiatives. All of this has led to a dizzying array of opportunities, terminology, pratfalls and pitfalls.

Back to: Technology


The SSTA Research Centre grants permission to reproduce up to three copies of each report for personal use. Each copy must acknowledge the author and the SSTA Research Centre as the source. A complete and authorized copy of each report is available from the SSTA Research Centre.
The opinions and recommendations expressed in this report are those of the author and may not be in agreement with SSTA officers or trustees, but are offered as being worthy of consideration by those responsible for making decisions.


Interactive Media for Saskatchewan Education

What Are Interactive Media?

Interactive media, for this document, refer to those media which users can control to some extent. Most of the media we have come to know and use in classrooms are linear, not interactive. Films and videos are typically shown from beginning to end, with only the occasional interruption by an inspired teacher. Filmstrips and slides are shown in sequence. Overhead transparencies are shown one-at-a-time at the discretion of the individual making a presentation.

Interactive media, by contrast, require the user or viewer to make choices. Some amount of control is turned over to the learner to construct the amount, sequence and shape of content. Interactive media usually require, but are not necessarily limited to computers, videodisc and compact disc-read only memory (CD-ROM). Each medium has unique features, advantages and disadvantages for the classroom teacher. This document will present some of the technical and instructional features of these newer media, describe how they can be used in the classroom, and discuss some of the advantages and disadvantages of each medium.

We will examine interactive media as independent components. These media can also be combined, in whole or part, into powerful instructional systems, commonly called interactive multimedia instruction. Interactive multimedia instruction (IMI) can be constructed from an array of media, each of which has particular strengths and limitations. IMI systems can be quite modest, restricted to some computer assisted instruction and print materials, or extremely elaborate, including videodisc, CD-ROM, hypermedia, and virtual reality interfacing. The power of these instructional systems lies in the way instruction is constructed and delivered¾not in the technological components which comprise a hardware system. Our only fascination with technology lies in its potential to assist learning. Learning is our focus, not equipment or toys. A host of studies over the last four decades compared various new media to more traditional methods or other media in order to find which facilitated learning more effectively. For the most part the studies landed with a resounding thud, indicating that no medium was inherently superior to another for instruction.

One of the most obvious lessons from this comparative research is that learning gains come from adequate instructional design theory and practice, not from the medium used to deliver instruction. (Clark, 1984, p. 5)

Our reasons for advocating the interactive technologies discussed in this document rest elsewhere. While specific media do not necessarily out-perform other media, each medium has specific characteristics which allow us to incorporate it into a rich, diverse instructional landscape. Simply put, there is no single right way to approach instructional problems, and instructional environments should be built to engage learners in a variety of exciting, provoking and instructionally sound manners. Our consideration of various media allows us to construct an instructional matrix from the features and advantages of components. The media we consider in this section are characterized by their capability for interactivity; that is, they can be used to actively engage the learner while learning.

In this document, interactive media must meet several criteria to be included in our definition.

• Commonly, interactive media are broken into segments, rather than presented in a linear fashion.

• Segments may be made up of motion sequences, still frames, questions, menus, audio or combinations of these, and they define an array of paths the viewer may follow through the presentation.

• To facilitate "navigation," interactive media incorporate periodic and structured input from the user. Interactivity relinquishes partial or complete control of the instructional presentation, and places it in the hands of the learner or teacher.

• Interactive media are the resources¾not the equipment on which the resources run. They usually require, however, that the educator or learner use computers, videodisc players, CD-ROM players or other related technology.

Table of Contents


Why Interactive Media?

It makes little sense to adopt newer technologies if the old ones can accomplish everything we want. We will briefly review some of the reasons why interactive media are beneficial, and also list a few disadvantages associated with them.

Advantages of Interactive Media

Interactive media, and by extension multimedia, offer several advantages to educators.

• Learner-appropriate content. Different paths can be taken through interactive media, potentially accommodating individual learner needs and preferences.

• Instructional design. Careful development leads to scrutiny of the content and delivery. Interactive media products are subjected to greater scrutiny than traditional, day-in and day-out lessons. We believe that many of the performance improvements often associated with interactive media are a natural by-product of the process of development.

• Increased overt activity. In well designed interactive media, the learner is constantly active, and this activity can result in increased cognitive investment in the content to be learned.

• Organizational flexibility. When used to support traditional instruction, interactive media provide more support in a more compact package. When interactive media are used for independent study, they allow flexible scheduling and locations.

• Motivation. Students suggest that interactive media used for independent study provide a low-threat environment. Learners can determine how much time is spent on various lessons, and can review material privately. At present, there is also a novelty factor which increases motivation, but we believe that it will dissipate as multimedia resources are used more often.

• Immediacy and dialogue. Interactive media can be designed to provide immediate and relevant feedback to learners. This can go beyond traditional question-answer formats. "Learner advisement" is the term used to describe systems which serve as wise companions, advising students about choices they make during instruction.

• Record keeping. Computers can automate the process of record keeping within an instructional program. Everything from performance data to time-on-task and complete audit trails can be invisibly tracked as the learner works on instruction.

• Cost. Generally speaking, interactive media formats are inexpensive to reproduce and distribute. Of course, some producers charge whatever they perceive the market will bear, but most courseware is less expensive than a comparable film production.

Disadvantages of Interactive Media

• Design expense. The same process which leads to higher integrity of the product, also leads to higher design costs. While most interactive media are inexpensive to reproduce and distribute, most must amortize high front-end costs. This means that they are most cost-effective when distributed in large quantities.

• Hardware expense. Computers, videodisc players and CD-ROM readers are not inexpensive. On the other hand, they are not more expensive than many of their traditional media counterparts. For example, an industrial quality videodisc player is approximately the same cost as an industrial quality VHS videocassette player.

• Compatibility and portability. Instruction requiring specific equipment configurations cannot be generalized easily. Apples still aren't oranges, and computer-based instruction sometimes carries specific requirements. This is not so true of videodisc or CD-ROM, and even commercial distributors of computer-based instruction often offer programs in multiple formats.

Some Conclusions from Research

As with any new area of study, the research into interactive media is still relatively lean, and some of the studies are fraught with methodological problems. Still, it is possible to derive some tentative and cautious conclusions.

• Computer-based instruction and interactive video may be effective means of achieving educational objectives, both as the principle means of instruction and as a supplement to other forms of instruction (Bailey, 1990; Franchi,1992);

• When computer-based instruction is compared to media that do not account for individual differences, computer-based instruction (1) can produce more learning in a given amount of time, or (2) can produce a given amount of learning in a shorter period of time (Franchi,1992);

• Retention following computer-based instruction is at least as good as retention following more traditional methods of instruction, and students favor well-designed computer-assisted instruction programs but reject poor programs (Franchi,1992).

• Several studies, despite methodological concerns, suggest that interactive video is effective, can improve student attitudes and results in increased participation (Cushall, Harvey and Brovey, 1989; Schaffer and Hannafin, 1986; Smith, 1987).

• Commercially available videodiscs grew in number from approximately 100 in 1986 to more than 560 in 1989 (Pollack, 1989).

Table of Contents


Ways to Use Interactive Media in the Classroom

Interactive media can contribute to creating a richer, more robust learning environment in a variety of ways. In this section, we will describe some of the ways interactive media can serve Saskatchewan teachers and learners. We will also peek at the future, and briefly consider developments not yet widely available.

Table of Contents


Traditional Instructional Support

We offer a rather bold suggestion. Until multimedia resources are readily adopted by teachers to support what they already do, multimedia resources will not be used in the classroom for other, more innovative, purposes. Further, once teachers adopt multimedia resources to support traditional teaching approaches, they will progress to more innovative instructional approaches quickly. We have little research to substantiate this position, but our reasoning is simple. Educators in our province (and most other places we have been) have already developed methods, preferences and styles which are successful. Education in Saskatchewan, despite many concerns, is generally healthy and of high quality. Teachers in Saskatchewan do a good job in the classroom, and Saskatchewan students learn well. There is no desperate plea to fix a substandard system. In places where the situation is more urgent, such as in several urban areas in the United States, the call for reform is accompanied by widespread application of educational technologies.

Also, educators have witnessed failed promises from educational technology. Film was touted as revolutionary, educational television was to be a panacea for many of our classroom ills, and programmed instruction was going to create a generation of responsible, successful learners. These technologies, while still used, did not create the wholesale reconstruction of education many pundits predicted. These, and most other technologies, are most often used to support traditional teaching methods.

Teachers are therefore cautious about adopting radical new approaches. Changes, and especially technologically-bound changes, require time and effort. If multimedia resources are to be adopted widely across the province, they must first be seen as easy to use, and also seen as making the teacher's job easier or more enjoyable to perform. Second, multimedia resources must be perceived by teachers as worth the effort to adopt them. In other words, teachers must consider multimedia resources as approachable and accessible. We believe that these issues are as important as learning issues are for the adoption of innovative classroom resources. Fortunately, multimedia resources are indeed becoming more approachable, easier to use and powerful.

At the same time, our new provincial curriculum calls for educators to respond in new ways to our traditional challenges. After looking at some of the ways multimedia resources can support traditional instructional approaches, we will examine some of the more aggressive strategies available to educators to use.

Illustrate Presentations

Multimedia, can be used to support lectures by providing visual and aural illustrations. Collections, almanacs and databases of materials are available on videodisc, CD-ROM and computer software. These collections are often quite comprehensive and inexpensive. For example, Figure 1 contains the chapter headings for motion and still picture segments contained on the Chemistry at Work Videodisc. This array of topics could support several Chemistry courses, not merely support a few lectures, and the current cost is approximately $600. Given that this is a lower cost than most films, the relative economy of this material to support traditional instruction is impressive.

Beyond economic arguments, this videodisc (and by extension other well-designed materials) consolidates a large number of resources for the teacher. With most support materials located on a single videodisc, the teacher does not need to spend as much time searching for, ordering and acquiring resources for a given course. Media distribution catalogues are listing an increasing number of database-type support materials. A disclaimer is required, however. At the time of this writing, many courses would require a number of support materials. Comprehensive single-sources are not available for many subjects. On the other hand, multimedia support materials which are more specialized (e.g. VanGogh, Macbeth, Volcanoes) are also typically less expensive, and can be acquired for as little as $30 Cdn.

Supplement Lessons

This type of educational resource assumes primary instruction exists elsewhere. Supplementary instruction either reinforces what has already been taught, or it attaches new instruction to what has been taught. We suspect there is an important role to be played by multimedia in providing supplementary instruction. Students exhibit widely varying interests and abilities in the typical classroom. In order to address specific needs, provide remedial help for subgroups of individuals or even shore-up specific deficiencies in a teacher's background, well-designed multimedia can provide supplementary instruction.

Provide Learning Resources

Given the prominence of resource-based learning in the new provincial curriculum, a wide array of learning resources will be required to carry out the instructional mandate. Currently, many people think of learning resources as books¾and only books. Our vision is different; learning resources include any collections of materials which can be used to promote learning. They may be used by the teacher or students in a number of productive ways.

For the purpose of this discussion, we consider learning resources to be organized data bases of multimedia materials. For example, resources might include a videodisc collection of sounds and still images for teaching biology. This means that organizational structuring is left largely in the hands of the user-whether that is the teacher or student. If an individual, say a learner in a laboratory, wants to review several of the images in a particular sequence, then the learner/teacher must search material for appropriate segments, sequence the material, then access the chosen segments. Perhaps a remote control unit is used or barcodes are printed and pasted in a desired sequence, but nevertheless, manipulation must occur, and this places a burden on the user.

The Visual Almanac: An Interactive Multimedia Kit produced by the Apple Multimedia Lab provides an excellent example of a learning resource (Figure 2). It is a database of many unrelated images, motion segments and sounds on a videodisc. An accompanying computer program breaks the components of the program into Collections, Compositions and Activities. These sections provide access to groups of material by topic, sample lessons constructed from some of the materials and authoring tools needed to construct unique instructional configurations. Learners or teachers can explore the database, search for images and segments by using keywords, and they can build their own presentations from materials found.

A popular breakdown of independent study approaches includes drill and practice, tutorials, games and some simulations (e.g., Alessi and Trollip, 1985; Hannafin and Peck, 1988; Heinich, Molenda and Russell, 1989; Romiszowski, 1986). These approaches are not restricted to multimedia instruction, of course, but multimedia can often be used to provide these types of instruction.

Drill and Practice

Drill and practice usually takes the form of a string of question-answer-feedback sequences. The purpose is usually to review previously learned material in a test-like environment rather than learn new material. In traditional media, you might think of drill and practice as a series of mathematics problems or a geography quiz in which the learner names the capital cities of countries.

For drill-and-practice, multimedia instruction can be used to increase the types, amounts, and layers of stimuli and feedback presented. For example, instead of using a map for the capital cities exercise mentioned above, the learner could be presented with an aerial photo from satellite, zooming into the target city. If the learner requires a clue, the national anthem of the country could be played, or additional data presented. A correct response could be greeted with a video or audio clip containing hearty congratulations from the actual head-of-state.

Tutorials

Tutorial instruction is what we most often associate with routine computer assisted instruction or classroom instruction. Tutorials are used to teach new information. Information is usually presented, learners are given opportunities to practice using the information, and learning is reinforced. For example, a typical tutorial environment will conventionally have an introduction (including motivational set), organizing material (e.g., advance organizers, objectives, topics), novel content, embedded practice and interaction(e.g., adjunct questions, exercises, activities), feedback, review, and evaluation. The role of the delivery system is to mimic the best characteristics of a personal tutor. A well designed tutorial will motivate the learner to enter the instruction enthusiastically, guide or coax the learner to complete the instruction, provide ample opportunities for meaningful interaction, correct errors or misinterpretations, and applaud successes. In mediated tutorial instruction, our challenge is often to provide a rich vicarious experience which approximates genuine human interaction. We have found that the majority of instructional computer programs available on the market today, fall into this category of instruction.

A tutorial approach has an inherent instructional delimitation. Tutorials constrict learner control over instruction; the branching possibilities and much of the sequence is predetermined by the program. This is desirable in many instructional settings, but highly undesirable in others.

Games/Simulations

Games are usually directed at a specific goal and involve some measure of competition. Simulations provide an abstraction or simplification of reality, some level of mimicry, in which the learner encounters circumstances and tries to respond to them. In multimedia instruction, features of games and simulations are often combined, as both approaches offer highly motivational, and potentially relevant environments.

Multimedia systems are ideally suited for gaming and simulation. In order to provide an interesting, robust environment, huge amounts of information must be available to the learner in realistic representations of reality. For example, if you want to simulate life in a small German village during the Reformation, the learner must be able to encounter a range of social, political, and interpersonal variables in order to establish even a modest fidelity with actual life during those times. Computers and CD-ROM are capable of housing massive amounts of information in a number of useful formats (from print to compressed video), and videodisc offers the realism and immediacy of full motion video.

Collaborative and Generative Study

Collaborative and generative study shifts some measure of control over instruction to the user., Collaborative and generative approaches permit the learner to influence what is learned, or how it is learned, or at least the order in which it is learned. That is, the fundamental difference between the approaches mentioned previously and these is the level of prescription, and therefore, learner control.

For multimedia resources used for collaborative and generative study, the emphasis shifts from constructing and controlling instructional events to providing easy access to instructional support. These types of multimedia learning resources emphasize navigation, motivation and access, and typically downplay objectives and evaluation. They take the form of resources which are easy to use by the learner.

Hypermedia

Hypermedia refers to a programming approach which allows the user to link pieces of information together in a theoretically unlimited number of ways, meaning the user could then move in an unrestricted fashion through a forest of information. The movement through the forest would not be restricted to moving from one tree to another in any direction, but it would allow you to be transported to another tree in another part of the forest, or indeed be transported to an entirely different forest. Theoretical limits have not even been approached in practice, as most hypermedia merely expand the number of directions an individual can move through data. In fact, most hypermedia examples at the time of this writing are restricted to hypertext (computer-based text, graphics, images and animation), but some of it is quite innovative.

In hypertext, special indicators (e.g., boldface type, boxes, arrows, asterisks) are used to indicate the existence of related additional information. The learner activates the indicators (say, by clicking on them with a mouse) and is presented with the additional information. The learner can then continue to explore the additional information (perhaps traversing the database yet again by clicking on another indicator), or return to the point of departure to continue the original path.

Table of Contents


Cybernetic Learning

Before leaving this area of discussion, we should take a moment to look into the future. Some writers and producers question the limited approaches employed in most computer-based instruction today (e.g., see Merrill, 1985; Dear, 1986). They suggest we should be developing learning environments which interact intelligently and mutually with the learner.

The ultimate goal of artificial intelligence is to create systems that "think". By simulating human thought processes, computers will be able to respond to novel situations, create and implement strategies for solving problems, and learn from experience and introspection. The challenge is, we do not really know how people think and learn, so the metaphor is speculative at present.

"Intelligent" interactive multimedia would use a computer to provide the intelligence. Using the "human thought" metaphor, we might think of a videodisc or CD-ROM as housing fixed thought, such as vivid memories, conclusions drawn from experience, or stable components of knowledge. As the storage capacity of archival media increases dramatically, so does the potential warehouse of thoughts. The computer (uneasily at present) adopts the functions of flexible thought, including mental activities such as comparison, inference, deduction, analysis, and insight. In "intelligent" systems, the program is often also capable of expanding its knowledge base from the types of user responses encountered, and adapt its performance ("learn from experience").

For example, consider a sophisticated flight-training simulator. A series of minor navigational errors could mount to serious difficulties if left to compound. The program driving the simulator could detect a pattern in which navigational errors were consistently introduced when a strong tail wind was present. The videodisc, providing visual images and data, could be directed by the program to test the assumption by subjecting the simulator navigator to a variety of wind conditions. The computer could "interview" the pilot about decisions made. The pilot could consult with navigational experts on the multimedia system. A final judgment, based upon the test, could be to provide a series of tail wind experiences to the user.

To extend the discussion, we should consider the application of virtual reality. The learner could enter, and indeed become a participant in, a constructed environment which is visually and aurally saturated and dynamic. For example, the learner could wear an interface which allows the learning environment to be projected within the learner's changing field of vision. Projected elements would change to adapt to the learner's perspective, and respond to the actions of the learner. In fact, the learner could physically (virtually) manipulate items in the constructed environment; items could be picked up, carried and moved to another location or massaged into a new shape. This is not a description of the future. Versions employing sophisticated computer graphics are currently available commercially. But multimedia can serve to enrich the virtual environments, and ultimately may serve to alter the nature of this type of learning experience completely:

There are many other types of applications which may become possible through the development of intelligent interactive multimedia. The above discussion is meant only as a sample of possibilities which may soon appear on the horizon. Cybernetic applications are just gaining a foothold for microcomputer systems, but it is likely that when cybernetic applications are more commonplace, interactive multimedia will be at the forefront of training applications. Multimedia will serve to offer more dynamic and realistic artificial environments for learning.

Table of Contents


Educators as Interactive Media Producers

Educators have always produced a significant number of materials to support their instructional initiatives. They have made truckloads of overhead transparencies, slides and mounted materials. Teachers, given some of the tools currently available, can produce their own multimedia resources with less difficulty than making overhead transparencies. For example, one school division trained 240 teachers to use Apple's HyperCard to develop instruction in only 12 hours of inservice training. The immediate result was that teachers produced more than 100 stacks of instruction, and as they became familiar with the program, their instructional approaches became more and more creative (Rude-Parkins, 1992). As with any type of local media production, the main issue is time. It takes a significant amount of time to design materials of any type, and teachers have been consistently implored to:

• first-select available resources to meet your needs;

• second-if appropriate materials are not available, adapt whatever materials are available to meet your needs; and

• finally-only if you cannot acquire or adapt materials to meet your needs, produce your own materials.

In order to produce multimedia resources, one requires specialized tools. Given the scope of this document, we will not deal with the full range of development an educator might pursue. We will not deal with video production resulting in videodisc or compressed video. We will only consider authoring programs for computers which can be used to produce materials to support instruction, create computer-assisted instruction for independent study, or interface with other media to shape instruction from existing media such as videodiscs.

Table of Contents


Authoring Programs

An authoring program is any computer program which can be used to develop instruction. For IMI development, authoring programs also link various components of media into a system. Several authoring programs are available for educators. They range from highly prescriptive and easy-to-use programs with a limited number of options, to fairly sophisticated programs which are only limited by the programming skills and equipment configurations of the user. In every case, authoring programs offer protocols for creating computer-based instruction, and most can be used to create interactive multimedia instruction. Authoring programs vary considerably in price, and most are specific to one or two types of computers. A fairly extensive list of authoring programs is identified at the end of this section, along with compatible computers and acquisition sources.

Don't rely completely on what you read or hear about a particular authoring program to guide a purchase decision. We have our own favorites and villains, and so does everyone else. Before adopting a particular system or language, spend time grappling with it on the hardware configuration you will use. Only then can you be reasonably certain it will do what you want, and even then you may encounter surprises. Changing authoring programs after a treatment has been completed can be a very time consuming task.

Programming Languages

Programming languages include programs such as Basic, Pascal, and "C." Our best advice is to avoid them unless you are already a skilled programmer. They allow the programmer to do a multitude of tasks by writing step-by-step instructions which can be interpreted by the computer. These tasks can include, but are not limited to, creating interactive multimedia instruction. A programming language requires a high degree of sophistication; languages are not very "user friendly." Also, authoring with a programming language can require much more time than programming with a specific authoring tool, because of the time required to construct the syntax for simple commands (see Figure 3). Few teachers would be interested in this level of programming.

Authoring Programs

Authoring programs are specifically designed to allow individuals to create instruction, and most are designed to simplify the task. Authoring programs usually require much less skill in programming to use productively. Although they vary considerably in sophistication and price, authoring systems can be thought of as templates into which you place your instructions and text. Many use menus to prompt the user for information, and most use a frame (a complete television screen) as the basic unit of development. In essence, you are "filling in the blanks" with the information and commands you desire. This requires an understanding of the command structure used by the authoring system, but in most cases, the commands are iconic or mnemonic and easy to execute, and "help" directories are available if you forget how to do something. You design text screens, graphics, question frames, and branches the way you want them to appear to the user. So, as you design a frame, you get a reasonably accurate idea almost immediately of what the learner will see. This is known as the principle of WYSIWYG (what you see is what you get). Selecting an Authoring Program

If an educator wishes to create multimedia instruction, the first task is to select an appropriate authoring program. This is not an easy job. If we could offer one piece of advice it would be to look for an easy program to learn and use. Most educators we have met have little interest in computer programming, and are primarily interested in finding a program they can learn quickly. Here are some other things you might consider.

• Portability is an important issue for most educators. Potential users may not have the same configurations of equipment you do. Portability can be enhanced by choosing authoring programs which exist in competing environments, say, MS-DOS and Macintosh for example. Another portability issue is whether or not the authoring program can create stand-alone run-time modules. Some programs (e.g., HyperCard) require that the ultimate user (i.e., the learner) have a copy of the copy of the same program as used to create it, just in order to run the instruction. Others (e.g., Authorware Professional) can "package" the instruction into a stand-alone unit, effectively making the instruction into an executable file.

• WYSIWYG (What you see is what you get). As mentioned earlier, this means that as you develop material, it will look the same way to you as it will when used by a student.

• Documentation. Printed material with a program should be easy to use and understand.

• Integration of peripherals. An important feature of authoring programs for multimedia development is the number of peripheral devices which can be tied together by the system, and the ease with which it is accomplished. Some such devices could include videodisc players, video cassette players, CD-ROM players, audio cassette players, and full-motion video boards.

• Text composition and editing features can be significant, and you may want to find an authoring system which approaches a full range of word processing features.

• Graphics are fundamental to screen design, so an authoring program should have the ability to create, edit, and/or import graphics easily. .

• Animation features can be very important to some productions, and not as important to others. You may decide this is a feature you can live without, especially if you favour simplicity over sophistication.

• User control is a critical feature of any program. What range of user control options are supported by the program?

• Networkability can be a key feature for some schools and divisions. We have encountered significant problems networking some versions and types of software, ranging from slow access time to program crashes. If networking is anticipated, we recommend trying sample programs in the exact network environment for which the product is anticipated.

Selected Examples of Authoring Programs

These are several authoring programs available today. We have not attempted to provide a comprehensive list; more are being developed constantly, and we may have overlooked some important ones. We did, however, attempt to list some for each of the three major multimedia systems (IBM-PC, Macintosh and Amiga) in use today. In order to review any of these programs, contact a local or regional dealer in your area. They often have demonstration versions you can borrow.

Table of Contents


Video Media

When we talk about interactive video media, we are primarily discussing videodisc. Videotape is still useful for instruction, but is not as useful for interactive instruction. Compressed video is an interesting newer development, but we know of no schools in the province using it at this time, so we will limit our discussion to videodisc.

Table of Contents


Videodisc

Videodisc systems exhibit characteristics which make the medium highly attractive to educators.

A videodisc looks like a silver LP (Long-Playing) record, covered by a clear coating of durable, mar-resistant plastic. Information is embedded within the disc and protected by plastic, so the product is extremely rugged. Unlike videotape, which can be torn, stretched, or crumpled by inept users, a videodisc can suffer considerable abuse and still reproduce an unblemished image.

Each frame on a videodisc can be thought of as a single frame of a motion picture which can be retrieved by videodisc players either individually to produce still pictures or in sequence to produce motion pictures. Videodiscs have sufficient storage capacity to produce approximately one hour of continuous replay time in the motion format, or 108,000 still pictures.

Technically, the optical reflective videodisc acts as a storage medium for encoded information (video and two tracks of audio control and program data) which can be processed by a videodisc player. The disc has two sides available for storing information, which are approximately 30 centimeters (12" ) in diameter; 54,000 tracks per side are arranged in a continuous spiral which begins at the center and weaves to the outer edge of the disc (Figure 4).

Within the tracks, data exist as microscopic pits or oval depressions in the inner surface of the disc.

The result of this is a recording which can be interpreted by reflecting a focused, laser-generated beam of light (actually split into three beams to detect the audio and video multiplex FM signal, focusing error and tracking error). Because the actual data are covered by plastic, and focusing actually occurs beneath the surface of the disc, the recording is extremely rugged. The "frame" of video never comes in contact with a reading mechanism, only light, so the image should never degrade.

There are actually two different types of videodiscs, and both can be "read" by standard videodisc players. They are called CLV (Constant Linear Velocity) and CAV (Constant Angular Velocity) and they exhibit specific strengths and limitations. In any catalogue you might choose, videodiscs will be clearly labelled as one or the other type.

Constant Angular Velocity (CAV) Videodiscs

With a CAV videodisc, each video frame occupies a full track (360 degree segment of the track spiral). Two pie-shaped video blanking intervals are etched across the disc, clearly marking the location of the vertical blanking intervals (VBI). Since there are 54,000 tracks on each side of a CAV disc, the same number of individual frames can be placed on a single side. Obviously, the frame lengths is much shorter near the center of the disc than near the outer edge (Figure 5).

The CAV disc is designed to rotate at a constant speed of 1800 rotations per minute. At 30 frames per second, this configuration will allow up to 30 minutes of continuous motion playback per side.

What is important about the CAV videodisc for multimedia instruction is how information can be accessed, manipulated, and used, coupled with the impressive quality of stored images.

The concentric alignment of fields allows a number of functions which would otherwise be difficult. Each of the frames is identified by a unique frame number and other hidden code which resides in the VBI (including such things as picture stops and chapter numbers). Individual frames can be accessed by a program or user. Each time the laser completes a video field, it enters a VBI, and can be commanded to jump inward or outward to other tracks, permitted to continue to the next field or continuously repeat the same track. This provides a level of flexibility unavailable with videotape.

Constant Linear Velocity (CLV)

Like a CAV disc, a CLV formatted disc has 54,000 tracks circumnavigating the disc per side. Unlike the CAV disc, however, the CLV format uses a constant length for embedded frames, rather than always having one frame occupy an entire rotation of the disc. Thus, a single track on a CLV disc may contain more than one frame, and the result is an increase in playing time from 30 to 60 minutes per side (108,000 frames reside in the 54,000 tracks) (Figure 6).

A CLV disc is designed to be scanned at a constant linear rate which does not vary over the entire disc. To achieve this, the disc varies its speed of rotation from 1800 rpm at the center of the disc (scanning one frame per track) to six hundred rotations per minute at the outside circumference of the disc (scanning three frames per track).

Typical Videodisc Functions

Specifically, some of the functions which are possible with videodiscs include:

• Still Frame. Repeating the same track is possible with CAV playback, and with CLV using some players. Since each track contains only one frame of video, repeating it results in a still frame or freeze effect. Only a beam of light contacts the disc surface, so unlimited repetition is possible without degrading the image. One caution must be mentioned. While continuously scanning a single image does not harm the videodisc, leaving a still image on a television monitor for a long period of time can damage a monitor screen. Ultimately, images can "burn" into the picture tube, and leave a ghostly image of the still picture as a permanent scar. We learned this the hard way; hope you don't.

• Step Frame. Step frame combines several still frames in sequence. A single still frame is repeated until the user commands the player to STEP, at which point the player moves to the next track in sequence and repeats it as a still frame. This can occur in forward or reverse mode, but only in sequence.

• Fast Motion. Fast motion play is accomplished by reading only one field of each frame, and then moving to the next track. Not all players have this feature.

• Slow Motion. Normal playback rate can be slowed by repeating individual frames within a sequence more than once. A variable rate of slow motion is accomplished by increasing or decreasing the number of times each frame is scanned before moving to the next frame and repeating the process.

• Scan. Scanning a disc results in extremely rapid playback of visual information, allowing the viewer to "fly through" the contents of a disc quickly. It is accomplished by having the player skip over tracks, only displaying occasional frames as it traverses the disc.

• Search. Any resident frame can be accessed by entering the frame number and a search command from the keypad in less than one second. With most players, the screen goes blank during the search, and once the requested frame number is located, the player scans that frame in a still frame mode. In some models of videodisc players, the last image is captured and displayed while the search is executed for the next location. When the new frame or sequence is initiated, it replaces the previous image. This results in "seamless" searching (the screen is never blank).

These features are critical for interactive applications. They provide an educator with a variety of tools to use for constructing exciting, varied instruction.

Levels of Interactivity

There are at least three different levels of interactivity available with videodiscs (Katz, 1992; Schwartz, 1987; Schwier, 1987).

• Level I: At level I, the user has access to any parts of the videodisc by using a remote control unit (RCU). Sequences and still pictures can be called randomly. Any of the typical functions of videodisc players mentioned previously, such as search, step and scan are available to the user. With some players, the user can also enter a sequence of commands from the remote control unit, store them as a program (as long as the videodisc player is powered on), and replay the sequence without re-entering the string of commands.

• Level II: At level II, the videodisc has a control program permanently recorded and hidden in audio track 2 of the disc. This control program, with some players, will communicate with the player and determine the sequence and options available to users. For instance, it may offer the user a menu from which to select material, or ask a question and branch to different sequences depending upon the response of the user. Think of it as a permanently embedded computer program which runs the disc. It can be overridden by turning off audio channel 2 on the player, thus defeating the program and giving Level I control back to the user.

• Level III: At level three, the videodisc is controlled by an external program. Typically, this means a computer is hooked to the videodisc player, and a program on the computer manipulates the videodisc. The advantage this gives the user is the ability to use several different programs with the same disc, and therefore have a variety of constructions from the material. Also, programs are relatively easy to write with newer authoring software on the market.

Software Cost and Availability

Although videodisc recording units are available, units intended for school use only allow playback of pre-recorded material. How much does educational software cost?

Manufacturing costs for large quantities of videodiscs are low, but manufacturing figures alone are deceptive. As with any type of media development, front-end and production costs are extremely high, with manufacturing and distribution of the final product accounting for only a small portion of the overall costs. Commercial film producers have already amortized front-end and production costs, and are able to mass-market discs at a much lower unit cost than educational producers, who must also recover production costs from a much smaller market base. So it probably depends on what you are buying, and from whom. More than likely, producers will attempt to market videodiscs for whatever the market will bear. Our experience so far indicates that commercial media are relatively inexpensive, whereas educational products are more expensive.

Table of Contents


CD-ROM Media

Compact discs (CDs) represent a recent quantum leap in the technology of audio recording, a fact which is appreciated by aficionado and expert alike. Their compact size, relative robustness, long recording time, and, above all, their superior, digitally-encoded sound quality have made them the undisputed sound medium of choice for the home entertainment market within the space of a few years.

CD-ROM stands for Compact Disc-Read Only Memory. (A read-only memory device is one from which a user can access and read or copy information, but cannot add to, edit, or delete data. Books and phonograph records are read-only media; videotapes and audio cassettes are not, although both can be made virtually read-only by removing the little plastic tabs which permit recording.) A CD-ROM is a digital recording, essentially the same technology as an audio CD. However, rather than being intended to be converted to analog audio for interpretation by the human ear, it is designed to remain in digital form, and therefore understandable by computers. CD-ROMs are physically identical to CDs. The only difference between them is in terms of what is encoded on the headers to their tracks.

Both CDs and CD-ROMs are 12-cm plastic discs, with a silvery finish and a spiral CLV track, just like videodiscs. Indeed, except for the difference in size and the fact that CDs and CD-ROMs are encoded on one side only, while videodiscs may be encoded on both sides, the two media are virtually identical.

This compatibility means that multi-mode players can be designed to read any of these optical media. Indeed, consumer models of players which will read either videodiscs or audio CD's already exist, and while they can also read CD-ROMs, we know of no multiple players which can be interfaced to a computer at present.

At the most basic level, it is convenient and intuitive to use a metaphor to describe the way in which digital data are encoded on CDs and CD-ROMs: Think of two different levels of surface on the disc¾called pits and land¾as representing ONs and OFFs, or 0s and 1s, that are interpretable by a computer. (In reality, things are slightly more complicated than that. Transitions from pit to land or from land to pit are encoded as 1s and constant land or constant pits are encoded as 0s.)

Of course, at the working end, the difference between CDs and CD-ROMs is plain: The former sounds intelligible, usually even pleasant when played on a stereo system, while the latter produces no sound; a CD-ROM contains information that a computer can understand, while a CD contains no information usable by a computer (unless a special program "tricks" the computer into behaving like a CD player, in which case the CD plays just as it would on a CD player). The recorded information on CDs is arranged in such a way that individual segments (equivalent to 1/75 of a second each) can be addressed individually. This means that beginning or ending play on a single note of music is possible.

Both media share important attributes:

• They can store very large amounts of information (up to 74 minutes per CD; up to 660 Mb of data for CD-ROMs);

• They can be produced at a relatively low unit cost;

• They are robust, inasmuch as they are constructed like videodiscs;

• they are a read-only medium, which means that, once pressed, they cannot be altered;

• They offer random access, albeit at a modest seek time (they are faster than floppy disks, but slower than hard disks);

• They are portable, both in terms of size and weight, and of compatibility across all brands of players.

Characteristics of CD-ROMs

CD-ROMs have a number of advantages, identified below. CD-ROMs:

• can hold huge amounts of data¾up to 660 Mb per disc when only digital data are encoded. That is equivalent to 825 double-sided Macintosh floppies! When such large quantities of data are placed on a single disc, careful planning is required with respect to its arrangement and to the relationships among the various files.

• are an economical format for schools to purchase software, but like videodisc, not as attractive as a production medium for schools to produce their own software.are relatively inexpensive to produce, costing only a few dollars each when manufactured in quantity. Since¾in quantity¾a CD-ROM is roughly equivalent in cost to a floppy disk, but can hold so much more information, CD-ROM represents a very economical means of distributing courseware. An expensive one-time mastering charge makes CD-ROM cost-effective only when you have more than about 10Mb of data and you need at least a couple of hundred copies.

• can hold a variety of kinds of information. Because of the nature of encoding¾digital¾CD-ROMs can contain digital data (i.e., computer code and data), digitized music, and¾when adequate compression techniques become available¾digitized video. Thus CD-ROMs are inherently multimedia. Designers can minimize the number of bits and pieces required for instruction¾both in terms of hardware and software¾ by capitalizing its "all-in-one" nature.

• provide a robust, stable storage medium; they can't be overwritten, and they don't deteriorate significantly with age. The don't require as much care in handling as magnetic media. Their toughness makes them a viable alternative where more delicate media might suffer (e.g., field conditions). Although their read-only nature can be an advantage in some situations, it can be a serious limitation in others.

• provide a standardized format. A file format designated High Sierra (or ISO 9660), which works with a variety of computer operating system, makes it possible for a CD-ROM to work properly no matter who manufactures the computer or the CD-ROM drive.

• provide random access to information stored on them. Although the seek times on CD-ROMs are not as fast as those on a hard disk, they are faster than those on a floppy, and are generally adequate for most purposes. Users will have a tendency to want to move files from the CD-ROM onto a hard disk, to increase speed of access. Many times, however, the files are too large to do that conveniently. It may be necessary to implement a system of installing and removing files from the hard disk. All things considered, CD-ROM is better as a distribution medium than as a real-time access medium. However, where speed of random access isn't crucial (e.g., large text databases), they can form a viable real-time access medium, as well. Audio and video will require better real-time compression techniques than are available at the time of writing before CD-ROM is viable for their real-time access.

Caring for Optical Media

Videodiscs and CDs are durable and resilient. Still, they can be damaged if mistreated. For longest life and quality replay, you should do a few simple things, such as:

• Handle discs like a good L.P. Avoid getting fingerprints on the surface of the disc. for one-sided discs, the player "reads" the bottom side, opposite the label.

• Store discs vertically¾not in a stack or in the disc player.

• Store in a temperate climate (40°F-95°F or 4°C-35°C is a safe temperature range).

• Discs may be washed in warm water and mild detergent. Use a soft, lint-free cloth to dry.

• Skipping during disc playback may be caused by dirty player optics. The focusing lens can be reached on most players. Cleaning the focusing lens and the disc will cure most problems.

• Discard badly scratched discs. They don't really hurt anything, but aren't curable either.

Table of Contents


Interactive Communication Devices: Interfaces

An interface, simply stated, is any device which allows a user to communicate with a computer or other hardware. In the early days of computing, the main computer interface was a keypunch and card reader.

As computers grew in sophistication and were joined by other media in multimedia systems, corresponding developments occurred with interfaces. In general, the trend in interfaces used in education has been toward user-friendliness; that is, making systems easier to use by requiring little or no technical knowledge. Ironically, as interfaces require less knowledge to use, the interfaces must become much more sophisticated. In this section, we will briefly tour a range of interfaces, from the most mundane to extravagant. You may notice we have not included any discussion of graphics tablets, joysticks and light pens. In our judgment these have waning significance for interactive multimedia systems, and thus have little importance for this discussion.

Remote Control Unit

Many devices have remote control unit (RCU) interfaces¾video cassette recorders, video disc players, compact disc players, television receivers, sound systems and garage door openers to name but a few. A remote control unit is a hand-held command center which is capable of communicating specific instructions from a user to the equipment. The available command set is typically specific to a single piece of equipment, although universal RCUs are available which can control an array of equipment from a single device. These interfaces communicate with equipment via infra-red transmission or over wires. In interactive instruction, RCUs are most frequently used to make selections in pre-recorded videodisc programs or develop user-designed control programs which can be stored and executed by a microprocessor in some players. For example, the following sequence of commands and numbers pressed on a Pioneer RCU will produce a small program which will execute a simple routine.

An important disadvantage of using an RCU to produce interactive programs is that the program is stored only as long as the videodisc player is turned on. If power is lost, so is the program. The present system is also intolerant of errors. If an entry error is made, it cannot be erased. The entire program must be re-entered, so it is not likely that large or complex programs will be introduced this way. Both of these limitations may be improved in future generations of equipment, but at the time of this writing, they impose serious limitations on the use of an RCU as an interface for interactive multimedia instruction. But as a user interface for browsing through videodiscs and CDs, it has lasting utility.

Keyboard

Still the most common interface to most microcomputer systems is the keyboard. This interface is basically a typewriter-like platform of keys which permits the user to type individual characters and commands. Of course, a typical microcomputer keyboard also includes a number of additional features, such as control keys, function keys and numeric keypads. These permit the user to carry out specialized functions which vary slightly from manufacturer to manufacturer, but the principle is still the same. On a microcomputer keyboard, the user must be able to type instructions, and this may require specialized knowledge of the program being used. Nevertheless, as an interface for interactive multimedia systems, the keyboard is very flexible. It allows a wide range of responses from the user, as long as the user and the system share the same vocabulary.

Mouse

A mouse is a push button mounted in a housing atop an omni-directional roller (trackball). The "button on a ball" interface is moved around freely on the surface of a desk (or anything with a surface for that matter). As the mouse is moved, a cursor, marker or pointer on a screen makes corresponding moves. So, for instance, if you move the mouse until the pointer on the screen enters a target area and then you push the button, the program can execute whatever command is associated with that target.

The mouse is a very easy device to use, and it is not as intimidating as a keyboard or RCU for many users. Although it requires a modicum of physical dexterity to use well, most people acquire the skills quickly. As an interface for interactive instruction, a mouse is somewhat limited. It is primarily capable of three functions: pointing, clicking and dragging.

Pointing is the act of moving the cursor on screen to a desired position. Clicking is the act of pressing the button. For example, you might use the mouse to point at an illustration of a kitten on the screen. Pointing at the kitten and then clicking the button on the mouse registers your cursor position with the instructional program. If the program has a command associated with the kitten's position (e.g., play the videodisc from frame 5000 to frame 5100 showing a kitten eating a mouse) then the command will be executed.

Dragging is the act of depressing and holding the button while you move the mouse. Dragging can be used for moving items on the screen. For example, if the program permits, you could point at the kitten, click and hold the button, and then drag the kitten to another position on the screen by moving the mouse. In similar fashion, the dragging function can be used to draw on the screen. The first click identifies the starting point and then moving the mouse will drag the line, box or circle to its end point or identify a second coordinate position.

Trackball

Functionally, the trackball is exactly the same as a mouse. Furthermore, a trackball looks and behaves something like a mouse - an upside-down mouse. That is, the unit remains stationary on the table, and the user manipulates a ball on its top surface to control the position of the cursor on the screen. Buttons near the ball provide for clicking and dragging. Since subtly different hand and finger motions are involved in using the two devices, user preference can play a major role.

Barcode

We have all become used to having a can of beans scanned at the supermarket checkout counter. It is routine technology. The same technology has been adapted to laser video systems, transforming a routine supermarket technology into a very useful instructional interface. Simply stated, a barcode interface for video allows a developer to write command sets in barcode hieroglyphics, and a user to sweep over the sets with a special scanning pen to execute the commands.

So, what's the big deal? To reiterate an earlier point, anyone who has used a remote control unit (RCU) to execute strings of commands realizes how cumbersome and prone to errors that can be. In order to do something simple, say find frame 2000, play from there until frame 3200 and stop, ten separate RCU keypresses are needed:

2-0-0-0-Search-3-2-0-0-Autostop

If any entry errors are made, the entries must be cleared, and the user starts over from the beginning. Multiply this by several command sets for a typical lesson, and the problem is apparent. Instructors using the system for group work or students attempting to illustrate instruction independently can easily make mistakes and either embarrassment or frustration can obtain.

A laser barcode system reduces the amount of information a user must manually introduce to the system. A single sweep of a barcode strip (followed by an audible 'beep' to announce the successful reception of information) and the commands are executed. A barcode system is made up of these parts:

• a videodisc player and monitor;

• a videodisc (of course);

• a barcode scanning pen (barcode reader);

• authoring software to generate barcode command sets.

At the time of this writing, not all videodisc players support barcode systems; barcodes are still relatively new. Most older hardware, however, can be adapted, and we anticipate the transition time until all players are supported will be brief.

The barcode scanning pen (a.k.a. barcode reader or barcode wand) is an infra-red or wired device which when swept over a barcode strip such as the ones illustrated above and below, transport serial commands to the videodisc player. The shape of the pen is familiar to users, and has the distinct advantage of not requiring special knowledge to use.

Much of the instructional videodisc software now being produced contains barcode indexes and common command strips (such as "step forward," "play," "scan"). Still, to take full advantage of barcodes, it is necessary to print your own command sets on barcode strips. Authoring software is available for both Macintosh and IBM platforms to encode, rehearse and print barcode command sets. A standard command set was established by the LaserBarcode Association in Japan, which includes companies such as NEC, Sony, Pioneer, Toshiba, and Sanyo. The standard command set ensures that materials developed in conformance to the standard set will work with all LaserBarcode compatible players.

Barcodes offer some very useful options for instructors and students. For group instruction, barcode strips can be placed in the margins of lecture notes or pasted onto a reference index. During a class, an instructor can select from a planned set of video illustrations without having to look up frame numbers and enter them with an RCU. Our experience is that this is a marvelous convenience, and it frees one to attend to the class instead of the equipment. We do however, suggest you keep an RCU and comprehensive disc indexes available, as students have the uncanny ability to ask for illustrations you have not planned. For these, and similar, instructional situations, the flexibility of the RCU is welcome. Barcodes, by their very nature, are more useful for planned (or at least anticipated) instruction, rather than for flights of serendipity.

A highly touted self-instructional version of the above suggestion is the illustrated textbook for readers (a modest print and video multimedia system). In the margins of the textbook, or in the body of text, barcodes can offer illustrated sojourns for students through related video pictures and segments. This is indeed a useful application, and it couples a newer instructional technology with one of the oldest ones, resulting in a familiar, yet potentially highly interactive multimedia resource for learners. This also challenges instructional designers to learn more about text design and how to merge external media with text to create integrated learning systems.

Yet another instructional application of barcode systems is for multimedia classroom displays. A display of realia, say a human skeleton, could have barcode strips placed on or adjacent to specific features. When scanned, the student might receive some video instruction about the "part" or see the system in its living state. Barcodes can be attached to most anything (well, all right, fish are difficult), and realia can extend the learning offered vicariously through video. Using realia as part of a multimedia system can introduce immediate, concrete experiences for learners which are not necessarily subjected to the treacheries of generalization.

For independent study and use with children, the barcode reader offers easy access to a wide range of material without dedicating a computer to a particular learning station, without spending a great deal of time training students how to use it, and without exposing more vulnerable systems to the tyrany of a third-grader's onslaught.

Touch Screen

Touch screens are the finger food equivalent of the mouse. They are characterized by a frame mounted to the face of a display screen. Although the technologies vary significantly, all touch screens perform similar functions: they identify coordinate positions on the screen and report to a program when one of the positions has been trespassed. Individuals interact (make selections) in a program by directly touching the screen. Touch screens require little or no knowledge of special commands to use them effectively, and they free the user from looking back and forth from the screen to the keyboard, or coordinate actions with a mouse. They do, however, usually require specialized knowledge of a programming language to develop instruction. Because most applications are system specific, touch screen based programs are not very portable and are usually limited to more expensive systems. Touch screens also retain the same disadvantages as a mouse. User responses are limited to selecting items from the screen or tracing positions from one position to another on the screen. Still, touch screens are used in a wide variety of settings, but are particularly beneficial in training or public display programs, where the users are not typists, may be techno-phobic or may span a wide range of ages and abilities.

We suggest touch screens are not as useful or flexible as other interfaces for classroom settings.

Voice Recognition

Voice recognition interfaces permit the user to give verbal instructions, commands, and responses to programs. Obviously, this is a desirable (almost transparent) interface; the user does not need to learn how to operate any external device or even make the effort to point. This type of interface would find tremendous application in an interactive multimedia environment in order to request information from a variety of sources.

Developments in voice recognition are just starting to make their way into mainstream instruction, most noticeably for individuals with physical disabilities which hamper fine motor activity necessary to manipulate a keyboard or mouse efficiently. Others of us dream of the day we can speak in natural language with a computer, and have it interpret our instructions conversationally.

Voice recognition systems require a user to speak into a microphone. Vocal input is represented as voice print patterns, and those patterns are compared to resident patterns. For example, if you said, "Save the file I am using," a numerical representation of that speech would be constructed. A programmer, anticipating this request, would have entered the phrase or key elements of the phrase for comparison. The input phrase would be compared to all resident elements, and if a match occurred, the command would be carried out.

One difficulty with voice recognition is in generalizing commands to several individuals because of differences in vocalization and pronunciation. With most programs in current use, one first records a series of commands or key words which become the resident voice print patterns for later comparison. In this way, you are matching your own voice when you input a command, but you are limited to phrases you entered or constructions from the phrases you entered. Voice print patterns also require a substantial amount of memory, a problem which file compression, sampling and larger computers will reduce in the future. For now, the vocabulary of voice recognition remains small, but development is brisk in this area, and it is not unreasonable to expect free-form natural language interfaces to be available at some time in the not-distant future.

Virtual Reality Interfaces

A virtual reality interface is a complete environment, one in which the user physically enters and interacts with the program. For example, if you were learning about landscape design in a virtual reality environment, you might walk around an area to be designed. You could pick up shrubs and plant them in various locations, lay a paving stone walkway, install lights, construct a water treatment and then walk through your creation to view it from a variety of perspectives. Absurd? Not a bit. Virtual reality programs and interfaces already exist, and their sophistication is growing rapidly.

The interface to accomplish this type of interaction is specialized and usually specific to a particular treatment, although most interfaces include some combination of goggles and gloves or data suits. A great deal is written in popular, electronic, and computer publications about prototypes and visions of virtual reality, but commercial applications are just beginning to emerge.

A virtual reality system requires a presentation system and an interface for interacting with the system¾no difference between this and other multimedia systems in that regard. The trick is to make the presentation system so complete, so absorbing, that the user can treat the simulated "world" as if it were real. Typically, this is accomplished by wearing a pair of goggles which places a small television in front of each eye, thus giving the wearer binocular, three dimensional vision. The goggles sense the position of the head and communicate with a computer, which changes the perspective on the screen as you change the position of your head. When you want to see what is behind you, you turn and look. As part of the presentation system, the user also wears earphones of some sort, in order to introduce stereo sound into the system.

This area of development, as exciting as it is, is also very highly specialized, and is unlikely to significantly influence classrooms for the foreseeable future. Nevertheless, we mention it because of its potential and vitality. To quote Jaron Lanier, founder of VPL Research, from an interview in Omni magazine:

Sometimes I think we've uncovered a new planet, but one that we're inventing instead of discovering. We're just starting to sight the shore of one of its continents. Virtual reality is an adventure worth centuries. (Stewart. 1991, p. 117)

Table of Contents


Microcomputers

A microcomputer lies at the heart of any interactive multimedia system. Any of the components we have discussed previously can be used without a computer, but of course, a computer extends the capabilities of other media. The trend we see emerging in interactive multimedia is that "all roads lead to computers." By that we mean other media are being compressed and incorporated into computer systems. Computers are already capable delivering full-motion video, digital audio and animation (along with the more routine information) without the use of peripheral devices such as videodisc players and CD-ROM readers. These are referred to as "multimedia computers" and they do indeed manage to house the various media within a single package. The drawback, at present, is that in order to create some of the more exotic treatments, such as full-motion video, computers require massive files. Compression technology is improving, and computing power is increasing, so it is very possible that in the not-distant future, we will be working with computers as the predominant, integrated medium for educational use.

Computers have already become part of the instructional fabric in education. For that reason, we will not go into an elaborate discussion of what a microcomputer is and how it works. Many excellent references are available if you have not yet had a close encounter with a computer of some type. We are often asked by teachers and school systems to recommend a particular make of computer for purchase. We have our own favourites and villains, but we cannot give firm answers to enquiries. Basically, every purchase decision boils down to how you intend to use the equipment and what your budget can support. All other things being equal, we often advise educators to purchase equipment which is compatible with equipment in the school division. But then, all other things are seldom equal.

This section will examine some of the trends in the use of computers in Saskatchewan and elsewhere. The direction on which Saskatchewan has embarked calls for all students to become functionally computer literate. This will be achieved through the introduction of computers as personal productivity tools across the curriculum¾supporting all subject areas, rather than through the introduction of discrete computer literacy courses (Butler, 1991). We would add that this requires a reasonably high level of computer literacy by teachers in the province.

Uses of Computers in Education

Computers have be alternatively viewed as objects of instruction, as aids to instruction, and as productivity tools for teachers and students. Each of these represents a valid perspective to the use of computers, but they carry very different implications for the classroom teacher and student.

Computer science is the model imposed on computers as objects of instruction. From this perspective, students take courses about computers. These may include topics such as the history of computing, programming, keyboarding or systems design. Certainly this has its place, but most students (and teachers) have little interest in learning the intricacies of programming in Pascal. This approach to computing in education will survive as a subject area for those interested in pursuing careers in related fields.

Computers are also viewed as aids to instruction, providing the curricular support mentioned earlier. Computers may be at the heart of presentation systems for teachers, deliver independent study materials to learners, or provide rich resources for resource-based approaches. Certainly, this is a significant and growing role for computers to play in K-12 education.

A dominant role for computers to play in education is as personal productivity tools for teachers and students. For example, from this perspective, users learn how to use the computer to facilitate research, design and layout reports, perform word processing, design spreadsheets and search electronic bulletin boards. This is a very important role for the computer to play in our schools, especially given the pressures on our schools to prepare computer literate individuals.

Microcomputer Usage in Saskatchewan Schools

A survey was mailed to the principal of each school in Saskatchewan in 1989 in order to determine the current and projected status of computer usage in Saskatchewan schools (Proctor, 1990). Fifty-two percent (52%) of the surveys were returned, and results were either derived or projected from these data. Respondents not only reported on current computer usage, they also projected the level of usage for 1994. While several highlights are presented below, we recommend you refer to the full document for details.

• Approximately 10,000 computers were added to school inventories between 1984-1989, bringing the total complement to more than 13,000.

• The Apple II series of computers accounts for more than half of the inventory, and are particularly evident in elementary and multi-division schools.

• It was projected that Apple will maintain its current market share, while IBM and Macintosh computers will increase their shares of the market. Commodore and other models will decline.

• Approximately 50% of computers in the schools are located in labs, with no change predicted in the future.

• School board budgets and EDF funds were reported as the most significant sources of funds to purchase computer hardware.

• The need for school and division level inservice training workshops is increasing. The expressed need is primarily for computer applications and integration into the curriculum.

• The role of the computer as an object of instruction is declining.

• The use of the computer to deliver and manage instruction was seen to be increasing, including tutorial instruction, simulation, problem solving and discovery learning.

• Preparation in computing was viewed as an important component in the training of pre-service teachers.

• Approximately 1950 teachers identified themselves as extensive users of computers in their classrooms, representing 16% of the teaching force.

These data support the notion that computers are well ingrained into the educational fabric of our province. They are the objects of considerable attention, and will likely increase in prominence in the future. We suggest there are also some areas for concern couched in the results.

An overwhelming share of the current inventory in our schools is comprised of outdated equipment. Apple II and early models of Commodore computers will soon be obsolete, and they account for more than half of the computers in the schools. Through the EDF fund and board support during the last decade, the schools responded aggressively to the challenge to introduce computers into our schools. How will the educational system respond to the challenge of replacing aging and outmoded equipment? Schools must create immediate plans to replace existing equipment, and we recommend school divisions develop plans for continual replacement of educational technology in the future. This is a problem which will continue, and which can be generalized to other media as well as computers.

Another difficulty is how to prepare teachers to respond to the challenges mentioned above. While the respondents recognized the need for preservice and inservice training for educators, little has been done to systematically respond to this need. Currently, computer application courses are available only as electives in the teacher education programs at the University of Saskatchewan. Secondary teachers receive one half-course in educational technology. Inservice training is hit-and-miss, largely in response to occasional requests from the field. We suggest every preservice teacher should take at least one computer applications (not programming) course. Further, in cooperation with the Professional Development Unit of the Saskatchewan Teachers Federation, the Saskatchewan Educational Leadership Unit at the University of Saskatchewan, and other professional development agencies, the College of Education should provide systematic inservice training on computer applications for every practicing teacher in the province who requests it. Perhaps given the relatively high percentage of teachers who identified themselves as extensive users of computers in their classrooms, a system of peer tutoring could be developed.

Table of Contents


Epilogue

Most of the implications of interactive media for Saskatchewan schools are implicit in the previous discussion. Interactive media can:

• provide support for traditional instruction;

• provide materials necessary for independent, collaborative and generative study;

• enhance learning environments;

• offer diverse, cost-effective alternatives;

• make resource-based learning viable in our province.

Our province has traditionally supported progressive educational measures. As we move firmly into the information age, we have little choice but to provide our students with the necessary resources to compete in local, provincial, national and global markets.

Table of Contents


Selected Sources of Interactive Media

If you are interested in acquiring multimedia resources, we offer the following list of distributors. You can contact them for current catalogues and prices of materials.

A-J TechR.R. 2Morinville, AlbertaCanada T0G 1P0

Activision3885 Bohannon DriveMenlo Park, CA 90425

American Chemical Society1155-16th St. N.W.Washington, DC 20036

Braderbund Software, Inc.17 Paul DriveSan Rafael, CA 94903-2101

BritannicaEncyclopedia Britannica Educational Corporation310 South Michigan AvenueChicago, Illinois 60604

Bureau of Electronic Publishing141 New RoadParsippany, NJ 07054

Compact Publishing, Inc.P.O. Box 40310Washington, DC 20016

Discis Knowledge Research, Inc.NYCCPO Box #450995150 Yonge StreetNorth York, Ontario M2N 6N2

EDUCORP Computer Services7434 Trade StreetSan Diego, CA 92121-2410

Grolier Electronic Publishing, Inc.Sherman TurnpikeDanbury, CT 06816

Heartbeat Software SolutionsP.O. Box 4497Cerritos, CA 90703-4497

Highlighted Data, Inc.4350 N. Fairfax Dr.Suite 450Arlington, VA 22203-1620

Hyperglot505 Forest Hills Blvd.Knoxville, TN 37919

Image Entertainment9333 Oso AvenueChatsworth, CA 91311

Jostens Learning Corp6170 Cornerstone Court E.San Diego, CA 92121

LessoncardP.O. Box 2778220 Cypress StreetAbilene, TX 79704

Lumivision1490 Lafayette StreetSuite 305Denver, CO 80218

National Geographic SocietyEducation Media Division17th and M Sts. NWWashington, DC 20036

Sierra On-Line, Inc.P.O. Box 485Coarsegold, CA 93614

The Software Toolworks60 Leveroni Ct.Novato, CA 94909

The Videodisc Compendium for Education and TrainingEmerging Technology Consultants , Inc.P.O. Box 120444St. Paul, MN 55112

Videodisc Publishing381 Park Avenue SouthSuite 621New York, NY 10016

Videodiscovery, Inc.College DivisionMcGraw-Hill, Inc.1221 Avenue of the AmericasNew York, NY 10020

The Voyager Company1352 Pacific Coast HighwaySanta Monica, CA 90401

VTAE, Inc2564 Branch StreetMiddleton, WI 53562

Wings for LearningP.O. Box 3240Station FScarborough, OntarioCanada M1W 9Z9

XiphiasHelms Hall8758 Venice Blvd.Los Angeles, CA 90034

Table of Contents


The Technology and Role of Distance Learning for Saskatchewan

Technologies for the delivery of learning are not novel, although one is left with that impression considering the pace at which telecommunications technology is developing. An unprecedented number of strategies for delivery of learning to the distance student are available to today's educators. The scope, diversity, and rate of change make selection of appropriate delivery mechanisms difficult and risky. Yet these decisions must be made and a stake driven if education and training are to remain abreast of the need.

A significant segment of change is occurring in video technology. The "video revolution", now of landslide proportions and gaining momentum, is due largely to advances in delivery technology. Television has been the mass medium of choice for over four decades, providing powerful life-shaping messages in areas of entertainment, news, and lifestyle. Video coupled with the computer and delivered via new digital technologies is putting the learner and the instructional developer in the "digital driver's seat". The union of computer and television with the new technologies of digital storage and delivery produce a technology called "multimedia". This technology gives added substance to the concept of television as a "universal medium" which has prevailed for forty years. Compressed digital video (CDV) is the glue which ties together diverse technologies by providing exceptionally cost-effective communication when bandwidth hungry analog transmission is reduced to narrow bandwidth by digital transmission. A dramatic illustration is the 6000 to 1 compression which occurs when a video image for a freeze-frame videophone is compressed from a full motion transmission rate of 90 megabits per second (Mbps) to 19.2 kilobits per second, allowing transmission on standard telephone lines (90,000,000 bps to 19,200 bps).

Saskatchewan is uniquely situated to take advantage of new and emerging distance education technologies because of its strong provincial heritage of pedagogical and technological development designed to bring education to the remote learner. Saskatchewan's technological roots extend to the mid-1870's when the first rural and urban telephone networks first linked citizens of this sparsely populated province. Since that time Saskatchewan has remained abreast or led in technological developments related to communications media. Currently this province boasts a communications infrastructure second to none and the envy of many. Development of distant oriented programs and materials designed to exploit the technological infrastructure have consistently lagged behind creation of the pathways to deliver learning at distance. Development of "open learning" has also been very slow to develop in Saskatchewan. Existence of very good traditional educational organizations, a relatively homogeneous rural/urban population, and lack of will to cooperate/collaborate among educational providers slowed development of programs designed to extend learning outside the institution. Changes in provincial demographics, reduced provincial economic circumstances, enrollment caps at traditional institutions, and a growing number of new learners requiring training, re-career re-tracking, literacy, and lifelong learning experiences, at a time and place of their choosing, has spurred educators to consider alternative strategies for delivery of learning to the people of Saskatchewan.

Table of Contents


Education at a Distance

It is important to precisely specify the meaning of terms that have numerous variations in current common use. Distance education, distance learning, and open learning have crept into our daily jargon and tend to be used interchangeability. Although related these three terms have quite different condonations for those who work with learners located at a distance from the institution. The following definitions will clarify these three terms in respect to discussion in this paper.

Distance education is defined as a means by which instruction is offered to learners who are often geographically separated from the provider and often described in terms of the technological devices used. Distance education tends to be institution centred rather than learner centred.

Distance learning is defined as a process of instruction characterized by separation of the learner from the source of instruction. In this process learning is facilitated by providing the learner with links to the originating site for transfer of information and interaction with the teacher. Distance learning, closely related to "open learning", describes, from the learner standpoint, a concept of how learning will be developed and arranged to best suit the needs of the student.

Open learning is a conceptualization that extends beyond the traditional meanings attributed to distance education and distance learning. Open learning embraces all modes for delivery of learning materials to learners who are at a distance. Conceptually the student or learner is at the centre of open learning, and educational providers endeavour to deliver, on a cooperative/collaborative basis, learning materials configured to the needs of the learners. Open learning removes barriers to learning that face students who must, for whatever reason, pursue their educational interests distant from a traditional institution. Open learning is learner centred and attempts to extend the learner's existing knowledge, experience, and skills. Open learning is premised on the philosophy that learning is life-long. Open learning is characterized by provision of learning opportunities which:

• permit students to learn at a time and place of their choice;

• provide a choice of learning strategies;

• permit transfer of skills and knowledge;

• foster collaborative/cooperative relationships among educational providers;

• permit credit for previous training and experience;

• provide broad educational opportunities including credit, non-credit, informal, training, and public interest programming;

• provide access to learning resources, and interaction with teachers.

Table of Contents


Overview

In its time the book was as novel and revolutionary as today's advanced multimedia delivery systems and held no less promise. Increased communication and interaction between the teacher and the student has been a driving force underlying development of every technology supporting the goal of learning at a distance. Existing delivery technologies have enjoyed varying levels of success in addressing the needs of the distance learner. Emerging technologies, particularly those involving a telecommunications component, have the potential to catapult distance learning to the forefront of the educational process. The client pool for non-traditional delivery of learning far exceeds the number of students now enroled in traditional institution-based programs. These new technologies hold much promise as a means to attack the enormous task of education, training, and retraining facing the world as we hurtle through the few years remaining before being thrust into the twenty-first century.

The see-saw of the information and technology explosions throughout the nineteenth century has alternately created the need for and solution to the scope and level of education and training required to support the society of the time. To establish a perspective on telecommunications strategies for delivering distance learning, one must have a feeling for that which has gone before. The history of bringing learning to the student is a fascinating and integral part of present day advances in telecommunications delivery. The goal of this paper is to trace the evolution of distance learning delivery modes from the earliest person-to-person, local educational extension programming to present day telecommunications developments with global capability.

Distance learning is defined as a process of instruction characterized by separation of the learner from the source of the instruction. In this process learning is facilitated by providing the learner with links to the originating site for transfer of information and for interaction with the teacher. The quality and capacity of links to the learner have enjoyed a constant state of growth, from the first correspondence courses in the late 1800's dependent on the mail service for interaction, to present day full motion interactive television instruction via satellite. Interaction with the remote learner has moved from sporadic, impersonal, delayed communication to real-time personalized dialogue with the instructor. More than a century of development has improved communication modes linking the learner with the originating site. New modes have not directly replaced the old but have consistently added another dimension. The importance of print and mail service to distance learning remains high, and new telecommunications technologies have improved access and interaction.

Strategies for delivery of distance learning by telecommunications cannot be considered in isolation from the preceding modes of delivery. Present day distance learning practice is a continuum from print to multimedia and mail service to satellite service. New strategies for delivery and interaction will not instantly replace serviceable existing modes. Rather, the old and the new must form an amalgam providing the best possible situation for the distance learner. Therefore, in the zeal to adopt the new, one must never lose sight of past and present methods. The increment results when the new is integrated with the old to take full advantage of the added telecommunications dimension. Cost effectiveness is a factor of distance learning that is best served by evolution rather than by revolution.

Until the mid-1800's word of mouth was the predominant mode for distance learning, and the instructor physically attended the instructional site. This mode remains predominant in many parts of the world today. Once print became established in the form of books, manuals, and newspapers, instruction at a distance was enhanced by learner access to a multitude of print-based information. Paper-based print formed the backbone of instruction at a distance for the past century and will remain a major information carrier well into the next century. "Digital fusion" the ability of digital processes to convert all forms of media to a common denominator, digital "bits", for transformation, storage, and delivery fuses the old with the new giving the educator a fantastic arsenal of material, process, and delivery of learning with which to carry out education at a distance..

Table of Contents


Delivery Continuum

A history of distance learning delivery modes and devices of delivery in roughly chronological order would include: face-to-face, print, mail service, telephone, radio, audio tape, broadcast television, video tape, audio teleconferencing, cable television, videotex, telewriter, computer assisted learning, audiographics, computer conferencing, satellite delivery, facsimile, video conferencing, video disc, CD-ROM, business television, compressed video and multimedia. Figure 10 provides a graphic representation of the delivery continuum.

Media devices used to deliver the information component for distance learning fall into two basic configurations: passive devices (one-way), which do not provide interaction with the instructor, and interactive devices (two-way) that permit direct interaction. Interaction and feedback are judged to be essential components of the learning process for all except the literate, mature, and motivated learner. Telecommunications devices used in "teleconferencing" mode are part of an "interactive technology" linking learners, often in multiple locations, by two-way communication. Whether as an inherent component of an "interactive" device or as an adjunct to a "passive" device, telecommunications can make the difference for the independent learner. Figures 11 and 12 illustrate some of the passive and interactive media.

Research points out that learning supported by the emerging interactive technologies is equally as effective as traditional methods. A beneficial aspect of new technologies for delivery of learning is the capacity of the new digital technologies for elimination of the "batch" aspect of traditional and distant learning by allowing the learner to interact on an "asynchronous" basis (not in real-time but at a convenient time) with the teacher and/or with the content base. Telecommunications strategies for reaching and interacting with the learner demonstrate potential for establishing the 90's as a "decade of the independent learner".

Face-to-Face:

This is the longest standing distance learning mode, with instructor and instructional materials travelling from the institution to the learner. Attempts are made to re-create institutional settings at the learning site. This mode, usually not cost-effective, is characterized by a high degree of interaction and low availability of resources. Instructors travel varying distances, in varying conditions, and by almost every conveyance imaginable, as illustrated by Figure 13.

Gutenberg's invention of the printed page truly enabled the distance learner. Correspondence courses, books, manuals, periodicals, and newspapers remain today as a major mode for distance learning. The book allows the learner true independence, with access any time, anywhere, and in any sequence. Print is cost-effective and rich in digital and iconic information. A vast organized body of print resources exist. To be successful, print, as a distance learning mode, relies upon a motivated learner with a basic level of literacy. It is cost-effective for small or large client pools, but instructional effectiveness my be reduced due to low or delayed feedback to the learner. Figure 14 outlines the various configurations associated with print and the usual forms of interaction. Audio and video cassette tapes have successfully supplemented print as a method to increase the richness of communication between teacher and learner. In general cassettes have been used to carry prepared material to the learner. Use of audio tape cassettes as a means for the teacher and student to interact has proven to be very a successful means of providing feedback. However, tapes are subject to postal delays in the same manner a written correspondence a a means of feedback.

Print will continue to form the backbone of distance learning strategies for many more years as we move toward an increasingly "paperless" society. New and emerging media will gradually replace paper print, as computer memory and storage decreases in price, the amount power required, and in cost.

Postal Service:

A developed and reliable mail service continues to provide feedback as an interactive link between the learner and the instructional site. Interaction is delayed (asynchronous) and often cumbersome and inadequate. In situations where the remote learner does not have access to telecommunications the postal service is often the only interactive link with the instructor. One-way instruction delivered by broadcast or satellite television often relies on the mail to provide interaction with the student. Mail service may provide both delivery of instructional materials and interaction with the student when learning is based on audio tape, video tape, or computer disc. Postal service is one of the most cost effective methods to deliver resources to the learner and receive feedback from the learner. Quality, consistency, and cost of interaction by mail depend upon resources available at the originating site.

Telephone:

Bell's invention of the telephone made possible the first real-time (synchronous) interaction between the teacher and the distance learner. The telephone is truly a ubiquitous telecommunications device, a universal network in which one has a reasonable expectation that anywhere one calls in the world the telephone will be answered. Initially the telephone network was used to make it possible for remote learners to call the instructor and receive individual help with instructional problems. Audio teleconferencing, using a conference bridge, enabled many individuals to be networked in real time for instruction and business meetings. Some forms of video teleconferencing make use of the telephone network to transmit interactive video and audio for instruction and business applications. Recent innovations in video compression assure that the telephone network will remain a viable option for delivery of interactive learning. Digital technology and fibre optic transmission inject new potential into the telephone network as a component of future distance learning and communications systems. The telephone network provides high interaction at low cost, and is effective for sparse, widely separated learner populations.

Radio:

Marconi's invention of wireless communication enabled educators to provide learning to remote learners not reached by telephone and the postal service. For a number of years, schools of the air and school radio broadcasts provided learners with direct instruction and instructional enrichment. One-way audio instruction was cost-effective in that large numbers were reached, receiver cost was within reach of everyone, and cost of production was relatively low. It required an infrastructure to provide feedback and print related support materials and was instructionally effective when all components were in place because it provided motivation for the independent learner and did not require the same degree of literacy as print modes.

Audio Tape:

Audio tape, particularly in cassette format, has enjoyed a long association with the distance learner as a medium of instruction and as an integral component of distance education courses. It is an effective learning medium due to portability, ease of review, potential for personalization, large body of readily available material, rapid low cost production of new material, inexpensive replay/record machines, and low cost of cassette tapes.

Broadcast Television:

Broadcast television, in off-air mode, was the first medium to bring cost-effective, information rich, aural and visual information into the home. A visual channel greatly expanded the quality of communication beyond that which had been associated with telephone and radio as a delivery mode for distance learning applications Television earned the reputation of being a "universal medium" incorporating beneficial aspects of other media and is often described as "reality thirty times a second". The major drawback to broadcast television as an ideal medium for instruction was its passive one-way transmission which relied on other media such as telephone or mail for interaction. Second, third, and fourth generation developments in television (cable, satellite, and multimedia) have overcome this limitation. Television has maintained its position as a "universal" medium by being incorporated into the multitude of modes now delivering programming for distance learning. The information rich audiovisual nature of television enhances its ability to transfer knowledge and abstractions to learners who possess minimum levels of visual and print literacy. Cost of production, inflexible schedules, lead time, technical expertise, security, and production expertise limit use of broadcast mode television as a distance learning medium. Later developments in television have eliminated some impediments, but cost and requirement for quality of production continue to limit its widespread use in distance learning. Figure 15 shows the possibilities for broadcast, cable, and satellite distribution of distance learning materials.

Video Tape:

This medium possesses the beneficial qualities of broadcast television and eliminates some of the disadvantages. Video tape is a flexible medium because the learner can use it at any time, it integrates easily with other media, has a large program capacity, low cost recording medium, and low cost replay equipment.

Instructional Television (ITFS)

ITFS, Instructional Television Fixed Service is a low power, omni- directional, line-of-sight microwave television broadcast technique used primarily by educational services to deliver programming to buildings and geographic areas located within a 30-50 kilometer broadcast area. ITFS uses microwaves in the 2,500 mhz band which must be up-converted for transmission and down-converted at the receiving site. This was a popular method of educational transmission from the mid-60's to mid-70's for densely populated urban centers. ITFS is still used in areas where mountains and water do not permit subscribers to receive signals from regular broadcast or cable is too difficult to lay.

Audio Teleconferencing:

This refers to use of the telephone on the Public Switched Telephone Network (PSTN) to enable three or more people in two or more locations to share a common discussion. The simplest teleconference uses a speaker-phone at each of two locations, enabling individuals at both locations to carry on a discussion with each other. When a telephone bridge is used a discussion between three or more telephones can be carried out simultaneously. The modern electronic telephone conference bridge permits automatic dialing of all conference participants, allows individuals to call into the conference, announces conference participants, and monitors and adjusts volume and quality of the telephone links. New digital telephone exchanges permit subscribers to link up or self-convene from two to six participants in a conference mode. Standard telephone bridges accommodate up to thirty conference connections, and in special cases thousands have been accommodated in one conference. Figure 16 illustrates two configurations for audio teleconferencing.

Cable Television:

Transmission of video is through coaxial cable or optical fibre directly from source to subscriber's television set. Several signals (channels) may be transmitted at the same time and selected by the tuner on the television receiver. It is possible to transmit signals in both directions to permit subscriber interaction. It requires very little power in comparison to off-air broadcast television and is cost-effective in densely populated areas. Most public cable operators are required by legislation to provide a dedicated education channel. Creation of programming is subject to the same cost/quality constraints as broadcast television. Channels available in VHF (2-13) and UHF (14-83) provide broad possibilities for scheduling, but cable operators tend to fill low channels with prime time revenue producing programming and relegate educational programming to unsocial times or to less accessible high UHF channels. In populated areas cable television has great potential for distance learning because it constitutes the Second Universal Network appearing in a large percentage of the homes. Integrated with the telephone network for feedback and interaction further enhances cable as a mode to deliver effective distance learning programming to a broad spectrum of the population. See Figure 15.

Visual Electronic Remote Blackboard (VERB):

This involves transmission of voice and textual material from instructor to student using regular telephone lines. The instructor's voice, as well as material created by the instructor with a stylus and drawing pad, is transmitted to the instruction site. Voice is reproduced at the distant site on a loudspeaker, and the textual material is reproduced by a stylus on the stage of an overhead projector. Second and third generation VERB systems capitalize on advanced technology to reproduce the image on white boards, computer screens, and in color modes. Hard copy of black or white board images is an option on newer systems. VERB has been superseded by audiographic conferencing which involves interface of voice and computer images over regular telephone lines.

Teletex:

This is the generic name for a set of systems transmitting alphanumeric and simple graphic information via the extra line capacity in the vertical interval of the television picture. Broadcast and cable television systems transmit textual and graphic information during regular programming. Teletex information is decoded and displayed on a modified television receiver. At one time teletex held great potential for education, but technological advances in transfer of data curtailed much of the development work in this medium.

Telewriter:

Microcomputers are linked through the telephone network to send and receive textual and graphic material. Data may be input from the keyboard of the microcomputer or by writing or drawing on a digitizing tablet. Information is viewed on all computer screens at the same time and may be converted to hard copy at any site. This mode is a synchronous mode of computer conferencing. Telewriter technology, as used to enhance audio teleconferencing, has developed into the mode known as audiographic teleconferencing.

Computer Assisted Instruction (CAI):

Students interact directly with instructional material programmed into the computer system. Computer Managed Instruction (CMI) The computer system is used to manage information on student performance, progress, and resources in order to prescribe, guide, and administer the instructional process to maximize learning efficiency. CAI and CMI may be carried out via telecommunication links by use of a modem on switched networks. CAI may function effectively by using the mail service to deliver discs to the learner. CAI emulates student/teacher interaction, provides immediate feedback, allows student to control pace and time of learning. CAI is an asynchronous mode of computer conferencing in which the student interacts with the host computer rather than with other computer users in the network. Instructor/student interaction on an asynchronous basis with the host computer establishes a feedback link and enhances the effectiveness of this form of instruction.

Audiographic Teleconferencing:

Teachers and learners are connected via a combination of voice and computer, using narrow band telecommunications channels to transmit alpha numeric data, graphics, and freeze-frame video images. A telephone bridge connects students and teacher together in audio teleconference mode, and by using special software, students' computers are connected by modem to the teacher's computer. Information created on the teacher's computer, graphics tablet, frame store, or digitizer appears on each of the students' computer screens. The computer screen provides each participant the equivalent of an electronic blackboard. In older systems two telephone lines were required for carrying voice and computer data. In newer audiographic teleconference systems both voice and computer data can be transmitted on one line, resulting in much reduced carrier cost. See Figure 17.

Computer Conferencing:

Two or more computers are connected using modems and narrow band telecommunication channels. Communication may be in synchronous mode to connect instructor and student in real time and emulate face-to face instruction, or it may be in asynchronous mode which uses a host computer to store information for later access, such as electronic mail or bulletin board. When more than two computers are linked a data bridge is required (Figure 18).

Satellite Delivery:

A satellite located in geostationary orbit appears to earth stations as fixed in space and provides a broad "footprint" from which it can receive transmissions from an "uplink" and re-transmit to any number of "downlinks" within the area of the footprint. This type of signal distribution is called point to multipoint. Each satellite has up to 40 receive/transmit transponders capable of distributing audio, video, and data signals. A number of satellites spaced in orbit can form a network capable of transmitting and receiving information to and from any point on earth. Satellite distribution can serve remote areas not linked by other telecommunication services. Satellites are most effective when used over a large geographic area with a large number of participants in local, national and international networks. Capable of bi-directional communication, satellite transmission of distance learning material is more often one-way with learner interaction provided by telephone link. Cost, programming constraints, terminal equipment, and lead time have restrained the small user. However, data compression, spectrum division, and advances in digital technology are reversing this trend to some extent. VSAT (very small aperture terminals) provide cost-effective bi-directional satellite delivery of narrow band programming of voice and data and are used extensively by corporations to link widespread offices. VSATs are an economical way to create a private network for business television (BTV). Educators make use of satellites operating at two different frequency allocations called C-band and Ku-band. C-band satellites were the first commercial carriers providing service to television and cable networks. They operate in the 6/4 KHz range and require relatively large receive dishes. The Ku-band satellites are more powerful, operate in the 14/12 KHz range, and, as they are more powerful, require smaller receiving dishes. See Figure 15.

Facsimile (FAX):

This is a device and process consisting of a scanner to convert hard copy to an electronic signal which is then sent by modem on a standard telephone line to the receiving site where the signal is decoded, reassembled, and printed. Facsimile transmission has been possible since the 1920's. It was not until the mid 80's that the high-tech digital FAX made it possible to send facsimile images in seconds rather than minutes, and the cost of machine and carrier dropped to a point where FAX is now a universal component of every office. Although facsimile can been used as the sole mode for distance learning, it is most commonly used to provide hard copy interaction between learner and teacher as an enhancement to other distance learning modes. Developments in FAX transmission permit the FAX signal to be embedded in video for simultaneous delivery, during a videoconference, of hard copy such as evaluation and course materials.

Analog and Digital Data:

Analog representations bear some physical relationship to the original quantity, usually a continuous representation (electromagnetic wave) where information is encoded in direct relationship to the power of the original light or sound source. as compared to digital representations where information is presented as discrete numbers, steps, or time intervals. The analog television signal is comprised of a approximately of 90,000,000 bits of information which, when transmitted as an analog signal is impressed or modulated on a carrier in such a manner that the pattern of light and dark can be visually ascertained on a wave television waveform monitor. Whereas digital information is reduced to a code that contains only discrete 0's or 1's, off or on, go or no go information.

Digital is method of processing, transmitting, and storing data which operates in discrete electronic or optical steps as contrasted to a continuous or analog method. Digital computers manipulate numbers encoded into binary digits (bits),on-off pulses, while analog computers sum continuously varying forms. Digital communications/switching is the transmission of information using discontinuous, discrete sequences of electrical or electromagnetic signals which change in frequency, polarity, or amplitude to represent or encode information. Analog information such as audio/video signals may be encoded for transmission on digital communication systems.

Bandwidth the width of an electronic transmission path or circuit, in terms of the range of frequencies it can pass without distortion. The wider or greater the bandwidth the more information can be carried by the medium of transmission. Electronic signals can be transmitted in digital or analog form. Typically measured in Hertz, but may be expressed in bits per second. A voice channel typically has a bandwidth of 3000 cycles per second; a TV channel requires about 6 megahertz. Conversion of analog signals to digital form permits manipulation of the signal for transmission on a signal path with a narrower bandwidth than would be possible in analog format. One could compare the bandwidth required to transmit enough information to create a television screen as a hose filling a barrel with water. Using analog signals would require a large diameter hose to fill the container with water in a certain period of time whereas with a digital signal the screen information could be manipulated to allow a garden hose to fill the container over a longer period of time or even in the same period of time if the screen information was compressed or coded for transmission.

Video Conferencing:

Sites are linked by a video picture and voice using a telecommunications channel suited to the type of video required. Video conferencing can be carried out in three basic modes: freeze frame, compressed, and full-motion video, spanning bandwidths from narrow to wide. Point-to-point or multipoint conferences are possible and employ two-way video/two-way audio or one-way video/two-way audio interaction between sites. Any graphics or other material that can be picked up by a television camera can be displayed at all sites, and textual or graphic material may be annotated by presenters. Video conferencing emulates a face-to-face meeting and has the advantage of social presence which enhances the richness of information communicated. Digital technology using compressed or compensated video on the public switched telephone network (PSTN) has revolutionized the ease by which organizations can participate in cost-effective, fully interactive, real-time conferencing on a network basis. Business, industry, government, and education have adopted this format in a massive move to establish full-function, interactive communication for training and distance learning.

Compressed video teleconferencing has become very popular because of significant savings in transmission costs and improved access, since the standard telephone network is used. Figure 19 shows the configuration for point-to-point and multipoint video conferencing using compressed video. Figure 20 illustrates the bandwidth choices available to conference users by comparing the number of standard lines equivalent to various compression rates.

Videodisc:

This is a plastic disc approximately 30 cm in size which is encoded by digital laser technology with up to 54,000 still images or 30 minutes of full motion video. Motion and still images may be combined on the same disc. Random access to images may be controlled by: instructions recorded on the disc itself, instructions programmed into the disc player, input to the player by bar code, or computer programs. Video disc is becoming one of the main sources of program material for multimedia applications. Coupled with a computer the video disc becomes a powerful learning device with good image handling, interactivity, image, and audio quality. The major drawback is the cost of original videodisc production and cost of interface equipment. Once produced, videodisc duplication cost is minimal and images are not degraded by use.

CD-ROM: (Compact disc-read only memory):

Plastic discs 12 cm in size are encoded by digital laser technology providing on-line computer access to a vast amount of audio and video information. A single CD-ROM disc can provide the learner with on-line access to huge data bases such as encyclopedia and research documents. This device is a key element in multimedia learning systems. It is cost effective for large stable data bases since this mass storage device is read only and requires complete re-manufacture to include new or updated information.

Business Television (BTV):

This includes training, informational, and educational television production and programming to private or closed group audiences. In the past it was generally carried by satellite with telephone interaction. Recent developments in compressed or compensated video have opened up this mode of broadcast for BTV via digital lines on the public switched telephone network. BTV provides business and industry with a cost-effective way to disseminate information and training to multiple, distant sites without incurring high personnel, travel and time costs associated with on-site meetings. Major advantages are that live, interactive communication can be carried out, ensuring that all employees at all levels receive the same information at the same time. Business television users often establish their own network or participate in the increasing number of narrowcast networks established to provide service to a specific users with common goals, such as automotive dealers, financial consultants, or workers in penal institutions. These new networks tend to be industry-wide, cutting across corporate and product boundaries.

Compressed Video:

The analog television signal as it is produced requires a tremendous amount of bandwidth to transmit (about 90 Mbps) when the waveform is combined with synchronization data and modulated unto a carrier signal for transmission. Digital technology permits encoding of the analog television signal in a much more efficient manner for transmission. Compressed digital video (CDV) reduces the amount of redundant information transmitted, and motion compensation results in only the changes between frames being transmitted. Compression rates can be as high as 6000 to 1 for freeze frame video at 19.2 Kbps to 60 to 1 for motion video at 1.5 Mbps. Compression is accomplished by a Codec (coder-decoder) which processes and encodes the analog audio and video television signal for transmission; once transmitted, a similar unit reverses the process by decoding the digital signal, returning it to analog which can be viewed on a standard receiver. See Figures 19 and 20.

Converting analog signals to digital permits processing, transmitting and storing the signals in computer-like manner. Analog video signals can be transformed and transmitted by digital techniques using much narrower bandwitdths. It is possible to compress video to the extent that still compressed digital video (CDV) can be transmitted over a single telephone voice circuit and motion video can be transmitted over a bandwidth as narrow as 6 standard telephone voice circuits.

Compressed video is expanding video teleconferencing in two major areas -- training and distance education. In satellite broadcasting, compression permits up to 20 channels to share one satellite transponder. Formerly one transponder was required to transmit wide band analog television signals. Compressed video reduces the bandwidth of television signals to the extent that terrestrial lines become practical. Digital transmission can be carried on standard public telephone switched network lines on a bandwidth-on-demand basis using multi-band inverse multiplexers to provide the bandwidth needed for the quality of transmission required. The user pays only for the bandwidth selected and the time connected

Manufacturers are introducing new lines of videoconferencing equipment of multi-mode design which have ushered in a new era of "application-driven" video conferencing. New codecs, inverse multiplexers, and vision processors coupled with improved compression algorithms and digital dial-up data lines are changing the face of teleconferencing. The user can establish enormous cost savings by eliminating high-cost dedicated data lines of specific capacity for equipment that will access the required bandwidth on demand from the telephone switched network. Savings are further compounded when the user can select the bandwidth and compression for a specific application. In general the larger and more formal the group activity the greater the requirement for picture quality.

Videoconferencing has aspired to provide "in-the-same-room" quality of picture and interaction between remote sites. Full motion video, CD quality audio, and full two-way audio/video transmitted over fiber optic channels have indeed reached this plateau. Cost and installation lag prohibit wide-scale application of this vision for videoconferencing to all but selected geographic sectors. New compression technology and the effectiveness of compressed and compensated video for specific applications are helping to establish links with independent, geographically separated learners.

Multimedia:

Each technology of communication as it has evolved and matured has exhibited unique qualities and capabilities useful in support of learning oriented opportunities. Multimedia is characterized by the convergence of these separate technologies into a whole that is far superior to the individual components. The cement of this union is the multimedia software which brings together computers, television, print, telecommunication, and mass information storage devices into a unified system. Multimedia systems not only exploit the beneficial qualities of each assimilated technology but provide a high degree of interactivity between the learner, developer, teacher, and learning materials. Multimedia will have tremendous impact on the way educators do business in the future. Educators and learners will have a high degree of decision making in the design, development, and use of the learning experiences provided. One might say that both the instructor and the learner will be in the "digital driver's seat" in all aspects of the development of learning experiences and the later use of these experiences. Multimedia has been referred to as "digital fusion", representing the bringing together of digital technologies by the unifying power of the computer.

Multimedia is in its infancy; yet, as we approach the millennium, it is having significant impact on education and training. The operating paradigm today makes use of the "windowing" capability of computer software to inset and overlay information in any order and from a multiplicity of sources including computer disc, CD-ROM, video tape, audio tape, laser video disc. Multimedia software enables the instructor to organize information using the capability of the computer to program and manage information and provide a tailored learning experience for learners of all types, levels, and capacities. On the learner side multimedia enables the learner to plan, execute, and manage his or her learning experience at the rate, place, and time of the learner's choice. New digital storage, compression, and delivery technologies will, as they come on stream, provide ever increasing access to information bases and to live teacher/peer interaction. See Figure 22.

The next advance in multimedia will involve a "wireless workstation" which will access information via cellular networks in much the same manner as subscribers now use mobile cellular telephones. "Smart cards" will allow learners to track their learning experiences, materials, and information sources in much the same manner as bank cards now track our financial resources. The learner will be able to access their learning account by placing their smart card in a workstation where ever they happen to be.

Table of Contents


Implications of Distance Education for Saskatchewan Schools

Strategies for delivery of distance learning are in a state of flux and will remain so for most of the remaining years of this century. Educators cannot afford to wait for a stable platform from which to safely take the initiative to provide learning where it is needed. The best any of us can hope for is to make the necessary decisions from as solid a knowledge base as possible. This "primer" discussion has, on an introductory basis, placed past, present, and emerging technologies in perspective. The overview took a cursory glance at various technologies available to the user. Change is happening at a pace which finds new developments in use before periodical literature and even manufacturer's data sheets are updated. This pace is illustrated by the continuum of devices and processes arranged in approximately chronological order in Figure 10. It is of some comfort to note that most of the processes for delivering distance learning are still in viable use today. We must also consider how the space between innovations is closing up. Multimedia, which assimilates most former communications modes using the power of organization found in computer software, allows us to view the future with a great deal of optimism. Using this new learning system configuration is some guarantee that little of what we have done before will be a loss. Multimedia and new telecommunications technology ensure that today's distant students will have a place and a time to learn.

Table of Contents


References

Alessi, S. M., and Trollip, S. R. (1985). Computer-based instruction: Methods and Development. Englewood Cliffs, NJ: Prentice-Hall.

Alexander, T. (1985). Artificial intelligence. Popular Computing, May, 66-69, 142-145.

American Psychological Association. (1983). Publication manual of the American Psychological Association (3rd ed.). Washington, DC: The Association.

Amsterdam, J. (1985). Expert systems. Popular Computing, May, 70-72, 150, 153.

Arnone, M. P., and Grabowski, B. L. (1991). Effect of variations in learner control on childrens' curiosity and learning from interactive video. In M. R. Simonson and C. Hargrave (Eds.) Proceedings of the 1991 Convention of the Association for Educational Communications and Technology (pp. 45-67). Orlando, FL: Association for Educational Communications and Technology.

Bailey, Z. T. (1990). CAI and interactive video enhance students' scores on the college level academic skills test. T.H.E. Journal, 18(2), 82-85.

Barker, P. (1990). Designing interactive learning systems. Educational and Training Technology International, 27(2) 125-145.

Bechtel, B. (1989). CD-ROM and the Macintosh computer. Apple Computer, Inc. Advanced Technology Group.

Black, T. R. (1987). CAL delivery selection criteria and authoring systems. Journal of Computer Assisted Learning, 3(4) 204-213.

Braden, R. (1986). Visuals for interactive video: Images for a new technology. Educational Technology, 26(5), 18-23.

Brandon, P. R. (1988). Recent developments in instructional hardware and software. Educational Technology, 28(10), 7-12.

Burger, M.L. (1985). Authoring languages/systems comparisons. Paper presented at the Annual Conference of the Association for Educational Communications and Technology, Anaheim, CA, January.

Butler, C. (1991). The 1991 computers in education survey of microcomputers. Computers in Education, March/April, 11-14ff.

Clark, R. E. (1984). Research on student thought processes during computer-based instruction. Journal of Instructional Development, 7(3), 2-5.

Cook, E. K. (1990). The use of Macintosh authoring languages in effective computer-assisted instruction. Journal of Educational Technology Systems, 18(2) 109-122.

Crowell, P., and Bork, A. (1989). Authoring systems. Instruction Delivery Systems, 3(2), 10-15.

Cushall, M. B., Harvey, F. A., & Brovey, A. J. (1989, March). Research on learning from interactive videodiscs: A review of the literature and suggestions for future research activities. Paper presented at the annual meeting of the American Educational Research Association, San Francisco).

Davis, D. B. (1986). Artificial intelligence enters the mainstream. High Technology, July, 16-23.

Dear, B. L. (1986). Artificial intelligence techniques: Applications for courseware development. Educational Technology, July 7-15.

DeBloois, M. L. (1982). Videodisc/microcomputer courseware design. Englewood Cliffs, NJ: Educational Technology Publications.

Eckols, S. L., and Rossett, A. (1989). HyperCard for the design, development, and delivery of instruction. Performance Improvement Quarterly, 2(4), 2-20.

Floyd, S., and Floyd, B. (1982). Handbook of interactive video. White Plains, NY: Knowledge Industry Publications, Inc.

Franchi, J. (1992). CBT or IVD? That's the question. Tech Trends, 37(2), 27-30.

Gayeski, D. (1983). Corporate and instructional video design and production. Englewood Cliffs, NJ: Prentice-Hall.

Geber, B. (1989). Whither interactive videodisc? Training, 26(3), 47-49.

Hannafin, K. M., and Mitzel, H. E. (1990). CBI authoring tools in postsecondary institutions: A review and critical examination. Computers and Education, 14(3), 197-204.

Hannafin, M. J., and Peck, K. L. (1988). The design, development, and evaluation of instructional software. New York: Macmillan.

Harvey, D. A., and Corbett, J. (1991). Unlimited desktop storage: Optical drives that blow away the competetion. Computer Shopper, 11(11), 230ff.

Hazen, M. (1987). Criteria for choosing among instructional software authoring tools. Journal of Research in Computing in Education, 20(2), 117-128.

Heath, T. (1981). Alternative videodisc systems. Videodisc/Videotext, 1(4), 228-238.

Holden, C. (1986). Artificial intelligence techniques: Applications for courseware development. Educational Technology, July, 7-15.

Hunka, S. (1989). Design guidelines for CAI authoring systems. Educational Technology, 29(11), 12-17.

Iuppa, N. V. (1984). A practical guide to interactive video design. White Plains, NY: Knowledge Industry Publications.

Katz, L. (1992). Essentially multimedia: An explanation of interactive laserdisc and optical technology. The Canadian Multi Media Magazine, 1(1), 18-20.

Katz, L. and Keet, C. (1990). Innovations in laser and optical disc technology. Calgary, Alberta: Alberta Laserdisc Committee.

Kemp, J. E., and Smellie, D. C. (1989). Planning, producing and using instructional media (6th ed.). New York: Harper and Row.

Leveridge, L. L., and Lyons, D. S. (1983). Which disc player for education? A comparative evaluation. Educational-Industrial Television, 15(7), 50-52.

Lucas, L. (1991). Visually designing the computer-learner interface. Educational Technology, July, 56-58.

Lucas, L. (1992). Interactivity: What is it and how do you use it? Journal of Educational Multimedia and Hypermedia, 1(1), 7-10.

Lynch, P. (1991) Multimedia: Getting started. Sunnyvale, CA: PUBLIX Information Products, Inc. for Apple Computer, Inc.

Merrill, M. D. (1985). Where is the authoring in authoring systems? Journal of Computer-Based Instruction, 12(4), 90-96.

Misanchuk, E. R. (1992). Preparing instructional text: Document design using desktop publishing. Englewood Cliffs, NJ: Educational Technology Publications.

Paris, J., and Boss, R. W. (1982). The care and maintenance of videodiscs and players. Videodisc/Videotext, 2(1), 38-46.

Phillipo, J. (1989). An educator's guide to interfaces and authoring systems. Electronic Learning, 8(4), 42, 44-45.

Pollack, R. A. (1989). Generic videodiscs in education and training. Instruction Delivery Systems, 2(5), 22.

Proctor, L. F. (1990). Microcomputers in Saskatchewan schools. SACE Bulletin, October, 16-25.

Raskin, R. (1990). Multimedia: The next frontier for business? PC Magazine, July, 151-192.

Richards, T. C., and Fukuzawa, J. (1989). A checklist for evaluation of courseware authoring systems. Educational Technology, 29(10), 24-29.

Rode, M., and Poirot, J. (1989). Authoring systems¾are they used? Journal of Research on Computing in Education, 22(2), 191-198.

Romiszowski, A. J. (1986). Developing auto-instructional materials. New York: Nichols Publishing.

Rude-Parkins, C. (1992). Computer-based curriculum development tools for teachers. Journal of Educational Multimedia and Hypermedia, 1(2), 179-186.

Sales, G. (1989a). An introduction to videodiscs III: Videodisc hardware. Computing Teacher, 16(7), 50-51.

Sales, G. C. (1989b). Repurposing: Authoring tools for videodisc. Computing Teacher, 16(9), 12-14.

Schaffer, L., & Hannafin, M. J. (1986). The effects of progressively enriched interactivity on learning from interactive video. Educational Communication and Technology Journal, 34, 89-96.

Schwartz, E. (1987). The educators' handbook to interactive videodisc (2nd ed.). Washington: Association for Educational Communications and Technology.

Schwier, R. A. (1987). Interactive video. Englewood Cliffs, NJ: Educational Technology Publications.

Seiter, C. (1991). Optical outlook. Macworld, 8(6), 139-145.

Semrau, P., and Boyer, A. (1991). Examining educational software from both an aesthetic and cultural perspective. Journal of Hypermedia and multimedia studies, 2(1), 25-29.

Shuping, M. B. (1991). Assistive and adaptive instructional technologies. In G. J. Anglin (Ed.), Instructional technology: Past, present, and future, pp. 292-301. Denver, CO: Libraries Unlimited.

Smith, E. E. (1987). Interactive video: An examination of use and effectiveness. Journal of Instructional Development, 10(2), 2-10.

Stewart, D. (1991). Interview: Jaron Lanier. Omni, January, 45-46, 113-117.

Taylor, T. D. et. al. (1987). Interactive video authoring systems. Optical Information Systems, 7(4), 282-300.

Tisdall, B. (1990). Buyer's guide: Optical disk [sic] drives. PC User, 137 (July 18), 74ff.

Weigand, I. (1985). Videodisc players: Pasts and futures. Video Manager, 8(3), 14-15.

Whiting, J. (1989). An evaluation of some common CAL and CBT authoring styles. Educational and Training Technology International, 26(3), 186-200.

Zollman, D. (1991). What's m-ss-ng? EBUG, 1(1), 1-2.

Table of Contents


Glossary of Terms: Multimedia and Distance Education

Access Channel A television channel dedicated to public use often with provisions for the general public to originate its own programs, but sometimes only for governmental or educational purposes. Access channels are often prescribed by cable company legislation.

Access Gain entrance to a network, program, or service.

Access Time The space of time between a request for access and the time the information or service is delivered.

Accounting Rate A basis for cost determination between international telecommunications carriers for joint or shared provision of international service. The wholesale rate agreed to by international carriers for international calls.

Acoustic coupler A modem device capable of transmitting and receiving specified sound tones along telephone lines. It allows a computer and terminal to be connected using a modem and a standard telephone handset.

Algorithm Arrangement of mathematical functions that enable a computer to perform problem solving operations. Video compression techniques rely on an algorithm which converts analog video signal to a digital signal for transmission and then re-assembles the video signal for viewing.

Alphamosaic A method for generating videotex images on a screen. Displays are constructed using a mosaic of dots.

Amplitude Modulation (AM) One of three ways of modifying a sine wave signal in order to make it "carry" information. The sine wave, or "carrier," has its amplitude modified in accordance with the information to be transmitted.

Analog Representations which bear some physical relationship to the original quantity, usually a continuous representation (electromagnetic wave) where information is encoded in direct relationship to the power of the original light or sound source. as compared to digital representations where information is presented as discrete numbers, steps, or time intervals. (see digital)

Analog Transmission Transmission of a continuously variable signal as opposed to a discretely variable signal. Physical quantities such as temperature, are continuously variable and so are described as "analog."

ANI Automatic Number Identification which provides the subscriber with the number of the caller. Also used in billing and routing calls and services.

Antenna (E) A device used to transmit and/or receive radio signals. In satellite communications an antenna often has one or more reflectors which direct incoming radiowaves onto a single point. Conversely, radiowaves are propagated in the direction of transmission by that focal point.

Apogee An orbiting satellite is at apogee when at its greatest distance from the earth.

Area Code A three-digit code designating a "toll" center not in the numbering plan area of the calling party. The first digit is any number from 2 through 9. The second digit is always a "1" or "0".

Artificial Intelligence Computer programs which perform functions, often by imitation, normally associated with human reasoning and learning.

ASCII American Standard Code for Information Interchange. A standardized code for representing characters which is recognized by most computer systems.

AsynchronousTransmission (1)Transmission in which each information character, or sometimes each word or small block is individually synchronized, usually by the use of start and stop elements. Also called start-stop transmission. (2) Communication that takes place between individuals in different time-frames. Computer conference members address a host computer rather than each other and access the host at their convenience. The computer conference is not conducted in "real-time". Synchronous transmission takes place at the same time, such as a teleconference.

Attenuation The reduction of current, voltage, or power of a signal due to transmission loss through a cable path or equipment. (usually expressed in dB)

Audio Frequencies Frequencies that can be heard by the human ear (usually 30 to 20,000 hertz).

Audiographic A teleconference system which makes use of narrow band communications channels such as telephone lines to transmit, audio, graphics and computer text files.

Authoring Program A computer program which is designed for computer assisted instruction development. Procedures are predefined, and require minimal programming knowledge on the part of the user.

Azimuth The horizontal angle of an antenna measured clockwise in degrees from true north. The azimuth of an antenna which points due west is 270 degrees.

Backbone A main communications path, usually a multi-conductor wire cable or multi-strand optic cable from which other communication paths branch to customers.

Backup Preserving computer data by copying to magnetic or optical medium such as disk or tape. Making a second copy for use should the original be damaged while in use.

Band A range of frequencies between defined upper and lower limits. For example, the Medium Frequency (MF) band, as designated by the International Telecommunication Union ITU), is 300-3,000 kHz.

Bandwidth The width of an electronic transmission path or circuit, in terms of the range of frequencies it can pass without distortion. The wider or greater the bandwidth the more information can be carried by the medium of transmission. Typically measured in Hertz, but may be expressed in bits per second. A voice channel typically has a bandwidth of 3000 cycles per second; a TV channel requires about 6 megahertz.

Barcode Reader An infra-red scanning device which interprets bar coded commands for a videodisc player.

Barcode A type of code used on labels to be read by a wand or bar-code scanner. The main application is in labelling retail products and documents in libraries. Also used to input programming code to devices such as laser disc players and CD-ROM players.

Baseband An information or message signal whose content extends from a specific frequency near dc to some finite value. For voice, baseband extends from 300 hertz (Hz) to 34000 Hz. Video baseband is from 50 Hz to 4.2 MHz.

Baud Bits per second (bps) in a binary (two-state) telecommunications transmission. After Emile Baudot, the inventor of the asynchronous telegraph printer.

Binary Code A base two system of notation, typically utilizing the numbers 0 and 1.

Bird Slang or nickname for a communications satellite.

Bit Contraction of "binary digit." The smallest part of digital information with equally likely values or states, "0" or "1", "off" or "on", or "yes" or "no". In electronic communication systems, a bit can be represented by the presence or absence of a pulse.

Bit Rate The speed at which bits are transmitted, usually expressed in bits per second.

Bit-Mapped Graphics Graphics composed of pixels which are individually addressed and accessed.

Boot or Bootstrap A method of inputting data prior to the loading of a computer program, so causing the program to be loaded.

Branching Moving the user from one sequence in a program to another, according to instructions in the program.

Bridge A device for interconnecting communications devices such as telephones and computers or two or more local area networks (LANS). A telephone bridge is an electronic device that three or more telephone lines together so that individuals can hold a teleconference. Advanced bridges automatically connect, announce those who join and those who leave a conference and provide a constant volume for all conference participants

Broadband A communication system with a bandwidth greater than voice band. Broadbands are capable of high-speed data transmission and usually utilize coaxial, microwave, or optical transmission. Used to describe high capacity transmission systems to carry large blocks of telephone channels, high speed data channels, or one or more video channels. Used to describe digital technologies which provide integrated voice, data, video and interactive communications services to businesses and households.

Broadcasting A radio wave communication service in which the transmissions are intended for direct reception by a wide spectrum of receivers such as the general public. Broadcast service may include voice, television or data transmissions.

Buffer A storage device used between communication devices to compensate for a difference in rate of data transfer, or time of occurrence when transmitting data from one device to another.

Bug An error in a computer program, glitch in a transmission, or flaw in equipment.

Bus A device that permits common connection of other devices, usually in parallel, to transfer information, power, or data from any one of several sources to anyone of several destinations.

Byte A group of binary digits which operate together as the smallest unit of usable information in a computer memory. A byte is often an 8-bit group, but can be a 7-, 9-, 16-, or 32-bit group. A byte is generally equivalent to the information found in one typed character.

C-Band A band of radio frequency from allocated to transmit satellite television or telephone signals. Signals on the C-band are transmitted at 6 Gigahertz and received at 4 Gigahertz.

Cable Television (CATV or CATV) Broadband radio-frequency transmission of video signals over coaxial cable or optical fiber directly to television sets in the home as opposed to broadcast television. Video signals may be transmitted in one or two directions; thereby enabling viewers to input data. Cable television also makes possible pay services and video conferencing.

CAI Computer assisted instruction.

Capacitive Disc An obsolete videodisc system that used capacitance signals embedded on the disc and a stylus which touched the surface of the disc to read encoded information.

Carrier (1) The frequency within a given bandwidth upon which an information-carrying signal can be impressed or modulated with another information carrying signal. (2) An organization, company, or business (vendor) authorized by a government regulatory agency to provide a specific communications services. (3) A carrier system using one of many modulation processes in order to derive more than one channel from a single path.

Cathode Ray Tube (CRT) A video display vacuum tube used in television receivers and monitors and in computer display terminals.

CATV Community Antenna Television System, see Cable TV.

CAV Constant angular velocity. A CAV videodisc revolves continuously at 1800 rpm, one revolution per frame, making each frame of a CAV disc addressable, a basic requirement for interactive videodiscs.

CBI Computer-based instruction.

CBT Computer-based training

CCIR International Consultative Committee for Radiocommunications (Comite Consultatif International des Radiocommunications)

CCITT International Consultative Committee for Telephone and Telegraphs (Comite Consultatif International Telegraphique et Telephonique)

CD Compact disc. A format which records digital data on 12 cm. optical discs.

CD-I Compact disc-interactive. A compact disc format which includes audio, video and program data.

CD-ROM Compact disc - read only memory. A format for recording data on compact discs, permitting virtual storage of a large amount of information in a small format.

CD-ROM XA Compact disc-read only memory extended architecture. A format for interleaving audio and data within a basic CD-ROM format.

Cellular Provision of mobile telephone service via a system of interconnected low power radio, transmitter/receivers, each of which provides service for a cell (a small geographic area). The mobile subscriber's call is automatically handed off to the next cell in the system as the subscriber travels through the cellular system.

Central Office (CO) The local switch or interchange interface for a telephone system or common carrier where local/long distance calls are switched.

Central Processing Unit (CPU) The component in a computer system which contains the circuitry to perform arithmetic, logic, and control functions that interpret and execute instructions contained in computer programs.

Channel The segment of a bandwidth which provides a pathway or a communications link between sending and receiving points.

Chapter A consecutive sequence of frames on a videodisc, usually identified as a coherent portion of a treatment.

Chapter Stop A code embedded in the vertical blanking interval of a videodisc that enables certain videodisc players to locate the beginning of chapters.

Chip Often termed microchips because of the microscopic transistors, resistors, capacitors, and connection paths which form complete electronic integrated circuits. Integrated circuits (IC) and large scale integrated circuits (LSI) may perform single or multiple electronic functions. Entire computer central processing units can be manufactured on one LSI chip.

Chip Sets Application Specific Integrated Circuits (ASIC). Developed for use in communication products such as desktop video, home satellite entertainment video, and codecs these chips pave the way for broader use of digital audio/visual and video compression techniques according to the CCITT H.261 compression standard.

CLV Constant linear velocity. A CLV or extended-play videodisc maintains a consistent length for each frame, thus enabling longer playing time per side, but sacrificing individual frame access in most players. Reference to locations on CLV discs is limited to time in minutes and seconds.

Coaxial Cable (COAX) A cable with a central metallic inner conductor surrounded by an insulating spacer which is wrapped by a second metallic conductor or shield which, in turn, is protected by an insulating covering .. Coaxial cables are specifically suited to transmission of broadband services.

Coder-Decoder (CODEC) A coder-decoder (analog-to-digital and digital-to-analog converter) is used to convert analog signals, such as television, to digital form for transmission and back again to the original analog form for viewing.

Communications Satellite An earth satellite designed to act as a telecommunications radio frequency relay that is positioned in geosynchronous orbit 22,300 miles above the equator so that it appears from earth to be stationary in space.

Compressed Video Reduces the bandwidth necessary to transmit video images over communications channels. Often only the changes between successive video are transmitted. Also termed data compression, bandwidth compression, and bit-rate reduction.

Conference Call A call or circuit established among three or more stations in such a manner that each of the stations is able to carry on two way communication with each of the others.

Courseware Instructional materials in a complete mediated format. May refer to a single instructional component, such as a computer assisted instruction program, or a multiple instructional entity, such as guidebooks, videodiscs and computer assisted instruction.

CRTC An independent agency of the Canadian government charged with the regulation of telecommunications that originates in the Canada (i.e. radio, television, telephone, and satellite).

Cybernetics (1) The science of communications and control in animals and machines. (2) A theory of communications and control which accounts for the operation of systems in terms of feedback effects.

Data Basic elements of information, i.e. numbers, letters, symbols, which are processed or produced by humans, computers, or machines.

Data Communications Transfer of encoded information by means of electromagnetic or electro-otical transmission systems.

Data Compression Reduces the amount of computer memory space or transmission capacity required to store or transmit a given amount of data.

Database An information storage file organized according to specific rules to enable required information to be retrieved for use or modification.

Decibel (dB, db) (1) A standard unit of measure of signal power expressing a ratio of input intensity, power or voltage relative to the output, commonly a ratio of change of sound intensity. One decibel equals one tenth of a bel. (2) Amount of signal power dissipated during transmission.

Declination Angle between antenna beam and equatorial plane (measured in meridian plane). The offset angle of an antenna from its polar mount axis.

Delay The time period for the signal to travel from the transmitter to the satellite and back to the receive station.

Demodulation The process of recovering information from a modulated signal or carrier.

Deregulation Elimination of regulatory contraints which govern telecommunication service providers. The provider is subject to the dictates of the market place with no monopoly guarantee of client base or profits.

Descrambler An electronic device that decodes encrypted satellite/telephone/facsimile signals.

Digital A method of processing, transmitting, and storing data which operates in discrete electronic or optical steps as contrasted to a continuous or analog method. Digital computers manipulate numbers encoded into binary digits (bits),on-off pulses, while analog computers and sum continuously varying forms. Digital communications/switching is the transmission of information using discontinuous, discrete sequences of electrical or electromagnetic signals which change in frequency, polarity, or amplitude to represent or encode information. Analog information such as audio/video signals may be encoded for transmission on digital communication systems.

Digital Speech Interpolation (DSI) A method by which speech can be digitized (converted to bits) so that no bits are transmitted when a person speaking pauses and as soon as speech begins, bits flow again. Bandwidth for transmission is reduced using digital versus analog transmission.

Digital Switching System A digital telephone switching system which provides special services such as call identification, speed dialing, call transfer and three-way dialing.

Digital Video Video signals which have been encoded as a series of binary digits. In this format they can be accessed and manipulated in a computer program.

Direct Broadcast Satellite (DBS) A satellite system designed with sufficient power to transmit signals directly from orbit to small inexpensive earth stations for direct residential or community reception, This eliminates the need for a local cable loop by allowing use of receiving disks with a diameter of a meter or less mounted directly on building.

Direct Distance Dialing (DDD) Capability of the public switched telephone network (PSTN) to automatically route log distance calls A telephone service which enables a subscriber to call outside his local area code without operator assistance.

Dish A bowl-shaped parabolic receive/transmit antenna used to transmit/receive satellite signals.

Distance Education Defined as a means by which instruction is offered to learners who are often geographically separated from the provider and often is described in terms of the technological devices used. (see distance learning and open learning)

Distance Learning Closely related to "open learning", describes, from the learner standpoint, a concept of how learning will be developed and arranged to best suit the needs of the student.

Down-Link A television receive only (TVRO) earth station. All of the components of a communication satellite receive system or earth station used to receive and process information delivered by satellite.

Downconverter A device which detects, selects and reduces the high radio frequency (RF) of a received satellite/microwave signal and converts it to a lower intermediate frequency (IF) for tuning and demodulation.

Driver The portion of a computer program that controls peripherals, such as videodisc players, CD-ROM players and disc drives.

Drop The wire or optic cable connecting telephone loops or coaxial cable systems to a subscribers premises.

Dual Band Used to denote equipment and antenna capable of using both C-band and Ku-band signals.

Dumb Terminal A computer access terminal with no independent processing capability of its own. It contrasts with a smart or intelligent terminal which can carry out some operations independent of a computer.

Duplex Used to describe simultaneous transmission in both directions (full-duplex) at the same time.as apposed to half duplex which is alternate transmission in each direction. Simplex transmission refers to transmission in only one direction.

DVI Digital video interactive. A format for placing digital video on a compact disc. Compressed files can provide up to 72 minutes of full motion video.

Earth Station Electronic equipment on the ground that is used to transmit and/or receive radio signals to/from an orbiting satellite.

Echo Suppression A process used in long distance telephone and audio teleconference systems to help prevent reflection of wave energy (echo) back to the transmitter.

Electromagnetic Radiation A form of energy including AC electric power, radio and light, which propagates through space in the form of oscillating electric and magnetic fields or "waves" at the speed of approximately 186,000 miles per second.

Electronic Mail (E-mail) A general term referring to the electronic transmission, distribution, and delivery, of messages. E-mail is characterized by storage of a message at an "electronic address" which can be received by the recipient via personal computer equipped with a modem. E-mail is distinguished from most areas of telecommunications by its capability for asynchronous (non-real time) transmission and reception of messages. Facsimile (FAX) transmission of messages operates in a similar manner, but are received directly rather than stored in a "host" computer until requested.

Electronic Office (Cottage) A term used to describe a office/study area,often located in the home, which makes full use of information communication technology for telework or telelearning.

Elevation Angle The vertical angle measured from the horizon up to a targeted satellite. Often used to aim.satellite dishes designating the angle, above the horizontal plane, that an antenna must be raised in order to direct it toward the satellite.

Encryption Conversion of a telecommunications signal to an encoded form in order to make transmission more secure. Receiver must have a decoder to receive transmissions. In pay television or video conferencing networks decoders may be addressed from the transmission site to permit reception by the subscriber.

Ethernet A local area network (LAN) provides relatively high speed data communication between computers and computer terminals over coaxial or shielded telephone cables.

Facsimile (FAX) A system used to transmit text and graphics over telecommunications channels . The original image is scanned at the transmitter, reconstructed at the receiving end, and duplicated on paper or stored on a personal computer. Facsimile transmission rates vary from analog Group I, II, and III faxs requiring five minutes, two minutes, and 20-40 seconds respectively to transmit a page, to digital Group IV faxs which transmits one page in three to six seconds.

FAX see Facsimile

Federal Communication Commission (FCC) An independent agency of the United States government charged with the regulation of telecommunications that originates in the US (i.e. radio, television, telephone, and satellite).

Fiber Optic Cable Thin filaments of glass or other transparent materials through which coded light pulses representing data, image and sound can be transmitted for long distances by means of multiple internal reflections. Fiber optic transmission is characterized by extremely high transmission speeds and bandwidth.

Field A scan of 262 lines on the screen at 1/60 second, constituting half of a complete video frame (see Frame).

Flowchart A diagram which illustrates the paths a user can follow through an instructional treatment.

Foot Print The geographic area on the earth's surface which is effectively served by a particular communications satellite signal or beam. Satellites and their transponders can be articulated from the ground to provide specific area footprints.

Four-Wire Transmission.. Permits two-way transmission of data using separate two wire pathways, one pair for transmit and one pair for receive. Early teleconference and audiographic conferencing used the four wire system. Advances in technology allow these forms of conferencing to be accomplished on two conductors with the same two-way effect.

Fractional T1.. A telephone network service that permits users to purchase 56 Kbps or 64 Kbps increments of a T1,1.544 Mbps, bandwidth.

Frame Number The number associated with each frame on a CAV format videodisc recorded in the vertical blanking interval.

Frame Two complete scans of the video screen at 1/30 second. A frame is composed of two fields (each 262 lines). A single frame is a standard CAV videodisc reference point. There can be as many as 54,000 addressable frames on one side of a CAV videodisc.

Frame-Grabber A "freeze frame" device that can seize and record a single frame of video information out of a sequence of many frames. Used in video, graphic, and video-phone applications to manipulate, transmit, and store single television frames. Digital technology makes possible a variety of visual manipulations of the "grabbed" frame.

Freeze Frame A single frame from a motion sequence that is stopped.

Frequency Modulation (FM) One of three ways of modifying a sine wave signal (carrier) to enable it to carry information. The carrier frequency is modified in accordance with the information to be transmitted.

Frequency Spectrum Describes a range of frequencies within the electromagnetic wave spectrum. In terms of radio frequencies the useful range extends from about 10 kilohertz to 3000 gigahertz.

Frequency The number of complete cycles of any given electromagnetic wave per unit of time. Electrical frequency is expressed in hertz per second which is equivalent to cycles per second.

Full Duplex (FDX) Refers to a communications system or equipment capable of transmission simultaneously in two directions.

Full Frame Time Code Also known as on-drop frame time code. A SMPTE standard for addressing the time code for a videotape.which preserves accurate frame counts instead of matching frames to real time. Edit masters for videodisc production must use this format.

Gain The signal amplification capability of a device expressed as a ratio of output power to input power, usually measured in decibels (dB).

Genlock Synchronization generator lock. A method of matching the timing of a video system and outside signals, so as to overlay or combine one signal with another.

Geostationary Satellite (Geo-Synchronous Satellite) A satellite which orbits the earth 22,300 miles above the equator and at at the same speed as the earth's rotation about its axis. The satellite appears stationary when viewed from the surface of the earth.

GHz see Gigahertz

Giga A prefix signifying one billion.

Gigabyte Unit of measurement of memory storage capacity in a computer system. One gigabyte consists of 1.024 billion bytes or roughly a billion printed characters.

Gigahertz (GHz) Unit of frequency equal to a billion cycles per second (1000 MHz).

Half-Duplex Circuit (HD OR HDX) A circuit designed for transmission in either direction but not both directions simultaneously.

Handshaking Exchange of alerting signals for control when a connection is established between to modems or other devices.

Hard Disc A magnetic or optical disc used for bulk storage of computer data. Hard discs have a far greater storage capacity than floppy discs and may be internal or external to the computer.

Hardware The physical equipment components used in telecommunications and computer systems. (contrasts with software)

Head End The source or originating area of signals for a cable television system. Signals which emanate from the head end may come from broadcast transmissions, videotape, radio, other cable systems, or be originated in studios distant from or at the head end.

Hertz (Hz) The frequency of an electric or electromagnetic wave in cycles per second. After Heinrich Hertz who detected such waves in 1883.

High Frequency (HF) Short wave radio bands used in long distance communication between 3-30 Mhz.

High Sierra A name for a popular data format for CD-ROM.

Hybrid Combination of two or more processes, technologies, or devices to accomplish a task. Such as using cable and satellite transmission to reach an entire population of an area.

Hypermedia An approach to information storage and retrieval which provides multiple linkages among elements. In IMI, it allows the learner to navigate easily from one piece of information to another.

Hypermedia Software designed to permit interactive manipulation of all aspects of communication media such as video, audio, text, graphics, animation, digital effects etc.

Hz see Hertz

Icon A graphic which identifies a function to be performed by a computer program. For example, an icon of a garbage can is used for disposing of unwanted files on a Macintosh computer.

Informatics A term that describes the study of information and its handling, especially by means of new information technologies.

Integrated Circuit A single substrate (chip) containing a complete electronic circuit consisting of transistors, diodes, capacitors, resistors, and connecters etc.

Integrated Services Digital Network (ISDN) A telecommunications network which capable of accepting all types of information (i.e. voice, data, facsimile, full motion video, videotext) in a common digital code and transmit it as if it were one signal. Provides end-to-end digital connectivity for simultaneous transmission of all types of information according to accepted defined international standards. Often referred to as a "universal network" able to support any device for transfer of information.

Intelligent Terminal A terminal that is programmable and has the capability to process messages to and from the host computer.

Interaction A reciprocal dialogue between the user and the system.

Interactive Communications The situation where the user inputs data and waits for a response form the far end before making the next input. In voice or video interactive implies a conversational mode of communications between users at either end.

Interface A boundary or connection between two pieces of equipment across which all the signals which pass are carefully defined. The definition includes the connector signal levels, impedance, timing sequence of operation and the meaning of signals.

Intermediate Frequency (IF) A frequency used in superheterodyne receiver/transmitters to facilitate filtering, distribution, modulation and demodulation of the signal.

International Telecommunications Union (ITU) A United Nations treaty organization for the purpose of accrediting international telecommunications standards.

ISDN see Integrated Services Digital Network

ISO-9660 The most commonly used format for recording data on CD-ROM discs.

ITFS Instructional Television Fixed Service. Microwave-based very high frequency television over-the-air broadcast method used primarily by education. Receive sites must have a converter to change signals to those used by the television receiver. Capable of full motion video and one way audio over distances up to 35 kilometers.

ITU see International Telecommunications Union

KHz see Kilohertz

Kilo A prefix signifying one thousand.

Kilohertz A thousand cycles per second.

Ku-band The band of microwave uplink frequencies from 12 to 18 GHz. Band of satellite communications frequencies from 11.7 to 12.2 GHz. Ku-band transmission requires one meter satellite receiving dish whereas the C-band dish spans a minimum of three meters.

LAN see Local Area Network

Landline A circuit path, wire or cable, which connects two location by land.

Laser Light amplification by stimulation of emission of radiation. Lasters produce focussed beams of light which are used to read optical data on videodiscs and CD-ROM, and used for a wide range of communications activities including as a light source for fibre optic transmission. .

Leased Line A permanent and exclusive communication path for a telecommunications subscriber. A leased line is separate from the public portion of the switched telephone network.

Level I A level of interactivity in which the user can control a videodisc player with the keypad, but has not other method of influencing the order of presentation.

Level II A videodisc presentation controlled by a digital program permanently recorded on the disc.

Level III A videodisc presentation controlled externally, often by a computer. The computer controls the presentation, and the videodisc player acts as a peripheral device.

Level of Interactivity The potential for interaction prescribed by the capabilities of videodisc hardware and external intervention. Conventionally, Levels I, II, and III are labels used.

Local Area Network (LAN) Private data communications network formed by linking together computers, word processors and telecommunications equipment in a specific building or geographic area. A LAN is then often linked to external networks such as public telephone and data transmission networks.

Loop 1) The wire pair which extends from a telephone central office to a subscriber's telephone. 2) the coaxial/optical cable in a broadband or cable TV system which passes by each building or residence on a street and connects with the trunk cable at a neighborhood node.

Low Earth Orbiting Satellite (LEO)

Low Noise Amplifier (LNA) A device which amplifies the received signal at an earth station. The LNA is usually mounted on the antenna.

Mega- A prefix signifying one million.

Megahertz(MHz) One million cycles per second.

Memory A computer storage device into which information can be introduced and stored for later access by a computer.

Menu A sequence or list of choices presented to the user in a program.

Micro- A prefix denoting one millionth. sometimes used simply to denote smallness.

Microchip Electronic circuit with multiple solid-state devices engraved through photolithographic or microbeam processes on one substrate. (see microcomputer and microprocessor)

Microprocessor A microchip which performs the logic functions of a digital computer.

Microwave Very short electromagnetic waves having a wavelength of approximately thirty centimetres to approximately one centimetre: This corresponds to associated frequencies extending from 1GHz to 30GHz.

MODEM see Modulator-demodulator

Modulation A process of modifying the characteristics of a propagating signal (the carrier) so that it represents the instantaneous changes of another signal. The carrier wave can change its amplitude (see AM), its frequency (see FM), its phase (see phase modulation), or its duration (see pulse code modulation), or combinations of these.

Modulator A device which converts a signal (voice or other) into a form that can be transmitted.

Modulator-Demodulator (MODEM) A device which converts digital pulses to analog tones, and vice versa, to allow transmission of digital computer data over analog telephone circuits. Any device which modulates signals for transmission over a telecommunications network.

Mount The supporting structure of an earth station antenna. Usually a polar mount, an azimuth/elevation (AZ/EL) mount, or a wheel and track mount.

Mouse A computer interface with a button(s) mounted on a trac ball. It is rolled to position a cursor on the screen, and the button is "clicked" to initiate an action.

Multimedia An instructional program which includes a variety of integrated sources in the instruction. The program is intentionally designed in segments, and viewer responses to structured opportunities (e.g., menus, problems, simulated crises, questions, virtual environments) influence the sequence, size, content, and shape of the program.

Multiplexer A device which enables more than one signal to be sent simultaneously over one physical circuit.

Multiplexing An electronic or optical process of combining two or more signals from separate sources into a single signal for sending on a transmission system from which the original signals may be recovered exactly.

MUX Abbreviation of the term word multiplexer.

Nano- A prefix denoting one thousand millionth.

Narrowband Communication A communication system capable of carrying only voice, slow speed computer signals, slow scan video, or facsimile at speeds less than 1.544 Mbps.

Network 1) A series of points connected by communication channels in different geographic locations. 2) The switched telephone network is the network of telephone lines normally used for dialed telephone calls. 3) A private network is a network of communications channels confined to the use of one customer.

Node In a topological description of a network a node is a point of junction or interconnection of the links. Typically a switching center where communications equipment attaches to the network.

Noise Any random interference in a communication system which degrades the clarity of the signal.

Non-Drop Frame Time Code (See full-frame time code)

NTSC National Standards Television Committee. Television standard adopted by the united States, Canada, Mexico, and Japan which uses 30 frame-per-second and 525 line television arrangement to display images.

NTSC The North American television standard based on 30 frames per second and 525 scan lines by the National Television Standards Committee.

Open Learning Open learning embraces all modes for delivery of learning materials to learners who are at a distance. Conceptually the student or learner is at the center of open learning, and educational providers endeavour to deliver, on a cooperative/collaborative basis, learning materials configured to the needs of the learners. Open learning removes barriers to learning that face students who must, for whatever reason, pursue their educational interests distant from a traditional institution. Open learning is learner centered and attempts to extend the learner's existing knowledge, experience, and skills. Open learning is premised on the philosophy that learning is life-long. Open learning is characterized by provision of learning opportunities which:

Operating System The programmed functions of a computer which enable it to run various programs and control scheduling, printers, terminals, memories etc.

Optical Character Recognition (OCR) A technique in which information recorded on hard copy is examined by an optical scanner that converts the scanned information into digital form.

Optical Disc A videodisc that uses a light beam to read information from the surface of the disc.

Optical Fiber A thin, flexible glass fiber the size of human hair which will transmit light waves capable of carrying vast amounts of information.

Optical Reflective Videodisc Method by which the laser beam reads data encoded on an optical videodisc. I the case of a reflective disc, the beam is focused on information just below the surface of the disc, and reflected onto a photosensitive pick-up device.

Optical Transmission Is the use of the visible part of the magnetic spectrum for communication. The two major methods are non-coherent transmission, which is normally used over short distances such as a few hundred meters and includes LED transmission, and coherent transmission, in which a laser provides transmission of information over a larger bandwidth over long distances.

Orbit The path of an earth satellite as it circles the earth.

Packet A standardized group of binary digits including data and call control signals that can be switched as a composite whole. The data, call control signals, and error control information are arranged in a specific format.

Packet Switching A technique of switching digital signals with computers whereby the signal stream is broken into small packets and reassembled in the correct sequence at the destination. There are many variations used in data networks, in satellite communications and for secure voice communications.

PAL Phase Alternation by Line. Television standard used in Western Europe, Australia, New Zealand, China, India, Argentina, Africa and Brazil. Uses 625 line, 25 frame-per-second configuration to display television images.

Phase Modulation One of three ways of modifying a sine wave signal to make it carry information. The sine wave or "carrier" has its phase changed in accordance with the information to be transmitted.

Picture Stop An instruction encoded in the vertical interval on the videodisc to stop the videodiscs player on a predetermined frame.

Pixel Picture elements that combine to construct characters on a video display unit.

Point-to-Point A connection permanently established between two specific stations. Point-to-Multi-point establishes connection from one originating site to many receive sites. Multi-point establishes signals between all connected sites so that each site can originate and receive.

Polarization The property by which an electromagnetic wave exhibits a direction (or rotation sense) of vibration, giving the opportunity for frequency re-use by orthogonal polarizations.

POTS Plain old telephone service. An acronym used by the telephone industry for conventional public switched telephone service.

Protocol Strict procedure required to initiate and maintain communication. Protocols can exist at many levels in one network such as link-by-link, end-to-end and subscriber-to-switch.

Pulse A brief change of current or voltage produced in a circuit to operate a switch or relay or which can be directed by a logic circuit.

Pulse-Amplitude Modulation (PAM) Amplitude modulation of a pulse carrier.

Pulse Code Dailing Signal emanating from a rotary telephone dail to designate a digit of a telephone number. Some push-button phones are designed to create a pulse code which will access older rotary networks as well as the computer-like code of a push-button touchtone telephone. Rotary pulse code telephones cannot access the services of digital touch tone networks.

Pulse-Code Modulation (PCM) A digital telecommunications technique whereby a signal periodically, encoded into numbers, and then transmitted in discrete binary pulses.

Queue A collection of items, such as telephone calls or printer instructions, , which can be thought of as arranged in sequence, the two ends being the head and new items are added to the tail. Items can be removed either from the head or tail.

Radio Communication over a distance by converting sounds into electromagnetic waves and radiating them through space. In Webster's dictionary; the wireless transmission and reception of electrical impulses or signals by means of electromagnetic waves.

Radio Interference Unwanted disturbance of radio reception. Interference may be caused by electrical a devices, competing radio transmitters, or by natural phenomena such as lightning.

Radio Spectrum The total band of frequencies suitable for radio communication. The FCC classifies the spectrum into 7 bands:

Read Only Memory (ROM) A memory which enables the user to read the information (ROM) stored but not to change it. video discs are read-only devices.

Real Time 1) Pertaining to actual time during which a physical process transpires. 2) Pertaining to an application in which response to input is fast enough to effect subsequent input, as when conducting the dialogues that take place at terminals in interactive systems.

Redundant Refers to a complement of "stand-by" equipment which can be switched on either manually or automatically when a unit in use fails.

Reflector The antenna dish which collects and focuses electromagnetic energy onto either a secondary reflector (Cassegrain) or a feed (Prime Focus).

Relay An electronically operated switch.

Repeater 1) A device that receives signals over one circuit and transmits them to another circuit, usually in an amplified or modulated form. 2) A device used to restore weak or distorted signals to their original intensity or shape.

RF Radio Frequency

RF Modulator The radio frequency modulator is used to modulate the frequency of a carrier signal. It is the frequency used to convert the output signal from a microcomputer into a form which can be displayed on a television screen.

RGB Video signals which use separate red, green and blue signals to compose the picture.

Ring Network A network which permits terminals to communicate with one another without having to go via a central computer. (See local area network.)

Robotics The use of artificial intelligence and cybernetic techniques, as programmed on microprocessors and microcomputers, to operate mechanical sensing and guidance mechanisms - robots - in manufacturing and assembly processes.

Routing The assignment of the communications path by which a message or telephone call will reach its destination.

RS-232C A serial interface used to connect computers to peripheral devices.

S/N see Signal-to-noise Ratio

Scan A player option which allows the user to quickly cover the surface of the disc with the video displayed.

Scramblers The devices and methods used to alter signals in order to achieve a degree of transmission privacy or security.

Search To rapidly access a single frame or a sequence of frames on a disc without video displayed.

SECAM Systeme Electronique pour Couleur Avec Memoire. Television standard used in France, Eastern Europe and USSR. Uses a 625 line, 25 frames-per-second configuration to display video images.

Sideband A frequency band on each side a carrier frequency of an amplitude modulated wave. Each sideband carries all the data in the modulating wave.

Signal An electromagnetic wave used to convey intelligence over a communication system. The event of data conveyance.

Signal-To-Noise Ratio The relative strength of a picture or audio signal to its residual background information. The higher the signal-to-noise ratio, the better the picture or sound.

Simplex A communication facility with transmission capability in only one direction.

Single Sided Band (SSB) A type of amplitude modulation in which only one of the two radio sidebands is transmitted. An SSB signal requires less power for transmission and has a better signal-to-noise ratio than an AM signal.

Slow Motion In videodisc technology, the controlled movement of the laster from frame to frame at an apparent rate of less than 30 frames per second. Achieved by rescanning each frame X number of times before moving to the next frame.

Slow Scan A technique for transmission of video signals or still pictures on a narrowband circuits such as telephone lines. At the transmit end an image is selected as one frame of video or scanned to create one video frame which is then transmitted, line-by-line, at a rate compatible with the bandwidth capability of the transmission line. The image is stored at the receive end until it has been completely transmitted and then is displayed replacing the previously sent image. The result is a still video picture which changes changes every few seconds.

SMPTE Code Also known as nondrop or full frame time code, this is a standardized method of addressing a video tape. It was developed by the Society of Motion Picture and Television Engineers, and gives an accurate location of each video frame on a video tape.

Software (1) The written instructions which direct a computer program. (2) Any written material or script for use on a communications system or the program produced from the script. (3) Describes resources and materials such as audio tapes, print materials, transparencies video disks, CD-ROM, Compact disks (See hardware).

Sound Synthesis The artifical production of speech and sound effects.

Speakerphone Telephone instrument which has a speaker-microphone unit that allows for hands-free conversation.

Spectrum A continuous range of frequencies (waves) within the electromagnetic spectrum that have some specific common characteristics such as radio waves and light waves.

Step To move one frame forward or reverse on a videodisc.

Still Frame Still material, including photographs, line drawings, pages and graphics, designed and presented as a single videodisc frame.

Store and Forward A process in communication systems in which information is received at intermediate routing points and recorded (stored) and then retransmitted to a further routing point or to the ultimate recipient.

Switching Centre (OFFICE) A location which terminates multiple circuits and is capable of interconnecting circuits or transferring traffic between circuits.

Symmetric Digital Video System A video system which can store and play back compressed digital pictures.

Synchronous (1) Having a constant time interval between successive bits, characters, or events. (2) Processes carried out in real time such as computer conferencing which connects users and their computers directly rather than connection to a host computer which, when queried, relays messages at a later time period (asynchronous).

T-Carrier A hierarchy of digital systems designed to carry speech and other signals in digital form; designated T1, T2, t-3, and T4.

Tariff The published rate for a service, equipment, or facility, as established by a communications common carrier.

Tele- A Greek prefix meaning "distant."

Telecommunications The art and science of communicating at a distance. (Plural noun)

Teleconferencing Bringing together by electronic means (audio, audiographics, video, computer) three or more people in two or more locations to share a common discussion. Audio teleconferencing permits different individuals in the conference to speak to one another. Video teleconferencing can be one-way video with two-way audio or fully interactive with two-way video and two-way audio. Computer teleconferencing connects individual computers to a host computer for asynchronous conferencing (not in real time) or synchronous conferencing which connects computers and users to each other in real time.

Telephony The art and science of sound transmission over a distance by changing sounds into electrical signals for transmission through communications equipment

Teleprocessing A method used by data processing systems to transfer intelligence through communications facilities.

Teletext The generic name for a set of systems which transmits alphanumeric and simple graphic information over the broadcast (or one-way cable) signal, using spare line capacity in the video signal (usually the vertical interval) for display on a suitably modified TV receiver.

Telewriter Used with a microcomputer over telephone lines to send and receive signals and produce hard copy from them. Used to enhance audio teleconferencing as a desk top annotation device. As each participant writes on the tablet or types on the computer keyboard, the results are displayed on all of the screens in the network simultaneously and instantaneously.

Terminal 1) A point at which information can enter or leave a communication network. 2) Any device capable of sending and/or receiving information over a communication channel.

Time Division Multiple Access (TDMA) A satellite communications system wherein access to the transmission capacity of a satellite is shared between a number of cooperating earth stations on the basis of time. In turn, each earth station is given access to the satellite for a short period to pass its information to the other earth stations.

Time Division Multiplex (TDM) A means of transmitting a number of channels over a single circuit path by dividing the circuit into a number of time slots and assigning each channel its own intermittently repeated time slot. At the receiving end, each time-separated channel is reassembled.

Time Sharing A function enabled by computer software which permits the multiple use of its CPU and peripheral devices by several users or multiple programs. E-mail or printing in the background while using the computer are two examples of time sharing.

Toll Call A call outside the local exchange area which is charged at toll rates.

Toll Centre A basic toll switching facility; a central office where channels and toll message circuits terminate.

Touch Screen An interface usually attached to the front of a display screen which can interpret the intrusion or pressure of an object introduced to the screen.

Touchtone AT&T term for push button dialing.

Transceiver A device which combines the capability of both a transmitter and a receiver.

Transmission Sending electronic/optical signals through the air or over wire/optic cable.

Transparent If a signal passes through a equipment, network or facility unchanged, that network or facility is said to be transparent to it.

Transponder A microwave repeater (receiver and transmitter) on the satellite used to receive and amplify the signal and translate to a different frequency for retransmission to earth. Domestic satellites usually employ 12 or 24 transponders, which typically have a 36 megahertz (MHz) bandwidth.

Trunk A main cable running from the head end to a local node which then connects to the drop running to a home in a cable television systems; A main circuit connected local central offices with regional or intercity switches in telephone systems.

Turnkey System A complete communications system, with hardware and software, assembled, installed, and tested by a vendor and sold as a total package.

Tutor/Proctor Teachers, educational supervisors, or subject specialists remote from the distance learning transmission site. Their role includes supervision, imparting knowledge, provision of feedback on student progress, and site facilitation. Tutors generally possess specific educational qualifications in the subject area and carry out site instructional, marking, and grading duties.. Proctors often fill a facilitative role carrying out supervisory or administrative roles at the distant site.

Two Wire Circuit A circuit consisting of two conductors insulated from each other, providing a "go" and "return" channel in the same frequency.

Ultra High Frequency (UHF) The Ultra High Frequency part of the radio spectrum (300MHz-3GHz).

Upconverter A modulator which increases intermediate frequency (IF) to radio frequency (RF).

Uplink The communications link from the transmitting earth station to the satellite. Consists of a large directional antenna and high power RF transmitters

Value Added Common Carrier A company which sells the services of value-added network. Such a network is built using the communications offerings of traditional common carriers, connected to computers which permit new types of telecommunication tariffs to be offered.

VDT Visual (video) display terminal or video display unit. A device which visually displays information with a cathode display tube.

Vertical Blanking Interval Twenty-one blanked lines during fields one and two on the videodisc, where frame numbers picture stops, chapter stops, closed captions, etc. are encoded.

Very High Frequency (VHF) The Very High Frequency part of the radio frequency spectrum (30-300MHz).

Very Large Scale Integration (VLSI) Very large scale integration. Single integrated circuits which contain more than 100,000 logic gates on one microchip.

Video Signal A signal comprised of frequencies normally required to transmit pictorial information (1 to 6 MHz)

Videodisc A plastic disc that contains video and audio information and is designed for playback on a television screen. Optical video discs are based on a system in which the tracks on the disc are monitored by an optical laser.

Videotex The generic name for a set to systems which transmits alphanumeric and simple graphical information over the ordinary telephone line for display on a suitably modified TV set at the request of a user equipped with a numeric keypad.

Voice Mail Word of mouth messages are recorded in digital code and stored in computer memory until they can be retrieved by the intended recipient

Voice-Frequency, Telephone Frequency Any frequency within that part of the audio-frequency range essential for the transmission of speech of commercial quality, i.e. 300-3000Hz.

Voice-Grade Channel A communications channel capable of transmitting voice, digital, or analog data with generally a bandwidth of 3,000 Hz.

VSAT Very Small Aperture Terminal. Small, low-cost satellite earth station using a dish of 1.2-1.8 meters in diameter. Used in data communications networking and wide area networks.

WARC see World Administration Radio Conference

Wavelength The distance between peaks of a sinusoidal electromagnetic wave is equal to the speed of light divided by the frequency in hertz (Hz), i.e., a wavelength approximately 41 meters long is equivalent to 7.3 megahertz.

Wide Area Telephone Service (WATS) A telephone service which enables subscribers to call a wide area for a flat monthly service rate.

Wide Band A high speed transmission channel with a bandwidth wider than that of a voice-grade channel.

Window A portion of a display devoted to a single source of material on a multimedia display screen. A window may occupy a full screen, or it may share the screen display with other windows of information.

Workstation A work area accommodating an array of electronic equipment. A workstation will usually include a personal computer, monitor, keyboard, peripherals, and network connections.

World Administration Radio Conference (WARC) International meetings regularly held which are concerned with the allocation of frequencies. The ITU is enacting body for any allocations agreed to at these meetings.

Wraparound Activities related to pre-, during, or post- during a video or teleconference program.

Zenith Pointing an antenna at an elevation of 90 degrees-the straight up position.

Table of Contents


Back to: Technology