questions. Some of these questions may require sophisti- cated statistics to answer, but many will not. Begin by listing as many of the factors that may influence the effective use of the technology as possible. Then, in everyday English, write a question (or several questions) for each that, when answered, will provide some insight into that particular piece of the puzzle. Once the ques- tions are written, the steps necessary to answer it are often intuitive. Many times a sufficiently detailed and useful answer to a question can be found by simply asking it of the right person. Other times, simple data collection techniques, such as student surveys or automated time-on-task tracking, can be built into the class activities. Typically, some of the most useful information for decision makers will not require sophis- ticated analysis. Taken as a whole, however, even an informal set of evaluation questions can provide an objective perspective on what is working well and what is not for a particular CAI application. Some samples of possible evaluation questions are listed in the following sections. Evaluating the Implementation * Were the training sessions beneficial for faculty? Was there sufficient /too much detail in the orientation? e Were the teachers given enough preparation to adequately handle computer-related problems? ¢ Were the computers adequate for the software? ¢ What kinds of unanticipated problems did the classes encounter that hindered their effectiveness? ¢ Were the labs located conveniently for students? * Was there sufficient space around the terminals for students to work efficiently? * Was the noise level in the labs a problem? ¢ Were there sufficient terminals/printers for student use? Evaluating Teaching and Learning * Could the students and teachers using the software be considered computer literate when they began using the product? ¢ Did students have a computer at home? Did the faculty? * Were entry-level computer skills a factor in the time it took a student to begin achieving course objectives? ¢ What kinds of training did the students require to become self-sufficient on the software? How much time did it take for students to become comfortable with the system? ¢ Were computer skills a factor in the amount of preparation time required of faculty? ¢ Did faculty using the lab feel it required more or less preparation time than traditional instructional methods? * Were some computer-based activities weaker or less appropriate than others? e Did the computer-aided classes require any special supplemental materials or activities in order to meet the learning objectives of the courses? e Did student attitudes about the method of instruc- tion, their teacher, or their own preparedness change as a result of using the program? ¢ How did student performance in classes using the software compare with students taught with tradi- tional methods? ¢ Dosome levels of students benefit more from using the software than others? e Did the software benefit the students most in need? ¢ Was computer-aided instruction worthwhile from the student point of view? ¢ Did study time outside of class appear to increase in classes using the software? ¢ Were the students in the computer-aided classes comparable to students in traditional classes in terms of entry-level basic skills? In terms of demographic characteristics? ¢ Was the system reporting and tracking sufficient to meet faculty needs? Student needs? e Did the teacher interact with students as they used the software? Individually, or with the class as a whole? A broad-based CAI evaluation of the kind described here implicitly recognizes that CAI is both evolving and complex. The outcomes for students may be impacted by a variety of constraining factors. Often, the simple act of posing questions like these can stimulate insight leading to creative improvements in a CAI application. And, as in many evaluation processes, the answers obtained to some of these questions will raise additional questions. The continued application of technology to instruc- tion may change in form, even substance, but it is not going to go away. Colleges must embrace technology and make it relevant and useful in teaching. The first step in that process is to begin asking the right ques- tions. Larry Johnson, Associate Director, League for Innovation in the Community College For further information, contact the author at 26522 La Alameda, Suite 370, Mission Viejo, CA 92691. Suanne D. Roueche, Editor TT September 9, 1994, Vol. XVI, No. 18 © The University of Texas at Austin, 1994 Further duplication is permitted by MEMBER iNetitiitione for their awn nerennnal INNOVATION ABSTRACTS is a publication of the National Institute for Staff and Organizational Development (NISOD), Department of Educational Administration, College of Education, EDB 348, The University of Texas at Austin, Austin, Texas 78712, (512) 471-7545. Funding in part by the W. K. Kellogg Foundation and the Sid W. Richardson Foundation. Issued waoaokly whon rlacecaoe ara in eaccaiann Airing fall and anrinn tarma ICCAININO 1NZYy