This paper has been submitted for review and consideration for publication. Please do not quote or cite without the permission of the authors until the editing process is complete.


The Nature of Learning in Interactive Technological Environments
A Proposal for a Research Agenda Based on Grounded Theory

Jack Bookman

David Malone

Duke University

This research was funded by National Science Foundation grant # DUE-9752421. The authors would also like to acknowledge the support for this project from the Carnegie Academy for the Scholarship of Teaching and Learning, a program of The Carnegie Foundation for the Advancement of Teaching.


The purpose of this study is to develop, based on observations of students' work, a set of research questions that will help us understand how students learn in a particular technology-rich environment, one using a computer algebra system with lessons delivered via the internet. These questions were not derived a priori from a theoretical perspective but were derived from the data. From the data we identified 4 categories of research questions: (1) What is the role of the instructor in this environment? (2) What types of behavior and thinking processes are students engaged in as they work together in front of the computer? (3) What is the importance of self-monitoring and metacognition in computer based instruction? and (4) What opportunities and obstacles are raised by the technology itself? Research in each of these areas has important implications for curriculum developers, math instructors, and students.



The Nature of Learning in Interactive Technological Environments:
A Proposal for a Research Agenda Based on Grounded Theory


It is clear that technology is fundamentally changing the way we live, work, and learn, but is not clear exactly how it is changing the way we live, work, and learn. In particular, it is not clear how the internet and sophisticated computer algebra systems are changing and will change how we teach and learn mathematics. Smith [1] has succinctly summed up the situation:

Technology is a fact of life for our students -- before, during, and after college. Most students entering college now have experience with a graphing calculator,because calculators are permitted or required on major standardized tests. A large and growing percentage of students have computer experience as well -- at home, in the classroom, or in a school or public library. ''Surfin' the 'Net'' is a way of life -- whether for good reasons or bad. Many colleges require computer purchase or incorporate it into their tuition structure. Where the computer itself is not required, the student is likely to find use of technology required in a variety of courses. After graduation, it is virtually certain that, whatever the job is, there will be a computer close at hand. And there is no sign that increase in power or decrease in cost will slow down any time in the near future. We know these tools can be used stupidly or intelligently, and intelligent choices often involve knowledge of mathematics, so this technological environment is our business. Since most of our traditional curriculum was assembled in a pre-computer age, we have a responsibility to rethink whether this curriculum still addresses the right issue in the right ways -- and that is exactly what has motivated some reformers.

Nonetheless, the move to integrate technology into teaching has not been without its detractors. Krantz [2] raises some important concerns when he states that:

(1) distance education and products promoted by publishers for profit "describe a dangerous trend"; (2) "Provosts and deans have dollars signs in their eyes. They envision teaching more students with fewer faculty;" and (3) that "The important question is whether students are internalizing and retaining the material." But he presents an extreme either/or view of technology lumping together all use of technology in the classroom. He asserts, providing no evidence, that "Traditional education ... enables students to master the ideas and retain them for future use" and that "traditional methods Ö have had - and continue to have - great success" and that traditional classrooms produce "interaction of first rate minds." He then claims (again with no evidence) that there is no measurable benefit to employing technology in the classroom. He is not the only mathematician with these concerns. The issue of how, or if, to introduce technology into the classroom is one of the most divisive and emotionally charged issues in education.

In this paper, we propose an agenda for research that will move the discussion of the use of technology in mathematics classes from the coffee lounge and soap box to the seminar room. We will discuss some preliminary results of careful observations of student learning using computer algebra systems with lessons delivered via the internet. Based on these observations, we propose a set of research questions whose answers will help us to understand how best to use these new technologies to improve the teaching and learning of mathematics.


In recent years, consensus among policy boards, such as the National Council of Teachers of Mathematics and National Research Council, may have emerged regarding the essential steps in reforming mathematics and science education [3, 4, 5, 6, 7]. Bailey and Chambers [3] summarized several of these reform reports and concluded that six overarching recommendations have emerged: (1) integrate the teaching of science and math; (2) emphasize cooperative learning; (3) focus on application and relevant problem solving; (4) teach primarily through discovery learning as opposed to lecture; (5) attend to the motivation of learners; and (6) use technology in meaningful ways.

The Connected Curriculum Project (The Coordinated Curriculum Library, Duke University, NSF DUE-9752421, 1998-2001) is an innovative instructional effort which addresses each of these six recommendations. The Connected Curriculum Project (CCP) has developed a collection of learning materials designed to create interactive learning environments for students in the first two years of college mathematics courses [8, 9]. The materials combine the interactivity, accessibility and connectivity of the Web with the power of computer algebra systems. These materials may be used by groups of students as an integrated part of a course, by individuals as independent projects or supplements to classroom discussions. Lawrence Moore and David Smith, who began their collaboration in 1988 at the beginning of the calculus reform movement, lead this project.

The CCP is a direct extension of the experience gained from the calculus reform movement in general and, in particular, Project CALC: Calculus As a Laboratory Course, supported by the NSF Calculus Reform Initiative. The key features of that course are real-world problems, hands-on activities, discovery learning, writing and revision of writing, teamwork, and intelligent use of available tools. The stated goals for the course are that students should: (1)  be able to use mathematics to structure their understanding of and investigate questions in the world around them; (2)  be able to use calculus to formulate problems, to solve problems, and to communicate the solution of problems to others; (3)  be able to use technology as an integral part of this process of formulation, solution, and communication; and (4)  learn to work cooperatively. [10, 11, 12].

A follow-on NSF grant in 1993 (Interactive Modules for Courses Following Calculus, Duke University, NSF DUE-9352889, 1993-97) supported development of modular lab activities for courses beyond calculus: linear algebra, differential equations, and engineering mathematics. These modules were created as interactive texts in specific computer algebra systems. CCP was devised to extend the usefulness of these modules by capitalizing on the interactivity and availability provided by the internet. The CCP modules include hypertext links, Java applets, sophisticated graphics, a computer algebra system, realistic scenarios, and questions that require written answers. The materials used for this study were single-topic units that can be completed in one to two hours with students working in two-person teams in a computer lab environment.

The CCP is based in part on recent research in cognitive psychology on the ways students take in, organize, and represent knowledge internally. An underlying principle from this research is that students cannot simply be given knowledge; they must construct knowledge in their own minds. This emerging theory of learning, known as constructivism, is rooted in the earlier work of cognitive psychologists such as [13, 14, 15].

Learning from the constructivist perspective is seen as a "self-regulated process of resolving inner cognitive conflicts that often become apparent through concrete experience, collaborative discourse, and reflection" [16]. That is, constructivist theorists maintain that the active use of manipulative instructional materials in a socially interactive learning environment is a necessary condition for meaningful learning. The primary role of the teacher is to structure learning situations in which students experience a sense of cognitive disequilibrium. Students take on more responsibility for monitoring and regulating their own thinking and learning. The job of the teacher is to create learning environments that place students in a position of constructing or building meaning and understanding for themselves. Thus, from this viewpoint, learning is viewed as a transformative process involving conceptual change, not merely a process in which students recite back information that they have passively accumulated.

In the traditional math class, the instructor might explain a concept, demonstrate several examples of how to solve problems involving that particular concept, and then ask students individually to work through problem sets. The emphasis is on learning computational procedures. In a cognitively oriented, or constructivist math class, after an introductory discussion led by the teacher, students are given complex and engaging problems which are situated and embedded in a meaningful context. These contextualized problems require students to engage not only in computational procedures, but also in sustained mathematical reasoning. Students might be asked to write about the problem, create graphs and drawings, manipulate objects, and in other ways actively make sense of the problem. Working collaboratively with peers, they pose hypotheses, justify ideas, formulate solutions, and explain their personal understandings in their own words. An emphasis in a constructivist math class is on making sense of mathematical ideas.

Since constructivist approaches to learning attempt to engage students in solving meaningful problems using real world data, the use of technology in math instruction is seen as holding enormous promise. Portela [17] noted that, "Although not new, constructivism has more relevance in education today because the dawn of the Information Age has rapidly increased the amount of, and accessibility to, information." He stated that there is a scarcity of studies of how students learn in this environment and describes the results of his case study of a mathematical communication and technology course for math graduate students. He reported that the focus of teaching and learning shifted from knowledge transmission to knowledge building and he credits the internet with aiding this shift. He also indicated that being connected to the internet in the classroom provided opportunities for more active learning by encouraging students to learn by doing, concentrate on the subject matter rather than simply copying notes from the board, participate in class discussions, work at their own pace, receive individual help from the instructor without holding back the rest of the class, and access related sites "right on the spot."

Cooper [18] stated that, "In its use as an educational medium in a carefully structured learning environment based on the principles of cognitive research, the computer may serve as a strong mechanism for reorganizing mental processes, aiding students in developing the hierarchical structure for their new knowledge." She also noted that although original uses of the computer focused on drill, practice, and individualized learning, use of the computer as an instructional tool is most effective in collaborative learning environments.

Heid [19] reviewed the empirical research on mathematics learning using computer algebra systems (CAS). Reporting only on those studies that involved systematic data collection and analysis, Heid et al identified 64 studies from journal articles, conference proceedings and dissertations that addressed CAS. They examined five sets of outcomes Ė achievement, affect, behavior, strategies and understanding Ė and concluded that the research justifies incorporating CAS into the established mathematics curriculum. In particular, the researchers noted that, "The majority of studies examined indicate that there is no loss in proficiency in computational skills and these results are obtained in the absence of a CAS on the research instrument. Cumulatively, these studies suggest that use of a CAS in the learning of mathematicsÖ can result in higher overall achievement in terms of both procedural and conceptual items." They concluded, "CAS research is now ready to enter a new phase. Researchers must no longer focus their efforts on corroborating the Ďno-harm-doneí conclusion. They must no longer be satisfied to establish that conceptual understanding is better. They must, like some of the more recent pioneers, investigate the very nature of learning with CAS."

Methodological Issues

The purpose of this study is to develop, based on observations of studentsí work, a set of research questions that will help us understand the nature of learning in these more interactive technological environments. Particularly since little research has been done in this area, this phase of the research must be exploratory in nature. We feel the most appropriate research methodology for this type of research is Glaserís notion of grounded theory which he described as "the discovery of theory from data systematically obtained from social research" which he contrasted with "theory generated by logical deduction from a priori assumptions" [20]. Our data gathering methods can be described using Rombergís [21] method of clinical observations where "the details of what one observes shift from predetermined categories to new categories, depending upon initial observations."

The subjects studied were college students who were taking a mathematics course (at a level beyond calculus) in a major research university. The students had been using CCP materials for at least several weeks and were somewhat familiar with MAPLE (the computer algebra system) and the format of the modules. The subjects volunteered to participate in the study (and were each paid $25). Their participation in the study consisted of working through one of the CCP modules with a partner. The students working together were videotaped and, simultaneously, their computer output was collected on a separate videotape. Each session was 1-2 hours in length and data were collected from a total of 10 pairs of students.

The data was collected in a quiet office where the pair of students could work comfortably. Also in the room (though not always for the entire time) was one of the investigators. On the table was a computer with MAPLE; students were also given paper and pencil. Also in the room was a video camera to record their work and a scan converter connected to a VCR and television to record their computer output. When the students arrived the investigator explained the general purpose of the research and asked the subjects to sign the consent forms and forms to be paid. He then helped them find the URL for the module and asked them to begin their work.

A principle of grounded theory is that one generates conceptual categories from evidence [20]. The authors of this paper (one mathematics educator and one educational psychologist) observed each of the tapes several times and noted those issues that appeared to facilitate or inhibit learning or that appeared to be important factors in understanding the process of learning taking place. Another principle of grounded theory is that the categories that "emerged from the data are constantly being selectively reformulated by them. The categories, therefore, will fit the data, be understood both to sociologists and to laymen who are knowledgeable in the area, and make the theory usable for theoretical advance as well as for practical application." [20] The following four categories emerged: (1) the role of the teacher; (2) how students work together; (3) self-regulation and time management; and (4) and issues raised directly by the technology. We have selected several pithy excerpts that illustrate these concepts. For each of the vignettes, we discuss aspects of the four categories, which are reflected in the data. We then discuss researchable questions, which were raised by our analysis.

In order to help the reader place these vignettes in the context, in which these students would normally be working, we present the following description of the beginning of a typical class in which these students would be working on these CCP modules:

On a day when CCP computer based modules are being used, the learning environment looks significantly different from a traditional math class. Typically, after gaining the attention of students and taking care of administrative announcements, the instructor gives a brief overview of and introduction to the lesson. The purpose of this initial teacher-directed overview is to activate studentsí prior knowledge, introduce new terminology and procedures, and to provide students with a conceptual anchor. Students are then assigned to pairs (or get with a previously assigned partner). For most CCP modules, studentís work cooperatively with a single partner, but each pair of students belongs to a larger support team made up of four students. Each pair works collaboratively on a single computer. Roles are not assigned, so students must decide between themselves who will control the keyboard, point the mouse, read the problem, and take primary responsibility for the variety of tasks required by the learning activity. Once settled in, each of the two students typically silently reads the introduction on the computer screen and they collaboratively engage in problem solving.

Vignette 1

In this vignette we examine the work of Mary and Jim who were working on a module called the "The Equiangular Spiral" (http://www.math.duke.edu/education/ccp/materials/mvcalc/equiang/index.html) with Jim at the keyboard and Mary to his left. In this module, they used properties of exponential growth and polar coordinates to understand why these spirals that occur so often in nature (such as in the shell of the chambered nautilus) are called equiangular. About 11 minutes into their work, they have finished measuring the outer spiral of a cross-section of a nautilus shell superimposed on a polar graph. They had worked together making steady progress and had encountered no difficulties. Meanwhile the investigator was in the office with them, but not closely observing their work because they seemed to be progressing well and did not seem to need or want any help. At the point at which this vignette begins, the subjects had just read the following question: "Follow the instructions in your worksheet to plot the sequence of radial measurements, rn, as a function of the counter n. What sort of growth does this look like?"

The subjects noted that the graph of the data seemed exponential. They read the next instruction, "Experiment with logarithmic plotting of the data to determine the type of growth," not paying attention to that (or perhaps not understanding what it was asking). They then went on to the next instruction, "Find a formula for a continuous function r = R(t) such that R(n) reasonably approximates the n-th measured radius, rn." They then tried to guess what the formulas would be by noting that for each increase in one n, rn seemed to increase by a factor of 1.5. At this point, they could not figure out how to get MAPLE to plot the function 1.5x. Mary immediately suggested that they check the help menu (?plot in MAPLE). This helped and they were able to make a plot. Mary noticed "we have a problem with the range;" Jim concurred and they attempted to fix the problem. At this point (about 8 minutes after they began grappling with the problem of plotting a function to match the data, the investigator asked, "What are you trying to do?" After looking at what they were doing, he explained how to more easily use MAPLE to graph a semi-log plot of the data (a data analysis tool that helps find formulas for data that is exponential). The subjects then worked for another 3 1/2 minutes, each concentrating and contributing, while the investigator stepped back. At that time, about 24 minutes into the exercise, the investigator asked them "Is that working?" Jim answered, "No. Hereís the problem. Hold on. We have to do something first." Jim fixed a MAPLE command and then he asked the investigator a few questions to clarify what MAPLE commands will create the plot he needs. Two minutes later they succeeded in producing a semi-log plot that was approximately straight. They continued to struggle, largely with the software for another 7 minutes before they succeeded in superimposing the graph of the exponential function on the data plot in a way that produced a good fit. They were clear on the mathematical principles, correcting each other when mistakes were made. They then went on to the next part of the module.


In this vignette, the students struggled with the software, but were not frustrated and did succeed in accomplishing what was asked. Much of the subjectsí effort in this vignette involved getting MAPLE to do the computations that they wanted done. Because Mary and Jim were hesitant to ask for help, they proceeded through the module at a slow pace. This raises certain questions: Was this an efficient use of the student time? If this was a classroom and the investigator was the instructor, should he have intervened more often to help the students move along more quickly? How does the instructor know when and in what situation to intervene?

We also noticed that in this vignette, as in others, that students are willing to persist for a long time trying different things on the computer. In particular, trial and error could be much quicker on a computer than with pencil and paper and because of this self-regulation and time management can become a problem.

This vignette illustrates three issues: (1) as in any active classroom, there are the questions of when and how the teacher should intervene, (2) the computer learning environment seems to affect studentsí ability to manage their time efficiently and (3) whereas computers solve certain pedagogical problems (such as time consuming calculations), they create others (such as learning the nuances of the software).

Vignette 2

This vignette also involved Jim and Mary and began several minutes after Vignette 1. They were asked to "construct a function r = r (theta) that describes the shell radius as a function of polar angle." They typed in:

x: = theta -> R * cos (theta); y := theta -> R * sin (theta)

Because R was defined earlier as a function (not as a constant), MAPLE would not plot the parametric equations. After the subjects struggled with this for about 4 minutes, the following conversation took place:

Investigator: Did you run into a problem?
Jim: When we graphed it, we didnít get anything.
Investigator: What is R? You have x: = theta -> R cos (theta). R is the radius, right?
Mary: Yeah.
Investigator: Isnít that changing as a function of theta?
Jim: Oh. So we have our R formula.
Investigator: You can just say R (theta). Does that make sense?
Jim and Mary: Yeah.

They tried that and it worked. They were then able to proceed through the next parts of the module.


In this vignette, the subjects are confronted with a mathematical problem that they may not have been able to solve on their own, and without solving it, they could not proceed through the module. Whereas in the first vignette the investigator/instructor helped move things along but whose help may not have been critical, in this vignette it seems that the intervention of the instructor was essential. This was not an isolated instance of this sort; we noticed such problems often as we reviewed the data. This raises the question of what would happen here in the absence of an instructor, i.e. in a distance learning situation. It also raises the issue of whether and how such problems can be anticipated by curriculum developers and what possible technological solutions (such as links to hints) can be built into the module.

Vignette 3

Again we look at Jim and Mary working on the Equiangular Spiral module, this time towards the end of the module. They read the instructions: "Find derivatives of x and y with respect to theta, and then combine the results to find dy/dx in terms of theta. You may want to use your helper application for this." They could not remember how to get MAPLE to compute derivatives and began trying to find the derivative of x=r0 ekq cos q with respect to q . After a few minutes of not being able to do this, Mary became frustrated and said, "We can just do it by hand." She began to do the calculation on paper, but Jim said, "Iím trying to remember how MAPLE works." After about a minute, Mary finished and they had the following conversation, which was conducted in a friendly and jocular manner:

Mary: O.K.
Jim: Shut up.
Mary: (laughs) Here, itís just the product rule.
Jim: Yeah. It would be nice if Maple would do it for us.
Mary: It will.
Jim:Yeah. I want it to do it.

A minute later, working together they got MAPLE to do the calculation.

Mary: You see, itís exactly what I just did.
Jim: Yeah but your way is stupid.
Mary: But it was quicker.

The instructions asked them to divide dy/dq  by dx/dq  to get a formula for dy/dx. This would have been quite tedious with pencil and paper but they were now able to use MAPLE to do this in a couple of seconds. The instruction then directed them to evaluate an even more complicated expression; this expression reduced to 1/k. They got this result with MAPLE (by now, they were using MAPLE correctly), Jim said, "Wow. I want to work this out on paper. I donít believe that."


Note that Mary first suggested using pencil and paper but then wanted to use the CAS, but Jim first insisted on using the CAS but in the end did not believe the results without checking using pencil and paper. The main issue raised by this vignette is how and why students choose one tool or another. As weíve seen in other videotaped sessions, the students used the CAS, hand-held calculators, as well as pencil and paper. Some students (like Mary and Jim above) seem to believe their pencil and paper calculations more than CAS applications. This situation may have been exaggerated by getting the surprising result of the complicated expression reducing to 1/k. This issue of what tools to use to solve a particular problem raises several questions: Does familiarity and comfort with the tool affect how readily one accepts the results produced with that tool? Do some students (particularly those who have been successful in school) receive some ritualistic satisfaction from doing pencil and paper calculations? How and do students check their work, given multiple technological tools?

Vignette 4

In this vignette, we examine the work of Andy and Larry working on a module called "Correlation and Linear Regression" (http://www.math.duke.edu/education/modules2/materials/test/test/) with Andy at the keyboard and Larry to his left. In this module, they learn about correlation and use that notion to develop an understanding of linear regression and least squares estimates for linear data. At the beginning of the module, the students are given the scores of 15 students on two tests and are asked to plot the scores on test 1 vs. the scores on test 2. The first thing they did was follow a link to another file that showed them how to use MAPLE to make a scatter plot. Andy correctly cut and pasted the zip command that tells MAPLE to create coordinates from two sets of variables. They then needed to decide what the x and y should be, the correct answer being test1 and test2. The following conversation then occurred. During this time they were in the room by themselves; the instructor had stepped out for a few minutes.

Andy: (types and says out loud): ourdata = zip (x, y) -> [x, y]
Larry: What does the zip do? Is that just something in the definition?
Andy: I have no idea. It just says it right there.
Larry: You just copied it?
Andy: Yeah. Why not.
Larry: I thought it was a cool function, it sounded interesting. I guess just test 1 test 2
Andy: No. No. Weíre not planning on plotting them against each other.
Larry: Yeah we were. Werenít we?
Andy: No.
Larry: I thought we were plotting test 1 vs. test 2
Andy: No. Weíre plotting test 1 and test 2 so we want to do these against, just like, the one, so each number represents a 1.
Larry: Oh wait, weíre just putting the plots on the same graph and not plotting them against each other?
Andy: Yeah.
Larry: OK.
Andy: I donít know if this is going to work.

Since what Andy said made no sense, they made very little progress. At first, they typed "plot (test 1)" which produced nothing. Andy then spent a couple of minutes poking around in the help menu thinking that he was having syntax problems rather than a problem understanding the mathematics. After about 4 minutes, Andy said, "You know what we can do?" and plots the test1 data vs. the set {1,1,1, Ö ,1}. Larry said, "That canít be the normal way to do it. Interesting though." Periodically, throughout this process, Larry politely and without being assertive asked whether Andy was sure that they were not supposed to be plotting test 1 vs. test 2. We should note that the module has a link to the glossary for the word "versus" yet neither Larry nor Andy suggested following that link. Finally, they looked further down on the worksheet and realized that they should have been plotting test 1 vs. test 2. They typed this in and got the correct scatter plot. At this point the investigator entered and verified for them that they were on the right track. Interestingly, Andy didnít seem embarrassed by his refusal to take Larryís advice and Larry didnít seem to blame him.


Several issues surface in this vignette. Perhaps foremost among these issues is the ongoing tension between the desire of one of the students to understand the conceptual ideas embedded in the math problem and the other studentís push to solve the problem and to get through the assignment as quickly as possible. For example, Larry asks, "What does the zip do?" Andy indicates that he has no idea what the zip does, seemingly implying that the primary goal is to finish the problem, not necessarily understand the underlying ideas and processes. This raises the question: How can computer learning environments be designed that foster learning for understanding but that also use studentís time efficiently, avoid student frustration, and prevent those students who have a desire to get the work done with a minimum of time and effort from missing the point of the lesson?

A second issue raised by this vignette is similar to an issue in vignette one. Students struggle with the tools (MAPLE software commands) as much as they do with the math concepts. Andy spends much of his time in this vignette trying to correct what he perceives to be a MAPLE syntax problem, when his real problem is his lack of understanding of the concept of "versus." Perhaps with greater metacognitive awareness, Andy might be able to ask himself whether he is having a problem with the tool or the math concept? This raises the following question: In a computer learning environment is it possible to build into lessons ways to help students develop metacognitive skills? What can instructors do to strengthen studentsí abilities to recognize and differentiate between tool problems and conceptual misunderstanding? Can the learning of metacognitive processes be embedded and situated in computer based modules without significant costs in terms of instructional time?

Another issue, which surfaces in this vignette, concerns the role of the teacher. In this particular scenario, the investigator/instructor permitted the two students to struggle independently for quite some time, prior to interacting with them. This did allow Andy and Larry eventually, after a significant expenditure of time, to discover on their own how to plot the data. However, was this an efficient use of instructional time? Would more learning have occurred if the instructor had intervened sooner? What is the cost-benefit ratio in terms of letting students discover solutions versus intervening and more directly guiding the computer based modules? Again, as was discussed in vignette one, these are questions in any active classroom. A final aspect of vignette four concerns the role of student-to-student dialogue in math computer learning environments. In this particular vignette, Andy and Larry appear to be sharing ideas, but Andy is not seriously considering Larryís questions and Larry is not asserting himself. Andy continues to plod down the wrong path, despite Larryís early suggestion that they should plot test one versus test two. In vignette one, two, and three genuine dialogue between the pair of students seemed to exist. The students shared their provisional hypotheses and provided one another useful feedback. In the case of Andy and Larry, Andy emerges as an assertive but conceptually mistaken leader. Larry, who correctly understands the math problem, remains for the most part a passive follower. This raises several questions: In an interactive computer learning environment where meaningful student-to-student dialogue is essential for developing understanding, what steps can the instructor take to facilitate dialogue? Can mechanisms be built into the computer modules to get students to reflect on the quality of their interactions? How does the instructor structure the lesson to minimize the problem of one student taking over the learning situation? Can interdependence and shared responsibility, as well as other aspects of cooperative learning, be more effectively built into computer modules?

Vignette 5

In this vignette, Carl and Kevin are working on the module called "Correlation and Linear Regression" and after examining formulas for the correlation coefficient are asked to compute the correlation coefficient for a data set consisting of for points: {(1,2), (2,3), (3,4), (4,3)}. They plunged into the exercise using pencil and paper but when (3 Ĺ minutes later) they were faced with trying to calculate the square root of 2, Carl asked Kevin, "Do you have a calculator?" After Kevin looked around and couldnít find one, the investigator said, "What are you looking at?" referring to the computer. The investigator expected that the subjects would bring up MAPLE but instead Carl brought up the computerís scientific calculator (a standard accessory in MS Windows). Five minutes later they finished the calculation. The following conversation occurred (Note: computing the standard deviation is a step in using the given formula for the correlation coefficient):

Investigator: Would it have been easier or harder to figure out how to get MAPLE to do the standard deviation?
Carl: If we had a nice little equation like that it would be fairly easy I think.
Investigator: Actually didnít we do that before?
Carl: Up here? Oh the coefficient thing. Yeah.
Kevin: Oh yeah that would have been easier."


This vignette illustrates an issue, which recurred in almost every lab session, which the researchers videotaped. The issue concerns the three aspects of students' use of tools in computer based learning environments: (1) the perceptions students have of the array of "tools" available to them to use to solve a problem; (2) students' notions of when and where it is appropriate to use a particular tool; and (3) and the degree to which students believe or "trust" that a certain tool is a reliable means of producing the correct solution to a problem. For example, after beginning their work using pencil and paper, Carl and Kevin then used the computer's calculator. By the end of the session, with the guidance of the instructor, Carl and Kevin begin to reflect on their choices of tools and they seem to realize that MAPLE is a powerful tool they also have at their disposal.

This vignette raises several potential research questions. For instance, even though Carl and Kevin had been using MAPLE for two or three months, the idea of using the computer algebra system to calculate a standard deviation did not occur to them. This raises the questions: prior to putting students into a computer based learning environment, can we design ways to introduce students to computer algebra systems that they help them feel comfortable using them? Can we design computer modules in ways that get students to think more reflectively about their choice of tools (pencil/paper, calculators, computer calculator, CAS)? What factors underlie students' perceptions of the accuracy and reliability and efficacy of a particular tool?

Vignette 6

Again, we describe the work of students working on the module "Correlation and Linear Regression". The students, Alex and Neil, are roommates and are both taking linear algebra. After a couple of minutes of work, the following conversation occurs:

Alex: Why donít you type?
Neil: Are you sure?
Alex: Yeah.
Neil: Why donít you want to type?
Alex: Youíre more familiar with the commands.
Investigator: Who usually types?
Alex: He did before because he knew MAPLE and then I did the last couple.
Neil: We take turns.
Alex: Yeah, itís his turn anyway.

After this, they returned to work on the module. During this period, they sometimes thought aloud and sometimes talked to each other (but they looked at the screen rather than each other) and they often pointed at objects on the screen as they talked.

Soon after they switched seats, Neil asked the investigator, who was still overseeing their work, "How do we turn radicals into decimals?" (MAPLE output is exact unless you ask for the decimal approximation.) The instructor/investigator, who was present responded, "you go evalf." (evalf is the command to compute decimal approximation). Several minutes later after the instructor/investigator left and was sitting in the adjacent room, they acknowledged in their discussion that they did not know how to interpret the word "versus" (as in "test 1 versus test 2"). Though the word "versus" was highlighted as a hot link, they did not click on the link. Eventually, they figured out what versus meant using the context of questions that came up later.

About 10 minutes into the tape, when they executed the command to plot the scatter plot of test 1 versus test 2, the plot was displayed incorrectly. This was not the result of any error on the studentsí part but was due to some technical problems concerning the memory of the machine and MAPLEís interaction with the hardware. The studentsí first reaction was to assume that the output was correct and they tried to construct some meaning out of the incorrect output (obviously this was a challenge). After realizing that the output was incorrect, Neil and tried all sorts of things such as changing the format and display options hoping to get the graphs to come out right.


One issue that appeared in many of the lab sessions concerned the question which of the lab partners would assume certain roles, for instance, who will take responsibility for typing, using the mouse and offering initial ideas. While in most cases, the roles are not explicitly discussed, in this case Neil and Alex directly addressed the issue of who would keyboard. This raises the questions of whether roles should be assigned, how students work out roles in the absence of assignments and what impact, if any, do the roles have on learning.

This ties in again to the question of the role of the instructor: Should the instructor assign roles to students? Although assignment of roles has been discussed in the literature on cooperative learning [22], do different issues arise in the interactive computer environment?

Another issue in this vignette concerns the different ways that students seek help. Neil and Alex encountered difficulty in interpreting "versus" yet they either failed to recognize that versus was a hot link in the HTML document, or they were reluctant to use it. Why do students sometimes use links and other times ignore them? What cognitive conditions prompt students to use hot links? This vignette also raises issues seen in earlier vignettes: What are the ways students seek and receive help in the computer learning environment? What is the role of the instructor in providing support and guidance?

A related issue that appears in this vignette has to do with the intellectual dialogue between the students during the session. One of the strongest features of the interactive computer lab approach seems to be the way that the labs foster collaborative discourse. This type of dialogue is seen in this vignette when Neil and Alex think aloud,

offer hypotheses, and make predictions. However, the question arises: Is this type of dialogue typical of computer learning environment? Does the computer lab environment foster meaningful collaborative discourse in ways that would not occur without the computer, and why?

A final issue that surfaced in this vignette was the technical problem with the interaction of the software and hardware. The hardware memory problem was very common and frustrating to many students (and the investigator!) This was not consciously built into the study but it points out the serious difficulties that can occur when technical problems happen. Typically, students think they did something wrong as in the reaction of Carlís in a different tape: "Oh! What did I change? The computer hates me." Often, even the instructor (as in this case) cannot solve the problem and this can seriously upset the flow of the lesson especially if it depends on the output. This raises the same question we discussed in the fifth vignette of the impact that these technical problems have on the sense of credibility and trust that the students have in the computer as a tool. Another question concerns a metacognitive issue: How can we help students learn how to check the reasonableness of an answer and determine whether discrepancies are due to mathematical or technical errors?

Vignette 7:

In this last vignette, we describe the work of Dan and Aaron working on the module "Correlation and Linear Regression". Dan and Aaron are among the best students in a linear algebra class. Dan is an electrical engineering major and Aaron is a math/economics major. Dan was working comfortably at the keyboard while Aaron was thinking critically and actively about the questions being asked. They were at the point where they had just examined the formula for the correlation coefficients and were asked to compute the correlation coefficient for a data set consisting of the points: {(1,2), (2,3), (3,4), (4,3)}.

Aaron: Could you go [scroll] up so I can see the formula?
Dan: Thereís a way to do this in MAPLE. I donít remember how.
Aaron: To find r?
Dan: To sum a list. I donít remember how right now.
Aaron: You can do the standard deviation, canít you? (meaning on MAPLE)
Dan: Yeah.
Dan enters "X1: =[1,2,3,4]; Y1: =[2,3,4,7];" and tries to remember the syntax for standard deviation.
Aaron: Itís at the top if you canít remember.
Dan: (mutters) I canít remember

Dan checked the syntax and computed the means and standard deviations of X1 and Y1 using MAPLE. Aaron began to input that information, using pencil and paper, into the formula for the correlation coefficient. While Aaron got involved in the pencil and paper computation, Dan, intently viewing the computer screen tried to remember how to sum a list. He used trial and error but did not use the help menu.

Aaron: I can do these by hand (referring to summing the product of xi and yi).
Aaron completed the computation, making some errors and gets 3/80* sqrt (10). Dan had MAPLE evaluate this and got approximately .12.
Aaron: Thatís really low.
Dan: Yeah, these two are correlated really well. Itís got to be higher than that.

Dan tried to create a scatter plot, while Aaron and the investigator tried to find the arithmetic error. Dan ran into trouble getting MAPLE to plot the scatter plot.

Dan: It doesnít seem to show for some reason.
Investigator: Thatís strange.
Dan: The graph shows 3 points, not 4.
Investigator: Where would the fourth point be?
Dan: There should be one right here but it [the correlation] still should be better than .1.

Aaron continued looking for the computational error during this interchange and the investigator returned to helping him, suggesting that instead of looking for the error to simply redo the calculation. Meanwhile Dan, quietly and persistently, tried to get MAPLE to compute the correlation coefficient by using the sum command but still couldnít get it to work.

Dan: I do this all the time in my other class. (He uses MAPLE in his electrical engineering class.)
Investigator: Weíll check it with MAPLE later.
Aaron: I think itís 2/sqrt(10). I think thatís right. I made two algebra mistakes and I found both of them.
Dan: (evaluating the expression on MAPLE) Yeah, thatís reasonable.

In writing up the answer and explanation, Aaron made suggestions. Dan listened to Aaronís suggestions though also incorporated his own ideas (though not saying anything), using Aaronís input. After reading what Dan typed, Aaron said, "I think that explains it."


In this vignette we see several of the issues we observed in other vignettes. For example, the two students struggle with computer software. In this case they can't remember the MAPLE syntax needed to sum a list. This raises the question: How do we provide students with adequate training in computer algebra software without taking away from instructional time? A second issue raised in this vignette is the problem students appear to encounter with self-regulation and time management. Dan inefficiently used trial and error to discover how to sum a list. They devoted too much of the lab time trying to create a scatter plot. This raises several questions: Are students more likely to have difficulty managing their time and regulating their thought processes in a computer based math lab? As anyone who uses a computer knows, sometimes a task which at first would appear to take a few minutes on a computer can end up taking much longer, perhaps because it is so easy to explore options on the computer.

This vignette also raises the question of the role of the instructor in an interactive computer math environment. How directive should the instructor be? In this case, the instructor finally intervened and directed Aaron to redo the calculation, instead of inefficiently looking for his error. What responsibility did the instructor have in this vignette for making sure the lesson progressed at a reasonable pace?

Aaron and Dan engaged in productive collaborative discourse and cooperative problem solving. There are several good examples of the in-depth cognitive processing that conceptually based approaches to math are designed to elicit. For example, when Aaron and Dan compute a correlation coefficient of .12, they experience a sense of cognitive disequilibrium. "Itís got to be higher than that," Dan remarked. The two students appear to have made a prediction or estimate of higher correlation coefficient based on their initial analysis of the data. When they computed a lower number, they began to question its reasonableness. Is questioning the reasonableness of an answer more or less likely to occur in a computer-based program? Will some students tend to accept as reasonable what the computer produces, because they view the computer as infallible? How can we design lab experiences which encourage students to make predictions, estimate reasonable answers, and then compare their estimates to computer generated products? How can we build into computer lesson modules tasks which elicit the high level mathematical cognitive processing we want students to engage in?

Summary and Conclusions:

The research presented here is the first stage of a process whose purpose is to develop an understanding of how students learn in technology-rich environments for the mathematics classroom. By carefully observing student work in a particular technology-rich environment (i.e., the Connected Curriculum Project), we generated a set of questions for further study. These questions were not derived a priori from a theoretical perspective but were derived from the data. Using grounded theory as a methodological approach, the data served as a first step in constructing a theory that will explain learning in this environment.

The analysis of the data led to the formulation of four categories of research questions: (1) What is the role of the instructor in this environment? (2) What types of behavior and thinking processes are students engaged in as they work together in front of the computer? (3) What is the importance of self-monitoring and metacognition in computer based instruction? and (4) What opportunities and obstacles are raised by the technology itself? Research in each of these areas has important implications for curriculum developers, math instructors, and students.

For each of these categories, we will summarize some of the issues and questions that arose from our observations:

(1) The role of the instructor: As in any active classroom, the instructor must confront questions of when and how to provide support and guidance to students who are engaged in complex problem solving. We observed incidents where the teacherís intervention was critical for the studentís progress. In other cases, we observed students floundering and making little progress because of the lack of help. We also saw students struggle, perhaps inefficiently, with a problem, but eventually solve it on their own. The questions raised here include: What should be the role of the instructor in a computer based math class? How does the instructor make decisions about intervening? In an interactive computer math lab, how does the instructor ensure that instructional time is used efficiently? Is learning more likely to occur if the instructor intervenes whenever students encounter difficulties? Should the instructor allow students ample time to discover solutions on their own?

  (2) What types of behavior and thinking processes are students engaged in as they work together in front of the computer? Much of the focus of our observations was on the ways students' chose to work with one another as they engaged in collaborative problem solving. For example, we observed students making decisions about which tools to use to solve the problems presented to them on the computer. One of the recurring themes concerned how, when, and why students choose one tool or another (e.g., pencil and paper, a calculator and/or CAS). Analysis of the data raised many questions concerning the perceptions students have of the array of tools available to them. We also observed students struggling with what they thought were tool problems, when in fact they had conceptual misunderstandings. How can students recognize and differentiate between tool problems and conceptual misunderstanding, and how can instructors help them?

We also observed how, as students worked indifferent situations, lab partners would assume certain roles such as hypothesizer, verifier, and recorder. We also saw students deciding who would take responsibility for typing, using the mouse, and offering initial ideas. At times, decisions of this sort were consciously made; at other times, the students seemed to choose roles without discussion. These observations raise the question: would students benefit from assigned roles? Should roles be structured to minimize the problem of one student taking over the learning situation?

We saw some students trying to understand the underlying concepts as well as some students trying to get through the lab as quickly as possible. In an interactive computer learning environment where meaningful student -to- student dialogue is essential for developing understanding, can mechanisms be built into the computer modules to get students to reflect on the quality of their interactions? Can interdependence and shared responsibility, as well as other aspects of cooperative learning, be built into computer modules?

We also noticed students using (or not using) links in many different ways. Some clicked on the hints immediately, some waited and clicked if necessary and some didnít link to the hint even when they were stuck. What cognitive conditions prompt students to use hot links? What are the ways students seek and receive help in the computer learning environment? These questions about links are questions of both cognition (how students are thinking about the problems at hand) and metacognition (their thoughts about their thinking).

We saw instances of productive dialogue as students tried to solve the problems presented to them. Is this type of dialogue typical of computer learning environments or a function of the content presented? Does the computer lab environment foster meaningful collaborative discourse in ways that would not occur without the computer, and, if so, why? What steps can be taken to facilitate dialogue in computer-based math classes?

(3) The importance of self-monitoring and metacognition: The computer learning environment seemed to affect the studentsí ability to manage their time efficiently. Is there a positive side of students becoming so involved in problem solving that they lose track of time? Where should the balance be between open-ended discovery versus a focus on getting the correct answer in a set amount of time? Because of the speed of computers we noticed that students would often not reflect before doing a calculation. Is a computer-based classroom different than a traditional pencil and paper environment in this regard? Can the learning of time management, self-regulation, and metacognitive processes be embedded and situated in computer based modules without significant costs in terms of instructional time? And (relating to problems with the technology), how can we help students learn how to check the reasonableness of an answer and determine whether discrepancies are due to mathematical or technical errors?

(4) The technology itself: Whereas computers solve certain pedagogical problems (such as time consuming calculations), they create others (such as learning the nuances of the software). Students struggled with the tools (MAPLE software commands) as much as they did with the math concepts. And the technical problems that occurred with the interaction of the software and hardware raise questions of how students and teachers react to such problems.

Clearly not all these questions fall neatly into a single category. For example the question of whether the instructor should assign roles to students depends on understanding what roles students assume, how they assume those roles, and the effect those roles have on learning. This is not an exhaustive list of issues. We can imagine many other questions, but, consistent with the principles of grounded theory, we are limiting the discussion to the evidence presented in the data.

In interpreting this data, it is important to realize that these students were talented students doing mathematics at a level beyond calculus and using specific software in a laboratory setting. It is not our purpose here to generalize these results to a larger population but to use these observations to suggest areas for future study. It is also important to note that each entering class of students brings more familiarity, more comfort and more sophistication with using educational technology. It is not clear which problems faced by the subjects in this study will likely be problems for students several years from now.

There were many issues not addressed in this study such as method of instruction and assessment, the physical environment of classroom and affective issues. These are certainly areas for future study. And as the categories above are reformulated and refined as we collect more data and develop a theory, the need to triangulate our observations (that is to verify our observations by interviewing subjects and collecting other sources of data) will be crucial.

Above, we have raised many questions that are suitable for immediate and more focused study. In concluding, we suggest several follow-up studies based on the following questions:

(1) What are the implications of these observations for distance education? What would be different when students work alone as opposed to working with a partner? Will they learn the "right" lessons? How (or can) problems such as the ones identified in this study be anticipated by curriculum developers of distance learning materials and what possible technological solutions can be built into these materials?

(2) Which of these issues arise in a real classroom setting versus an experimental setting? Which donít? What other issues arise?

(3) Initially, we thought the person who had control of the keyboard might have been the more active learner, but we have seen instances where the person who was not burdened with keyboarding was free to think more about the mathematical content. We would suggest follow-up studies that focus on particular aspects of the way students work together. These studies should include clinical observations along with interviews and other data collection.

(4) How does the speed, allure and stimulation of computers affect the ways students solve problems? We would suggest focusing in great detail (perhaps with think aloud protocols) on a comparison of students working on a task with pencil and paper as opposed to computers. Among other things, such a study could document how computers affect student time management.

(5) What can we learn from existing research in other areas (e.g., cooperative learning, problem solving approaches to instruction)?

(6) How are the questions raised here different in an active learning environment without computer technology?

Our long-term research program involves the creation and validation of a model of learning and teaching of mathematics in a technology rich environment. The model will examine the nature and importance of the relationships among the following components:



content and context

materials (software, text)

method of instruction and assessment

physical environment of classroom

affective environment (e.g., classroom atmosphere)

We plan to identify sub-components of these factors. For example, studentsí issues might include student's prerequisite knowledge, attitudes, motivation, and learning style. We hypothesize that there will be significant interactions among the sub-components both within and between larger components of the model. We realize that this is a very bold and ambitious agenda. We plan, working with others over a period of years, to make progress toward an understanding of the relationships among these components.


  1. Smith, D.A. (2000) Renewal in Collegiate Mathematics Education: Learning from Research. In Ganter, S. L. (Ed.), Calculus Renewal: Issues for Undergraduate Mathematics Education in the Next Decade (pp. 23- 40). New York, NY: Kluwer Academic/Plenum Publishers.
  2. Krantz, S. (2000). Imminent Danger: From a Distance, Notices of the AMS 47(5), 533.
  3. Chambers, J. & Bailey, C. (1996). Interactive Learning and Technology in the US Science and Mathematics Reform Movement. British Journal of Educational Technology (27), 123-133.
  4. Battista, M.T. (1999). The Mathematical Miseducation of America's Youth. Phi Delta Kappan, 80 (6), 425-433.
  5. National Council of Teachers of Mathematics. (1991). Professional Standards for Teaching Mathematics. Reston, Va: Author.
  6. National Research Council. (1991). Moving beyond myths: Revitalizing undergraduate mathematics. Washington, DC: National Academy Press.
  7. National Science Foundation. (1996). Shaping the Future:New expectations for Undergraduates for Undergraduate Education in Science, Mathematics, Engineering, and Technology. Washington, DC: Author.
  8. Colvin, M.R., Moore, L., Mueller, W., Smith, D. and Wattenberg, F. (1999). Design, development, and use of web-based interactive instructional materials. In G.Goodell (Ed.), Proceedings of the Tenth Annual International Conference on Technology in Collegiate Mathematics. Reading, PA: Addison-Wesley.
  9. Coyle, L., Moore, L., Mueller, W., and Smith, D. (1998). Web-Based Learning Materials: Design, Usage, and Resources. Proceedings of the International Conference on the Teaching of Mathematics (pp. 71-73). Somerset, NJ: Wiley.
  10. Bookman, J. & Blake, L.D. (1996). Seven Years of Project CALC at Duke University- Approaching a Steady State? PRIMUS, 6 (3), 221-234.
  11. Smith, D.A. & Moore, L.C. (1990). Project CALC, In T. W. Tucker, (Ed.), Priming the Calculus Pump: Innovations and Resources (pp. 51-74). Washington, DC: Mathematical Association of America.
  12. Smith, D.A. & Moore, L.C. (1991). Project CALC: An Integrated Lab Course. In Leinbach, C. et al (Eds.), The Laboratory Approach to Teaching Calculus (pp. 81-92). Washington, DC: Mathematical Association of America.
  13. Piaget, J. (1952). The Origins of Intelligence in Children. New York: International Universities Press.
  14. Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.
  15. Bruner, J.S. (1966). Toward a Theory of Instruction. New York: Norton.
  16. Fosnot, C.T. (1993). Rethinking Science Education: A Defense of Piagetian Constructivism. Journal for Research in Science Education.
  17. Portela, J. (1999). Communicating Mathematics through the Internet-A Case Study. Educational Media International, 36 (1), 48-67.
  18. Cooper, Marie A. (1999). Cautions and Considerations: Thoughts on the Implementation and Evaluation of Innovation in Science Education. In Kelly, A. and Lesh, R. (Eds.) The Handbook of Research Design in Mathematics and Science Education. (pp. 859-876.) Mahwah, NJ: Lawrence Erlbaum.
  19. Heid, M.K., Blume, G., Flanagan, K., Iseri, L.,Deckert, W., Piez, C. (1998). Research on Mathematics Learning in CAS Environments. In G. Goodell (Ed.), Proceedings of the Eleventh Annual International Conference on Technology in Collegiate Mathematics (pp. 156-160). Reading: Addison-Wesley.
  20. Glaser, B.G. & Strauss, A.L. (1967). The Discovery of Grounded Theory. Chicago: Aldine Publishing.
  21. Romberg, Thomas A. (1992). Perspectives on Scholarship and Research Methods. In D.A.Grouws (Ed.) Handbook of research on mathematics teaching and learning. (pp.49-64). New York: Macmillan.
  22. Slavin, R.E. (1995). Cooperative Learning: Theory, Research, and Practice. Boston: Allyn & Bacon.