ETIPS

Educational Theory into Practice Software



Embody Theory and Research


ETIPS - Make Thinking Visible

Technical Report 1: Characteristics of Relevancy Index and Number of Steps

Eric Riedel, Ph.D.

Center for Applied Research and Educational Improvement (CAREI)

University of Minnesota

David Gibson, Ph.D.

The Vermont Institutes

Abstract

The work of 449 preservice teachers using the ETIP cases in 2002-2003 is examined with a focus on how their information searches within the cases resemble those of experts (relevancy index scores) and how this is related to the extensiveness of their search (number of steps). In general, the relevancy index scores increased over multiple cases and were negatively related to number of steps taken in a case. Users made an average of 31 separate steps to access information in a case, an average which decreased over multiple cases. Individual search strategies for achieving high relevancy index scores included broad, sweeping searches of available information and shorter, focused searches of relevant information only.

Report prepared for the ETIP Cases Project on January 13, 2004. Correspondence regarding this paper can be directed to the first author at the Center for Applied Research and Educational Improvement (CAREI), University of Minnesota, 275 Peik Hall, 159 Pillsbury Avenue SE, Minneapolis, MN 55455, riedel@umn.edu.

Table of Contents

  1. Executive Summary. 2
  2. Introduction. 3
  3. Sample. 6
  4. Method. 7
  5. Distribution of Relevancy Index. 7
  6. Number of Steps Taken. 10
  7. Relation between Number of Steps and Relevancy Index. 11
  8. Examination of Selected Classes. 15
  9. Discussion. 18

List of Tables

  1. Table 1. eTIP 2 Problem Space with Relevant Information Highlighted. 5
  2. Table 2. Relevancy Index Scores by Semester 8
  3. Table 3. Fall 2002 Relevancy Index Scores by Class. 8
  4. Table 4. Spring 2003 Relevancy Index Scores by Class. 9
  5. Table 5. Individual Case 3 Sequence by Relevancy Score (Instructor P1's Course) 16
  6. Table 6. Individual Case 3 Sequence by Relevancy Score (Instructor J's Course) 17

List of Figures

  1. Figure 1. Number of Steps Taken by Course and Case (Fall 2002) 11
  2. Figure 2. Number of Steps Taken by Course and Case (Spring 2003) 12
  3. Figure 3. Case 1 Number of Steps by Relevancy Index (Fall 2002) 13
  4. Figure 4. Case 2 Number of Steps by Relevancy Index (Fall 2002) 13
  5. Figure 5. Case 1 Number of Steps by Relevancy Index (Spring 2003) 14
  6. Figure 6. Case 2 Number of Steps by Relevancy Index (Spring 2003) 14
  7. Figure 7. Case 3 Number of Steps by Relevancy Index (Spring 2003) 15

Executive Summary

The following paper examines specific aspects of the use of ETIP Cases, an educational simulation designed to allow preservice teachers to practice making technology integration decisions. These aspects include how skillfully a user searches for information in a simulated school setting relevant to the case question (relevancy score index), how extensive their search is (number of steps), and the relationship between the two. The sample includes 449 preservice teacher users embedded in 34 different teacher education classes. These were taught by 16 different instructors (test-bed faculty) among 10 colleges and universities over the 2002-2003 academic year.

The relevancy index had a range of 0 to 2. It had an approximately normal distribution with the mean between .86 and .93 and a standard deviation between .29 and .35 depending on the semester and case. Relevancy index scores generally increased over time. Gains were statistically significant when classes were grouped by semester but only for some individual classes. The number of steps had a range of 4 to 120 (cases with less than 4 steps were excluded).

The mean number of steps ranged between 31 and 32 with a standard deviation between 20 and 23 depending on the semester and case. The distribution for the number of steps was generally skewed left (positively) for both individual classes and whole semester samples. There was considerable variation within and between classes in the number of steps taken. There was a slight tendency for the number of steps to decrease over time, a pattern that was not statistically significant.

The relationship between number of steps taken in a case and the relevancy index was generally negative. As the number of steps taken increased the relevancy index score decreased. This relationship was consistent across semesters and cases. Although the pattern was generally linear, the relationship did appear to weaken with high numbers of steps taken.

Two test-bed classes from spring 2003 were selected for examination of individual users: Instructor J's introductory educational technology course; and a section of Instructor P's social studies methods for elementary teachers course. Instructor J undertook a minimal implementation strategy by introducing the cases in class but having students do each case on their own with little feedback. Instructor P implemented more substantially by doing cases in class, discussing them, and providing other forms of formative assessment. Individual search patterns on the third case were presented in relation to the individual's relevancy index scores. In Instructor P1's class, a high relevancy index score was achieved by hitting almost exclusively on the highly relevant items and not accessing any other information. In Instructor J's class, a high relevancy index score was achieved by accessing clusters of highly relevant items and returning to them later in the case.

Introduction

The Educational Theory into Practice Software (ETIPS) originated with a grant in 2001 from the U.S. Department of Education's Preparing Tomorrow's Teachers to Use Technology (PT3) program. Since its inception these online cases were designed to provide a simulated school setting in which beginning teachers could practice decision-making regarding classroom and school technology integration guided by the Educational Technology Integration and Implementation Principles (eTIPs). In each case, users are given a case challenge based on one of these six principles about how they would use educational technology in the specific scenario[1]. They then can search out information about the school staff, students, curriculum, physical setting, technology infrastructure, community, and professional development opportunities. After responding to the case challenge in the form of a short essay, users are given feedback about their essay and case search. (Readers can view cases at http://www.etips.info/.)

The present paper draws on research and evaluation data gathered on the actual use of the cases during part of the 2002-2003 test phase of the cases. It is part of a series of technical papers aimed at informing project staff, users of these cases, and researchers of educational technology more generally. This paper focuses on two measures of user performance within the cases – how expertly the user searches for information in the case (relevancy scores) and how extensive the user searches (number of steps).

Relevancy scores were developed by the ETIP Cases project as a way of summarizing student searches within the cases. Specifically they were intended to show to what degree the student accessed case information necessary for answering the questions in the case challenge and thus serve as one measure of technology integration expertise. Each case challenge contained questions related to one of educational technology and integration principles selected by an instructor for the assigned case. Thus relevancy is a concept defined by the specific questions asked.

Relevancy scores were previously assigned by ETIP cases project staff acting as technology integration experts. Each piece of case information was rated as "0 Not Relevant", "1 Somewhat Relevant", or "2 Relevant" to answering the question posed in the case prologue. A relevancy index was subsequently calculated as the sum of relevancy scores from the items accessed by a user in a case divided by the number of steps taken in the case. A "step" is defined as accessing an individual piece of information in the case. Returning to the same item later in a search would count as an additional step. The relevancy index is a measure ranging from 0 to 2 of the efficiency of the search.

The following analyses sought to provide a clear understanding of the nature of the relevancy index through answering the following questions:

  • What are the characteristics of the relevancy index scores?
  • What is the distribution of the relevancy index?
  • How does the index vary over time?
  • How does the relevancy index relate to number of steps taken in a case?

Example

The following example illustrates how relevancy is applied in one of the ETIP cases. It is taken from a case with an urban, middle school called Cold Spring in which the instructor assigned questions pertaining to eTIP2 ("added value"). The case challenge reads as follows:

This case will help you practice your instructional decision making about technology integration. As you complete this case, keep in mind eTIP 2: technology provides added value to teaching and learning. Imagine that you are midway through your first year as a seventh grade teacher at Cold Springs Middle School, in an urban location. A responsibility of all teachers is to differentiate their lessons and instruction in order to accommodate for the varying learning styles, abilities, and needs of students in their classrooms and to foster students' critical and creative thinking skills. As a new teacher at Cold Springs Middle School, you will be observed periodically throughout the first few years of your career. One of the focuses of these observations is to analyze how well your instructional approaches are accommodating students' needs. The principal, Dr. Kranz, was pleased with your first observation. For your next observation she challenged you to consider how technology can add value to your ability to meet the diverse needs of your learners, in the context of both your curriculum and the school's overall improvement efforts.She will look for your technology integration efforts during your next observation.

On the case's answer page, you will be asked to address this challenge by making three responses:

1. Confirm the challenge: is the central technology integrationWhat challenge in regard to student characteristics and needs present within your classroom?

2. Identify evidence to consider: What case information must be considered in a making a decision about using technology to meet your learners' diverse needs?

3. State your justified recommendation: What recommendation can you make for implementing a viable classroom option to address this challenge?

Examine the school web pages to find the information you need about both the context of the school and your classroom in order to address the challenge presented above. When you are ready to respond to the challenge, click "submit answer".

After reading the challenge, the user would then search for information relevant to the questions posed. Table 1 below lists all the information categories and individual items in those categories available for searching in all cases. The information items relevant to this particular case (eTIP 2) are highlighted. Relevant information is in bold and semi-relevant information is in bold and italics. Note that this table serves as a key for examination of individuals in two selected classes presented later in the paper.

Table 1 . eTIP 2 Problem Space with Relevant Information Highlighted

CATEGORY

INDIVIDUAL INFORMATION ITEMS

Prologue (1)

Prologue=1

About the School (2-11)

Mission Statement=2; School Improvement Plan=3; Facilities=4; School Map=5; Student Demographics=6; Student Demographics Clipping=7; Performance=8; Schedule=9; Student Leadership=10;

Student Leadership Artifact=11

Staff (12-22)

Staff Demographics=12; Staff Demographics Talk=13; Mentoring=14; Staff Leadership=15; Staff Leadership; Talk=16; Faculty Schedule=17; Faculty Meetings=18; Faculty Talk=19; Faculty Meetings Artifact=20; Faculty Contract=21; Faculty Contract Talk=22

Curriculum and Assessment (23-30)

Standards=23; Instructional Sequence=24; Computer Curriculum=25; Classroom Pedagogy and Assessment=26 ; Teachers=27; Talk=28; Talk 2=29; Clipping=30

Technology Infrastructure

(31-42)

School Wide Facilities=31; Library / Media Center=32; Classroom-Based Facilities=33 ; Classroom-Based Software Setup=34; Community Facilities=35; Technology Support Staff=36; Policies and Rules=37; Policies Clipping=38; Technology Committee=39; Technology Committee Talk=40; Technology Survey Results=41; Technology Plan and Budget=42

School Community Connections (43-48)

Family Involvement=43; Family Involvement Clipping=44; Business Involvement=45; Business Involvement; Clipping=46; Higher Education Involvement=47; Community Resources=48

Professional Development

(49-68)

Professional Development Content=49; Professional Development Content Area=50; Resources=51; Professional Development Leadership=52; Professional Leadership=52; Professional Leadership Talk=53

Professional Development Talk=53; Learning Community=54; Learning Community Talk=55; Professional Development Process Goals=56; Professional Development Data=57; Professional Development Data; Artifact=58; Professional Development Evaluation=59; Professional Development Evaluation Talk=60;

Professional Development Research=61; Professional Development Research Artifact=62; Professional Development Design=63; Professional Development Design Talk=64; Professional Development Learning=65

Professional Development Learning Artifact=66; Professional Development Collaboration=67; Professional Development Collaboration Artifact=68

Epilogue (69)

Epilogue=69

Essay (70)

Essay=70

Bold items have high relevance. Bold, italicized items have medium relevance.

The path a student took in the case (instructor J's class presented later in this paper) to search through the case is as follows:

Step 1: Prologue

Step 2: Student Demographics

Step 3: Faculty Contract

Step 4: Faculty Schedule

Step 5: Staff Leadership

Step 6: Standards

Step 7: Classroom Pedagogy and Assessment

Step 8: Teachers

Step 9: School Wide Facilities

Step 10: Library / Media Center

Step 11: Classroom-Based Facilities

Step 12: Classroom-Based Software Setup

Step 13: Policies and Rules

Step 14: Technology Survey Results

Step 15: School Improvement Plan

Step 16: (Clicks again on School Improvement Plan – Not Counted)

Step 17: School Wide Facilities

Step 18: Library / Media Center

Step 19: Essay

The student had 18 actual steps in the sense that he or she made 18 separate steps within the case. The relevancy index is thus calculated by summing nine items with high relevancy (18 points) each plus one item with medium relevancy (1 point) divided by the total number of steps (18) which are subtracted by one so as not to count the step for writing the essay at the end). The formula is:

Relevancy Index = (Sum of Relevancy Points) / (Number of Actual Steps) – 1)

1.12 = (19 / 17)

Sample

The sample consists of students enrolled in a teacher education class at one of the ten ETIP Cases test-bed institutions during the 2002-2003 academic year. This included 449 students in 34 foundations, methods, or educational technology classes taught by 16 different faculty and instructors. Faculty and instructors assigned one, two, three, or four cases depending on the needs of the course and their approach to implementing the cases in their course. Faculty and instructors also select whether the cases involved elementary students (K-6), intermediate and secondary students (7-12), or both. The sample was allowed to vary by these conditions with the exception that when a faculty or instructor allowed students to use either cases with elementary or middle/secondary students, only part of the class was included in the case analysis over time to insure consistency within class.

Data for the following analyses were collected automatically by the software although additional information (used in other technical papers) was collected through the use of a pre-semester survey. The software collected information on what information the user searched, in what order they searched, and the essay written at the end of the case in response to a general question posed about technology integration. Information from a user was included if that user returned a pre-semester survey, completed each of the cases assigned in the correct order, and made use of at least four separate steps in each case. These criteria assured that the data utilized met human subjects' protection requirements, the user made a reasonable attempt to follow course instructions, and that the user did not encounter insurmountable technical problems. Additional background data on case use was collected through the use of telephone interviews with each faculty or instructor using the cases following each semester.

Method

This analysis was conducted at both the level of all test-bed courses during either the fall 2002 or spring 2003 semester as well as at the level of the individual course. Given that the case curriculum underwent minor revisions between each semester, the two semesters are treated separately in the analyses. These revisions included weakening links among multiple cases, scaffolding case questions, and simplifying an essay grading rubric used by faculty and instructors. The revisions did not affect the internal structure of the actual cases however.

Descriptive characteristics are provided for both relevancy index scores and the number of steps taken. These include the measures of the average score (mean, median) and the how much individual scores vary from the mean (standard distribution). Change in these characteristics over multiple cases is assessed primarily through the use of paired t-tests. Aspects of individual student searches are also examined through comparing the behavior of individuals among two classes in which the instructors took different approaches to using the cases in their courses.

Distribution of Relevancy Index

The first two tables below provide the mean relevancy index scores by test-bed course and case for fall 2002 and spring 2003 semesters. The distribution of relevancy index scores approximated a normal distribution for each semester. The means relevancy index scores for each semester ranged from .86 to .93 (see Table 2).

The general pattern in relevancy index scores was for them to increase over time judging by the averages in each class. When looking at the classes combined by semester, the gains were sometimes statistically significant (see Table 3). When examining individual classes, only a few classes show statistically significant gains as shown in Tables 3 and 4 below. There was less individual stability in relevancy index scores than might be expected. For the fall 2002 semester the correlation between relevancy index scores on the first and second cases was only .26. For the spring 2003 semester the correlation between relevancy index scores on the first and second cases was .51 and .27 between the second and third cases.


Table 2 . Relevancy Index Scores by Semester

Semester

Case 1

Case 2

Case 3

Mean

S.D.

Mean

S.D.

Mean

S.D.

Fall 2002

.86

.30

* .93

.35

   

Spring 2003

.90

.29

.93

.33

* 1.07

.33

* Statistically significant difference from previous time period based on paired t-test (p < .05).

Table 3 . Fall 2002 Relevancy Index Scores by Class

Instructor

Course

Case Grade Level

ETIP

N

# Cases

Case 1

Case 2

Case 3

Case 4

Mean

Med.

Mean

Med.

Mean

Med.

Mean

Med.

A

Technology

Secondary

2

12

3

.78

.76

.82

.73

.77

.84

   

B1

Foundations

Elementary

2

11

2

1.09

1.00

.99

1.00

*.42

.33

   

B2

Foundations

Both

2

&Dagger 12

1

1.09

1.02

           

C

Methods

Elementary

 

5

2

.98

1.00

.89

.91

.99

1.00

1.17

1.08

D

Methods

Secondary

2

5

4

.98

1.00

.92

.93

1.02

.97

.89

.88

E

Assessment

Elementary

3

23

4

.88

.89

.85

.85

*1.04

1.00

1.02

1.00

F

Methods

Secondary

1

5

2

.76

.83

.79

.67

       

G

Foundations

Elementary

6

12

3

.67

.58

.69

.62

.81

.88

   

H1

Methods

Elementary

1

17

4

.91

.79

*1.06

1.04

1.00

1.00

1.09

1.04

H2

Methods

Elementary

1

11

4

.85

.81

.92

.96

1.04

1.25

.88

.90

I

Foundations

Both

2

&Dagger 5

4

1.16

1.18

1.06

1.06

1.03

1.20

1.10

1.03

J

Technology

Both

2

&Dagger 8

4

1.14

1.18

.99

.96

*.38

.42

*1.05

1.09

L

Foundations

Elementary

2

14

2

.79

.75

*1.44

1.58

*.35

.50

*1.05

.92

M1

Technology

Secondary

2

25

2

.89

.87

.92

.94

       

M2

Technology

Elementary

2

40

2

.77

.78

.81

.80

       

M3

Technology

Secondary

2

22

2

.80

.69

.88

.86

       

M4

Technology

Secondary

2

20

2

.76

.75

*.99

.92

       

N

Technology

Both

 

&Dagger 6

2

.69

.71

.75

.74

       

‡ Course allowed use of both elementary and secondary grade school cases. Part of the sample from this course was excluded for consistency across classes.

* Statistically significant difference from previous time period based on paired t-test (p < .05).

Table 4 . Spring 2003 Relevancy Index Scores by Class

Instructor

Course

Case Grade Level

ETIP

N

#

Cases

Case 1

Case 2

Case 3

Mean

Med.

Mean

Med.

Mean

Med.

M1

Technology

Elementary

2

15

2

.85

.79

.82

.83

   

M2

Technology

Elementary

2

4

2

.64

.64

.66

.66

   

M3

Technology

Elementary

2

18

1

.75

.73

       

M4

Technology

Elementary

2

21

1

.87

.82

       

M5

Technology

Elementary

2

9

1

.89

.88

.91

.93

   

I

Foundations

Elementary

1

28

3

1.01

1.02

.93

.85

* 1.13

1.06

O

Technology

Elementary

2

19

3

.87

.86

.92

1.00

.81

.78

J

Technology

Both

2

&Dagger 11

3

1.06

1.09

* 1.15

1.17

1.16

1.12

L

Foundations

Elementary

2

12

1

.77

.81

       

B

Methods

Secondary

2

6

2

1.17

1.12

1.04

.89

   

K

Technology

Both

2

&Dagger 11

3

.75

.77

.87

.82

1.03

.88

C1

Foundations

Elementary

6

3

2

.72

.60

.87

1.00

1.44

1.44

C2

Methods

Elementary

2

7

1

.93

1.04

       

E

Assessment

Elementary

3

12

3

1.05

1.09

1.09

1.13

1.10

1.12

P1

Methods

Elementary

2

6

3

1.14

1.22

1.08

.94

1.32

1.31

P2

Methods

Elementary

2

14

3

.74

.68

.80

.67

* 1.10

1.17

‡ Course allowed use of both elementary and secondary grade schools. Part of the sample from this course was excluded for consistency across classes.

* Statistically significant difference from previous time period based on paired t-test (p < .05).


Number of Steps Taken

For each semester sample, the distribution of the number of steps taken in a given case was skewed to the left (positively). This also characterized the distribution of the number of steps at the level of individual classes. In the fall 2002 sample, the mean number of steps taken in the first case was 31, the median number was 25, and the standard deviation was 22. For the second case taken in the fall 2002 sample, the mean number of steps taken was 30, the median number was 25, and the standard distribution was 22. There was no statistically significant difference between the mean number of steps taken in the first and second cases for fall 2002.

For the first case in the spring 2003 sample, the mean number of steps was 30, the median was 26, and the standard deviation was 21. For the second case, the mean number of steps was 32, the median was 25, and the standard deviation was 24. For the third case, the mean number of steps was 25, the median was 18, and the standard deviation was 20. There was a statistically significant difference in the mean number of steps between the second and third cases but not between the first and second cases.

The box plots below (Figures 1 and 2) show the considerable variation in the number of steps taken that existed both between and within classes. For each box plot, the range of scores is defined by the outer edges of the vertical lines for each class. The actual top and bottom of each box defines the 75th and 25th percentile of scores for that class. The middle line in each box defines the median (or middle) score for that class.

Figure 1 . Number of Steps Taken by Course and Case (Fall 2002)

Figure 2. Number of Steps Taken by Course and Case (Spring 2003)

Relation between Number of Steps and Relevancy Index

The basic pattern between the number of steps taken in a case and the relevancy index is negative. The more steps taken in a case, the lower the relevancy index score tends to be. This is true whether looking at an individual class or all classes taken together. This basic finding is demonstrated in the five scatter plots (Figures 3-7) shown below for fall 2002 and spring 2003 semesters. It is worthwhile to note that in some cases, a curvilinear relationship actually fits data better than a straight, negatively sloped line.

This curvilinear pattern could be described as where the negative relationship between the number of steps and the relevancy index becomes weaker as the number of steps taken increases. In further analysis, the outliers were cut out for each of the five scatter plots below by excluding cases where the number of steps exceeded 60 and three different forms of bivariate regression analyses were run for each predicting relevancy index by the number of steps (linear, quadratic, cubic). The exclusion of such outliers helps avoid problems of heterogeneity that would otherwise violate assumptions of regression analysis. In three of the five samples, a curvilinear relationship actually predicts more of the variance in the relevancy index than a linear relationship (case 1 fall 2002; case 1 spring 2003; case 3 spring 2003).

Figure 3. Case 1 Number of Steps by Relevancy Index (Fall 2002)

Figure 4. Case 2 Number of Steps by Relevancy Index (Fall 2002)

Figure 5. Case 1 Number of Steps by Relevancy Index (Spring 2003)

Figure 6. Case 2 Number of Steps by Relevancy Index (Spring 2003)

Figure 7. Case 3 Number of Steps by Relevancy Index (Spring 2003)

Examination of Selected Classes

The two classes examined below from spring 2003 contrast two degrees of implementation. Instructor J used a minimal level of classroom time and instructor guidance with the cases while Instructor P1 utilized formative feedback while asking students to do the cases during class time. The implementation for each class was carried out as the instructor intended. A table is produced for each class on the search paths for individuals working through the third case is provided below. Semi-relevant items are shaded light gray. Relevant items are shaded dark gray. Both classes worked with eTIP 2. Instructor J's data is from secondary grades while Instructor P1's data is from elementary grades. Each table arranges the students from left to right in terms of the lowest to highest relevancy index scores for that case. Each column in the table shows what items in the case the student accessed during each of the first 30 steps in the case. The key to each table was provided in Table 1 which also describes the full eTIP 2 problem space.

Each table suggests interesting patterns about what led to high relevancy index scores in each class. For Instructor P1's class (Table 5), the student with the lowest relevancy index score (.89) essentially adopted a pattern of going systematically through all available information from left to right, a pattern expected for students on the first case. The two students with the highest relevancy index scores (1.46 and 1.65) hit the highly relevant items in the categories About the School, Curriculum and Assessment, and Technology Infrastructure, but did not seek out much else. Those students that achieved middle relevancy index scores (1.28, 1.29, 1.33) also hit highly relevant items but while also hitting not relevant items. It is interesting to note those items that were "semi-relevant" played almost no part in any of the students' searches.

The patterns in Instructor J's class are somewhat different (Table 6). While hitting the three categories of About the School, Curriculum and Assessment , and Technology Infrastructure played an important role in relevancy index scores, other factors seemed to distinguish between those with the highest relevancy index scores and those with the lowest relevancy index scores. In particular, those scoring in the highest ranges appeared to hit on clusters of highly relevant items (e.g. #6 & #8; #24, #26; #31, #33) and even more importantly return to hit those clusters again. This is particularly true for students scoring 1.17, 1.25, and 1.34 on the relevancy index.

Instructor P1:Instructor P asked his students to do the ETIP cases as one of four assignments that dealt with the use of technology in teaching. He asked students to do three cases with eTIP 2 which figured into their class participation grade. He introduced the cases by talking about case studies and importance of reading critically. He then started the first case in a computer lab. Instructor P also did the first case and asked students to compare his exploration with their own. Most then went on to finish the case in class. Students were asked to do the second case on their own. In the next session, most students completed the second and third cases in class. Instructor P discussed the case and gave feedback on student work.

Table 5 . Individual Case 3 Sequence by Relevancy Score (Instructor P1's Course)

STEP

RELEVANCY INDEX SCORE

.89

1.28

1.29

1.33

1.46

1.65

1

1

1

1

1

1

1

2

2

2

6

2

6

6

3

1

6

7

6

8

7

4

.

8

6

8

9

6

5

2

9

8

9

23

8

6

3

10

26

10

24

25

7

1

23

30

3

25

26

8

3

25

6

23

26

27

9

4

.

.

.

27

24

10

6

26

26

24

31

31

11

7

27

30

33

32

32

12

6

31

25

34

33

31

13

8

32

23

26

34

32

14

9

26

25

27

.

31

15

10

24

26

35

70

35

16

11

9

30

31

.

8

17

14

8

26

32

.

23

18

15

31

27

31

.

.

19

17

32

2

33

.

1

20

18

70

8

31

.

.

21

20

.

9

32

.

.

22

18

.

10

36

.

.

23

23

.

25

70

.

.

24

24

.

26

.

.

.

25

25

.

23

.

.

.

26

.

.

24

.

.

.

27

26

.

25

.

.

.

28

27

.

26

.

.

.

29

31

.

27

.

.

.

30

32

.

33

.

.

.

Semi-relevant items are shaded light gray. Relevant items are shaded dark gray.

Instructor J:Instructor J repeated the course that she taught last semester and used the cases in a similar manner. She introduced the cases with a PowerPoint presentation that addressed teacher education standards and a school's technology infrastructure. She gave students copies of the prologue, rubric, and student implementation manual. Students were told they would not be penalized for searching extensively in the case. Students were given the rest of the class period to work on the first case and a week to complete two more cases outside of class. The essays were figured into the course grade. Instructor J did not utilize the search path map but did refer to the rubric in grading the essays.

Table 6 . Individual Case 3 Sequence by Relevancy Score (Instructor J's Course)

STEP

RELEVANCY INDEX SCORE

.79

.79

.93

1.06

1.09

1.12

1.17

1.25

1.34

1.55

1.67

1

1

1

1

1

1

1

1

1

1

1

1

2

.

3

2

.

2

6

2

6

6

6

33

3

3

4

3

6

4

21

4

31

8

23

34

4

2

6

6

8

6

17

6

32

9

24

6

5

.

7

12

9

8

15

8

31

12

25

31

6

4

8

23

23

31

23

12

32

24

26

32

7

33

12

24

24

32

26

23

31

25

27

70

8

34

25

25

25

33

27

24

33

26

31

.

9

36

31

33

31

35

31

25

34

31

32

.

10

23

32

34

32

48

32

26

35

33

33

.

11

24

33

31

33

25

33

27

25

34

34

.

12

9

36

32

34

70

34

31

6

35

.

.

13

8

41

41

35

.

37

32

9

43

.

.

14

6

35

9

51

.

41

35

43

45

70

.

15

15

47

70

15

.

3

37

47

6

.

.

16

42

51

69

36

.

.

33

.

8

.

.

17

70

1

.

23

.

31

34

70

10

.

.

18

.

.

.

24

.

32

43

69

23

.

.

19

.

51

.

.

.

70

51

.

24

.

.

20

.

42

.

70

.

.

49

.

25

.

.

21

.

70

.

69

.

.

50

.

26

.

.

22

.

.

.

.

.

.

.

.

27

.

.

23

.

.

.

.

.

.

1

.

31

.

.

24

.

.

.

.

.

.

33

.

32

.

.

25

.

.

.

.

.

.

34

.

33

.

.

26

.

.

.

.

.

.

31

.

34

.

.

27

.

.

.

.

.

.

24

.

6

.

.

28

.

.

.

.

.

.

25

.

24

.

.

29

.

.

.

.

.

.

26

.

70

.

.

30

.

.

.

.

.

.

27

.

69

.

.

Semi-relevant items are shaded light gray. Relevant items are shaded dark gray.

Discussion

When analyzed by semester, the relevancy index had an approximately normal distribution. There were, however, significant differences in the means and variance of the index between classes within each semester. The number of steps taken in a case had a positively skewed distribution. That is the bulk of individuals took between 4 and 40 steps in a given case with the rest divided over the entire range which was cut off at 120 steps in these analyses. To an even greater degree than the relevancy index, there was considerable variation in the mean and variance in the number of steps taken between classes each semester.

Examination of the relevancy index over time and in relation to the number of steps taken in a case revealed that the relevancy index appeared to function as intended. There were statistically significant gains in the mean value of the relevancy index over time by semester and some classes. The relevancy score was inversely related to the number of steps taken. Those who scored high on the index generally took fewer steps in the case than those who scored low. Given these patterns, it appears that at least some students narrow their searches within the cases to information more pertinent to the case question. High scores are related to choices about what information not to access rather than extensive searches in hopes of finding relevant information.

Whether the relevancy index represents learning about making decisions regarding high quality technology integration or learning only how to manage the ETIP cases remains an open question. Subsequent analysis with additional data internal and external the cases will provide further evidence on this. A preliminary in-depth examination of two courses, however, suggests the former. Specifically, those students with high relevancy index scores exhibited one pattern characteristic of expert thinking – the clustering of pieces of relevant information rather than accessing related pieces of information in different steps during their search.


[1] These six principles state the conditions under which technology use in schools has been demonstrated to be most effective. Case 1: Learning outcomes drive the selection of technology. Case 2: Technology provides added value to teaching and learning. Case 3: Technology assists in the assessment of learning outcomes. Case 4: Ready access to supported, managed technology is provided. Case 5: Professional development targets successful technology integration. Case 6: Professional community enhances technology integration and implementation. See Dexter, S. (2002). eTIPS-Educational technology integration and implementation principles. In P. Rodgers (Ed.), Designing instruction for technology-enhanced learning (pp.56-70). New York: Idea Group Publishing.