Learning in Engineering Running head: LEARNING IN ENGINEERING THE IMPACT OF NEW LEARNING ENVIRONMENTS IN AN ENGINEERING DESIGN COURSE Daniel L. Dinsmore Patricia A. Alexander Sandra M. Loughlin University of Maryland DRAFT Paper to be presented at the 2008 annual meeting of the American Educational Research Association, New York. Do not cite without author permission. 1 Learning in Engineering 2 Abstract In this study, we investigated the effects of students’ participation in a collaborative, projectbased engineering design course on their domain knowledge, interests, and strategic processing. Participants were 70 college seniors working in teams on a design project of their choosing. Their declarative, procedural, and principled knowledge, along with their domain interest and their interest in select roles within that domain were tested at the outset of the semester and at its conclusions. Findings indicated that this course contributed to a rise in students’ declarative knowledge, but not their procedural or principled knowledge of engineering design. Further, there was no significant change in students’ personal interest in the domain over the semester, and their role interests were not associated with their demonstrated knowledge in the field at posttest. Implications for the perceived effectiveness of collaborative, problem-based learning environment on students’ academic development are considered. Learning in Engineering 3 THE IMPACT OF NEW LEARNING ENVIRONMENTS IN AN ENGINEERING DESIGN COURSE Historically, educational researchers have recognized the multidimensional nature of learning and the influence that learner characteristics, teacher knowledge, pedagogical practices, and classroom resources and climate play in the processes and products of learning (Alexander, Schallert, & Reynolds, 2008; Jenkins, 1974). Recently, the multidimensional nature of learning has garnered increased attention in the research on “new learning environments” or NLEs—a phrase that evokes the infusion of technological advances within constructivist-oriented instructional settings (Brown, 2005; Gijbels, 2006). Although the specifics of NLEs vary from study to study, common features include a focus on “real-life” problems and peer collaboration (Gijbels, 2006). Within this literature, it is often assumed that these two pedagogical features will positively influence students’ autonomy and achievement goals (Land & Hannafin, 1996; Roth, 1996), and their strategic processing such as their self-regulatory behavior (Boekearts, 2002). The purpose of the current investigation was to put such assumptions to test within a particular domain and instructional context. Specifically, it was our intention to consider whether a learning environment that involved teams of students engaged in the solution of professionallyrelevant problems of their design would manifest growth in knowledge, strategic processing, and personal interest indicative of expertise development (Alexander, 2003). New Learning Environments and Learning Outcomes As noted, the use of “real-life” (i.e., professionally- or personally-relevant) problems or examples has been a prominent pedagogical feature in the NLE literature. These problems are differentially defined in the literature, but can be broadly described as classroom problems intended to mimic problems encountered in field- or domain-related contexts. But, to what Learning in Engineering 4 degree does the inclusion of these problems (hereafter referred to as relevant problems) translate into better learning for students? There is certainly some evidence that realistic and challenging problems matter. For example, Hardy, Jonen, Möller, and Stern (2006) found that an environment focusing on relevant examples in a third-grade science classroom yielded greater conceptual change compared to an environment that did not focus on relevant examples. However, such supportive evidence is not consistent within the literature. In fact, in a metaanalysis Dochy, Segers, Van den Bossche, and Gijbels (2003) found that, in some cases, the focus on relevant problems had a negative effect for knowledge gains. Further, the relation between environments rich in such problems and knowledge gain was mediated by the relative expertise of the learner. Another feature considered in the NLE research is peer collaboration, which has often been tied to relevant problem-solving tasks. Underlying peer collaboration efforts are the assumptions that cognition, along with motivation and engagement, will be enhanced when learners work together to solve challenging and meaningful problems. Evidence that these environmental manipulations increase satisfaction and interest in sciences courses has been offered (Nolen, 2003). However, the evidence for the effects of peer collaboration on student learning is far from consistent. For example, Beers, Henny, Boshuizen, Kirschner, and Gijselaers (2005) determined that the effects of collaboration were mitigated when student views were not made explicit. For instance, socially-inhibited students may not make their views explicit and, thus, may not benefit from peer collaboration. Thus, the characteristics of the learners appear to be implicated in peer collaboration within NLEs and should be carefully considered in analysis of learning outcomes. Learning in Engineering 5 One presumed consequence of environments that feature relevant problems and peer collaboration is that students may enjoy more autonomy in both the problem search and problem solution aspects of schooling (Lea, 2005). As with relevant problems, the evidence for increased learner autonomy and choice is mixed for NLEs. These differential outcomes may pertain to the fact that increased autonomy for learners in these classrooms comes with an increased requirement for students to set and regulate their own goals and instructional behaviors rather than relying on the teachers or the curriculum to provide “other” regulation (Boekaerts, 2002). Understandably, students may be differentially able or willing to assume that responsibility. The data on students’ strategic processing further illuminates the varied effectiveness of NLEs on learning. For instance, student-directed problem solving was found to yield higher selfreports of surface-level rather than deep-processing strategies (Nijhuis, Segers, & Gijselaers, 2005). One reason students gave for this increase in surface-level strategies was that the clarity of the goals for the more student-directed course decreased as student autonomy increased. By comparison, Wilson and Fowler (2005) found that the strategic performance of students who reported that they typically engage in deep processing remained stable across diverse learning environments, whereas students who reported relying on surface-level strategies reported a shift to deeper-processing strategies when relevant problems were involved. Although the increase in deep-processing strategies among learners who relied on surface-level strategies could be regarded as positive evidence for NLEs, the stability in strategies for students typically reporting deep-processing strategies across academic settings suggests that the environment may not tell the whole story. Rather, it seems likely that learners with different cognitive and motivational characteristics benefit differentially from NLEs. Further Consideration of the What and Who of Learning Learning in Engineering 6 The research on environments featuring relevant problems and peer collaboration has generally focused on the where of learning (i.e., features of the environment). However, within the NLE literature, significantly less attention has been paid to the what (i.e., the specific instructional content) and the who (i.e., particular student characteristics) in conjunction with the aforementioned contextual features. In most cases, the what of learning in reported studies has been on more factual, declarative knowledge outcomes. Therefore, it is less apparent what the effects of NLEs will be on more complex procedures or more abstract concepts (Gijbels et al., 2006). Who is learning in NLEs is also of paramount concern in learning outcomes. However, the particular cognitive and motivational characteristics of study participants have not been fully explored. For instance, the participants in many of these NLE studies are older and more cognitively mature learners and many are enrolled in courses that are directly relevant to their professional or academic goals. Yet, the potential effects of these critical learner characteristics have not often been weighed in the interpretation of outcomes, even though it seems likely that environmental manipulations will be differentially utilized by learners with varying degrees of knowledge and motivation (Dochy et al., 2003). Indeed, Vermetten, Vermunt, and Lodewijks (2002) cautioned that individuals interact uniquely with their environments, indicating that learner characteristics may mediate what can be learned from classrooms that feature relevant problems or peer collaboration. Further, it is unclear whether effects for NLEs would remain stable across levels of expertise or for varied academic domains (e.g., mathematics or history; Boekaerts, 2002). In order to better understand learning in academic settings, the effectiveness of the learning environment must be examined in relation to what is being learned and by whom. The Current Investigation Learning in Engineering 7 One framework with which to explore the multidimensional nature of learning within NLEs is the Model of Domain Learning (MDL; Alexander, 1997, 2003). The MDL examines the interplay of students' knowledge, interest, and strategic processing across three stages of expertise: acclimation, competency, and proficiency. There are certain advantages to using the MDL in the present investigation. First, as Dochy et al. (2003) ascertained, knowledge gains from pedagogical features of learning environments should be examined in relation to students’ relative levels of knowledge and experience. The MDL would allow for such a consideration by gauging the effects of a particular learning environment on students who manifest different levels of knowledge at the outset of the educational experience. A second valuable aspect of the MDL is the inclusion of interest as a motivational variable. Boekaerts (2002) suggests that motivational aspects may play a key role in NLEs. Moreover, individuals’ personal investment in a given topic or domain has been positively linked to the processes and products of learning (Alexander, Jetton, & Kulikowich, 1995; Hidi, 1990; Wade, Buxton, & Kelly, M. 1999). Thus, examining the interaction of not only cognitive aspects of learning (i.e., knowledge and strategies), but also motivational aspects (i.e., interest) may help clarify the mixed results for learning in NLEs. Further, the MDL expands the methodological features of the current literature, which relies heavily on static self-report, by examining the changes in student learning over time through cognitive measures. Finally, according to the MDL, the domains of exploration matter greatly since learners’ knowledge and interest and strategic abilities differ from field to field. To capture the domainspecific aspect of NLEs, we nested the present study within the field of mechanical engineering, and more particularly engineering design. There were several reasons that engineering design was chosen as the focus for this study. For one, engineering design represents a novel venue for Learning in Engineering 8 consideration of NLEs. In addition, there are certain characteristics of engineering design that correspond nicely to the key features of NLEs, such as the focus on the solution to realistic engineering problems that reflect both creativity and practicality (Spector, 2006). For our purposes, the specific course we chose for this investigation was a senior capstone course on design for mechanical engineering students. The capstone course was expressly devised in response to criticisms that graduates of engineering programs were not receiving the skills necessary to make the transition from academia to industry. In this course, engineering students work in teams to complete a design problem of their choice, while presumably acquiring detailed understanding of the field of engineering and enhancing their individual interest in the domain and the professional roles associated with that domain. In many ways, the features and goals of this capstone design course correspond to the features and goals of NLEs in general, including relevant problems, collaborative learning, and enhanced student autonomy and choice. In order to address the potential effects of participation in this unique context where collaborative problem solving is a natural rather than contrived feature of the learning environment, we posed four research questions. First, we investigated whether there was a significant change from pretest to posttest in participants' domain-specific declarative, procedural, and principled knowledge, strategies and personal interest. Based on the MDL, we hypothesized an increase in students’ knowledge, strategies, and interest as a result of an effective instructional experience. Yet, our decision to measure personal interest over one semester by means of an activity-based approach (Schiefele & Csikszentmihalyi, 1995) made it less likely that a change in personal interest would be evidenced. Second, we sought to examine the magnitude of such changes. Given the hands-on, problem-based nature of the design Learning in Engineering 9 curriculum, we felt that this instructional experience would translate into higher effects on students’ procedural knowledge than for their declarative or principled knowledge about engineering design. We also looked specifically at students’ self-reported interest in roles associated with engineering design and perceived as central to team collaboration. Our decision to focus on this specific learner characteristic was informed by the research on engineering teams within undergraduate education. For instance, Schmidt (2006) expressed concern that the assignment of students to standard roles (e.g., prototype developer or computer-aided designer) could negatively affect students’ interest in experiencing alternative roles (e.g., end product interviewer) and their knowledge gains. Consequently, we used the resulting data to answer two questions. First, does role interest remain stable from the beginning to the end of the course? Further, does role interest mediate participants' declarative, procedural, and principled knowledge, strategic processing, and their personal interest? It was expected that students— many of whom had prior team experience—would be differentially drawn to the standard roles associated with engineering design but that level of interest would remain relatively stable over the semester. In addition, we hypothesized that role interests would not mediate effects for domain knowledge strategic processing, or personal interest. Method Participants Participants for this study were 110 undergraduate mechanical engineering students enrolled in a capstone engineering design course at a large university in the mid-Atlantic region. The capstone design course is required for graduation in mechanical engineering and is taken mainly by undergraduates in their last year of study. Of these participants, 70 completed both the Learning in Engineering 10 pretest and posttest measures. The other 40 students who took only the pretest or posttest were therefore excluded from subsequent analyses.. As is typical of engineering programs, the majority of students were male (n = 66; 94%) and Caucasian. The Capstone Engineering Design Course As noted, the capstone design course was structured around relevant problems and peer collaboration. There were two major components to the course; a lecture component that met twice weekly and a laboratory segment that occurred three times a week. The lecture component was the only time students were assembled as a whole class. Typically, each lecture period either introduced topics to the class that were relevant to the course (e.g., concept generation, axiomatic design, or patents) or served various administrative purposes (i.e., exams, peer evaluations, and assigning/collecting homework). Students attended lectures where topics were introduced 20 times during the semester. More heavily weighted in this course were the procedural and hands-on facets of engineering design. Students attended lab sessions where they worked in teams approximately 36 times throughout the semester. They also worked together outside of class. The major product for the course was a team project undertaken by groups of about 6 students. This project was devised by the students who continually worked toward its completion utilizing the product development process (PDP). The PDP expanded the problem-solving framework introduced in earlier courses and included the product development viewpoint (i.e., customer requirements and outcomes). At the conclusion of the course, teams created a prototype of their design based on the PDP, of which sketching is a major component, and presented that prototype as a final product. This was a critical and intensive course for participating undergraduates. One of the defining features of the collaboration in this course is the specialized roles that students had to Learning in Engineering assume as part of the team problem-solving experience. Those roles involved CAD (computeraided design) modeling, creating prototypes, interviewing end product users, technical writing, and mathematical analysis (Schmidt, 2006). Measures In order to explore the relations between knowledge, interest, and strategies in the domain of mechanical engineering, we developed measures using the lens of the MDL. Specifically, we developed six domain-specific measures: declarative knowledge, principled knowledge, procedural knowledge, strategic processing, personal interest, and role interest. Together these six measures formed the Engineering Design Instrument (EDI). Neither final course grade nor GPA was available as an outcome measure for participants. However, we did not regard this as a serious limitation, because our research questions investigated change over time relative to knowledge, strategies, and interest, which were measured through the EDI. Declarative knowledge. The declarative knowledge measure consisted of six engineering design terms (e.g., “House of Quality” or “brainstorming”). Those terms were drawn from the mechanical engineering design curriculum by the course instructor who is an expert in this domain. The terms were judged not only to be core concepts to this field but also to differ in familiarity. That is, it was assumed that students entering the capstone course would have had some exposure to several of the terms but would not be highly knowledgeable about any. Participants were directed to define the six terms and explain their importance in engineering design. A template was provided by the instructor, and included prototypical descriptions for each term that could serve as guides in scoring. For instance, brainstorming was described by the instructor as “a group creativity technique designed 11 Learning in Engineering 12 to generate a large number of ideas for the solution to a problem.” Items were scored by the first and third authors on a scale of zero to two. An incorrect or blank response received a score of zero, a response including an appropriate description or an explanation of importance received a score of one, and a response that included both an appropriate description and an explanation of importance was scored as a two. Responses that paraphrased the provided description were considered to be correct. Using the scoring scale, a description of “generating plausible solutions” for brainstorming received a score of one, because it addressed the idea generation aspect of brainstorming, but did not address its importance to engineering design. A description of “deep thinking” was scored as a zero, because it neither included an appropriate description of brainstorming nor addressed the role of brainstorming in the engineering design process. Interrater agreement for the measures was over 88% with differences resolved through discussion. Although the reliability for the declarative knowledge measure at pretest was low (α = .57), we felt that this was reflective of the relative unfamiliarity of certain terms. Further, we felt that this reliability was sufficient for our experimental purpose and was compensated for by the strong content validity. Principled knowledge. Principled knowledge was assessed with an open-ended item for which students were directed to list domain principles that define the field of engineering design. A sample principle from the field of physics was provided (e.g., “Energy is neither created nor destroyed.”). For scoring purposes, the instructor provided authors with six engineering design principles that were directly related to the goals and experiences of the capstone course (e.g., “Function is not equal to form;” or “Quality cannot be inspected into a product.”). In addition, it was decided that any statement of principle not on the instructor-provided list but considered Learning in Engineering 13 appropriate to the field of mechanical engineering in general or engineering design in particular would be added to the scoring template. Two such statements appeared with some regularity in the data from participants (i.e., “Design is an iterative process” and “Satisfy customer needs”) and were thus added to the template. Student responses were scored by the first and third authors as approximations or nonapproximations of the eight listed principles. Each suitable paraphrase received a score of one (maximum=8). For instance, the response “Form follows function” was coded as a suitable approximation of the principle “Function is not equal to form” and received a score of one. Interrater agreement for the principled knowledge over 89%. Procedural knowledge. Procedural knowledge was measured through a diagram of the engineering design process. Participants were directed to create a visual representation of the sequence or steps in the engineering design process that was key to the capstone course. Reponses were scored in comparison to an instructor-provided template in which the eight essential steps of the PDP were represented (Dieter & Schmidt, 2008). Diagrams were coded on a scale of zero to three. A score of zero indicated that no answer or no correct steps were provided, one represented one to three correct steps, two indicated four to seven correct step, and a score of three was awarded for a complete, eight-step process. The first and third authors scored the diagrams with an interrater agreement of above 88%. Disagreements in coding were resolved through discussion. Strategic processing. We attempted to measure strategic processing through a studentgenerated sketch. Participants were directed to provide a preliminary sketch for a design concept of an all-purpose athletic bag. Based on sketches produced during piloting, we expected to see evidence of strategic processing, such as sketch revision, written questions, or notes describing Learning in Engineering 14 changes made to the design. However, the participants’ sketches on the pretest and posttest revealed no evidence of strategic processing. Therefore, we excluded strategic processing from any subsequent analyses. Personal and role interest. Two types of interest were considered in this study: personal interest (i.e., enduring interest in the domain of mechanical engineering or engineering design) and role interest (i.e., interest in the specialized roles that students must assume as part of the team problem-solving experience). Personal interest was measured with five Likert-type items that directed participants to indicate the frequency of their participation in domain-specific activities within the last three years (e.g., “Attended conferences, seminars, or workshops related to engineering design;” or “Engaged in community based projects related to engineering design.”). Participants responded by marking the box that best described their level of participation on a five-point scale ranging from "very rarely" to "very often.” Participants also had the opportunity to indicate if they were unsure of the meaning of the question. Scores from the five personal interest items were summed to create a total personal interest score. Cronbach's alpha for this measure was acceptable (α = .84). Volunteered for project competition courses related to engineering design (e.g. solar decathlon)? Very rarely Rarely Sometimes Often Very often Unsure of Question’s Meaning A similar procedure was used to measure role interest. Participants were directed to rate their level of interest in engineering design activities (e.g., “CAD modeling” or “creating prototypes”) on five Likert-type items. Again, participants responded by marking the appropriate box on a five-point scale ranging from "not interested at all" to "very interested." Participants were directed to make the box labeled “Unsure of Question’s Meaning” if they were unfamiliar Learning in Engineering 15 with the designated role. The Cronbach's alpha for this scale was lower than that of personal interest (α = .69), but we did not consider this to be a serious concern, because students are often interested in multiple roles within the mechanical engineering domain (Schmidt, 2006). CAD modeling Not interested at all Not interested Neutral Interested Very interested Unsure of Question’s Meaning Procedure The participants in this study took the EDI twice during the semester. The class instructor administered the pretest during the second lecture class of the semester, which occurred during the second week of classes. The EDI was again administered by the instructor at posttest during the last lecture class of the semester, 15 weeks later. All students completed the EDI in less than one hour at both pretest and posttest. Results and Discussion Our purpose in this investigation was to examine the effects of participating in an engineering design course involving collaboration and relevant problems on students’ knowledge and interest in the domain of mechanical engineering. Descriptive statistics for all measures involved in analyses are included in Table 1, and Table 2 displays the correlations for key variables. These data meet the assumptions for repeated measures ANOVA. Although not all the distributions of the data were perfectly normal, ANOVA is generally robust in terms of normality. In addition, the assumption of spherecity was not relevant to these data since the repeated measures ANOVA encompassed only two time points (i.e., pretest and posttest). Change Over Time Learning in Engineering 16 Two of the questions we explored considered change over time. The first of these generally addressed the growth in students’ knowledge and interest as a consequence of their participation in the engineering design course, whereas the second considered the magnitude of those changes relative to the instructional experiences central to the course. To address the first question, we ran four repeated measures analysis of variance on declarative, procedural, and principled knowledge and personal interest with time as the repeated measure. Since a multivariate analysis of variance (MANOVA) design would not allow us to tease apart which key variables were significant in the repeated measures, we chose to run separate ANOVAs on the four variables as they were identified in our a priori hypotheses (Hancock, Lawrence, & Nevitt, 2000). With regard to increases in declarative knowledge, the repeated measures ANOVA was significant from pretest (M=4.37, SD=1.13) to posttest (M=5.33, SD=1.66), F(1,69)=22.71, p<.001, Cohen's d=.58. The repeated measures ANOVA on procedural knowledge revealed no significant differences from pretest (M=0.63, SD=0.52) to posttest (M=0.73, SD=0.56), F(1,69)=2.38, p >.05. For principled knowledge, significant gains were not detected from pretest (M=0.26, SD=0.50) to posttest (M=0.36, SD=0.54), F(1,69)=1.33, p>.05. Finally, the repeated measures analysis for personal interest demonstrated no significant change from pretest (M=11.06, SD=4.01) to posttest (M=11.44, SD=3.86), F(1,69)=1.22, p>.05. As hypothesized by the MDL, there was evidence that participation in the design course contributed to gains in students’ declarative knowledge in the domain. However, the hypothesized effects for other forms of knowledge (i.e., principled knowledge and procedural knowledge) did not materialize. It was surprising that we did not find a significant change in procedural knowledge from pretest to posttest. One plausible explanation for this outcome may Learning in Engineering 17 relate to students’ prior exposure to the PDP process. This prior exposure to the PDP process was evident when scoring the pretest measure of procedural knowledge. It is possible that participants had sufficient procedural knowledge in which to complete the team project, so the need to integrate more procedural knowledge was not present. In addition, in line with our hypothesis that personal interest would not change, due to the relatively brief elapsed time between pretest and posttest, we also found no detectable change in personal interest as measured by domainrelated activities. One possible explanation for this change in declarative knowledge is the nature of the team based project. The knowledge gain could in fact be due to the different facets of knowledge that each team member brought to the project. In addition, the modeling of peers has been hypothesized to be an effective learning mechanism for older students than adults (Bandura, 1986). Although we found declarative knowledge gains in this case, these findings may be reflective of the characteristics and interaction of the team members as Beers et al. (2005) suggested. Further, the differential pattern of gains in these data from pretest to posttest in our measures extends the findings of Dochy et al.'s (2003) meta-analysis. In this study, we pulled apart different kinds of knowledge (i.e., declarative, procedural, and principled) to test specific gains in each. In addition we were able to extend literature on NLEs beyond strictly cognitive variables. Clearly, findings relevant to our initial question relate, as well, to our second question, which addressed the magnitude of instructional effects. Based on the effect size of the declarative knowledge measure we can see that again, in this case, the effect size for declarative knowledge was moderate (Cohen, 1988). We would caution, however, that these effects may Learning in Engineering 18 vary from sample to sample based on the characteristics and interactions of the team members. In regards to personal interest, we were particularly interested in activities that may be amenable to change in the short-term. Although we could not conduct individual comparisons of items since the omnibus ANOVA was not significant, descriptively we were interested in which activities in a course such as this may build personal interest. The two activities that showed the greatest gains were; attending workshops and engaging in community based projects. Role Interest Two aspects of role interest were examined in this study. The first was the relative stability of the role interests over time. The second was the variability in the participants' declarative knowledge, procedural knowledge, and personal interest associated with role interest. (We did not test the effects on principled knowledge because of the small amount of variability in this measure.) Stability of role interest. In order to examine role stability, we ran a principle-components analysis on items at pretest and at posttest. We then subjected those data to Varimax rotation to obtain simple structure for ease of interpretation. Table 3 contains the rotated factor loadings by items by time. At pretest, two components had eigenvalues greater than one and accounted for 57.06% of the variance in the original items. Specifically, two roles (i.e., CAD modeling and creating prototypes) loaded highly on the first component, while mathematical analysis loaded moderately well on this first component. Both interviewing end users and technical writing loaded highly on the second component. The first component appears to represent a more physical hands-on role within the teams. The second of these components seems to reflect a more communicative role by the participants; that is, roles that presumably require these engineering students to share ideas orally or in writing. Learning in Engineering 19 At posttest, two components again had eignenvalues greater than one and accounted for 62.52% of the variance in the original items. The only loading that differed from the pretest findings was that of mathematical analysis. At posttest, this role loaded moderately on both the hands-on component and the communicative component. It could be that the shift in this particular role from pretest to posttest indicates an expanded awareness of the demands of mathematical analysis to the completion of challenging engineering projects. Yet, given the moderate loadings of this role at both testing points, it could also be argued that the specific character of this team duty remains difficult for students to ascertain. Although these roles were relatively stable over time in the principle components analysis, an individual participant's role interest had potential to change throughout the course. In order to examine participants role interest over time, we subjected their factor scores on each component at pretest and posttest to a regression analysis. The beta weights for both the handson component [t(67) = 1.10, p>.05] and the communicative component [t(67) = -0.08, p>.05] were nonsignificant, indicating that participants' interest in these roles at pretest was not predictive of their interest at posttest. We can forward two possible explanations for these findings. First, it may be indicative of the team configuration of the course. Roles that students may be interested in when they entered the course may not be the roles they assumed, willingly or otherwise. For example, in the teams of six, it would be possible to have a majority of students initially interested in the CAD modeling aspect of the design process. Yet, the nature of the projects would permit only one of those students to actually assume that role. Still, the division of labor within teams would mean that any number of students would be required to take on less popular roles, such as the technical writer or the mathematical analyst. Experiencing a role that was previously uninteresting may Learning in Engineering 20 increase interest in that particular role. Conversely, it is also possible that exposure to a new role decreased or reinforced relative interest in that particular role. The second explanation for this change may be that the course (through its emphasis on the PDP process) focused more on the communicative roles than participants' previous coursework, particularly the focus on interviewing end users. Concurrent descriptive evidence for this is evident in the principled knowledge measure. Although the principled knowledge of these participants was relatively low at both pretest and posttest, principles dealing with customer requirements were prevalent. Unfortunately, further statistical inference was not possible on the individual items of role interest since there was no significant omnibus test. Interaction of role interest with key variables. Again, in line with the implications of Dochy et al.'s (2003) meta-analysis, we hypothesized that role interest may explain a significant portion of the variability among students' levels of declarative knowledge, procedural knowledge, and personal interest. In order to investigate these relations, we regressed these key variables at posttest on their role interest component scores at posttest. For knowledge, the communicative role component was a positive predictor of declarative knowledge gains [β = 0.38, t(67) = 3.92, p<.01], while the hands-on component was a nonsignificant predictor [t(67) = -0.18, p>.05]. By comparison, neither the hands-on component [t(67) = -0.15, p>.05] nor the communicative component [t(67) = 1.24, p>.05] was a significant predictor for posttest procedural knowledge However, both the hands-on component [β = 0.86, t(67) = 68.96, p<.001] and the communicative component (β = 0.50, t(67) = 39.65, p<.001) were significant predictors of personal interest at posttest. These findings again extend the implications of the Dochy et al. (2003) meta-analysis. In essence, gains in knowledge may not only interact with prior levels of cognitive characteristics, Learning in Engineering 21 but also with levels of role interest in these learning environments. Specifically, we can see that both the hands-on and communicative roles were positively related to personal interest, although the score on the hands-on component was much more highly predictive of higher personal interest than the communicative component. The opposite was true with domain procedures, where scores on neither component was found to be predictive. Interestingly, high component scores on the hands-on component significantly predicted positive knowledge gains, while component scores on the communicative component were nonsignificant. This is a particularly salient finding since overall knowledge gains from pretest to posttest for all participants were nonsignificant. This may indicate that giving students the opportunities to experience varied roles may help them acquire domain knowledge within these collaborative, problem-based learning environments. Conclusions and Implications Understandably, there is a determined search within education to find more effective ways to structure the learning experience for students so as to stimulate their minds and enhance their motivation to learn. New learning environments or NLEs are one promising path toward effective learning that combine the technological innovations of our time with more constructivist theoretical views of human learning. Among the features of these environments are problem-based and more student-directed educational experiences that incorporate collaborative engagement around tasks that are not only perceived as challenging but also relevant to students and to the domain under exploration. To date, however, the evidence that these NLEs fulfill their promise has been mixed (Dochy et al., 2003). One explanation posed for such mixed results is that the features of these learning contexts may prove differentially effective for learners who enter that environment with varied goals, interests, and background knowledge. In this study our Learning in Engineering 22 intention was to examine this differential hypothesis within the framework of the MDL (Alexander 1997, 2003). What the MDL afforded us as an explanatory framework was twofold. First, it allowed us to target variables that have been shown to be critical in individuals’ development within academic domains. For that reason, we set out to consider the knowledge, interests, and strategic processing our students as they entered and exited a particular learning environment. Second, one of the key premises of the MDL is that the interplay of these variables is essential, especially for those who are seeking to gain a foothold in competence within a given field of study. Thus, if a learning environment manifesting features of NLEs does not concurrently support the development of learners’ knowledge, interests, and strategies, then its apparent effectiveness may be called into question. We were also in a unique position in this investigation to test our hypotheses within an existing educational context that was particularly informative. Not only were challenging and professionally-relevant problems a centerpiece of the capstone design course, but these mechanical engineering students came up with the projects they wanted to tackle and worked in teams throughout the semester to complete those projects. In effect, as researchers, we did not have to impose the environmental features of interest on an instructional context; those features were natural components of the capstone learning environment. Moreover, it is important to reiterate that the undergraduates who participated in our study were senior mechanical engineering students who were generally committed to this domain and who had prior projectbased, team experiences earlier in their program. Consequently, we could expect some level of domain interest and familiarity with collaborative activities that might not be assumed within all NLE research. Learning in Engineering 23 Of course, this study was not without its limitations in that we were not able to link our variables to project grades or overall academic performance. Further, we found that our approach to documenting strategic processing did not serve us well. Consequently, we were not able to consider the students’ strategic processing within our analysis. Even in light of those limitations, however, we discerned several intriguing patterns in student learning with implications for research on NLEs, as well as instructional practice in classrooms where the features of NLEs are evident. First, our findings echo the concerns of Dochy et al. (2003) regarding the effects of collaborative, problem-based environments on student knowledge. Moreover, our study extends that meta-analysis in that we were in the position to disentangle the construct of knowledge by considering three forms in this analysis: declarative, procedural, and principled. In fact, we determined that the declarative knowledge of students, represented in this study as knowledge of key concepts, developed through this experience. This was the good news. However, the same could not be said for students’ procedural or principled knowledge. Specifically, even when students enter a learning environment with some level of background knowledge and interest in the domain, it cannot be assumed that their engagement in a relevant project will necessarily translate into the acquisition of fundamental principles deemed critical to the domain. It was evident from our reading of the course syllabus and, more importantly, our interactions with the course instructor that students were expected to acquire certain core principles about engineering design through their collaborative, project-based activities, supplemented with some instructor-led discussions. Yet, there was little evidence that students’ either understood that intent or were able to extract the target principles from the learning experience. Learning in Engineering 24 What was even more surprising to us, however, was the non-significant change in students’ procedural knowledge, especially given the hands-on, problem-based nature of this course. We acknowledge that our focus in analysis was rather narrowly centered on a particular procedure, albeit one that was the major thrust of this course. Therefore, we cannot rule out that students’ broader procedural knowledge related to mechanical engineering may have been positively influenced. Nonetheless, our findings do suggest that the acquisition of such knowledge cannot be taken for granted even in a domain that is heavily procedural or in a course where a select procedure (i.e., PDP) is a focal point of instruction. But what of students’ interest in engineering design or the roles that are commonly associated with team-work in that domain? Did this capstone course contribute to such interests or not? This question is especially pertinent to the MDL, since personal interest has been hypothesized to be a driving force compelling individuals toward higher competence and, potentially, toward expertise in a domain. What we saw for participants in the capstone course was that their personal interest in engineering design, as indicated by their engagement in relevant domain activities in and out of school, did not manifest much change over the semester. We acknowledge that this activity-oriented approach to gauging personal interest is a more conservative measure, but we would argue that it is more informative than a simple self-report of one’s interest and we felt that there were sufficient opportunities for students to participate in domain-related activities over this four-month period. Another objective of the capstone design course and its format was to increase students’ awareness of and interest in various roles that are common to the domain of mechanical engineering. Yet, as in the case of personal interest, we did not witness the magnitude of change that would be hypothesized for this very hands-on and student-directed experience. We did Learning in Engineering 25 document some shifts in students’ interest over the semester, but by and large the strong interests in more physical or hands-on roles in mechanical engineering that students had at the outset (i.e., computer aided design and prototype development) were the same roles that interested them at the conclusion. In work now underway, we have had some opportunity to meet with engineering students to discuss their team experiences. What these focus-group discussions highlight is that students sometimes express frustration in the roles they assume or are given within teams and their inability to experience other roles during the semester. Perhaps such frustration helps to explain the relative stability in students’ role interest over time. We appreciate the exploratory nature of this look at the relation between the where of new learning environments in conjunction with the what and the who. There is unquestionably much more work that needs to be undertaken in the future. For one, we plan to revise the current Engineering Design Instrument and seek alternative ways to document students’ strategic processing. We also will make adjustments in the metric by which we document students’ personal and role interests over time. In conjunction with the revision of the instrument, we also intend to utilize more powerful analyses to tease apart these complex relations, such as structural equation modeling. For another, we would like to explore our hypotheses regarding the nature of learning environments and their interaction with learning characteristics and the content of instruction within other domains and with individuals at varying points in their academic development. How would the pattern in relations differ, for instance, when the content is more abstract and less physical and when the domain is less oriented toward collaborative, team-based projects? Whatever direction we take, it is evident to us that the effectiveness of learning environments, regardless of how innovative and engaging they appear, cannot be established Learning in Engineering 26 without due consideration of the content under study and students that are also part and parcel of those environments. Learning in Engineering 27 References Alexander, P. A. (1997). Mapping the multidimensional nature of domain learning: The interplay of cognitive, motivational, and strategic forces. In M. L. Maehr & P. R. Pintrich (Eds.), Advances in motivation and achievement (Vol. 10, pp. 213-250). Greenwich, CT: JAI Press. Alexander, P. A. (2003). The development of expertise: The journey from acclimation to proficiency. Educational Researcher, 32, 10-14. Alexander, P. A., Jetton, T. L., & Kulikowich, J. M. (1995). Interrelationship of knowledge, interest, and recall: Assessing a model of domain learning. Journal of Educational Psychology, 87, 559-575. Alexander, P. A., Schallert, D. L., & Reynolds, R. E. (2008, April) What is learning anyway? A topographical perspective considered. Paper presented at the annual meeting of the American Educational Research Association, New York. Bandura, A. (1986). Social Foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall. Beers, P. J., Boshuizen, H. P. A., Kirschner, P. A., & Gijselaers, W. H. (2005). Computer support for knowledge construction in collaborative learning environments. Computers in Human Behavior, 21, 623-643. Boekearts, M. (2002). Bringing about change in the classroom: strengths and weaknesses of the self-regulated learning approach–EARLI presidential address, 2001. Learning and Instruction, 12, 589-604. Learning in Engineering 28 Brown, J. S. (2005). New learning environments for the 21st century. Paper presented at the Forum for the Future of Higher Education Aspen Symposium. Retrieved March 1, 2008, from http://www.johnseelybrown.com/newlearning.pdf. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates. Dieter, C., & Schmidt, L. C. (2008). Engineering design (4th Ed.). Columbus, OH: McGrawHill. Dochy, F., Segers, M., Van den Bossche, P., & Gijbels, D. (2003). Effects of problem based learning: A meta-analysis. Learning and Instruction, 13, 533-568. Gijbels, D., Van de Watering, G., Dochy, F., & Van den Bossche, P. (2006). New learning environments and constructivism: The students' perspective. Instructional Science, 34, 213-226. Hancock, G. R., Lawrence, F. R., & Nevitt, J. (2000). Type 1 error and power of laten mean methods and MANOVA in factorially invariant and noninvariant latent variable systems. Structural Equation Modeling, 7, 534-556. Hardy, I., Jonen, A., Möller, K., & Stern, E. (2006). Effects of instructional support within constructivist learning environments for elementary school students’ understanding of “floating and sinking.” Journal of Educational Psychology, 98, 307-326. Hidi, S. (1990). Interest and its contribution as a mental resource for learning. Review of Educational Research, 60, 549-571. Jenkins, J. J. (1974). Remember that old theory of memory? Well, forget it. American Psychologist, 29, 785-795. Learning in Engineering 29 Land, S. M., & Hannafin, M. J. (1996). Student centered learning environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments (pp. 123). Mahwah, NJ: Lawrence Erlbaum Associates. Lea, M. R. (2005). ‘Communities of practise’ in higher education: Useful heuristic or educational model? In D. Barton & K. Tusting (Eds.), Beyond communities of practice: Language, power, and social context (pp. 180-197). New York, NY: Cambridge University Press. Nijhuis, J. F. H., Segers, M. S. R., & Gijselaers, W. H. (2005). Influence of redesigning a learning environment on student perceptions and learning strategies. Learning Environments Research, 8, 67-93. Nolen, S. B. (2003). Learning environment, motivation, and achievement in high school science. Journal of Research in Science Teaching, 40, 347-368. Roth, W.-M. (1996). Experimenting in a constructivist high school physics laboratory. Journal of Research in Science Teaching, 31, 197-223. Schiefele, U., & Csikszentmihalyi, M. (1995). Motivation and ability as factors in mathematics experience and achievement. Journal for Research in Mathematics Education, 26, 163181. Schmidt, L. C. (2006). Engineering teams: Individual or group sport?” International Journal of Engineering Education, 22, 659-664. Spector, J. M. (2006). A methodology for assessing learning in complex and ill-structured domains. Innovations in Education and Teaching International, 43, 109-120. Learning in Engineering 30 Vermetten, Y. J., Vermunt, J. D., & Lodewijks, H. G. (2002). Powerful learning environments? How university students differ in their response to instructional measures. Learning and Instruction, 12, 263-284. Wade, S. E., Buxton, W. M., & Kelly, M. (1999). Using think-alouds to examine reader-text interest. Reading Research Quarterly, 34, 194-216. Wilson, K., & Fowler, J. (2005). Assessing the impact of learning environments on students' approaches to learning: Comparing conventional and action learning designs. Assessment & Evaluation in Higher Education, 30, 87-101. Learning in Engineering Table 1 Descriptive Statistics of Knowledge and Interest at Pretest and Posttest Pretest Min. Max. Mean (SD) Min. Posttest Max. Mean (SD) Declarative Knowledge 0.00 7.00 4.37 (1.13) 2.00 10.00 5.33 (1.66) Procedural Knowledge 0.00 2.00 0.63 (0.52) 0.00 2.00 0.73 (0.56) Principled Knowledge 0.00 2.00 0.26 (0.50) 0.00 2.00 0.36 (0.54) Personal Interest 5.00 21.00 11.06 (4.01) 5.00 20.00 11.44 (3.86) 31 Learning in Engineering 32 Table 2 Intercorrelations between Knowledge and Interest at Pretest and Posttest 1 2 3 4 1. DK1 — 2. PdK1 .04 — 3. PpK1 .14 -.02 4. PI1 .15 .14 .04 — 5. DK2 .32** .28* .04 .14 6. PdK2 .18 .50** -.01 .15 7. PpK2 .11 .12 .03 .21 8. PI2 .06 .28* .08 .57** 5 6 7 8 — — .38** -.10 .17 — .18 — .26* .01 Note. DK1 = Pretest Declarative Knowledge; PdK1 = Pretest Procedural Knowledge; PpK1 = Pretest Principled Knowledge; PI1 = Pretest Personal Interest; DK2 = Posttest Declarative Knowledge; PdK2 = Posttest Procedural Knowledge; PpK2 = Posttest Principled Knowledge; PI2 = Postttest Personal Interest. *p < .05, **p < .01 — Learning in Engineering 33 Table 3 Rotated Factor Loadings of Role Interest Components at Pretest and Posttest Hands-on Pretest Communicative Hands-on Posttest Communicative CAD modeling .75 -.03 .83 -.14 Creating prototypes .74 .45 .73 .16 Interviewing end users .15 .80 -.13 .78 -.28 .72 .05 .84 .47 -.03 .41 .60 Technical writing Mathematical analysis