Sie sind auf Seite 1von 6

SIGNALS AND SYSTEMS ASSESSMENT: COMPARISON OF RESPONSES TO MULTIPLE CHOICE CONCEPTUAL QUESTIONS AND OPEN-ENDED FINAL EXAM PROBLEMS

Kathleen E. Wage George Mason Univ. ECE Department Fairfax, VA 22030 USA k.e.wage@ieee.org John R. Buck Univ. of Massachusetts Dartmouth ECE Department N. Dartmouth, MA 02747 USA johnbuck@ieee.org Margret A. Hjalmarson George Mason Univ. Graduate School of Education Fairfax, VA 22030 USA mhjalmar@gmu.edu Jill K. Nelson George Mason Univ. ECE Department Fairfax, VA 22030 USA jnelson@gmu.edu

ABSTRACT The validity of the Signals and Systems Concept Inventory (SSCI) was evaluated by comparing students performance on the SSCI to open-ended nal exam problems. An assessment instrument is said to be valid to the extent that it measures what it was designed to measure. The SSCI was designed to measure students understanding of core concepts in undergraduate signals and systems (S&S) courses through 25 multiple choice questions. The SSCI scores and nal exam scores for more than 150 students in four sections of S&S at two schools were found to have a statistically signicant correlation. A more detailed analysis was conducted with a pool of over 60 students at both schools. This second analysis compared detailed coding of students responses on the nal exam problems to their answers for specic SSCI questions assessing the same topic. This analysis found statistically signicant correlations between SSCI questions and nal exam problems for some convolution and Fourier transform problems. Results were mixed for the problem on Bode plots. Index Terms Assessment, signals and systems, concept inventory 1. INTRODUCTION Engineering problem solving requires both conceptual and procedural knowledge. For example, signals and systems students must understand both the fundamental concept of convolution and the procedure for computing the convolution integral. Student understanding of convolution and other important topics can be assessed in various ways. One way to test conceptual understanding is to administer a concept inventory, which is a standardized multiple-choice exam. A concept inventory (CI) is designed so that the incorrect choices (distractors) represent common misconceptions. Ideally, CIs emphasize conceptual understanding, rather than rote calculation. Procedural knowledge, such as how to compute a convolution integral, is often measured using problem-solving exercises. Analysis of student responses to open-ended problems reveals whether they understand how to implement the convolution operation. This study compares student responses to a concept inventory with their responses to open-ended nal examination questions for a signals and systems course. A key motivation for this work is the need to validate the concept inventory. An exam is said to be valid if it measures what it was intended to measure [1]. There are a number of different aspects of validity; see the article by Moskal et al. for a summary [2]. This study investigates the content validity of
Work funded by NSF Grants DUE-0512430 and DUE-0512636.

the questions by examining whether student responses to the inventory accurately reect their understanding of the underlying concept. It also looks at criterion-related evidence for validity by correlating the inventory scores with other measures, such as nal exam scores. Steif et al.s analysis of the Statics Concept Inventory is an example of the type of validation study required [1, 3]. The focus of this study is the Signals and Systems Concept Inventory (SSCI) [4]. The SSCI is a 25-question exam designed to test knowledge of the core concepts in the undergraduate signals and systems curriculum taught in electrical and computer engineering. Development of the SSCI began in 2001, and as of 2010, 30 instructors have given the SSCI to more than 2600 students. The project website (http://signals-and-systems.org) provides a complete list of SSCI-related publications. Instructors can request a password to access the latest copies of the inventory. This paper is a follow-on to an initial study presented at the Frontiers in Education Conference in 2007 [5]. The 2007 study compared the SSCI and nal examination results for a single class of students at one university. The present study analyzes a larger data set obtained from four classes at two universities. The rest of the paper is organized as follows. Section 2 describes the data set and provides relevant details about the courses where data were collected. Section 3 examines the correlation between students SSCI scores and their scores on the nal examination. Following that, Section 4 compares student responses to three open-ended nal exam problems with their responses to related questions on the SSCI. Section 5 summarizes the paper. 2. DATA SET This study focuses on data from four undergraduate signals and systems classes taught at George Mason University (GMU) and the University of Massachusetts Dartmouth (UMD) between 2006 and 2009. Table 2 summarizes the course information, number of students, instructor, and textook for each of the classes in the data set. Three of the classes were sections of ECE 220 taught by the rst author at GMU in three different semesters. The remaining class was ECE 321, taught by the second author at UMD. Both ECE 220 and ECE 321 focus on continuous-time linear signals and systems. ECE 220 is open to both sophomores and juniors, and is often taken concurrently with courses in circuits and differential equations. ECE 321 is taken primarily by second-term juniors who have already taken a discrete-time signals and systems course, two semesters of circuits courses, and a differential equations course. Both ECE 220 and ECE 321 were scheduled for two 75-minute lectures per week. ECE 220 also had one 50-minute recitation and one 100-minute laboratory session each week. The laboratory as-

978-1-61284-227-1/11/$26.00 2011 IEEE

198

DSP/SPE 2011

Percent correct

Table 1. Summary of classes in the data set. Class Course name N Instructor Textbook 1 GMU ECE 220 40 Wage [8] 2 GMU ECE 220 39 Wage [8] 3 GMU ECE 220 49 Wage [9] 4 UMD ECE 321 26 Buck [9]

Posttest difficulty index for questions common to v3 and v4 100 90 80 70 60 50 40 30 Class 1 Class 2 Class 3 Class 4

Class 1 2 3 4

Table 2. SSCI version v3.2 v3.2 v4.11 v4.11

SSCI results for 4 classes. Pre-test Post-test mean (std) mean (std) 39.3 (11.0) 67.4 (12.9) 41.5 (9.5) 67.4 (11.9) 45.3 (10.6) 74.0 (11.7) 53.3 (9.0) 75.4 (10.2)

Gain 0.46 0.44 0.53 0.47

20

10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 CTSSCI question number (version 3)

Fig. 1. Comparison of post-test difculty indexes (percent correct) for the four classes. Plot shows results for the 20 CT-SSCI questions shared by versions 3 and 4. The question numbers refer to version 3. Table 3. Correlation: CT-SSCI vs. nal exam Class 1 2 3 4 Correlation 0.637 0.427 0.610 0.760 Signicance 0.000 0.007 0.000 0.000

signments consisted of Matlab exercises. ECE 321 does not have a separate laboratory session, but students worked on Matlab projects in small groups outside of class. Both ECE 220 and ECE 321 were taught using active and collaborative (ACL) methods, such as those described in the 2004 article by Buck and Wage [6]. Specically, class periods consisted of short lecture segments interspersed with in-class problems that students worked in small groups. See the ECE 220 course webpages on Kathleen Wages website [7] for more information on how ACL methods were implemented and for sample in-class problems. The SSCI was administered as a pre-test and post-test in all four classes. The pre-test took place during the rst lecture, and the posttest was administered during the rst hour of the nal examination period. The second part of the nal exam consisted of standard openended analytical problems. Final exams for all classes were closed book. During the SSCI portion of the exam period, students could not use notes or calculators. Once the SSCI was complete, students could use three sheets of notes and a calculator for the second part of the nal. Both the SSCI post-test and the open-ended nal problems counted towards the nal grade. The SSCI was worth between 5-7% of the grade, and the nal was worth 13-19% of the grade. Table 2 shows the SSCI results for the 4 classes. Note that the rst two classes used version 3.2 of the CT-SSCI, while the last two classes used version 4.11 of the CT-SSCI. There were 20 common questions between versions 3.2 and 4.11. An ANOVA test indicates there is no statistically-signicant difference in the four class means for the subtest containing only the common questions. Figure 1 shows the post-test difculty indexes (percentage correct) for the 20 SSCI questions that are common to versions 3 and 4. While there are some differences between the four classes, the general behavior of the difculty index curves is quite similar. The ANOVA test and the difculty indexes indicate that the student populations in these four classes are similar enough that it is reasonable to compare data from these different semesters and schools. In addition to the SSCI data, classes 1 and 4 shared three questions on the problem-solving section of their nal exams. The students responses to these questions were coded and linked to the SSCI results using anonymous study ID numbers. A ve-level coding scheme was used: 4=correct, 3=minor errors, 2=major errors, 1=wrong, and 0=no answer. Three of the authors coded the results independently. Differences were discussed and resolved, so that the nal coding represents the group consensus. Section 4 compares the

coded results for these nal exam problems to related questions on the SSCI. 3. CORRELATION BETWEEN SSCI SCORE AND FINAL EXAM SCORE Figure 2 shows scatter plots of scores on the CT-SSCI versus scores on the nal exam for each of the four classes. Based on these plots, the SSCI scores appear positively correlated with the nal exam (problem-solving) scores. Table 3 contains the correlation coefcients and the associated signicance values for the SSCI/nal exam comparison. The correlation varies between 0.42 and 0.76 for these four classes, and the analysis indicates these correlations are statistically signicant at the 1% level (p < 0.007 for all classes). These results indicate that, for this population, students who have greater conceptual understanding (as measured by the SSCI) perform better on open-ended exam problems. The correlation is not equal to 1, nor would we expect it to be since conceptual understanding and procedural understanding are not necessarily correlated. For example, Mazur found that student scores on a series of conceptual physics problems were uncorrelated with scores on a set of paired conventional problems [10]. The conventional problems could be solved using procedural knowledge (a standard recipe), but the conceptual problems could not be solved by rote. 4. COMPARISON OF OPEN-ENDED FINAL EXAM PROBLEMS TO SELECTED SSCI QUESTIONS This section compares student responses to three nal exam problems (F1, F2, and F3) to their performance on SSCI questions that assess related concepts. Classes 1 and 4 are used in this analysis.

199

Class 1 100 SSCI (percent) SSCI (percent) 80 60 40 20 0 0 50 Final (percent) Class 3 100 SSCI (percent) SSCI (percent) 80 60 40 20 0 0 50 Final (percent) 100 100 80 60 40 20 0 0 100 100 80 60 40 20 0 0

Class 2

50 Final (percent) Class 4

100

Table 4. Class 1, F1 versus CT-SSCI Question 8 Problem F1 minor major SSCI Q8 correct errors errors wrong zero correct 19 6 5 1 0 wrong (shape ok) 0 1 5 2 0 wrong (add sigs) 0 0 0 1 0 wrong (multiply sigs) 0 0 0 0 0

Table 5. Class 1, Contingency Table for F1 vs. Q8 Problem F1 SSCI Q8 correct incorrect correct 25 6 incorrect 1 8
50 Final (percent) 100

Fig. 2. Scatter plots of nal exam score versus SSCI score for the four classes in the study. Classes 1 and 2 took version 3.2 of the CT-SSCI and Classes 3 and 4 took version 4.11 of the CT-SSCI. 4.1. Problem F1: Convolution Figure 3 shows Final Exam Problem F1 that assesses students understanding of the fundamental topic of convolution. This problem Problem F1 System 1 is a linear time-invariant (LTI) system with the impulse response h1 (t) shown below:

h1 (t)

Determine and sketch the output y(t) of the system when the input is the signal x(t) shown below:

-2 -1 0

x(t)

2 1 0 1 3 5 t

Fig. 3. Final exam Problem F1 (one part of longer problem). is linked with SSCI v3.2 Question 8 (Q8) and SSCI v4.11 Questions 13 (Q13) and 15 (Q15). Problem F1 asks students to compute the output of an LTI system when the input is two non-overlapping rectangles and the impulse response is a short square pulse. The correct answer is a tall triangular pulse followed by shorter trapezoidal signal. SSCI question Q8 asks students to identify the output of an LTI system when the impulse response is a unit amplitude rectangular pulse and the input signal is a square pulse. Students must recognize rst that they need to convolve these signals, and then identify the correct results for that convolution. The distractors for Q8 vary substantially in the level of misconception. Two distractors probe whether students understand what convolution means, since

they give the results of adding and multiplying x(t) with h(t). The third distractor has the right trapezoidal shape, but the wrong extent. Students choosing this distractor would recognize that convolving the signals given should produce a trapezoid, but lack sufcient understanding to identify which trapezoid. Feedback from the SSCI Design Team and student interviews indicated that Q8 on version 3 was too inconsistent in the distractors between gross misconceptions (e.g., adding instead of convolving) and subtle misconceptions (getting the right shape but the wrong time extent). Based on this feedback, version 4 of the SSCI replaced Q8 from version 3 with two new questions: Q13 and Q15. These questions were designed to probe different levels of misconceptions about convolution. Q13 was designed to be the more basic of the two. The question statement for Q13 is identical to Q8 from version 3.2. However, in this case, the correct answer is the only trapezoid given among the choices. As with Q8, two of the distractors represent the addition and the multiplication of x(t) and h(t). The third distractor is now a rectangular pulse but with the same extent and location as the correct output should have. This nal distractor probes for the case of students who remember how to nd the starting and ending times for the convolution of two nite signals, but dont recall how to actually compute a convolution. The design goal for Q13 was to probe whether students have the basic understanding that they need to convolve the input and impulse response to nd the output, and what shape that convolution will have. Question Q15 was designed to probe for a deeper understanding of convolution than Q13. This problem also asks the students to specify the output of an LTI system given the input and impulse response. In this case, both the input and the impulse response are unit amplitude rectangular signals, but with different lengths. Again, the correct answer is a trapezoid. However, all three distractors have the same region of support on the time axis as the correct trapezoid. The distractors are a trapezoid with the wrong extents for the linear region and constant regions (i.e., slope is wrong), a trapezoid with the wrong amplitude for the constant region, and a triangle. Obtaining the correct answer to Q15 requires a deeper level of understanding about convolution that Q13 requires. Table 4 compares the coded responses of Class 1 students to Problem F1 with their answers to SSCI Q8. As the table shows, all students who were able to compute the correct answer to the openended problem selected the correct answer to Q8. The majority of students who made minor errors on the open-ended problem were

200

Table 6. Class 4, F1 versus CT-SSCI Question 13 Problem F1 minor major SSCI Q13 correct errors errors wrong zero correct 5 8 7 4 1 wrong (equal to h(t)) 0 0 0 1 0 wrong (add sigs) 0 0 0 0 0 wrong shape/right length 0 0 0 0 0

Problem F2 A causal linear time-invariant system has the transfer function H(s) given below: H(s) = s + 10 . (s + 1)(s + 100)

Sketch the Bode magnitude plot for this system. Fig. 4. Final exam Problem F2 (one part of longer problem).

Table 7. Class 4, F1 versus CT-SSCI Question 15 Problem F1 minor major SSCI Q15 correct errors errors wrong zero correct 5 6 4 3 1 wrong (slope is incorrect) 0 0 0 1 0 wrong (max height wrong) 0 1 3 1 0 wrong shape (triangle) 0 1 0 0 0

Table 8. Class 1, F2 versus CT-SSCI Question 22 Problem F2 minor major SSCI Q22 correct errors errors wrong zero correct 10 11 6 0 0 wrong (pole at 10) 1 2 1 2 1 wrong (zero at 100) 2 0 1 0 0 wrong (-40dB offset) 1 1 1 0 0

also able to choose the correct answer to the conceptual question. Some students who made major errors or were completely wrong on Problem F1 were able to answer Q8 correctly, but others chose the answer with the right shape, but the wrong time extent. Note that the only student who selected the SSCI distractor indicating he would add the input and the impulse response to obtain the output was also completely unable to solve the open-ended convolution problem. Assuming that students who made minor errors had a correct understanding of convolution, whereas those who made major errors did not, Table 4 reduces to the 2 2 contingency table shown in Table 5. This contingency table can be analyzed using Fishers exact test [11]. Fishers test indicates that the results for Q8 and Problem F1 are statistically signicantly correlated (p = 0.0003). Tables 6 and 7 compare the coded responses of Class 4 to the open-ended Problem F1 with their responses to the two new convolution questions on v4.11 of the SSCI. As Table 6 shows, all but one student in Class 4 was able to answer Q13 correctly. The student who answered Q13 incorrectly also produced a completely wrong answer to Problem F1. The 25 students who chose the correct answer to Q13 gave responses to F1 that ranged from correct to completely wrong. Given the distribution of responses in Table 6, it is not surprising that Fishers exact test of the corresponding 2 2 table rejects the hypothesis that student responses to Q13 and F1 are correlated. Table 7 comparing F1 and Q15 shows similar results. Most of the students were able to select the correct answer to Q15, but their responses to Problem F1 varied from correct to completely wrong. Fishers exact test for the 2 2 table corresponding to Table 7 rejects the hypothesis that the results for Q15 and Problem F1 are correlated. There are several possibilities for why Q8 correlates well with the Problem F1 while Q13 and Q15 do not. First, the data set for Class 4 is relatively small, and very few students got the conceptual questions wrong. A larger data set may be required to fully sample the distractor space. Second, SSCI questions 13 and 15 are more focused on specic aspects of convolution than Q8. While the general ve-level coding scheme (correct, minor errors, major errors, wrong, and zero) worked well for the general Q8, it may not be appropriate for these more specic questions. For example, since all of the Q15 answers have the same time extent, a coding scheme that

ignores time axis errors might work better. Third, as noted in the conclusions (Section 5 below), it is possible that the cognitive level of Problem F1 is signicantly higher than that of questions 13 and 15. This mismatch in level could explain the lack of correlation of the results. 4.2. Problem F2: Bode Frequency Response Plots Final exam Problem F2 and CT-SSCI Q22 (v3.2) and Q20 (v4.11) are also closely related. All of these questions require students to work with Bode frequency response plots. Problem F2, shown in Figure 4, asks students to sketch the Bode magnitude plot for a system function H(s) with two poles (at s = 1, 100) and one zero (s = 10). Q22 and Q20 are the same basic conceptual question, differing only in formatting details. The question is designed to assess whether students understand how the introduction of a new pole modies the Bode plot. Q22/20 provides a Bode magnitude response for a system H(j) and asks the student to identify the magnitude response of an new system obtained by multiplying H(j) by an additional pole. The distrators include one that adds a zero instead of a pole, one that changes the DC value of the Bode plot, and one that puts the pole at the wrong frequency. Tables 8 and 9 summarize the results for the comparison of the open-ended Bode plot problem and the corresponding SSCI question. The results for Class 1 shown in Table 8 indicate that most students can correctly answer the conceptual question, but the coded results indicate that these same students exhibit varying levels of ability when it comes to sketching a Bode magnitude plot given the system function. Fishers exact test for the corresponding 2 2 table rejects the hypothesis that the results for Problem F2 and SSCI Q22 are correlated. The results for Class 4 shown in Table 9 show a similar mixture of coded responses for the large majority of students who get Q20 correct. Again the hypothesis that the open-ended Bode plot problem and the SSCI question are correlated is rejected by Fishers test. As noted in the previous section, there are several possibilities for the lack of correlation between the conceptual question and the open-ended problem. It may be useful to consider a different coding scheme that is tailored to represent the distractors in Q22/20.

201

Table 9. Class 4, F2 versus CT-SSCI Question 20 Problem F2 minor major SSCI Q20 correct errors errors wrong zero correct 4 5 4 5 1 wrong (pole at 10) 2 0 1 2 1 wrong (zero at 100) 0 0 0 0 0 wrong (-40dB offset) 0 0 0 1 0

Problem F3 The signal p(t) has the Fourier transform P () shown below.

P () 1 B +B

4.3. Problem F3: Fourier Transform Final exam Problem F3, shown in Figure 5, is related to a collection of ve questions on the CT-SSCI dealing with Fourier transform properties. On CT-SSCI v3.2, these are Questions 911, 15, and 16, while on CT SSCI v4.11, the relevant questions are 1012, 21, and 22. The content and distractors for these ve questions were essentially unchanged between the two versions of the exam,1 although the formatting and wording were adjusted in an effort to make the questions easier to read with a cleaner layout. Part (a) in Problem F3 asks students to do a rote calculation of the inverse Fourier transform of a square pulse, while part (b) requires students to apply Fourier transform properties to analyze a new system. For this study we compare the results of F3a and F3b with scores on an SSCI subtest consisting of 5 Fourier-transform-related conceptual questions. (Note that for the earlier study [5] it was 6 questions, but we removed one of those questions between version 3 and 4 of the SSCI.) Figures 6 and 7 show the scatter plot of the Fourier subtest scores versus the coded responses for Problem F3 for Classes 1 and 4, respectively. The size of the dots on these plots is indicative of the number of students with that response. The correlation coefcient and the statistical signicance are shown in the title of each plot. Figure 6 indicates that for Class 1, the results for F3b are have a correlation of 0.497 at better than the 1% level (p = 0.001) with the Fourier subtest, while the correlation of F3a with the Fourier subtest is not signicant. The results for Class 4 shown in Figure 7 show similar behavior: the correlation of F3b with the Fourier subtest is 0.46 with signicance level p = 0.018. These results are consistent with the idea that F3a tests students procedural knowledge, thus the results should not necessarily be correlated with the conceptual question results. On the other hand, F3b requires conceptual understanding to deconstruct a rather complex problem into a set of subproblems. It is therefore reasonable that performance on F3b is statistically-signicantly correlated with the SSCI subtest on Fourier transform concepts. 5. CONCLUSIONS An assessment of validity is crucial for any concept inventory. This paper evaluated the validity of the SSCI by comparing student performance on the inventory questions to their performance on a set of related open-ended exam problems. Analysis of a population of more than 150 students in four classes at two universities indicated a statistically-signicant correlation between SSCI scores and associated nal exam scores. A more detailed analysis of the nal exam problems for two classes revealed signicant correlations between the coded scores for open-ended problems and the relevant SSCI
1 One distractor on Q15 was changed based on a review of the version 3 data and feedback from the SSCI Development Team.

(a) Determine and sketch p(t). Use the denition of the inverse Fourier transform to determine this result. (You may use the Fourier transform table to check your result, but you will not receive full credit unless you prove your result using the denition.) (b) The signal p(t) is the input to the following system. Note that the frequency response H() is shown below the system.
p(t) p(t) p(t)

s(t)

x(t)

H()

y(t)

r(t)

2 B

cos(2Bt)

H() 1 2B 0 +2B

Determine and sketch the Fourier transform of the signals at each point in this system. In other words, sketch S(), X(), Y (), and R(). Show your sketches and any other work below. Fig. 5. Final exam problem F3. questions. Results for other open-ended problems, notably the Bode plot problem, were not statistically-signicant. The results from Section 4 raise the possibility that the SSCI questions were easier for many students in this pool than the open ended problem solving questions. Specically, in Tables 4, 6 and 7 all of the students who got the correct answer on the open-ended problem correct also got the SSCI question correct, but many of the students who correctly solved the SSCI question had serious errors on the open- ended problem. This suggests that it might be revealing to examine the SSCI and nal exam questions from the perspective of Blooms taxonomy to see if some of the open-ended problems require the students to function at a higher cognitive level than the paired conceptual level. 6. ACKNOWLEDGMENTS We thank the National Science Foundation (NSF) for its support of the SSCI project through grants DUE-0512686 and DUE-0512430 under the Assessment of Student Achievement program. NSF also supported the initial development of the SSCI through grant EEC9802942 to the Foundation Coalition. In addition we thank the members of the SSCI Development Team for their input on the design and

202

Corr=0.201, Sig=0.213 5 Fourier subtest score Fourier subtest score 4 3 2 1 0 0 1 2 3 Problem F3a 4 5 4 3 2 1 0 0

Corr=0.497, Sig=0.001

[8] B. P. Lathi, Linear Systems and Signals, Oxford University Press, Oxford, England, 2005. [9] A. V. Oppenheim and A. S. Willsky with H. Nawab, Signals and Systems, Prentice Hall, Englewood Cliffs, NJ, second edition, 1997. [10] Eric Mazur, Peer Instruction: A Users Manual, Prentice Hall, Englewood Cliffs, NJ, 1997.

2 3 Problem F3b

[11] Jerrold H. Zar, Biostatistical Analysis, Prentice Hall, Upper Saddle River, NJ, fourth edition, 1999.

Fig. 6. Scatter plots of the results for Problems F3a and F3b for Class 1.
Corr=0.382, Sig=0.054 5 Fourier subtest score 4 3 2 1 0 0 1 2 3 Problem F3a 4 Fourier subtest score 5 4 3 2 1 0 0 1 2 3 Problem F3b 4 Corr=0.460, Sig=0.018

Fig. 7. Scatter plots of the results for Problems F3a and F3b for Class 4. development of the SSCI. 7. REFERENCES [1] Paul S. Steif and John A. Dantzler, A Statics Concept Inventory: Development and Psychometric Analysis, Journal of Engineering Education, pp. 363371, October 2005. [2] Barbara M. Moskal, Jon A. Leydens, and Michael J. Pavelich, Validity, reliability, and the assessment of engineering education, Journal of Engineering Education, pp. 351354, July 2002. [3] Paul S. Steif and Mary Hansen, Comparisons between performances in a statics concept inventory and course examinations, International Journal of Engineering Education, vol. 22, pp. 10701076, 2006. [4] Kathleen E. Wage, John R. Buck, Cameron H. G. Wright, and Thad B. Welch, The Signals and Systems Concept Inventory, IEEE Trans. on Educ., vol. 48, no. 3, pp. 448461, August 2005. [5] John R. Buck, Kathleen E. Wage, Margret A. Hjalmarson, and Jill K. Nelson, Comparing student understanding of signals and systems using a concept inventory, a traditional exam and interviews, in Proceedings of the 37th ASEE/IEEE Frontiers in Education Conference, Milwaukee, WI, October 2007, pp. S1G1S1G6. [6] John R. Buck and Kathleen E. Wage, Active and Cooperative Learning in Signal Processing Courses, IEEE Signal Processing Magazine, vol. 22, no. 2, pp. 7681, March 2005. [7] ECE 220 course materials, http://ece.gmu.edu/kwage/teaching.html.

203

Das könnte Ihnen auch gefallen