BS-CSE Program: Assessment and Feedback Mechanisms, Processes, and Program Improvements

Introduction

The BS-CSE program uses a number of assessment and feedback mechanisms to evaluate and improve the effectiveness of the program.

In Section A, we provide summaries of each of the assessment mechanisms we use. In Section B, we summarize the course group report (CGR) mechanism which is an important feedback mechanism that allows us to make effective use of the results of the various assessment mechanisms. In Section C, we summarize the main processes used for evaluating the results of assessments and to identify and implement possible improvements in the program.

In Section D, we present summary results from the last several years obtained using one of our assessment mechanisms; similar results for the other assessment mechanisms that involve surveys of various constituents are available in Section D of the page at http://www.cse.ohio-state.edu/~neelam/abet and are not included with this report. In Section E, we consider each of our program outcomes and describe how it has been assessed using the various assessment mechanisms, how the assessment results are used to identify possible improvements, and document the resulting changes that have been implemented to further develop and improve the program. In Section F, we similarly consider each of the Criterion 3 outcomes, describe how it has been assessed, and present the resulting changes to improve the program. While most of the improvements documented in Sections E and F have been fully implemented, a few are currently in varying stages of implementation.


A. Assessment Mechanisms

A1. Exit surveys: Graduating seniors are required, as part of their application to graduate, to complete an on-line exit survey that asks the student to rank, on a scale of 'Very Unimportant' through 'Very Important', the importance of each program objective and outcome; and, for each outcome, rank on a scale of 'Strongly Disagree' through 'Strongly Agree', how well the program met the particular outcome in his or her individual case; and, for each objective, rank on a scale of 'Strongly Disagree' through 'Strongly Agree', how well the program prepared the student to achieve the particular objective in the years following his or her graduation. Recently, after discussions in the department's Undergraduate Studies Committee (UGSC), two free-form questions were added to the survey; the first asks the respondent to comment on one single aspect of the program that he or she found most helpful; the second asks for suggestions on a single change in the program that the respondent would most like to see. The results are summarized and discussed at a meeting of the UGSC, typically late in the Spring quarter.

A2. Alumni surveys: Alumni who graduated either two years ago or six years ago are asked to complete a survey. The first part of the survey asks the respondent to rank the importance of each of several general educational outcomes (mostly based on EC2000 Criterion 3 outcomes), as well as how well the program prepared them with respect to these outcomes. The second part of the survey asks the respondent to assess his or her educational experience in the program, by ranking, on a scale of 'Unsatisfactory' through 'Excellent', a range of items such as the quality of instruction by faculty in the major courses, the quality of the computer labs, etc. The third part of the survey asks the respondent to rank, on a scale of 'Strongly Disagree' through 'Strongly Agree', the degree to which each of our published program objectives was met in their particular case. The fourth part of the survey asks for general comments and suggestions for improvement. The results are summarized and discussed by the UGSC once a year.

During some years, the alumni survey has also included a one-time "targeted" component. The topic of this component has varied from the importance business knowledge and skills among our students to the importance of general education courses in engineering programs.

A3. Supervisor surveys: Alumni who graduated fifteen years ago are asked to complete a survey. The first part of the survey asks the respondent to rank the importance of each of several general educational outcomes (these are the same as in the first part of the alumni-survey) as well as how well the program prepared the OSU graduates that they supervise with respect to these outcomes. The second part of the survey asks for general comments and suggestions for improvement. The results are summarized and discussed by the UGSC once a year.

A4. Student performance in individual courses: All CSE courses include extensive homework assignments, programming labs, and midterm and final examinations. Some courses, especially the capstone courses, that are heavily project oriented replace the homeworks and exams with requirements of detailed project reports as well as oral presentations by the students; others that are more oriented toward theoretical issues, often substitute the programming labs with additional homework assignments. The performance of students in all of these activities provide the single most important feedback to individual instructors about the course in question. Indeed, often student performance in these tasks allow instructors in particular courses to evaluate the skills that students should have acquired in earlier parts of the curriculum. For example, the quality of oral presentations as well as the quality of the design documentation in capstone courses allows instructors to evaluate the communication skills that students are expected to have acquired in a range of courses earlier in the curriculum. Similarly, how well students are able to analyze the performance of various algorithms in CSE 680 gives a good indication of how well they have acquired discrete math skills (in Math 366 and 566); etc.

A5. Student Evaluation of Instruction (SEIs): Each instructor of each course in the program gets feedback via the mechanism of SEIs. Near the end (typically in the eighth or the ninth week) of the quarter, students are asked to complete an evaluation of the teaching in the particular course. Students' names do not appear anywhere on the evaluation; the instructor is not present in the classroom when the evaluations are completed; a volunteer student collects all the evaluations and returns them to the department's main office where they remain until the quarter is over and the instructor has assigned grades for the class. The department chair looks through the results at the end of the quarter and then passes them on to the instructor.

The results of the SEI are confidential and are available only to the instructor of the particular course and to the department chair; and in the case of courses taught by GTAs, to the course coordinator. These results allow the instructors to make needed adjustments in their course content, for example, how they present certain topics. And it also allows the chair and course coordinators in the case of GTA instructors to provide appropriate intervention where necessary. For example, individual instructors have been moved out of particular courses where they were ineffective. In the case of the CSE 459 courses which are usually taught by GTAs, the results of this assessment were partly responsible for the idea of developing standard course notes for the courses.

A6. UG forum: An open undergraduate forum is held each year to discuss all aspects of our program, including its objectives. The forum is held in late winter or early spring and is attended by interested students, key faculty members, and the academic advisor. Announcements about the forum are made widely, especially on the electronic (student) newsgroups to ensure wide participation. Following the forum, a summary of the discussion is posted on the newsgroups by the chair of the UGSC; this often leads to further extended discussions where students (including those who could not attend the forum) further express their opinions and ideas on the program. Reports on the forums of the last few years are available from Section D.6 of the page at http://www.cse.ohio-state.edu/~neelam/abet and are not included with this report.

Direct vs. indirect assessments: Of the various assessment mechanisms listed above, A4 is our primary direct assessment mechanism since it involves evaluation by appropriate faculty of actual student work at the most appropriate points in the program. Naturally, as we will see in the tables below, A4 plays the most important role in assessing the various program outcomes and the various Criterion 3 outcomes.

A3 has a direct assessment component since it is partly based on observation of our graduates' performance in their post-graduate employment, by their supervisors. However, in our experience, as in that of many other programs, the return rates for the supervisor surveys have been relatively poor; this may in part be due to employers' concern about the possibility of legal problems in providing such evaluations. In any case, the poor return rate, combined with the fact that the employees in question may have graduated from the program over a number of years during which the program itself has gone through numerous changes, means that it is difficult to identify specific actions for improvement of the program based solely on the results of this survey. We recently instituted a co-op/internship survey which is sent to supervisors of our current students in a co-op or internship. This survey should be somewhat more useful than the supervisor survey since it will be an evaluation of current students. But this survey is very recent and has not yet been integrated into our program improvement process. It is also worth noting that since the evaluation reported in this survey will not be in the context of a particular point in the curriculum, the results of the survey can point to problems only in a very broad sense rather than identifying the need for specific changes at specific places in the curriculum; for that task, there is no better mechanism than A4.

The remaining assessment mechanisms are all indirect. Therefore, the results from these mechanisms usually provide general guidelines for direction rather than for specific changes in the program, although there have been, as we will see in the Sections E and F, some instances of specific changes suggested directly by the results of these mechanisms as well.


B. Course Group Reports (CGRs): A Key Feedback Mechanism

The CGR is a unique feedback mechanism that we have set up that allows us to integrate the results of the various assessment instruments and identify appropriate actions to improve the program. In more detail, the various courses in the program are organized into groups of related courses. Thus, for example, the Software Spine group consists of the one-year long course sequence consisting of CSE 221, 222, 321. The Software Engineering group consists of CSE 560, the systems programming course with the large size team-based project; CSE 757, the survey course on software engineering and CSE 758, the capstone design course on software engineering; and CSE 601, the course on social, ethical, and professional aspects of computing. Another group, this one having to do with an application area, is Computer Graphics, and consists of the six graphics courses, CSE 581, 681, 682, 781, 782, and 784.

Once every two to three years, the coordinators of the courses in a given group are responsible for producing a Course Group Report (CGR) and presenting it to the Curriculum Committee. The CGR is expected to address such questions as whether the courses are meeting their objectives, whether prerequisites are appropriate, suitability of the current textbooks, student reactions to the courses in the group, relation to the rest of the program, and whether the courses in the group are helping meet the overall program objectives as intended. In arriving at answers to these questions, the course coordinators are expected to draw upon the feedback provided by the various assessment tools listed above, including especially on A4, student performance in the various courses in the group. The CGR also provides information about the group's ideas for possible changes in the courses. This mechanism ensures that changes in individual courses are not made without taking account of the likely impact on related courses. Full details of the CGR mechanism as well as the actual reports are available at http://www.cis.ohio-state.edu/~neelam/abet/CGRs.


C. Processes for Feedback and Improvement

The main processes we use for evaluating results of the assessment mechanisms and for identifying and implementing program improvements involve two departmental committees, the Undergraduate Studies Committee (UGSC) and the Curriculum Committee (CC). Each of these committees consists of a number of faculty members, student representatives, and the undergraduate (staff) advisor. Further, UGSC and CC have many members in common with each other; this helps coordinate their actions. The results of the assessment mechanisms A1 (exit survey), A2 (alumni survey), and A3 (supervisor survey) are discussed in the meetings of the UGSC, typically in the Spring quarter. The UGSC is also responsible for organizing the Undergraduate forum (A6), usually in the Winter quarter. The opinions and ideas expressed at the forum are discussed in a UGSC meeting following the forum and a summary report is also posted on the student and faculty electronic newsgroups. Ideas for improvements in the program, based on the results of the assessments and based on the UGSC discussions, are initiated and developed at various times as appropriate. Numerous examples appear in the tables in Sections E and F. Changes involving individual courses or sequences of courses are discussed in detail in CC (although ideas for such changes may be triggered by discussions in UGSC based on the results of various assessments as described in the last paragraph). Further, the Course Group Reports (CGRs) are presented by the appropriate faculty groups to the CC. Since CC members hear all the reports they are able to help the faculty groups identify connections with courses in other groups that the faculty in question may not be aware of. The discussions accompanying the presentations of the CGRs in the CC often leads to important refinements in the reports, as well as to ideas for possible changes in the courses to address any problems.

In addition to the formal assessment mechanisms, both UGSC and CC often also receive informal assessments of various aspects of the program from the student representatives on the committees. These informal assessments often play an important role in identifying and developing possible improvements in the program. The feedback from such assessments are documented in the minutes of the committees. And the committees, because of the nature of the membership which changes relatively slowly over the years, provides long-term memory of ideas for improvements that may have been considered in previous years but not implemented because of various possible reasons. In addition, as the need arises, other faculty who are not members of either committee are often involved in discussions of particular items. And all changes (except for relatively minor or routine ones) are discussed and approved by the full faculty before they are implemented. The posting of meeting minutes of both committees on the faculty and student electronic newsgroups ensures that all constituents are well informed of ideas for changes and improvements in the program.

Three other committees that play important roles in assessment and improvement of the program are the Outcomes Committee, the Core Committee, and the Academic Standards and Progress Committee (ASAP). The Outcomes Committee is a College of Engineering committee and consists of faculty from the various engineering programs. This committee allows faculty from the different programs to share ideas about how to assess programs; it is also responsible for the assessment-related activities, such as designing alumni surveys and supervisor surveys, that are coordinated by the college. The Core Committee is also a college committee and consists of faculty from the various engineering programs. It is responsible for designing the college-level portion of the curriculum that is common to all the engineering programs and for interacting with the various non-engineering departments (such as English, Mathematics, Physics, etc.) that teach courses that all engineering students are required to take, to ensure that these courses are meeting the needs of the engineering programs. As in the case of the departmental UGSC and CC, these two committees also have many members in common with each other, helping coordinate their actions.

The ASAP committee is also a college committee and consists of faculty members as well as undergraduate (staff) advisors from the various programs in the college. The committee typically meets once a quarter to discuss the performance of students who may not be meeting minimum standards of academic performance. These discussions allow the committee to determine suitable followup actions for individual students, ranging from probation to dismissal from the program or the college; further, and perhaps more importantly, the discussions allow the committee to also to identify possibly intervention actions such as advising the student to repeat an earlier course or take a smaller load of CSE courses, etc. Prior to the ASAP meeting, the CSE undergraduate staff advisor identifies students with potential problems. The advisor and the chair of UGSC committee then arrive at possible courses of actions that are then presented to the ASAP committee. The undergraduate advisor also checks that each student filing for graduation has, in fact, met all the program requirements.


D. Summary Results of Exit Surveys

The results of of the CSE Exit Survey (A1) for the years 1998-2004 are summarized here. As noted earlier, the results of our other assessment mechanisms that involve surveys, i.e., alumni surveys (A2) and supervisor surveys (A3) are available at the web site mentioned earlier.

For each program objective and outcome, the respondent was asked to rank its importance on a scale of "Very Unimportant" through "Very Important", and how strongly the respondent agreed with the statement "This program objective has been met for me personally" on a scale of "Strongly Disagree" through "Strongly Agree". In averaging the responses, the following weights were attached to the various possible responses:

ImportanceObjective was Met
Response WeightResponse Weight
Very Unimportant 0% Strongly Disagree 0%
Somewhat Unimportant 33%Moderately Disagree 20%
Somewhat Important 67%Slightly Disagree 40%
Very Important 100%Slightly Agree 60%
   Moderately Agree 80%
   Strongly Agree 100%

The Results: The survey results for each of the years 1998-99, '99-'00, '00-'01, '01-'02, '02-'03, '03-'04 are shown in the table below. The first column lists the outcome in question, the second shows, as a bar-graph, the average importance that respondents for each year attached to that item, the third shows the average value for "objective was met" that respondents for each year specified for that item; in each case, the numerical percentage represented by the bar-graph is also shown. For each item, the result for '98-'99 appears first, followed immediately below it by the result for '99-'00, '00-'01, '01-'02, '02-'03, '03-'04. If a given item did not appear in the survey for a particular year, no bar-graph appears at that point and no numerical percentage is shown. For comparison, a bar-graph representing "100%" appears at the top of each column; immediately below that is listed the number of respondents for each year's survey.

Also, the wording of some of the questions changed slightly in Autumn 2001 (following a similar change in the published objectives and outcomes of the program). In such cases, the numbers corresponding to the most closely related questions from the earlier surveys were used in preparing this table.

Note that the bar-graphs displaying the results of the various surveys don't display well in some browsers; please check the numerical results printed next to the bar-graphs to be sure of the actual values.

Objective/Outcome Importance
(Very Unimportant -
Very Important)
Objective was Met
(Strongly Disagree -
Strongly Agree)
 
98-99
99-00
00-01
01-02
02-03
03-04
 
N=65
N=67
N=70
N=55
N=95
N=109

100

100
1. To provide graduates with a thorough grounding in the key principles and practices of computing, and in the basic engineering, mathematical, and scientific principles that underpin them.






88
92
89
93
91
92






85
85
82
83
83
83
1a. Students will demonstrate proficiency in the areas of software design and development, algorithms, operating systems, programming languages, and computer architecture.






90
96
92
91
93
94






81
79
78
79
77
80
1b. Students will demonstrate proficiency in relevant aspects of mathematics, including discrete mathematics and probability, as well as electrical circuits and devices.






72
73
70
74
69
73






83
79
79
81
79
77
1c. Students will successfully apply these principles and practices to a variety of problems.






81
87
87
85
89
86






76
74
74
75
76
74
       
2. To provide graduates with an understanding of additional engineering principles, and the mathematical and scientific concepts that underlie them.






72
72
70
69
73
75






81
82
82
82
79
79
2a. Students will demonstrate an understanding of differential and integral calculus, and of statistics.






70
69
68
70
67
67






81
80
83
85
82
82
2b. Students will demonstrate an understanding of the basic principles of physics and at least one other laboratory-based science.






70
69
68
64
62
63






81
80
83
84
81
81
2c. Students will demonstrate an understanding of the basic principles of at least one other engineering discipline in addition to computing and electrical engineering.






70
69
68
56
58
63






81
80
83
71
75
74
       
3. To provide graduates with an understanding of human and social issues that will enable them to be informed and involved members of their communities, and responsible engineering and computing professionals.






74
79
78
72
69
72






70
67
73
71
69
72
3a. Students will demonstrate familiarity with basic concepts and contemporary issues in the social sciences and the humanities.






59
61
59
56
58
63






73
69
74
68
70
71
3b. Students will demonstrate an understanding of social, professional and ethical considerations related to engineering in general and to computing in particular.






72
72
72
77
74
74






65
66
73
71
73
71
       
4. To provide students with appropriate social and organizational skills.






86
85
88
88
88
85






74
75
79
64
72
73
4a. Students will demonstrate an ability to work effectively in teams.






86
85
88
94
92
93






74
75
79
77
81
80
4b. Students will demonstrate an ability to communicate effectively.






91
93
88
93
91
92






72
70
71
68
73
73
       
5. To prepare graduates for employment in the CSE profession upon graduation, as well as for successful careers in the profession, and for graduate study in computing.






95
97
96
99
94
96






78
73
78
70
68
69
5a. Graduates will be heavily recruited for positions in high-technology companies that utilize their computing education.






97
98
96
92
93
95






78
72
75
62
56
57
5b. Strong graduates will be prepared to enter good graduate programs in computing.






75
77
77
74
78
82






83
78
79
77
72
83
5c. Graduates will demonstrate an ability to acquire new knowledge in the computing discipline and to engage in life-long learning.






xx
xx
xx
95
95
92






xx
xx
xx
87
82
84


E. Improvements Related to Program Outcomes

In the table below, we first list each program outcome, then summarize which assessment mechanisms (A1 through A6 as described in Section A) are used to assess how well that particular outcome is achieved, specify which feedback mechanisms (Course Group Reports (CGRs), discussions in (UGSC), discussions in the Curriculum Committee (CC), discussions in the Outcomes Committee, etc.) are used to analyze the results of the assessments to arrive at possible changes in the programs to improve achievement of the outcome, and list some recent changes that have been implemented (or, in some cases, are ongoing) to improve achievement of the particular outcome.

CSE Outcome Assessment Mechanism Feedback Mechanism Recent Improvements
I.i: Students will demonstrate proficiency in the areas of software design and development, algorithms, operating systems, programming languages, information systems, and computer architecture. A4 applied to required CSE courses, capstone courses.
A5 in individual courses.
A1, A2, A3, A6
Course Group Reports;
Discussion in UGSC based on A1, A2, A3, A6;
Discussion in CC based on CGRs and proposals for changes in individual courses based on A4, A5, A6.
Improvements in CSE 201 (pre-req for 221);
New activity-based learning in 221;
New course project in 321;
New project in 655;
Change in pre-req for 459.22 to 321 (dropping 314 as alternate pre-req);
I.ii: Students will demonstrate proficiency in relevant aspects of mathematics, including discrete mathematics and probability, as well as electrical circuits and devices. A4 (and, to a lesser extent A5) in 625, 680;
A6.
Course Group Reports, and discussion in CC based on the reports;
Discussion in UGSC based on A6;
Interaction of appropriate CSE faculty with math and ECE faculty.
Addition of Math 566 as a required course (about 3 or 4 years ago);
Improvement in relation between CSE 360 and ECE 567;
Improvements in 680 (ongoing).
I.iii: Students will successfully apply these principles and practices to a variety of problems.
(Note: All items under Outcome I.i and I.ii also apply to I.iii and are not repeated here.)
A1, A2, A3, A6 Discussions in UGSC. Recent introduction of CSE 682, a new capstone course in animation;
Recent introduction of several elective courses (on visualization, security, data mining, etc.)
(Planned) changes in CSE 762, another capstone course.
II.i: Students will demonstrate an understanding of differential and integral calculus, and of statistics. A1, A2, A3, A6. Discussions in UGSC;
Discussions in Core Committee.
Replacement of Math 415 (diff eqns) with Math 566 (second course on discrete math)
II.ii: Students will demonstrate an understanding of the basic principles of physics and at least one other laboratory-based science. A1, A2, A3, A6. Discussions in UGSC;
Discussions in Core Committee.
Increased flexibility in science requirements.
II.iii: Students will demonstrate an understanding of the basic principles of at least one other engineering discipline in addition to computing and electrical engineering. A1, A2, A3, A6. Discussions in Core Committee. No recent changes.
III.i: Students will demonstrate familiarity with basic concepts and contemporary issues in the social sciences and the humanities. A1, A2, A3, A6. Discussions in Core Committee. No recent changes.
III.ii: Students will demonstrate an understanding of social, professional and ethical considerations related to engineering in general and to computing in particular. A1, A2, A3, A6. Discussions in UGSC;
Discussions in Core Committee.
Recent proposal for changes in Engineering GEC (pending approval by university) will require all engineering students to take a course on ethics.
IV.i: Students will demonstrate an ability to work effectively in teams. A4 (in CSE 560, capstone courses)
A1, A2, A3, A6.
Appropriate CGRs;
Discussions in UGSC.
Recent changes in criteria for Capstone courses stress the importance of team working.
All capstone courses are being evaluated against the new criteria to ensure, in particular, that students' team working skills are well developed;
One of the student computing labs (BO 111) was recently reorganized to facilitate team-working; it provides wireless networking that team members with suitably equipped laptops can easily access, take minutes of meetings, etc.
IV.ii: Students will demonstrate an ability to communicate effectively. A4 (in CSE 560, 601, capstone courses)
A1, A2, A3, A6.
Appropriate CGRs;
Discussions in UGSC.
The curriculum was recently revised to require all CSE majors to take a course in effective public speaking.
V.i: Graduates will find suitable positions in industry and government that offer the prospect of challenging and rewarding careers in computing. A1, A2, A3, A6. Discussions in UGSC;
Discussions in CC.
Recent introduction of CSE 682, a new capstone course in animation;
Recent introduction of several elective courses (on visualization, security, data mining, etc.);
Addition of a required public speaking course to the curriculum;
Addition of a required economics course to the curriculum;
Addition of an elective business course to the curriculum.
V.ii: Graduates will demonstrate an ability to acquire new knowledge in the computing discipline and to engage in life-long learning. A1, A2, A3, A6. Discussions in UGSC. The revised capstone course criteria stress the importance of life-long learning; some of these courses (e.g.: CSE 778) now require students to make an oral presentation on a VLSI tool or system that is not a standard part of the course.
A new individualized technical elective option is being in the early planning stages; this option will allow students to tailor their technical electives based on their particular interests; the availability of such an option should encourage students to new ideas and technologies on their own.
V.iii: Graduates with an aptitude for, and interest in, graduate studies will apply to and be accepted for entry by strong graduate programs in computing. A1, A2, A3, A6. Discussions in UGSC. Recent "BACK (Buckeye Alumni Creating Knowledge)" colloquium series invites our undergrad alums who have gone to grad schools to present talks about their graduate work and experiences. (see recent example).


F. Improvements Related to Criterion 3 Outcomes

In the table below, we list each Criterion 3 outcome, summarize which assessment mechanisms (A1 through A6 as described in Section A) are used to assess how well that outcome is achieved, specify which feedback mechanisms are used to analyze the results of the assessments to arrive at possible changes in the programs to improve achievement of the outcome, and list some recent changes that have been implemented (or, in some cases, are ongoing) to improve achievement of the particular outcome.

Criterion 3 Outcome Assessment Mechanism Feedback Mechanism Recent Improvements
3.a: Ability to apply knowledge of mathematics, science, and engineering. A4 applied to required CSE courses, capstone courses.
A5 in individual courses.
A1, A2, A3, A6
Course Group Reports;
Discussion in UGSC and CC;
Discussion in Core Committee.
All of the changes listed under CSE outcome I.i;
Changes in Engineering 181, 183.
3.b: Ability to design and conduct experiments, as well as to analyze and interpret data. A4 applied to required CSE and other (non-CSE) Engineering courses.
A1, A2, A3, A6
Discussion in UGSC and CC;
Discussion in Core Committee;
Interaction of engineering faculty with math and science faculty.
Replacement of Math 415 (diff eqns) with Math 566 (second course on discrete math).
3.c: Ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability. A4 applied to several of the required CSE courses, in particular, CSE 560;
A4applied to the capstone courses.
CGRs for the Software Spine group, the Software Engineering group;
Evaluation of capstone courses against revised capstone course criteria in UGSC.
Improved stress in capstone courses on including, in the documentation, issues concerning realistic constraints as well as social and ethical issues where appropriate;
Addition of Econ 200/201 as a required course in the curriculum;
Addition of a business course as an elective in the curriculum.
3.d: Ability to function on multi-disciplinary teams. A4 applied to CSE 560 and the capstone courses. CGRs for the Software Engineering group;
Evaluation of capstone courses against revised capstone course criteria in UGSC.
Recent revision of capstone course criteria to stress the importance of team-working in the capstone design project.
Note: The "multi-disciplinary" aspect is still problematic. Some courses provide a natural possibility for establishing multi-disciplinary teams; a good example is the recent computer animation course where CSE majors work with Art majors in suitable teams. Other courses present more of a challenge.
3.e: Ability to identify, formulate, and solve engineering problems. A4 applied to several of the required CSE courses, in particular, CSE 560;
A4applied to the capstone courses.
CGRs for the Software Spine group, the Software Engineering group;
Evaluation of capstone courses against revised capstone course criteria in UGSC.
More interesting and more realistic programming projects in a variety CSE courses (CSE 560, 655, 677, 679, etc.);
More interesting and more challenging projects in the capstone courses.
3.f: Understanding of professional and ethical responsibility. A1, A2, A3, A6. Discussions in UGSC;
Discussions in Core Committee.
Recent proposal for changes in Engineering GEC (pending approval by university) will require all engineering students to take a course on ethics.
3.g: Ability to communicate effectively. A1, A2, A3, A6. Discussions in UGSC. The curriculum was recently revised to require all CSE majors to take a course in effective public speaking.
3.h: The broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context. A1, A2, A3, A6. Discussions in UGSC;
Discussions in Core Committee.
Recent proposal for changes in Engineering GEC (pending approval by university) will require all engineering students to take a course on ethics;
Recent addition of Econ 200/201 as a required course in the curriculum.
3.i: A recognition of the need for, and an ability to engage in, life-long learning A1, A2, A3, A6. Discussions in UGSC. The revised capstone course criteria stress the importance of life-long learning; some of these courses (e.g.: CSE 778) now require students to make an oral presentation on a VLSI tool or system that is not a standard part of the course;
A new individualized technical elective option is being in the early planning stages; this option will allow students to tailor their technical electives based on their particular interests; the availability of such an option should encourage students to new ideas and technologies on their own.
3.j: A knowledge of contemporary issues. A1, A2, A3, A6. Discussions in Core Committee. No recent changes.
3.k: An ability to use the techniques, skills, and modern engineering tools necessary for modern engineering practice. A1, A2, A3, A6. Discussions in UGSC;
Discussions in CC.
Recent introduction of CSE 682, a new capstone course in animation;
Recent introduction of CSE 459.51 on Perl, a language that is now being widely used in industry;
Recent introduction of several elective courses (on visualization, security, data mining, etc.);


To be added:

  1. Not clear what to do with this one: maybe say it is a potential assessment mechanism?
    Feedback from interviewers: (scale of 1 to 5; 5 is highest)
  2. Change in AP score for 201 credit; based on A4 (student performance) and A6 (forum comments).
  3. Change in advising (first step (already taken): suggest that students take 181/183 early. Next step: advising email; also in the email include suggestion that they see advisor to plan electives; maybe start this with a pilot before trying to automate it?) - based on UGSC discussion
  4. Changes in exit-survey (addition of free-form questions; pending addition of advising question - based on UGSC discussion).
  5. Introduce ECS workshop in capstone course based on exit survey responses (on-going; for outcome V.i)
  6. Change name of CGR to Networking and Security: report this as improvement -maybe not.
  7. Include dates of changes.
  8. Survey of advisory board?
  9. Changes in 778, 762 triggered by evaluation in UGSC against capstone course criteria.
  10. Change in instructor for 725 (based on A5); ask Stu if there are others?
  11. BO 111; also say it is related to the lounge discussions in UGSC; claim this is based on exit, alum and supervisor surveys (importance of team-working).
  12. Addition of Biology (following discussions in core comm.)? is that worth mentioning? It contributes to "broad education"
  13. Standardize notes for 459 courses, based on UG forum comments and SEI comments.
  14. Papers in 321/222/ etc.: For life long learning.
  15. W.r.t. to science/math/engineering: Core comm. discussion is an imp. fdbk. mechanism.
  16. In the tables, when using A4, maybe it should be "A4 applied in capstone courses" etc., rather than just A4.
  17. For communication,team-working, lifelong learning, the capstone evaluation discussion in UGSC is important feedback mechanism.
  18. Advising office survey and resulting changes (see 316, 317, 318 in ADMIN/UGC/ACCRED05/vmmsgs).
  19. Diversity program-related changes?
  20. Recent change in CGR structure (assuming CC approves it) makes the CGR have a definite direct assessment component to it. And at least some of the improvements will be based on the results of these assessments.

Last modified: Sat Apr 23 17:20:37 EDT 2005