CSE Program:
Objectives, Outcomes, Assessments, Program Improvements
Assessment Results
Some key points concerning the upcoming ABET evaluation
The next ABET evaluation of the BS-CSE program is coming up.
The evaluation will include a site visit in Fall 2017. We
have three major tasks:
- Collect materials from all required CSE courses (such as Software I,
II, all core-choice courses, both project courses, all capstone courses, and
the most important elective courses. "Materials" includes such things as
copies of syllabi, course notes/slides, other handouts, copies of all
homework/project
assignments, copies of midterms, finals; and representative (average, below
average, above average) samples of student work in all homeworks, projects,
exams; all this collected from a recent (Fall '16 or Spring '17) section of
the course. Neelam will contact various faculty members to work on this
starting in August and continuing through the end of Sp '17.
- Assessment/continuous improvement: A major set of requirements in the
accreditation criteria relate to regular assessment of the extent
to which the program outcomes are being achieved and careful evaluation
of the assessment results to identify possible changes in the program to
make it more effective in a process of continuous improvement, and
appropriate documentation of these activities. We have, as many faculty
know and as detailed below, a very good set of assessment and continuous
improvement processes but the documentation component probably needs
some improvement/revision. We will be working on this during the summer
and on through fall and spring; and this will be a key focus of the Undergraduate
Studies Comm. in '16-'17.
This page summarizes the ABET
requirements concerning outcomes, assessments, etc.
Follow-up Actions: This is a new section (Su '16)
that lists follow-up actions taken in response to the evaluation of
various assessment results.
- Self-study: We have to submit to ABET, in late spring '17, a detailed
self-study, providing complete details of our program and how it meets the
various accreditation criteria requirements. We will work on this, starting
in mid-fall and continuing through late spring; the Undergraduate Advising
Office will play an important role (especially in documenting how we meet the
criteria requirements with respect to supporting and advising students).
Neelam will be coordinating these activities. Jeremy Morris,
Al Cline (a part-time Senior Lecturer in the dept.
who is becoming a full-time Senior Lecturer) will
assist Neelam. They will need help from many other faculty in the dept.,
certainly
with respect to collecting materials from various courses but probably also
in completing the other two tasks.
Please send any suggestions or comments to
neelam@cse.ohio-state.edu
Reports
Some useful links
- Current ('16-'17) EAC & CAC
criteria. (Many of
the criteria are common to EAC and CAC; so they are listed only in the
former set. The complete criteria are available here:
official EAC criteria and
official CAC criteria.)
- Iowa rubrics (good model of rubrics; but
the way they do assessments is *not* a good model for us because, given
the size of our program, it is not sustainable.
Also, their assessments are part of courses throughout the
curriculum and so can't
claim to measure outcomes at the time students graduate.
Our POCAT experience seems to reiterate the importance of this point: what
students know/are able to do when they are taking a course is not
necessarily a good measure of their knowledge/ability near graduation;
this may be worth stressing in the self-study.)
- ABET's presentation (general information about accreditation)
- Prev. version of this page
Major programs in the CSE Dept.
There are three major programs in the CSE Dept. at Ohio
State, the BS-CSE, the BS-CIS, and the BA-CIS. The BS-CSE program is
offered via the College of Engineering; the BS-CIS and BA-CIS are in
the College of Arts and Sciences. The Computer Science portion of the
BS-CSE and BS-CIS programs are essentially identical to each other; the CS
portion of the BA-CIS program is a subset of that of the BS-CIS and
BS-CSE programs. For details of the curricula of the three programs,
please see this page.
The current page focuses on the assessment and continuous improvement activities
of the BS-CSE program
which is accredited by (the Engineering and Computing
Accreditation Commissions (EAC, CAC) of)
ABET.
However, the improvements in our courses based on the assessment results
apply also to the BS-CIS program because of the commonality described above;
and, in the case of improvements related to CS courses (such as Software I,
II or the junior project course) that are in the subset of courses taken by
BA-CIS majors, apply to the BA-CIS program as well.
A. Introduction
This page is intended both to document our efforts
related to the assessment, evaluation, and continuous improvement of the
BS-CSE program, as well as to help direct those efforts.
It will help various constituents including
current students, alumni, employers of
graduates of the program, and others, understand the growth and evolution
of the program and the rationale behind the evolution. It is
expected to provide the documentation needed to show that the
program meets the requirements of the
ABET criteria, especially the
requirements concerning
program educational objectives, student outcomes, assessment, evaluation, and
continuous improvement. It is also intended to serve as a resource
for other engineering, computing, and technology programs who are in the
process of developing their own assessment and evaluation processes to
meet the ABET criteria. (Information about our previous ABET evaluation (in 2011)
is available here.)
Please send questions or comments to
neelam@cse.ohio-state.edu.
B. ABET Terminology and Criteria Requirements
[Here are (local copies of) the official EAC
and CAC criteria for 2016-'17.]
In the accreditation criteria, ABET uses specific terms defined in particular ways. It
is important to keep these definitions in mind when understanding the
criteria requirements. In particular, the following terms are important:
Program Educational Objectives (PEOs): PEOs are
broad statements that describe what graduates are expected to attain
within a few years of graduation. PEOs are
based on the needs of the program's constituencies.
An example PEO:
graduates will be employed successfully as computing professionals.
Student Outcomes (SOs): SOs describe what students are
expected to know and be able to do by the time of graduation. These
relate to the skills, knowledge, and behaviors that students acquire
as they progress through the program. An example SO:
students will acquire
an ability to communicate effectively. The intent is that achievement
of the SOs will prepare the graduates to attain the PEOs.
[SOs are also occasionally referred to as "program outcomes (POs)".]
Assessment: one or more processes that identify, collect, and
prepare data to evaluate the attainment of SOs. (At one time, ABET used to
require PEOs to be also assessed; that is no longer a requirement although
many programs do so.)
Effective
assessment uses both direct and indirect, and quantitative and
qualitative measures as appropriate to the objective or outcome being
measured.
Direct assessment is assessment of actual student
work by someone qualified to assess it; indirect assessment
refers to things like student and alumni surveys. In general, direct
assessments are preferred.
Important: The purpose of assessment is, first,
to assess the program, not individual students. And, second,
to assess the extent to which students in the program
attain each of the student outcomes (SOs); so, e.g., such items as
students' grades in required courses are not suitable for this purpose.
Evaluation: one or more processes for interpreting
assessment data to help identify any weaknesses in the program and potential
improvements in the program in a process of continuous improvement.
Rubrics can be of particular value in assessing SOs related to
such skills as effective communication and team-working. A rubric
specifies a set of relevant dimensions for the particular skill and, for
each dimension, a set of levels of achievement; levels of achievement
are not terms such as "excellent" but rather clear descriptions of the
qualities that the student's work must display (along this particular
dimension) in order to be classifed "excellent" etc.
(Some resources:
Writing effective rubrics; UFl assessment site;
more on creating rubrics).
Summary of ABET criteria requirements:
- PEOs must be determined based on the
needs of the program's constituencies. There must be a documented and
effective process, involving the constituencies, for the periodic
review and revision of the PEOs. PEOs must be published.
- SOs must include the (a-k) outcomes listed in EAC Criterion 3;
the (a-i) in CAC Criterion 3; and the (j-k) in CAC's Program Criteria for
CS programs.
- SOs must prepare graduates to attain PEOs;
there must be a documented and
effective process for the periodic review and revision of the SOs.
SOs must be documented.
- The program must use appropriate, documented processes for
assessing the extent to which SOs are being attained.
The results of the assessments must be evaluated, the assessment results
and the evaluation must be documented; and must be
utilized as input for the continuous improvement of the program. Other
available information may also be used to assist in this.
C. Program Educational Objectives
C.1 Current PEOs
The PEOs of the BS-CSE program are:
- Graduates of the program will be employed in the computing
profession, and will be engaged in learning, understanding, and
applying new ideas and technologies as the field evolves.
- Graduates with an interest in, and aptitude for, advanced studies
in computing will have completed, or be actively pursuing, graduate
studies in computing.
- Graduates will be informed and involved members of their
communities, and responsible engineering and computing professionals.
These objectives are published on the OSU
Undergraduate Majors site and
on the program's site.
C.2 Process for assessment and review of PEOs
The main constituents for the program are current students, alumni, and the
computing industry, represented by the Industrial Advisory Board (IAB). Input from current
students is obtained on all aspects of the program, including the
PEOs, at an open Undergraduate Forum that is held each year. The forum
is held in the Winter quarter and is attended by interested students,
key faculty members, and advisors from the Advising Office. It should be stressed, however, that discussion of
PEOs is, typically, only a very small component of any of the forums.
The IAB
convenes for a day-long meeting every year on campus, typically in
late May. The board gets a detailed update on various recent
developments in the department related to research, graduate programs,
and undergraduate programs. Once every three years or so, input is
sought from the board on the PEOs.
Alumni are especially important in the assessment of PEOs since they
have intimate knowledge of the program and, at the same time, also
have experience in industry. Input from alumni is obtained by means
of an alumni survey. The survey is sent to alumni who
graduated either two or three years prior to the survey date.
Thus the approach lets
us gather input from alumni who graduated relatively recently and
hence have more or less current knowledge of the program but also have
some experience in the job market and hence can comment on how well
the program prepared them for the profession. The key portion of the
survey that relates to PEOs asks the respondent to rate, on a scale of
very unimportant through extremely important, the
importance of each of the PEOs. Next, the respondent is asked to rate,
on a scale of strongly disagree through strongly
agree, the extent to which they agreed with the statement, "the
BS-CSE program adequately prepared me to achieve the PEO".
C.3 Assessment results
Annual forum:
Announcements about the annual Forum are made widely, especially on
the electronic (student) newsgroups and listservs to ensure wide participation.
Following the forum, a summary of the discussion is posted on the
newsgroups by the chair of the Undergraduate Studies Committee (UGSC).
The report is then discussed in regularly scheduled meetings of the
UGSC to help identify possible improvements in the program.
Reports from forums of the last several years are available here:
Undergraduate Forum Reports
Feedback from Advisory Board: Each year, during the annual meeting
of the CSE Department's Industrial Advisory Board (typically in late April/
early May), a presentation is made
reporting on the state of the BS-CSE program, recent changes in courses,
etc. to obtain feedback from the Board. Also included is a discussion of the
program objectives to see if any changes in the objective might be called
for. Starting in Spring '14, a brief summary of the presentation and the
feedback from the Board has been created; these are available here:
Feedback from IAB.
Alumni Survey: The alumni survey
is administered every other year, the most recent ones being in
2006, 2008, and 2009-'10; the next one is planned to be in early
2012. Prior to 2006, the survey was administered as a paper survey. In
2006, the survey was moved on-line. As noted above, the survey is sent
to alumni who graduated either two or three years prior to the survey
date. Since the survey is conducted every other year, this ensures that
all graduates will receive the survey (exactly) once.
TheSurvey results are discussed in regularly scheduled meetings of the
UGSC to help identify possible improvements in the program
(COE's Outcomes Comm. page is here; it has results for the
common portion; more useful is the
Qualtrics page which has the raw
results.)
Full details of
the survey and results from the three recent surveys are available here:
Alumni Survey Results.
D. Student Outcomes
D.1 Current SOs
Students in the BS-CSE program will attain:
-
an ability to apply knowledge of computing, mathematics including
discrete mathematics as well as probability and statistics, science,
and engineering;
-
an ability to design and conduct experiments, as well as to analyze
and interpret data;
-
an ability to design, implement, and evaluate a software or a
software/hardware system, component, or process to meet desired needs
within realistic constraints such as memory, runtime efficiency, as
well as appropriate constraints related to economic, environmental,
social, political, ethical, health and safety, manufacturability, and
sustainability considerations;
-
an ability to function on multi-disciplinary teams;
-
an ability to identify, formulate, and solve engineering problems;
-
an understanding of professional, ethical, legal, security and
social issues and responsibilities;
-
an ability to communicate effectively with a range of audiences;
-
an ability to analyze the local and global impact of computing on
individuals, organizations, and society;
-
a recognition of the need for, and an ability to engage in
life-long learning and continuing professional development;
-
a knowledge of contemporary issues;
-
an ability to use the techniques, skills, and modern engineering tools
necessary for practice as a CSE professional;
-
an ability to analyze a problem, and identify and define the
computing requirements appropriate to its solution;
-
an ability to apply mathematical foundations, algorithmic
principles, and computer science theory in the modeling and design of
computer-based systems in a way that demonstrates comprehension of the
tradeoffs involved in design choices;
-
an ability to apply design and development principles in the
construction of software systems of varying complexity.
These SOs subsume outcomes (a) through (k) of EAC Criterion 3;
(a) through (i) of CAC Criterion 3; and (j), (k) of the CAC Program Criteria for CS programs.
These SOs are published on the
program's site.
Relation to PEOs:
Outcomes (a) through (c), (e), and (k) through (n) ensure that
graduates will be well prepared to succeed in challenging positions in
the computing profession thus contributing to achieving PEO (I).
Outcomes (d) and (g) will prepare graduates to work effectively as
part of teams of CSE professionals, further contributing to their
success as CSE professionals. Outcome (i), along with the solid
technical background ensured by outcomes (a), (b), (e), and (l)
through (n) will prepare graduates to achieve PEO (II), pursuit of
advanced/graduate studies in computing. Outcomes (f), (g), (h), and
(j) will prepare gradutes to be informed and involved members of their
communities and to be responsible engineering and computing
professionals, thereby helping them achieve PEO (III).
We have classified the SOs into two groups. Outcomes (a), (b), (c),
(e), (f), (k), (l), (m), and (n) form the technical group of
SOs; (d), (f), (g), (h), (i), and (j) form the professional
group of SOs; note that outcome (f) is included in both
groups. Previously, we had split the professional group into two
subgroups, "societal issues" and "professional skills"; but the
outcomes are so closely related that we have merged these two into a
single group, the professional outcomes group. (But some of the
pages in this site may still refer to these as two distinct groups; if
you notice such references, please email neelam AT cse.ohio-state.edu).
POCAT, an exit-test, is used to assess the technical group of
SOs. A set of rubrics, used to evaluate certain key
activities in the capstone design courses and in CSE 601, the required
course on professional and ethical issues in computing, is used to
assess the professional group of SOs. An exit-survey is used to assess
both groups of outcomes. POCAT and the rubrics are direct
assessments, the exit survey an indirect assessment. We
consider each in turn.
D.2.1 Assessment of the technical group of SOs using POCAT
POCAT (Program OutComes Achievement Test), is an exit-test
that all BS-CSE majors take prior to graduation. When a
BS-CSE major applies for graduation, generally three quarters before
the expected date of graduation, he or she is asked to sign up to take
POCAT. The test is offered once each quarter, typically in the third
or fourth week of the quarter. Although all CSE students are required
to take the test, the performance on the test does not affect
the grades of individual students in any courses, nor are records
retained on how individual students performed on the test. When a
group of students takes the POCAT, each student receives a unique code
that appears on that student's test but only the individual student
knows his or her code. Once the tests have been graded, summary
results, organized by this code, are posted on electronic bulletin
boards so an interested student can see how well he or she did and how
his or her performance compared with that of others who took the
test. This was a deliberate decision since we did not want the
students to spend a lot of time preparing for the test. The goal of
the test is to help assess the program by assessing the
extent to which the students have acquired and internalized the
knowledge and skills associated with the various SOs, not assess
individual students. Initially, there was a concern that if the
individual students' performance on the test did not affect them in
any tangible way, they would not take the test seriously. Our
experience with the test has eliminated that concern; most students
enjoy taking the test and take it seriously. At the same time,
security of the test is not a concern since students do not
have to worry about how their performance will affect their records;
hence, for example, it is perfectly appropriate to reuse questions from
one POCAT to the next and we do so often.
The questions on POCAT are based on topics from a number of required
and the most popular elective high-level courses related to a variety
of key topics such as software engineering, formal languages and
automata theory, databases, programming languages, computer
architecture, algorithm analysis, AI, computer graphics, etc. Each
question is a multiple-choice with, typically, two or three questions
in each topic area. The ideal POCAT question not only has a specific
correct answer but has distractors that are so chosen that
they correspond to common misconceptions that students tend to have
about the particular concept. It is for this reason that the summary
results of a POCAT includes information not only about the percentage
of students who answered a question correctly but also the percentages
of students who chose each of the distractors, in other words how many
students harbored the particular misconceptions represented by the
various distractors about the underlying concept(s). The questions on
the test are chosen in such a way that there are one or more questions
related to each Group 1 outcome. Indeed, because of the nature of the
questions and the nature of the outcomes, many of the questions tend to
be related to more than one outcome in the group.
There is one unusual feature of the POCAT questions that is worth
noting. Each question on the test has, as one of the choices
(typically the last one), an answer along the lines of "I don't
know". The instructions for the test suggest that the student should
pick that answer if he or she has no idea what the correct answer
is. Since their performance on the test will have no impact on their
record, students who do not know the answer to the question and
know that they do not know, pick this answer. This means we do not
have to worry about students making wild guesses and confounding
our attempt to pin down misconceptions that they may have.
The grading of POCAT and production of summary results is
mechanical. The faculty members responsible for each question also
provide an estimate of the percentage of students who ought to be able
to answer the question correctly as well as the particular outcomes
that the question is related to. All of this information is included
in the summary results. This allows the UGSC to have a well-informed
discussion about the extent to which the particular group of students
had achieved these outcomes and identify potential problem spots in
particular courses, indeed in particular topics, and bring them to the
attention of the appropriate group of faculty. The result page for each POCAT
includes three key tables. The first one lists, for each question, the
particular answer (including, possibly, the "I don't know" one) each
student picked for that question. The second table lists, for each question
and each possible answer for the question, the number of students who
picked that answer; there is also a summary line that specifies what
percentage of students answered that question correctly. The third table
lists, for each of the SOs, the average level achievement of that SO.
This is computed on the basis of the SOs that each question is related to
and the number of students who answered that question correctly. Of these,
the second table is perhaps the most valuable since it allows UGSC and
relevant faculty to identify which particular misconceptions students
most commonly hold concerning a given topic and, hence, arrive at possible
improvements to address the problem.
Results from all the POCATs are available here:
POCAT Results (outcomes (a), (b), (c), (e), (f), (k), (l), (m), (n))
D.2.2 Assessment of the professional group of SOs using rubrics
By their nature, both the achievement of the professional SOs
as well as their assessment require different approaches than
for the technical SOs. In general, a variety of
courses from across the curriculum, including several non-CSE courses,
contribute to the achievement of a number of the professional SOs. At
the same time, both in order to ensure that students engage in
activities that help achieve these outcomes in a CSE context,
and in order to help with assessment of the extent to which they achieve
these outcomes, we have adopted the following approach.
CSE 601, the 1-credit required course on social, ethical and
professional issues in computing, and each of the capstone design
courses include a number of activities that are tailored to the
professional SOs while, at the same time, being well-integrated with
the courses. CSE 601 requires each student to explore a new or recent
product or practice or event etc. (outcome (i) (as well as outcome
(k))), consider the impact it may have in a "global, economic,
environmental, and societal context" (outcome (h)); consider as well
any relevant contemporary issues (outcome (j)) as well as ethical and
professional issues (outcome (f)) related to the product, practice, or
event; and present the findings in a 3-4 page paper. CSE 601 includes
this activity in order to further develop the degree of student
achievement of these outcomes. Naturally, this activity also
contributes to the development of written communication skills
(outcome (g)). In addition, the course requires students to make oral
presentations (outcome (g)) on topics related to social, ethical, and
professional issues in computing. Suitable rubrics, each with
appropriate dimensions corresponding to the component skills of these
outcomes, have been developed to assess the extent to which students
are achieving these outcomes as exhibited by their performance in
these activities.
Turning next to the capstone design courses, the central activity in
each of these courses is, of course, a quarter-long design and, in
most cases, implementation project. The courses require student teams
to make a number of oral presentations and produce suitable written
design documentation (including such things as storyboards) accessible
to clients, project managers, peers, etc., thereby contributing to
outcome (g). The courses also require students to engage in an
activity similar to that in 601, researching a product or practice,
typically that is relevant to the team's project, and write a paper or
make presenations about it, thus contributing to (i). One of the
capstone design courses, CSE 786, has developed an especially
innovative way (dubbed "technology teams") for this component of the
course that not only helps students develop life-long learning skills
but also engages them in additional team activity.
Often the team
projects and/or the researched tools raise questions related to
ethical, legal, etc., issues, thus contributing to (f); as well as
issues related to impact of aspects of computing on individuals and
society and related contemporary issues, thereby contributing also to
(h) and (j). As in the case of CSE 601, suitable rubrics, each with
appropriate dimensions corresponding to the component skills of these
outcomes, have been developed to assess the extent to which students
are achieving these outcomes as exhibited by their performance in
these activities. More details of the capstone design courses are
available in here.
Rubrics for assessment of professional outcomes:
- Rubric assessment of junior project courses (CSE 3901, 3902)
- Rubric assessment of capstone projects (for use by course instructor)
(pdf)
Results: Spring '17;
Autumn '16
- Rubric for evaluating capstone final (poster) presentation (for use by visitors to the poster session);
(that is the final version (as of Aug. 8, 2016;
we went through a number of versions, starting with
this;
then these three versions:
1,
2,
3,
before arriving at the final one).
Results: Spring 2017;
- Rubric for assessing life-long learning paper (in CSE 2501) (outcomes (f), (g), (h), (i), and (j))
- Rubric for assessing individual presentations in CSE 2501, Phil 1338 (outcomes (g))
- Summative Rubric for Assessing of Program Outcomes in CSE 2501, Phil 1338 (CAC outcomes (e), (f), (g))
Results: 1338 (Au '16);
2501 (Sp '17).
- Rubric for assessing team work (outcome (d))
- Rubric for assessing team presentations (outcomes (d), (g))
- Rubric for assessing technology research as part of a technology team (outcomes (d), (f), (g), (h), (i), and (j)).
Sample assessment results from some recent courses (in each case, S1, S2, ... etc. refer to individual students whose work was assessed):
Rubric for assessment of capstone course: Since the capstone design
course plays such an important role with respect to several of the SOs, and
since external reviewers are invited to examine the results of student
projects in (some of) these courses, starting
in Spring '13, we are experimenting with using
this rubric (based directly on one
developed by Steve Lavender of ISE) for assessing some of the important
student outcomes. Results will be collected and discussed in the Undergrad
Studies Comm.
D.2.3 Assessment of all SOs using the Exit-Survey
Prior to graduation, BS-CSE majors are required to complete an an
anonymous exit survey; in fact, students complete the
exit-survey and take the POCAT in the same session. There are three
parts to the survey. The first one asks the respondent, for each
student outcome, to rank its importance on a scale of
very-unimportant/somewhat-unimportant/
somewhat-important/very-important;
and asks how strongly the respondent agreed with the statement "this
student outcome has been achieved for me personally" on a scale of
strongly-disagree/moderately-disagree/
slightly-disagree/slightly-agree/moderately-agree/strongly-agree.
In averaging the responses, we attached weights of 0%, 33%, 67%, and
100% to the four possible importance ratings; and weights of 0%, 20%,
40%, 60%, 80%, and 100% to the six possible achievement ratings.
The
second part of the exit survey asks students to evaluate the quality of
faculty advising as well as the quality of staff advising. Students
are asked to consider four specific items:
- Faculty Advising with respect to course choices;
- Faculty Advising with respect to graduate school, career
options, etc;
- Staff Advising with respect to curricular issues;
- Staff Advising with respect to career options, graduate school,
university policies and procedures, referrals, etc.
Students are asked to rank the importance of each item; and the
quality of advising, as it relates to that specific item, that they
received. The third part of the survey asks students to briefly
respond to two questions. The first asks, "What single aspect of the
CSE program did you find most helpful? Explain briefly." The second
asks, "What single change in the CSE program would you most like to
see? Explain briefly." These two parts of the survey, although not
directly related to specific student outcomes, are naturally very
important to students and provide us a good lens through which to view
the program and help identify possible improvements.
Results from the last four years' surveys are available here:
Exit Survey Results
E. Continuous Improvement
The various assessment processes listed above, the evaluations of
their results in UGSC discussions, other assessments such as from
student performance in individual courses and input from other ad-hoc
sources and their evaluations all have resulted in a number of
improvements in the program. Many of these improvements have been in
individual courses; others have been at the program level; and yet
others have been in the assessment/evaluation processes
themselves. Below we list a few of the recent improvements in each of
these categories, along with brief descriptions. In the case of the
improvements in the first two categories, we also
identify the related student outcome(s) or PEO. We also identify the
specific assessments (Undergrad Forum, Alumni Survey, Exit Survey,
POCAT, assessment using the rubrics listed above, IAB discussions,
or other
(such as student performance in specific activities in individual courses))
whose results and their evaluation led to the particular
improvement/change.
In some cases, some of the details of the
improvement are still being discussed or yet to be implemented; such cases are marked with a "*".
In most cases, the evaluation of the assessment results and the possible
program improvements are discussed initially in UGSC meetings, followed by
actions by relevant faculty. In other cases, especially those based on
the results of assessment of student performance in specific activities in
individual courses, the evaluation and the program improvements are
initiated by individual faculty or faculty groups. In a few cases,
especially those involving the Alumni Survey, the College of Engineering's
Outcomes Committee is often part of the discussion since some of the items
on the survey apply to all programs in the college.
E.1 Improvements in Courses
- CSE 221, 222, 321: [Related SOs/PEOs: (b), (k); assessment instruments: UG Forum]
At one of the annual forums, some students
noted that that they spent a
lot of time figuring out some errors they got from the C++ compiler
and felt that,
given the simple nature of the underlying problem --once
they figured it out-- that was the source of the errors, it should not
have been so difficult to understand it. Most such
problems are, in fact, addressed in the on-line FAQ for the sequence.
However,
many students do not read the information in these pages. In order to
address this, the faculty responsible for these courses created
surveys to be completed by students in CSE 222 and 321 that asked the
students to respond to the following questions:
- Briefly describe one specific technical problem you encountered
and solved in CSE 221 (or, in the case of the survey for students in 321,
in CSE 222). For example, it could be a
problem you had while working on a lab assignment, or in the use of
the CSE computing environment, or any other issue that you struggled
with--yet managed to solve--and that you wish you could have known
about before you had to deal with it.
- Briefly explain the solution to the problem described above and how
you managed to find it.
- Is this problem and its solution listed on the Resolve/C++ FAQ pages
at:
http://www.cse.ohio-state.edu/sce/rcpp/FAQ/index.html?
Based on the student responses, a number of actions have
been taken:
- Improved the FAQ to provide detailed information about reading
compiler errors and the manner in which "build" and "make" work
when a project is recompiled.
- Added an "open lab" in 221 where a key goal of the lab is to
help students learn to use the information in the FAQ to understand
both compile-time and run-time errors.
- Developed a "closed lab" that requires students to
exercise their Unix/Emacs skills, including absolute and relative
paths for copying files and doing submissions; Emacs window buffers;
and Unix command line short cuts.
- Worked with the computing system staff to standardize remote
access utilities and offer a regular workshop/clinic to help students
set up their computers to enable remote access to departmental compute
servers.
- CSE 222: [Related SOs/PEOs: (c), (e), (f), (n); I; assessment instruments: other]
The first programming lab in CSE 222 addresses a number
of course learning outcomes, all of which are of a technical nature
(e.g., "Be competent with using the computing environment to complete
lab assignments" and "Be familiar with using [various software
components] to write application programs and/or component
implementations"). Student performance on this lab tends
to be all over the map. Analysis shows that, in a typical section of
CSE 222, fewer than a third of the students submit solutions that meet
the stated requirements, about half submit solutions that compile and
execute but fail to meet all the requirements, and the rest either
submit code that does not compile or do not submit a solution at all.
In order to confront the problems faced by the middle portion -- who
previously seemed to throw up their hands and simply move on -- we
have instituted a "revise and resubmit" policy for this lab.
This is similar to what might happen for an English
composition assignment, except that students are not told about the
opportunity to revise and resubmit until after their initial
submissions have been graded and returned. The graders for this first
lab now use a rather harsh grading rubric, so even a
seemingly minor deviation from a stated requirement results in a score
of 80% or less, leaving many students who thought they had done
"pretty well" believing they should have received more partial
credit. We therefore give them an opportunity (on this lab only) to
consider the grader's feedback and to resubmit a revised solution a
short time later. We use this occasion to discuss in class why "good
enough" is not good enough for software, how failure to understand and
meet all customer requirements may in some cases result in loss of
life, limb, money, or mission-critical opportunities for their
clients. This change in the course has been positive in the sense
that the number of submissions in the middle group on subsequent CSE
222 lab assignments has been noticeably smaller.
This experience also has resulted in changes to how we have written
the course learning outcomes for our first semester course in this
area, Software I (CSE 2221). Rather than being primarily technical in
scope, the first few outcomes of Software I deal with general
principles of software engineering that students can learn to
appreciate and apply early in the program (e.g., "Be familiar with the
reasons it is important that software be 'correct', i.e., why 'good
enough' is not good enough when it comes to software quality").
- CSE 321: [Related SOs/PEOs: (a), (e), (m); I, II; assessment instruments: POCAT]
One of the main goals of the 221-222-321 sequence
is to ensure that students are able to use simple formal logic
assertions involving mathematical set models to understand and reason
about an operation's behavior. Students typically take CSE
321 and Math 366, the first discrete math
course, during the same quarter. A major outcome of Math 366 is
developing facility with manipulating logical assertions.
Our expectation is that
the activities in Math 366 would help strengthen the lessons of the
221-sequence. We discovered, however, via a POCAT question that, in
fact, this was not happening. Students do not seem to transfer ideas
from one context to another even when they are closely related.
Here are some details. In the 221-sequence, in order to make them easily
machine readable and writable, the notation that is used to write logic
assertions spells out operators as, for example,
"there exists" and "union"; whereas in Math 366,
traditional math symbols are used.
The faculty involved with the 221-sequence had designed a POCAT
question that tested students ability to manipulate simple logic
assertions involving properties of small sets of numbers. When the
question was typeset for the POCAT, the person doing so, not being
regularly involved
with the 221-sequence, replaced the spelled out operators in the
assertions that appeared in the question with their
traditional math symbols. When UGSC discussed the test
results after the test had been administered, there was concern that
the performance of students on the question was much poorer than
expected; only about 35% of the students had answered the question
correctly as against the expected 70%. Various possible explanations
were discussed until the faculty member who had suggested the question
for inclusion in POCAT noticed the difference in notation used. He then
hypothesized that the poor performance was at least partly due to this
difference in notation; and that although students had seen and used
the traditional math notation in Math 366,
their not using that notation in the context of
specifying/reasoning about program behavior, likely had a major impact
on their performance.
And indeed, when the question was rewritten to use the 221-notation in
the next POCAT,
student performance was close to 70%.
To further clinch this issue, in the following POCAT, students
taking the test were split into two groups (as it turned out, over 60
students took the test during this following quarter (as against the
usual 25), allowing us to split them into two groups).
One group was
given the question using the 221-notation, the other
using the traditional math notation. And, indeed, the results
were as predicted: the performance of the second group
was much poorer than that of the first group.
This suggested a possible improvement in the sequence. The point is
that students, by the time they graduate (which is close to when they
take the POCAT) and become computing professionals, need to be able to
go back and forth between the traditional math notation and other
notations such as that used in the 221-sequence. Otherwise, their
ability to identify, formulate, and solve engineering problems, and,
especially, their ability to understand others' formulations of
engineering problems, will be limited. Hence we have been
experimenting with a new activity in 321. In this activity, the
instructor engages students in translating specifications of
reasonably simple behaviors written in one notation to the other, in
both directions; and assigns exercises in which students perform the
translation as well. When students in a section of the course which
included this activity and the idea discussed explicitly in a lecture
were tested on this type of problem, their performance did not depend
on the notation used; in another section of the course in which the
relation between the two notations were only mentioned in passing in
class and students were not engaged in actual translation from one
notation to the other, the performance was poorer (while still being
better than in POCAT). We are currently considering how best to ensure
that not only are students in 321 able to go back and forth between these
notations but that they retain the abilities over the long term.
- Pointers in Systems I course*: [Related SOs/PEOs: (c), (e), (k), (n); I; assessment instrument: Exit Survey]
(Note: As mentioned earlier, items marked with "*"
are still being discussed or are in the process of being implemented.)
One observation that faculty teaching courses such as CSE 660 and
CSE 677 have made over the years is that students' facility with pointers
was somewhat lacking. A number of graduating students have also noted,
in comments made as part of their exit surveys, that while the program
does a good job of helping develop students' high-level/abstract skills,
many students are not comfortable working at low-/systems-level,
specifically when working with pointers.
Hence in designing CSE 2421, Systems I, the semester replacement for
CSE 360, faculty decided to include, as a key component, material that
would directly address this weakness. Indeed, the title of the new course
("Low-level programming and computer organization") reflects this. The
first part of this course will help develop students' abilities with C
programming, including pointer manipulation. Since the other part of the
course is computer organization, there will be a natural relation between
the two parts and students should be able to not only develop their
pointer-based programming skills but also their understanding of the
relation to machine-level considerations. As we transition to semesters,
we will try to track student performance to see if there is a noticeable
difference in such courses as CSE 3461 (the replacement for CSE 677)
depending on whether the student took CSE 360 or the new Systems I course.
- CSE 321: [Related SOs/PEOs: (a, c, e, k, l, n); I; assessment instruments: other]
A couple of years ago, the 221-222-321 sequence was reorganized to
free up the last 2-3 weeks of CSE 321 so that students could engage
in the design and implementation from scratch of a new abstract
component that they use in an interesting case study (to generate tag
clouds web pages). The design of the new abstract
component is done though several homeworks and in-class activities to
provide incremental and timely feedback and allow students to complete
the design in a reasonable amount of time.
This change was prompted by feedback
from students and instructors in CSE 560. The feedback showed
that the gap between 321 and 560 was substantial
enough to create real issues for students in 560. One key concern was
the lack of experience in designing new components from scratch. The new
activity in 321 seems to have helped address this problem and also
enables students, near the end of the CSE 221-222-321 sequence, to look
back over the sequence and gain a summary understanding of the
component-based approach to software design that the sequence is focused on.
- CSE 459.24, 459.xx*:
[Related SOs/PEOs: (i, k); I; assessment instruments: Exit Survey]
[This item is still being discussed/in the process of being implemented.]
One of the points that comes up frequently in the
Exit Survey (as well as in the Alumni Survey) is that while the
program provides students with excellent grounding in foundational
ideas, it is somewhat lacking when it comes to helping students
acquire experience with the most recent software tools.
The CSE~459 courses are meant to address this concern, enabling
students to explore a programming language that they may not otherwise
enounter in the curriculum. One of the recent languages that has
acquired importance in industry is
C#. Hence CSE 459.24 was developed about three years
ago that allows students to develop some skills in the
language. It has been offered regularly since then. (This
is related not just to outcome (k) but also (i)
since it encourages students to explore, on their own, the
various toolkits and libraries etc. that have developed around the
language.)
Another 459, this one on Python, is currently
being planned. While this language has been around for some time, its
importance in practical applications such as building GUIs has gone up
significantly recently, in part because of the numerous, powerful
Python libraries that have become available. In recent UG
forums and Exit Surveys, students have expressed
interest in learning Python. Recently, a senior graduate student (who
has used Python extensively in building systems for his research)
teamed up with a faculty member to propose developing a Python course
in the 459 series. After discussion in the Curriculum Committee, the
proposal has been approved and the course will be offered in Winter
'12; assuming that the pilot offering goes as well as it is expected
to, a semester version of the course will be developed and offered
starting in the 2012-'13 year.
- CSE 541:
[Related SOs/PEOs: (a, e); I; assessment instruments: other]
Many students struggle through this course on
numerical analysis. The problem is that they have not adequately
understood the necessary ideas from the prerequisite courses on
calculus, nor developed adequate skills to work with
typical real-valued functions and this is reflected in their poor
performance in 541. To address this,
faculty made
the following changes: i) introduced additional lectures at the start of the course
reviewing relevant parts of calculus and introducing relevant material
from basic linear algebra; ii) room for these topics was made by omitting
lesser priority topics
(such as Lagrangian interpolation). Student performance in the course
has improved following these changes.
- CSE 551:
[Related SOs/PEOs: (k, n); I; assessment instruments: UG Forum]
In a
recent offering of 551, one of the main assignments required
the student to conduct a broad survey of the field and write a 3-5
page paper and make a brief in-class presentation reporting the
findings. Alternately, students could build a
simple application related to information security for a mobile
handset. Microsoft Research had loaned us several Windows
Phone 7 phones; and students who chose this latter option implemented
their application on these phones.
At a recent UG Forum, a couple of students
suggested that focusing on one platform in this manner was not
appropriate. In response to this, the instructor allowed students in
the class to choose to develop the application on their own handset on
any platform they preferred. In addition, the instructor is exploring
the possibility of acquiring a few additional handsets on alternative
platforms so that students in future offerings of the course will be
able to develop on non-Windows platform even if they did not own an
appropriate handset.
- CSE 601*:
[Related SOs/PEOs: (f); III; assessment instruments: POCAT]
[This item is still being discussed/in the process of being implemented.]
One of the questions on recent POCATs was the following:
Conflict of interest is a common ethical issue in business. Which of
the following has the most potential for being a conflict of interest?
- Recommending your company buy the same kind of computer you have at
home;
- Recommending your company buy computers from your brother-in-law;
- Buying a computer for home use from the same supplier your company
uses;
- Asking questions of the computer support people at work about
programs you're using at home.
Almost 90% of the students picked the correct answer, i.e., (2).
But in the UGSC discussion of the results,
the point was made that the question was too simple
and did not adequately address
outcome (f). For example, one common difficulty
students have is distinguishing between legal
questions versus ethical questions, or obligation to an employer
versus obligation toward society, etc. Following the
discussion, the faculty involved came up with a new
question:
Your brother/sister is very ill and needs medication you cannot
afford, so you steal it. Which of the following is the ethical
question that arises in this scenario?
- What kind of illness does your sibling have and how do we prevent
its spread among populations?
- Why is the medication not affordable and how do we make it more
affordable?
- Is it ever right to steal, even if you have a great need?
- What consequences might you face if you were caught stealing the
medication?
- All of the above are ethical questions.
- All of the above are important but none of them is an ethical
question.
- I cannot decide based on the given information.
Only about 70% of the students picked (3), the right answer.
In response, faculty are currently
considering adding a lecture at the start of the course that
explicitly considers the distinctions of the kind described above.
Faculty are also analyzing student answers more carefully to see which
of the various distractors seemed most popular to see what items to
focus on in this lecture.
- CSE 655:
[Related SOs/PEOs: (c, e, k); I; assessment instruments: POCAT]
A common problem
that programs, especially large ones, exhibit has to do with
uninitialized variables. Different programming languages help
address the problem in different ways.
First, the
language could be so designed that when a variable is defined, it is
automatically assigned some appropriate default value. Or
the syntax of the language could prevent the programmer from
introducing a new variable without specifying an
initial value for it. Or have the compiler
analyze each program to check that each variable has been
initialized before it is used. A fourth approach would be to have the
compiler
insert, into the compiled code, additional checks that make sure (at
runtime) that
each variable that is used has a value that was actually assigned to
it. The last approach would be to do nothing and expect the
programmer not to make the mistake of using a variable without
initializing it; in this case, if the program does have an
uninitialized variable, the program will probably crash when the
compiled code is actually executed.
Each of these approaches has advantages and disadvantages. For
example, the first approach may mask a bug in the program because the
programmer may have meant to but forgot to assign a specific initial
value to a variable; since the system provides a default initial
value, the program will run but may compute a wrong result and the
programmer may not realize that. The third approach (compiler
detecting the problem at compile time) but it can't be implemented in
general because of conditional and loop structures that depend on
runtime values. In other words, the compiler cannot tell
exactly which parts of
the program will be executed before which other parts. But it can do
an approximate analysis and arrive at a conservative evaluation
that would flag some uses of certain variables as questionable because
it is not able to conclusively establish that, in all cases, during
program execution, that the variable in question will be initialized
before being used. Java uses this approach. C++ uses
the fifth approach (leave it to the programmer); Resolve-C++, a
local dialect of C++ that is used in the 221-sequence
uses the first approach.
The topic is discussed in some depth in CSE 655 but for some
students the essential nature of the problem and its
possible solutions tend to remain unclear. Here is a related POCAT question:
One common problem in programs is that of uninitialized variables,
i.e., using a variable without having initialized it. This is
commonly a run-time error but Java flags this error at compile time.
How does it do this?
- Java uses a special technology that converts
run-time errors into compile-time errors;
- Java uses a "conservative"
approach, sometimes flagging situations which are not actually
erroneous;
- Java does automatic initialization of all variables so the
problem of uninitialized variables cannot arise in Java programs;
- Java is an interpreted language, so this question is meaningless;
- I have no idea.
When the question was tried a couple of years ago, faculty
expected 70% or so of the students to get the correct
answer. In fact, the number of students who picked the right answer
was substantially less. While some of the students seem to have chosen
an answer (such as (3)) that would indicate not having knowledge of
some Java details, many others chose answers (such as (1)) that
indicated failure to have a sufficiently good grasp of this important
concept. Indeed, someone with a good understanding of the concept
should, even if she had not heard of Java before, be able to choose
(2) as the most likely answer. Based on this, the faculty revised the
discussion in CSE 655 to include a more detailed discussion of the
topic. The performance of students in recent offerings of POCAT in
this (and similar) questions has been substantially better.
- CSE 670*:
[Related SOs/PEOs: (a, e, m); I; assessment instruments: POCAT]
[This item is still being discussed/in the process of being implemented.]
Relational schema and, in particular, the notion of
primary keys for given schema, is a conceptually important
notion discussed in CSE 670, the required course on databases. In order
to see how well students are able to work with this notion at the time
of their graduation, faculty designed the following POCAT question:
Consider the relational schema R(A,B,C,D,E,F) with the functional
dependencies:
{BC --> ADEF, B --> DF, D --> EB}
Which of the
following could be the primary key for R:
- {A,B}
- {B}
- {C}
- {C, D}
- {B,C,D}
- None of the above
- I have no idea
(1) is incorrect because although using B we can get D and F and then
get E, there is no way to get to C. (2) is incorrect because we can't
get to either C or A. (3) is incorrect because using C we can't reach
any of the others. (4) is the right answer: using D, we can get E and B,
and then using B and C, we can get to the rest. (5) does let us get to
all the rest but is not minimal (since we can omit B).
Surprisingly few students (less than 10%) got the correct answer on the
POCAT. After some discussion in UGSC, it was decided that we should try
a simpler version of the question.
Consider the relational schema R(A,B,C,D) with the functional
dependencies:
{A --> BCD, BC --> AD, B --> D, D --> B}
Which of the following could be the primary key for R:
- {A,B}
- {B}
- {C}
- {C, D}
- {B,C,D}
- None of the above
- I have no idea
Again the right answer is (4); but again very few students got it.
Following further discussion in UGSC, it was decided to request one of the
670 instructors to try the question in his final exam. The Sp '11
instructor for the course did so, assigning the first version of the
question to half the students in the class and the second version to
the other half. To his surprise, the performance of the students was
as poor as in POCAT. Overall, less than 10% of the students (about the
same figure as in POCAT) got the correct answer for either form of the
question.
The preliminary conjecture is that students are not understanding the
minimality requirement of primary keys since several students
chose answer (5). But we plan to look into this further and see how to
revise the course to address the problem.
- CSE 680*:
[Related SOs/PEOs: (a, b, e, l, m); II; assessment instruments: POCAT]
[This item is still being discussed/in the process of being implemented.]
One of the topics in CSE 680, the required course on algorithms,
is solving recurrence relations.
These relations can be used to express the running time of certain
algorithms; in effect the running time of the algorithm for input of a
certain size is related to the running time for input of a smaller
size; which, in turn is related to the running time for input of still
smaller size; etc. But getting a good feel for the actual running time of such an
algorithm for large inputs requires us to "solve" the
relation to obtain the asymptotic behavior of the algorithm. In
general, this is a difficult task but if
certain conditions are satisfied, the Master
theorem can simplify it considerably. This can be important in
certain situations such as when dealing with algorithms
designed to search through very large volumes of data since the
difference in running time between different algorithms for the task
can be very substantial.
Hence the faculty involved with 680 designed a POCAT question
intended to see if students are able to solve (reasonably simple)
recurrence relations. The performance of the students who took the
test was unexpectedly poor. In the evaluation discussion analyzing the
test results, one explanation offered was that students were,
in fact, capable of using the Master theorem to solve the relation
--the relation in the POCAT question being one that satisfied the
conditions that allow the Master theorem to be applied-- but that,
because of the complex nature of those conditions, students could not
be expected to remember them when taking the POCAT. Indeed,
this seemed to be confirmed when the CSE 680 instructor asked a
similar question as part of his final examination for the course. A
large majority of the students in the course answered the final exam
question correctly.
But we decided to test this further. In the next offering
of POCAT, the question was revised to include a statement of the
theorem. With this, we felt students
should be able to check that the needed conditions were satisfied in
the given scenario and solve the specified recurrence relation. But,
in fact, student performance wazs no better. This is a puzzle
and one that has not yet been resolved. Why did the students in the
course final examination do so well when students taking the POCAT did
so poorly even when they were provided an explanation of the
Master theorem? Further fine-tuning of the question in future offerings
of the POCAT will, we hope, help address the question and tell us whether and
what changes in the course are needed.
It may also be worth noting that, in practice, one
would expect a CSE professional to look up, perhaps on-line, the
details of the theorem rather than necessarily remember them. Thus if
the faculty's original explanation that students taking the POCAT
simply did not remember the theorem had turned out to be correct, we
would have concluded that no change in the course was called for. But
it didn't.
- CSE 682:
[Related SOs/PEOs: (d); I; assessment instruments: teamwork rubric]
Based on the results of the peer-evaluations, using the teamwork rubric,
in a recent offering of CSE 682, it is clear that some students have
problems being good team members. The evaluations provided 3 example sets of
comments by others in the group about individuals who did not function
well on the team. In one
case, comments made by the student about himself show how out of
touch he was with the rest of the group. The instructor is planning to
put these on
slides and make a mini-lecture out of them for use in future offerings of
the course. Previously, he has talked
in class about good team participation but having such specific real examples (with the names of the individuals being elided, of course)
should make the point much clearer.
- CSE 786:
[Related SOs/PEOs: (d, i, k); I; assessment instruments: teamwork rubric, life-long learning paper rubric]
Each capstone design course has required
students to explore a new tool, technology, or process and write a
three or four page paper on it. However, the coordinator for CSE 786 (the
capstone course on game design) felt
that this requirement was too distracting for students
since it took too much of their focus away from the capstone design
project. Hence he came up with an alternative approach to
achieving the outcome via an activity related to the design project
and, as an added bonus, also engaging students in another team
activity. The approach is to require groups of several students each
to form "technology teams". These teams are "orthogonal" to the
project teams; i.e., typically each technology team of, say, 5
students, will consist of one student from each of five different
project teams. Each technology team will focus on a particular
relevant (to the area of the particular course) topic,
for example,
"sound production" or "physics of games" (these being relevant to
the topic of CSE 786); each student in the team is
responsible for one aspect of the particular topic and is expected
to research that aspect. The team then puts together appropriate
resources and/or documentation that summarizes its findings and makes a
presentation to the entire class, with each student taking the lead
for the particular aspect that he/she was responsible for. The
document becomes a resource that is available to the entire class and
helps address (or provides pointers to resources that may address)
questions related to the particular topic that different project teams
may encounter in their projects.
Thus the approach helps students improves their lifelong
learning skills; and
since the student teams are required to document and present their
findings, it helps sharpen their communication
skills as well. Further, since the research and, especially, creating the
document and making the presentation are both team activities, and
since the team is distinct from the student's project team thus
requiring the student to interact closely with another group of
students, the approach contributes to improving students' team
skills. Moreover, since the topic is directly relevant to the design
projects that the various project teams in the course are engaged in,
the interaction between the technology team and the class during the
presentation is likely to be more engaging and of greater depth.
Other capstone design courses (such as CSE 682) are adopting the
approach to their courses.
- CSE 2221, 2231 (the semester replacements for CSE 221-222-321):
[Related SOs/PEOs: (m, k); I; assessment instruments: Alumni and Exit Surveys]
One comment that we have often seen in the
Exit and Alumni Surveys is that while students and graduates appreciated
the software
engineering discipline that the CSE 221-sequence teaches, many are
unsatisfied with the use of RESOLVE/C++ in the sequence. Many
would prefer to learn the concepts we teach in this sequence
with a different programming language as the delivery mechanism, a
language used more commonly in industry positions of the sort they are
most likely to secure for internships and after graduation. Hence, in
the semester version of the sequence, the conceptual focus of the
introductory software courses will remain, but we will use Java
as the programming language and focus more attention on current
industry best practices, much as we do now in CSE 421.
- CSE 3901, 3902 (semester replacements for CSE 560):
[Related SOs/PEOs: (c, e, k, l, n); I; assessment instruments: Alumni and Exit Surveys]
One of the comments we consistently see
in the Exit and Alumni Surveys is that CSE 560, the current junior level project
course, is extremely valuable since it helps students develop
teamworking as well as communication skills while also, at the same,
engaging them in a challenging design and development task. The one
negative comment that we see has been with respect to the domain
of the task, i.e., system software (assembler, linker, loader,
simulator). While understanding of such software is important,
especially for those who want to work, for example, in embedded
systems development and the like, most students feel that it should
not be the topic of the project in 560.
In designing the semeter program, we have, therefore, developed two
courses, CSE 3901 and 3902, each of which can be used to meet the
junior-level project course requirement. The project in CSE 3901 is
concerned with development of a web application with the focus on both
client-side and server-side scripting. The project in CSE 3902 is
concerned with development of a 2-d interactive game with the focus on
both 2-d graphics and rendering as well as on event-based programming.
Each course will, as does the current 560, engage students in
intensive team work as well as documentation/communication.
E.2 Program-level Improvements
- NEWPATH:
[Related SOs/PEOs: (d, g, k); I; assessment instruments: Alumni and Exit Surveys]
Many BS-CSE majors have a strong interest in
entrepreneurship and dream of being successful entrepreneurs.
Indeed, many of
these students gravitated to CSE because of their interest in
entrepreneurship. The university does offer, in the Fisher College of
Business, a minor program in the topic and a number of students do
complete that program. However, several comments over the years in
alumni surveys and exit surveys have suggested that a more focused
program, one that concerns not entrepreneurship in general but
IT-entrepreneurship would be of value. Partly in response to this, and
partly because NSF funding made it possible, the NEWPATH program was
created four years ago with the main goal of educating,
training and nurturing highly motivated students to become IT
entrepreneurs. The most ambitious students in the program are
expected, by the time of graduation, to be running their own IT
startups. The program has a number of key components: internships in
local IT-startups, arranged in collaboration with TechColumbus, a
state supported non-profit that serves as an incubator for high-tech
startups; an on-going weekly seminar session that provides a forum in
which students can learn from each other and from NEWPATH faculty
members, can brainstorm ideas that may serve as the basis for
startups, can hear from other students about their internship and
e-practicum experiences, can engage in case-studies of successful and
unsuccessful IT startups to identify best (and not so good) practices,
and can hear from CEOs and other senior people from local IT startups;
and a two-quarter long entrepreneurship practicum. This component is
designed to provide an in-depth practical experience in IT
entrepreneurship to NEWPATH students where they will have the
opportunity to take an idea from concept to the brink of
commercialization. The program has proven popular with a select group
of motivated BS-CSE majors as well as majors from ECE, Business,
etc. The availability of this activity contributes to several of the
student outcomes including (d), ability to function on
multidisciplinary teams; (g), communicate effectively with a range of
audiences; and (k), ability to use techniques and skills (including,
especially, entrepreneurial skills) to succeed as a CSE professional
and entrepreneur.
-
GET:
[Related SOs/PEOs: I; assessment instruments: Alumni and Exit Surveys, IAB meetings]
Another common refrain in the Alumni Survey over the years,
has been the importance of providing opportunities for students to
gain, via suitable internships, relevant work experience while they
are still in the program. Comments at some IAB
meetings have echoed this. Many of our students, out of
necessity, work at least part-time but that work is often unrelated to
CSE and does not address this need. The Global Enterprise Technology
(GET) Immersion Experience is designed to do so. It is an ``immersive
internship program'' for BS-CSE (and BS-CIS) majors, designed in
cooperation with JP Morgan Chase and Syracuse University. Students
admitted to the program engage in a 2-quarter+summer long on-site
internship at JP Morgan Chase or other companies, while being fully
enrolled in classes that are tightly woven into this internship.
Students who have gone through the program will gain valuable,
relevant work experience, and can also expect to receive permanent
job offers from JP Morgan Chase and associated companies when they
complete their degrees. The program is in its pilot state and has
already attracted considerable student interest. This improvement
directly contributes to preparing the student to achieve the first
part of program objective (I), i.e., graduates of the program will be
employed in the computing profession, ...
- Ethics Course:
[Related SOs/PEOs: (f); III; assessment instruments: Alumni Surveys]
The alumni survey as well as informal
interactions with alumni and employers has made clear the need for
students (engineering students in general, not just BS-CSE majors) to
receive training in topics related to engineering ethics in a
full-fledged course. In response to this data, the College of
Engineering's Core Committee worked to achieve two
things: replace one of the general education courses by a course
devoted to this topic; and have various departments around the
university develop suitable courses that engineering students could
take. The committee defined the set of learning outcomes that any
course in this category should achieve:
- An ability to explain the ways in which society regulates the use of
technology;
- An ability to identify stakeholders in an engineering solution;
- An ability to identify moral problems and dilemmas;
- An ability to analyze moral problems from different ethical perspectives;
- An ability to identify the personal values that an individual
holds and uses to resolve moral problems and dilemmas;
- An ability to describe the relation between personal values,
societal values, and professional values.
Four different courses have been developed and approved for this
category with the majority of students taking Philosophy 131.01,
Introduction to Engineering Ethics. This course is an introduction to
engineering ethics, and stresses the application of professional
ethical codes to specific cases. The focus is on the National
Society of Professional Engineer's (NSPE) ethical code but the course
also looks at the professional codes of other professional engineering
organizations. In addition, the course briefly surveys some of the major
ethical theories that have been proposed and discusses the general
relationship between advancing technology and society's ethical
standards.
The addition of the engineering ethics course to the curricula of
BS-CSE majors who entered the university in the last few years is an
important improvement that contributes to outcome (f), in particular
student's understanding of professional, ethical, legal, and social
issues and responsiblities.
- Change in PEO II, III*:
[Related SOs/PEOs: II, III; assessment instruments: IAB input]
[This item is still being discussed/in the process of being implemented.]
During its May 2011, the Industrial Advisory Board, during its
periodic discussion of the BS-CSE program, suggested a couple of
(modest) revisions to the current PEOs. The first suggestion was to
revise PEO (III) which currently reads, "Graduates will be informed
and involved members of their communities, and responsible engineering
and computing professionals", to refer to professional
societies (such as ACM) rather than or, possibly, in addition to "communities". The second
suggestion was to consider dropping the word "computing" from PEO
(II) which currently reads, "Graduates with an interest in, and
aptitude for, advanced studies in computing will have completed, or be
actively pursuing, graduate studies in computing", to account for
graduates who may pursue, for example, an MBA. We will discuss these
suggestions in early Autumn 2011 in the UGSC
for possible action.
E.3 Improvements in Assessment/Evaluation Processes
- Quality of POCAT questions*:
[This item is still being discussed/in the process of being implemented.]
In the earlier discussion of CSE
601, we noted that the original POCAT question
was too simple and that was the reason for the
relatively high scores that students received. But this raises a
general question: is there a way to evaluate the quality of
POCAT questions in general? One obvious measure is
the difficulty of the question, i.e., the percentage of students
who choose the right answer.
But there is a more nuanced measure that we have been considering, one
that is based on a similar measure used elsewhere.
The idea is that since wrong answers in a POCAT question are not simply
arbitrary wrong answers but are supposed to represent common
misconceptions that students harbor, for each answer X of a
question, one can define its discrimination, disc as
follows:
disc = (Xt/Nt) - (Xb/Nb)
where Xt is
the number of students from the top quartile who
selected the answer X and Nt the number of students in
the top quartile; Xb
the number of students from the bottom quartile who
selected X and Nb the number of students in
the bottom quartile. One expects a high-quality question to have a
correct response with a large positive disc value.
Similarly, a good distractor (ie incorrect response)
for such a question would have a large negative disc value.
Thus disc provides a good way to evaluate the quality of a POCAT
question not just in terms of its difficulty but also with respect to
the quality of the distractors in it. This is important since good
distractors allow us to identify common misconceptions that students
harbor which allow us come up with improvements in the courses to help
overcome the misconceptions. Hence we are revising the automated tool
that generates the summary results so that it computes all the
disc values and highlight those that are out of line.
Having these values (computed by hand) available during the UGSC
discussion of the POCAT results has already proven very useful in
helping to guide the evaluation.
- Evaluation and documentation of POCAT results:
Each quarter, one key activity for UGSC is arranging the POCAT and
creating the test that will be used. In the first or second week of
the quarter, committee members discuss this briefly, decide a date for
the test, and discuss the questions that should be used on the test,
based on the results from previous tests. The summary results of the
POCAT are compiled within a week of the test administration. At the
following UGSC meeting, the results are evaluated and the conclusions
summarized in the minutes of the meeting. While this approach has
worked,
the evaluation is buried in the UGSC minutes. Hence, when subsequent
POCAT results are discussed, faculty's recollection of the previous
evaluations tends to be somewhat hazy.
In order to address this, we have now adopted the following approach.
For each question for which the results were unexpected or otherwise
led to discussions in the Undergraduate Committee (and beyond),
summaries of the discussion are written up. The summaries are similar
to the ones above (for CSE 321, 601, 655, 670, etc.). The summaries are
maintained in a single web page in reverse chronological order with
the summaries corresponding to the each POCAT
being organized in its own section. Each of these sections also
contains a link to the actual question used in that particular POCAT. In effect,
over time, the page provides a historical view of the changes that
were made to the program and the rationale, in terms of the assessment
results that triggered them and the summary evaluations of the
results, behind the changes. Thus, for example, a new faculty member
to the department can read through this page and get an excellent view
of the evolution of the program and the reasons behind important
changes in the program. (This page is protected since it contains links
to banks of POCAT questions as well as the tests.)
- Rubric for interactive grading in CSE 560*:
[This item is still being discussed/in the process of being implemented.]
A major activity in CSE 560 is the quarter-long team project which
requires student teams (typically of 4 or 5 students each) to design,
implement, test, and document a system project (consisting of an
assembler, loader, simulator, etc.) One key component of the
assessment of student work in the course is a set of interactive
grading sessions in which student teams have to demonstrate parts
of their system to the instructor and the grader and answer questions
about various aspects of the system and their design.
One of the difficulties of any team project of this kind, especially
at this stage in the students' curriculum, is ensuring that each
member of the team contributes appropriately to each important piece
of the effort. For example, it is not uncommon for one student in a
team, someone who has been programming for a long time, the proverbial
"hacker", to take over all implementation aspects of the project. The
interactive grading sessions often reveal such problems. But students
may not always know what to expect of these sessions or their
purpose. Our plan is to develop a grading rubric that will be used
during these sessions and make the rubric available to the
students at the start of the quarter so that they can see what
potential problems to look out for in their own teams. Thus the idea
is not just to make the grading task more clearly defined but to help
students recognize, as they are working on their projects, common
problems that teamwork engenders and take corrective action in time.
Proposed rubric for 560 (to be developed) (outcomes (c), (d), (e), (k), (n))
More recent improvements (added starting March 2014):
- Au '15: See http://web.cse.ohio-state.edu/~neelam/ugsc/minutes/1516index.html#oct2, item (2), for
an interesting example of how POCAT can point to all kinds of problems!
- See same minutes for another (non-POCAT-based) improvement related to
5xy9.
- Su '15: Suggestion from Jeff Jones about creating a repository of
course materials:
Jones, Jeffrey S. writes:
] Hi Neelam:
] Do you want us to send you the course presentation materials we develop
] on any sort of
] regular basis? Do we maintain any sort of department repository of
] this type of material?
Neelam's response:
What do you mean by "course presentation materials"? Do you mean things
like Powerpoint slides? Something else?
Anyway, we don't maintain anything like a departmental repository of
such materials but maybe we should. Sounds like a good idea - maybe
Curriculum Comm. should take it up in the fall. I am adding Paul
Sivilotti (who is taking over as chair of Curr. Comm in the fall) to
the cc: list; I am also adding Jeremy to the list since he might be
interested as well ...
- Spring 2015: Quality of capstone projects; msg. from sponsor; see 32 in
assessEvalmsgs
- Spring 2015: Given the increasing importance of security-related
issues, Dr. Babic has recently revised CSE 2431 to include coverage of
this topic: His message: (also see: 33, 34, 35, in assessEvalmsgs)
From: "Babic, Gojko"
To: "Soundarajan, Neelam"
Subject: RE: Security in Cse2431
Date: Wed, 22 Apr 2015 10:33:06 -0400
Hi Neelam,
I can inform you I was able to incorporate the ppt presentation on security in both of my Cse2431 sections this semester. It took me 1 week to do that. The only relatively significant topic I had to exclude was "disk arm scheduling", which can be normally done in 30-35 minute lecture, and I think that time can be found with planning from the beginning of semester.
Regards,
Gojko
- Spring 2015: The IAB, at its meeting of April 10, 2015, approved the
current PEOs; but one of the members pointed out that the last PEO, in
fact, directly ties into a key mission of OSU as recently articulated by
President Drake.
- Spring 2015: Possible changes in Phil 1337; see
http://web.cse.ohio-state.edu/~neelam/ugsc/minutes/1415index.html#feb16
- Spring 2015: Changes in ECE 2000, 2100; see
http://web.cse.ohio-state.edu/~neelam/ugsc/minutes/1415index.html#sep18.
- Dec 31, 2014: Message to capstone project sponsors
- Dec. 1, '14: New capstone course on cognitive computing
- Nov 3, 2014: ISE capstone poster rubric is a good tool for direct assessment of various outcomes. Need to use something like this.
- Aug 26, 2014: Suri Jayant's message to his 5911 students at the start of the semester.
- Aug 18, 2014: Raghu's request for use of 3d printing in 5542
(see this message).
- Aug 1, 2014: (Message to Eric and Rajiv):
Eric gave a summary of the discussion he and Rajiv
had with Bob Rhoads about CoE's multi-disciplinary capstone course.
Couple of things Eric brought up were the following: Would our students
be able to use this 2-semester long multi-disciplinary course as part
of their CSE program? Eric and I talked about this and it looks like
we should be able to do this. As I understand it, this is a total of 6
cr hrs so students should already be able to use it as part of their
tech electives since they can have up to 7 hrs of non-CSE courses as
part of tech electives. We do say that this has to be approved by the
faculty advisor; we could, instead, after discussion in UGSC, list this
as "pre-approved" so students are more likely to consider taking it.
The other (related) question was whether students could use this to
meet their capstone requirement. The answer to that seemed to be that
it would depend on the project in question. If it has a substantial CS
component, then, indeed, it would seem appropriate. So that could be
done on a case-by-case basis, the decision being made perhaps by a
small subcomm. of 3 people ... But, in any case, the student will have
the assurance that the sequence could at least be used as part of
his/her tech electives. Eric, did I summarize that appropriately?
- (Possible) Changes based on ACM/IEEE CS Curricula 2013 (sent message
to faculty on July 20, 2014).
- (Possible) changes in 2501 based on CACM, July 2014, article, "Toward a
pedagogy of ethical practice" (sent mail to Michelle Mallon, July 20, 2014).
- (Possible) changes in 390x based on CACM, July 2014, article, "Finding
more than one worm in the Apple" (sent mail to Paul and Paolo on 7/20).
- Based on UGSC forum of March 25, 2014: Paul Sivilotti is working on
revising 3901
to make it more demanding in terms of the programming activities
- Based on UGSC forum of March 25, 2014: Neelam attended a meeting of
the ECE Undergrad Comm. on Apr. 3, 2014, and explained our concerns.
The ECE faculty are planning to
look into possible changes that could be made to the courses or, possibly,
coming up with alternative organizations of the courses for CSE students.
- GitHub accounts, etc. (based on discussions in UGSC)
(copy of e-mail to faculty); (see also
this page which is
a forum for teachers that GitHub set up).
- Improvements in GTA performance (Jeremy Morris's suggestion based on
(apparently invalid) comments about a 1223 GTA) by finding ways to mentor
them.
- ECE 2000, 2100 based on discussion in UGSC
This section (introduced in Su '16) lists some of the followup actions,
designed to improve the CSE courses and
program, based on the analysis/evaluation
of POCAT results as well as the results of other assessments.
Suggestions for changes to the way in which this section is organized
(or other aspects) are welcome.
- Su '16: Introduction of rubric for poster session: see
here.
- In discussions, among Jeremy Morris, Al Cline, and Neelam Soundarajan,
as this rubric was being developed, it was suggested that it would make
sense to combine the last two
dimensions of the rubric because, from the point of view of the visitor
to the poster session, "Effectiveness as a Project Team" will be seen
primarily via how well the members of the team interact with the visitors
and how they support each other in those interactions; so it may be appropriate
to have a single dimension, "Communication and Team Effectiveness". (By
contrast, the rubric(s) used by the course instructor should certainly
include distinct dimensions corresponding to team skills and communication
skills.
- Fall '15/Sp '16: Based on student performance in the POCAT question
related to encoding of information in binary numbers, the discussion
of the results in UGSC which concluded that one possible way to address
this problem would be to have the systems I course (2421) spend a bit
more time on the topic. Gagan, the coordinator for CSE 2421, suggested that
a possible reason for the problem may be that some of the lecturers who
teach the course might be spending too much time on C programming, teaching
students all kinds of C programming tricks, at the possible expense of
conceptual topics such as this one. To address this, at the start of each
semester, we will have the
following message sent to all instructors of 2421 for that semester:
This message is being sent to all CSE 2421 instructors. One of the
topics in the course, as specified in the official syllabus, is
(low-level) C programming, the idea being that students need to be
able to work reasonably comfortably with pointers and, more generally,
be able to think at the machine-level; according to the syllabus,
about 4-5 weeks of the 14-week semester is supposed to be devoted to
this topic. But some instructors seem to spend far more time on this
topic than that. While this may help students develop stronger
C-programming skills, it reduces the time available for the remaining
topics and negatively impacts the comptuter organization aspect of the
course which, in one sense, is intended to be the primary topic of the
course.
One of the problems that has been observed, over several semesters, in
POCAT (see:
http://web.cse.ohio-state.edu/~neelam/abet/DIRASSMNT/POCATRESULTS/index.html)
is that a surprisingly high percentage of our students do not seem to have a
good intuition for how to, in general, encode different (and relatively
simple) types of information as binary strings. (For details, see the above
web page.)
Given this, please make sure, in your section of CSE 2421, that you do not
spend much more time on C programming than specified in the official syllabus
and that you pay adequate attention to computer organization/architecture
aspects; further, that you try to ensure that students develop a good intuition for
how all kinds of information can, at the lowest level, be stored as binary
strings. Note that the point is not about such things as IEEE floating-point
standards etc. Rather, it is about ensuring that students develop a good
feel for how many bits might, roughly speaking, be required to store a
particular piece of information and, possibly about, how security
considerations or error correction considerations might impact this etc.
If you have any questions or comments, please send mail to soundarajan.1
Thanks!
--Neelam Soundarajan, Chair, CSE Undergrad Studies Comm.
- Sp '16/Su '16: Proposed change in PEOs based on input from IAB.
- Au '14/Sp '15: Specialization options: ...
G. Sample materials
- Individual report (Au '14) Ramasamy's comments)
- 758/5911 projects
Please send comments, questions, and suggestions to Neelam at cse.ohio-state.edu
(Previous (now outdated) version of this page is here; and an even older version is
here.)