More details: In general, it is hard to predict what a particular program evaluator
("PEV" in ABET terminology) will want to talk about in individual meetings with
faculty. The PEV may ask about your background (experience in academia, in industry
(if any), your specialization, etc.) If there are such questions, they should be
fairly straightforward and easy to respond to. If your background/interests
includes any focus on CS education, including attending conferences such as SIGCSE,
please highlight those.
More likely, the PEV will focus on the ABET criteria and how well, based on
your understanding, the program meets the criteria requirements. What follows
is a brief summary of the criteria requirements and how our program meets
them. You may want to model your answers on this summary; but please do NOT
print this page and take it with you to the meeting! That would be serious
academic misconduct!:-)
There are eight criteria:
Assessments: Our assessments consist mainly of the following:
i) POCAT: POCAT is a multiple-choice exit test that BS-CSE students take before graduation, typically
in their final semester. All BS-CSE students take the test but their individual
performance neither becomes part of their OSU record nor is even recorded.
The test includes questions from a number of required and
core-choice courses as well as some of the popular electives. The goal is to help
identify common weaknesses in students' conceptual understanding of important ideas
and, via discussion in UGSC, come up with ideas that might help improve the program.
The questions on POCAT are proposed mainly by the faculty members closely
involved with the particular courses.
ii) Rubrics: The POCAT questions are related, primarily, to the "technical" SOs. In order to
help assess the SOs in the other two groups, we have developed three rubrics that
include specific dimensions related to those SOs. The first rubric is for use
by instructors in
CSE 2501, the course on social, ethical and professional issues in computing
(and in Philosophy 1338 which students may take in place of 2501); the second
is for use by instructors in the capstone design courses; the third rubric is
for assessing
the work of student teams at the capstone poster session at the end of fall and
spring semesters.
iii) Exit survey: Before graduation, students also complete an opinion survey that
asks for the student's assessment of the importance of each outcome as well as
the extent to which the program helped the student achieve the outcome. The survey
also asks the student's opinion concerning the one aspect of the program that he/she
was especially happy about, and one aspect that most needs improvement.
If the PEV asks what the results of the survey show, here is a brief summary:
the program has helped students achieve most of the outcomes reasonably well
and most of the outcomes
are reasonably important (with the technical ones typically rating higher than the
outcomes in the other two groups). One aspect that shows up repeatedly as especially
satisfactory is the quality of advising provided by the Advising Office. One aspect that
often shows up as needing improvement is a greater emphasis on largish design/implementation
projects.
iv) Annual forum: The annual student forum, typically held in the middle of Spring
semester, is another important method of obtaining student feedback on various
aspects of the program and of answering student questions with respect to courses,
career options, etc. In addition to the students, a number of faculty, members
of the advising staff, and one or more staff members from computing services, attend
the forum. Students provide useful feedback on individual courses as well as on the
program as a whole. A few days after the forum, a detailed report is posted on
a website maintained by the UGSC.
UGSC works closely with the Advising Office to coordinate the administration of the POCAT and the exit-survey; and to schedule (and advertise) the Annual Forum. UGSC works with the instructors of the capstone courses and CSE 2501/Phil 1338 to administer the rubrics. Recent results from all four assessments are available at web pages maintained by the UGSC. UGSC discusses the assessment results at its regular meetings and comes up with ideas for improvements in the program. Occasionally, the improvements tend to be for changes in the rubrics or tweaks to particular POCAT questions, etc.
Improvements: Below is a list of some recent changes/improvements based on the assessments.
Each item is flagged with the names of one or more of the people who are meeting with
a PEV and who are in some way reasonably related to the item. Make sure you look
through at least the item(s) that are flagged with your name; that should help during your
discussion with the PEV. (In some cases, your name may be included because of your role in UGSC/Curr. Comm; or active involvement in discussing the item; etc.)
a) Chris, Paolo, Paul, Rafe, Neelam, Nikki: Based on feedback during a couple of annual forums, we worked with the ECE faculty
to revise the ECE courses that our students are required to take. The previously
required courses, ECE 2000, 2100 which many of our students found rather difficult
and inaccessible have been replaced by two course, ECE 2060 (digital logic) and
2020 (analog circuits). These courses seem better received by the students although
we are continuing to work with the ECE faculty to see if further changes are needed.
b) Jeremy, Neelam: An IT professional from the central Ohio area, when he heard about the POCAT,
liked the idea but also felt that some of the questions were very theoretical and
could, usefully, be recast in a practical setting. We are currently working on doing
that.
c) Doreen, Ken, Rafe: A recent (Sp '17) POCAT included a question about the asymptotic running time of a
simple (exponential time!) recursive program to compute the Fibonacci function. This is a topic that,
as was noted by the student reps at the UGSC meeting, is discussed at some length
in Foundations I/II or both, the point being stressed that the exponential running time
makes it a very poor program indeed.
But the performance of several students was surprisingly
weak ... the reasons for this are not clear but it is something we are working
on pinning down. We are tweaking the question to see if we can get a
better feel for the problem that students apparently have.
d) Paolo, Jeremy, Neelam: Continuing the above item, one suggestion was that
perhaps the language in the question or the choices presented were confusing or
unclear. Obviously, this is a problem for any of the questions in the POCAT, not just
this one. Hence we are toying with the idea of adding an alternative
along the lines of, "the question and/or
the presented options are too confusing!". If many students pick that alternative,
that will tell us the problem was with the phrasing of the question. We will consider
this for future POCATs, at least when we try a question the first few times.
e) Mike, Doreen, Ken, Rafe, Chris, Paolo, Paul, Jeremy, ...: Another POCAT question on which performance
has been surprisingly weak has to do with basic ideas related to representing
information in a string of bits; the question asked if n bits are required
to encode some information, how many bits would be required to represent twice as
much information. Here too the performance was much weaker than expected. Part of the
problems indeed related, as we discovered when we tweaked the question over some
semesters, to some of the terminology used in the question but the performance is
still too weak ... we are continuing to try to pin down the problem here; if it
turns out there
is a real problem in student understanding of this basic idea rather than in the
formulation of the question, we will work on revising either CSE 2421 or 2321 or
both, to
address this.
f) Ramasamy, Eric, Neelam: At the end of each fall and each spring semester, each student team from each of
the capstone courses from that semester is expected to present its work at
a public poster session which also typically includes demo's of the
team's (prototype) system. The session is attended by invitees including,
especially, professionals from the local IT industry. The teams are also
expected to be aware of any ethical and societal issues that their project may
raise and present them, as appropriate, in their posters or in answering
questions that the visitors to the poster session may have.
Following the Spring '17 poster session, the rubric assessments of the session
suggested that while the project teams, as a rule, had
done a reasonably satisfactory job in their projects, their posters could
be improved. Following discussion of this in UGSC, we put together a a guidelines document that provides some
useful tips to student teams on how to create effective posters. This will be
given to all teams in each section of each capstone course and should help improve
students' communication skills.
g) Eric, Ramasamy, Paolo, Jeremy, Neelam: One other concern, based on the Spring '17 capstone poster session, was that
teams in one of the courses, 5914 (the course in which teams use various facilities
provided by the Watson system), fared poorly on the other factors
dimension. Part of the intent of that dimension has to do with the team's
comparing alternative approaches and tools for solving specific problems in its
project and considering relevant tradeoffs before choosing one; and the low scores of the 5914 teams
were related to the perceived inadequate
attention paid by the 5914-teams to this question. This issue was discussed in a recent UGSC
meeting. Following the meeting, several
members of UGSC engaged in an email discussion about this with the instructors, including the adjunct from IBM, who teach 5914.
It turns out that the problem was not that the teams in the course did not go through
this important task. Rather, it was that their posters focused on how they actually
used specific facilities provided by Watson in their project without much, if any,
mention of alternative facilities (both those provided by Watson as well as other
systems) that they considered before settling on the ones that they did. In other
words, the poor scores in this dimension were mainly due to the failure of the
teams to document their evaluation of alternative approaches and facilities to solve
specific problems in their projects rather than the failure to engage in such
evaluation. The new poster guidelines (see above) include some
stress on the importance of documenting this evaluation since it is standard practice
in professional projects. If the performance still does not improve, we will
stregthen the class discussion on this item.
h) Mike, Neelam: When discussing the rubric results from a recent section of CSE 2501, a number of
members of UGSC felt that the rubric was somewhat lacking with respect to the
assessment of the paper(s) that students write in that course. While the rubric
addressed the stylistic aspects adequately, it did not include any dimensions
related to the quality of the student's argumentation based on appeals to relevant
ethical principles or theories. We have since
revised the rubric to address this; the change is in
the dimension, "Presentation of ideas and organization of the paper ".
i) Ken, Jeremy, Arnab, ...:
One of the questions in the POCAT, one intended for students who had taken CSE
3241, the Database I course, poses a simple problem that presented a simple relational
schema and asked students which sets of attributes could be keys for the
schema. The performance on this question was surprisingly weak with less
than 20% of the students answering it correctly. After considerable discussion in
UGSC and among faculty involved with the course, the question was tweaked, in one
version of POCAT, to include a line reminding students of the meaning of key
in this context. The performance on this version of the question was substantially
better. The conclusion was that
students, by the time of their graduation, do not always recall the definitions of
some important terms although they do seem to retain understanding of the
underlying concepts. Whether there is a way to improve retention of the definitions
of important terms is unclear ...
j) Jeremy, Arnab, Rafe, Nikki, ...: We currently have two database courses, CSE 3241 and
5242. 3241 is a core-choice course, taken by a large percentage of our BS-CSE students,
and fairly well received by them. CSE 5242 is more challenging since it serves two
distinct purposes. On the one hand, it is meant as a more advanced course on databases
for undergraduate students and, on the other hand, it is intended as a forum for
presenting and talking about research problems and recent results in the topic for
research-oriented graduate students. While some undergrads who are, by that point,
seriously considering graduate school options do well with such an approach, it
doesn't suit others who are interested in exploring, in a standard undergrad setting,
deeper and newer topics in the field. At recent Annual Forums, this point has come up.
The courses on data mining, machine learning, and the newly developed course on
cloud computing help but the number of sections of those courses that we can offer,
given our current faculty resources, is limited; indeed, the cloud computing course
has not generally been available to BS-CSE majors since it
was designed to serve the needs of the Data Analytics major which is distinct from
BS-CSE and those students get first priority for that course.
We are working on addressing the problem.
k) Paul, Arnab, Chris, Jeremy, Paolo, Neelam, Mike, others: Some students expressed considerable
interest in a course (whose detailed
contents may well vary from one offering to the next) that presents a few
important and powerful new/recent tools/systems etc. with some essential coverage of the
underlying concepts followed by discussion of the technical details of the
tool/system and how may be used in practice including, possibly, a detailed
assignment/project involving the tool or system.
One risk in having such courses is that students may fill up their
tech elective hours with a number of such courses with the result that their
overall technical foundations may be weak and/or lack coherence. Hence, for
now, we have shelved the idea; but if there is continuing interest (as is likely) and
if we can find satisfactory ways to mitigate this risk, we may pursue it.
Last, and by no means least, thank you very much for being willing to meet with the PEVs. Meetings with faculty is a critical part of ABET evaluations and your engaged participation is essential for the success of our evaluation. Thank you!