Assessment Results and Evaluation
- POCAT Results:
- Rubric Results:
- Exit-survey results:
- Alumni-survey results (these surveys are performed every other year):
- Annual forum reports:
2017;
2016;
2015
- Some recent improvements: Several recent improvements are listed in the
self-study. Here we consider some others (for the most part, more recent ones):
- Guidelines for capstone design course posters: Following the use of the capstone
poster session rubric during Au '16 and Sp '17, there was extensive email discussion
among several members of UGSC and key people involved with the capstone design courses.
An importatant conclusion was that while students, in their interactions with the
guests at the session were quite effective, their posters did not address the items
they were expected to. The capstone course instructors then came up with a set of
guidelines that should help address this problem. The plan is the instructor of
each section of each capstone course to distribute the guidelines to the students a
couple of weeks before the poster session and have a discussion in class about the
expectations of their posters in particular; and about what makes for effective
posters, in general.
A copy of the guidelines is
here.
- We also realized, during these discussions that our current
rubric had some key flaws in it which
made the feedback that we could get from the people who completed the rubric
not quite as useful as we had expected. For example, the term "sponsor" was not
always appropriate since some of the projects did not come from outside sponsors
but rather were essentially generated based on discussions in the particular
class; e.g., in the game design course, student teams came up with proposals for
particular games, presented it to the class, got feedback from the instructor,
revised the proposals based on that, and then proceeded to implement the game; in
the Watson-based class, again student teams had to propose suitable ideas for
project that would exploit facilities provided by Watson (or by other similar
systems), and go through a similar approach. For another example, the characterization
of the levels of achievement as "Average" etc. seemed inappropriate (being norm-based)
when assessing students readiness to become computing professionals. Hence we
revised those levels to be criterion-based although the levels such as
"exemplary" are not explicitly defined; if that proves problematic, we will revisit that.
The revised rubric is available here.
- We recently realized, during a discussion in a UGSC meeting of the assessment
results based on the use of the rubric in a section
of CSE 2501 that there were some weaknesses in that rubric. Specifically, it did
not assess how well the student based his/her arguments on appropriate
ethical issues and principles including especially computing issues.
We have therefore come up with a (prototype) revision of the rubric which will
be discussed in an upcoming UGSC meeting for possible adoption.
This revised version is available here.
- Minutes of some UGSC meetings: The minutes of some recent UGSC meetings where
assessment results as well as other feedback from various constituents was
discussed is available here.