Proposed Changes in Course Group Reports
Background
Course Group Reports (CGRs) are one of the most important feedback
mechanisms we use in the CSE program. Briefly, the mechanism works
as follows:
All regular CSE courses are organized into groups of related courses.
Coordinators of courses in each group are expected to interact with
each other and with faculty (including part-time faculty) who
regularly teach the courses in question on a regular basis to keep
track of any problems that might arise, or to identify any changes
that might be appropriate to make in the courses, etc. The
coordinators of each group are asked to present a status report, the
course group report (CGR), on
the particular group
of courses to the Curriculum Committee on a regular basis, perhaps
once every three years. For each course in the group, the report is
expected to address such questions as, are the course objectives
appropriate? Are the current pre-requisites appropriate?
Does the average student in the course seem
prepared for it? Do most students understand the main ideas and
acquire the skills that
the course is meant to give them? Etc.
Recent CGRs have been typically organized as follows: Section 1 provides
a broad summary of the group of courses. Section 2 provides detailed
analysis. Subsection 2.1 provides a one-to-two paragraph summary of
each course in the group. Subsection 2.2 discusses the relation to the rest
of the
program. Subsection 2.3 discusses the contribution that each course in
the group makes to the BS-CSE
program outcomes as well as to ABET Criterion 3 outcomes. Subsection
2.4 discusses the responses that have been made to the concerns
listed in the previous CGR for the group. Subsection 2.5 discusses
additional changes that may have been made in the group.
Subsection 2.6 lists any continuing concerns. Section 3 presents a
brief conclusion summarizing how the group contributes to the
BS-CSE program and lists the names of the faculty recently involved
with the courses in the group.
More complete details of the
CGR mechanism as well as recent CGRs are available.
Potential problem
One important ABET requirement is that program improvements
are based on results of assessments against specific
outcomes. In other words, it is not enough to say something along the
lines of, "we are evaluating our courses and as we find problems or
the need for improvements (possibly because of changes in the field),
we will identify suitable changes in the courses to solve the problems
or effect the improvements, and implement the changes". Instead, we
have to show that we have specific outcomes we are trying to achieve,
that we are assessing how well we achieve the outcomes, and that the
changes we make (at least some of them) are based on the results of
this assessment. Further, improvements based on results of direct
assessments, i.e., that are based on evaluations of actual
performance of students or graduates of the program (rather than, for
examples, results of self-assessments by students via such instruments
as exit-surveys), are considered most important.
The changes that faculty make in individual courses as well as the ones
faculty groups propose in their respective CGRs are, of course, heavily
influenced by student performance in the courses and by the faculty's
evaluation, based on that performance, of how well the students are
understanding the main concepts and acquiring the skills that the courses
are intended to equip them with. But this does not explicitly appear
in the CGR and therefore it may not be clear to someone reading a given
CGR at some future date, how well the intended course objectives/
learning outcomes were being achieved in each course and how any proposed
changes were related to this achievement. The proposal below addresses
this problem and help us better meet the
ABET requirement described in the last paragraph as well as providing
better documentation of how we meet the requirement.
Terminology used in course objectives
In CSE course syllabi, the course objectives strive to capture intended
learning outcomes.
They use the following terminology to describe familiarity
level (most to least) with respect to various kinds of material and
procedures:
-
To Master means the student will be able to exhibit knowledge of
the material and/or skill with the procedure, in a new but appropriate
context, even when not instructed to do so.
-
To Be Familiar means the student will be able to answer questions
about the material and/or to use the procedure, in a new but appropriate
context, when instructed to do so.
-
To Be Exposed means the student will have heard the term and/or
seen the procedure, but may not be able to discuss or use it effectively
without further instruction.
Some CSE course objectives use the following terminology
for skill level (most to least) to describe a student's facility
in dealing with various languages and notations:
-
Writing means the student will be able to use the notation, and
will be able to create new instances of it to perform some task.
-
Using means the student will be able to read the notation, and will
be able to apply that understanding to perform some task.
-
Reading means the student will be able to recognize a syntactically
and semantically well-formed instance of the notation, to understand its
meaning.
Proposal
Faculty preparing a CGR will explicitly evaluate how well each course
in the group is meeting each of its objectives/learning outcomes based
on student performance in the course. In other words, for each
learning outcome, the faculty preparing the CGR will provide their
evaluation, based on recent offerings of the course, of actual student
achievement of the particular outcome. This evaluation will be in
terms of the same mastery/familiarity/exposure and
writing/using/reading scales as used in specifying the
outcomes. For convenience, these might be mapped to numerical values
so that the mastery or writing levels map to a value of 3, familiarity
or using map to 2, and exposure or reading map to a value of 1. These
values can then be compared with the expected levels of achievement,
as stated in the respective course syllabi, for each of the outcomes.
If the expected level differs from the actual level of achievement by
more than a certain threshold (perhaps 1 numerical point according to
the mapping just described?) for any particular outcome, that would
indicate a potential problem in that course or in related courses and
suggest a need for appropriate changes.
The faculty preparing the CGR will be expected to address such
differences and discuss ideas for appropriate improvements in the
courses. In some cases, the improvement might be to rewrite the
outcome or the expected level of achievement of the outcome; in
others, the improvement might be a suitable modification in the course
content or how it is taught; in yet others, it might be a combination
of the two. The evaluation of actual student performance with respect
to each outcome of the course could be included as part of subsection
2.1 of the CGR immediately following the summary of the particular
course. The proposed improvements to address any substantial
differences with the expected levels of achievement could be included
as part of subsection 2.6. In any case, the proposed improvments will
be clearly tied to a direct assessment of the actual student
performance with respect to the intended learning outcomes of the
particular courses and the CGR will document this relation.
Neelam went through
a "test run" of this process using CSE 655 and 755 as the test courses.
He reported the following results based on his recent experiences teaching
these two courses.
-
The syllabus of
655 lists ten outcomes, six of which are listed as mastery-level
outcomes, the other four are familiary-level outcomes. For the six
mastery-level outcomes listed,
students in the
class achieved
mastery-level for three; familiarity-level for one; and between exposure- and
familiarity-level for the remaining two. For the four
familiarity-level outcomes listed in the syllabus,
students in the
class achieved
familiarity-level for two; between familiarity- and mastery-level for one;
and between exposure- and
familiarity-level for one.
-
The syllabus of
755 lists five outcomes, three of which are listed as mastery-level,
one as familiary-level, and one as exposure-level. For all of the
outcomes, actual student achievement in the class matched the levels
listed in the syllabus.
Neelam suggested that, given these facts, he would want to downgrade to
familiarity-level the
two mastery-level outcomes in CSE 655 for which student achievement was
between
exposure- and familiarity-levels; and that he would revise the course
content so that the actual student achievements with respect to these
outcomes are improved to reach the familiarity level. Clearly no changes
seem to be indicated in CSE 755.
In order to help faculty preparing the CGRs to arrive at reliable
estimates of students' achievements with respect to each of the
learning outcomes of the various courses in the group, it may also be
useful to extend the recently created syllabus database as
follows. During exam week of each quarter, each instructor who taught
a course during that quarter would receive an automated email message
that will request the instructor to submit an evaluation of the
achievement level with respect to each learning outcome for that
course for students in the section of the course taught by that
particular instructor. The message would include a url that would
bring up a form listing the objectives included in the official
syllabus for the course. The instructor would simply have to enter a
rating, indicating actual level of achievement, for each of the
objectives and submit the form. There would also be room, on the
rating form, for any other comments that the instructor wished to make
about his/her section of the course. All of the ratings (as well as
the comments) from all sections of the course, appropriately
summarized, would be available to the faculty in the group when they
prepare the next CGR for the group. This would ensure the accuracy of
the evaluations of actual student achievements of the various outcomes
for the course, and provide a more reliable basis for the faculty to
consider possible changes in the courses.
Process
This proposal was developed in email discussions among some CSE
faculty following general comments from Dean Baeslack of the College
of Engineering and members of the Outcomes Committee of the college,
about the importance of direct assessments and of the need to document
the relation between program improvements and the results of such
assessments. The proposal was briefly mentioned at the CSE Curriculum
Committee meeting of April 19 and is expected to be discussed at the
committee's meeting on April 26.