Tips for faculty meetings with ABET evaluators (schedule)

First, some really important guidelines:
a) Please do not get into any arguments (about the ABET criteria or anything else related to the ABET evaluation) with the program evaluators.
b) Please make sure that when talking about courses, including ones that you teach, that you do not say or imply that each faculty member teaches whatever he or she pleases. The view we want to convey is that, as a general rule, students get more or less the same instruction in any given course, no matter which particular section of the course they enroll in; this is really critical.
c) Do not ask questions along the lines of, "how is the evaluation going so far"? Evaluators are not, in any case, supposed to answer such questions; and it may make them feel uncomfortable. Of course, if the evaluator says something on his own, that is a different story.
d) Apart from meeting faculty, staff and students, the evaluators will be going through course materials and assessment materials (see item (4) below). The course materials will be available at a password protected site that Neelam will send mail about shortly. Even if you didn't provide materials for any course, you should look at the materials on that site for courses that you normally teach; that should help with item (b) above.
e) A general point: Please try to represent the BS-CSE program as a whole, not just your particular courses. Of course, you are mainly interested in your course but do not be unnecessarily critical of other courses. (For example: don't suggest that students coming into your course are poorly prepared because the pre-req course is a mess; etc. Remember, it is the program as a whole that is being evaluated, not just your course(s). It is okay to suggest that some particular topic in a pre-req course may need to be beefed up and that you are planning to work with the relevant course coordinator. That would fall under item (4) below.)

More details: In general, it is hard to predict what a particular program evaluator ("PEV" in ABET terminology) will want to talk about in individual meetings with faculty. The PEV may ask about your background (experience in academia, in industry (if any), your specialization, etc.) If there are such questions, they should be fairly straightforward and easy to respond to. If your background/interests includes any focus on CS education, including attending conferences such as SIGCSE, please highlight those.
More likely, the PEV will focus on the ABET criteria and how well, based on your understanding, the program meets the criteria requirements. What follows is a brief summary of the criteria requirements and how our program meets them. You may want to model your answers on this summary; but please do NOT print this page and take it with you to the meeting! That would be serious academic misconduct!:-)

There are eight criteria:

  1. Students: This criterion deals with how students are advised with respect to curricular and career matters; and things such as ensuring that students meet all the program requirements when they graduate.
    Clearly, we depend on the Advising Office to take care of things like ensuring that students meet all requirements; and it is fine to say that. But with respect to advising students on curricular and career matters, faculty are expected to play an active role. So I would try to talk about some instances where you helped students in these matters. (Even if the students in question were not officially assigned to you as advisees -- but don't stress that.) Especially, if you were involved with encouraging them with respect to research or projects of any kind or Hackathon etc., do talk about those things.
    One other key point to talk about is the Annual Undergrad Forum. If you have attended them, even if you missed the most recent one, you should talk about the forum and how students get a lot of advice and information about courses, career options, internships, etc. at the forum; not only from faculty and advising staff but also each other. So do talk about this; if you need a reminder of what was discussed at recent forums, the reports are all available here.
    In addition, the PEV may ask how many students you advise. Most faculty advise 25 or so; and, in the case of full-time lecturers/senior lecturers, about 10 students; so that is what I would say.

  2. "Program Educational Objectives (PEOs)": All programs are expected to have specified PEOs that are "published"; and programs are supposed to use input from the dept.'s Advisory Board, alumni, etc. in formulating/revising the PEOs.
    We have three PEOs that are published on our website. The first PEO talks about our graduates being successful in their computing careers; the second is about some of our grads pursuing or having completed grad programs in computing or related fields; the third talks about our grads being engaged members of their communities and responsible computing professionals. We ask for feedback on PEOs from our alums (in the bi-annual alumni survey). We also present them to the Advisory Board to ask for its feedback. Most recently, based on input from the Advisory Board, the Undergrad Studies Committee (UGSC) recommended adding a reference to the ACM Code to our third PEO; and this change was approved by faculty (via email).

  3. "Student Outcomes (SOs)": All programs are expected to have specified SOs. SOs are supposed to be statements of knowledge/skills the students of the program are expected to acquire by the time they finish the program. ABET itself specifies a set of SOs that all programs must include in their set of SOs; in fact, EAC and CAC (remember we are accredited by both EAC and CAC) each has its own set of SOs but they have a lot in common with each other. Our SOs were obtained essentially as a *union* of the EAC and CAC outcomes, in some cases specializing an EAC outcome so it is specific to Computer Science and Engineering. Our fourteen SOs --published at the same website-- were developed some years ago by UGSC and approved by faculty. They were revised a few years ago when CAC specified its own SOs to include those outcomes and the change was again approved by faculty by email.
    The PEV might try to probe your understanding of the SOs. The SOs use very generic language but, broadly, they can be classified into three groups. The first group, which might be called the technical group, deals with the student's knowledge of computing, math, and engineering; and ability to apply this knowledge to solve problems and to build/analyze software and hardware/software systems. The second group, which might be called the professional group, deals with "soft" skills such as team work abilities, communication skills, etc. The third group, which might be called the societal group, deals with societal issues such as security issues related to computing etc. The PEV might ask you which outcome you think the program achieves well and which one it has trouble with ... as a general rule, our students do well with respect to the first group; moderately well with respect to the second; and there is much variation among the students with respect to the third. You will have to base your answers in part on your own experiences in your classes; but it would be problematic if the PEV got the impression that there were serious shortcomings with respect to any of the SOs. So even with respect to the third group, you should not give the impression that there are serious issues.

  4. Assessment and continuous improvement: This is a very important criterion. The idea is that the program will regularly assess its SOs to see how well the outcomes are being achieved and analyze the results to identify weaknesses in the program and to arrive at possible improvements. ABET is really focused on this idea of continuous improvement and expects programs to be serious about it. Please do NOT question the idea in your meeting with the PEV! Even if the PEV seems to suggest it, just nod sagely and let it go!

    Assessments: Our assessments consist mainly of the following:
    i) POCAT: POCAT is a multiple-choice exit test that BS-CSE students take before graduation, typically in their final semester. All BS-CSE students take the test but their individual performance neither becomes part of their OSU record nor is even recorded. The test includes questions from a number of required and core-choice courses as well as some of the popular electives. The goal is to help identify common weaknesses in students' conceptual understanding of important ideas and, via discussion in UGSC, come up with ideas that might help improve the program. The questions on POCAT are proposed mainly by the faculty members closely involved with the particular courses.
    ii) Rubrics: The POCAT questions are related, primarily, to the "technical" SOs. In order to help assess the SOs in the other two groups, we have developed three rubrics that include specific dimensions related to those SOs. The first rubric is for use by instructors in CSE 2501, the course on social, ethical and professional issues in computing (and in Philosophy 1338 which students may take in place of 2501); the second is for use by instructors in the capstone design courses; the third rubric is for assessing the work of student teams at the capstone poster session at the end of fall and spring semesters.
    iii) Exit survey: Before graduation, students also complete an opinion survey that asks for the student's assessment of the importance of each outcome as well as the extent to which the program helped the student achieve the outcome. The survey also asks the student's opinion concerning the one aspect of the program that he/she was especially happy about, and one aspect that most needs improvement.
    If the PEV asks what the results of the survey show, here is a brief summary: the program has helped students achieve most of the outcomes reasonably well and most of the outcomes are reasonably important (with the technical ones typically rating higher than the outcomes in the other two groups). One aspect that shows up repeatedly as especially satisfactory is the quality of advising provided by the Advising Office. One aspect that often shows up as needing improvement is a greater emphasis on largish design/implementation projects.
    iv) Annual forum: The annual student forum, typically held in the middle of Spring semester, is another important method of obtaining student feedback on various aspects of the program and of answering student questions with respect to courses, career options, etc. In addition to the students, a number of faculty, members of the advising staff, and one or more staff members from computing services, attend the forum. Students provide useful feedback on individual courses as well as on the program as a whole. A few days after the forum, a detailed report is posted on a website maintained by the UGSC.

    UGSC works closely with the Advising Office to coordinate the administration of the POCAT and the exit-survey; and to schedule (and advertise) the Annual Forum. UGSC works with the instructors of the capstone courses and CSE 2501/Phil 1338 to administer the rubrics. Recent results from all four assessments are available at web pages maintained by the UGSC. UGSC discusses the assessment results at its regular meetings and comes up with ideas for improvements in the program. Occasionally, the improvements tend to be for changes in the rubrics or tweaks to particular POCAT questions, etc.

    Improvements: Below is a list of some recent changes/improvements based on the assessments. Each item is flagged with the names of one or more of the people who are meeting with a PEV and who are in some way reasonably related to the item. Make sure you look through at least the item(s) that are flagged with your name; that should help during your discussion with the PEV. (In some cases, your name may be included because of your role in UGSC/Curr. Comm; or active involvement in discussing the item; etc.)
    a) Chris, Paolo, Paul, Rafe, Neelam, Nikki: Based on feedback during a couple of annual forums, we worked with the ECE faculty to revise the ECE courses that our students are required to take. The previously required courses, ECE 2000, 2100 which many of our students found rather difficult and inaccessible have been replaced by two course, ECE 2060 (digital logic) and 2020 (analog circuits). These courses seem better received by the students although we are continuing to work with the ECE faculty to see if further changes are needed.
    b) Jeremy, Neelam: An IT professional from the central Ohio area, when he heard about the POCAT, liked the idea but also felt that some of the questions were very theoretical and could, usefully, be recast in a practical setting. We are currently working on doing that.
    c) Doreen, Ken, Rafe: A recent (Sp '17) POCAT included a question about the asymptotic running time of a simple (exponential time!) recursive program to compute the Fibonacci function. This is a topic that, as was noted by the student reps at the UGSC meeting, is discussed at some length in Foundations I/II or both, the point being stressed that the exponential running time makes it a very poor program indeed. But the performance of several students was surprisingly weak ... the reasons for this are not clear but it is something we are working on pinning down. We are tweaking the question to see if we can get a better feel for the problem that students apparently have.
    d) Paolo, Jeremy, Neelam: Continuing the above item, one suggestion was that perhaps the language in the question or the choices presented were confusing or unclear. Obviously, this is a problem for any of the questions in the POCAT, not just this one. Hence we are toying with the idea of adding an alternative along the lines of, "the question and/or the presented options are too confusing!". If many students pick that alternative, that will tell us the problem was with the phrasing of the question. We will consider this for future POCATs, at least when we try a question the first few times.
    e) Mike, Doreen, Ken, Rafe, Chris, Paolo, Paul, Jeremy, ...: Another POCAT question on which performance has been surprisingly weak has to do with basic ideas related to representing information in a string of bits; the question asked if n bits are required to encode some information, how many bits would be required to represent twice as much information. Here too the performance was much weaker than expected. Part of the problems indeed related, as we discovered when we tweaked the question over some semesters, to some of the terminology used in the question but the performance is still too weak ... we are continuing to try to pin down the problem here; if it turns out there is a real problem in student understanding of this basic idea rather than in the formulation of the question, we will work on revising either CSE 2421 or 2321 or both, to address this.
    f) Ramasamy, Eric, Neelam: At the end of each fall and each spring semester, each student team from each of the capstone courses from that semester is expected to present its work at a public poster session which also typically includes demo's of the team's (prototype) system. The session is attended by invitees including, especially, professionals from the local IT industry. The teams are also expected to be aware of any ethical and societal issues that their project may raise and present them, as appropriate, in their posters or in answering questions that the visitors to the poster session may have. Following the Spring '17 poster session, the rubric assessments of the session suggested that while the project teams, as a rule, had done a reasonably satisfactory job in their projects, their posters could be improved. Following discussion of this in UGSC, we put together a a guidelines document that provides some useful tips to student teams on how to create effective posters. This will be given to all teams in each section of each capstone course and should help improve students' communication skills.
    g) Eric, Ramasamy, Paolo, Jeremy, Neelam: One other concern, based on the Spring '17 capstone poster session, was that teams in one of the courses, 5914 (the course in which teams use various facilities provided by the Watson system), fared poorly on the other factors dimension. Part of the intent of that dimension has to do with the team's comparing alternative approaches and tools for solving specific problems in its project and considering relevant tradeoffs before choosing one; and the low scores of the 5914 teams were related to the perceived inadequate attention paid by the 5914-teams to this question. This issue was discussed in a recent UGSC meeting. Following the meeting, several members of UGSC engaged in an email discussion about this with the instructors, including the adjunct from IBM, who teach 5914. It turns out that the problem was not that the teams in the course did not go through this important task. Rather, it was that their posters focused on how they actually used specific facilities provided by Watson in their project without much, if any, mention of alternative facilities (both those provided by Watson as well as other systems) that they considered before settling on the ones that they did. In other words, the poor scores in this dimension were mainly due to the failure of the teams to document their evaluation of alternative approaches and facilities to solve specific problems in their projects rather than the failure to engage in such evaluation. The new poster guidelines (see above) include some stress on the importance of documenting this evaluation since it is standard practice in professional projects. If the performance still does not improve, we will stregthen the class discussion on this item.
    h) Mike, Neelam: When discussing the rubric results from a recent section of CSE 2501, a number of members of UGSC felt that the rubric was somewhat lacking with respect to the assessment of the paper(s) that students write in that course. While the rubric addressed the stylistic aspects adequately, it did not include any dimensions related to the quality of the student's argumentation based on appeals to relevant ethical principles or theories. We have since revised the rubric to address this; the change is in the dimension, "Presentation of ideas and organization of the paper ".
    i) Ken, Jeremy, Arnab, ...: One of the questions in the POCAT, one intended for students who had taken CSE 3241, the Database I course, poses a simple problem that presented a simple relational schema and asked students which sets of attributes could be keys for the schema. The performance on this question was surprisingly weak with less than 20% of the students answering it correctly. After considerable discussion in UGSC and among faculty involved with the course, the question was tweaked, in one version of POCAT, to include a line reminding students of the meaning of key in this context. The performance on this version of the question was substantially better. The conclusion was that students, by the time of their graduation, do not always recall the definitions of some important terms although they do seem to retain understanding of the underlying concepts. Whether there is a way to improve retention of the definitions of important terms is unclear ...
    j) Jeremy, Arnab, Rafe, Nikki, ...: We currently have two database courses, CSE 3241 and 5242. 3241 is a core-choice course, taken by a large percentage of our BS-CSE students, and fairly well received by them. CSE 5242 is more challenging since it serves two distinct purposes. On the one hand, it is meant as a more advanced course on databases for undergraduate students and, on the other hand, it is intended as a forum for presenting and talking about research problems and recent results in the topic for research-oriented graduate students. While some undergrads who are, by that point, seriously considering graduate school options do well with such an approach, it doesn't suit others who are interested in exploring, in a standard undergrad setting, deeper and newer topics in the field. At recent Annual Forums, this point has come up. The courses on data mining, machine learning, and the newly developed course on cloud computing help but the number of sections of those courses that we can offer, given our current faculty resources, is limited; indeed, the cloud computing course has not generally been available to BS-CSE majors since it was designed to serve the needs of the Data Analytics major which is distinct from BS-CSE and those students get first priority for that course. We are working on addressing the problem.
    k) Paul, Arnab, Chris, Jeremy, Paolo, Neelam, Mike, others: Some students expressed considerable interest in a course (whose detailed contents may well vary from one offering to the next) that presents a few important and powerful new/recent tools/systems etc. with some essential coverage of the underlying concepts followed by discussion of the technical details of the tool/system and how may be used in practice including, possibly, a detailed assignment/project involving the tool or system. One risk in having such courses is that students may fill up their tech elective hours with a number of such courses with the result that their overall technical foundations may be weak and/or lack coherence. Hence, for now, we have shelved the idea; but if there is continuing interest (as is likely) and if we can find satisfactory ways to mitigate this risk, we may pursue it.

  5. Curriculum: This criterion requires the program to include specified minimum number of hours of math including discrete math and probability/statistics, science, engineering and computing topics, a suitable general education component, etc., all of which we satisfy.
    But the PEV may ask you how the program meets, say, the discrete math requirement; the answer is via CSE 2321 (Foundations I) and Math 3345 (Foundations of higher math). Or what general education requirements we have; the answer is, in addition to English 1110 and one of the second writing courses, the student is required to take 18 additional hours including one of a specified set of courses dealing with ethical issues. With respect to the computing requirements, students are required to take six core courses; CSE 2501 (or Phil 1338); four core choice courses; ECE 2020 and 2060; and 17 credit hours tech electives of which at least 9 must be CSE courses and the other 8 may be some combination of CSE and appropriate non-CSE courses; the requirement of 8 hours of CSE+non-CSE courses may, instead, be satisfied by the student completing an appropriate minor such as Math or Linguistics or another field with interesting computing applications.

    The criterion also requires the curriculum to include "coverage of the fundamentals of algorithms, data structures, software design, concepts of programming languages and computer organization and architecture." One important point here is that we have to be able to claim that this coverage is in the *required* courses, not in the electives since some students may not take particular electives; core-choice courses may be okay if we can claim that each course in a given core-choice pair covers the topic(s )in question. So it would be best to appeal only to the required courses; and we can (mostly) do so as follows. For each case, I list the courses in question and the names of the faculty who should (be able to) talk about this coverage:
    i) algorithms: 2331 (Ken, Rafe); 2321 (Ken, Doreen, Rafe); 2431 (mutual exclusion etc.: Doreen)
    ii) data structures: 2231 (Paolo, Wayne, Paul); 2331(?)(Rafe)
    iii) software design: 2221, 2231 (Paolo, Wayne, Paul);
    iv) concepts of prog. langs.: 2221, 2231 (Paolo, Wayne, Paul); 2421 (Mike); 2431 (concurrency primitives, etc.: Doreen)
    v) computer org./architecture: 2421 (Mike); 3421/3461 (Chris: if you can talk about both courses, arguing, in the case of 3461, that networking, broadly speaking, is part of computer org./architecture, that will work; if you cannot make such an argument reasonably convincingly, I would probably forget it).

    In addition, the curriculum is required to "culminat[e] in a major design experience based on the knowledge and skills acquired in earlier course work and incorporating appropriate engineering standards and multiple realistic constraints". That requirement is essentially a reference to a capstone design/implementation course that meets those requirements. So this is of special importance for Eric and Ramasamy because, of all the people on the ABET schedule, they are the ones most involved with the capstone courses.

  6. (Actually, criteria 6, 7, 8): The remaining three criteria deal with the adequacy of the number and qualifications of the faculty in the department to serve the needs of the students of the program and to ensure its growth and improvement over time; the adequacy of the (computing, library, etc. as well as classroom, lab, office space) facilities available to students and faculty; and the support provided by the university to both attract qualified faculty as well as to provide for their professional needs, such as support for attending professional conferences.
    Obviously, we are stretched very thin when it comes to having adequate faculty; and it is only by using a fair number of adjunct faculty that we have been able to meet the needs of the students. Fortunately, central Ohio has a good supply of qualified and interested adjuncts. And, of course, we are really short of space ... In any case, the PEV may ask for your opinions on these and similar items. Hopefully, you will be able to respond reasonably ...
Two final points: The main part of the self-study that we submitted to ABET is available here. It contains full details about our assessment and continuous improvement activities and all other aspects of the program. If you have any questions about any of that material, please let me know.

Last, and by no means least, thank you very much for being willing to meet with the PEVs. Meetings with faculty is a critical part of ABET evaluations and your engaged participation is essential for the success of our evaluation. Thank you!