1. In Accreditation Policy and procedure manual it says: (in section II.E.3.10): Sufficient examples of student work in technical, mathematics, and science courses must be available to the visiting team for the entire campus visit. The examples should show a range of grades for assignments, including homework, quizzes, examinations, drawings, laboratory reports, projects, and samples of computer usage in technical courses. Examples must also be presented to demonstrate compliance with the requirement for student competence in written and oral communications. Examples of competence in oral communications? 2. Criterion 1 says: "evaluate, advise, and monitor students to determine its success in meeting program objectives." How do we do this? Partly we could refer to the exit survey and alum surveys; looks like there is nothing else. ... no, that is not true. All course exams serve this purpose to the extent that individual instructors (and groups of faculty via course group report mechanism) see how well students have met the objectives of their particular course. 3. Criterion 2b says: a process based on the needs of the program's various constituencies in which the objectives are determined and periodically evaluated Exit survey (the `importance' part, not the `met' part) plus the discussion in UGSC of the results of the survey clearly does the periodic evaluation. Maybe should introduce discussion and re-approval by faculty every three years? Any others? 4. In the `Program self-study instructions', it says (B. Accreditation Summary): `It is suggested that the information presented for each criterion be as complete as possible such that the program evaluator can determine if all of the requirements are being met without cross-referencing material provided under other criteria. This may require some duplication of material but it should aid the evaluator. Reference to material provided in appendixes I and II, and to other information provided by the institution should be made as needed. Also in `Lessons learned' it says: "Organize evidence [in the Self-study] by outcomes for Criterion 3". 5. In showing industry involvement, play up outside instructors as well as refer to the Perl/Tcl-Tk seminar (and other similar ones) organized by ACM/IEEE, especially those taught by industry folks. ======================== From old notes (in ADMIN/UGC/ACCRED/SELFSTUDY) 0. Pointers: a. ADMIN/UGC/ACCRED/coursestatus describes the idea of annual course evaluation and a suggested form for doing it. This lead to the coursegp. report idea. Abandon the individual course evaluation? b. ADMIN/UGC/ACCRED/cover contains the cover letter sent to alums for the alum survey. Quoting from this would be a good idea since it validates the claim that we are assessing objectives and outcomes (both `met' and `importance'). c. ACCRED/talkslides.tex contains presentation to advisory committee; this was where we presented our objectives and got feedback from them. The Final objectives were approved by faculty in Dec. '97. 1. ABET guidelines suggest that a representative group be involved with completing the self-study. The following group would seem to be a good idea: Neelam S., Rick Parent, Tom Page, Bruce Weide, Stu Zweben, a GTA (Steve Fridella?), a senior undergrad, and a junior undergrad? Procedure: Neelam will do the first draft, run it by Rick to see if there are any problems. Next the draft will go to the whole group for comments -- sent to Neelam (not e-mail discussions, just comments to Neelam). After 2 weeks Neelam will produce a new draft; the whole group will again send comments to Neelam about this draft. Following this round of comments, the final draft will be produced by Neelam and Rick (and Stu?). 2. See 0.b. 3. Criteria for capstone courses (see, for example, ACCRED/maintasks) seems to require some change; do it now? 4. Capstone course problem: Need to rewrite all course descriptions using a different format than the one for other courses since these courses are supposed to stress other things than learning new stuff. Thus the `mastery/familiarity/ exposure' categorization is not appropriate. A uniform format would clearly be best, and should justify the claim that the course constitutes a major design experience that incoporates important appropriate considerations. Documentation must be an important part. 5. Play up the `useful' aspects of 294V, the multi-media course, of the `relevance' aspect of 601 discussion of MicroSoft case, and the the `process' aspect of Sp'99 601. 6. Some of the ABET criteria will be met by courses like ISE 504. 7. Assessment mechanisms: a. Exit surveys. b. Alum surveys. c. Supervisor surveys (not yet available). d. UG forum. e. Course group reports. f. SETs Create a web page that describes these at a high-level and links to evalmech.html and assess.html (in WWW/abet). Need a mid-stream survey? Maybe at the end of 560?