ABET Assessment Results for AY 2018-19

Summary

The data for AY 2018-19 was used as part of our continuing improvement process in the Fall of 2020.

Outcome % Excelled % Mastered % Partially Mastered % Below Expectations #Student Assessments
(1) 29 31 31 9 144
(2) 54 46 0 0 215
(3) 47 53 0 0 129
(4) 68 18 12 1 141
(5) 71 29 0 0 215
(6) 31 22 28 19 177

Considering Excelled and Mastered to "Satisfy" each student outcome, we can look at historical trends.

Outcome %Satisfied 2014-15 %Satisfied 2015-16 %Satisfied 2016-17 %Satisfied 2018-19 Last Year Change
(1) 83 79 73 60 -13
(2) 79 80 93 100 7
(3) 89 96 76 100 24
(4) 100 100 98 86 -12
(5) 83 77 96 100 4
(6) 94 62 51 53 2

The following graphs show the history of the outcomes (1) through (6), using data from the old outcomes (a)-(k) for prior years.

More details of student assessment can be seen by considering the assessment of individual PKIs.

Outcome KPI % Excelled % Mastered % Partially Mastered % Below Expectations #Student Assessments
(1) (1.1) 17 20 43 20 35
  (1.2) 33 34 28 6 109
(2) (2.1) 53 47 0 0 43
  (2.2) 0 100 0 0 43
  (2.3) 53 47 0 0 43
  (2.4) 100 0 0 0 43
  (2.5) 63 37 0 0 43
(3) (3.1) 0 100 0 0 43
  (3.2) 53 47 0 0 43
  (3.3) 88 12 0 0 43
(4) (4.1) 37 33 22 7 27
  (4.2) 93 7 0 0 27
  (4.3) 93 7 0 0 27
  (4.4) 67 15 19 0 27
  (4.5) 55 27 18 0 33
(5) (5.1) 88 12 0 0 43
  (5.2) 88 12 0 0 43
  (5.3) 88 12 0 0 43
  (5.4) 88 12 0 0 43
  (5.5) 0 100 0 0 43
(6) (6.1) 23 23 49 6 35
  (6.2) 33 31 33 3 36
  (6.3) 81 6 11 3 36
  (6.4) 0 23 11 66 35
  (6.5) 17 29 34 20 35

Assessment Data

COSC 3011: Software Design, Kim Buckner

Core Course Assessed
COSC 3011: Software Design Yearly
Performance Indicator (1.2): Analyze at least two or more proposed solutions 
to given problem and select the best solution for the given problem.

This was assessed through the program design project. The teams were required to plan, design, and program a game. This was in five steps over the majority of the semester with the sixth, the wrap-up, due as the final.

I have combined the results of the first five steps of the programming project as I did for the course grade. This was 45% of the course grade.

73 students were assessed.

Excelled: 28 students 
Mastered: 26 students 
Partially Mastered: 19 students 
Below Expectations: 0 students

Performance Indicator (2.3): Design the selected solution for a given problem. 

COSC 3020: Algorithms & Data Structures, Lars Kotthoff

Core Course Assessed
COSC 3020: Algorithms & Data Structures Every other year, starting 2016-17

Performance Indicator (1.1) :  Identify key components and algorithms necessary for a solution

Question 6 in final exam: Devise an algorithm to determine whether two graphs
are isomorphic or not.

35 students were assessed.

6 students excelled.
7 students mastered.
15 students partially mastered.
7 students were below expectations.

Performance Indicator (1.2) :  Analyze two or more proposed solutions to a given problem and select the best solution for the given problem 

Assignment 3: Implement two algorithms for solving the traveling salesman
problem and compare them.

36 students were assessed.

8 students excelled.
11 students mastered.
11 students partially mastered.
6 students were below expectations.

Performance Indicator (6.1) :  Analyze the asymptotic cost of divide-and-conquer algorithms

Question 4 in assignment 1: Implement an iterative and in-place version of merge
sort and analyze its complexity.

35 students were assessed.

8 students excelled.
8 students mastered.
17 students partially mastered.
2 students were below expectations.

Performance Indicator (6.2) :  Analyze the asymptotic cost of recursive algorithms

Question 3 in the midterm: Give Theta-bounds for three recurrence
relations.

36 students were assessed.

12 students excelled.
11 students mastered.
12 students partially mastered.
1 students were below expectations.

Performance Indicator (6.3): Analyze the asymptotic cost of basic graph algorithms

Lab 8: Implement the Floyd-Warshall algorithm and analyze its complexity.

36 students were assessed.

29 students excelled.
2 students mastered.
4 students partially mastered.
1 students were below expectations.

Performance Indicator (6.4):  Describe the impact of techniques such as caching 
and dynamic programming on the performance of algorithms

Question 5 in final exam: Implement a dynamic programming solution to
compute the Liouville number.

35 students were assessed.

0 students excelled.
8 students mastered.
4 students partially mastered.
23 students were below expectations.

Performance Indicator (6.5):  Understand the difference between polynomial 
and exponential complexity

Question 8 in the final exam: design a local search algorithm for the n-Queens
problem and argue whether it will always find the optimal solution (exhaustive
search, exponential complexity) or not (stop before all possibilities searched,
polynomial).

35 students were assessed.

6 students excelled.
10 students mastered.
12 students partially mastered.
7 students were below expectations.

COSC 3050: Ethics, Robin Hill

Core Course Assessed
COSC 3050: Ethics Every other year, starting 2016-17
Performance Indicator (4.1): Recognize ethical issues involved in a 
professional setting.

Exercise #4

1.  Would the ACM Code help resolve the Vulnerability Disclosure case? If so, cite the section and explain (as was done in the Dark UX analysis). If not, explain whether the Code should be enhanced.

2.  How much do you think you should get paid, as a computing professional, relative to other professionals, such as police, teachers, and lawyers? Is this an ethical issue? Do, or should, the professional Codes say anything about this?

There were 27 students assessed for the two sections of the course.

Excelled: 10 students 
Mastered: 9
Partially Mastered: 6 
Below Expectations: 2

Performance Indicator (4.2): Describe current issues in security. 

Q.  Voting machines exhibit security issues; see the Bibliography article by Nick Corasaniti if you need an example. Many electoral processes are determined by nontechnical local officials, who lack the expertise and resources to implement solid security measures. How would you explain the ethics of security to such officials --not a sales pitch, but an ethics pitch? How would you appeal to their ideas of right and wrong in order to convince them to take security seriously?


There were 27 students assessed for the two sections of the course.

Excelled: 25 students 
Mastered: 2
Partially Mastered: 0 
Below Expectations: 0

Performance Indicator (4.3): Describe current issues in privacy.

Q.  We have often referred to the difference between older and newer expectations.  What about privacy?  Consider the case of motor vehicle records (Tavani Case 5-12).  Was he doing anythihg that could not be done pre-Internet?  Does the easy access make a difference to the ethics?

There were 27 students assessed for the two sections of the course.

Excelled: 25 students 
Mastered: 2
Partially Mastered: 0 
Below Expectations: 0

Performance Indicator (4.4): Respect and honor ethics in writing assignments.

Mid-Term Exam, Question #2:   Consider an avatar, represented by a user-designed graphic, in Second Life or some other real-world simulation that is meant to be accurate. Is that avatar an ethical agent? An ethical patient? Explain, considering other factors that may be relevant, such as whether the avatar represents a player as opposed to a non-player character.

There were 27 students assessed for the two sections of the course.

Excelled: 18 students 
Mastered: 4
Partially Mastered: 5 
Below Expectations: 0

Performance Indicator (4.5): Acquire and practice skills that enable critical examination of ethical issues.

Question #2 (20 pts):  We have seen different normative theories (Consequentialism, Deontology, Virtue Ethics, Intuitionism, Supernaturalism, Social Contract).  Consider the case of the cooperative deployment of a disruptive worm to thwart bad Internet conduct, as described in the Rogue web hosting case. Apply two of the theories, briefly, explaining what should be done under that theory.

There were 27 students assessed for the two sections of the course.

Excelled: 12 students 
Mastered: 9
Partially Mastered: 6 
Below Expectations: 0

COSC 4950/5: Senior Design I & II, Mike Borowczak

Core Course Assessed
COSC 4950: Senior Design I Every other year, starting 2017-18
COSC 4955: Senior Design II Every other year, starting 2017-18

This is the capstone course. Students work in groups ranging from 3 to 5, select a project, and create a software system to solve it. The students flesh out the project ideas, deciding on the important features and trying to come to terms with project risk. They decide what technology to use, and they proceed to implementation. They also document their decisions by writing use cases, or other tools as appropriate (e.g., story boards for projects that choose to write games).

Classtime is devoted to group work and presentations that updated the instructor and the other teams on the status of the project. They also create a poster presentation of their project and do a final presentation to a wide audience. The class presentations, poster, and final presentations are the main tools used for assessment.

Performance Indicator (2.1): Identify constraints on the design problem

Does the capstone Project Design Document contain details on constraint (e.g. limitations, and feasiblity)

43 students were assessed.

23 students excelled.
20 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (2.2): Establish acceptance criteria for a solution

Does the capstone Project Design Document contain acceptence criteria (e.g. gated design/acceptance criteria)?

43 students were assessed.

43 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (2.3): Design the selected solution for a given problem

Mid-capstone presentation and github respositories contain evidence of design

43 students were assessed.

23 students excelled.
20 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (2.4): Implement the designed solution for a given problem

Capstone presentation and github respositories contain evidence of deliverables

43 students were assessed.

43 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (2.5): Evaluate the implemented solution

Mid-capstone presentation: Pathforward,  Evaluation of other designs

43 students were assessed.

27 students excelled.
16 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (5.1): Listen to other team members

End of capstone team reflection.

43 students were assessed.

38 students excelled.
5 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (5.2): Actively discuss team projects, objectives, or challenges with other team members

Actively discuss team projects, objectives, or challenges with other team members

43 students were assessed.

38 students excelled.
5 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (5.3): Fulfill team duties on time

Fulfill team duties on time

43 students were assessed.

38 students excelled.
5 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (5.4): Share in the work of the team

Status Updates; Git Tracking (does statement of responsibilities align to actual work performed)

43 students were assessed.

38 students excelled.
5 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (5.5): Research and gather information

Capstone Project Design Document

43 students were assessed.

0 students excelled.
43 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (3.1): Write technical reports

Is the Capstone Project Design Document written for a technical audience?

43 students were assessed.

0 students excelled.
43 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (3.2): Present technical material to technical peers

Capstone presentation survey tool: did peers understand the presentation?

43 students were assessed.

23 students excelled.
20 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (3.3): Present technical material to non-technical visitors

Capstone presentation survey tool: did non-technical visitors understand the presentation?

43 students were assessed.

38 students excelled.
5 students mastered.
0 students partially mastered.
0 students were below expectations.