ABET Assessment Results for AY 2015-16

Summary

The data for AY 2015-16 was used in the Department’s response to the ABET evaluation of our program following the ABET Team Visit on October of 2015.

Outcome % Excelled % Mastered % Partially Mastered % Below Expectations #Student Assessments
(a) 45 35 9 11 185
(b) 46 33 15 6 94
(c) 58 22 9 11 232
(d) 53 24 19 4 165
(e) 98 2 0 0 96
(f) 67 29 4 0 99
(g) 100 0 0 0 48
(h) 81 10 9 0 81
(i) 56 26 10 7 96
(j) 36 26 17 21 131
(k) 74 8 2 16 383

Considering Excelled and Mastered to “Satisfy” each student outcome, we can look at historical trends.

Outcome %Satisfied 2014-15 %Satisfied 2015-16 Change
(a) 72 80 8
(b) 83 79 -4
(c) 79 80 1
(d) 83 77 -6
(e) 100 100 0
(f) 89 96 7
(g) 100 100 0
(h) 100 91 -9
(i) 91 82 -9
(j) 94 62 -32
(k) 64 82 18

As the table shows, the percentage of students who satisfy each outcome is mostly the same in the two years for which we have data. The outcomes (j) and (k) show the most difference, and we attribute this to a better understanding of the assessment process among the faculty, as opposed to any real change in student performance. We will, of course, continut to monitor this closely in the next years.

More details of student assessment can be seen by considering the assessment of individual PKIs.

Outcome KPI % Excelled % Mastered % Partially Mastered % Below Expectations #Student Assessments
(a) (a.1) 39 41 6 14 51
  (a.2) 25 17 8 50 24
  (a.3) 54 4 38 4 24
  (a.4) 92 0 4 4 24
  (a.5) 39 58 3 0 31
  (a.6) 35 65 0 0 31
(b) (b.1) 16 35 29 19 31
  (b.2) 60 32 8 0 63
(c) (c.1) 77 11 0 11 35
  (c.2) 69 20 0 11 35
  (c.3) 58 24 9 8 96
  (c.4) 64 12 15 9 33
  (c.5) 18 42 21 18 33
(d) (d.1) 52 9 30 9 33
  (d.2) 91 0 0 9 33
  (d.3) 79 0 21 0 33
  (d.4) 18 58 24 0 33
  (d.5) 24 55 21 0 33
(e) (e.1) 100 0 0 0 24
  (e.2) 100 0 0 0 24
  (e.3) 100 0 0 0 24
  (e.4) 92 8 0 0 24
(f) (f.1) 24 64 12 0 33
  (f.2) 100 0 0 0 33
  (f.3) 76 24 0 0 33
(g) (g.1) 100 0 0 0 24
  (g.2) 100 0 0 0 24
(h) (h.1) 100 0 0 0 24
  (h.2) 55 24 21 0 33
  (h.3) 100 0 0 0 24
(i) (i.1) 32 52 12 4 25
  (i.2) 50 0 0 50 12
  (i.3) 73 0 27 0 26
  (i.4) 64 36 0 0 33
(j) (j.1) 53 34 6 6 32
  (j.2) 6 27 21 45 33
  (j.3) 66 0 16 19 32
  (j.4) 11 56 22 11 18
  (j.5) 31 25 25 19 16
(k) (k.1) 52 27 9 12 33
  (k.2) 77 0 0 23 252
  (k.3) 60 32 8 0 63
  (k.4) 100 0 0 0 35

Assessment Data

COSC 1010: Intro to Programming, Allyson Anderson

Core Course KPIs Outcomes
COSC 1010: Intro to Programming (i.2) (i)
Performance Indicator (i.2): Program in Java, including use of the Java collections and other useful 
Java frameworks

Homework 8:  The salesReport project allows an employee to enter a simple daily sales report and 
prints out the results. The employee must enter his username and password. After verifying the 
username and password, the program allows the employee to enter his sales details for the day. 
Sales details include the product name, product ID#, price, and quantity sold. An object should be 
created for each product and then added to the daily collection of product objects. When the 
salesperson is finished entering data, the program calculates the total cash value of the 
salesperson's sales for the day and prints out the report.  

There were 12 Computer Science majors in the course, 6 of them did not turn in the assignment. 
6 students were assessed.
6 students excelled.

COSC 2030: Programming II, Tom Bailey

Core Course Assessed
COSC 2030: Programming II Yearly
Performance Indicator (i.1):  Program in C++, including use of the C++ Standard Library

Problem 3.2 from Exam 3

Give a diagram of the storage structure for a Standard Template Library vector. Explain your 
diagram.

Give a diagram of the storage structure for a Standard Template Library list. Explain your 
diagram.

The STL list class has a method that inserts a new entry at the front of this list object. 
The STL vector class does not have this method. Why not?

25 students were assessed.
8 students excelled.
13 students mastered.
3 students partially mastered.
1 student was below expectations.

Performance Indicator (i.3):  Use an IDE to edit, compile, and debug a program

Project 01:  Read and process a file of lists of numbers

Task 

Write a C++ program that reads lists from a file and reports the name, size, mean if 
appropriate, and smallest number if appropriate for each list; and reports the list 
name and sum of the list with the largest sum.

The file to be read consists of:
    the number of lists in the file,
then
    zero or more of:
        a list sentinel, a number not equal to any value in this list;
        the list name, readable as a C++ STL string;
        zero or more list values, readable as C++ doubles;
        the list sentinel

26 students were assessed.

19 students excelled.
no students mastered.
7 students partially mastered.
no students were below expectations.

COSC 2300: Discrete Mathematics, John Cowles

Core Course Assessed
COSC 2300: Discrete Mathematics Yearly
Performance Indicator (a.1): Use descrete mathematics techniques

 Quiz 1
   converse & contrapositive
   negations & DeMorgan's laws

 27 students were assessed.

  6 students excelled.
 20 students mastered.
  0 students partially mastered.
  1 student was below expectations.
 -------------------------------------------------
 Quiz 2 
   direct proof

 24 students were assessed.

 14 students excelled.
  1 student mastered.
  3 students partially mastered.
  6 students were below expectations.

Performance Indicator (a.2): Estimate cardinality of relevant events
                             in computing applications

 Final Question 7
   count sets, bags, lists, & sorted lists

 24 students were assessed.

  6 students excelled.
  4 students mastered.
  2 students partially mastered.
 12 students were below expectations.

Performance Indicator (a.3): Use mathematical induction to prove 
                             mathematical formulas that arise
                             in computing applications

 Final Question 6 
   math. induction proof

 24 students were assessed.

 13 students excelled.
  1 student mastered.
  9 students partially mastered.
  1 student was below expectations.

COSC 3011: Software Design, Kim Buckner

Core Course Assessed
COSC 3011: Software Design Yearly
Performance Indicator (b.2): Analyze at least two or more proposed solutions 
to given problem and select the best solution for the given problem.

This was assessed through the program design project. The teams were required 
to plan, design, and program a game. This was in five steps over the majority 
of the semester with the sixth, wrap-up due as the final. The programming 
assignments are: Program 01, Program 02, Program 03, Program 04, Program 05.
The assignments build upon the previous solutions to culminate in a graphical
game of Izzi.

I have combined the results of the first five steps of the programming project 
as I did for the course grade.

63 students were assessed.

38 students Excelled
20 students Mastered
5 students Partially Mastered
0 students Below Expectations

Performance Indicator (c.3): Design the selected solution for a given problem. 

This was assessed with the same set of projects as (b.2).

Performance Indicator (k.2): Describe commonly used design patterns.

This was assessed using several of the quizzes which were given covering student 
presentations: Visitor Pattern, Factory Pattern, Observer Pattern, Model-View-Controller.

Visitor pattern:
63 students were assessed.
Excelled: 50 students
Mastered: 0 students
Partially Mastered: 1 students
Below Expectations: 12 (most did not take quiz).

Factory pattern:
63 students were assessed.
Excelled: 42 students
Mastered: 0 students
Partially Mastered: 0 students
Below Expectations: 21 (most did not take quiz).

Observer pattern:
63 students were assessed.
Excelled: 51 students
Mastered: 0 students
Partially Mastered: 0 students
Below Expectations: 12 (most did not take quiz).

Model-View-Controller:
63 students were assessed.
Excelled: 50 students
Mastered: 0 students
Partially Mastered: 0 students
Below Expectations: 13 (most did not take quiz).

Performance Indicator (k.3): Design the selected solution for a given problem. 

This was assessed with the same set of projects as (b.2) and (c.3).

COSC 3020: Algorithms & Data Structures, Tom Bailey

Core Course Assessed
COSC 3020: Algorithms & Data Structures Every other year, starting 2016-17

Performance Indicator (a.4): Calculate the sum of arithmetic series 
                             that arise in computing applications

 Final Question 2
   Gauss' method for summing arithmetic series

 24 students were assessed.

 22 students excelled.
  0 students mastered.
  1 student partially mastered.
  1 student was below expectations.

Performance Indicator (a.5): Calculate the sum of geometric series 
                             that arise in computing applications

Problem 3.8 from Exam 3

true / false :   $2n \in O(n)$
true / false :   $2^c + 4^c + 8^c + \dots + 2^{nc} \in \Theta(2^{n(c+1)})$
true / false :   $(\log{n^2}) \in O(n/\log{n})$
true / false :   $9^{\log_3(n)} \in \Omega(n(n+1))$
true / false :   $1 + 8 + 27 + 64 + \dots + (n/3)^3 ) \in \Theta( n^4 )$
true / false :   Merge Sort time is in $(n \log n)$.
true / false :   Binary Search time is in $\Omega(log n)$.

31 students were assessed.

12 students excelled.
18 students mastered.
1 students partially mastered.
no students were below expectations.

Performance Indicator (a.6) :  Use calculus to find the asymptotic limit of functions

Problem 3.5 from Exam 3

    Order these Big Oh time complexity classes from smallest (slowest growing) to largest.

    * $O( \log(n^n) )$
    * $O( (\log(n))\cdot(\log(n^2)) )$    
    * $O( n \cdot (1 + n) )$
    * $O( n + 2 \sqrt{n} )$
    * $O( 8^{\log_2{n}} )$
    * $O( n! / 2^n )$   
    * $O( 24 )$
    * $O( 3^n )$

31 students were assessed.

11 students excelled.
20 students mastered.
no students partially mastered.
no students were below expectations.

Performance Indicator (b.1) :  Identify key components and algorithms necessary for a solution

Problem 3.1 from Exam 3

Partition can be used to solve the Selection problem. 

What is the expected time complexity of this solution when the required value is the smallest value in the collection?

Explain your answer.


31 students were assessed.

5 students excelled.
11 students mastered.
9 students partially mastered.
6 students were below expectations.
Performance Indicator (j.2) :  Analyze the asymptotic cost of recursive algorithms

Problem 1.3 from Exam 1

    Give a recurrence relation for the time required by this C++ function.  The size of the problem is the value of jump.

int stumble( int jump ) 
{
    if( jump < 8 ) 
    {
        return jump;
    }
    
    int step = 3;
    while( step * step < jump )
    {
        ++step;
    }

    int distance = 0;
    for( int i=0; i<step; ++i )
    {
        distance += stumble( step );
    }
    return distance;
}


33 students were assessed.

2 students excelled.
9 students mastered.
7 students partially mastered.
15 students were below expectations.

Performance Indicator (j.3): Analyze the asymptotic cost of basic graph algorithms

Lab 07

Graph methods:  reverse, topological sort

32 students were assessed.

21 students excelled.
0 students mastered.
5 students partially mastered.
6 student were below expectations.

Performance Indicator (j.4):  Describe the impact of techniques such as caching 
and dynamic programming on the performance of algorithms

Problem 4.1 from Exam 4

Consider this recursive function:

int
test(int n)
{
    const int terms = 19;

    int sum = 1;
    if (n >= 1)
    {
        for (int i = 1; i <= terms; ++i)
            sum += test(n / (i + 1));
    }
    return sum;
}

When called with n = 100,000, the function required more than 25 seconds to 
make over 900,000,000 recursive calls.

Sketch the pseudo code for a memorization of this function.  Explain why this 
version is expected to be much faster than the pure recursive version.

Why would memorization be better than dynamic programming for this problem?

18 students were assessed.

2 students excelled.
10 students mastered.
4 students partially mastered.
2 students were below expectations.

Performance Indicator (j.5):  Understand the difference between polynomial 
and exponential complexity

Problem 12 from the Final Exam

    Define the longest increasing subsequence problem.  

    Describe a naïve, brute force solution that solves this problem.

    What is the time complexity of this solution?

    Describe a dynamic programming solution that solves this problem in O(n^2) time.

16 students were assessed.

5 students excelled.
4 students mastered.
4 students partially mastered.
3 students were below expectations.

COSC 3050: Ethics, Kim Buckner

Core Course Assessed
COSC 3050: Ethics Every other year, starting 2016-17
Performance Indicator (e.1): Recognize ethical issues involved in a 
professional setting.

Case studies/readings/quiz topics: Performance, harassment, professional 
responsibility.

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (e.2): Describe current issues in security. 

Case studies/readings/quiz topic: Security

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (e.3): Describe current issues in privacy.

Case studies/readings/quiz topic: Privacy, free speech

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (e.4): Respect and honor ethics in writing assignments.

Their final papers were all reasonable, the biggest problems being grammatical errors
especially for our foreign students. Final paper

There were 24 students assessed.

22 students excelled.
2 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (g.1): Understand the impact of computing solutions on 
society in a global economic context.

Case studies/readings/quiz: Visas, whistle blowing

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (g.2): Describe non-technical computing issues such as 
sustainability, entrepreneurship, and outsourcing.

Case studies/readings/quiz: Codes of conduct, social society, visas (same from (g.1)) 

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (h.1): Read and report on papers in the technical literature.

Same as (e.1), (e.2), (e.3), (e.4), (g.1), (g.2) and the final paper. The assessment 
of this item is included in the first assessment.

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (h.3): Review articles, chapters, or presentations from the 
professional literature.

Same as (e.1), (e.2), (e.3), (e.4), (g.1), (g.2) and the final paper. The assessment 
of this item is included in the first assessment.

24 students were assessed.

24 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

COSC 4950: Senior Design I, Ruben Gamboa

Core Course Assessed
COSC 4950: Senior Design I Every other year, starting 2017-18

This is the first part of the capstone course. Students work in groups ranging from 3 to 5, and they select a project. In this course, the students try to flesh out the project, deciding on the important features and trying to come to terms with project risk. They decide what technology to use, and they may start implementation. They also document their decisions by writing use cases, or other tools as appropriate (e.g., story boards for projects that choose to write games).

Classtime was devoted to group presentations that updated the instructor and the other teams on the status of the project. This is the main tool used for assessment.

Performance Indicator (c.1): Identify constraints on the design problem

35 students were assessed.

27 students excelled.
4 students mastered.
0 students partially mastered.
4 students were below expectations.

Performance Indicator (c.2) Establish acceptance criteria for a solution

35 students were assessed.

24 students excelled.
7 students mastered.
0 students partially mastered.
4 students were below expectations.

Performance Indicator (k.4) Describe agile methods of software development


35 students were assessed.

35 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

COSC 4955: Senior Design II, Ruben Gamboa

Core Course Assessed
COSC 4955: Senior Design II Every other year, starting 2017-18

This is the second part of the capstone course, and students continue to work in the same groups as in COSC 4950, and they finish the implementation of the project they started in that class. Students also prepare for the final presentations, and give a presentation describing their project to a general audience. Finally, students develop their project using source code control through github, and they turn in their final projects by inviting the instructor to the github repository.

Classtime was devoted to group presentations that updated the instructor and the other teams on the status of the project. This is the main tool used for assessment. Where indicated, the project source code or the final presentation is also used for assessment.

(d,4), (d.5), (f.1), (f.2), (f.3), (i.4), (k.1)

Performance Indicator (c.3): Design the selected solution for a given problem

33 students were assessed.

18 students excelled.
3 students mastered.
4 students partially mastered.
8 students were below expectations.

Performance Indicator (c.4): Implement the designed solution for a given problem

33 students were assessed.

21 students excelled.
4 students mastered.
5 students partially mastered.
3 students were below expectations.

Performance Indicator (c.5): Evaluate the implemented solution

33 students were assessed.

6 students excelled.
14 students mastered.
7 students partially mastered.
6 students were below expectations.

Performance Indicator (d.1): Listen to other team members

33 students were assessed.

17 students excelled.
3 students mastered.
10 students partially mastered.
3 students were below expectations.

Performance Indicator (d.2): Actively discuss team projects, 
objectives, or challenges with other team members

33 students were assessed.

30 students excelled.
0 students mastered.
0 students partially mastered.
3 students were below expectations.

Performance Indicator (d.3): Fulfill team duties on time

33 students were assessed.

26 students excelled.
0 students mastered.
7 students partially mastered.
0 students were below expectations.

Performance Indicator (d.4): Share in the work of the team

33 students were assessed.

6 students excelled.
19 students mastered.
8 students partially mastered.
0 students were below expectations.

Performance Indicator (d.5): Research and gather information

33 students were assessed.

8 students excelled.
18 students mastered.
7 students partially mastered.
0 students were below expectations.

Performance Indicator (f.1): Write technical reports

Assessed using the documentation and project description.

33 students were assessed.

8 students excelled.
21 students mastered.
4 students partially mastered.
0 students were below expectations.

Performance Indicator (f.2): Present technical material to technical peers

33 students were assessed.

33 students excelled.
0 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (f.3): Present technical material to non-technical visitors

Assessed using the final presentation.

33 students were assessed.

25 students excelled.
8 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (h.2): Participate in professional activities
of a program

Assessed using ACM Student Chapter meetings, as well as references to
trade books and conferences during class discussion.

33 students were assessed.

18 students excelled.
8 students mastered.
7 students partially mastered.
0 students were below expectations.

Performance Indicator (i.4): Use source code control to manage different versions 
of a program

Assessed using the final project submission.

33 students were assessed.

21 students excelled.
12 students mastered.
0 students partially mastered.
0 students were below expectations.

Performance Indicator (k.1): Design the overall architecture of a software system

Assessed using the final project submission and class status reports.

33 students were assessed.

17 students excelled.
9 students mastered.
3 students partially mastered.
4 students were below expectations.