### Abstract

Small auto-graded coding exercises with immediate feedback are widely recognized as helping students in introductory programming courses. We analyzed usage of a particular coding exercise tool, at 11 courses at different universities. Instructors awarded differing amounts of course points (including zero) for the exercises. We investigated how awarding points affected student completion rates of the exercises. We found that without awarding points, completion rates were about 25%. Awarding even a small amount of points, such as 2 course points out of 100, resulted in 62% completion, with little increase in completion rates for more course points (such as 5, 10, or even 25). Comparing to participation activity completion rates of 85%, one might conclude that the 62% is short of 100% in part due to some students simply not doing homework (15%), and the remaining 23% due to the greater difficulty of the exercises. We analyzed time spent, and found that students spent about 3.3 minutes per exercise, matching the expected 2-4 minutes by the exercise authors. We analyzed number of tries per exercise, and found students submitted 3.5 tries on average per exercise. For some harder exercises, the averages were higher at 5-10 tries, suggesting the students are indeed putting forth good effort. We found very high numbers of tries by some students on a single exercise, sometimes 30, 50, or even 100, suggesting more work is needed to assist such students to better learn the concepts rather than repeatedly submitting tries, and to reduce frustration and increase learning efficiency.

Language | English (US) |
---|---|

Journal | ASEE Annual Conference and Exposition, Conference Proceedings |

Volume | 2017-June |

State | Published - Jun 24 2017 |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)

### Cite this

*ASEE Annual Conference and Exposition, Conference Proceedings*,

*2017-June*.

**An analysis of incorporating small coding exercises as homework in introductory programming courses.** / Edgcomb, Alex Daniel; Vahid, Frank; Lysecky, Roman; Lysecky, Susan.

Research output: Research - peer-review › Article

*ASEE Annual Conference and Exposition, Conference Proceedings*, vol 2017-June.

}

TY - JOUR

T1 - An analysis of incorporating small coding exercises as homework in introductory programming courses

AU - Edgcomb,Alex Daniel

AU - Vahid,Frank

AU - Lysecky,Roman

AU - Lysecky,Susan

PY - 2017/6/24

Y1 - 2017/6/24

N2 - Small auto-graded coding exercises with immediate feedback are widely recognized as helping students in introductory programming courses. We analyzed usage of a particular coding exercise tool, at 11 courses at different universities. Instructors awarded differing amounts of course points (including zero) for the exercises. We investigated how awarding points affected student completion rates of the exercises. We found that without awarding points, completion rates were about 25%. Awarding even a small amount of points, such as 2 course points out of 100, resulted in 62% completion, with little increase in completion rates for more course points (such as 5, 10, or even 25). Comparing to participation activity completion rates of 85%, one might conclude that the 62% is short of 100% in part due to some students simply not doing homework (15%), and the remaining 23% due to the greater difficulty of the exercises. We analyzed time spent, and found that students spent about 3.3 minutes per exercise, matching the expected 2-4 minutes by the exercise authors. We analyzed number of tries per exercise, and found students submitted 3.5 tries on average per exercise. For some harder exercises, the averages were higher at 5-10 tries, suggesting the students are indeed putting forth good effort. We found very high numbers of tries by some students on a single exercise, sometimes 30, 50, or even 100, suggesting more work is needed to assist such students to better learn the concepts rather than repeatedly submitting tries, and to reduce frustration and increase learning efficiency.

AB - Small auto-graded coding exercises with immediate feedback are widely recognized as helping students in introductory programming courses. We analyzed usage of a particular coding exercise tool, at 11 courses at different universities. Instructors awarded differing amounts of course points (including zero) for the exercises. We investigated how awarding points affected student completion rates of the exercises. We found that without awarding points, completion rates were about 25%. Awarding even a small amount of points, such as 2 course points out of 100, resulted in 62% completion, with little increase in completion rates for more course points (such as 5, 10, or even 25). Comparing to participation activity completion rates of 85%, one might conclude that the 62% is short of 100% in part due to some students simply not doing homework (15%), and the remaining 23% due to the greater difficulty of the exercises. We analyzed time spent, and found that students spent about 3.3 minutes per exercise, matching the expected 2-4 minutes by the exercise authors. We analyzed number of tries per exercise, and found students submitted 3.5 tries on average per exercise. For some harder exercises, the averages were higher at 5-10 tries, suggesting the students are indeed putting forth good effort. We found very high numbers of tries by some students on a single exercise, sometimes 30, 50, or even 100, suggesting more work is needed to assist such students to better learn the concepts rather than repeatedly submitting tries, and to reduce frustration and increase learning efficiency.

UR - http://www.scopus.com/inward/record.url?scp=85030537779&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030537779&partnerID=8YFLogxK

M3 - Article

VL - 2017-June

JO - ASEE Annual Conference and Exposition, Conference Proceedings

T2 - ASEE Annual Conference and Exposition, Conference Proceedings

JF - ASEE Annual Conference and Exposition, Conference Proceedings

SN - 2153-5965

ER -