ACM SIGCSE

In the field of Computer Science Education, which has become my main professional focus since I was hired as an Assistant Professor of Teaching a few years ago, the SIGCSE conference is one of the best international venues for publications (rank A).

SIGCSE is originally a special interest group (SIG) focused on computer science education (CSE), and is part of the ACM (Association for Computing Machinery), one of the two main organizations in CS (the other one is IEEE). Every year since 1970, SIGCSE organizes a big Technical Symposium of the same name which attendance now regularly exceeds 1,500.

I went to SIGCSE for the first time in 2019 when it was held in Minneapolis, MN. Last year, I flew to Portland, OR, only to discovered upon arrival that the conference was cancelled (it was the beginning of the restrictions linked to the emerging pandemic). This year, SIGCSE was held entirely online, so I was able to watch presentations from the (relative) comfort of my office on campus!

Talk reviews

I unfortunately did not have time to watch every single presentations (there are a lot of them!) but I did spend an hour or so everyday of last week watching presentations that looked interesting, and taking notes which you can find below.

If you are interested to learn more about a certain presentation, note that the entire proceedings of the conference can be found here: https://dl.acm.org/doi/proceedings/10.1145/3408877.

[Opening keynote] “Expanding Opportunities through Research for Societal Impacts”

By Juan E. Gilbert (University of Florida)

Great keynote by Prof Gilbert. He first tackle the issue of recruiting more diverse PhD students. He showed statistics to highlight that, contrarily to what many say, there are many minority students who graduate with a Bachelor every year, and we should do more to recruit them in PhD programs. In the second part of his talk, he presented various interesting projects he and his students have worked one in the past two decades: better voting machines, application for increasing the safety of police stops, etc.

[Paper] “How Do Students Collaborate? Analyzing Group Choice in a Collaborative Learning Environment”

By Xinyue Lin, James Connors, Chang Lim, and John R. Hott (University of Virginia)

Super interesting presentation studying collaboration patterns between students. In a data structure and algorithms class (like ECS 36C), they had an open collaboration policy where students were allowed to collaborate on certain assignments. Although each student still had to submit their own, original work, they were allowed to collaborate with up to 4 other students. Collaborations were to be duly reported, as well as its direction: have you provided help to another student, or received help from another student, or both. Based on the received data, the authors were then able to represent the group structures, as connected graphs. This was probably my favorite part of the talk. They found some correlation between groups and performance: students working in groups were performing 8% better overall. They also found that students tend to naturally favor group work when assignments are hard.

[Paper] Where is Computer Science Education Research Happening?

By Stephanie Lunn (Florida International University), Maíra Marques Samary (Boston College), and Alan Peterfreund (SageFox Consulting Group)

In this study, the authors looked at papers published in three different venues for CS education research: ITiCSE, ICER, and TOCE. I was surprised they didn’t consider SIGCSE as well. They found that most of the papers came from the US and Europe, and hypothesized that it could because of the language used in these venues (English).

[Paper] Gender and Engagement in CS Courses on Piazza

By Adrian Thinnyun, Ryan Lenfant, Raymond Pettit, and John R. Hott (University of Virginia)

As you may know, studying the gender gap in CS courses is also one of my research topics, so I was particularly interested in this talk.

In this research, the authors looked at some Piazza statistics from various classes at undergrad level. They measured the engagement of male vs female students, and found that female students were more engaged: on average, they ask more questions on the forum, stay longer on Piazza throughout the class, and have about the same ratio of questions vs answers. Next, they measured the use of the anonymity feature on Piazza. Here they found that female students more often ask questions anonymously, and answer questions anonymously as well. Finally, they wanted to measure the influence of peer parity (the fact that if you ask a question for the first time on the forum, someone from the same gender will answer your question). They found a poorer peer parity for female students (12%) than for male students (65%), but didn’t find that it had any influence.

[Keynote] Automated Feedback, the Next Generation: Designing Learning Experiences

By Stephen H. Edwards (Virginia Tech)

This year, Stephen Edwards received the SIGCSE award for his contributions to the field of CSEd. It isn’t surprising considering the number of influential papers he has contributed in the past 20 years. I remember that at SIGCSE’19 it felt like every other paper had his name on it (I’m obviously exaggerating but he had a lot of papers!).

Edwards built his work on designing autograder (he is famous for WebCAT), so he presented his work on this topic. He talked about designing autograders that generate constructive feedback rather than simple pass/fail test cases. Very interesting ideas that I’m excited to explore in my own courses!

[Paper] Investigating the Impact of the COVID-19 Pandemic on Computing Students’ Sense of Belonging

By Catherine Mooney and Brett A. Becker (University College Dublin)

For the past few years, the authors have been regularly giving a survey to their undergrad CS population. This survey contains about 20 questions about the demographics of the students (gender identity, minority) and their sense of belonging, using a set of both positively and negatively phrased questions (“I feel like I belong” vs “I feel excluded”). The data was presented for four categories: non-minority males, minority males, non-minority females, minority females.

The pre-COVID results are pretty much what anyone would expect: non-minority students (male and female) are the students with the highest sense of belonging, whereas minority students (male and female) experience a much lower sense of belonging. What’s already fascinating in these results though, is that female students didn’t classify themselves consistently. Many of them classified themselves as minority only based on the fact that they were female. In other words, if a female student sees herself as a non-minority, she’ll experience a high sense of belonging (same as a non-minority male); but if a female student sees herself as a minority, because she is female, then she is likely to experience a poor sense of belonging.

The second fascinating observation of this work are the post-COVID results. The sense of belonging for non-minority students have plummeted, while it has increased for minority students!

Among the possible explanations for this shift are:

  • The sense of belonging is fostered by social interactions. In this pandemic time, less social interactions may mean less way for minorities to feel like they don’t belong.
  • Online learning is inherently more inclusive than in-person learning.

[Paper] “Finding Video-watching Behavior Patterns in a Flipped CS1 Course”

By Colin Moore, Lina Battestilli, and Ignacio X. Domínguez (North Carolina State University)

In this study, the authors looked at the video watching statistics they gathered for their non-major CS1 class and tried to see if there existed any correlations with how students performed in the class.

Their non-major CS1 class (the equivalent of our ECS 32A) is a flipped course; they have a weekly cycle where students start by watching videos on their own (Sunday/Monday), then meet up in class with their instructor (Tuesday), and the rest of the week is filled with labs and homework until the cycle starts again the week after.

They measured two aspects of video watching: punctuality (did students watched the videos? Early, late, or not a all?), coverage (how much of each video students watched), and watch count (did students watch videos more than once).

The punctuality aspect already shows a big divide in the class. A first cluster of students (C1) overwhelmingly watches the videos, even early, while the second cluster of students (C2) hardly watches any videos at all. The same divide exists for the other aspects: cluster C1 watches 75% of the videos and 1.14 times on average, while cluster C2 watches 12% of the videos and 0.23 times on average.

Interestingly enough, many students in cluster C2 are in fact not watching the videos due to them having prior experience in programming. However, as one can expect, there is a correlation between video watching and class performance: students in cluster C1 are, on average, more likely to get an A, while students in cluster C2 are, on average, more likely to get a D or F.

[Paper] “Code in Place: Online Section Leading for Scalable Human-Centered Learning”

By Chris Piech, Ali Malik, Kylie Jue, and Mehran Sahami (Stanford University)

Great work from Stanford, who decided to organize a free online CS1 class during the pandemic last Spring 2020. The class represents half of the CS1 class offered at Stanford. They had an attendance of 10,000 students with great gender diversity (50% female, 47% male, 3% non-binary).

The talk was mostly about the selection and training process of the section leaders. What’s great about this work is that their online class was not your typical online MOOC (“watch lecture videos and submit work completely by yourself”). Instead, they wanted to have one hour of live discussion, where students would learn in small groups (~10 students) with one section leader. In order to manage such a high scale, they had to recruit over 900 volunteer section leaders and train them!

As a result, their engagement numbers were amazing: 99.7% of the sections leaders who signed up stayed for the whole class, while 56% of students completed it (apparently, for typical MOOCs, the completion ratio is more around ~5%…).

[Paper] “Investigating Item Bias in a CS1 Exam with Differential Item Functioning”

By Matt J. Davidson, Brett Wortzman, Amy J. Ko, and Min Li (University of Washington)

This work studies fairness and bias in CS exams. The author used DIF methods to detect bias in typical CS1 exams. I have to admit that, although the talk was interesting and definitely made me want to assess my own exams, I did not understand much of how to apply DIF. I’ll need to investigate this more sometime.

A long blog post showing how to perform DIF analysis using R was also published recently, I’ll need to find the time to read it carefully.

[Specials]

There were two special sessions that I thought were interesting as well:

  • EngageCSEdu: A Collection of Engaging Assignments
    • This is a website containing class materials, mostly for your typical CS intro series. It seems great to find inspiration when writing new homework assignments.
    • https://www.engage-csedu.org
  • Nifty assignments
    • Usually, one of the most popular sessions at SIGCSE. Instructors present some of their “niftiest” assignments.
    • I liked this year’s linked link labyrinth, but wished they had used GDB!
    • http://nifty.stanford.edu/

[Keynote] “Increasing Diversity in Computing Education: Lesson Learned”

By Valerie E. Taylor (CEO & President of CMD-IT)

Great and inspiring talk by Valerie Taylor, who is the director of the Mathematics and Computer Science Division of Argonne National Laboratory in Illinois, but also the CEO and President of the Center for Minorities and People with Disabilities in IT (CMD-IT).

In order to maximize the impact of promoting diversity in CS, she identified the 11 academic institutions that are the union of the top 10 producers of faculty and the top 10 ranking institutions, and gathered them into the LEAP alliance. Now they are working on developing a common agenda: for example, increasing the number of students from target groups in PhD programs, in order to increase the diversity among PhD graduates.

She highlighted the importance of having local advocates, and a strong community between these advocates and the academic fellows. She also talked about academic career workshops, in order to to demystify the academic career ladder. These workshops gather participants and panelists from specific target groups, and for example, teach how to write grant proposals.

Finally, she talked about the first Tapia conference, a conference designed to promote diversity, connect undergraduate and graduate students, faculty, researchers, and professionals in computing from all backgrounds and ethnicities.