The Computer Science (CS) department welcomes prospective students interested in our graduate programs to our annual Prospective Student Visit Day on Friday, Nov. 8. We are looking for strong students with diverse backgrounds to join our MCS and PhD programs. A wide variety of research areas is represented by our world-class faculty including algorithms, computational epidemiology, distributed computing, human-computer interaction, machine learning, massive data algorithms and technology, mobile computing, networks, programming languages, text mining, security, and virtual reality. Write to our Graduate Program Administrator at cs-info@list.uiowa.edu if you'd like to visit.
An important part of the prospective Student Visit Day is the 10th Iowa Computer Science Graduate Research Symposium (2024), in which senior PhD students will present talks showcasing their current research. This will be followed by a keynote by a CS faculty member. Talks are intended for a wide audience with interest in CS, including CS juniors and seniors. The talks presented by current CS graduate students at the Symposium are excellent examples of the exciting CS research taking place here on the UI campus!
Schedule
All times CT [Conversion to your timezone possible here if needed]
Friday, Nov. 8, 2024 |
Morning Session (By invitation) |
|
Morning |
Overview of Graduate Programs by Professor and Director of Graduate Studies Steve GoddardShort Faculty Research Presentations by UIowaCS Professors Lucas Silva, Tianyu Zhang, Sourya Roy, and Rahul Singh |
|
Graduate Student Research Symposium Sessions |
||
1:30-3:30 p.m. |
Student Sessions |
|
1:30 p.m. |
Alex Hubers, PhD StudentAbstracting Duality With Row Types |
|
2 p.m. |
Hongyan Ji, PhD StudentEnhancing Distributed Algorithms with Local Knowledge |
|
2:30 p.m. |
Yafan Huang, PhD StudentA Fast Error-Bounded Lossy Compressor for Distributed Systems |
|
3 p.m. |
Jamil Gafur, PhD StudentThe Pruning Playbook: Secrets to Leaner, Meaner Neural Networks! |
|
3:30-4 p.m. |
Break |
|
4-5 p.m. |
Keynote |
|
Speaker: Professor Rishab NithyanandThe Quest for Transparency and Accountability in the Online Data Ecosystem |
||
Speakers
Alex Hubers
Title: Abstracting Duality With Row Types
Abstract:
Records and variants pervade programming languages. Each are ubiquitous in orthogonal paradigms of programming: records give a foundational model to objects (á la object-oriented programming) and variants to algebraic data types (à la functional programs). This orthogonality can more abstractly be viewed as the order-theoretic notion of duality. Duality observes that theorems, constructions, and proofs in one ordered theory yield equivalent (but dual) theorems, constructions, and proofs in dual theories. Row types describe one approach to witnessing this duality. This talk will show how this process — of reflecting and reifying the abstraction of duality — can lead to powerful forms of meta- and generic-programming over records and variants, and, by consequence, powerful connections between two seemingly orthogonal paradigms of programming.
PhD student | Advisors: Garrett Morris; Aaron Stump (now at BC) | Areas of research: Functional Programming Languages, Semantics, and Type Theory
Hongyan Ji
Title: Enhancing Distributed Algorithms with Local Knowledge
Abstract:
In distributed computing, the initial knowledge each processor has about its surroundings can greatly influence algorithmic efficiency, particularly the message complexity – the amount of information exchanged between processors. This talk focuses on the message complexity within the KTρ Congest model, a variant of the classical Congest model, where each node starts with knowledge of its neighborhood within radius ρ. In the KT0 Congest (with no initial neighborhood knowledge), problems like Minimum Spanning Tree (MST) can require Ω(m) messages. However, in the KT1 Congest (where nodes know their immediate neighbors’ IDs), this message complexity is reduced to (roughly) O(n). However, this reduction in message complexity comes at the expense of rounds (time). Building on these insights, we demonstrate that with slightly more initial local knowledge — specifically in the KT2 Congest model — we can solve problems such as MST using relatively few messages while incurring a relatively small penalty in rounds. Overall, this talk will highlight the power of local knowledge in distributed settings.
4th-year PhD student | Advisor: Sriram Pemmaraju | Areas of research: Distributed Graph Algorithms
Yafan Huang
Title: A Fast Error-Bounded Lossy Compressor for Distributed Systems
Abstract:
Today's high-performance computing (HPC) simulations and large language model (LLM) training increasingly depend on distributed systems, where massive nodes equipped with high-end CPUs and GPUs frequently exchange enormous data volumes. Such data movement creates bottlenecks in many real-world applications, significantly degrading runtime performance in return. To address this challenge, we introduce cuSZp, a fast lossy compressor optimized for inline compression tasks requiring high-speed processing. On the one hand, cuSZp features a single-kernel design that operates entirely on the GPU, ensuring minimal compression and decompression overhead. On the other hand, cuSZp achieves substantial data size reduction within user-defined error thresholds, highly preserving the data quality.
4th-year PhD student | Advisor: Guanpeng Li | Areas of research: Error Resilience, Lossy Compression, and Parallel Computing
Jamil Gafur
Title: The Pruning Playbook: Secrets to Leaner, Meaner Neural Networks!
Abstract:
We present preliminary findings on optimizing neural networks using pruning techniques, particularly focusing on the Lottery Ticket Hypothesis. Our study involves three established architectures — ResNet, LeNet, and VGG16 — utilizing Iterative Magnitude Pruning (IMP) on the CIFAR-10 and MNIST datasets. We analyze redundant weights and neurons through cosine similarity, which offers insights for enhancing model efficiency. Initial results indicate that while IMP successfully removes less significant weights, challenges remain in completely addressing neuron redundancy. By incorporating the principles of the Lottery Ticket Hypothesis, we aim to establish a foundation for creating more efficient and compact neural networks in future research.
4th-year PhD student | Advisors: Steve Goddard | Areas of research: Resource Efficient Neural Networks, Explainable AI, and Adversarial Machine Learning
Keynote Speaker: Rishab Nithyanand
Title: The Quest for Transparency and Accountability in the Online Data Ecosystem
Abstract:
Algorithmic and data-driven systems on social media platforms and within ad-tech wield immense power, shaping societal norms, perceptions of reality, and the flow of revenue online. Yet, their operations remain obscured from public scrutiny, leaving debates around their harms and regulation lacking nuance and depth. In this talk, I will describe my research journey aimed at injecting transparency and accountability into these opaque systems. I will discuss how we mitigated the unintended amplification of harmful behavior by platform governance mechanisms, uncovered hidden data-sharing relationships between online entities, and curtailed the funding of misinformation. Additionally, I will present our ongoing efforts to diagnose the “pathologies” of content recommendation systems used by major platforms. Finally, I will outline actionable strategies for technologists and regulators to mitigate the harms posed by algorithmic systems, ultimately paving the way for a healthier digital society.
Bio: Rishab Nithyanand is an Assistant Professor and Emeriti-Faculty Scholar in the Department of Computer Science at the University of Iowa where he leads the SPARTA lab. He also holds a courtesy appointment in the College of Law and co-directs the Center for Publics, Platforms, and Personalization (CP3). Professor Nithyanand’s research is broadly in the areas of privacy and social computing. His research aims to improve transparency and accountability in the online data ecosystem.