Web
Analytics
Meet Madiha Zahrah Choksi
top of page
  • Writer's pictureDigital Life Initiative

Meet Madiha Zahrah Choksi



Interviewed by Shana Creaney (Cornell Tech)

Can we get a brief introduction about yourself and what you’re studying?


My name is Madiha. I am a third year PhD student in Information Science. I work with Helen Nissenbaum and James Grimmelmann on topics related to information governance online, particularly how disparate groups online govern their information flows. So, I’ll look at issues of privacy or various kinds of governance issues that may arise for both platform based groups, like Reddit, Facebook, or Discord or the open community, developers on Github, for example, or Wikipedia users. But primarily looking at how they restrict or enable information flows and issues that come about therein.


How did you become interested in that?


Oh my gosh. I don’t even know where to begin. I come from a pretty interdisciplinary background. I became interested in freedom of information or access to information issues after I did a Masters in information science, and started working within the Freedom of Information Office of the Federal Government of Canada.


I got a close look at the backend, the government side, and the breadth of information that is kept closed off from the public. I learned a lot about privacy issues that arise with releasing certain information, and how government officials are trained to deal with a wide scope of information requests. All of this just sparked something in my head. Like, where is sharing more information fair or relevant or important and where is restricting information important. I definitely didn’t agree with how requests relevant to particular issues, especially contentious issues, were treated. 


That being said, it was really interesting to be applying federal legislation and legal tools to these very serious documents that were requested by journalists or the general public. Commonly these requests encompass government spending or taxation information about government officials, or various kinds of environmental issues that come up in certain cities in Canada. For example, if there was a wildfire there were various organizations or news outlets that were looking for the protected documents that the government had access to in order to uncover pollution harms that weren’t widely disseminated. 


So, that kind of led me to then do some of my own research and work. I did a Masters at Columbia and I worked with a lawyer on my thesis, and I looked at surveillance and privacy in the context of policing and government and the nature of public private partnerships.


Why do you think that’s worth studying? Especially thinking about it from a layman's understanding.


So many reasons. I think that a lot of the time we do this work because we have certain expertise and are naturally curious. We think there’s interesting research questions or investigate something that’s timely. We often want to understand novel technologies, or some effect they have on society. But I think the work that I do is mostly motivated by practical implications and in my work I am trying to blend contextual integrity and legal theory to address issues of governance. For example, how students navigate or govern their privacy relative to EdTech platforms? How do students react or protest? If you think of something like exam proctoring tools, those research questions are really motivated by the fact that I was noticing how students turning to Reddit or Discord (very decentralized platforms) to band together and create some kind of grassroots activist movement against their institutions. And that’s super interesting to me. Why is that happening? What can we learn from how the students are responding to maybe effect some change in the arcane bureaucratic system that decides what EdTech platforms we use. 


This led to another research question about privacy for social groups online more broadly. How do groups on platforms express their privacy norms? What tools do platforms afford to let their communities decide? Where are the gaps? I guess to sum, the short answer is my line of work is meant to be practical, and that’s where I find my motivation.


In an ideal world, where does this research lead?


In an ideal world I think this research leads to some… I think every PhD student will say something like “more regulation” or “more policy” and, for sure, I think we all have that goal. But I think: a better informed audience. 


I think that there’s a lot of assumptions being made about how systems work. The technology we study can be quite mystical… The layperson doesn’t always know what we mean when we discuss open or closed source tools, models, or generative AI. I want my work to clarify the issues. 


We can look at my current work on the open source ecosystem. Communities of developers are expressing very strong sentiments about tools such as ChatGPT or Co-Pilot, and generative AI tools that are totally getting in the way of long standing developer norms. For example, how people learn to code, how they write code, and how they share and publish code. LLM based tools that aim to automate these processes (and for some tasks, it makes sense), do not naturally contribute back to the broader open source ecosystem. How can we hear and remedy developers' concerns? What are the practical takeaways here? 


This is the moment of change. So, if we can start to theorize about how we see this change manifesting, how it will affect things, what are the downstream harms, then we can better articulate what’s changing and clarify these things. Then we’ll have a better way to address them moving forward through something like policy or regulation. But even just best practices. And maybe it’s how we teach, and CS education changes a little bit. 


How did you become part of DLI?


This is a good story. I was working at Columbia after my Masters for a while and thinking very slowly and deliberately about PhD programs. I knew about Cornell Tech. I was definitely well read in both Helen and James’ work, particularly contextual integrity, and this was the place I wanted to do my PhD. To me, it was the only option. I would regularly attend the DLI Seminar on Zoom, well before I even applied to the PhD program. I would read everything coming out of DLI. I was a big fan and felt strongly that this was the intellectual community for me, where my work will thrive. I applied and was accepted. Considering other schools wasn’t even an option. So, here I am and it’s been go, go, go ever since. 


DLI must have just come across my radar via a newsletter when DLI first started years ago. I did also regularly read the Cornell Tech website, embarrassing. 


What has the DLI experience been like for you?


Oh, extremely, extremely generative. I think that DLI is, like I said, this interdisciplinary group of brilliant scholars. We have postdocs, DLI doctoral fellows – I was one last year, and then we have PhD students and everyone comes together from a different disciplinary home. Some of them overlap, we have a bunch of lawyers for example, or we have a strong group of HCI folks and privacy scholars and machine learning experts but everyone comes together. Between the reading group and the seminar, we are exposed to so much incredible research, so many unique and insightful perspectives. It also helps that DLI is such an open and inviting space and we feel free to ask questions. 


The experience has been very critical to my development as a scholar. I regularly recommend the DLI to anyone here, in Ithaca, when the fellowships open up I’m always sharing it out. I’ve made some of my closest friends here in the DLI. It’s a beautiful community.


What’s next for you?


So, I have some conferences upcoming. I published my first ACM conference paper on NextDoor at CHI, as well as another 2 at FAccT. There is another forthcoming with Helen at CSCW in November. So conferences and presenting my work. At the same time, I am working on preparing for my qualifying exams to start working on my dissertation. So, that’s what’s next!


Madiha Zahrah Choksi

Digital Life Initiative

Cornell Tech




Cornell Tech | 2024


bottom of page