Posts tagged danah boyd

Data, Fairness, Algorithms, Consequences

Medium, danah boyd, data, privacy, algorithms, bias, discrimination, transparency, responsibility

When we open up data, are we empowering people to come together? Or to come apart? Who defines the values that we should be working towards? Who checks to make sure that our data projects are moving us towards those values? If we aren’t clear about what we want and the trade-offs that are involved, simply opening up data can — and often does — reify existing inequities and structural problems in society. Is that really what we’re aiming to do?

via https://points.datasociety.net/toward-accountability–6096e38878f0

When Good Intentions Backfire

Medium, danah boyd, intent, purpose, good vs evil, malice, hacker mindset

I find it frustrating to bear witness to good intentions getting manipulated, but it’s even harder to watch how those who are wedded to good intentions are often unwilling to acknowledge this, let alone start imagining how to develop the appropriate antibodies.[…] I have learned that people who view themselves through the lens of good intentions cannot imagine that they could be a pawn in someone else’s game. They cannot imagine that the values and frames that they’ve dedicated their lives towards — free speech, media literacy, truth — could be manipulated or repurposed by others in ways that undermine their good intentions.

via https://points.datasociety.net/when-good-intentions-backfire–786fb0dead03

Transparency ≠ Accountability

Medium, data, society, danah boyd, transparency, accountability, algorithmics

In the next ten years we will see data-driven technologies reconfigure systems in many different sectors, from autonomous vehicles to personalized learning, predictive policing to precision medicine. While the changes that we will see will create new opportunities, they will also create new challenges — and new worries — and it behooves us to start grappling with these issues now so that we can build healthy sociotechnical systems.

via https://points.datasociety.net/transparency-accountability–3c04e4804504

Media: End Reporting on Polls

Medium, data, reporting, politics, polls, civic engagement, data and society, danah boyd

We now know that the polls were wrong. Over the last few months, I’ve told numerous reporters and people in the media industry this, but I was generally ignored and dismissed. I wasn’t alone — two computer scientists whom I deeply respect — Jenn Wortman Vaughan and Hanna Wallach — were trying to get an op-ed on prediction and uncertainty into major newspapers, but were repeatedly told that the data was solid. It was not. And it will be increasingly problematic.

via https://points.datasociety.net/media-end-reporting-on-polls-c9b5df705b7f

There was a bomb on my block.

Medium, danah boyd, terrorism, media, fear mongering, fear

After hearing the bomb go off on 23rd and getting flooded with texts on Saturday night, I decided to send a few notes that I was OK and turn off my phone. My partner is Israeli. We’ve been there for two wars and he’s been there through countless bombs. We both knew that getting riled up was of no help to anyone. So we went to sleep. I woke up on Sunday, opened my blinds, and was surprised to see an obscene number of men in black with identical body types, identical haircuts, and identical cars. It looked like the weirdest casting call I’ve ever seen. And no one else. No cars, no people. As always, Twitter had an explanation so we settled into our PJs and realized it was going to be a strange day.

via https://medium.com/@zephoria/there-was-a-bomb-on-my-block–6045e597ac2f

What does the Facebook experiment teach us?

danah boyd, facebook, research, ethics, IRB, peer review, psychology, sentiment manipulation, algori

For better or worse, people imagine Facebook is run by a benevolent dictator, that the site is there to enable people to better connect with others. In some senses, this is true. But Facebook is also a company […] it designs its algorithms not just to market to you directly but to convince you to keep coming back over and over again. People have an abstract notion of how that operates, but they don’t really know, or even want to know. They just want the hot dog to taste good. Whether it’s couched as research or operations, people don’t want to think they’re being manipulated. So when they find out what soylent green is made of, they’re outraged. This study isn’t really what’s at stake. What’s at stake is the underlying dynamic of how Facebook runs its business, operates its system, and makes decisions that have nothing to do with how its users want Facebook to operate. It’s not about research. It’s a question of power.

https://medium.com/message/what-does-the-facebook-experiment-teach-us-c858c08e287f