If you haven’t watched the “HP computers are racist” video on YouTube yet, go ahead and do it now.
The video is funny, but it’s not an isolated incident. Maybe you saw this, too: “Racist Camera! No, I did not blink… I’m just Asian!”
At People’s Production House, one of the things we talk about when teaching media literacy – and why we’re working to include lessons on hardware and infrastructure – is that telecommunications and digital media companies make assumptions about their customers when developing their technology. Our devices and networks are no more value neutral than any of our content. The video and photo above are good lessons in what kinds of assumptions companies sometimes make and what the results can be.
For PPH, we’re looking specifically at the bundles of cell phone devices, software & applications, service, and plans (minute/texting/data packages). In that instance, the assumptions don’t determine minor features on your camera. They can shape how you communicate with your family and friends, even if sometimes it’s hard to see exactly how.
There’s another place we can look where this same phenomenon is also at play, even if it’s less apparent: search. If you base your search algorithm (or however it works) on existing links on the Internet, you are designing your search engine to work for the people who were early to the Internet. The same image comes to mind in this instance as for the HP webcam: a bunch of white guys in a lab saying, “It works for me.”
I remember women of color bloggers discussing the number of men who found their blogs through pornographic search queries and wound up leaving hateful comments. That should cause the same kind of “there’s something wrong here” moment that Wanda and Dezzie (sp?) had with the HP laptop and Joz had with her Nikon S630.
What’s really interesting is that the inclination is often to blame the technology. “Hewlet-Packard computers are racist” or “racist camera.” Yet I assume that in each of these instances, you could just as easily design the technology to bias in another direction – if you wanted to.
Ultimately, this explains why issues like handset exclusivity and other methods of unbundling are a civil rights issue. If you have to take the bundle as the company has made it and you can’t modify it, then you’re stuck with technology developed by a company for the company’s idea of who it wants its customers to be and what it wants them to do. If you have open standards, you can mix, match, and develop your own technology using the bits and pieces that work for you.
Certainly, not everyone has the necessary set of skills to do this, but at least the potential is there and the pool of people who can is much larger than in a walled garden or with closed, proprietary standards. Still, developers of open source technology are plenty capable of incorporating racist bias into the technology, too, so to get the best outcomes, we want more people to have these skills. True openness requires both transparency and participation.