Who is training the machines?

Latanya Sweeney is a pioneer in the field of data privacy. During her keynote address at the ninth annual Women in Cybersecurity conference, she emphasized that technology rules many aspects of our lives and therefore impacts policy. That is one of the reasons I found myself at the conference and in a graduate program studying cybersecurity and the ways it (continues to) impact society.

Many privacy laws were written when computers were not portable – and neither were phones.

Now technology has evolved and there are societal benefits. We have smaller and faster computers, smart devices for our homes, more efficient cars that may be smarter than the driver, and phones that are mini-computers. 

But there are also costs – bias in algorithms and ads, unequal access to broadband, and outdated privacy laws. There is a level of exclusivity in who is directly benefited and who is put further at risk.

In 1998, the Children’s Online Privacy Protection Act (COPPA) was created to impose requirements regarding the information that websites or online services may collect about children under the age of 13.

Needless to say, a law created when I was a toddler stands no chance against the clever legal teams behind Big Tech. Take, for example, Facebook, which “beat” COPPA by only allowing individuals over the age of 13 to create an account on its platforms. This is loosely enforced by a self-disclosure of birthday, and there is no formal age verification process.

In order to remain relevant and useful, there must be major updates made to COPPA. For one, the age of protected children must increase from 13. In a time when children are studying, playing, and ordering dinner online, we must continue to talk about data privacy.

Who benefits from technological innovation?

We must note the role that creating equitable spaces and practices plays in our physical and digital realities. The rapidly moving digital revolution is not a blank slate. In fact, it has become the opposite, a new venue for discriminatory and unjust practices.

Machine learning algorithms that do not detect brown skin did not appear out of thin air. They were trained by engineers who had never had to worry about brown skin.

A lack of staff diversity shows itself in product outcomes.

An example is Google’s new AI skincare tool, which can recognize up to 288 skin, hair, and nail conditions. Whose skin, hair and nails? The images used to train the model were primarily of how the conditions present themselves on lighter skin tones. Less than 2% of images used in the original dataset were of darker skin tones. Therefore many patients may be led astray by the diagnosis provided by this model. How can technology developments make healthcare more accessible IF they don’t see everyone?

Bias in training data + AI Technology = adverse impact!

In 2013, Latanya Sweeney found that there was discrimination in the delivery of online ads that suggest criminal records. According to the study, “names identified as being given at birth to black or white babies were found to be predictive of race. For example, names such as DeShawn, Darnell, Jermaine generated ads suggestive of an arrest in 81 to 86% of name searches on one website and 92 to 95% on the other, while those assigned at birth primarily to whites, such as Geoffrey, Jill, Emma generated more neutral copy: the word ‘arrest’ appeared in 23 to 29% of searches on one site and 0 to 60% on the other.

Why does it matter? There are social, political, and professional implications to having an arrest record. According to a 2021 report by Working Chance, Black and brown women with criminal records face harsher challenges when finding jobs and progressing in their careers than their white counterparts. It feels wildly inappropriate that a company or website is allowed to suggest the existence of something as serious and life-altering as an arrest record for ad revenue.

Protecting and representing individuals, particularly those most vulnerable (marginalized communities) in digital spaces should be central in building effective privacy and technology legislation.

Be well,
Himaja