Black Lives Matter | Data | Diversity | ProgrammingBlack Lives MatterJune 20, 2020
Neil Chaudhuri (He/Him)
Neil Chaudhuri (He/Him)
Black Lives Matter.
While we at Vidya have made it very clear that diversity in tech has always been a priority and that we have zero tolerance for racism, I am sorry it took so long for me to state those words in this space.
I am sorry it took the murder of Breonna Taylor in her sleep in her home when police executed a no-knock warrant without any announcement or any regard for the black lives inside.
I am sorry it took the murder of George Floyd by a police officer who smugly ignored the desperate pleas of onlookers as he cruelly drained the life out of Mr. Floyd over eight and a half agonizing minutes.
I am sorry it took the murder of Elijah McClain, who played violin for animals at the local shelter and who was on his way home with iced tea for his brother when police descended upon him. He begged them to listen as he pleaded "I’m an introvert. I’m just different. That’s all. I’m so sorry" only to be choked, injected with ketamine, and left for dead while police and first responders chatted about things.
I am sorry it took the lynching of Ahmaud Arbery, who was pursued, attacked, and murdered while jogging by white supremacists, who stood over his body and cursed him using the worst language imaginable.
I am sorry it took the breathtaking privilege of Amy Cooper, who was so angry at avid birder Christian Cooper for merely asking her to follow the rules and leash her dog that she called police and lied, "There's an African-American man recording me and threatening myself and my dog" with implicit knowledge and smug assurance that she has at her command an instrument for killing black people. In fact, Amy Cooper's disgusting behavior undermines completely the "bad apples" argument made so often in defense of police and an entire criminal justice apparatus who destroy black lives.
I am sorry I can't list the names of all the innocent black cis and trans lives going back centuries that we have lost to systemic, institutionalized racism and white supremacy sanctioning police brutality, corruption in the criminal justice system, terrorism, redlining, mass incarceration, climate change, political disenfranchisement, insufficient investment in health care and education and black entrepreneurship, and countless other insidious tactics.
I am sorry conventional media, social media, and elements of our politics are motivated to conflate activism for black lives with "politics." They make money and gain power by stoking division and outrage, and we need ways to undermine these perverse incentives.
It's far too late but I am finally stating it here on the Vidya website: Black Lives Matter
It's time to act. It was always time to act.
What will we do about it?
Not enough. It will never be enough. But we can try.
At Vidya we have made longstanding efforts to promote black people, particularly black women, in STEM and of course technology through our courses, talks at conferences, and volunteering to support organizations that help black women build careers in technology, and we will expand these efforts. Still, this moment calls for much more. We will work in the following areas to build technology solutions that support black lives. Please contact us if you have ideas or would like to collaborate.
In her book Weapons of Math Destruction, mathematician and data scientist Cathy O’Neil describes why we are foolish to have so much faith in the "objectivity" of the data models we use to solve racial bias. Data models derived from unconscious bias produce biased outcomes.
Northpointe’s COMPAS algorithm to predict the likelihood of recidivism for criminal defendants turned out to be a disaster because it underestimated the recidivism of white defendants and overestimated the recidivism of black defendants.
This is such a hard problem that even Amazon has bungled it. A test of Amazon's facial recognition algorithm Rekognition generated an unnerving number of false positives matching members of Congress against a database of mugshots, and these false positives were disproportionately biased against people of color.
If only Northpointe and Amazon were as clever with the reliability of their algorithms as they were with the names of their algorithms.
These catastrophic failures aside, there will only be more data analytics not less, and the predictions and decisions by these models will have a profound impact on black lives. We will be working on products and services that leverage open-source tools like AI Fairness 360 to build data models that minimize bias and produce fair outcomes that we hope will help black people overcome institutional racism across society.
Data architectures ingest, transform, store, and index data that ultimately become the basis for training models, and continuous delivery promotes these models to production systems where they can be queried and visualized to yield valuable insights and prompt further analysis.
Software drives this architecture.
There is a litany of open-source and commercial cloud software dedicated to all of these concerns. In fact, Palantir offers a complete enterprise solution--but for evil.
The universe of software applications that can help save black lives is vast. From mobile applications that automate recording and notifications about encounters with police to those that help black people overcome voter suppression to make their voices heard. From APIs that enable data sharing to promote better health outcomes for black women who are disproportionately vulnerable to COVID and whose maternal mortality rates are far too high to APIs that convey which communities need the most investment to improve their schools. From simple websites that organize content, disseminate information, and collect donations for fledgling organizations to clever voice assistant technology that provides information on your rights when police demand entry into your home and also notifies loved ones of your situation.
And on and on.
The problems are of course vast; the solutions complex. Vidya will build software to make the incremental progress we need to save black lives.
It has always troubled us that the field of software engineering is rife with offensive terminology. Thankfully, we have not been alone in our concern.
In distributed computing, people often speak of a "master" node managing "slave" or "worker" nodes in a cluster. I mean...really? How about "primary" and "replica" instead? Or just about anything else?
In cybersecurity, people often refer to "blacklisting" users or IP addresses that are deemed threats and "whitelisting" those that are deemed safe. How about "block" or "deny" and "allow" or "accept"?
In distributed version control systems like Git, people often call the main line branch of the code base "master." In fact, we are guilty of this in the name of following "standards" and "best practices," but part of being anti-racist is acknowledging where you have failed and fighting to rectify it. So how about "main" as GitHub has promoted?
While lazy, privileged engineers and executives in the tech space decry these moves as "political correctness," there is no reason we need to remind black people of the pain of their past and present, and it is cruel to dismiss it so casually because no one inflicted such terrible pain on us. We can express exactly what we want in a way that's inclusive, more accurate, and promotes the resilience that defines the black experience.
Words matter. At Vidya our words will represent our values.
We are already avid donors, but we will certainly be more active in donating to organizations focused particularly on saving and improving black lives. If you are able, please consider donating as well. Here are just a few doing amazing work: