Big Tech Shouldn’t Check Its Ethics at the Door
Security is all the rage to discuss in IT services.
Many in the technology space still view security as the ability to string some security products together and consider a customer protected; however, security at this level is a technology view of the problem. Inherent in security is the concept of trust.
Big tech is doing its very best to take away any inherent trust in technology. Facebook is a master class in what not to do. When you say you are going to do something, do it. Facebook’s choices seem entirely about squandering any public trust it may have – from taking money for political advertising, even with known falsehoods and lies – to its inability to close its data collection APIs, even after the company said it would. Google keeps getting caught with its hand in the cookie jar when it comes to health care data, and was instantly questioned when it purchased Fitbit, only to be called out a few days later when Project Nightingale was revealed as collecting health care data on millions of patients.
This cavalier attitude with public trust is going to prove costly for the entire technology industry, as this is just bad for everyone. Regardless of where you think the line is, disregarding lines is not going to position anyone well. Big tech is dismissive of addressing a lot of these concerns, citing how difficult this is.
For those who claim this is “too hard” or “stifles innovation,” there are already moves that show neither to be true.
The Department of Defense recently announced a draft set of guidelines that smartly address the use of AI in an ethical way. The paper focuses on building systems that are Responsible, Equitable, Traceable, Reliable and Governable. Think about what this does: One of the largest technology producers and consumers in the world has come out and established a set of guidelines for how it expects the technology to work and how it should be implemented. The document is meaty — it addresses the philosophy here, including authoritarian versus democratic norms, how to build trust, and how to be transparent with academia and the private sector. It’s bold, it’s big, and it’s thoughtful.
|Hear from Sobel and other top industry speakers by attending Channel Evolution Europe, Dec. 2-3, in London. Register now!|
Big tech is struggling to come up with guidelines, but government has managed to do what it does best and set some societal boundaries. It’s clearly not “too hard.”
As for stifling innovation, Big Tech argues that oversight will make it incapable of innovation, but the adult entertainment industry not only embraces having regulation, but then innovates within that scope. This industry is cautious and well thought out about safety and privacy. There are very clear controls around adult content. This level of control is to ensure it complies with regulations from the government, billing agents and banks, which can include punishments that include being barred from payment processing, all the way up to prison terms.
Consider child abuse content. On porn sites, the company would likely be out of business. On Facebook, the content is bad PR. (And it’s notable that this has, in fact, happened.)
This is a good case for the argument why a well thought out ethical approach to technology can be beneficial. Just because it seems hard doesn’t mean it is, and here is an entire industry that has managed content far better than any of the others. Porn has a long history of driving technology innovation – here’s another example.
Legislation is coming. Congress recently proposed the “Online Privacy Act” to create the Digital Privacy Agency, setting up an entire agency to manage privacy. At first review, this legislation is more aggressive than both GDPR in Europe as well as California’s Consumer Privacy Act. With both sides of the aisle eyeing big tech, and consumer trust lowering due to the moves by Big Tech, it’s likely this …