| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

The biggest barrier to humane, ethical AI: Capitalism itself

Page history last edited by Paul Pajo 3 years, 4 months ago Saved with comment

The biggest barrier to humane, ethical AI: Capitalism itself

 

Katherine Schwab

October 5, 2020

 

https://www.fastcompany.com/90558020/ai-ethics-money-facial-recognition-fei-fei-li

 

"Ethical AI's biggest obstacle" 

First Impressions: This article seems to be about how capitalism is the biggest obstacle for ethical AI (artificial intelligence) development.

Quote: "For Timnit Gebru, the technical colead of the Ethical Artificial Intelligence Team at Google, one challenge is that she has to work against the incentive structures inherent to capitalism. For publicly traded companies such as Google, constantly increasing profit is the highest good. 'You can’t set up a system where the only incentive is to make more money and then just assume that people are going to magically be ethical,' she said.

Reflection: 

The advent of Artificial Intelligence systems and the preponderance of it's obvious usefulness begs the question if ethical considerations are something that should be wrapped over its utility.  There is a clamor for the emergence of what is called Ethical AI because the temptation for its abuse is very high. However in this article, the argument is that even if big companies adopt ethical frameworks to make sure that the use of AI is ethical - there will still be smaller companies and startups who can start using Artificial Intelligence without or little of the ethical concern that big companies have. Another way of looking at it is that small companies have limited resources to apply to ethical considerations when they can devote it towards more research and development. If a small startup however has a corporate culture of not putting any ethical considerations from the very start - it's arguable that they might not be as stringent in adopting ethical restraint as a best practice compared to bigger companies who can afford to do so. Publicly traded companies also have the public pressure to deal with as there is an expectation that ethical AI is part of their corporate governance. Also in terms of experimentation, there will be instances when AI can be weaponized as part of the an experimental hypothesis without real-life targets and the results at best can be part of an academic paper. At worst though, someone could be tempted to look at such experimental results and turn into an actual real-life application because of commercial considerations.

5 Things I have learned:

 

1.  If the only incentive for companies is to make money, the expectation that they will become magically ethical is naive.

2. It took public protest before companies like Amazon, IBM and Microsoft to reconsider that deployment of facial recognition.

3. One way to make sure that Ethical AI emerges is the strategy of "pressure from all sides" where there are external as well as internal advocates.

4. One of the least controversial actions that you can do to make AI ethical is to have better documentation.

5. There's a lot of easy money that can be made with AI.

5 Integrative Questions:


1. How can Ethical AI emerge from corporations that have pressure to monetize AI without ethical considerations?

2. Why is getting better documentation the least controversial action to have more ethical AI?

3. Why is even getting better documentation for more ethical AI an action that is resisted within companies?

4. Why should Amazon, IBM and Microsoft reconsider the deployment of facial recognition?

5. Why is it naive to expect companies to magically become ethical when their main incentive in the first place is to make money?

Comments (0)

You don't have permission to comment on this page.