Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Has AI hacked the operating system of human civilization

  • The New York Times: ‘The Godfather of A.I.’ Leaves Google and Warns of Danger Ahead – For half a century, Geoffrey Hinton nurtured the technology at the heart of chatbots like ChatGPT. Now he worries it will cause serious harm.
  • Gizmodo – AI Will Make Our Society Even More Unequal, Economists Warn – AI threatens to shift even more power from workers to the wealthy, shrinking tax bases and weakening governments, writes economist Yingying Lu.
  • Security Week – OpenAI CTO Mira Murati discusses AI safeguards and the company’s vision for the futuristic concept of artificial general intelligence, known as AGI.
  • The Economist: Yuval Noah Harari is a historian, philosopher and author of “Sapiens”, “Homo Deus” and the children’s series “Unstoppable Us”. He is a lecturer in the Hebrew University of Jerusalem’s history department and co-founder of Sapienship, a social-impact company. Ears of Artificial Intelligence: “(AI) have haunted humanity since the very beginning of the computer age. Hitherto these fears focused on machines using physical means to kill, enslave or replace people. But over the past couple of years new AI tools have emerged that threaten the survival of human civilisation from an unexpected direction. AI has gained some remarkable abilities to manipulate and generate language, whether with words, sounds or images. AI has thereby hacked the operating system of our civilisation. Language is the stuff almost all human culture is made of. Human rights, for example, aren’t inscribed in our DNA. Rather, they are cultural artefacts we created by telling stories and writing laws. Gods aren’t physical realities. Rather, they are cultural artefacts we created by inventing myths and writing scriptures. Money, too, is a cultural artefact. Banknotes are just colourful pieces of paper, and at present more than 90% of money is not even banknotes—it is just digital information in computers. What gives money value is the stories that bankers, finance ministers and cryptocurrency gurus tell us about it. Sam Bankman-Fried, Elizabeth Holmes and Bernie Madoff were not particularly good at creating real value, but they were all extremely capable storytellers. What would happen once a non-human intelligence becomes better than the average human at telling stories, composing melodies, drawing images, and writing laws and scriptures? When people think about ChatGPT and other new AI tools, they are often drawn to examples like school children using AI to write their essays. What will happen to the school system when kids do that? But this kind of question misses the big picture. Forget about school essays. Think of the next American presidential race in 2024, and try to imagine the impact of AI tools that can be made to mass-produce political content, fake-news stories and scriptures for new cults…Fear of AI has haunted humankind for only the past few decades. But for thousands of years humans have been haunted by a much deeper fear. We have always appreciated the power of stories and images to manipulate our minds and to create illusions. Consequently, since ancient times humans have feared being trapped in a world of illusions…We can still regulate the new AI tools, but we must act quickly. Whereas nukes cannot invent more powerful nukes, AI can make exponentially more powerful AI. The first crucial step is to demand rigorous safety checks before powerful AI tools are released into the public domain. Just as a pharmaceutical company cannot release new drugs before testing both their short-term and long-term side-effects, so tech companies shouldn’t release new AI tools before they are made safe. We need an equivalent of the Food and Drug Administration for new technology, and we need it yesterday…”
  • New Yorker – There Is No A.I. There are ways of controlling the new technology—but first we have to stop mythologizing it. By Jaron Lanier. April 20, 2023.

Sorry, comments are closed for this post.