
{"id":1533,"date":"2016-08-01T11:36:31","date_gmt":"2016-08-01T02:36:31","guid":{"rendered":"http:\/\/www.monomorphic.org\/wordpress\/?p=1533"},"modified":"2016-08-01T17:39:43","modified_gmt":"2016-08-01T08:39:43","slug":"ai-and-the-politics-of-perception","status":"publish","type":"post","link":"https:\/\/www.monomorphic.org\/wordpress\/ai-and-the-politics-of-perception\/","title":{"rendered":"AI and the politics of perception"},"content":{"rendered":"<p>Elon Musk, entrepreneur of some renown, believes that the sudden eruption of a very powerful artificial intelligence is one of the greatest threats facing mankind.\u00c2\u00a0&#8220;Control of a super powerful AI by a small number of humans is the most proximate concern&#8221;, he <a href=\"https:\/\/twitter.com\/elonmusk\/status\/738856423656808448\">tweets<\/a>. He&#8217;s not\u00c2\u00a0alone among silicon valley personalities to have this concern.\u00c2\u00a0To reduce the risks, he has funded the\u00c2\u00a0<a href=\"https:\/\/openai.com\/blog\/introducing-openai\/\">OpenAI<\/a> initiative, which aims to develop AI technologies in such a way that they can be distributed more evenly in society.\u00c2\u00a0Musk is very capable, but is he right in this case?<\/p>\n<p>The idea is closely related to the notion of a technological singularity, as promoted by for example Kurzweil. In some forms, the idea of a singularity resembles a God complex. In C G Jung&#8217;s view, as soon the\u00c2\u00a0idea of God is expelled (for example by saying that God is dead), God appears as a projection somewhere. This because the archetype or idea of God is a basic feature of the (western, at least) psyche that is not so easily dispensed with. Jung directs this criticism at Nietzsche in his Zarathustra seminar. (Musk&#8217;s fear is somewhat more realistic and, yes, proximate, than Kurzweil&#8217;s idea, since what is feared is\u00c2\u00a0a constellation\u00c2\u00a0of humans and technology, something we already have.)<\/p>\n<p>But if Kurzweil&#8217;s singularity is a God complex, then the idea of the imminent dominance of\u00c2\u00a0uncontrollable\u00c2\u00a0AI, about to creep\u00c2\u00a0up on us\u00c2\u00a0out of some dark corner, more closely resembles a demon myth.<\/p>\n<p>Such a demon myth may not be useful in itself for understanding\u00c2\u00a0and solving social problems, but its existence may point to a real problem.\u00c2\u00a0Perhaps what it points to is\u00c2\u00a0the gradual embedding\u00c2\u00a0of algorithms deeply into our culture, down to our basic forms of perception and interaction. <a href=\"http:\/\/www.huffingtonpost.com\/entry\/peter-sloterdijk-man-machine-interview_us_55e37927e4b0aec9f3539a06\">We have in effect already merged with machines.<\/a> Google and Facebook are becoming standard tools for information finding, socialising, getting answers to questions, communicating, navigating. The super-AI is already here, and it has taken the form of human cognition filtered and modulated by algorithms.<\/p>\n<p>It seems fair to be somewhat suspicious &#8212; as many are &#8212; of fiat currency, on the grounds that a small number of people control the money supply, and thus, control the value of everybody&#8217;s savings. On similar grounds, we do need to debate the hidden algorithms, controlled by<a href=\"http:\/\/www.nytimes.com\/2016\/06\/26\/opinion\/sunday\/artificial-intelligences-white-guy-problem.html?_r=2\"> a small number of people<\/a>\u00c2\u00a0(generally not available for perusal, even on request, since they would be trade secrets), and pre-digested information that we now use to interface with the world around us almost daily. Has it ever been so easy to change so many people&#8217;s perception at once?<\/p>\n<p>Here again, as often is the case, nothing is truly new.\u00c2\u00a0Maybe we are simply seeing a tendency that started with the printing press and the monotheistic\u00c2\u00a0church, taken to its ultimate conclusion. In any case I would paraphrase\u00c2\u00a0Musk&#8217;s worry as follows: control of collective <em>perception<\/em> by a small number of humans is the most proximate concern. How we should address this concern\u00c2\u00a0is not immediately obvious.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Elon Musk, entrepreneur of some renown, believes that the sudden eruption of a very powerful artificial intelligence is one of the greatest threats facing mankind.\u00c2\u00a0&#8220;Control of a super powerful AI by a small number of humans is the most proximate concern&#8221;, he tweets. He&#8217;s not\u00c2\u00a0alone among silicon valley personalities to have this concern.\u00c2\u00a0To reduce the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[20,49],"tags":[121,158,57,69],"class_list":["post-1533","post","type-post","status-publish","format-standard","hentry","category-computing","category-philosophy","tag-ai","tag-jung","tag-nietzsche","tag-politics"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/py2qT-oJ","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/posts\/1533","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/comments?post=1533"}],"version-history":[{"count":10,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/posts\/1533\/revisions"}],"predecessor-version":[{"id":1546,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/posts\/1533\/revisions\/1546"}],"wp:attachment":[{"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/media?parent=1533"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/categories?post=1533"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.monomorphic.org\/wordpress\/wp-json\/wp\/v2\/tags?post=1533"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}