Jump to content

Saturn-Moon symbolism during Harari's 2018 WEF speech


CarolineL
 Share

Recommended Posts

While reading about the Saturn-Moon matrix in The Perception Deception, it struck me that this kind of symbolism was displayed during Yuval Harari's WEF speech in 2018.  (Attached still image extracted from the video at 00:16:35.)

 

The animated Saturn-Moon symbol is pretty obvious. I'm not as certain about the King imagery--my guess is that he represents Arthur Pendragon. Seems he's holding "Excalibur" and the (Archontic) torch in a way that encloses his body in a triangle/pyramid shape. If I'm right about Arthur, the image would also imply Arthur's "Knights of the Round Table" and supply an answer to Harari's rhetorical question, "Now how exactly will the future masters of the planet look like?"

 

Any other ideas about this symbolism?

 

Incidentally, there may be some subliminal messaging during the Q & A part of the presentation. Look on the screen behind Harari for a foot-high strip of rapidly blinking text.  The camera angle makes it difficult to decipher the letters, but maybe someone here will have better luck than I did. (I thought I saw partially formed words like "Die" and "Blood.") Q & A starts at 00:19:35.

 

 

2018-00.16.35.560.jpg

Link to comment
Share on other sites

12 hours ago, CarolineL said:

While reading about the Saturn-Moon matrix in The Perception Deception, it struck me that this kind of symbolism was displayed during Yuval Harari's WEF speech in 2018.  (Attached still image extracted from the video at 00:16:35.)

 

The animated Saturn-Moon symbol is pretty obvious. I'm not as certain about the King imagery--my guess is that he represents Arthur Pendragon. Seems he's holding "Excalibur" and the (Archontic) torch in a way that encloses his body in a triangle/pyramid shape. If I'm right about Arthur, the image would also imply Arthur's "Knights of the Round Table" and supply an answer to Harari's rhetorical question, "Now how exactly will the future masters of the planet look like?"

 

Any other ideas about this symbolism?

 

Incidentally, there may be some subliminal messaging during the Q & A part of the presentation. Look on the screen behind Harari for a foot-high strip of rapidly blinking text.  The camera angle makes it difficult to decipher the letters, but maybe someone here will have better luck than I did. (I thought I saw partially formed words like "Die" and "Blood.") Q & A starts at 00:19:35.

 

 

2018-00.16.35.560.jpg

 

I cannot make out what the letters say but in the video @ 15:57 where he asks who owns our data, DNA etc, well I can say without doubt that it belongs God & nobody else.

It doesn't belong to Governments or corporations or even ourselves, it is God's.

Link to comment
Share on other sites

Thanks for looking, Alexa, and for your take on who owns the data.  I think it's a very important question; definitely not one that should be decided by the likes of the WEF.  

 

Harari asserted in the Q & A that "the discussion is just began." The partially decoded "Big Brother" and other dark symbolism suggests to me that he knows the elites do not intend to let us hacked and hackable humans have a say. I made an informal transcript of the whole presentation, and will post it here in the public interest.

 

When pressed, Harari "predicted" that people would give up their privacy in exchange for healthcare.  Then came the Pandemic.  Problem-Reaction-Solution much?

 

 

Quote

ANNUAL MEETING 2018

World  Economic Forum

 

[Stage background: Animated digital version of "Big Brother."]

 

MC: ...everybody. And welcome to a conversation with Professor Yuval Noah Harari. My name is Gillian Tett. I am the US managing editor of the Financial Times.

 

[Ref.: https://www.politico.com/media/story/2015/05/the-60-second-interview-gillian-tett-us-managing-editor-financial-times-003775/]

 

Now there are not many historians who would be put on the main stage of the Congress Center of the World Economic Forum sandwiched between Angela Merkel and Macron.


I think there are even fewer who could fill the room almost as much as Angela Merkel. And almost none who would have the experience as we were waiting in the Green Room  and Angela Merkel came through, Chancellor Merkel came through--she took care to stop, go up to Yuval and introduce herself and say, "I have read your book." Pretty amazing.

 

But Yuval Harari has written two very important books, which have really shaped the debate, not just inside governments, but inside many businesses and many non-governmental organizations too. One of them, I imagine most of you have read: Sapiens. Hands up who in the room has read Sapiens? (Show of hands).  Okay, well that is pretty impressive. His second book, Homo Deus, took those themes of Sapiens, looking at the history of mankind, threw it into the future and looked at the digital. He's got a third book coming out this summer. Twenty-one Lessons for the 21st Century. Which is going to look at the present.

 

But what he's going to be talking about today is something that actually Chancellor Merkel touched on in her own speech, which is the question of data. And what do we do about data today.

 

His ideas are very provocative, very alarming, and something that all of you should pay very close attention to now.

 

Professor Harari. Professor Yuval, the floor is yours.

 

[Yuval Harari walks on stage. Applause. ]

 

YH: (to Gillian Tett as their paths cross.) Thank you.

 

0:02:08.0

[Stage background changes to an image reminiscent of  "The Shroud of Turin." ]

 

YH: So, hello everybody. Let me just have one minute to get friends with this computer, and make sure everything is okay. And can I have a bit more light on the audience, so I can see the faces and not just speak to a darkness. Thank you.

 

So I want to talk to you today about the future of our specie. And really the future of life.

 

We are probably one of the last generations of Homo Sapiens. Within a century or two, Earth will be dominated by entities that are more different from us than we are different from Neanderthals or from chimpanzees. Because in the coming generations, we will learn how to engineer bodies and brains and minds.

This will be the main products of the economy, of the 21st century economy. Not textiles and vehicles and weapons. But bodies and brains and minds.

 

[Background: "Jesus/Shroud of Turin." Text overlay: WHAT WILL FUTURE LOOK LIKE?]

 

0:03:24.7

 

Now how exactly will the future masters of the planet look like? This will be decided by the people who own the data. Those that control the data control the future, not just of humanity, but the future of life itself. Because today data is the most important asset in the world.

 

[Background: Image of a yellow field with a small stand of green trees in the middle. Text overlay: WHAT WILL FUTURE LOOK LIKE?]

 

In ancient times, land was the most important asset. And if too much land became concentrated in too few hands, humanity splits into aristocrats and commoners.

Then in the modern age, in the last two centuries, machinery replaced land as the most important asset.

 

0:04:13.9

 

[Background image changes to a large industrial complex.]


And if too many of the machines became concentrated in too few hands, humanity split into classes, into capitalists and proletariats.

 

Now data is replacing machinery as the most important asset.

 

0:04:31.7

 

[Background changes to scrolling php code.]

 

And if too much of the data becomes concentrated in too few hands, humanity will split, not into classes. It will split into species. Into different species. Now why is data so important. It's important because we have reached the point when we can hack not just computers. We can hack human beings and other organisms.

 

There is a lot of talk these days about hacking computers, and e-mail accounts and bank accounts and mobile phones. But actually, we are gaining the ability to hack human beings.

 

Now what do you need in order to hack a human being? You need two things.

 

0:05:24.5

 

[Background changes to a cloud server operation with Chinese or Japanese inscription design on a row of servers; on the right side of the background is a thumbprint.]

 

You need a lot of computing power. And you need a lot of data, especially biometric data. Not data about what I buy or where I go. But data about what is happening inside my body and inside my brain.

 

Until today, nobody had the necessary computing power and the necessary data to hack humanity. Even if the Soviet KGB or the Spanish Inquisition followed you around everywhere, twenty-four hours a day, watching everything you do, listening to everything you say, still they didn't have the computing power and the biological knowledge necessary to make sense of what was happening inside your body and brain and to understand how you feel and what you think, and what you want.  

 

But this is now changing, because of two simultaneous revolutions. On the one hand, advances in computer science, and especially the rise of machine learning and AI are giving us the necessary computing power. And at the same time advances in biology, and especially in brain science are giving us the necessary understanding, biological understanding.  

 

0:06:51.1

 

[Background: Text overlay over computer bank: Infotech. Text overlay over thumbprint: Biotech]


You can really summarize a hundred and fifty years of biological research since Charles Darwin in three words: Organisms are algorithms. This the big insight of the modern life sciences. That organisms, whether viruses or bananas or humans, they are really just biochemical algorithms. And we are learning how to decipher these algorithms.

 

Now when the two revolutions merge, when the Infotech revolution merges with the Biotech revolution, what you get is the ability to hack human beings. And maybe the  most important invention for the merger of Infotech and Biotech is the biometric sensor that translates biochemical processes in the body and the brain into electronic signals that a computer can store and analyze. And once you have enough such biometric information, and enough computing power, you can create algorithms that know me better than I know myself.

 

And humans really don't know themselves very well. This is why algorithms have a real chance of getting to know ourselves better. We don't really know ourselves. To give an example, when I was 21, I finally realized that I was gay, after living for several years in denial. And this is not exceptional. A lot of gay men live in denial for many years. They don't know something very important about themselves.

 

Now imagine the situation in ten or twenty years when an algorithm can tell any teenager exactly where he or she is on the gay/straight spectrum.  And even how malleable this position is.  The algorithm tracks your eye movements, your blood pressure, your brain activity and tells you who you are. Now maybe you personally wouldn't like to make use of such an algorithm. But maybe you find yourself in some boring birthday party of somebody from your class at school. And one of your friends has this wonderful idea that "I've just heard about this cool new algorithm that tells you your sexual orientation. And wouldn't it be very a lot of fun if everybody just takes turns testing themselves on this algorithm as everybody else is watching and commenting?"

 

What would you do?  Would you just walk away? And even if you walk away, and even if you keep hiding from your classmates or from yourself, you will not be able to hide from Amazon and Ali Baba and the secret police. As you surf the Internet, as you watch videos or check your social feed, the algorithms will be monitoring your eye movements, your blood pressure, your brain activity. And they will know. They could tell Coca-cola that, "if you want to sell this person some fuzzy, sugary drink, don't use the advertisement with the shirtless girl. Use the advertisement with the shirtless guy." You wouldn't even know that this was happening. But they will know. And this information will be worth billions.

 

Once we have algorithms that can understand me better than I understand myself, they could predict my desires, manipulate my emotions, and even take decisions on my behalf. And if we are not careful, the outcome might be the rise of digital dictatorships.

 

[Background change: Left panel: Temple ruin. Text: Democracy. Right panel: Black and white image of a large circular structure. Text: Digital dictatorship.]

 

0:10:55.8

 

In the twentieth century, democracy generally out-performed dictatorship, because democracy was better at processing data and making decisions. We are used to thinking about democracy and dictatorship in ethical or political terms. But actually these are two different methods to process information. Democracy processes information in a distributed way. It distributes the information and the power to make decisions between many institutions and individuals.  Dictatorship on the other hand, concentrates all the information and power in one place.

 

Now given the technological conditions of the twentieth century, distributed data processing worked better than centralized data processing. Which is one of the main reasons why democracy out-performed dictatorships. And why for example the US economy out-performed the Soviet economy.

 

But this is true only under the unique technological conditions of the twentieth century. In the twenty-first century new technological revolutions, especially AI and machine learning, might swing the pendulum in the opposite direction. They might make centralized data processing far more efficient than distributed data processing.

And if democracy cannot adapt to these new conditions, then humans will come to live under the rule of digital dictatorships.

 

And already at present we are seeing the formation of more and more sophisticated surveillance regimes throughout the world. Not just by authoritarian regimes, but also by democratic governments.  The US for example is building a global surveillance system while my home country of Israel is trying to build a total surveillance regime in the West Bank.

 

0:13:20.3

 

[Background changes to scrolling php code.]

 

But control of data might enable human elites to do something even more radical than just build digital dictatorships. By hacking organisms elites may gain the power to re-engineer the future of life itself. Because once you can hack something, you can usually also engineer it.  And if indeed we succeed in hacking and engineering life, this will be not just the greatest revolution in the history of humanity. This will be the greatest revolution in biology since the very beginning of life four billion years ago.

For four billion years nothing fundamental changed in the basic rules of the game of life. All of life, for four billion years: dinosaurs, amebas, tomatoes, humans. All of life was subject to the laws of natural selections and to the laws of organic biochemistry.

 

But this is now about to change. Science is replacing evolution by natural selection with evolution by intelligent design. Not the intelligent design of some god above the clouds. But OUR intelligent design and the intelligent design of our clouds. The IBM cloud, the Microsoft cloud--these are the new driving forces of evolution.

And at the same time, science may enable life, after being confined for four billion years to the limited realm of organic compounds, science may enable life to break out into the inorganic realm.

 

So after four billion years of organic life shaped by natural selection, we are entering the era of inorganic life, shaped by intelligent design. This is why the ownership of data is so important.

 

0:15:37.4

 

[Background changes to Saturn-Moon symbolism. Text overlay: REGULATION]

 

If we don't regulate it, a tiny elite may come to control not just the future of human societies, but the shape of life forms in the future.

 

So how to regulate the data, the ownership of data?

 

We have had ten thousand years of experience regulating the ownership of land. We have had a few centuries of experience regulating the ownership of industrial machinery. But we don't have much experience in regulating the ownership of data, which is inherently far more difficult. Because unlike land and unlike machinery, data is everywhere and nowhere at the same time. And it can move at the speed of light and you can create as many copies of it as you want.

 

So does the data about my DNA, my brain, my body, my life, does it belong to me or to some corporation or to the government. Or perhaps to the human collective.

At present, big corporations are holding much of the data. And people are becoming worried about it. But mandating governments to nationalize the data may curb the power of the big corporations only in order to give rise to digital dictatorships. And politicians really, many politicians at least, are like musicians. And the instrument they play on is the human emotional and biochemical system. A politician gives a speech, and there is a wave of fear all over the country. A politician tweets, and there is an explosion of anger and hatred.

 

Now I don't think we should give these musicians more sophisticated instruments to play on. And I certainly don't think they are ready to be entrusted with the future of life in the universe, especially as many politicians and governments seem incapable of producing meaningful vision for the future. And instead, what they sell the public are nostalgic fantasies about going back to the past.

 

And as an historian I can tell you two things about the past. First of all, it wasn't fun. You wouldn't like to really go back there. And secondly, it's not coming back.

So nostalgic fantasies really are not a solution. So who should own the data? I frankly don't know. I think the discussion has just began. Most people when they hear the talk about regulating data, they think about privacy, about ah, shopping, about companies, corporations that know where I go and what I buy. But this is just the tip of the iceberg. There are much more important things at stake. So the discussion is hardly began and we cannot expect instant answers.  We had better call upon our scientists, our philosophers, our lawyers and even our poets. Or, especially our poets, to turn their attention  to this big question. How do you regulate the ownership of data?

 

The future, not just of humanity, but the future of life itself may depend on the answer to this question.

 

Thank you.

 

[Applause]

 

0:19:28.8

 

[Background changes back to digital big brother.  Harari joins Tett for a Q&A at the right side of the stage.  Behind Harari's head is a strip of rapidly blinking text. ]

 

GT: Well thank you Professor Harari for an absolutely brilliant, thought-provoking and I must say a somewhat challenging and depressing talk.

 

I must say I'm quite star-struck sitting here, listening to that stream of ideas. And I'd like to start with a very simple question, which is this. You paint this picture of a future that is quite scary. Um, how soon do you expect that future to be here? Are we talking about two years? Twenty years? Two hundred years?  I mean, how soon could we be dealing with digital dictatorships?

 

YH: I think that the time scale is decades. I mean in two hundred years I guess there won't be any Sapiens left. It will be something completely different. Two years is far too soon. So we are talking about a few decades. Nobody knows exactly how many.

 

GT: Right.

 

Now you're unusual. Because you actually stood up on that stage and you said "I don't know what the answer is."

 

Okay, that's not something you hear a lot at the World Economic Forum.  Um, it's admirably humble. But I'm curious. You know, as you look around the world today, do you see any countries or any groups of people or any academic groups that seem to be having a sensible debate about this? Do you see any reason for encouragement at all?

 

YH: Well I think the world is divided into a very, very small group of people and institutions who understand what is happening and what is at stake.  And the vast majority not just of ordinary people but even of politician and business people who are not really-- "Ah yes, they hear about data, yeah, data protection, those cyber attacks, somebody might steal my identity or my bank account details." But as I said it's just the tip of the iceberg. I think that my guess, I'm not, I don't know. But I guess that ah, some of the big corporations like Google, like Facebook, the usual suspects--they understand what is at stake.

 

I also think that some governments, ah, especially the Chinese government. I think they understand what is at stake. I think most, certainly most humans have no idea.

 

GT: Right.

 

YH: Again, the thing is just to make it clear. It's the biometric data is the key. When people think about data, they mostly think about where I go, what I buy. When they think about hacking, they think about computers, they talk about AI, about machine learning, they forget the other side of the equation, which the life sciences, the brain sciences.  The brain sciences are giving us access to here (points to his head). This is the real, what we really try--. We. What somebody is really trying to hack is this. (Points to his head.) Not this.  (Points to a table.)

 

GT: Right. I mean China is interesting, because I remember sitting at a table a few years ago in Davos, with a senior Chinese official who-- arguing about democracy. And he said, "Well, you in the west have democracy. We have social media." And the point was that Chinese government is using social media to not just monitor its citizens. But also act as a weather vane to gather information about what's happening in terms of public sentiment. And ensure that they stay one inch ahead of that to stop any explosions.

 

Do you see China as a place where this type of digital dictatorship is most likely to emerge?

 

YH: Well, I don't know. As I said, as I gave examples, you have cases in the west. And I know maybe best about my own country that ah, Israel is building a real total surveillance regime in the West Bank, which is something we haven't seen anywhere, almost anywhere in history before. Of really trying to follow every place, every individual. And we are, we still haven't crossed the critical watershed of the, of the biometric sensor.

 

Whether it's in the US, in Israel, in China, it's still social media. It's still my mobile phone. It's still where I go, what use I make of my credit card. We stilll don't really have the technology to go inside. But we are maybe five years, ten years away from having the technology. So maybe to give an extreme example, let's say you live in North Korea. And you have to wear this bracelet, which constantly monitors what is happening inside your body. And you walk into a room and you see a picture of the dear leader on the wall. And the bracelet can know what is happening to your brain, to your blood pressure as you see this picture.

So this, this, really what is meant by a digital dictatorship.

 

GT: I mean it makes 1984 sound positively--

 

YH: Child's play.

 

GT: --child's play. Exactly.

 

Um, you say you don't know what to do about this. But imagine for a moment that you were dictator, be that digital or not. What would you do right now to help humanity deal with this?  Would you like to just throw away all of those biometric devices?

 

YH: No, it's absolutely impossible to go back, especially in terms of technology and science.  Even if one country or an entire continent is freaked out by the possibilities. And they say, "We stop all research in this field." You can't force other countries to do the same. And then you have your own, I mean you have a race.  A race to the button, unless you have some global agreement on how to deal with this, then no country would like to stay behind in the race.

 

GT: So do you want the scientists to take control? Do you want the United Nations--do you think the United Nation--[cross talk]

 

YH: No, as I said, I mean--

 

GT: --World Economic Forum?  I mean could all the people here take control of this do you think?

 

YH: The discussion is just began. I don't think we should panic. We should just first of all be aware that this is what we are facing. And there are many possibilities. Also technological possibilities. How, for example--. I mean when we talk about regulating ownership of land, we have a very clear picture what it means. Okay you have a plot. You have a field. You build a fence around. You have a gate. You stand at the gate and you say, "Okay you can come in, you can't." That, "This is my field." Now what does it mean in terms of the data about my DNA or what's happening in my brain? I mean, what's the analogy of the fence and the gate. We just don't understand.

So I think we are in an analogous position to where we were with the Industrial Revolution two hundred years ago. And you just need time. I mean, when you start a discussion, I know this from class in university. You start a discussion and somebody raises a hand and says, "Okay, but what should I write in the test?"  And "No, no no! We are not there yet! We should stress to have a discussion about this." I don't have all the answers.

 

GT: Right.

 

I mean, one thing I find fascinating in your description of the digital economy is that it actually involves a picture of society which is not quite the picture that normal economists have.

 

Because most of the digital exchanges today don't actually involve money. People are giving up data in exchange for services. And that's something that no economic model can capture right now. And frankly, the legal models can't either, in terms of the anti-trust. So I'm curious. When you look at this problem, it's not quite economics, it's certainly not just computer science, it's not really any particular discipline. Do you think this means that universities need to rethink how they categorize academics? I mean who is going to take this over-arching view to try and talk about these issues?

 

I'm trained as an anthropologist, so I'd love to say the anthropologists. But I'm not waiting on them either.

 

YH: Hopefully everybody. I mean I think that today if you are a computer scientist, you also need to be to some extent a philosopher, and an anthropologist. It's now part of the business. And um, I think maybe again to try and focus it, you talked about different exchanges in the world.  Maybe the most important exchange in this respect will be in healthcare.  The big battle over what we today call "privacy" will be between privacy and health. Do you give access to what is happening inside your body and brain in exchange for far better healthcare?  And my guess is that health will win. Hands down.

 

People will give up their privacy in exchange for healthcare. And maybe in many places they won't have a choice.  I mean they won't even get insurance if they are unwilling to give access to what is happening inside their bodies.

 

GT: Right.

 

So now that they exchange that would not involve money, but still be very, very important.

 

Last quick question, then we must wrap, sadly.  When it was all about land control, the elites essentially had feudalism. We called it feudalism in history. When it was all about the industrial machines, we had capitalism and Marxism. Have you thought of a word to describe this new world of data-ism?

 

YH: Yeah, I tried data-ism, but I don't know. I mean words have their own ah, their own life. And what word catches, and what word doesn't is really a coincidence.

 

GT: Well maybe answers on the postcard, if anyone in the hall has an idea. Or tweet it out or send him an e-mail or whatever digital communication you like. But anyway, on behalf of everyone here, I can see from the faces, people are actually not looking at their mobile phones. They're actually looking at you. They are rapt. That is quite an achievement. And thank you very much indeed for both inspiring us and terrifying us.

 

Thank you.

 

YH: Thank you, Gillian. (To audience: Thank you.)

 

[Applause.]

 

End.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...