Dr Paul Vallet: Welcome to the Geneva Centre for Security Policy weekly podcast. I am your host, Dr Paul Vallet, Associate Fellow with the GCSP’s Global Fellowship Initiative. For the next few weeks, I'll be talking with subject matter experts to explain issues of peace, security, and international cooperation. Thank you for tuning in. Cyber security is very much on our minds, just as current events of different kinds frequently remind us the challenges for individual and collective security posed by the increased digitisation of our lives and societies. Yet, cyber security encompasses a large spectrum of problems. Today, our discussion will focus rather on how to protect our much-exposed minds as we navigate a digitalised world. To do so I'm joined by Dr Hani Dabbagh. But to Hani Dabbagh, was an Executive in Residence at the GCSP in 2019 and is now an Alumnus Fellow is a digital strategist. After his PhD in information engineering and electronics, he began a lengthy career with Hewlett Packard rising from system engineer to digital business development manager. He then became an independent consultant and senior advisor to companies. As an early adopter of web 2.0 for marketing campaigns, he has focused his attention on the impact of disruptive digital technology on customer behaviour, and how to harness it for business benefit. This comes with an increased attention to what he calls “Cyber security of the mind. And it is an area which he will develop this me as a guest speaker to the GCSP, his Leadership in International Security Course. Thank you for joining us on the podcast, Hani.
Dr Hani Dabbagh: Thanks very much for the invitation.
Dr Paul Vallet: Well, to begin our conversation, my first question to you is what is the state of cyber security of the mind today for individuals, cooperation’s, or organisations? Do you think the basic problem have been given solutions? Or are you concerned with a newer generation of issues?
Dr Hani Dabbagh: Thanks Paul. I mean, first of all, I think there is a serious security breach of our minds that I don't believe has really been fully grasped by the population, I use that term a little bit tongue in cheek, I like to use the term cyber security because it catches the attention of so many today. But I'm not really talking about cyber security of systems, of IT systems, I really am talking about a security breach of our minds. It's really a breach that's hiding in plain sight. And I think it has serious, far reaching consequences to our democracy, and to our society as a whole. So, I don't really believe that this is really fully under control today. The whole things really started around 2007, when there was a kind of coming together, alignment of the of the planets, as it were, with the introduction of the iPhone, Android, Kindle, Twitter, Facebook started opening up. And in those days, we went from the dial up connection to the “always connected”. And then for people a lot younger than me, this might sound a little bit strange. But there was a time when we were not always connected, we actually had to have a modem and plug it in the wall like we hear that noise connecting us to the internet. This we take for granted today. But we were not always connected. Once we're always connected, it then becomes very easily, always trackable. And once we were always trackable, then we're always profiled. And we really then go from one stage to the next to become predictive, predictable, we are able to predict, very accurately when you will change back, you will able to predict when you will stop your subscription, it's actually predictable. And Amazon can ship your product before you even order it. And from there, we became manipulable. All that really has come from that large amounts of data collection that has been going on, and we get raw data. And in get out of that, deduce from that really a lot of behavioural knowledge and inferences about us is what we sometimes call the attention economy. Or I like to call it the “no free lunch economy”, whereby we're getting these so called free products free for us to use but come at a very big cost and what I call a kind of a Faustian deal, where we're selling our souls for something that has a huge cost to us. And whereas a lot of people today are aware of micro targeting of advertising advertisements, people are aware that it was nice to have an ad of a product that I enjoy or will enjoy. It really goes a lot further than that. And there's a lot of behavioural analysis, and ways in which we get connected back and back again, pervasive techniques to stay on those applications or on social media, or wherever. If you think about just to give a little bit more context to it, if you think about a classic, let's say, tripartite agreement between three parties in the classical economy where you've got a producer, you've got a customer, and you've got a product, right? And in the classical sense, if you allow me to use the HP example, HP would be the producer, the customer would be maybe yourself and the product could be a laptop, right? So, HP produces the product, you purchase it and you pay HP. This is what we're used to. But this is turned all upside down in this attention economy. And if you take that triangle and apply it to today, you put Facebook as the producer, right? The product is really your attention and the customer is the advertiser so in that agreement we are suddenly no longer a customer but we are a product and the better the product is the more the customer is willing to pay the more the advertiser is willing to pay, the producer. So in that agreement, unfortunately we're not the customer and whenever the customer's always right it's not us and that then becomes embroiled in all sorts of kind of far reaching consequences that become a way of manipulating us to two different sets of actions.
Dr Paul Vallet: Well your words are really quite enlightening in part because they've anticipated what would be my next question but perhaps we can elaborate a bit on that which was precisely on this issue that you raised about awareness and obviously what you've just underlined is the fact that since the dynamics of the relationship itself has been transformed this may be one of the factors why we do hear about some of the issues you're talking about but we've got, I think a problem in terms of following and being up to date as to what exactly the problem is confronting us, so I wanted to have your feeling about you know whether the this notion of awareness and the problem of influencing is that in any way making progress in terms of awareness among the general public?
Dr Hani Dabbagh: I think not enough Paul, I think as I said earlier people are maybe aware on the surface that we are being targeted particular advertising but they're not really aware of the invasion of privacy that is that is occurring generally speaking. I hear often around me this is maybe anecdotal but in terms of privacy “I have nothing to hide so what's the issue” but actually privacy really is a right we behave differently when we're alone we don't have to have anything to hide we actually need private time and need that privacy if I told you hypothetically that a government can read your mind can actually know what you're thinking and what you're going to be doing you'd be horrified I mean you might have images of the Stasi or KGB or whatever it's all movies and fights tell you that actually this is what's happening today but it's about private companies and the amount of data and knowledge that they have about you is concentrated into so much power by private enterprises that are unaccountable to anybody today that should raise alarm and I don't believe that enough is really being said about it or maybe being said more in specific more like specialised media and so on and people need to be really aware of the extent to which this is this has actually happened.
Dr Paul Vallet: Indeed moving along from you know the very problematic question of whether that awareness is sufficient or not perhaps now we could see whether we have any ideas or solutions going a little bit beyond raising that public awareness of the way we're having this discussion here. So, I was wondering whether in your feeling you have do you feel that there are certain types of organisations or entities whether private or public that are emerging perhaps as good models in handling this question of improving cyber security of the mind?
Dr Hani Dabbagh: well in terms of maybe improving the awareness of it I can think of Center for Humane Technology from Tristan Harris, they do amazing work in that respect and I think this is something that I can only recommend people to follow and read about. In terms of actually doing something about it, I think it really boils down to a push and a pull if you like. I mean, on the one hand, it's in our own hands to be able to recapture that critical thinking. It's the only if you like, safeguard that we have for breaching our minds restore that, that critical thinking. And I think this is something that goes into all sorts of areas from education and schooling and parenting, and so forth. Today, what is happening, and one of the consequences of this breach is the spread of misinformation. Today, misinformation and disinformation. And misinformation is spread by people we know, not necessarily with nefarious, intense intentions behind it. And we need to be able to question, be critical about the source, and so forth. So there's a lot actually that can in our own hands to be able to do something about maybe it is something about refusing that so called free software, and being able to decide, well, you know, what, this isn't free, I’m paying a lot for it, I'd rather pay for my wallet, then from my own soul as it were. To me the real source of all of this is that business model that I've described. It's also the recognition that this data that is being taken, amassed is ours, and we need to recover that right and that ownership. So, there's a number of tools around that people might not be fully aware of that we're able to use to be able to reclaim that right. There are also data rights organisations that are coming up from grassroots effort mydata.org would be another one that focuses on the data itself and the ownership of that data and what you know, companies are not allowed to just take that data. I would also hasten to mention a Professor of Harvard Business School Professor Shoshana Zubov, who wrote really seminal book on “Surveillance Capitalism”. And in some articles she wrote, which talks really about taking back that that ownership and saying, from that push perspective, governments have a big role to play as well in this, and we can see that today in the US about breaking up of the of these big five companies, such as Facebook and Google, but it goes beyond that, as well. And she says something interesting about actually forbidding the trading of data. And it sounds maybe too extreme. But when you think about it, we forbid governments forbid, the trade of organs, the trade of people. It does make sense when you realise that data is also a lot of it can be in our powers as users, we need that awareness. And we also expect a lot from our governments to take back that kind of control. Because there's a lot at stake here.
Dr Paul Vallet: Well, I'm I get that we each of course have as adults and educated individuals, our own behaviour to put in, that you’ve illustrated that very well. And obviously, as you also pointed out, governments can use their regulatory powers to shape the way this business is being conducted, what to allow or not. So, I was also wondering, in that respect, within the industry, of course, that you have some knowledge of as well. And bearing in mind that's the problem you're telling us about is already considerable. Is there any thought given to modelling or predicting future kinds of threats to the to this issue? Or are we anticipating that if our regulation manages to put into check the trading of data, there will not be a next loophole that can be used for future business?
Dr Hani Dabbagh: That’s a very good question. I wish I knew that the full answer to. The question is good because you touched on an important point. And that is the crux of digital technology today, in that it is a solution waiting for a problem. In past revolutions, we start with a problem. And we try and find a solution to it from the agricultural to industrial, how do we mechanise? How do we make things go faster? Et cetera. Today, we can therefore we do and then we try and find a problem for that solution. And that's what makes it over more challenging. You know, when I joined Twitter, I think it was 2008 or something like that, it was to SMS and just a little nostalgia here. And when I joined, Twitter was designed and made to share what you had for breakfast with friends. Right? I mean, ultimately, that's what it is. If you had told me back then. But Twitter would be the main megaphone of communication from a US president. I would have laughed off my seat. And even Jack Dorsey would have said the same. And the same with Mark Zuckerberg. I mean, you get that sense that we're creating that Frankenstein, that we're kind of losing control over. And we need to have tech savvy people who surround our politicians and our policymakers that are able and capable to look into the crystal ball about tomorrow's problem rather than today's. And what solution is being built, that's going to create that, you know, problem in the future. And that is something that I see is very nascent. today that I haven't really seen, you know, we've seen CTOs in governments and in various governments. But as that as a particular, you know, full-fledged, powerful department is, I think, yet to be seen, but that is a very important point that it's about tomorrow, rather than today. The anecdote I always like to give, I'm going to be running out of time here, is that when I did research for my PhD on speech recognition, I discovered and one of the research papers that researchers wanted to find a good, let's say, application for speech, right here we are talking about 1970s and 80s. And, they figure well, a postal service for four parcels would be perfect because it's an in a warehouse, gigantic warehouse with conveyor belts. And the operators would punch in the zip code of a package and it would get sent off in the right direction. But because it's unheated in the winter, with the gloves, the keypads generate a lot of error. So, I thought, wouldn't it be great, that's fantastic. You put a headset and a microphone, and you will say the zip code, and you're away. To cut a long story short, the discovery would have been easier and cheaper to hit the warehouse, take off the gloves and punch that keypad and set an example I'd like to give about, you know, a solution waiting for a problem, and that is an important issue.
Dr Paul Vallet: Yeah, well, remarkable. And well, the maybe we have time for one last question. And I asked you the previous one because obviously we're also interested a lot in foresight at the GCSP. But our other activity, of course, is executive education. So my final question to you would be, you know, based on the experience, whether now when we're conducting training courses for people who will be going into IT management or other situations of responsibility, are we integrating enough of this preparedness and awareness into our courses and what can we do to improve that?
Dr Hani Dabbagh: I can't speak for all corporate courses, but I think there's room for that. I think there's definitely, we tend to talk more on the technical side, rather than on the human impact side, rather than on the right side, there's a lot more that we can do there to teach also the responsibilities of IT management, as well as able to understand what can be taken, in terms of data and so forth. So, I think there's, there's room for that. Definitely. And at the same time for executives, as well to understand, you know, there's the IT infrastructure, but the executives themselves also need to understand how far they can go, and what rights and obligations that we have definitely. I'm happy to kind of give that little module at the LISC course, as well, it’s always an interesting discussion and debate.
Dr Paul Vallet: I think at least the most we can say is that will probably come out of this discussion that will be a lot more aware of the issue. Thank you, Dr Dabbagh, for joining us. today. That's all we have time for this episode. So, to our listeners, please listen to us again next week to hear the latest insights on international peace and security. Don't forget to subscribe to us on Anchor FM, Apple iTunes. You can also follow us on Spotify, on Soundcloud and perhaps some of the other infamous platforms that we’ve just been mentioning. I’m Dr Paul Vallet with the Geneva Centre for Security Policy and until next time, bye for now.
Disclaimer: The views, information and opinions expressed in this digital product are the authors’ own and do not necessarily reflect those shared by the Geneva Centre for Security Policy or its employees. The GCSP is not responsible for and may not always verify the accuracy of the information contained in the digital products.