Exit X? Curbing Social Media
The Why? CurveSeptember 12, 2024x
119
43:5860.58 MB

Exit X? Curbing Social Media

X banned in Brazil. The boss of Telegram detained in France. Is state power finally moving to curb the big social media sites? There’s been a lot of talk about reining-in X, TikTok, Instagram, Snap and the rest, but have governments now decided make the sites accountable for the harm they cause - misinformation, child abuse and societal division? Or are the Elon Musks still beyond control and regulation? Robin Mansell, Professor of New Media and the Internet at the London School of Economics, tells Roger and Phil the economic pressure from advertisers will probably be a more effective curb.

Hosted on Acast. See acast.com/privacy for more information.

[00:00:00] The Why Curve, With Phil Dobbie and Roger Herring.

[00:00:30] He can't strike by Western states begun now against over mighty social media.

[00:00:35] The Why Curve, Yeah I mean this is a big hour for long time as the thing is,

[00:00:41] When change though things are happening but when you, you know for decades or for centuries

[00:00:46] people sitting in pubs speak rubbish, it doesn't always have always count stuff that just isn't true.

[00:00:53] Yes, they have a few times too good.

[00:00:58] But of course you know it's only handful of people that can get agree or disagree.

[00:01:02] The difference is now you can spout this rubbish.

[00:01:04] A million, and it gets picked up and the other people go there's a fair point and then it just gets perpetuated.

[00:01:10] It's not just random, that's the thing.

[00:01:12] You know a lot of this is manipulate to there and a bot's doing stuff out there that have put these in

[00:01:17] there are government paying for bots to do that kind of stuff.

[00:01:19] It is and it's one West.

[00:01:22] There's no real control.

[00:01:24] It is because I pop back into Twitter, as you know, I came off Twitter.

[00:01:27] You did and I think Elon Musk, or Elon Musk, I knew he was famous.

[00:01:31] Well, he knew me and I, you know, a few million, that means he could take many but yeah.

[00:01:35] Yeah, yeah.

[00:01:35] So well, I took my whatever, I didn't have many 5,000, 10,000 followers ahead.

[00:01:39] So anyway, but I just did 90% bots.

[00:01:42] Yeah, probably.

[00:01:43] But I just pop back in there.

[00:01:45] Meal a joy posted isn't it curious that we never see war footage from you,

[00:01:51] Krain.

[00:01:53] Why is that?

[00:01:54] To which the first responses from Sally Key, what K. Why more than likely in their neather never was a war followed by a long list of war didials.

[00:02:03] All these people who've convinced themselves that we are, we're being told to lie.

[00:02:07] Yeah.

[00:02:07] Have you Krain?

[00:02:08] Yes.

[00:02:08] Because they haven't seen the footage.

[00:02:10] They maybe if they turn the TV on a wall, isn't it?

[00:02:14] They might.

[00:02:15] You never know.

[00:02:16] Well, you think well, you're you're siding with them.

[00:02:18] Maybe it's all a big lie.

[00:02:19] But this is how it starts and it probably, you know, all of those who play has probably came from a Russian bot farm.

[00:02:24] But the thing is, as first of all, why didn't my mother join post that?

[00:02:28] Secondly, why did Twitter on my welcome back feel, you know, we know you just have

[00:02:32] an abit of a poker end?

[00:02:34] Why did they think that should be the first thing I see?

[00:02:36] Oh, no.

[00:02:36] You just have to just make it.

[00:02:37] Well, nothing's changed there.

[00:02:39] I think I'll just go away.

[00:02:40] Okay.

[00:02:40] They think no sooner can it's so bad that they cannot be turned around.

[00:02:43] The good way.

[00:02:44] This is, I mean, all this cram.

[00:02:47] How how is it being perpetrated?

[00:02:49] How can people be so stupid to believe it?

[00:02:52] Well, you know, I mean, it says a lot about society, not just the, you know, this is happening,

[00:02:57] but we allow it to happen.

[00:02:58] So you can say, well, okay, it's up to these big firms to be reingeneem.

[00:03:03] Well, in hell, but we're allowing them to do this.

[00:03:06] I mean, because free speech.

[00:03:08] I mean, this is, you know, this is the whole thing that Elon Musk put out there.

[00:03:10] But this free speech and being an idiot.

[00:03:12] And these people are being idiots.

[00:03:14] Yeah, if you could draw the line easily between those two, you'd be doing very good work.

[00:03:18] Now, I mean, it's almost impossible.

[00:03:19] That is the fundamental problem because how far, first of all, these organizations operate

[00:03:25] super nationally.

[00:03:26] I mean, how it does do you tie down Twitter?

[00:03:27] Well, it's very interesting.

[00:03:29] The Brazil actually has.

[00:03:31] And also, yeah, France has arrested the country who's injured.

[00:03:35] I don't know.

[00:03:35] I know you keep it to say, say, I can be very,

[00:03:37] I think he's just being Australian until most of you podcast.

[00:03:39] So it's time that there's a reason for people who have drinks.

[00:03:41] Every time you mention Australia, you really have to listen to them.

[00:03:44] They're very drunk people.

[00:03:45] Well, so in Australia now, the Prime Minister there is pushing to actually introduce an age restriction on social media.

[00:03:51] How an earth would you monitor that?

[00:03:52] Well, I don't know.

[00:03:55] I mean, well, you'd have to have an age, some sort of age verification to sign up to these platforms.

[00:04:00] But that can't be got around a better can.

[00:04:02] Well, it can because they've tried in a couple of states in the United States.

[00:04:06] And of course, everybody just gets onto VPNs, but it's probably...

[00:04:11] It's the Miller Joy source.

[00:04:13] Yeah, no.

[00:04:13] Exactly.

[00:04:14] Who probably isn't at all?

[00:04:15] If she exists at all.

[00:04:17] So I don't know that's in a fairly way forward.

[00:04:20] I mean, is it the sense though that if you get all these mad step opinions out there in the end,

[00:04:26] the truth will kind of rise above it, you know, that the...

[00:04:29] It's not so far, it hasn't.

[00:04:30] I mean, we're given it a long go.

[00:04:33] That one, I don't know.

[00:04:33] I mean, that was the day wasn't it when the internet started.

[00:04:36] I remember thinking, well, this is fantastic.

[00:04:38] It's like a global marketplace.

[00:04:40] What an exchange of ideas will have and for the betterment of mankind.

[00:04:45] We'll all rise to this new high level.

[00:04:47] Well, but actually we all went to the...

[00:04:49] We all went down rapidly in the opposite direction.

[00:04:51] And now maybe the answer is there because we are beginning to see government do something about you.

[00:04:55] Let's find out if there's a prospect that he will actually work because quite frankly I'm skeptical.

[00:04:59] Let's talk to Robin Mansel, his professor of new media and the internet at the London School of Economics and Political Science.

[00:05:05] And Robin joint doesn't know.

[00:05:06] So I mean, the first obvious question Robin is...

[00:05:10] Do you think social media just generally is doing harm to society?

[00:05:14] I think it is in some ways.

[00:05:17] I think it's a double-edged sword, of course it's doing lots of good for various activists groups who want to get their voices heard.

[00:05:26] But it is also harmful and also its ways.

[00:05:30] Yeah, and I suppose the harm obviously is the big concern

[00:05:33] but we're growing number of authorities including states and groups like the EU.

[00:05:39] Do you get a sense from, for example, what Brazil has done, what France has done,

[00:05:44] what the EU is considering doing that we are now moving to a point where

[00:05:49] some form of state regulation that actually makes a difference is beginning to come in?

[00:05:54] Yeah, I really think that enforcing the rules that you make is very important.

[00:05:58] I think it remains to be seen whether that enforcement is going to be so prolonged and

[00:06:08] contested in the courts that it actually doesn't have a big impact on the behavior of the big digital platforms.

[00:06:15] So you don't think what Brazil is doing or what France is doing will be effective if you think it will be challenged.

[00:06:20] And I don't think it will be challenged too much, I think it will be challenged.

[00:06:23] And I think my concern is that if you look at some of the other areas of regulation which have been playing out,

[00:06:30] whether it's competition policy or whatever, when you go through whole series of appeals it can take years.

[00:06:37] So that's not the same as saying it's not effective.

[00:06:39] It's saying that the time horizon is protracted and therefore it's not a case of saying just because they arrest somebody

[00:06:48] or just because Brazil says you haven't responded to our requirements and so we're going to block your application.

[00:06:56] Doesn't necessarily mean that it will have the consequences on the owners of the platforms that some people are terrified that it will have.

[00:07:06] And I also think it doesn't necessarily mean that freedom of speech is suppressed because there are so many alternative platforms.

[00:07:14] But the point is it's very difficult to prove isn't it? So for example, a situation in Brazil they you know the argument is that they are not blocking accounts that are allegedly spreading misinformation.

[00:07:26] That was part of the reason why they decided they were just going to ban it outright.

[00:07:30] But it's difficult to prove that isn't it because I'm sure X would just say we'll look you know we've got algorithms in place we try and do our best.

[00:07:37] How can you stop somebody typing something which might be misinformation?

[00:07:40] I mean it's you know there's an element of freedom of speech but also you can't predict what someone's going to type before they type it.

[00:07:47] Exactly, but typing whether it's an individual or it's a malicious group organized somewhere in some other country.

[00:07:58] All of that freedom of speech I believe should come with some responsibility.

[00:08:02] And so I don't see it as a case of having to prove that person X did person Y so much as it is a case of the people who own and control these platforms.

[00:08:14] Having a responsibility to make transparent what it is they are and they are not doing.

[00:08:20] So how do you mean transparent because they say you have a site like Twitter or X and they have men people who join it and they say things.

[00:08:29] So the transparency should be in showing how they use those things or how they try to monitor those things is that what you mean.

[00:08:37] Yeah, exactly. I don't see why though that that to be a commercial secret.

[00:08:41] I mean if you look at the fact checking organizations which are NGOs they are completely transparent about what tools they use how they go about tracking information.

[00:08:50] I don't see that there's any argument that says that a platform which is used by millions if not a billion people cannot be transparent about what techniques it is using and how it's interpreting its own terms of service if you like.

[00:09:10] I don't see why that should be commercially sensitive if that's what the governments are asking for is that kind of transparency they're not asking to have access to every tweet and every post that somebody makes on a platform.

[00:09:26] So you say they want access to the algorithm so they want to say what are what are the rules that you're putting in place to for example amplify messages is that what we're talking about.

[00:09:35] And more transparency about how many actions are taken and it was request.

[00:09:42] So let's just say for example a platform gets a request from rogue state to to start amplifying content.

[00:09:52] What was their response? Yeah, so that's that sort of grand level almost but sposing let's say it's called someone let's call them mrd trump signs up to x and start writing.

[00:10:04] The last election the US was stolen by a conspiracy by the deep state including lots of democrats any democrats who are clearly pedophiles puts that up.

[00:10:16] Now then as far as x is concerned are they responsible for that being or remaining on the platform or are they it's a question of how fast they take it down or what would be the accountability there.

[00:10:30] I think they are responsible in the sense that if they are using an algorithm which amplifies that content and in addition it is also contrary to their terms of service which in the king so.

[00:10:42] Our x possibly it wouldn't be but if they were transparent about what they did with that content then they have a responsibility.

[00:10:50] So they're not responsible for the originator of the content, obviously.

[00:10:54] Although keeping them on if they repeated the night we spell the money obviously Trump was suspended from Twitter for a while exactly but that again is an issue that I think where where governments are intervening at least liberal democratic governments are intervening.

[00:11:10] Is they are not at least not yet saying we will arbitrate whether or not Trump should be on your platform.

[00:11:19] What they are saying is that you need to be transparent to us about what decisions you are making and how you are making them and then if we we are not happy with what we see.

[00:11:33] Then at least according to legislation in the EU if the matter would be referred to a court and then you get independent education of whether or not.

[00:11:44] Some kind of illegal behavior has taken place.

[00:11:48] If I was them though and I and you know what I'd be saying and I'm sure this is what they would say is you know we're basically making money from advertising so we want to see as many eyeballs as possible.

[00:11:58] To stay on our sites for as long as possible. So we do that by finding content that people like and then we amplify similar content.

[00:12:06] And that of course is what leads you to could lead you down a rabbit one. I mean it could mean that you see lots of photographs of cats and all of a sudden you are your like my wife's feed is completely cats.

[00:12:16] And you know the more she looks at cats the more cats that are they never ending supply of cats on the internet and she's seeing them all.

[00:12:24] And you know she's getting served at the same time but if she was looking at content about about Russia or the Ukraine war is all in lie which we just saw talked about before we started talking to you.

[00:12:37] I'd start to see lots of content like that as well because that's because that's similar to the content that I've looked at. So they'd say well we don't look at the content we just look at the behavior of people and we send them stuff that other people who like that tweet also liked.

[00:12:50] And that's the way the algorithm with them works works equally for good and bad that I'm sure that would be their defense.

[00:12:56] I'm sure it would be but we're not in a court of law and also I'm not a lawyer. But what I'm saying is though, I don't think it is accurate to say that oh we don't look at content because for the majority of platforms which have made somewhat transparent what they do much of the moderation is done by human beings it's not all done by algorithms.

[00:13:21] And so they are looking at content and they are making substantive judgments about what they believe to be consistent with let me just say good behavior and what is not.

[00:13:35] So if they were in a court and they were asked can you tell us what you do?

[00:13:41] If you're an Elon Musk, you probably would say the only thing that we are doing is using the algorithm we don't look at the content.

[00:13:50] Yes, I agree with you.

[00:13:51] Yeah, that's because he's going to benefit me.

[00:13:52] I agree with you.

[00:13:53] But the other platforms are gradually coming around to the realization that their activities come with responsibilities as I said earlier and they are gradually because the laws are changing in different regions of the world.

[00:14:06] And they are saying here in the more granular fashion are close to practice and here are how many people we employ.

[00:14:17] And here is a record of where we receive requests from to take stuff down.

[00:14:22] And that is a different picture in terms of how you manage this very concerning work.

[00:14:28] It's just the that you're saying that we are at a kind of turning point and because you're saying apart from eggs.

[00:14:34] The other platforms are beginning perhaps to turn around the way they operate and there and their accountability and their visibility of that.

[00:14:43] But let me pick you up in terms of something like telegram now we talked we mentioned, Pavel Duroff, head and founder of telegram who has been arrested in France while and then was released but can't leave the country while the French investigate what was carried on telegram.

[00:15:00] They mentioned child pornography they mentioned various other things.

[00:15:04] Drunk trafficking, drug trafficking, drug trafficking, but that is what's in the platform not necessarily the way in which it was amplified so that's a different time isn't it?

[00:15:12] On the face of it it appears to be, but I've seen a number of articles saying that until we see the full detail of the charges which I haven't seen I'm not sure if they're available at the moment.

[00:15:25] But that I have not anything there yet.

[00:15:28] Supposition that I have seen is that this case that the reason for their arrest is less to do with.

[00:15:37] What's actually on the platform and more to do with the consequences of what's on the platform so if I suppose they think they have evidence that kind of link something that's on the platform to actual trafficking of children or actual running of guns or actual.

[00:15:55] But on drug smuggling or whatever then those the law is perfectly clear about those kinds of behaviors they are illegal and if they can.

[00:16:07] link the two activities, then I suppose they think that they have a case to break let's not be surprised but what I'm wondering do with the real world then it is to do with what's on the platform.

[00:16:16] Yeah, this time we've had just this week the Ukraine military intelligence you've put in office saying that he thinks that it's also responsible for actions which are affected Ukraine's national security so it really thinks the Russians are make sense wouldn't it the Russians are using it to.

[00:16:32] to communicate with themselves but I mean what if you go back to the aid to the phone you know a lot was said on the phone drug transactions took place over the phone.

[00:16:44] You know we didn't hold the phone companies responsible because how could they control what was being said I'm just wondering whether you know so we're being unfair to some of these companies in that how can they control what happens on a platform.

[00:16:55] I actually think we did hold the telecommunications companies responsible in the following way if they're traffic on there that works was there was a suspicion that it was some kind of illegal content.

[00:17:13] We've had security laws in place that would allow the government under a court order to ask for that content.

[00:17:20] To accept yeah, so I mean I think I'm not trying to say that I don't think that can get out of hand and I'm not trying to say that I think that those courts should be proceeding as I believe they do in this country in private.

[00:17:36] So people don't know what they're being trashed for but I do think that we have in a previous media cut up information age held.

[00:17:46] Many organizations and individuals responsible for communication it hasn't just been a free for all when the platforms came along and they haven't been allowed a free reign because they were innovators because they were generating lots of money in the data economy this was all seen to be good.

[00:18:06] And the hope was it would be good for democracy and also it's another thing.

[00:18:11] Yeah, well in many ways it has been but it seems to be turning the corner and.

[00:18:16] Steve Moha and good.

[00:18:17] But what about you don't think that the arrest of one man in France, for example is an indication that we are suddenly marching to a new regime where everybody's breathing of speech is suddenly going to be censored and suppressed I really don't think that.

[00:18:34] Let me pick you up on there because of something you said earlier, so liberal democratic governments and you talked about the offensive democracy, but it's the the mirror image and what we're talking about is the way in which a lot of these sites.

[00:18:45] Including telegram, but also I think ex and tick dock and others are used in oppressive regimes as ways of being able to communicate and do things which they wouldn't be able to do.

[00:18:58] In any other way and in fact they can be an instrument of and have been instrument of attempts to make political change in places where we might think that was a very good idea this move if it is a move towards more regulation.

[00:19:10] If it's allowed by in governments I don't know in an oppressive regimes could be very much of a backfiring.

[00:19:17] You don't necessarily see why I mean why.

[00:19:21] And nobody has said yet that these platforms that are deemed to be problematic today are not going to modify their behavior as you yourself said.

[00:19:31] Part of the reason why they do the amplification of content that they do do is to generate money.

[00:19:37] So if they continue to be motivated by the profit sent incentive and governments score courts come along and say you can do this but you cannot do that.

[00:19:45] Then wouldn't you change the behavior so that you would amplify content in the West when you when it wasn't deemed to be harmful.

[00:19:55] So the people who are using your platform whether to be telegram or anything else and Russia or some other countries where there's very little freedom of speech would not be disadvantaged.

[00:20:08] But they would if there was if there was a came the norm that companies that operate these platforms are expected to provide cooperation assistance with governments as a result of this that they are.

[00:20:22] It becomes and as they're regulated they're expected then to comply with regulation in all these different places they might end up complying with regulation.

[00:20:30] And so I don't know that someone like me and Mar or relatively oppressive regime or Iran where that could be really harmful.

[00:20:37] Exactly, but I think you have to differentiate between the motivations of governments and courts in the West and liberal democratic states, at least those columns all that.

[00:20:47] And those which are clearly Democratic states.

[00:20:50] I don't think you can say we in the West treat both the same way.

[00:20:55] So this question about maximizing advertising revenue and those responsibilities associated with that.

[00:21:03] I think it's key to all of this isn't it because I mean, I know for a fact from previous lives that getting a good argument going whether it's on social media or on the radio drives engagement.

[00:21:14] And if you say something scandalous online or on the radio you get people disagreeing with you, but they're engaged and he's allowed the tazinger in that.

[00:21:23] That's the way appalling talk back radio in the states and under Australia works and it's the way online works as well.

[00:21:29] So I'm sure a lot of these companies go, well yeah people say scandalous things and then there are loads of people engaged whether they agree or disagree.

[00:21:36] And that's why we get these very long threads and that's why people might have nicked in for 10 minutes actually spend 45 minutes.

[00:21:43] And we serve them heaps of ads in the meantime, they love that toxicity.

[00:21:48] And so I think you know, raining them in through regulations by saying, well you've got to have some responsibility for the truth here.

[00:21:55] And you can't stop what people are saying, but you've got to have checks in place and you've got to demonstrate to us that you have those checks in place.

[00:22:03] Because we can always go in and take slabs of content and see you know how that's being a dear to.

[00:22:09] I mean that is all enforceable isn't it and they they are doing well that they know what they're doing.

[00:22:13] I think I guess I wouldn't I would agree with someone what you just said, but I'm not others so I'm not sure that I've said that they can go in and take content and look at it in the sense that.

[00:22:26] There's a big difference distance between having enforceable codes of practice way as to go back to what I said at the beginning you require a certain amount of transparency without looking at specific content or if you do look at specific content.

[00:22:42] You do it only under a court order and now I'm talking about liberal democratic states where I guess we would have some faith in the in the courts.

[00:22:51] It's an entirely different thing if you are talking about countries where a the courts are not independent and the the governors of those states are.

[00:23:04] Very keen to suppress any kind of dissent and the thing is though we already know that some of these platforms already are sometimes responding to the request of authoritarian states so for example, we know that they have been.

[00:23:21] Right there well, wrongly but maybe it was some sugar enough to the fact responsible for helping to mobilize all sorts of horrific behavior in countries from Miramar to some other countries.

[00:23:37] So the companies in the West are already.

[00:23:40] are not really responsible for creating chaos and havoc in other countries, whether it's hate speech or misinformation.

[00:23:49] So, in that sense, it does seem to me that if there are ways of calling.

[00:23:59] On the carpet behavior that we hear in the West do see as harmful or illegal.

[00:24:07] Then something should be done about it and what I would suggest is happening now is after several decades of basically taking a completely hands-off approach and saying, well, it's free speech just let it rip.

[00:24:21] Governments are gradually coming around to the idea that if you want to be responsible and have some care for the other and their safety, something more needs to be done.

[00:24:32] And I would say that for the most part, at least in Europe, something more that needs to be done is quite, you know, fairly softly softly in the sense that it is negotiating codes of practice with the more responsible platforms.

[00:24:45] It is asking for more detailed information about what they actually do with their algorithms and with their human beings.

[00:24:51] And then there are a few platforms who are holding to this line, which is free speech let it rip.

[00:24:59] We have the problem.

[00:25:01] Yeah, and we know who that is and it's one of the ones anyway.

[00:25:06] And it's a Mr Elon Musk who could well find himself in as an advisor to the US government.

[00:25:12] And I went back in there, came off it, but I went back in there just before we started talking to you as talking to Roger about this.

[00:25:19] And the first tweet I saw was a denial that there's a war going on in Ukraine and there was a whole load of comments from that and that was fed to me for whatever reason.

[00:25:28] And then a whole load of comments who people saying, yes that's right and agreeing with them. Now whether it came from bots or not.

[00:25:34] I mean that's open content so you can deepen and have a look at these sites and see that sort of stuff should that be allowed or should a government be able to say,

[00:25:44] Well you are spreading untruths here.

[00:25:46] It's unchecked.

[00:25:48] We've done this repeatedly and 20% of the time you are allowing these messages to get through should they be taking legal action.

[00:25:55] But more to the point, rather than something like the Ukraine war, I'm a bit of a political thought.

[00:25:59] But what about people who put out misinformation about COVID for example, which could actually have real world effects in terms of people?

[00:26:06] Well I'm busy saying that I mean it's hard because I saw that as a sort of like a forerunner to Donald Trump saying,

[00:26:12] Well he's not going to fund the war if anything.

[00:26:13] Well you're just there to be a little American people and seen it on Twitter that it's all of it's a false or whatever the mean.

[00:26:18] It's a free speech, I suppose.

[00:26:19] That's an issue that all societies have to grapple with.

[00:26:22] And I think the European view of what it limits to free speech around a different historically and now to what they are in the United States with the first amendment.

[00:26:34] And so there's a cultural thing happening here, there's a political thing happening here, times change.

[00:26:40] And it seems to me that as times change and you know the political and cultural situation changes, then it's worth having a debate.

[00:26:51] Hopefully democratically about what should be done and it seems to me that I'm not arguing that it commissioned for example is entirely transparent with what it has done.

[00:27:01] But that seems to me to me to be what is mobilizing these changes.

[00:27:05] And of course you couldn't pick any instance of disinformation which is circulating around and say, oh this is having a terrible impact.

[00:27:15] Everybody is going to believe it.

[00:27:17] Society will break down if you look at the evidence of the numbers of people, proportions of people who actually do fail to identify what is accurate and what is inaccurate is actually much lower.

[00:27:31] Well, it's only speaking to the hysteria that seems just around the idea that there's misinformation out there.

[00:27:36] Yeah, good point.

[00:27:38] So it's more perhaps things where people are doing things online that have real world direct effect of how we talked about obviously child pornography or drug dealing or criminal activities and people being held responsible for that.

[00:27:50] But it's no problem with the multinational nature of these things that okay, exes ban now in Brazil, but it's not banned in lots of other countries.

[00:27:58] And they can operate in all sorts of jurisdictions.

[00:28:01] Perhaps there's a need for I don't almost a UN charter on this is that would that be a better way forward to make sure the regulation is universal.

[00:28:09] Well, I wonder USA out to that well as you know the United Nations is taking steps at the highest levels to set up principles and ethical practices.

[00:28:19] And the UNESCO for example has said guidelines published on what should be done to regulate offours, but these are a very very high level and as with all kinds of statements so that nature that come out at the global level they all have to be translated into regional or national law if they're going to have any effect.

[00:28:39] So I am not.

[00:28:44] Convinced that.

[00:28:46] We will see anything more than the statement and restatement of global principles have this book go on for some time.

[00:28:55] I don't think the UN has it what we know the UN doesn't have the authority to initiate actual laws at the national level so then the next question you probably want to ask is well do you think there'll be a treaty.

[00:29:07] Well, we'll have ages in.

[00:29:09] Well, like we have for biochemical welfare or things like that will there be one for the information ecosystem.

[00:29:17] In time I suppose it's possible but then you have to ask yourself it will sign up to it exactly and the United States has not been the first sign up to these kinds of global conventions in the past.

[00:29:29] But all of these countries including the United States are do have human rights commitments that they're obliged to adhere to and you know then it comes back to the differences that I was talking about earlier that countries have different ways of deciding whether or not they're compliant with these conventions.

[00:29:48] So I think we're a long way from some sort of talk down, Daryngeast.

[00:29:53] So I'll dare I say stubbornist approach to the information world, but what I do see whether it's the arrest of somebody in Paris or what will probably be a temporary shut down of.

[00:30:12] I think these things are going to work themselves out and I don't have a problem with the working out.

[00:30:18] I might have a problem with the ultimate conclusion if it were to be a do described suddenly my speech is being censored.

[00:30:26] Yeah, well my speech is funny because we met and I definitely do more than most and but you know I find that I've had Facebook posts taken down which have been completely innocuous.

[00:30:35] Well you say that well I mean they you know touch of irony may be that was that's the problem you can't you're not allowed irony because it's being placed by yeah yeah people sitting in.

[00:30:44] Capitino or wherever it is so the but what about you know the influence on elections so we've got a US election coming up.

[00:30:51] I mean one in five Americans is on Twitter and you know you can't help feeling if if Elon Musk can just treat his algorithms just a little bit to help his make Donald he'll do that.

[00:31:01] And once again, I would take you back to the social science literature and having just reviewed for a project that I'm involved in at the moment.

[00:31:08] Literally hundreds and hundreds of studies that look at what is the direct impact of.

[00:31:16] Disinformation on election outcomes.

[00:31:19] The evidence is that it can have.

[00:31:24] And the fact on the attitudes and voting behavior of relatively small minority of people who are already divided and partisan in their offline life.

[00:31:39] And so yes it has an impact but it isn't the only impact and it isn't nearly as big as what is supposed and that comes from empirical studies that have been done a lot of them in the US about around the.

[00:31:54] 2020 election in the US but also another countries and the evidence is very very very ambiguous of direct impacts what does matter.

[00:32:05] Is how people's lives are you know are they struggling or and they well educated and not well educated all of these factors which really don't have anything to do with the online world per se.

[00:32:18] Our contributors to whether or not.

[00:32:21] Miss and disinformation is going to impact on election and all this is all this is dynamic as well because we know more people sign up on different platforms got truth social out there if you really want to be.

[00:32:32] Full on maga and there's lots and varieties of these things as well as other things that are trying to be none not like Twitter not like ex I mean where do you see the future of social media going in their ways the power that he has unquestionably had in the last five to ten years going to remain that way will it get more powerful do you think in terms of influencing the way people think where do you think it's going.

[00:32:56] I wish I didn't crystal ball. So the fine question asking that. It's not what we know is not going away. Don't wait.

[00:33:02] I think these companies are more thing in the sense that as we move into an environment where first of all, I think they think they think they can make more money for a gender to write application, et cetera.

[00:33:18] The problem may become less what people are doing and much more what AI bots are doing first of all.

[00:33:26] And then that raises yet more questions about how do you manage and govern it.

[00:33:31] I think one thing you didn't mention earlier when we were talking about money is that there is some evidence that the ad tech industry and some of the major advertisers will pull back from these platforms when they find they're advertising being located alongside particular types of

[00:33:48] speech. And again, I would come back to the already pulp polarized condition of various societies. So if you take America with its

[00:33:59] You know very deep polarization especially up to the fifth of November when the voting happens.

[00:34:09] And you compare that with some other Western European countries which are also gradually kind of drifting into a more polarized divided state.

[00:34:19] How much of that do you really want to attribute to these platforms?

[00:34:23] Or do you want to attribute it to the ways in which they've been governed up till now to austerity measures to also its above the things which have really very radically affective people's lives.

[00:34:34] Social media comment is the symptom rather than the cause.

[00:34:37] I think it's a symptom and it can be an accelerator of problems in society.

[00:34:44] I wouldn't say that it is the singular cards that would be the main point that I'm making.

[00:34:49] So the angry un informed mob have always been the angry un informed mob that just got a new place to meet.

[00:34:55] And no one's going to advertise around them is the other point you're making there.

[00:34:57] I think that's where X made a big mistake in there because they call it brand safety.

[00:35:02] Don't they in the advertising world? You've got to be a safe place for brands and a lot of companies had a lot of people working in brand safety to try to assure advertisers that it is a safe place.

[00:35:11] Elon Musk got rid of it. I think all of them at X and consequently people are advertising on email because they can't be assured that it's a safe place to it.

[00:35:18] But are they advertising on TikTok and Instagram and the ones we haven't mentioned snap you know that that's a world which perhaps we're not looking into as much because it's less political.

[00:35:26] But is there the same is the same accountability coming there as well?

[00:35:31] Well, there shouldn't be in mind you.

[00:35:32] So also gets to the stage doesn't mean that future that you're talking about where AI and other things get added on.

[00:35:39] It then becomes a question about these companies just getting too big taking too much money and having too much power, too much monopoly.

[00:35:46] Then we just get into which is almost easier to govern.

[00:35:49] Good old fashioned anti-trust legislation don't really really where we say okay you got to break up somehow because you've got too much control.

[00:35:55] Which is gradually wrapping up that at least in the United States for how long that's really nice to be seen but I think I would also like to point though that.

[00:36:07] To use the sort of crazy academic term this dataification process trying to quantify everything inside in order to do predictive analytics is is something which is really deeply embedded now in across all industries and is driving much of the economy.

[00:36:26] I think that is not going to change anytime soon and as long it is the case that this seems to be in the way in which.

[00:36:36] Countries and companies.

[00:36:39] Thrive the future and contribute to economic growth then we're going to have lots of twists and turns in how these big companies.

[00:36:48] Managed that environment and how all of that sort of trickles down to all the intermediary actors.

[00:36:54] They've sort of the middle sized companies and the small entrepreneurs who also get a lot of some of these technologies.

[00:37:00] Whether it's large language models or whatever it is and they start to do things with it and the question is do we behave as we did.

[00:37:08] 25 years ago it's a how well their innovators will just leave them alone and see what happens or do we say we have learned from.

[00:37:17] Has 20 years that completely.

[00:37:21] If you like irresponsible innovation.

[00:37:24] Doesn't really help society very much.

[00:37:28] And therefore there should be some guardrails in place one guardrail is to break the companies up my question then would be if you break them up.

[00:37:37] And then you're going to be able to do the middle sized and small companies just don't replicating and doing the same thing.

[00:37:42] So I think there is a lot of you you lose some of the advantages of course of the you know the the integration aspects you know the larger companies perhaps can do more.

[00:37:50] You know better for society like if you broke an Amazon for example we'd have a load of shopping sites so it's actually slightly more expensive than the one big one which you know as a good.

[00:37:58] And I think that's the best way better to the people who are actually making it less interesting might be a bit of competition that for wages.

[00:38:04] But just remember that that's the argument there was made about why you needed to tell a common operator.

[00:38:11] Yeah, yeah.

[00:38:11] One year's a goal and that turned out to be wrong.

[00:38:13] Yeah, yeah.

[00:38:14] I wonder whether it's just one final point then one thing Elon must do was to Santa San like I've got a personal vendetta again.

[00:38:20] I do have to say but I mean it's not all he's not the only villain is he but one of the things he has done is sort of closed up the opportunity for people to analyze what's going on on Twitter.

[00:38:31] And if there was more than open interface for all of these social media sites so you could analyze at least the you know the what is you know publicly available.

[00:38:41] So that you could look at for example attitude changing attitudes the the volume the level of hate or a deer and

[00:38:50] whatever you want whatever verbs you want to use to look at to look at tweets and analyze behaviors.

[00:38:58] And in a way that could almost become self-police and we could be useful for us all because analytics companies could use it for the for the better.

[00:39:05] And also and sociologists and the like but also it would it would help to police these sites as well if we saw that there was abnormal behavior that was developing so some some open interfaces seems to be part of the what should be a regulation shouldn't it?

[00:39:18] Well it already is a requirement under the digital services acting in the European Union and the question is whether or not they can enforce it.

[00:39:27] Also, even where data are made and that is made available quite often the metadata or the contextual information that surrounds that data is not provided to researchers so it's one step is to say.

[00:39:44] Like X should be making that data available but a second step is to say our researchers able to independently examine that data because they're given enough contextual information.

[00:39:56] So for example if you get data from Facebook under contract quite often Facebook will be right there with you not necessarily sitting beside you, but telling you what the parameters were and how you should.

[00:40:07] You know clean the data and work with it which is good but the question is, is that really independent research so.

[00:40:19] There's a long way to go before we actually get into an environment where we have that kind of transparency, but I think I started talking about transparency and I think it seems to me that as it's not.

[00:40:30] Under mining principles which I feel very strongly about like freedom of speech it's not under mining privacy if.

[00:40:40] You ask companies to be more transparent about how they operate we do that with every other industry I can think about from airplanes to.

[00:40:50] I don't know what our companies for example.

[00:40:56] And so why do we think that we should just.

[00:41:00] The hands off for these big other companies because they deal with information and the only reason why we think that is because someone has said.

[00:41:10] That freeing of speech is a sign of one and it doesn't come with responsibility well they like and fresh air seems to be the.

[00:41:17] The main thing is to clean this up is pretty much what the lesson there Robin thank you very much for doing that for us and well we'll see where social media does go and indeed what effect it does or doesn't have on November 5 Robin thank you.

[00:41:29] I think you know there's not there's not that transparency and everything because I instantly thought of kennel soldiers with his you know his secret recipes yes but then I've just looked up into it.

[00:41:39] Yes and we know what they are oh you got them yeah so you could set up your own run.

[00:41:44] Yeah I mean I reckon no basalt time celery salt blank.

[00:41:46] Yeah we're not even that is in the open.

[00:41:51] There are no signatures the way forward but speaking of things like that sort of food yeah we like to eat it because it tastes good even though it doesn't know good.

[00:41:59] Yeah right even though all the dead because many of us are not possibly what you might call adults I mean we don't do adult things in the way that perhaps we would have done 30 40 years ago.

[00:42:10] We'll see something we like it we want it we need to do mean on a Sunday morning you're not putting us suit in a time when you go out for your morning course I always do constantly I dress for dinner of course.

[00:42:22] Oh it's nice but it's true isn't that we've been as we've got older we behave like young people.

[00:42:26] Well a lot of a lot of older people a lot of adults generally not according to some scientists social scientists let growing up in the way they used to.

[00:42:35] I mean they're not they you know they they use to instant gratification they're used to getting what they want when they want and also they kind of rather cling to some of the things from their infantile state.

[00:42:47] But there's a lot of it constantly want to be seen as being cool and with their entrendian down with the kids or are they just terrified of modern grown up life.

[00:42:55] Hmm which might equally be the case but there is definitely a bit of a problem in that regard.

[00:42:59] But we all face modern grown up life we've all got on mortgage to pay we all have the fear of losing our job hanging over on all in perpetual fear not on the young people don't have a lot of things no will do yeah maybe they've given us such a surprise they go around in ones isn't got me's what else you know I mean.

[00:43:14] That sense that perhaps the adult world is rather than you don't have a ones you do I tell you we know no no no I don't I don't anyway we thought we would discuss this site trend in modern life what's not growing up.

[00:43:24] This is our older people not growing up is it all not growing up fundamentally so is it easy to do are we just being dominated by youth culture yes that well was you've culture basically being a refusal to grow up and do adult things.

[00:43:38] Maybe they have a sense of the youth culture hang on to into our 70s.

[00:43:43] Yes that's something absolutely okay all right interesting on or an angle to approach we'll look at that next week on the YK

[00:43:48] Thanks for joining us today we'll see you next week. Bye. The Y curve.