Hosted on Acast. See acast.com/privacy for more information.
[00:00:00] The Why Curve with Phil Dobbie and Roger Herring
[00:00:04] The world for anyone under 30 is now dominated by TikTok
[00:00:08] Social media has been a revolution of connection and communication
[00:00:12] But it's also been a pit of sexual exploitation, bullying and horror
[00:00:16] And there seems to be no way to protect young people from the worst of it
[00:00:19] The big networks publicly wring their hands and make promises and do nothing
[00:00:24] Governments launching cries and promise action
[00:00:26] Is there any effective way to guard the most vulnerable from the worst the internet can throw at?
[00:00:31] And is there any appetite to do it?
[00:00:33] The Why Curve
[00:00:36] Well I think there's quite a bit of appetite to do it
[00:00:38] But the question is does anyone actually know what we have to do
[00:00:42] Well there's a lot of puffing and puffing and thinking something must be done
[00:00:45] The classic thing
[00:00:46] And it seems to be everyone goes well it's a technological problem
[00:00:49] It's all to do with the internet, the internet's to blame for it
[00:00:51] All we need to do is to control the internet players
[00:00:53] But the question is how do you do that because we are dealing with governments around the world
[00:00:57] Dealing with massive corporations, three trillion dollar companies
[00:01:01] Which all have an interest in doing the things that
[00:01:05] In terms of bringing them in money that actually are the worst of the internet
[00:01:08] The worst of social media
[00:01:09] Also they might say well it's a bit complicated
[00:01:11] What we're talking about is almost like social engineering
[00:01:13] And even though that's actually exactly what they do
[00:01:16] It is a bit too complicated for us to try and weave out the good from the bad
[00:01:21] And isn't it just one of these moral panics
[00:01:23] Anyway some people say well you know young people always access these things
[00:01:26] They've always bullied each other, they've always had access somehow to pornography
[00:01:30] They've always found the horrible things out there
[00:01:32] However much we might try and protect them
[00:01:34] It's actually no different
[00:01:35] Yeah and if we over protect them then is that molly cuddling?
[00:01:39] If it even works because young people are pretty good at finding their way around
[00:01:42] Anyway absolutely
[00:01:43] Including their same different ages
[00:01:45] So I mean yeah I mean you do get told you know well as a parent you should know what your kids are doing
[00:01:48] What's on their devices
[00:01:49] I mean hell I'd have to be awake all night first of all to try and police that
[00:01:54] Secondly there's no way in the world I can out maneuver my son online
[00:02:00] Because he's far better than I am at understanding how to evade fire walls or get into
[00:02:05] He'll tell you what you need to hear and get on with what he's doing
[00:02:08] What he's doing exactly
[00:02:09] But there is clearly a big desire to do something about it
[00:02:13] So let's get some of the evidence about what this actually does, how it works
[00:02:16] Let's talk to Emily Setty she's a lecturer in criminology at Surrey University researching young people harmful sexual behaviour and online harms and she's with us now
[00:02:26] Is there a necessity to control what people, what young people see online
[00:02:31] I know you've looked at some of the difficulties that can come from it
[00:02:35] But many people say well this is all part of a free media
[00:02:38] It's a part of people growing up
[00:02:40] People have always exposed themselves to things that are challenging
[00:02:44] It's all part of growing up and really we shouldn't even be trying to interfere with that
[00:02:48] What do you say?
[00:02:49] Yeah sure so is there a necessity
[00:02:52] I suppose it depends on what is it you're trying to control
[00:02:55] Who is it that they think is going to be doing the controlling and of whom
[00:03:01] I think sometimes these statements are made in quite a broad way without necessarily actually identifying and disentangling how this is going to work in practice
[00:03:13] Just because something is challenging to make workable in practice doesn't mean it isn't needed
[00:03:18] But I think there does need to be a bit more precision around what it actually is that we think the issues are
[00:03:24] Because I find when I'm engaging say with adults, parents, teachers, others who are interested in young people's safety and wellbeing online
[00:03:33] There can be a particular perspective on what's going on and why it's happening and why it's going on
[00:03:39] That sometimes isn't always hitting the mark actually in terms of the reality of young people's perspectives
[00:03:45] And certainly then the proposed solutions might come with their own counterproductive consequences that we need to anticipate
[00:03:54] So in the online space, I mean the two areas obviously are exposure to pornography and what it does in terms of mental health from bullying and the like
[00:04:04] Are they the two significant issues? Because in both cases, there could be a lot worse than they used to be
[00:04:10] But they're not new things which have been bought about by online
[00:04:13] The nature of it might have changed but kids have always been exposed to those two elements for as long as we've been around
[00:04:19] Yeah absolutely I mean I wouldn't deny that those are two issues that are pertinent
[00:04:25] Obviously there are other issues but yeah you're definitely right to note those two and they're quite distinct in some ways
[00:04:32] And I yes completely agree that there's nothing new about those particular issues
[00:04:38] Bullying has been around for as long as human beings have been in existence and so actually has pornography
[00:04:44] The desire for human beings to create some kind of representation of sex and bodies is not new
[00:04:50] And to consume it and also historically to regulate who can consume it
[00:04:56] That also isn't new and the powers that be have always sought to manage the consumption of what we would call pornography among kind of the lesser orders as it were
[00:05:08] And so I think we do need to be a bit critical when we consider that
[00:05:13] I think what is so important to consider is let's if we say pornography as an example which is such a fraught issue
[00:05:23] What we're even talking about when we refer to exposure to pornography
[00:05:28] I work with a colleague who will instantly say there is a big difference between a seven eight year old who stumbles on something online
[00:05:37] Versus a 15 16 year old who purposefully and deliberately goes out to access that material
[00:05:45] And I think we therefore at that starting point need to distinguish what we're actually talking about here
[00:05:52] There's a good example though so do we want to stop it happening so that the seven year old doesn't inadvertently come across porn online
[00:06:02] I think when you get to 15 or 16 year olds it's a bit hard isn't it just as those magazines used to be on the top shelf people will find a way of getting them
[00:06:10] You know it's almost impossible to control online or otherwise but at the younger ages
[00:06:18] Is that the difference in fact Emily that being online that being social media makes that it is accessible to seven or eight year olds in a way it wouldn't have been before
[00:06:28] Yeah I mean why do we dislike the internet and social media so much as adults when it comes to young people it's because of that kind of volitional experiences that they're having away from us
[00:06:41] And away from our ability to oversee and regulate what it is that they're experiencing
[00:06:47] I think that's always been a struggle for adults in offline domains I don't think there's anything particularly new about that
[00:06:52] But yeah I do think we need to treat the younger children as a separate case in that regard
[00:06:58] Practically how you would do that I think is quite challenging I think as young people grow into adolescence we need to become very mindful of situating their experiences online within a wider context
[00:07:16] I think there is a lot of very reductionist and deterministic narratives going on at the moment that you know young people's sexual behavior is becoming more harmful and more unhealthy because they are consuming pornography in ways that previous generations haven't been able to
[00:07:31] So that is true, that is true it is worse than it used to be, it is new
[00:07:37] I mean harmful sexual behavior and unhealthy sexual behavior isn't particularly any more or less problematic than it used to be we're just more aware of it
[00:07:47] And it might be changing in nature and scope because online tools and devices create new conditions for it to take place within
[00:07:56] But the actual underlying problems aren't actually all that different in much the same way as bullying is not caused by being online
[00:08:03] It's caused by a set of dispositions and social processes and contexts that need to be taken into account
[00:08:09] And yeah we need to update some of our understanding for the new challenges that are being posed by online domains
[00:08:15] But I think it's quite almost reassuringly simplistic to say okay if we could deal with the pornography thing then wouldn't we have young people having great sexual experiences and sexual development and relationships
[00:08:30] And I don't think that's necessarily true so we don't want to give too much power to pornography in that respect
[00:08:37] So what are you saying is it's less important than we think it is then?
[00:08:40] Of course it's less important than we think it is in my opinion I think that actually understanding the ecosystem and ecologies within which young people develop and understand relationships and sex holistically is so much more vital
[00:08:56] And actually so much more interesting when I sit down with young people and I say okay where do you learn about sex and relationships from?
[00:09:04] Pornography might be one part of that it might be a significant part of that for some of them some of them don't even mention pornography
[00:09:11] There is such a multi kind of faceted set of influences upon them and I think we almost give them the linguistic tools to make sense of their own experiences
[00:09:24] So we have delivered in schools say a lot of interventions that tell them that pornography is bad, harmful, having all these negative effects on them
[00:09:34] So then when you sit down and talk to young people they repeat all of that and sometimes you need to disentangle well okay let's talk that through
[00:09:43] What evidence actually is there for cause and effect? Where else are you learning about sex and relationships from?
[00:09:50] What else is happening within your social and cultural context? And then you can start to develop their critical thinking about that
[00:09:57] Not just about pornography but about all sources of influence right?
[00:10:02] I was going to say we're talking about pornography a bit here but in a way maybe it is a false thing to separate that from bullying because often those things are mixed together
[00:10:10] Sex thing this kind of thing of course and there are wider issues to do with bullying to do with people being pressured to lose weight for example or people being pressured even into suicide in some cases
[00:10:21] And all kinds of things like that and is that different from the pornography area or not?
[00:10:27] No I mean that's the thing it's just recreating in an online environment the same social and cultural pressures that have always existed
[00:10:36] And there has been an intensification and an increase in the visibility of some of those processes
[00:10:41] Where before young people's sexual development bullying or so on could be hidden from view it's now there isn't it? It's become much more tangible and we can see it happening
[00:10:51] So is it volume as well is that part of it? So maybe you know kids have always been exposed to porn but now there's more of it
[00:10:58] They've always been subject to bullying but now it's happening more often is the volume component has that turned up a notch and is that problematic?
[00:11:06] And yeah I mean volume you know yeah I mean you'd have to look at okay what does the data show about the intensification of some of this in terms of the amount of it per se?
[00:11:16] I think what you definitely see is a collapsing of boundaries between different spaces
[00:11:21] So what used to take place in a certain environment can now take place everywhere and anywhere because devices are in our pockets right?
[00:11:28] So it's made a lot of these processes quite visible and I wonder whether there's an opportunity in that because then we can utilize that to actually have more conversations
[00:11:39] My biggest problem actually with the monitoring and control and regulation and how can we get young people off of these spaces and devices is one I don't think it's necessarily workable
[00:11:50] I think what it will do is just encourage young people to be more secretive and to disengage and to you know young people are always one step ahead of the game aren't they?
[00:11:58] You're playing catch up with the ingenuity with which they evade adult control and that's always been the case and it will continue to be the case with devices
[00:12:09] But I think what it also does is it closes down conversations. What we can see here is an inroad into being able to talk to young people about things
[00:12:19] What I think we sometimes have as adults is a disinclination or discomfort with actually just sitting down and talking to young people about things
[00:12:28] That's because they're on their phones all the time in a way isn't it?
[00:12:31] Well yeah, that entrenches the us and them dynamic doesn't it? How do we actually create conditions in which we can have two-way conversations with young people about some of these challenges that we're talking about today?
[00:12:43] You're asking parents to talk to their kids about sex
[00:12:46] Yeah, yeah, that's not nice is it?
[00:12:49] No, I mean that's the most horrific part of this discussion so far, the thought that we've got to have that conversation
[00:12:54] If you could see Phil's eyebrows going up higher and higher and higher because he's kind of in that space
[00:12:59] That's what they've got all the devices for
[00:13:01] And do you know what I think though with that, it's so fascinating to me because yeah it's really hard to speak to our kids about sex
[00:13:08] Do you know what though, it's really hard to talk to anybody about sex
[00:13:11] Because married couples don't talk about sex openly and honestly with each other
[00:13:16] It's such a fraught, awkward, intensified aspect of the human condition right?
[00:13:22] So is that a human condition or is that just a bit about being British? Do you know it's the same in other countries?
[00:13:27] It definitely is the same in other countries to some extent as well
[00:13:31] Yeah of course the British stereotype of being prim and proper probably just intensifies some of that
[00:13:36] But think about it with bullying, we've known, forget the internet and social media for a while
[00:13:41] The reason why bullying is so hard to disclose and talk about is because of the shame and stigma of being bullied
[00:13:47] A lot of these problems that kind of create barriers between adults and young people in terms of really working through some of the challenges
[00:13:55] Are about the meaning that that conversation has and how difficult and uncomfortable and unpleasant it can be
[00:14:02] And I think sometimes therefore we go down this route of saying, oh look it's all social media and isn't this awful
[00:14:07] And that's why young people don't speak to us and this is where it's all coming from
[00:14:10] You know can we just get rid of that? And then we can go back to this bygone era where everything was wonderful
[00:14:15] And I think no not quite. Parents and kids have always found it hard to talk to each other
[00:14:19] There's always been intergenerational gaps
[00:14:22] Kids have always responded to efforts to be controlled and regulated with coming up with a way of getting around it
[00:14:29] And actually what we're really talking about here is that age-old problem of how do we bridge generational gaps?
[00:14:35] How do we guide the upcoming generation?
[00:14:39] And I don't think there is any quick fix. The challenges will be new, the conditions will be new, the devices will be new, whatever it is
[00:14:47] But in the end we're going to keep chasing our own tails if we don't find ways of connecting with young people over the world
[00:14:55] Let me inject a bit of scepticism here because you're talking about it's always been there, that's the way it is and it's not that much different now
[00:15:01] But there is something different in the sense we have big companies, TikTok, we have X, formerly Twitter
[00:15:08] Whose raison d'etre, whose way of working is almost designed in a way to accentuate those things and they make money out of it
[00:15:16] It's a commercial imperative
[00:15:17] So yeah they're the ones turning up the volume on all of this
[00:15:19] Oh I couldn't agree more on the extent to which platform responsibilities here are so vital
[00:15:27] And I think the kind of corporate commercial agenda is completely anathema to almost the best interests of their users
[00:15:36] And we know that, there is evidence for that, that you can monetise the very most toxic dynamics of what goes on say on social media
[00:15:45] And I think that is a problem
[00:15:47] And I don't personally know how to regulate our way out of that
[00:15:54] Because I think like we've seen with the Online Safety Act and it's for progression through Parliament
[00:16:01] It goes back to what I was saying at the beginning, what exactly would they be being asked to do differently
[00:16:09] And who is asking them to do that and with what effect, right?
[00:16:13] But isn't it putting pressure on the companies that actually answer this
[00:16:16] Because you're saying kids will find ways around things
[00:16:18] But if technology doesn't allow them to do that, if the platforms don't allow them to do that
[00:16:23] But then what do you say to these companies? Do you say turn off your algorithms?
[00:16:26] And they're going to say well yeah and reduce your advertising revenue as a response
[00:16:32] Well they make enough money anyway
[00:16:33] Well yeah but no there's never enough money is there right?
[00:16:36] No no
[00:16:37] They're not going to say we're going to earn less because we're going to be more benevolent to children
[00:16:42] They're going to say well okay look you know if we can turn up the algorithm and get more hits then we'll do that
[00:16:47] And unless there's a law and then the issue is well what would the law be?
[00:16:51] I think that's what you're saying isn't it?
[00:16:52] If you're going to introduce laws on this, what exactly are they going to be Emily?
[00:16:57] It's hard to say isn't it?
[00:16:58] Well and that's why, I mean the Online Safety Act has been a complete disaster
[00:17:02] Because it's got a bunch of vague requirements on platforms
[00:17:07] And now Ofcom is going to have to spend years coming up with some kind of regime to enforce that that creates the specifics
[00:17:14] And if the lot of Ofcom are struggling and going through rounds and rounds of consultancy to try to understand or consultation
[00:17:22] To try to understand how to make this workable, it really speaks to the heart of the issue
[00:17:27] I think with in absence of a clear and straightforward way of regulating platforms that we can all agree on
[00:17:37] And actually command some kind of public legitimacy, I think it does go back to that need to think well okay
[00:17:44] What can we do with young people in order to help them navigate that environment?
[00:17:50] Much the same way I suppose is drug use, drugs are what they are aren't they?
[00:17:57] They have particular effects on human beings that can be very difficult to navigate and they're almost designed
[00:18:04] And we try and downing them it doesn't work
[00:18:06] But you're talking addiction there aren't you?
[00:18:07] Or even anything kind of underneath that you know, dysfunctional behaviours around drugs, food, whatever it might be right?
[00:18:17] How do we navigate these problems where the very essence of it is almost designed to work against us?
[00:18:22] And I think it is that idea of what is it that we need to be doing with young people to help them make sense of this?
[00:18:28] You know an example that I give say around cyberbullying is you know talking
[00:18:34] Some of the interventions and work I do with young people is why might a social media pylon be kind of not pleasurable
[00:18:44] But like why might you be drawn to it? What are the actual kind of psychological and social dimensions at play there?
[00:18:50] Because yeah absolutely those things are being capitalised upon by the social media companies by definition
[00:18:55] They know that but what can you do almost to recognise how that's being played on within you right?
[00:19:04] And how maybe to respond differently to that because actually a lot of these kids then say well the real challenges isn't really about what's going on online
[00:19:12] It's about if I don't join in I'm going to get excluded from the peer group. Oh right then so that's not actually fully a technological issue then is it?
[00:19:20] That's partly a social issue. So what agency do you have within this domain?
[00:19:26] Where actually are the problems technological versus more social or psychological?
[00:19:32] And how can we disentangle that as adults rather than almost denying agency to these kids by saying it's all technological?
[00:19:41] Something I've done a lot on that is around like online sexual image sharing you know kids sharing nude images of themselves and each other and all that kind of stuff that we're really worried about
[00:19:52] And actually I would say that some of our messaging to kids that you know if you take a picture and you send it to somebody
[00:19:59] You've lost control over that image there's no privacy online and so on and so forth
[00:20:04] I'm thinking do you know what we've really denied the agency of the young people involved in those behaviors because actually yes of course your device enables you to leak that image and to share it with all your mates and to cause loads of problems for that person in the image
[00:20:21] But that's because the tool is offered by technology. You as an individual human being have to make a decision to use that tool
[00:20:30] The question is can you make that decision at 13 years old I guess or whatever age it is. Is there an age at which your mind doesn't develop to be able to make the rational decisions and you don't realize until it's too late and you're suffering the consequences of it?
[00:20:46] Absolutely right but how can we encourage young people to make more ethical and more responsible decisions?
[00:20:51] I don't think going into them and saying privacy is dead if you take an image of yourself it's going to get leaked. I think the message needs to be about encouraging how can we get them critically thinking about if an image is sent to you why would you share it further?
[00:21:06] Why would you make those sets of decisions? Why would you utilize the tools on that technological device to harm someone else? Because that's about your human agency.
[00:21:17] But we're talking teenage boys here aren't we?
[00:21:20] Well do you know what though? What's so fascinating about that is when you get into those conversations with kids not everybody's image is getting leaked. They make very clear judgment calls about whose image they'll share further.
[00:21:33] They'll share the picture of the unpopular person because it's all about bullying and shame. They'll share the picture of the somebody that looks stupid and they want to make fun of them. They are making conscious choices.
[00:21:45] Our role I think as adults is to make them aware of those choices and get them critically thinking about that in order to guide them in the direction of ethical behavior.
[00:21:56] I'm not saying that we'll solve it all when they're 13 but going in and telling them it's really dangerous risky don't do any of it to begin with. That hasn't stopped the behavior anyway. The unethical behavior is still happening. We haven't changed that.
[00:22:08] It's interesting you're saying all of this because I don't think my kids would be sharing those images. They are 16 and 17 by the way or just coming out for 16.
[00:22:17] And what surprises me about both of them, a boy and a girl, the older one is a boy, is that they are far more ethical and more concerned about other people than I ever was at their age.
[00:22:30] So all of this I think you're right. They will be going into these, if they're faced with these situations they will be looking at them perhaps a little bit more rationally than my generation would have if they'd been given the same technology.
[00:22:44] It's not the point that they're not the problem. It's the problem that the machines we've got, if it's just about children on children or young people on young people what you're saying is right, you can educate them.
[00:22:55] But the problem with these machines as we know is they connect with adults who are pretending to be that 16 year old or whatever or are encouraging.
[00:23:03] I mean some of the most awful cases recently have been encouragement of suicide and what have you. But it's adults doing that and they aren't the ones you're reaching out to are they?
[00:23:13] No and do you know what, and that's what's so fascinating isn't it, our sort of inability to connect with young people and resolve any of this is actually storing up some of those problems right.
[00:23:24] I mean an example that I think keeps coming out at the moment is this idea of what we call sextortion. So somebody shared an image online often with an adult and then they are being threatened and coerced to carry on doing so.
[00:23:37] But what is the way of actually resolving that? It's exactly the same as what we would want to do with any kind of abuse that young people are experiencing at the hands of adults.
[00:23:47] We want them to come and talk to us about it and to disclose that abuse. The problem with the way that we are approaching the digital sphere is we are entrenching those barriers because we're telling kids
[00:23:58] you know social media is bad, being online is bad. If you take a picture of yourself that's bad and stupid and you shouldn't do any of that and it's risky. And then when they're then finding themselves in these situations they don't feel able to come and speak to the adults around them.
[00:24:11] A lot of the work that I've done with parents and police around how you respond to actions that young people are engaging with when they're underage is about are you increasing or decreasing the likelihood that this child will come and disclose harm to you.
[00:24:26] If you have delivered a lesson to kids on sexting let's say and you've told them all it's illegal and it's bad and they shouldn't do it. And then they do do it anyway because they're kids and why wouldn't they?
[00:24:36] And then they find themselves being threatened or coerced maybe by an adult, maybe by another young person. Are they going to come and talk to us about it? Maybe not.
[00:24:44] And it comes back to what well I know not maybe not they're definitely not. They are not disclosing because they're like I've done something bad. I am going to get into trouble for what I've done wrong.
[00:24:54] Because you've already told me what your opinion is on that so. Exactly and I think it is it goes back to that thing of look digital is here to stay unless we kind of somehow manage to get rid of all the devices or somehow manage to regulate our way out of this.
[00:25:07] I don't think that's going to happen. It is not going to resolve like I say that age-old problem of adults having to find a way to connect with young people not just to guide their ethical development as citizens into adulthood but also in terms of embedding that dialogue early so that kids feel able to come and talk to their parents if they're struggling.
[00:25:29] Whether it be cyber bullying, sexual coercion online, whatever it might be that might be going on. How can we encourage them to understand that? Same deal goes with pornography.
[00:25:39] Do you know what's so I mean striking a few boys that I've spoken to who have got into trouble for looking at underage pornography and I'm not talking about real child like age say like they've been looking at 14, 15, 16 year olds online.
[00:25:55] And they've turned around to me and gone and they're also of that sort of age and they've turned around to me and gone I didn't think that was like a big deal because like they're my age. They're the same as the kids at school like it'd be even creepier to want to look at like a 25 year old, a 30 year old.
[00:26:09] To them that's normal and I was like wow there's like anybody ever spoken to you about the different categories of this stuff and what's legal and what's illegal and they're like no you just get told it's all bad and you shouldn't watch any of it.
[00:26:22] And I'm like right okay we're really not equipping you with the knowledge and the tools that you need to navigate this stuff right. Our very heavy handed undifferentiated messaging is just not hitting the mark on the complexity.
[00:26:37] Just declare what you're saying. I mean to sum up your position, you're saying that regulation, the kind of regulation they talk about with Ofcom and others and putting pressure on these internet firms, the people who are making money, the algorithms, you're saying don't bother with any of that.
[00:26:51] Just concentrate on educating the children? Is that right? I'd say no do both in tandem. I think we should absolutely be putting pressure on the social media companies. We should make sure that pressure is evidence based and informed by what we are learning about from our engagement with the kids.
[00:27:06] Our engagement with young people about the challenges that they're facing and we can then present to these social media companies look this is what needs to change and what we're asking for and campaigning for 100%.
[00:27:18] But we cannot use that as the full 100% solution. That will still not prevent us from needing to have conversations with kids about some of these.
[00:27:28] But do we know what we're asking those tech companies? What do we want TikTok to do, for example, if they said okay we're listening. What do we want them to do?
[00:27:35] Yeah I mean I think that they, I mean I think they already know that what they do to monetize you know the clicks and the attention is counterproductive. I think the time for voluntary action in that sense is kind of over.
[00:27:50] And I think that idea of how can we devise something that at its very most practical prevents that from even being a kind of voluntary consideration for TikTok.
[00:28:02] I don't know necessarily technologically what that would look like and I think that's what the Online Safety Act is trying to do right.
[00:28:10] And I think we need to figure out what's actually workable and not workable on these platforms. My sort of technical prowess with that is probably quite limited in terms of understanding exactly what that would look like.
[00:28:22] Well I mean if you talk to the technologists they would say something else which is just unachievable is this idea that everyone should be identified.
[00:28:28] So because you know if you take the anonymity out of all of it then in theory it's very easy to track people. I mean you can sort of track people anyway so if you break the law online, you know you can do a lot with IP addresses.
[00:28:39] If you've really broken the law and you should go to prison for it then the police will get permits. They'll go to the internet companies, they'll track your IP address and they'll nab you.
[00:28:48] Yeah. But I mean so that already exists even without any changes in regulation and that goes on.
[00:28:54] That's how we do catch people and I guess that was one of the benefits of all of this happening online is that it is traceable.
[00:29:01] Well do you know what actually some teachers say that. They say you know bullying is actually easier to deal with now because we've got evidence of it rather than who says what.
[00:29:09] And I think yeah the anonymity thing, you know there's a bigger sort of theoretical debate about the way that anonymity impacts behaviour online and I definitely think there's a problem with that.
[00:29:18] I think there's a it's the trade off isn't it in terms of who collects that data, who are we you know identifiable to and with what impact on privacy rights and so on.
[00:29:34] And that's what I'm saying about this online safety act. We still actually haven't resolved any of those tensions and those tensions can continue to unfold in all the political debates that we had about that law or that legislation and the you know the sets of law contain within it.
[00:29:47] And I think we do wait to see what those technological solutions can look like. I think there have been some quick wins. I know Instagram is looking at say how to prevent adults from being able to contact minors who aren't in their friends list and the idea of adult men sending you know explicit imagery to children and that being kind of blocked at source.
[00:30:14] Because I think some of what these you know social media platforms say they would find very difficult to do. I don't believe it. I think they know exactly what is in every single image. I think they know exactly what the nature of communication is because that's how they make their money.
[00:30:30] And I think they need to be called upon to not just use those that knowledge and that technical ability to make money. I think they need to be called upon to use it to keep their users safe.
[00:30:43] And should they know the age of each of their users as well? I mean there's a lot obviously there's a lot of people, there's a lower age limit for a lot of these platforms and kids are on it way below that because they lie about their age because they want to get on it.
[00:30:56] So it almost is like well saying well okay there's no lower age just be honest about your age and then you can actually control the images that they see. I'm sure you're right. There's algorithms that can go through and say there's a lot of naked flesh on this. It's possibly a nude image whereas this is a picture of a boat in the Lake District.
[00:31:11] And you know what actually that point you just made is quite interesting because it goes to show when you try and regulate something you end up with unintended consequences. When you put an age rating on something kids lie and then they lose their protective status as young people because they've lied about their age.
[00:31:27] Whereas if you didn't have an age rating then you'd have more honesty and then you'd be able to manage the interactions between people because you'd understand the differences in age.
[00:31:35] And I'm not saying therefore that we need to get rid of age ratings entirely. I'm saying that that's the balancing act all the time isn't it? It's the push and pull between what we think we need to implement to keep people safe and then the potential counterproductive effects of that.
[00:31:48] It's like when kids phones are monitored we've actually lost some of the evidence say of grooming from adults because kids are deleting it because they know their phone is going to get monitored.
[00:31:58] And I'm not saying that-
[00:31:59] Well they're always more savvy than the rest of us are anyway in terms of those kind of things.
[00:32:03] So should kids be able to maintain their anonymity online or should we be saying for everyone, not just kids, if you want to get online you've got to prove who you are just as you do for example with your kids.
[00:32:15] Yeah for sure. And that's the argument with pornography right? It's an explicit category. The idea is there's going to be some sort of age verification or age assurance to be able to access it.
[00:32:25] And the problem I think then that goes even like above what we've already discussed is what we're seeing happening with that.
[00:32:34] Well okay well most people are quite uncomfortable with the idea of like uploading some kind of proof of ID to these websites who's going to keep that data and all of that.
[00:32:44] And then there's the idea of well age, AI being able to judge the age of the person and how the kind of validity and reliability of that data is going to be.
[00:32:54] And then you've got the problem that you need an international global regime because to get around all of this you just need a VPN. So you're still not fully solving the issue.
[00:33:03] What you might do though is it goes back to what we were saying in the beginning. You might be able to get some quick wins say around the seven-year-old that stumbles on pornography for a pop-up.
[00:33:13] Yeah you might be able to tick some boxes on that and I think some of the pornography stuff is about maybe managing some of that.
[00:33:19] But what you're then going to have to do is as kids get older and they become more autonomous and more agentic and their ability to do things beyond the purview of the adults around them becomes more and more increased.
[00:33:29] I think you're going to have to keep having the conversation. So how you would monitor a seven-year-old's device would be very different to what you might be doing with your 15 year old.
[00:33:38] But that's always been parenting right?
[00:33:40] For sure and what does the I mean this the seven-year-old could have gone into his brother's bedroom of course.
[00:33:46] And so but what does it do to the seven-year-old? Do they just go well you know do they just click away and just or does it materialize that you materially change them in some way?
[00:33:55] Well yeah I mean do you know what most of the data on these shows they're fine. They might be a bit more accurate.
[00:34:01] But I think the thing is that it's not just about the age of the child.
[00:34:07] They might be a bit shocked and upset. You know some of them in the first instance you know of course but most of them end up being fine.
[00:34:16] I don't want to overstate kids resilience because I think that can lead to kind of putting too much burden on them to carry things and to display resilience when actually the adults shouldn't be just throwing them to the walls like that.
[00:34:27] But I do think most kids end up all right and I would say in terms of pornography there is no evidence that there is a direct cause and effect just like there hasn't been with any kind of media that young people have been exposed to if we want to use that language.
[00:34:39] You know we used to do it with the video games and the you know and the slasher films and all the rest of it didn't we?
[00:34:45] But if kids watch this they will go out and copy what they do and it will damage them. There is no evidence for that happening on a widespread level.
[00:34:54] It's just this regular sort of social horror that everyone has every day.
[00:34:57] Oh yeah we love the moral panic of it don't we?
[00:35:00] We do rather.
[00:35:01] I snuck in to see Poltergeist at the cinema.
[00:35:03] Or it's not too late to prosecute Phil I think.
[00:35:06] He's a complete wreck now.
[00:35:08] Absolutely, it changed me.
[00:35:10] It's a disaster.
[00:35:11] But there is one other core and final point and I think Roger will relate to this as well.
[00:35:16] How much of the time that kids are spending online are they looking for affirmation?
[00:35:20] I mentioned Roger because obviously we do this podcast just for the affirmation for the number of likes we get.
[00:35:24] I feel for them.
[00:35:25] And I think that's the biggest thing.
[00:35:28] I mean that's important to them and I wonder how dangerous that is.
[00:35:33] And there's an ONS survey says 10 to 15 year olds, this isn't me just upset because my daughter is getting more likes for her post than I do.
[00:35:39] But the ONS survey says the crime survey for 10 to 15 year olds says that oh my god I'm going to be a good mother.
[00:35:46] I'm going to be a good mother.
[00:35:48] I'm going to be a good mother.
[00:35:50] I'm going to be a good mother.
[00:35:53] I think it is.
[00:35:54] The ONS survey says the crime survey for 10 to 15 year olds says that over a third of children accepted a friend in that age group accepted a friend request from someone they did not know online.
[00:36:04] And to me that's not actually that disturbing a statistic.
[00:36:09] I mean some people it will be but for my daughter who sort of like goes and sees bands and stuff and tweets about it there'll be other followers of that band who will take a friend request for her.
[00:36:18] And my response to that is so what?
[00:36:21] Yeah, yeah I think that's quite a fascinating example.
[00:36:24] I mean we saw that during like the COVID lockdowns.
[00:36:27] I heard from a lot of young people saying if we didn't make friends online we weren't making friends anywhere.
[00:36:32] Like this is where our social network builds it's how we make friends and how we do things right.
[00:36:37] It's so normalized to them.
[00:36:39] I think there are inclinations that human beings have towards social belonging that can cause dysfunctional behaviors for themselves and for each other.
[00:36:48] So you know like with the cyber bullying or leaking people's nudes or whatever a lot of young people's behaviors are driven by a desire to belong.
[00:36:57] And they will do things about which they are quite ambivalent actually but they will do it because they want to find a tribe they want to belong.
[00:37:06] That's evolutionarily baked into us human beings isn't it?
[00:37:09] We wouldn't have survived if we didn't have those inclinations within us.
[00:37:12] And I think therefore it goes back to that point.
[00:37:16] Yeah of course social media companies themselves capitalized on that desire in order to keep us on these platforms and to keep drawing us in and sometimes to encourage us almost to make decisions that aren't good for us or for each other.
[00:37:29] But also on the other hand I think having conversations with young people about that, about saying you know why do we do things sometimes that maybe are dangerous or maybe are dysfunctional or harmful.
[00:37:41] But like they feel good because we get a sense of belonging from it.
[00:37:44] And I think that's one of the most powerful ways that us as adults can bridge the gap with young people.
[00:37:49] The best quality conversations that I have with young people is when you sit down and level with them and you say look none of this really is like that deep in the sense of you personally are doing something wrong.
[00:38:01] You're just a human being trying to navigate all of this. Let's just talk about it and reflect on it.
[00:38:05] Maybe there are no right answers or quick or easy solutions to some of this.
[00:38:09] Like with porn. Yeah of course maybe you know it's bad to watch porn but like people do it's very natural and normal for human beings to navigate rules.
[00:38:18] So what you're saying I think is don't emphasize the rules either because they're probably breaking that.
[00:38:22] They will break them and also it entrances shame. We know that kids dealing with them the shame of what they're doing then causes them to shut off even more.
[00:38:31] I think it's actually just leveling with them and having that conversation.
[00:38:34] And you know what parents teachers whoever say to me they're like young people don't listen to us.
[00:38:40] We told them what to do. We told them what they're allowed to do and not allowed to do.
[00:38:44] Why aren't they agreeing with us and doing what we said?
[00:38:47] And I'm like because nobody would if someone turned around to me and just gave me like a full on lecture about what's good and what's bad and what I should be doing and not doing.
[00:38:57] I can't we switch off as well. I think.
[00:39:00] So Emily they need to talk to you.
[00:39:03] My kids aren't going to listen to me and then I'm going to listen to their teachers.
[00:39:06] That's the answer.
[00:39:08] Emily thank you so much for taking us through all that.
[00:39:10] Thank you.
[00:39:12] And hopefully there we are we've been talking and that's what it's all about.
[00:39:15] Exactly this is a model for what we need to do.
[00:39:18] It is and you know I think we can forget about Big Tech.
[00:39:21] They won't change.
[00:39:23] I mean we can we can introduce laws.
[00:39:26] Laws are the rules of course as well.
[00:39:28] Kids will find their way around the rules around the tech in the way they always do.
[00:39:32] But thanks for letting us know what you know the evidence you've got and what it says about the whole problem which is as you know a very vexed one.
[00:39:39] Thank you Emily.
[00:39:41] Cheers.
[00:39:42] Goodbye.
[00:39:43] So yes I need to like a lot of things.
[00:39:45] It's all a question of balance isn't it really.
[00:39:47] It is.
[00:39:48] I think we have a great idea of that because so much in the world seems to be just getting out of control.
[00:39:54] You think?
[00:39:55] It feels like that.
[00:39:56] Yeah and particularly in terms of geopolitics.
[00:39:59] Well at the moment yeah and we are moving it looks like to something pretty dramatic in the Ukraine War which is going on for two and a half years.
[00:40:06] Well it feels like that the front around Kharkiv.
[00:40:10] Are the American weapons going to get there in time can the Ukrainians hold on if they don't.
[00:40:15] Will they all roll all the way back potentially to Kiev.
[00:40:18] Does the whole thing end that way in any case with a potential change in the White House in six months.
[00:40:23] So what happens if Russia marches into Kiev.
[00:40:26] What's the what's the reaction to that.
[00:40:28] Do we all go oh that's a bit of a shame but are we moving swiftly on.
[00:40:31] Yeah.
[00:40:32] No well I mean everyone is scared senseless and it's very interesting the Americans are pushing in very hard at the moment.
[00:40:37] Although obviously the people in Washington may change in the say in six months time.
[00:40:43] It's very hard to say and will it all fall apart which is the way it feels like.
[00:40:46] And if that does happen and Putin triumphs what difference does it make whether Trump comes in or not.
[00:40:52] Yeah.
[00:40:53] No and so what's the one then the West just goes oh well we tried and.
[00:40:58] Well most of them would be deeply unhappy about that but perhaps not certainly in terms of Germany and France with enough force to make them actually do something about it.
[00:41:06] Anyway that we will discuss next week.
[00:41:08] Well that's going to be an uplifting episode isn't it.
[00:41:11] But anyway yeah well I mean we have covered it quite a bit over recent months but it's just one of those issues.
[00:41:17] Time for an update.
[00:41:18] Yeah for sure.
[00:41:19] That's what we'll do next week.
[00:41:20] We're just looking very serious.
[00:41:22] That's how bad it is.
[00:41:23] That's next week on the why care.
[00:41:24] Thanks for listening today.
[00:41:25] We'll catch you next week.
[00:41:26] The why curve.

