Oct. 23, 2025

Has the internet ruined the Human Identity?

Has the internet ruined the Human Identity?

It promised a world of self-expression and connection. But instead, it’s encouraged us to create performative versions of ourselves; curated by algorithms and validated by likes.

In this episode, Colin Corby – a Digital Detox Coach, Technologist, and CEO of Technology Wellbeing - joins us to explore the complex relationship between technology and the human identity. We discuss how online personas are curated, the psychological effects of social media, and the role of influencers in shaping self-worth.

We also look the implications of AI on identity, the psychological toll of a permanent online record and the diminishing empathy in digital interactions - ultimately contemplating the future of human identity in a technology-driven world.

Check out the episode here.

 

Gareth King (00:24)

Colin, thank you so much for joining us and welcome to the show.

 

Colin Corby (00:27)

Thank you for inviting me, I'm very much looking forward to it.

 

Gareth King (00:30)

Yeah, me too, but before we get into it, can you tell us a bit about the work that you do and the journey that's led you to this point?

 

Colin Corby (00:37)

So i'll try and keep it short. When I was in my mid-30s, I had a fantastic job in technology, lots of world firsts, great people to work with, but I was suffering from stress. So, what was happening is the heart rate went up and up and up until I started to see stars, and I passed out a few times at work. So, loads of tests. At the time they said inconclusive results. So thought, well, what am I going to do? Because I've got to do something.

 

So, I remember being a good swimmer because I live near the water. And so I thought, better get myself fit then. So, I went swimming, a couple of lengths, absolutely wrecked. It was a tipping point, I had to do something. So, I persevered with the swimming. I got fitter. I took lessons to learn all the strokes. I started racing, started triathlon. To cut a long story short, I ended up doing four Ironman triathlons and winning county titles in swimming and stuff like that.

 

But it was in doing the Ironman, I just realised that how you talk to yourself can change the way that you think and change what you can do. That fascinated me, so the psychology aspect started to set in. And then I got the opportunity much later on to leave the corporate world. I was going for some interviews and one or two other things, and I was being guided by a coach. I said, you know what, I don't really want to do this. I said, I love technology. I love sport and the thing about the mind, and why am I on my iPhone so much? I researched that and I thought, yeah, I can just smash all of the three things together and work out what's going on, but actually be able to coach, give talks, help other people.

 

Gareth King (02:21)

You've previously spoken at TEDx around the human identity and the effects of technology and kind of our online selves and our true selves. Do you think that our online selves and our offline selves, do you think the line between those is so much more blurred these days? Do people really realise that they're even curating an online identity or do they just see it as an extension of themselves?

 

Colin Corby (02:49)

I think that most people now must curate their online presence, because we've had it long enough. So I curate my online presence. I use it mostly for work activities, and very little for personal activity. It's curated. I think everybody knows that it's curated.

 

A lot of people now realise that actually it's self-censorship, because there's a record. Every website you've ever been, everything you've ever purchased, every message you've ever sent, every post you've ever sent is recorded as online memories. So we basically self-censorship so there's the whole host of things I would never say online as if I was in the real world with people. So self-censorship.

 

Now we're quite lucky in the sense that we can observe what's happening in China and that has a very interesting way of self-censorship. Whereas we're being influenced by predominantly the US.

 

In preparing for this talk, I said, we're going to talk about the internet and then we're going to talk about our identities. If we look at the internet first, well, what is it? Well, the internet today isn't what it was last year or the year before or the year before. The internet today has a large percentage of non-human content. The internet is an artificial construct. It's the most complex thing humans have ever created, but it's a facsimile of the real world.

 

Gareth King (04:18)

Yeah, for sure. And I think that that was kind of where I was just heading towards with the reference to the lines blurring. As you said, that being the facsimile of the real world, I can remember a time there was no internet. And so it was a thing that came into my life. My identity was an offline identity that the internet got added into at the time.

 

And then I think over time, as it's kind of become so much more all-encompassing across everything, as you said there, depending on where you are online or what you're doing, people are self-censoring and curating like a different version of their identity.

 

So, I'd love for us to get into how maintaining all of those different versions of your offline identity in an online space can either A, blur together or B, become such a huge part of your identity that the offline part is either equal or less than the sum of those parts. So there's a lot to kind of get into there.

 

Colin Corby (05:17)

So the thing to say is that who you are, who I am, it's got two components. One is who other people believe we are. So, these multiple curated personas of ourselves, that's how other people are perceiving us in the online world. That's how governments perceive us, job search companies, algorithms.

 

The other thing to think about is that there's an infinite amount of information on the internet, more than we can possibly access, and we're being fed feeds, curated feeds from all the different apps and algorithms. So, they shape what we experience on the internet. Now, if we come down to ourselves, evolution, thought it wise, for biological animals to have this conscious experience of self, we experience the now and we predict the future based on all of our experiences.

 

So if most of our experiences are online, then those experiences are shaping who we are. And like a lot of people, if they spend predominantly most of their time online, then most of the experiences are online and that shapes who they are more so than when you switch off online, you go out for a bit with mates. You have a good laugh or you play a sport or you do something in, I call it the real world, but we have a perception of the real world.

 

So what I'm interested in doing is getting that balance of getting people to have more human contact, so they can hang on to who they are. But we know in a non-internet world, we were a different person at work to who we were in a sports club, who we were down the pub, who we are to our families. So that part of it isn't new.

 

Gareth King (07:06)

No, absolutely. And I think you touched on something there around, know, if you're predominantly experiencing the online or the digital world, those experiences are shaping who you are both online and then beyond. But one thing I'd love to kind of get into about that is we know that people are curating their online selves. And so, you know, everybody knows that there's a lot of discourse around mental health problems that might come out of social media, whether it's people getting body image issues and things like that, you know, jealousy about it seems like everyone else has a perfect life, et cetera.

 

And we know that that's because you're only seeing everyone else's highlight reel. So, if you're, if you're taking in that their identity is this most perfect picture, perfect life, you know, everything seems flawless. That's obviously going to give you some feelings of inadequacy in the online space, that are then going to translate into the offline space potentially. But how do you think that likes and shares became such an important and valued form of digital currency for our self-worth?

 

Colin Corby (08:14)

So, at the root of it is with social animals. Part of the success of homo sapiens is that we’re social animals, that's a safety thing, a sense of safety. And within a group, everybody is always trying to pitch themselves where they fit into that group, how high they are, what's your standing at those sorts of things. And those instincts, that social behaviour that we've all got, have migrated to the internet, and it's migrated to all the likes.

 

So particularly, as you get older, you get less worried about it. When you're young, you're worried about everything. But if you're missing the physical contact with friends, the going out and the real world activities, and if you're focused solely on this, then if people don't like you, then that has serious consequences because that makes you feel instinctively unsafe. Because we're social animals, we need to be liked.

 

We need to have a social standing. And so it’s just migrated to the internet and it has horrific implications for people that are very sensitive to that.

 

Gareth King (09:23)

Yeah. Yeah, look, that's, that's perfect there, the way that you've summed it up as it's, it's kind of a biological thing that's instead of playing out in that primal structure where you might've had, I don't know, two lions or something fighting and that's their competition or, you know, we know animals do mating rituals and things like that. But in the online space, as you've called out, you don't need to actually be anything other than what you are in that one little moment, which I can imagine is just fragmenting people's identities beyond all sorts of repair.

 

Colin Corby (09:56)

I heard at schools, and you have a career talk at school, what do want to be? An influencer.

 

Gareth King (10:02)

Yeah, yeah, yeah. It's something interesting in there too, because I think that humans love influence and no matter what we're doing, as you said, even in your peer group, there's always that leader of the group and everyone kind of settles into their structure and the hierarchy offline. And I think that the influencer is just kind of a globally reaching, very easy way to kind of sort and section yourself into different things on just a crazy scale that we've never seen before.

 

Colin Corby (10:31)

The other interesting thing is in the real world, if you want to be a sports person, a musician, an academic, whatever it is - it's hard work. It's a lot of hard work. Everything to do with technology is about giving the appearance of everything being easy. So, if you see the influencers and you see this perfect world that they have, and then you think, I could be that. All I need to do is tweet my videos or curate myself a little bit more or find a niche or just be lucky.

 

But the sad thing is a lot of influencers suffer enormous mental health issues because once they're on the treadmill, they then have to carry on finding content. I recently had a holiday in Italy and we went by the lakes, went by the lakes and there were so many young people who were there purely to take the Instagram shots and the TikTok clips.

 

Gareth King (11:29)

Yeah, look, I've, I've seen, I've been out for nice dinners with, with my, fiancée and we're so like keen just to get this delicious food and, taste it. And we've seen tables next to us, you know, people young, younger than us and, and with, with kind of mini tripods and doing full shoots and things around the food that I'm thinking, oh my God, this incredibly amazing tasting stuff is just getting cold and you're not even eating it.

 

And like you said, it's just all about the pictures. And I think for me, being a little bit older than growing up just into that world as a natural thing. Like I don't see the appeal in it. And I would almost feel a little bit awkward giving myself an identity like that, a digital identity.

 

But I guess if you're doing that to a community or followers who are all into that kind of thing, you could see how that performance of a self in the digital space could influence our own core beliefs and sense of self. As you said, the demand to keep doing that, you may feel that that becomes who you are, not just this very special curated and I don't want to say fake because it's real people doing it, but it's not like a real representation of the whole you.

 

How do you think that just so much emphasis on the influence side of ourselves in the digital space, how is that affecting people offline? Like, have you, have you seen any, anything around how it affects those influences, as you said, when they're on the treadmill trying to come up with content? How is it changing people?

 

Colin Corby (13:06)

So, what I've seen with influencers is the stress, so increasing stress of always having to perform. And quite often an influencer feels so stressed and burnt out that they go offline for a while and then they come back. So, there's a burnout associated with that. In the same way, there's burnout associated with working all hours in our jobs and things.

 

But there's a big difference. You see, you and I, because we’re biological animals, our sense of self is a biological sense of self. It involves the body, it involves the mind, it involves all of our senses and emotions. Whereas on the internet, it's a facsimile. So, they're digitised images, digitised things, but it's a very flat, two-dimensional representation in many ways. So, in a sense, there's this mismatch between who we are as biological animals and this sort of two-dimensional internet version of ourselves.

 

Gareth King (14:05)

Yeah, so can you just explain a little bit about what the most common forms that mismatch might take?

 

Colin Corby (14:12)

Well, I'm fascinated because I love technology, and I'm fascinated by where it's going and problems associated with it. I'm not in favour of the dead internet theory. The dead internet theory is that as soon as something has so much machine content that we stop using it, I don't think that will ever happen. But there's this arms race.

 

One of the biggest problems is it's very, very difficult to prove that you're human online. Because the way technology has advanced, it's so easy if you have the right resources, processing resources and training models to create clones of people. Now in the TEDx talk that I did a couple of years ago now, ‘Are we losing our identity to technology?’ Because we're outsourcing our knowledge and skills to technology.

 

Now, at the simple level, sat nav, turn left, turn right, we get somewhere. We don't know how we got there and we probably couldn't get there again without the sat nav. With AI, it can now write our emails, it can do our presentations, it can read reports, it can do lots of things. And there are lots of CEOs in the US, technology companies, who are saying, well, okay, you know, the internet's so, so complex. What you need is an AI personal assistant who can be you online.

 

Now that's got lots of privacy problems associated with it at the moment. But so, if I was talking to you online, let's say 10, 15 years time, it might be your AI agent acting on your behalf and my agent acting on my behalf. And so that's going to change the picture of who we are again, because technology has moved on.

 

Gareth King (15:58)

That's a great tangent that we've headed down now is as we've started off talking about how the internet and the digital world has affected the biological human identity. And as we've discussed, over time as people get more and more digital, as their identity demands more time online in that space, as you've raised now with the sophistication constantly increasing about what we can fake and pretend and outsource, it seems like the next evolution that we should be, you know, we can discuss now is what happens to the digital identity, or the digital human identity now that these large language models and agentic AI and generative tools come about. Where is this going to go, Colin, do you think?

 

Colin Corby (16:46)

So I, there's lots of things that I can't control. So, I'm a firm believer in saying, well, okay, what is it I can control? And how can I communicate that to other people? So I come from a sustainability. So, you've got to be online, but you've got to be human as well. And the reason why I've sort of honed in on digital detox athlete is that because it gets across, we all know an athlete has to do all the right things biologically to perform at their best. They have to get enough sleep, eat the right food, but also this idea that things take effort and take time.

 

If we wanted to create a new digital detox habit, we're going to have to spend some research suggests 66 days to create a new habit. But it could be as long as 250 perhaps, or shorter, but depending on complexity. So that's why I've gone, it's this effort. Everything we do in the real world takes effort, human relationships and human connection - really, really important. Lots of studies have shown that although we might be the most connected people in the world online, it's not the same thing. It's still at a superficial level. It doesn't replace human connection.

 

So, hence, I'm trying to get people offline to balance out their time online so there's biological animals they can survive.

 

Gareth King (18:09)

You said something there about almost relying too much on the online space. And we've seen with large language models, what happens when people get so reliant on a false reality, there's not even a person that they can't handle it not being there. Do you know what I mean? Like how does somebody detox from that?

 

Colin Corby (18:29)

There was a terrible thing reported a while ago that one of the tech companies, US tech companies, had this idea of creating a best friend for children, an AI best friend for children. I think it was shot down very quickly.

 

Yeah, but it takes us away from our humanity. We're very, very complex animals. We need all of the things around us. If you take some of them away, then for some people, they might come unhinged. It might be really important. Whereas for other people, they'll be fine with it. We survive by being different from each other. So, we need all of those differences and we need that complexity, because at the end of the day, evolution has given us the task, if we want to survive, we have to survive in this real world that's changing all the time.

 

We can't possibly know everything about it. We just have to know enough about the world in order to be able to survive. The past for us isn't about remembering what happened in the past. Luckily for me, all the bad things I've done in the past, all the people I've upset, all the hateful things I've said is not online.

 

But the past is about helping us in the future, survive in the future. Whereas online, it's just a straight record of what happened. As if you're the same person 20 years ago, 30 years ago. And it's very easy to be a prisoner to that.

 

Gareth King (20:00)

No, for sure, and you said something there around there not being a record of a lot of the stupid things that you've done and I’m the same, you know, and I've had so many conversations with people my age about the exact same thing and everybody seems to get to the same conclusion, which is I'm so glad smartphones and everything wasn't around when we were that age. And it's not like, you know, committing heinous crimes and things like that, but it's just we've seen that there's so much risk.

 

And you know, when everything's being documented at every given moment, it could have substantial effects on your life that you of course, obviously have not, not meant it to be. What would you say the psychological toll of knowing, like if you're a young person now, you don't really know a life before internet. It's just a thing. Like it's just part of life. What kind of psychological toll that puts on somebody knowing that there is that unerasable record of your past that, as you said, you can never escape from like, it's always part of you. You can't kind of reinvent yourself somewhere else and do something else.

 

Colin Corby (21:09)

Well, I mean, there's been lots of cases where something a child would have said, as defined in Australia or the UK, has affected their job prospects. With children, the brain is growing and it grows in certain stages, but some of the cognitive skills of self-regulation and control are some of the last things that children and young adults get. And it goes into the 20s.

 

So, if you imagine then you're in a family, and every bad thing you did at home, someone was spilling the beans to all your friends. Every word you said. I mean, that amount of pressure is incredible, isn't it? Children have got to be children, because we learn by mistakes, and the culture that I grew up in is certainly not acceptable today.

 

Gareth King (21:49)

No, for sure. And I think that we could even say that about this point right now that we're speaking. But you said something a couple of minutes ago as well around potentially losing bits of ourselves, whether it's psychologically or emotionally, you know, the more we're kind of interacting with, whether it's a large language model or, or a stranger online.

 

We know that if you're encountering a stranger offline, you still recognize them as a person, you know. Whereas you said a couple of minutes ago, they're just this kind of flat 2D representation of who knows what it is. I don't know, someone might have a picture of a tree instead of them. So, you actually are just talking to a screen or interacting with a screen, which I can imagine, it doesn't require as much empathy and any of those human emotions that form part of your real person identity.

 

Do you think over the time that you've been, I guess, looking into and exploring this space, have you seen us becoming less empathetic as a society now that we're so used to interacting with pixels rather than strangers?

 

Colin Corby (23:06)

Those skills about emotions and those things, they're slower skills. So, if you read a book or if you talk to a person, you've got enough time for those feelings. So we've got mirror neurons. So when I'm talking to you, to try and understand what it is you're saying in the fullest sense, I've taken all the information, the way that you're saying something, the intonation, your facial expression.

 

If it's a face-to-face, I get lots of other feelings the way you're standing, those sort of things. In order for me to understand it, I have to almost like mirror some of those things to understand what you're really feeling. Are you upset? Are you happy? Is there something you're not saying? So then you come online and it's very, very superficial. We make snap judgments. Whether or not we're talking to a bot, an AI bot, or a real person, we're projecting the first level of humanity onto them, but we're not going into the emotion.

 

That's why people can immediately react to someone says something about some bad thing that's happened. And we're outraged. Instantly, we're outraged, but it might be 20 years old that thing happened. We've got no sense. Our critical thinking hasn't engaged.

 

Gareth King (24:29)

Yeah, I mean, that is interesting because I think as we touched on a little while ago around that kind of social currency, and I think that that I'm an outraged person by this thing, whatever it is, whoever you are, you might've seen something and you can easily just move past it in three seconds, you know, but signalling that you're either outraged or like so in love with this thing, it's almost as we go back to formulating this identity as the angry person, or the super pleased person, which is again curating online that's not even your real identity.

 

Colin Corby (25:08)

Yeah, or proving that we believe in this thing as opposed to that thing. Yeah. And having to do it.

 

Gareth King (25:12)

Yeah, look, it's so interesting to think about. as you said, those algorithms are designed to keep you embracing, let's say, the outrage or the anger part of your identity. And then maybe spending enough time looking at that, you will become that person offline.

 

But you've mentioned now around AI, and I'd love for us to kind of dig around with that a little bit. You're very aware of it. And I guess you've spent quite a bit of time thinking about where it goes from here. What personal and social implications for the human identity, whether it's offline or online, do you imagine that we need to be aware of due to where AI is now and then also where it can potentially go?

 

Colin Corby (25:57)

So i've already started talking about an AI detox. And it's different from a digital detox. Digital detox is where you go offline. An AI detox is something you can do online. If I go on Chat GPT and I say, look, I've created this post, can you suggest some improvements? Chat GPT are going to say, brilliant, you've done exceptionally well on this. Here are a few minor things. And I look at it and I think, that's interesting, but that sounds really American.

 

So AI is creating the new average. I experiment with it and I think, oh no, I can't do that as a British English speaker. And that's not something I would do. And it basically creates the same sort of thing, but it's always that, you've done great type of thing.

 

I think that we have to be very, very careful with AI. If you outsource too much to AI, and let's face it, big companies are basically using AI as the next step of digital transformation. And digital transformations ultimately are about removing people. I mean, you know, making it less expensive. But if we personally rely too much on AI, then what are our skills as humans? So, if you think about the future of work and the future of humans, you know, it's a bit sad.

 

All technology makes life easier for us. I mean, who would do the washing in the local stream anymore? You know, washing machines are fantastic. So we have to learn how to be biological animals. We have to go down the gym because we don't move about enough, or we have to go running, or we have to go walking. We have to learn about cooking from real vegetables and food because all of a sudden processed foods seems to be a problem now. So for us to survive in the future, we have to focus more on being human and being humans together because that's going to be what's left for us.

 

Gareth King (27:53)
Yeah. You mentioned something there that I'd love for us to look at now. And it was around the role of AI in the workplace and whether they were admitting to it or denying it or whether something else is going on. We know that everybody right now is seeing it and feeling it and hearing about it as this kind of replacement for lower level tasks and people.

 

Now, of course, as we know, this kind of technology increases in sophistication exponentially. So, it's, it's only a matter of time until it's going up the ranks and then who knows it goes.

 

I guess the point I wanted to look at through this was we know that for so many people, a huge part of their identity is their job. And so if, if their employment, their purpose or, or even their job or their title is removed thanks to technology, that feels like an even larger potential problem that technology is indirectly causing to somebody. Except the problem now is not just losing parts of yourself to online. You have lost a huge part of your offline self and how that recovers if those avenues to replace it aren't there, thanks to AI.

 

Colin Corby (29:07)

It's interesting. Now, what we've got to say about AI is AI within certain fields where they're heavily bounded like the medical field, image recognition, those sorts of things. Absolutely fantastic. They go through proper studies, field trials. The benefits are enormous, but not all AI is the same.

 

A lot of people have identity. Who are you? Your title is your badge in life. Particularly in the US, you are that person. Unfortunately, in the US, you can be sacked at a moment's notice. It's not true in Europe and the UK and certainly not in Australia, but it's very much that badge of who we are. So, in a sense, there's a lot of hard work for people to actually be bigger than their job.

 

Now I've worked for a lot of technology companies and enjoyed working for a lot of technology companies, but my first company was British Telecom International. I spent 19 years there and I was institutionalised. It took me two years to actually reconnect with, you know, who I was to be able to make a move. And that's going to affect a lot of people because if you lose all those skills, then what have you got left?

 

And so what an AI detox is about deciding what are the skills that you need personally as a professional, as a person, and then you need to practice. We might, and we work hard on something, problem solving, but we don't have the answer. And then all of a sudden, the next day after a sleep, or when we go for a run, the answer pops into our mind. That's because our unconscious has worked on all the assumptions and come up with it.

 

But what if we don't have all of that, those knowledge and skills for it to work on? So, our ability to be as good at problem solving and critical thinking is diminished if we don't practice those things. But all of this is hard work. All of this is hard work and, you know, it won't happen. It's not going to happen tomorrow. And we've got a tendency to take the easy option.

 

Gareth King (31:12)

That's such an interesting paradox that you've just highlighted there around doing stuff, let's say the human way is hard work. And when you've got all these tools in front of you, giving you the super easy option. Obviously it's super, super tempting. But as you said a little while ago too, like people are handing over more and more and more of themselves to technology and losing those parts of their identity. That what does happen when it's gone, you know, everyone's outsourced everything to technology, what do we have left? Like how far can we go?

 

Colin Corby (31:45)

I'm a fan of science fiction because I think science fiction works on sort of really weird ideas and sees how they go. One version of humanity is that we're not really very good at looking after ourselves. I mean, there are, you know, there's inequalities, there's wars, there's all of these other problems. One outcome is that at some point in the future, and we don't know when, the AI sort of farms us. And we might be perfectly happy with whatever it does for us. It's a simpler life but it's not a human life.

 

Gareth King (32:20)

No, and I think that it takes me right back to certain biological beings have got that gift of sentience, being able to ponder what our station is, not only in life, but kind of in life itself. And then I think that, yeah, look at this, you look at the positive.

 

Colin Corby (32:37)

It's hope because the current versions of AI, they calculate the past. They don't really know the relationship of all of those things in the past and certainly the past can't predict the future. They have no concept of self and therefore they have no real concept of the reality that they're in. It's not to say that at some point in the future they might, okay, but they don't.

 

Gareth King (33:02)

I was just thinking that like how far off do you reckon we are till they get sophisticated enough to be aware of their own shortcomings and kind of address that?

 

Colin Corby (33:14)

At the moment the nuclear powered data centres that got planned in order to get enough processing power is on a trajectory where climate change is going to be a real problem, more of a problem than it currently is. But every six months, approximately, there's something new happening and you can't predict, there's this thing called general intelligence. You can't predict when it might happen.

 

But I'm of the view that actually real intelligence is biological, and it's about testing in this world. And you mentioned it, the great thing about what humans can do, we can make a decision based on no information at all in this world. We'll go with something because we have to, that's the way we're designed. We can think about things that don't exist. We can think about future things. Now it's all influenced by what we already know, but there are some marvellous thinkers who think that step beyond. And so as humans, the hope is that because we're in the real world and because that other world is a facsimile, then it's missing out. I know a lot of the AI companies are now thinking about getting AI down into the robotic level so they can experience the real world because they've run out of the internet data, human data to work on. That data has been polluted by robotic data.

 

Gareth King (34:39)

Yes, of course.

 

Colin Corby (34:40)

So future generations of large language models, as soon as you train yourself on yourself or previous versions of yourself, it all ends up in a mess somewhere. There's a robotic shift in order to try and experience this world. So there's a little bit of hope and time for us.

 

Gareth King (34:57)

We'll definitely keep that hopeful mindset, I think. And you kind of alluded to something there, which was what sets us apart is that ability to take previous data sets and information, recognise patterns, put it all together essentially, and imagine something that may not exist. Now, as you've also said, and we've seen around these kind of AI tools, LLMs, whatever they are, they're just aggregating what exists.

 

Potentially one day it gets to the point where it can imagine in a, in I guess a biological way. I don't know. That will be a bit scary because then where does that leave humans? But I guess the hopeful part of it as well is we do eventually get some sort of guardrails that reign this in a little bit and corral it into certain avenues, as you said, around medical and things like that, that could have much better positive outcomes for humanity.

 

Colin Corby (35:48)

Yeah, I'm going to have to be an optimist because the alternative is not as good.

 

Gareth King (35:54)

No, it's definitely, definitely not as good. As we could become, you know, literal cyborgs with, you know, tools in our minds and in our bodies and things, what will a human identity look like into the future if that's the case?

 

Colin Corby (36:11)

So, let's look at the positives first. There are a lot of people who through no fault of their own, their bodies don't function in the same way that they would like them to. And the ability to have an implant, and this is the way it's being proposed, gives them the ability to, let's say, walk again, to do lots of wonderful things. And they're really, really great.

 

This idea though, and that's relatively simple because it uses the motor part of the brain. We have no idea what consciousness is or to be fair, how the brain works properly. Now, we've got more ideas than we had 10 years ago and 20 years ago. And in 10 years time, we'll have more idea than now. But it's incredibly complex. A human brain takes about a millionth of the power of a supercomputer with roughly the same processing power.

 

So a biological brain, and all animals' brains, are doing something slightly different, but they're all potentially conscious beings, even down to the tiniest animals. There isn't any way that that will happen in a sort of a cyborg sense anytime soon. I always relate it to, I'm still waiting for self-driving cars and that's been about for what, 20 years? Try and do something really complicated.

 

Gareth King (37:28)

Yeah, I tell you what, it's funny. remember, I remember being a kid and I don't know if they had it in the UK or not, but we had this show here. It was called Beyond 2000. And obviously it was before 2000 came, right? And like, was hypothesising what would happen after the year 2000. I remember the intro, cause I look it up on YouTube every now and again, and it will have like a robot. I think it's pushing a kid on a swing, you know? So, it's like the parents are now giving their playtime to a robot. But then there was like, yeah, the flying car and some of the technology is like long surpassed by now. But yeah, think flying cars have been kind of imagined forever, haven't they?

 

Colin Corby (38:09)

Flying taxis are just starting to get licenses now. MIT in the US has got robots to run, which is quite complex. And there's that dog.

 

Gareth King (38:12)

Are we talking like, you know, the Boston Dynamics stuff.

 

Colin Corby (38:27)

Yeah, the Boston Dynamics dog.

 

Gareth King (38:29)

That's very scary.

 

Colin Corby (38:31)

As time goes on, the robotic things are happening. But the great thing about the future is that predicting time scales, as you say, from flying cars and all that sort of thing is incredibly difficult. But in the internet, because it's an artificial construct and a relatively simple one, then things can appear to move much faster because it's almost a controlled environment.

 

Whereas in the real world, who knows what's really happening? We haven't even got to the bottom of quantum theory yet and how it shapes the physical world that we know. I grew up learning about electricity. Now I have to comprehend the fact that electrons probably exist within the outer shells of atoms, but we're using it. So, we don't have to know everything and there's tons of stuff that we don't know.

 

Gareth King (39:20)

Yeah, look, that's a little bit above my mental capacity to understand that stuff, but it sounds quite serious. But yeah, look, a lot to think about there on how our identity started, how they've been changed and affected through technology and where they could potentially go. And I think I'm going to be on the same camp as you. I want to be hopeful about this because as you said, the alternative is too bad to think about.

 

Just to finish up then Colin, what's coming up for you and where can people follow what you're up to?

 

Colin Corby (39:54)

So as a digital detox coach, I'm going to say the internet, because we all have to be online to live in this world. So, the website is www.thedigitaldetoxcoach.com. If people would like, have a look at the TEDx talk because the introduction of the TEDx talk is about “imagine a possible future”. And it was done a couple of years ago, but I was already familiar with AI before it sort of hit. And so you imagine the future and then we end up is that actually it's about human sanctuaries at the end. So have a look at the TEDx, but all the other information is on the website.

 

Gareth King (40:31)

Awesome. Colin, you've given us quite a lot to think about. Thank you so much for joining us.

 

Colin Corby (40:37)

Thank you for having me.