Sept. 20, 2025

Has the internet ruined Journalism & News? – John DeDakis

Has the internet ruined Journalism & News? – John DeDakis
The player is loading ...
Has the internet ruined Journalism & News? – John DeDakis

It promised unlimited reach, direct engagement, and a more informed world. But instead devalued expertise, creating a ‘clicks and outrage’ economy where speed reigns over accuracy.

To help us get the full story, we’re joined by John DeDakis, an award-winning novelist, writing coach, public speaker, and former CNN Senior Copy Editor.

https://johndedakis.com/

https://www.amazon.com/stores/author/B002BM6WM0

https://www.instagram.com/dedakisjohn/

https://www.facebook.com/john.dedakis

https://bsky.app/profile/johndedakis.bsky.social


00:00 The Journey of a Journalist

02:16 The Impact of the Internet on Journalism

03:52 Skepticism and Trust in Digital News

06:25 Challenges for Editors in the Digital Age

08:48 The Shift Towards Clickbait and Investigative Journalism

11:35 The Rise of Citizen Journalism

13:55 The Role of AI in Journalism

16:37 The Future of Content Creation

19:14 The Ethics of AI and Journalism

21:19 Substack and the Future of Individual Journalism

24:08 The Evolution of Journalism in the Digital Age

30:15 The Importance of Media Literacy and Education

37:08 The Role of Public Broadcasting in Journalism

42:50 Hope for the Future of Journalism

45:18 Curiosity as a Key Skill for Writers

 

Let us know what else you’d like us to look into, or just say hello at https://www.ruinedbytheinternet.com/

 

journalism, internet, media literacy, investigative journalism, AI, misinformation, citizen journalism, public broadcasting, trust in news, writing

1
00:00:00,160 --> 00:00:01,640
Welcome to Ruined by the
Internet.

2
00:00:01,680 --> 00:00:03,960
I'm Gareth King.
Today we're asking, has the

3
00:00:03,960 --> 00:00:07,280
Internet ruined journalism?
A promised unlimited reach,

4
00:00:07,280 --> 00:00:10,640
direct engagement and a more
informed world, but instead

5
00:00:10,640 --> 00:00:13,960
devalued expertise, creating a
clicks and outrage economy where

6
00:00:13,960 --> 00:00:17,440
speed reigns over accuracy?
To help us get the full story,

7
00:00:17,440 --> 00:00:21,120
we're joined by John Dedekis, an
award winning novelist, writing

8
00:00:21,120 --> 00:00:23,800
coach and former CNN senior copy
editor.

9
00:00:28,640 --> 00:00:30,760
John, thank you so much for
joining us and welcome to the

10
00:00:30,760 --> 00:00:32,720
show.
Thanks Gareth, it's good to be

11
00:00:32,720 --> 00:00:34,520
here.
Before we get into it, can you

12
00:00:34,520 --> 00:00:37,720
tell us a bit about what you do
and the journey that led you to

13
00:00:37,720 --> 00:00:40,480
this point?
What I do now is I'm a writing

14
00:00:40,480 --> 00:00:45,120
coach, a manuscript editor, a
novelist, a public speaker, and

15
00:00:45,120 --> 00:00:49,240
I've been doing that since I
retired from CNN in 2013,

16
00:00:49,440 --> 00:00:52,680
although I was doing some of
those things while I was still

17
00:00:52,680 --> 00:00:57,760
at CNNI was a journalist for 45
years, covered the White House

18
00:00:57,760 --> 00:01:02,800
when Reagan was president, went
to CNN in 1988 in Atlanta and

19
00:01:02,800 --> 00:01:07,600
was with the network for 25
years, the last 7 as an editor

20
00:01:07,600 --> 00:01:10,080
for Wolf Blitzer on the
Situation Room.

21
00:01:10,240 --> 00:01:12,240
Yeah, wow.
So it's quite a quite an

22
00:01:12,240 --> 00:01:15,040
extensive repertoire and
background that you've got

23
00:01:15,040 --> 00:01:16,680
there.
So obviously you've been in the

24
00:01:16,680 --> 00:01:20,280
world of journalism and and
writing long enough to have seen

25
00:01:20,280 --> 00:01:23,520
the arrival of the Internet and
also its effects play out.

26
00:01:23,720 --> 00:01:27,280
From your perspective, what's
been the most fundamental change

27
00:01:27,280 --> 00:01:30,600
across the industry?
I think that the I mean the most

28
00:01:30,600 --> 00:01:34,080
fundamental change, at least for
me, because I existed before the

29
00:01:34,080 --> 00:01:38,440
Internet did the the biggest
change is the connectivity of

30
00:01:38,440 --> 00:01:41,680
it.
I mean, you and I are half a

31
00:01:41,680 --> 00:01:45,080
world away and we are able to
talk in real time.

32
00:01:45,080 --> 00:01:49,000
Being able to see each other.
I mean, that was unheard of when

33
00:01:49,000 --> 00:01:52,080
I was growing up.
And so I think that's the

34
00:01:52,080 --> 00:01:57,200
biggest thing, the reach that we
have to be able to connect with

35
00:01:57,200 --> 00:02:00,720
people around the world.
I mean, that's just spectacular,

36
00:02:00,720 --> 00:02:02,360
I think.
Yeah, absolutely.

37
00:02:02,360 --> 00:02:06,280
And I think just on that point
of reach, you know, as you said,

38
00:02:06,280 --> 00:02:08,960
it would have been unheard of,
but but looking back to that

39
00:02:08,960 --> 00:02:11,920
time, what did the initial
impact of the Internet and

40
00:02:11,920 --> 00:02:15,320
technology look like on the
industry and and what was the

41
00:02:15,320 --> 00:02:18,520
reaction at the time?
Well, in in journalism there was

42
00:02:18,520 --> 00:02:22,520
certainly a fair degree of
scepticism about the Internet.

43
00:02:22,760 --> 00:02:27,320
I mean, we used it, but we had
to be very careful about knowing

44
00:02:27,320 --> 00:02:32,080
if the information was reliable.
And so for the longest time, you

45
00:02:32,080 --> 00:02:36,080
know, the, the, the Internet was
available as sort of a tip

46
00:02:36,080 --> 00:02:38,600
service.
We would get heads up about

47
00:02:38,600 --> 00:02:42,080
things, but we would still check
it out the old way where you

48
00:02:42,280 --> 00:02:45,480
pick up the phone and call
somebody you know in authority

49
00:02:45,480 --> 00:02:49,840
at a reputable organisation, a
government agency or one of your

50
00:02:49,840 --> 00:02:54,680
contacts or something like that.
So you know, it took a while for

51
00:02:54,680 --> 00:03:00,320
the Internet to become more
insinuated into daily life.

52
00:03:00,360 --> 00:03:04,240
Yeah, interesting to hear that.
It almost generated those leads

53
00:03:04,240 --> 00:03:07,400
easier to to people which he
then still had to investigate.

54
00:03:07,400 --> 00:03:10,200
And I think one of the things
we'll probably get into as well

55
00:03:10,200 --> 00:03:12,720
as we go through this
conversation is the speed that

56
00:03:12,720 --> 00:03:15,800
everything operates at these
days and how that plays out.

57
00:03:16,000 --> 00:03:19,920
Would you say that it was the
Internet as a as a tool for

58
00:03:19,920 --> 00:03:23,360
journalism was taken seriously
right off the bat, like right

59
00:03:23,360 --> 00:03:26,320
away, or do you think the
industry was a little bit slow

60
00:03:26,320 --> 00:03:29,360
to react?
It's been a while now, but I, I

61
00:03:29,360 --> 00:03:33,440
think that there was a fair
amount of scepticism and, and I

62
00:03:33,440 --> 00:03:36,320
think you know what, I think at
reputable news organisations,

63
00:03:36,360 --> 00:03:39,680
there's still a lot of
scepticism for good reason,

64
00:03:40,040 --> 00:03:46,120
because there is just so much BS
out there and there's so much,

65
00:03:46,120 --> 00:03:49,680
there are so many lies and
conspiracy theories.

66
00:03:49,680 --> 00:03:52,120
It's just gotten like it's the
Wild West.

67
00:03:52,520 --> 00:03:55,960
And so I think, I think
reputable news organisations are

68
00:03:55,960 --> 00:04:01,480
still cautious about it and yet
it's used quite a bit as well.

69
00:04:01,560 --> 00:04:03,920
Yeah, yeah.
Especially, I mean your point

70
00:04:03,920 --> 00:04:06,440
there around reputable news
organisations.

71
00:04:06,440 --> 00:04:10,200
And I think as we know, as it
expands and and the rise of say

72
00:04:10,200 --> 00:04:13,360
citizen journalism as well, the
lines are kind of getting

73
00:04:13,360 --> 00:04:15,320
blurred.
And as I guess we've seen play

74
00:04:15,320 --> 00:04:19,760
out over years now, that trust
even in these reputable

75
00:04:20,040 --> 00:04:22,840
organisations seems to be
breaking down quite a lot too.

76
00:04:22,840 --> 00:04:26,560
So obviously that's another
challenge to to try and address

77
00:04:26,560 --> 00:04:30,120
as everything keeps going.
But I guess on that point, what

78
00:04:30,120 --> 00:04:34,480
would you say beyond maintaining
that reputation and I guess that

79
00:04:34,480 --> 00:04:37,680
trust from the audience, what
would be the biggest challenge

80
00:04:37,680 --> 00:04:42,440
for an editor trying to manage
an outlet or publication in the

81
00:04:42,440 --> 00:04:44,680
digital age?
And how would that compare to

82
00:04:44,680 --> 00:04:48,120
previously when it was kind of
just implicitly trustworthy?

83
00:04:48,360 --> 00:04:52,120
I'm not sure I can add much to
that just because journalists

84
00:04:52,160 --> 00:04:58,080
are sceptical.
And so the the problem is the

85
00:04:58,360 --> 00:05:03,280
sources that are making things
up and it's just a matter of

86
00:05:03,280 --> 00:05:06,840
trial and error.
I mean, the, the sources that

87
00:05:06,840 --> 00:05:10,760
journalists tend to rely on are
sources they trust.

88
00:05:11,560 --> 00:05:15,400
And so a lot of people are
competing for attention.

89
00:05:15,400 --> 00:05:20,960
The, the problem now is that, I
mean, you even, look, you even

90
00:05:20,960 --> 00:05:25,440
have the President of the United
States, Donald Trump, who I'll

91
00:05:25,440 --> 00:05:30,320
be honest, he, he lies
constantly, reflexively, you

92
00:05:30,320 --> 00:05:34,000
know, and, and so if he says it,
you have to be sceptical.

93
00:05:34,640 --> 00:05:38,240
But of course, if you ask him
questions, then you're, you're

94
00:05:38,240 --> 00:05:42,000
considered treasonous.
You know, it's, it's gotten to

95
00:05:42,000 --> 00:05:46,800
the point where, you know, Trump
himself has done a tremendous

96
00:05:46,800 --> 00:05:51,920
disservice to the to the
Internet because of the

97
00:05:51,920 --> 00:05:57,200
falsehoods that he spews and
then claims that the mainstream

98
00:05:57,200 --> 00:06:02,080
media is fake news.
And what he's done, it's not.

99
00:06:02,080 --> 00:06:04,120
And he's not just undermining
journalism.

100
00:06:04,360 --> 00:06:06,680
He's undermining the
intelligence community, the

101
00:06:06,680 --> 00:06:09,280
judiciary, the scientific
community.

102
00:06:09,520 --> 00:06:12,280
And that's what dictators do.
They undermine trust in

103
00:06:12,280 --> 00:06:14,320
everything so that you believe
them.

104
00:06:14,320 --> 00:06:17,640
And I mean, he said when he
first ran for president, only I

105
00:06:17,640 --> 00:06:22,120
can solve it.
Yeah, look, that's, that's an

106
00:06:22,120 --> 00:06:26,360
entire, I guess world that we
could probably spend hours going

107
00:06:26,360 --> 00:06:28,480
down.
And what it appears to me, you

108
00:06:28,480 --> 00:06:31,120
know, I'm in Australia, I'm not
on the ground in in the US

109
00:06:31,120 --> 00:06:32,960
seeing what it's like over
there.

110
00:06:33,320 --> 00:06:37,000
But what I can kind of see
around all of that is the way

111
00:06:37,000 --> 00:06:41,800
that him or his team or whoever
it is is kind of using the

112
00:06:41,800 --> 00:06:46,520
Internet in like Internet
culturally rather than an

113
00:06:46,520 --> 00:06:49,360
official sense.
And I think that not just him, I

114
00:06:49,360 --> 00:06:54,040
think all officials do it to to
varying degrees, where they've

115
00:06:54,040 --> 00:06:57,240
got a team that manages their
social media or something.

116
00:06:57,600 --> 00:07:00,240
And the team is not staffed by
people like them.

117
00:07:00,240 --> 00:07:04,240
It's staffed by young people
totally plugged into, you know,

118
00:07:04,240 --> 00:07:06,920
the way to communicate to people
online.

119
00:07:06,920 --> 00:07:11,400
So, so that communication is
formatted as Internet

120
00:07:11,400 --> 00:07:14,560
information, not kind of
official information.

121
00:07:14,560 --> 00:07:16,560
And then I think, you know,
wires get crossed somewhere

122
00:07:16,560 --> 00:07:19,120
along the line.
That's obviously one of the

123
00:07:19,800 --> 00:07:24,360
disservices that the
democratisation of information

124
00:07:24,360 --> 00:07:27,160
provides via the Internet.
One of the things that's

125
00:07:27,160 --> 00:07:30,360
happening right now in the White
House press corps is that they

126
00:07:30,360 --> 00:07:34,440
have made room for what they
call new media.

127
00:07:34,920 --> 00:07:39,760
And, and it's a rotating thing.
And they get the first question

128
00:07:39,760 --> 00:07:43,200
in the briefing.
They sit to Caroline Leavitt's

129
00:07:43,200 --> 00:07:48,800
immediate right along that that
Rose Garden wall and the New

130
00:07:48,800 --> 00:07:52,960
York Times just did a piece on
one of the, you know, the new

131
00:07:52,960 --> 00:07:55,920
media people and the guy make
stuff up.

132
00:07:55,920 --> 00:08:00,080
I mean, he he's he's he's
already got a reputation for not

133
00:08:00,080 --> 00:08:03,960
being reputable, but that's OK
for the White House because, you

134
00:08:03,960 --> 00:08:10,040
know, he says what they want him
to say and he slavishly asks

135
00:08:10,120 --> 00:08:15,560
softball questions.
It's, it's, it's so it's

136
00:08:15,560 --> 00:08:18,640
complicated because you're right
about, you know, young people,

137
00:08:18,840 --> 00:08:23,480
you know, using the technology
to get the, the message out as

138
00:08:23,480 --> 00:08:26,240
effectively as possible.
They're not necessarily dealing

139
00:08:26,240 --> 00:08:29,040
with the content.
They're dealing with the, the

140
00:08:29,040 --> 00:08:33,559
way to make sure that the reach
is as far as it needs to be.

141
00:08:33,799 --> 00:08:37,080
But you know, you've got people
in the White House, in the, in

142
00:08:37,080 --> 00:08:40,720
the in the briefing room who
aren't journalists, but they've

143
00:08:40,720 --> 00:08:44,360
got hundreds of thousands of
followers, but there's no editor

144
00:08:44,360 --> 00:08:46,560
on their shoulder going.
Where did you get that?

145
00:08:46,720 --> 00:08:49,400
How do you know that's true?
Yeah.

146
00:08:49,400 --> 00:08:52,600
I mean that's that's an
interesting point as well around

147
00:08:52,600 --> 00:08:55,680
kind of I guess the speed that
everything needs to operate.

148
00:08:55,960 --> 00:09:00,320
We know that headlines Dr
clicks, which drive revenue and

149
00:09:01,280 --> 00:09:04,000
you know, the byproduct of that,
of course, is the need to create

150
00:09:04,000 --> 00:09:06,960
that kind of viral content that
you know, is going to get a lot

151
00:09:06,960 --> 00:09:10,080
of eyes, a lot of clicks and get
that revenue to keep the the

152
00:09:10,080 --> 00:09:13,640
network or your own small
publication, whatever is going.

153
00:09:13,640 --> 00:09:17,840
But seeing them somewhat of the
model head down that path, you

154
00:09:17,840 --> 00:09:20,200
know, where everything is just,
you know, we've seen the rise of

155
00:09:20,200 --> 00:09:24,160
opinion pieces as as like, you
know, quite polarising by their

156
00:09:24,160 --> 00:09:26,760
nature for that, I guess for
that purpose.

157
00:09:27,120 --> 00:09:31,160
Does that kind of cheap form of
content, Does that mean that

158
00:09:31,280 --> 00:09:34,480
expensive investigative
journalism and writing just

159
00:09:34,480 --> 00:09:37,080
becomes too expensive to produce
for the most part?

160
00:09:37,680 --> 00:09:41,320
And and does that shift towards
clickbait and headline, You

161
00:09:41,320 --> 00:09:43,080
know, a lot of people don't read
beyond the headline.

162
00:09:43,880 --> 00:09:46,680
Is driving things through
headlines rather than content

163
00:09:46,680 --> 00:09:49,440
simply a necessity of a business
model that might be struggling?

164
00:09:49,720 --> 00:09:52,000
Yes, that's that's a real
danger.

165
00:09:52,000 --> 00:09:56,160
And what's happening is, in
fact, the Internet is probably

166
00:09:56,360 --> 00:09:59,400
responsible for a lot of
newspapers going out of business

167
00:10:00,040 --> 00:10:03,680
because they had to monetize
what they were doing.

168
00:10:03,680 --> 00:10:07,880
But people wanted free news.
And, and so they were, you know,

169
00:10:07,880 --> 00:10:11,680
a lot of a lot of newspapers
were late to put up a paywall.

170
00:10:11,680 --> 00:10:16,200
And so they're, they were kind
of caught flat footed responding

171
00:10:16,200 --> 00:10:19,120
to the Internet.
And so you're absolutely right,

172
00:10:19,200 --> 00:10:23,360
it really puts a chill in
investigative reporting because

173
00:10:23,800 --> 00:10:27,960
in order to do effective
investigative reporting, you

174
00:10:27,960 --> 00:10:31,240
have to be backed up by a news
organisation that has deep

175
00:10:31,240 --> 00:10:34,800
pockets and skilled lawyers that
can protect you.

176
00:10:35,160 --> 00:10:38,480
You know, if you are branching
out on your own and you're doing

177
00:10:38,480 --> 00:10:43,360
investigative reporting, you are
very likely going to get sued.

178
00:10:43,640 --> 00:10:46,440
And even if you are in the
right, even if truth is an

179
00:10:46,440 --> 00:10:50,400
absolute defence, someone with
deep pockets and a lot of

180
00:10:50,400 --> 00:10:54,040
patients can run out the clock
and run out your bank account.

181
00:10:54,280 --> 00:10:57,400
Even though, I mean, look at
we're we're seeing Trump shake

182
00:10:57,400 --> 00:11:01,240
down, you know, the Wall Street
Journal, CBS News, The New York

183
00:11:01,240 --> 00:11:05,080
Times, you know, he's suing, you
know, some of these people for

184
00:11:05,080 --> 00:11:08,480
$20 billion and they're
settling.

185
00:11:08,680 --> 00:11:12,000
And these are big news
organisations, you know, they're

186
00:11:12,000 --> 00:11:15,160
not settling for 20 billion,
they're settling for 16,000,000.

187
00:11:15,400 --> 00:11:17,840
Just to kind of get it off their
plate now, of course.

188
00:11:17,840 --> 00:11:21,120
And look, it's, it's, I guess
just for context, it's very

189
00:11:21,120 --> 00:11:26,200
different over here in terms of
the litigious nature of, of

190
00:11:26,200 --> 00:11:27,960
things.
I think over here our legal

191
00:11:27,960 --> 00:11:32,360
system doesn't allow, well, not
from what I can see anyway, like

192
00:11:32,360 --> 00:11:35,920
that level of, of suing over of
kind of everything.

193
00:11:36,360 --> 00:11:40,200
But you said you said something
there around orgs, you know, and

194
00:11:40,200 --> 00:11:43,280
publications being slow to put
up a paywall around their

195
00:11:43,280 --> 00:11:46,000
content.
Why did they originally just

196
00:11:46,000 --> 00:11:48,600
decide to start giving away the
content for free?

197
00:11:48,640 --> 00:11:51,640
What was the strategy behind
that and how did they imagine

198
00:11:51,640 --> 00:11:53,720
they could capitalise on that
economically?

199
00:11:54,000 --> 00:11:57,320
It's hard for me to, to know for
sure, but my hunch is they still

200
00:11:57,320 --> 00:12:01,960
had the, the physical newspaper
and they, I don't think realised

201
00:12:01,960 --> 00:12:05,520
that, you know, people were
moving to the Internet.

202
00:12:05,520 --> 00:12:09,360
And I think that they still
trusted in the history that

203
00:12:09,360 --> 00:12:13,120
people will still gravitate
towards the actual paper.

204
00:12:13,120 --> 00:12:17,000
And I think they just were slow
to realise the tectonic shift

205
00:12:17,240 --> 00:12:19,560
that took place in news
consumption.

206
00:12:19,800 --> 00:12:21,240
And that, that makes total
sense.

207
00:12:21,240 --> 00:12:24,720
And you know, on that point, how
the how do you think the rise of

208
00:12:24,720 --> 00:12:27,720
that citizen journalism that we
we mentioned a couple of minutes

209
00:12:27,720 --> 00:12:30,840
ago, which is obviously fueled
by the Internet and the ease it

210
00:12:30,840 --> 00:12:33,760
is to publish and and share your
own stuff.

211
00:12:34,280 --> 00:12:36,760
How do you think that changed
the way traditional media

212
00:12:36,760 --> 00:12:40,040
outlets do their jobs?
Like now, with the ability for

213
00:12:40,040 --> 00:12:45,400
anyone to curate their own
unique interest, News feed is

214
00:12:45,400 --> 00:12:49,920
the one size fits all, let's say
broadcast model a thing of the

215
00:12:49,920 --> 00:12:52,560
past and like how much future do
you reckon it has?

216
00:12:52,960 --> 00:12:56,680
That's an intriguing question.
I think that the search for

217
00:12:56,680 --> 00:13:00,960
truth, I don't think that the
principles involved with that

218
00:13:00,960 --> 00:13:04,600
have gone away or will go away.
You know, you still need to

219
00:13:04,640 --> 00:13:07,320
verify your sources.
If you're getting something

220
00:13:07,320 --> 00:13:11,240
anonymously, you need to be able
to confirm it with two other

221
00:13:11,240 --> 00:13:14,320
sources at least.
There are still reputable news

222
00:13:14,320 --> 00:13:17,720
organisations, The Associated
Press, Reuters, The New York

223
00:13:17,720 --> 00:13:21,680
Times, the Guardian, BBC, just
to name a few.

224
00:13:21,880 --> 00:13:27,760
So I, I still think that the
fundamentals of journalism still

225
00:13:27,840 --> 00:13:31,520
still exist.
But the problem, of course, is

226
00:13:31,520 --> 00:13:33,640
that even the Russian
government, I mean, the

227
00:13:33,640 --> 00:13:38,120
military, they are, they have,
they have, what are they troll

228
00:13:38,120 --> 00:13:39,560
farms?
You know, they, they are

229
00:13:39,560 --> 00:13:43,800
planting false information.
You know, that that is something

230
00:13:43,800 --> 00:13:47,720
they're doing on an industrial
strength level, on a state

231
00:13:47,720 --> 00:13:50,920
level.
And, and it's, it's weaponizing

232
00:13:50,920 --> 00:13:53,840
falsehoods and that's hard to
fight against.

233
00:13:53,840 --> 00:13:56,120
It's hard to fight against.
No, absolutely.

234
00:13:56,120 --> 00:13:59,000
And I think that, you know,
that's something that I'm sure

235
00:13:59,000 --> 00:14:01,320
you you'd understand that comes
up here as well.

236
00:14:01,320 --> 00:14:05,240
You know, anytime there's an
election or anything to do with

237
00:14:05,240 --> 00:14:09,520
government going on, it's this
kind of it's whether whether

238
00:14:09,520 --> 00:14:12,520
it's it's real or not.
It's almost the go to now that

239
00:14:12,520 --> 00:14:16,200
they're they're these Russian
troll farms are doing these kind

240
00:14:16,200 --> 00:14:18,560
of interference jobs.
And I'm sure that they're not

241
00:14:18,560 --> 00:14:20,760
the only ones.
Like I'm sure basically every

242
00:14:20,760 --> 00:14:24,440
nation is probably.
China as well, you know, and one

243
00:14:24,440 --> 00:14:27,480
thing you haven't mentioned is
the is artificial intelligence,

244
00:14:27,480 --> 00:14:28,600
I mean.
Yes, yes, the.

245
00:14:28,640 --> 00:14:32,120
Deep fakes that are becoming
much more sophisticated.

246
00:14:32,280 --> 00:14:34,160
We can segue into that now
because that's a.

247
00:14:34,160 --> 00:14:36,120
It's a great, great thing to
talk about.

248
00:14:36,120 --> 00:14:39,720
I remember, you know, when that
must be a few years ago now,

249
00:14:39,720 --> 00:14:43,680
playing with the first kind of
image generators and they were

250
00:14:43,960 --> 00:14:46,280
terrible.
You know, they were laughable at

251
00:14:46,280 --> 00:14:47,600
what you would get back from
them.

252
00:14:48,160 --> 00:14:53,240
And then seeing the first
examples of, of when you could

253
00:14:53,240 --> 00:14:57,760
kind of make an image of
somebody doing something and

254
00:14:57,760 --> 00:15:01,120
then when the first video came
out where you could just take an

255
00:15:01,120 --> 00:15:03,480
existing video of put someone
else's face on it.

256
00:15:04,120 --> 00:15:06,920
Even then, you know, I remember
having conversations with people

257
00:15:06,920 --> 00:15:09,600
around this is going to be
really bad.

258
00:15:09,680 --> 00:15:14,240
Like once this kind of starts
exponentially increasing quality

259
00:15:14,240 --> 00:15:16,280
wise.
You know, it's so funny.

260
00:15:16,280 --> 00:15:19,000
We've, we've spent the last
decade.

261
00:15:19,000 --> 00:15:22,640
You, you mentioned the, the term
fake news just before and

262
00:15:22,960 --> 00:15:26,080
whether anyone likes it or not,
that has just become part of the

263
00:15:26,080 --> 00:15:28,040
vernacular for, for everybody
now.

264
00:15:28,320 --> 00:15:30,880
And, and, and it's kind of just
refers to everything that

265
00:15:30,880 --> 00:15:33,640
somebody doesn't like.
They can kind of call it that.

266
00:15:33,920 --> 00:15:36,600
And, and it's so it's so
interesting to me that we've

267
00:15:36,600 --> 00:15:42,680
spent so much of the last decade
fighting against fake news and,

268
00:15:42,680 --> 00:15:45,160
and things that are
misinformation, disinformation,

269
00:15:45,160 --> 00:15:50,320
etcetera, to now be barrelling
at super speed into a digital

270
00:15:50,320 --> 00:15:55,920
world full of stuff that's fake
and we'll have no way of

271
00:15:55,920 --> 00:15:58,560
telling.
And it's just quite an

272
00:15:58,600 --> 00:16:02,680
interesting irony that all of
that fight against fake stuff

273
00:16:03,160 --> 00:16:06,680
seems to have fallen by the
wayside as we now embrace, you

274
00:16:06,680 --> 00:16:09,240
know, generative AI and other
forms of AI.

275
00:16:10,120 --> 00:16:12,280
How do you think that's going to
happen and play out?

276
00:16:12,520 --> 00:16:15,920
Well, here's the thing.
I mean, technology is morally

277
00:16:15,920 --> 00:16:21,360
neutral and the, the, the
problem is the people and, and

278
00:16:21,360 --> 00:16:25,400
what people do with it.
And, you know, going forward, I

279
00:16:25,400 --> 00:16:28,760
mean, I think that one of the
points you made about AI being

280
00:16:28,760 --> 00:16:32,240
so sophisticated that we won't
be able to tell the real from

281
00:16:32,240 --> 00:16:35,080
the fake.
But I think that there are ways

282
00:16:35,080 --> 00:16:38,640
that there might be able to put
watermarks on things or, you

283
00:16:38,640 --> 00:16:41,720
know, a digital watermark for
authenticity's sake.

284
00:16:41,960 --> 00:16:45,240
I mean, this is way beyond my
understanding or ability to deal

285
00:16:45,240 --> 00:16:47,480
with.
And, you know, in, in this

286
00:16:47,480 --> 00:16:50,400
country, there's, you know, the
1st Amendment freedom of speech.

287
00:16:50,760 --> 00:16:54,000
And so, you know, you, you
really want to be very careful

288
00:16:54,000 --> 00:16:57,800
about fiddling with content
because that was that was always

289
00:16:57,800 --> 00:16:59,920
the case.
Even when we're talking about

290
00:16:59,920 --> 00:17:03,960
the printing press, you could
still lie, you know, and use a

291
00:17:03,960 --> 00:17:06,560
printing press to do it and
reach a lot of people.

292
00:17:06,560 --> 00:17:09,160
So, you know, lying is nothing
new.

293
00:17:09,560 --> 00:17:13,520
It's just that the technology in
the pipeline is more

294
00:17:13,520 --> 00:17:17,800
sophisticated, but the
responsibility is still on us to

295
00:17:17,800 --> 00:17:20,280
be discerning.
There's no surprise that there's

296
00:17:20,280 --> 00:17:24,599
lying out there, so that means
we've just got to be alert to it

297
00:17:24,599 --> 00:17:26,440
and not just swallow everything
whole.

298
00:17:26,960 --> 00:17:29,680
Yeah, no, totally.
And I think that your your point

299
00:17:29,680 --> 00:17:32,320
you raised there about
watermarks or or something like

300
00:17:32,320 --> 00:17:34,280
that.
You know, I have seen certain

301
00:17:34,280 --> 00:17:37,240
pieces of, of content that might
be shared on various platforms

302
00:17:37,240 --> 00:17:40,240
that will have kind of a
watermark identifier built in.

303
00:17:40,240 --> 00:17:43,520
People do want that, you know,
because we've seen the amount of

304
00:17:43,560 --> 00:17:48,160
what they call slop out there in
terms of text and and and

305
00:17:48,160 --> 00:17:52,000
writing is insane.
And everybody, you know, they

306
00:17:52,000 --> 00:17:55,320
have these debates around how
you can tell formats.

307
00:17:55,320 --> 00:17:57,440
It's got a certain tone,
etcetera.

308
00:17:57,440 --> 00:18:00,360
Like the use of an M dash.
She's like a big one, but that's

309
00:18:00,360 --> 00:18:02,760
short form content.
Obviously you're you've got a

310
00:18:02,760 --> 00:18:06,480
background in journalism, but
you're now a longer form writer

311
00:18:06,480 --> 00:18:08,600
in the world of novels.
And I make it up.

312
00:18:09,120 --> 00:18:11,120
Yeah, yeah.
But I guess that's coming from

313
00:18:11,120 --> 00:18:14,840
your from your, your own mind.
How do these are these tools,

314
00:18:15,520 --> 00:18:20,160
are they starting to impact that
world at all or are they OK?

315
00:18:20,160 --> 00:18:23,120
In what ways?
I want to say, I want to say one

316
00:18:23,120 --> 00:18:25,640
thing, though, to go back to,
you know, the future.

317
00:18:25,840 --> 00:18:29,600
I think a lot can be said for
the education system.

318
00:18:29,960 --> 00:18:33,560
I think that there really needs
to be not just, you know, in

319
00:18:33,560 --> 00:18:37,680
education where people learn how
to use it, but I think that

320
00:18:37,680 --> 00:18:41,280
there needs to be some classes
in, you know, being more

321
00:18:41,280 --> 00:18:45,480
sophisticated in using it
responsibly and being able to

322
00:18:45,480 --> 00:18:47,320
discern what's real and what's
fake.

323
00:18:47,400 --> 00:18:51,200
So I think education in
elementary school on up, you

324
00:18:51,200 --> 00:18:54,920
know, I think is, is important,
but as far as AI is concerned,

325
00:18:54,960 --> 00:18:58,280
that's something I've been
playing with because I teach a

326
00:18:58,280 --> 00:19:00,120
lot.
I do a lot of writing classes.

327
00:19:00,560 --> 00:19:03,160
And so the, the question comes
up a lot.

328
00:19:03,160 --> 00:19:07,920
And so I've, you know, I've
fiddled with AIA little bit and

329
00:19:09,120 --> 00:19:11,640
it's here to stay.
And I found that it can be

330
00:19:11,640 --> 00:19:14,920
useful as a research tool as
long as you get, you know,

331
00:19:15,000 --> 00:19:19,600
sources that you can check out.
But you know, AI isn't really

332
00:19:19,600 --> 00:19:21,920
creative.
It just regurgitates

333
00:19:21,960 --> 00:19:24,160
information.
And I actually tried that.

334
00:19:24,440 --> 00:19:27,640
I had to write a short story,
which is not my specialty.

335
00:19:27,880 --> 00:19:29,880
And so I plugged in the
parameters.

336
00:19:29,880 --> 00:19:32,920
I need a short story, 5000
words, this style.

337
00:19:32,920 --> 00:19:35,040
This is the title.
These are the elements.

338
00:19:35,440 --> 00:19:39,760
And it's spit out something in
nanoseconds and it sucked it.

339
00:19:39,760 --> 00:19:43,760
Suck.
And that then was a reminder

340
00:19:43,760 --> 00:19:47,160
that I hadn't lost my chops and
that, you know, there is

341
00:19:47,160 --> 00:19:50,520
something to say for the human
element here.

342
00:19:50,800 --> 00:19:54,240
And it is definitely, I mean,
there's a lawsuit in the US

343
00:19:54,240 --> 00:19:57,320
about copyright infringement
because, you know, people's

344
00:19:57,320 --> 00:20:00,480
books are being used to train
AI.

345
00:20:00,680 --> 00:20:04,080
But, you know, ChatGPT is not
buying these people's novels.

346
00:20:04,080 --> 00:20:07,480
They're just stealing them.
And and that and that's still

347
00:20:07,480 --> 00:20:10,640
going through the courts.
So copyright infringement is an

348
00:20:10,640 --> 00:20:12,480
issue.
But, you know, it's almost a

349
00:20:12,480 --> 00:20:15,680
moot point because the damage
has already been done, Although

350
00:20:15,680 --> 00:20:18,800
it could protect, it could
protect people who are now

351
00:20:18,800 --> 00:20:21,760
writing, but their stuff hasn't,
you know, been published yet.

352
00:20:22,520 --> 00:20:24,000
Yeah.
And on the, on the kind of, I

353
00:20:24,000 --> 00:20:27,440
guess copyright infringement
stuff, as you've alluded to

354
00:20:27,440 --> 00:20:31,080
there, it is just pulling from
all information available to it

355
00:20:31,080 --> 00:20:34,680
and kind of summarising it,
which if you, if you play that

356
00:20:34,680 --> 00:20:37,520
out over a long enough time, the
more it does that, the more of

357
00:20:37,520 --> 00:20:40,360
itself it's putting out there
and then the more of itself it's

358
00:20:40,360 --> 00:20:43,160
kind of drawing from.
So it should theoretically just

359
00:20:43,160 --> 00:20:47,320
come to this ultimately beige
point that it's just nothing.

360
00:20:47,440 --> 00:20:49,600
You know what I mean?
Once it loses, as you said, all

361
00:20:49,600 --> 00:20:53,880
that humanness, that's, yeah,
not very exciting to think

362
00:20:53,880 --> 00:20:56,160
about.
I would bet though that AI is

363
00:20:56,160 --> 00:20:59,880
becoming is going to become much
more sophisticated to the point

364
00:20:59,880 --> 00:21:03,920
where it really can replicate
the human thought and the

365
00:21:03,920 --> 00:21:06,360
emotions.
I read or I heard a story

366
00:21:06,360 --> 00:21:12,720
recently that you know, unstable
people are using ChatGPT or or

367
00:21:12,960 --> 00:21:16,960
you know, generative AI to to be
their therapist and that it

368
00:21:16,960 --> 00:21:19,480
actually can talk them into
committing suicide.

369
00:21:19,600 --> 00:21:22,280
Yeah, I've seen a couple of the
same stories and that's very,

370
00:21:22,280 --> 00:21:28,480
very scary that not only are we
so insularised, I don't know if

371
00:21:28,480 --> 00:21:32,000
that's a word, but we're so
interested now, you know, our

372
00:21:32,000 --> 00:21:35,040
devices and things and our
digital lives where we're not

373
00:21:35,040 --> 00:21:38,600
interacting with real people
offline, that turning to a

374
00:21:39,160 --> 00:21:45,760
screen for therapy is it seems
like a very dystopian next step.

375
00:21:45,840 --> 00:21:48,960
And then seeing as you said,
these stories of these these

376
00:21:48,960 --> 00:21:54,120
young people ending up kind of
committing suicide be because of

377
00:21:54,120 --> 00:21:56,920
what it said or whether it's
because of what it said directly

378
00:21:56,920 --> 00:21:59,200
or not.
It's just a so tragic.

379
00:21:59,200 --> 00:22:02,920
And hopefully there gets to be
some guardrails and, you know,

380
00:22:02,920 --> 00:22:05,760
regulations around that because
that's awful.

381
00:22:05,760 --> 00:22:09,080
If you were a nefarious actor,
you could easily sabotage that

382
00:22:09,080 --> 00:22:13,640
and, and commit some kind of OP.
But that's, that's a very dark

383
00:22:13,640 --> 00:22:17,040
place to go down.
But what I did want to talk to

384
00:22:17,040 --> 00:22:20,880
you around, as you said, those
large language models and, and

385
00:22:20,880 --> 00:22:23,480
generative AI to, to make
pieces.

386
00:22:23,480 --> 00:22:28,240
If we rewind back to the context
of, say a, let's say, a book

387
00:22:28,240 --> 00:22:32,600
publisher or even a newsroom or,
or anything that deals in words,

388
00:22:34,280 --> 00:22:36,880
theoretically, could they get to
a point where you could be

389
00:22:36,880 --> 00:22:42,000
running just agentic AI systems
producing all the content

390
00:22:42,000 --> 00:22:44,080
drawing from what's happening
out there in the world?

391
00:22:44,400 --> 00:22:48,680
And so you could be running an
entirely fake newsroom or, or

392
00:22:48,680 --> 00:22:52,800
book publisher.
Yeah, awesome.

393
00:22:54,200 --> 00:22:58,280
Yeah, have a nice day.
Yeah, God, that sounds awful.

394
00:22:58,760 --> 00:23:02,120
OK, another, another thing, if
we go back to that citizen

395
00:23:02,120 --> 00:23:04,560
journalism, I'm sure you've
heard of platforms like

396
00:23:04,560 --> 00:23:07,960
Substack, right?
Would you say that on, on an

397
00:23:07,960 --> 00:23:11,120
individual level that the
Substack model is potentially A

398
00:23:11,120 --> 00:23:15,600
viable future for journalism?
Or is it just a niche where I

399
00:23:15,600 --> 00:23:18,360
guess if you've got a big enough
name and platform already, you

400
00:23:18,360 --> 00:23:21,280
can capitalise on that?
Oh, well, I think you you can

401
00:23:21,280 --> 00:23:23,040
definitely do that.
I mean, there are a couple of

402
00:23:23,040 --> 00:23:26,240
people who left the Washington
Post and, you know, started

403
00:23:26,240 --> 00:23:29,040
their own Substack.
I've always seen Substack is

404
00:23:29,040 --> 00:23:34,600
more of a newsletter kind of
thing, but you know, technology

405
00:23:34,600 --> 00:23:36,440
evolves.
I don't know if I don't see it

406
00:23:36,440 --> 00:23:40,680
necessarily replacing, you know,
day to day journalism, but it

407
00:23:40,680 --> 00:23:44,520
certainly supplements it and it
can be a useful, valuable

408
00:23:44,520 --> 00:23:49,080
contribution to the search for
truth that and I mean, that's

409
00:23:49,080 --> 00:23:51,400
the thing.
It's, I think that anything is

410
00:23:51,400 --> 00:23:56,200
possible and that it can be used
for good or for I'll.

411
00:23:56,720 --> 00:24:00,880
And it goes back to being
discerning and being able to use

412
00:24:00,880 --> 00:24:03,480
things responsibly with the
understanding that there are

413
00:24:03,480 --> 00:24:06,760
going to be a lot of people who
don't use it responsibly or use

414
00:24:06,760 --> 00:24:10,160
it nefariously or even in an
evil with evil intent.

415
00:24:10,800 --> 00:24:15,400
But that's been the human
condition since the beginning.

416
00:24:15,680 --> 00:24:18,280
It's it's just more
sophisticated now.

417
00:24:18,560 --> 00:24:21,080
Yeah, on that point, you said
something.

418
00:24:21,080 --> 00:24:23,880
They obviously the search for
truth and you got to be a bit

419
00:24:23,880 --> 00:24:27,600
discerning around whether it's
your sources or what you're

420
00:24:27,600 --> 00:24:30,280
producing.
A little while ago we touched on

421
00:24:30,280 --> 00:24:34,320
kind of how the Internet
affected, say, newsrooms and

422
00:24:34,320 --> 00:24:38,200
journalism right at the get go.
How has that changed over time

423
00:24:38,200 --> 00:24:42,680
now that there's this 24/7 news
cycle and just constant demand

424
00:24:42,680 --> 00:24:45,800
for new stuff and the speed that
everything operates at?

425
00:24:45,960 --> 00:24:50,360
Well, in in some ways, it's it's
very useful because one of the

426
00:24:50,360 --> 00:24:53,920
things I noticed when I was at
CNN is that, you know, we could,

427
00:24:53,920 --> 00:24:57,520
it was hard for us to do
interviews with people who

428
00:24:57,520 --> 00:25:01,200
didn't live in a Bureau city.
In other words, you had to get a

429
00:25:01,200 --> 00:25:04,640
camera in front of them.
And, you know, that took time,

430
00:25:04,640 --> 00:25:08,120
it took effort, travel, all that
kind of stuff.

431
00:25:08,120 --> 00:25:11,960
So, you know, the newsmakers
were in the big cities near

432
00:25:11,960 --> 00:25:15,520
where we had a camera.
Well, now with the Internet, we

433
00:25:15,520 --> 00:25:18,400
can hook U to somebody in
Function Junction, Utah.

434
00:25:18,560 --> 00:25:24,040
And that's great because the
technology allows us to reach

435
00:25:24,040 --> 00:25:28,080
more people and draw from the
expertise and the experience of

436
00:25:28,080 --> 00:25:32,640
people who aren't just in the
elite gatekeeping kinds of

437
00:25:32,640 --> 00:25:35,600
places.
So I think that's a value you

438
00:25:35,600 --> 00:25:37,720
have now.
You know, everybody who's got a

439
00:25:37,720 --> 00:25:41,800
cell phone, cell phones are very
sophisticated in that they have

440
00:25:41,800 --> 00:25:45,360
that camera function and the
video function.

441
00:25:45,600 --> 00:25:50,200
So people can go live during a
traffic stop that goes South

442
00:25:50,200 --> 00:25:53,440
and, you know, people see it in
real time badly.

443
00:25:53,440 --> 00:25:57,920
However, mass shooters have also
live streamed their crimes.

444
00:25:57,920 --> 00:26:01,720
It's sick.
But just because the ability

445
00:26:02,160 --> 00:26:05,360
that is there doesn't make the
Internet sick.

446
00:26:05,680 --> 00:26:10,040
It means that people misuse it.
But as far as journalism is

447
00:26:10,040 --> 00:26:14,840
concerned, it's been wonderful
in terms of getting video fast

448
00:26:15,080 --> 00:26:18,200
from people right on the scene.
You know, if you've got, if, you

449
00:26:18,200 --> 00:26:22,720
know, when Trump was shot in
Butler, PA, everybody had it on

450
00:26:22,720 --> 00:26:25,200
tape.
You know, I mean, you, you get

451
00:26:25,200 --> 00:26:27,720
so many different angles of the
same thing.

452
00:26:28,240 --> 00:26:31,520
And that is helpful to law
enforcement as well.

453
00:26:31,760 --> 00:26:35,360
Yeah, Look, I think that that as
well, as you said, if

454
00:26:35,360 --> 00:26:38,360
theoretically any event that
happens, you've got somebody on

455
00:26:38,360 --> 00:26:41,360
the scene.
And and we know that people not

456
00:26:41,360 --> 00:26:44,440
even interested in producing
news, they're just interested in

457
00:26:44,440 --> 00:26:47,320
getting likes online, putting a
video out there.

458
00:26:47,320 --> 00:26:51,960
And, and one of the things that
we've all noticed is whether big

459
00:26:51,960 --> 00:26:56,480
or small doesn't really matter.
Sources of news do scour the

460
00:26:56,480 --> 00:27:00,280
Internet and and social media
platforms to find those stories

461
00:27:00,920 --> 00:27:03,920
that might originate on
someone's Instagram or TikTok

462
00:27:03,920 --> 00:27:06,480
account or as you said, a video
just shared somewhere.

463
00:27:06,600 --> 00:27:09,840
How how much of that goes on?
Would you say there's people

464
00:27:09,840 --> 00:27:13,880
that that's their job just to
like be trolling the Internet

465
00:27:13,880 --> 00:27:16,840
for things like that that can
turn into stories or or do you

466
00:27:16,880 --> 00:27:20,720
hear the story and then find?
I don't think, I don't think

467
00:27:20,720 --> 00:27:22,680
news organisations are doing
that.

468
00:27:22,680 --> 00:27:25,280
Again, we're talking reputable
news organisations.

469
00:27:25,520 --> 00:27:27,600
I don't think that's where they
get their news.

470
00:27:27,840 --> 00:27:31,320
Now, there's a friend of mine in
Baltimore when the, you know,

471
00:27:31,320 --> 00:27:34,120
the Key Bridge was hit by a
barge and it collapsed.

472
00:27:34,120 --> 00:27:38,440
There was video, there was there
was surveillance video and they

473
00:27:38,440 --> 00:27:41,480
had to go through a rigorous
check to make sure they had to

474
00:27:41,480 --> 00:27:44,600
check with, you know, the
police, you know, is this is

475
00:27:44,600 --> 00:27:47,600
what we see is this did this
happen?

476
00:27:48,280 --> 00:27:52,680
And so that then means though,
that there are people out in the

477
00:27:52,680 --> 00:27:56,560
hinterlands who will, you know,
send you a video and say, isn't

478
00:27:56,560 --> 00:28:00,360
this amazing?
And it is amazing, but it's, is

479
00:28:00,360 --> 00:28:02,640
it true?
And so I think again, the

480
00:28:02,640 --> 00:28:06,000
reputable news organisations,
they're not searching for that

481
00:28:06,000 --> 00:28:08,320
kind of stuff because the stuff
comes at them.

482
00:28:09,040 --> 00:28:11,960
One of the things I'd like to
see, and maybe they actually

483
00:28:11,960 --> 00:28:17,960
exist, is look, if you've got a
video of some event that you

484
00:28:17,960 --> 00:28:21,520
know is stunning, but there's
no, you know, no one else

485
00:28:21,520 --> 00:28:25,120
happened to be there with a
camera, did it really happen or

486
00:28:25,120 --> 00:28:27,560
is it fake?
I would think there need to be

487
00:28:27,560 --> 00:28:33,720
people in newsrooms that are
able to tell if a news event is

488
00:28:33,720 --> 00:28:35,840
really a news event.
There have been a couple of

489
00:28:35,840 --> 00:28:39,080
cases where Fox News, which is
sort of a very conservative

490
00:28:39,080 --> 00:28:43,000
network in the US, sort of
Trumps personal, almost personal

491
00:28:43,000 --> 00:28:45,960
website.
They were giving stories about,

492
00:28:46,000 --> 00:28:50,240
you know, burning down cities,
but the video was from something

493
00:28:50,240 --> 00:28:53,480
entirely different.
But, you know, you got to be

494
00:28:53,480 --> 00:28:57,920
able to show where the original
came from, show how they're

495
00:28:57,920 --> 00:28:59,680
using it.
And I mean, that take.

496
00:28:59,680 --> 00:29:01,280
That's Investigative Journal.
Yeah.

497
00:29:01,280 --> 00:29:03,240
Look, that's, that's a great
point.

498
00:29:03,240 --> 00:29:07,160
And I think that that is one of
the problems with this.

499
00:29:07,160 --> 00:29:09,520
And and you know, as we, as we
mentioned a little while ago,

500
00:29:09,520 --> 00:29:13,520
like a lot of people are not
doing any deeper digging on

501
00:29:13,520 --> 00:29:16,080
anything.
You know, they see the video,

502
00:29:16,120 --> 00:29:18,560
that's the truth.
They see a claim, that's the

503
00:29:18,560 --> 00:29:20,280
truth.
You know, I've seen things like

504
00:29:20,280 --> 00:29:22,960
that, even those US ones that
you're talking about, like I

505
00:29:22,960 --> 00:29:25,680
might have seen it in my feed.
And I'm like, this just seems

506
00:29:25,680 --> 00:29:28,680
absurd, whether it's about fires
or something else.

507
00:29:29,200 --> 00:29:34,280
And within a minute of further,
you know, amateur investigation

508
00:29:34,280 --> 00:29:36,240
that I can do, I can find that
it's fake.

509
00:29:36,240 --> 00:29:40,000
But let's be honest, the vast
majority of people aren't doing

510
00:29:40,000 --> 00:29:41,120
that.
That's true.

511
00:29:41,120 --> 00:29:46,040
It's totally understandable how
this stuff spreads and I guess.

512
00:29:46,120 --> 00:29:49,720
Which which goes back to, which
goes back to my feeling that the

513
00:29:49,720 --> 00:29:52,760
education system needs to step
up as well.

514
00:29:52,960 --> 00:29:56,760
Because people are, you know,
they're sheep in many ways.

515
00:29:56,760 --> 00:29:58,600
They'll believe whatever they
want to believe.

516
00:29:59,000 --> 00:30:03,840
And that when you have an
informed electorate, when you

517
00:30:03,840 --> 00:30:07,120
know, civics is taught, when
science is taught, you know, the

518
00:30:07,120 --> 00:30:11,080
education system I think really
needs to step up to let people

519
00:30:11,080 --> 00:30:16,880
know that at a, at a young age,
that, you know, the dangers of

520
00:30:16,880 --> 00:30:20,040
falsehoods, the consequences of
that.

521
00:30:20,320 --> 00:30:23,480
I mean, I, I think we can turn
it around if we educate people.

522
00:30:23,480 --> 00:30:26,440
I mean, that's not a, that's not
meant to be a panacea, but I

523
00:30:26,440 --> 00:30:31,240
think it is at least one place
where we can start to make a

524
00:30:31,240 --> 00:30:33,440
difference.
Yeah, No, absolutely.

525
00:30:33,440 --> 00:30:36,320
I totally agree with you there.
That kind of teaching digital,

526
00:30:36,320 --> 00:30:40,480
digital literacy, I mean, look,
if you're in, if you're a 14

527
00:30:40,480 --> 00:30:43,680
year old kid now, you, you don't
know anything other than having

528
00:30:43,680 --> 00:30:46,640
a the supercomputer in your hand
at all times.

529
00:30:46,640 --> 00:30:48,680
So you know how to use all of
this stuff.

530
00:30:48,680 --> 00:30:53,080
But it's like, as you said,
being able to discern the

531
00:30:53,080 --> 00:30:56,560
objective truth versus the
subjective truth is, is a skill

532
00:30:56,560 --> 00:31:00,800
that I know it'd be interesting
to see whether people want that

533
00:31:00,800 --> 00:31:04,240
or they want their own opinions,
whatever they are to to be

534
00:31:04,240 --> 00:31:06,920
nurtured and kind of coddled in
that regard.

535
00:31:07,040 --> 00:31:09,240
You know, you mentioned Fox News
there and, and you've got

536
00:31:09,240 --> 00:31:11,920
publications on the other end.
And look, everybody's got their

537
00:31:12,080 --> 00:31:14,720
bias and, and kind of angle to
different things.

538
00:31:14,720 --> 00:31:19,440
But would you say that
journalism is in better shape

539
00:31:19,440 --> 00:31:24,240
when when you've got a strong
public broadcaster that needs to

540
00:31:24,480 --> 00:31:27,760
appeal to a broad audience
representing the entire nation?

541
00:31:27,760 --> 00:31:31,000
Like in Australia we've got the
ABC and so obviously being

542
00:31:31,000 --> 00:31:34,680
publicly funded, they need to
appear right down the middle.

543
00:31:34,720 --> 00:31:37,680
Now, that said, I don't think
there's a single person in

544
00:31:37,680 --> 00:31:40,200
Australia that thinks the ABC is
that.

545
00:31:40,200 --> 00:31:43,560
Everybody thinks that it's
catering to the views that they

546
00:31:43,560 --> 00:31:46,160
don't like, which is probably an
indicator that they're actually

547
00:31:46,160 --> 00:31:50,080
doing quite a good balanced job.
But those publicly funded

548
00:31:50,080 --> 00:31:53,200
broadcasters that do need to
play right down the middle, how

549
00:31:53,200 --> 00:31:57,240
do you think that they keep the
truth in journalism alive?

550
00:31:57,760 --> 00:32:00,280
Well, the problem is that Trump
is defunding them.

551
00:32:01,280 --> 00:32:03,800
And so that, you know, that's
one of the problems we have.

552
00:32:03,800 --> 00:32:07,440
It's not that, you know, they
never were, well, I shouldn't

553
00:32:07,440 --> 00:32:11,240
say never, but you know, at this
point it's not publicly funded.

554
00:32:11,240 --> 00:32:15,280
There's, there's a public stream
of money, but it's, it's a small

555
00:32:15,280 --> 00:32:18,760
percentage, but it's, but it's
enough that when that money goes

556
00:32:18,760 --> 00:32:22,680
away, that does have an impact
on, you know, the quality of

557
00:32:22,680 --> 00:32:25,200
reporting.
And I, and I think though that

558
00:32:25,560 --> 00:32:30,160
they're really, I have a problem
with, with publicly funded news

559
00:32:30,160 --> 00:32:33,760
organisations because if you've
got a dictator, that then means

560
00:32:33,760 --> 00:32:38,360
the information is curated by
somebody who really has an axe

561
00:32:38,360 --> 00:32:40,000
to grind.
I don't.

562
00:32:40,000 --> 00:32:43,720
I don't see public as
necessarily the same as as

563
00:32:43,720 --> 00:32:45,720
unbiased.
I mean, yeah, that's that's a

564
00:32:45,720 --> 00:32:49,000
fair fairpoint.
I guess the flip side of that is

565
00:32:49,000 --> 00:32:53,360
that hypothetically, let's say
Donald Trump destroys it.

566
00:32:53,360 --> 00:32:56,480
Couldn't someone just reinstate
it in a few more years?

567
00:32:56,680 --> 00:33:00,160
Sure, absolutely.
And you know there was a time in

568
00:33:00,160 --> 00:33:03,440
the US where they had the
fairness doctrine and equal time

569
00:33:03,440 --> 00:33:06,160
provisions.
It got to be very difficult to

570
00:33:06,160 --> 00:33:10,520
administer and and a lot of
broadcasting organisations got

571
00:33:10,520 --> 00:33:12,320
their licences from the
government.

572
00:33:12,320 --> 00:33:16,160
So in order to get their licence
renewed they needed to play it

573
00:33:16,160 --> 00:33:18,320
straight and to be fair to all
sides.

574
00:33:18,600 --> 00:33:23,600
And when those particular rules
went away, it was the Wild West.

575
00:33:23,600 --> 00:33:29,000
It it gave rise to talk radio,
Rush Limbaugh, you know, people

576
00:33:29,000 --> 00:33:32,440
who you know, who didn't have
editors, they could spew

577
00:33:32,440 --> 00:33:34,680
whatever they wanted.
I think it's a pendulum.

578
00:33:34,680 --> 00:33:38,040
You know, we talked about truth.
We talked about objectivity.

579
00:33:38,360 --> 00:33:43,920
I think on a fundamental level,
people really do appreciate

580
00:33:43,920 --> 00:33:46,040
truth.
We don't like to be lied to,

581
00:33:46,560 --> 00:33:49,200
certainly don't like to be lied
to in relationships.

582
00:33:49,680 --> 00:33:52,840
And that's really what we're
talking about, a relationship

583
00:33:52,840 --> 00:33:56,560
between, you know, the people
and a president, the people and

584
00:33:56,560 --> 00:33:58,800
their government.
So I think that on some

585
00:33:58,800 --> 00:34:03,760
fundamental level there still is
a desirable for reliable

586
00:34:03,920 --> 00:34:06,840
information.
Yeah, 100%.

587
00:34:06,840 --> 00:34:10,440
And I think that that's one of
the benefits that the access to

588
00:34:10,440 --> 00:34:14,080
finding that truth for anybody
that is interested is you can

589
00:34:14,080 --> 00:34:21,199
just see how much these stuff is
kind of curated and you know,

590
00:34:21,199 --> 00:34:24,320
it's it's gives you a little bit
that doesn't give you the full

591
00:34:24,320 --> 00:34:26,920
story that you can find.
So you can obviously see what

592
00:34:26,920 --> 00:34:30,159
anyone's agenda and how much
anybody is is lying to you.

593
00:34:31,639 --> 00:34:35,040
Do you think people still value
journalism as a public good even

594
00:34:35,040 --> 00:34:36,560
if they're not willing to pay
for it?

595
00:34:36,560 --> 00:34:41,080
And does the amount they value
it, You know, like I, I value it

596
00:34:41,080 --> 00:34:44,159
quite a lot and I, I read a wide
mix of things.

597
00:34:44,159 --> 00:34:46,280
But do you think for a lot of
people that the amount they

598
00:34:46,280 --> 00:34:50,560
value it depends on how much it
aligns to their own, I guess

599
00:34:50,560 --> 00:34:54,239
mindset?
Yeah, because look, going back

600
00:34:54,239 --> 00:34:58,200
to when this country, the US,
was founded, you know, they

601
00:34:58,200 --> 00:35:01,000
didn't have anything called
objective journalism.

602
00:35:01,320 --> 00:35:04,040
You know, there were a lot of
newspapers, but each newspaper

603
00:35:04,040 --> 00:35:07,720
was spouting a particular
political position.

604
00:35:08,160 --> 00:35:12,120
And it wasn't until probably the
1920s, maybe 100 years ago,

605
00:35:12,360 --> 00:35:17,920
where the concept of objectivity
even entered journalism, entered

606
00:35:17,920 --> 00:35:21,640
the public sphere.
And, and so that was then

607
00:35:21,640 --> 00:35:25,080
curated and it became more
sophisticated.

608
00:35:25,360 --> 00:35:28,720
It was always controversial.
I think the Vietnam War was, was

609
00:35:28,720 --> 00:35:32,760
probably a perfect example of
that because, you know, Nixon

610
00:35:32,760 --> 00:35:36,360
was prosecuting a war and lying
about.

611
00:35:36,480 --> 00:35:40,200
And, and it wasn't just Nixon,
It was every president up until

612
00:35:40,200 --> 00:35:43,160
Nixon and beyond, you know,
lying about it.

613
00:35:43,440 --> 00:35:47,560
And during the Vietnam War, the
reporters that were actually in

614
00:35:47,560 --> 00:35:50,880
Vietnam covering it, they go in
the field with the troops and

615
00:35:51,160 --> 00:35:54,480
then they'd come back and there
would be a briefing in Saigon

616
00:35:54,480 --> 00:35:57,640
from the military leaders.
And they called the briefing the

617
00:35:57,640 --> 00:36:02,120
5:00 follies because there was
no, there was no coherence,

618
00:36:02,120 --> 00:36:05,800
because there was no, there was
no comparison to the lies that

619
00:36:05,800 --> 00:36:08,800
were told from the podium and
what these guys were seeing in

620
00:36:08,800 --> 00:36:11,640
the field.
But Nixon's vice President,

621
00:36:12,320 --> 00:36:15,640
Spiro Agnew was going around the
country calling journalists

622
00:36:15,880 --> 00:36:21,080
nattering nabobs of negativism
and and a feet core of impudent

623
00:36:21,080 --> 00:36:25,240
snobs, which is just another
sophisticated way of saying fake

624
00:36:25,240 --> 00:36:29,800
news.
So, you know, I I think we are

625
00:36:30,000 --> 00:36:33,720
fooling ourselves if we really
think that this is going to be

626
00:36:33,720 --> 00:36:36,520
solved.
There is no matter what the

627
00:36:36,520 --> 00:36:41,920
technology is, as long as people
are as involved, there is going

628
00:36:41,920 --> 00:36:45,520
to be funny business.
And I just, I sound like a

629
00:36:45,520 --> 00:36:48,960
broken record, but it just means
that those of us who care about

630
00:36:48,960 --> 00:36:53,400
the truth need to be discerning
and need to do what we can to

631
00:36:53,400 --> 00:36:56,520
let others know that not
everything you see on the

632
00:36:56,520 --> 00:36:59,480
Internet is trustworthy.
I think we all know that.

633
00:36:59,720 --> 00:37:03,360
But I think that politics has
gotten to be a blood sport.

634
00:37:03,600 --> 00:37:07,000
It's all about power.
It's all about power and

635
00:37:07,000 --> 00:37:10,520
winning.
And truth doesn't matter anymore

636
00:37:10,520 --> 00:37:12,880
because the bottom line is being
in control.

637
00:37:13,200 --> 00:37:16,320
And that's so sad.
It's really sad.

638
00:37:16,440 --> 00:37:19,880
Yeah, look that that again, 100%
agree with that.

639
00:37:19,960 --> 00:37:22,840
And I'm only speaking in an
Australian context.

640
00:37:22,840 --> 00:37:26,520
And from what I can see from the
US, it seems just kind of on

641
00:37:26,520 --> 00:37:32,360
another planet in that kind of
sense that even here it has kind

642
00:37:32,360 --> 00:37:36,680
of broken down to a team sport
to varying degrees.

643
00:37:36,680 --> 00:37:40,200
You know, it's kind of and just
that that hypocrisy and almost

644
00:37:40,200 --> 00:37:44,880
not holding anybody to anything
that they say of or promise.

645
00:37:44,880 --> 00:37:47,280
Yeah.
Do you see any spillover from

646
00:37:47,280 --> 00:37:50,560
the US where you know our
craziness is starting to seep

647
00:37:50,560 --> 00:37:51,840
into your politics?
Really.

648
00:37:51,840 --> 00:37:53,120
Of course.
Yeah, yeah, yeah.

649
00:37:53,120 --> 00:37:57,800
Look, I mean, rightly wrongly,
good or bad, like obviously the

650
00:37:57,840 --> 00:38:01,480
US is probably one of the, if
not the most influential culture

651
00:38:01,600 --> 00:38:05,360
in in the world, right.
And I think that being that

652
00:38:05,480 --> 00:38:09,480
highly visible, let's say leader
of the of the Western world that

653
00:38:09,480 --> 00:38:12,680
your political, cultural,
whatever it is, issues

654
00:38:13,160 --> 00:38:15,760
definitely bleed over into other
Western nations.

655
00:38:15,760 --> 00:38:20,080
And I think one of the, the
biggest ones is it's almost like

656
00:38:20,080 --> 00:38:22,520
we have bootleg versions like
you've got over there, you've

657
00:38:22,520 --> 00:38:24,760
got mega, right?
And then someone will, will

658
00:38:24,760 --> 00:38:28,600
repurpose that here and, and
it's make Australia great again.

659
00:38:28,600 --> 00:38:32,600
And, you know, similar kind of
sentiments, obviously, like

660
00:38:32,680 --> 00:38:35,960
Australia put Australians first
and, and, you know, same kind of

661
00:38:35,960 --> 00:38:37,480
thing.
And I think that it's not a, a

662
00:38:38,600 --> 00:38:44,280
uniquely Trumpian thing.
I think that it's just very easy

663
00:38:44,280 --> 00:38:49,640
to spread around so much shared
I guess culture and values in

664
00:38:49,640 --> 00:38:51,840
general.
Does that then mean that?

665
00:38:52,320 --> 00:38:56,680
You don't really have any other
places to go for reliable

666
00:38:56,680 --> 00:39:02,200
information because the Internet
has been so polluted entirely.

667
00:39:02,480 --> 00:39:07,120
Look, I would I would say.
That look and I could be

668
00:39:07,120 --> 00:39:09,560
completely wrong here, I'll just
talk from my perspective.

669
00:39:09,560 --> 00:39:14,200
I feel that, as I said, the US
being on another level with all

670
00:39:14,200 --> 00:39:18,000
of this stuff, it again comes
down to that like, I think here

671
00:39:18,440 --> 00:39:23,800
people aren't, from what I can
tell, as supremely emotionally

672
00:39:23,800 --> 00:39:26,600
invested in politics as as what
they are in the US.

673
00:39:26,600 --> 00:39:28,400
Like, you know, don't get me
wrong, we've got kind of

674
00:39:28,400 --> 00:39:31,920
politics super fans here, but
they seem to be in in far

675
00:39:31,920 --> 00:39:34,840
smaller numbers.
And I think that that what it's

676
00:39:34,840 --> 00:39:39,240
like here is most people
generally have this perception

677
00:39:39,240 --> 00:39:42,840
that the political class is just
completely mediocre.

678
00:39:43,120 --> 00:39:47,600
And so that doesn't have this
weight to the normal person.

679
00:39:47,600 --> 00:39:51,320
Like normal people are kind of
somewhat detached from it.

680
00:39:51,320 --> 00:39:53,840
And I think that that's a
benefit that we can enjoy.

681
00:39:55,360 --> 00:39:56,640
How did it get that way?
How do?

682
00:39:57,880 --> 00:40:00,000
Just how can we?
How can we learn from you?

683
00:40:00,600 --> 00:40:04,240
Look, I think.
I think find the least

684
00:40:04,240 --> 00:40:08,400
charismatic people you can find,
the least interesting people you

685
00:40:08,400 --> 00:40:13,720
can find and you know, like
straight down the middle

686
00:40:13,720 --> 00:40:17,160
inoffensive never rocked like
they're, they're careerists,

687
00:40:17,160 --> 00:40:18,800
right.
And look, I'm and again, we,

688
00:40:18,880 --> 00:40:21,120
we'll go off on a tangent, but
it is an interesting one.

689
00:40:21,560 --> 00:40:24,440
But it's like, you know, we know
that politicians are careerists,

690
00:40:24,440 --> 00:40:26,720
right?
Like what a what a job like get

691
00:40:26,720 --> 00:40:29,120
on that gravy train and you can
just ride it.

692
00:40:29,120 --> 00:40:32,800
And it's the same here, like,
you know, except here there's

693
00:40:33,960 --> 00:40:37,280
the salaries are probably far
less than what you would get in

694
00:40:37,280 --> 00:40:41,600
the US Politics here is kind of
got that control lever, but it

695
00:40:41,600 --> 00:40:46,480
doesn't have the cultural cache
that I guess, you know, like

696
00:40:46,480 --> 00:40:50,920
during election, someone might
put a a core flute or a sign in

697
00:40:50,920 --> 00:40:54,040
their yard or on their fence if
their local candidate.

698
00:40:54,040 --> 00:40:57,720
But it's like there's they've
never videos of someone driving

699
00:40:57,720 --> 00:41:00,480
their car through someone's yard
to smash them down.

700
00:41:00,480 --> 00:41:03,480
And fights breaking out and
things like that.

701
00:41:03,480 --> 00:41:05,320
It's just, it's just not like
that.

702
00:41:05,320 --> 00:41:08,960
And I think that at the end of
the day for the vast majority

703
00:41:08,960 --> 00:41:12,160
here, we are able to kind of
interact with each other.

704
00:41:12,160 --> 00:41:16,400
It does it hasn't broken society
tribally is is what I can

705
00:41:16,400 --> 00:41:19,760
observe.
But I think on on that while we

706
00:41:19,760 --> 00:41:23,920
were talking around, you know,
public broadcasters and and the

707
00:41:23,920 --> 00:41:27,920
search for truth, like, I think
one healthy thing on that here

708
00:41:27,920 --> 00:41:32,440
is this inherent cynicism, like
any, any claim that you might

709
00:41:32,440 --> 00:41:34,600
see from the from the
government, so many people are

710
00:41:34,600 --> 00:41:37,200
just like, that's bullshit.
You know, I'm gonna try and

711
00:41:37,200 --> 00:41:39,240
look, I'm gonna find why that's
wrong.

712
00:41:40,280 --> 00:41:42,360
So that's I guess that's one
benefit.

713
00:41:42,360 --> 00:41:45,600
And like I said, there is of
course, sycophants and

714
00:41:45,600 --> 00:41:48,440
mouthpieces, but I don't know,
it just doesn't have the

715
00:41:48,440 --> 00:41:51,320
cultural weight here.
I think that it might in the USI

716
00:41:52,280 --> 00:41:55,440
think though that.
Roughly the US is broken into

717
00:41:55,440 --> 00:41:57,680
thirds.
You know, there's the extreme

718
00:41:57,680 --> 00:42:00,680
right, the extreme left, and
then I think that there's the

719
00:42:00,680 --> 00:42:04,600
middle that really isn't as
tuned in anymore.

720
00:42:04,600 --> 00:42:06,880
There's sort of fatigue about
it.

721
00:42:06,880 --> 00:42:11,080
And I and I think that's where
I'm hopeful that those people,

722
00:42:11,440 --> 00:42:14,080
you know, might actually, you
know, swing it one way or the

723
00:42:14,160 --> 00:42:18,960
other if they're not persuadable
by 1 extreme or the other.

724
00:42:19,640 --> 00:42:20,840
Yeah, absolutely.
And look that.

725
00:42:20,840 --> 00:42:23,920
That pretty much sounds like
like here, most people are in

726
00:42:23,920 --> 00:42:26,960
that kind of grey area.
And I think that the, the, the

727
00:42:26,960 --> 00:42:29,360
good thing about that is
obviously being able to find

728
00:42:29,360 --> 00:42:32,920
common cause with people that
aren't on your team as such.

729
00:42:32,920 --> 00:42:36,160
And then that's that shared
humanity, which obviously can

730
00:42:36,160 --> 00:42:38,880
help hopefully move, move
society along.

731
00:42:39,080 --> 00:42:41,080
I think we're, I think you're
really on to something.

732
00:42:41,080 --> 00:42:44,520
I feel very strongly about that,
that common human ground.

733
00:42:44,760 --> 00:42:45,360
Yeah.
Look.

734
00:42:45,360 --> 00:42:47,840
Fingers crossed.
But while we're on a positive

735
00:42:47,840 --> 00:42:51,920
note, what would you say one
thing that the introduction of

736
00:42:51,920 --> 00:42:55,080
the Internet and and technology
has enabled in the world of

737
00:42:55,360 --> 00:42:58,040
journalism and writing that's
made you more hopeful about the

738
00:42:58,040 --> 00:42:59,800
future of it?
Being able to have this

739
00:42:59,800 --> 00:43:02,680
conversation I.
Think is a perfect example I

740
00:43:02,680 --> 00:43:05,880
mean you're concerned about what
the Internet is doing to

741
00:43:06,120 --> 00:43:09,880
society, the world, and we're
talking about that.

742
00:43:09,880 --> 00:43:12,040
We are half a world away talking
about it.

743
00:43:12,360 --> 00:43:14,600
I think that is what gives me
hope.

744
00:43:14,840 --> 00:43:16,440
Awesome.
On that point then we.

745
00:43:16,680 --> 00:43:19,960
Need those people to believe in
and and deliver that hope moving

746
00:43:19,960 --> 00:43:22,160
forward from from within the
industry.

747
00:43:22,480 --> 00:43:25,320
What would you say in your
opinion is the most important

748
00:43:25,320 --> 00:43:28,520
skill for a young writer,
whatever form of writing that

749
00:43:28,520 --> 00:43:30,880
they're, you know, looking to
pursue?

750
00:43:31,600 --> 00:43:34,640
The most important skill for
them to have today that was not

751
00:43:34,640 --> 00:43:39,040
needed, say, 20 years ago.
Boy I I hadn't.

752
00:43:39,040 --> 00:43:41,800
Thought about that one because I
mean, you know, the obvious

753
00:43:41,800 --> 00:43:46,920
skill is just being able to have
a vocabulary and be able to put

754
00:43:46,920 --> 00:43:51,840
into words succinctly and
quickly whatever it is that

755
00:43:51,840 --> 00:43:56,560
you're trying to communicate.
I think here's here's one

756
00:43:56,560 --> 00:44:00,360
thought just off the top of my
head, and that is it's almost

757
00:44:00,360 --> 00:44:05,200
the antithesis of the Blizzard
of information that we get.

758
00:44:05,400 --> 00:44:11,000
It would seem to me that the
ultimate is to be able to say

759
00:44:11,000 --> 00:44:15,760
something quickly and succinctly
and speak in sound bites, get

760
00:44:15,760 --> 00:44:18,840
the point across without
belabouring it.

761
00:44:19,520 --> 00:44:22,440
And now I'm going to stop.
No, that's that's an

762
00:44:22,440 --> 00:44:23,760
interesting.
Point there around especially

763
00:44:23,760 --> 00:44:27,560
since we've been talking about
people reading that headline or

764
00:44:27,560 --> 00:44:30,920
just the top level of it.
So I wonder if that's the skill

765
00:44:30,920 --> 00:44:36,680
to develop is figuring out a way
to deliver a real truthful

766
00:44:37,320 --> 00:44:40,080
headline or or sound bite that
is actually going to deliver

767
00:44:40,480 --> 00:44:44,560
that truth rather than just that
click and that intrigue to then

768
00:44:44,560 --> 00:44:47,000
deliver.
I don't know something else with

769
00:44:47,000 --> 00:44:50,760
a different different agenda A
lot to think about.

770
00:44:51,520 --> 00:44:53,440
I know a lot to.
Think about.

771
00:44:54,640 --> 00:44:58,400
Just to finish up then, what
advice would you give to anybody

772
00:44:58,400 --> 00:45:02,760
either you know currently within
the industry or thinking of

773
00:45:02,760 --> 00:45:06,080
getting into it sometime in the
near future or or down the line,

774
00:45:06,600 --> 00:45:10,400
what how would you advise them
to not only navigate the current

775
00:45:10,960 --> 00:45:14,160
landscape and world of
journalism and writing, but also

776
00:45:14,160 --> 00:45:16,200
help future proof themselves as
well?

777
00:45:16,320 --> 00:45:18,360
Be curious.
I think that's.

778
00:45:18,360 --> 00:45:21,680
Fundamental I I don't think that
that will ever go away.

779
00:45:21,680 --> 00:45:27,480
And not only be be curious, but
be assertive about your

780
00:45:27,480 --> 00:45:30,440
curiosity.
Don't be afraid to ask questions

781
00:45:30,560 --> 00:45:34,880
and and why is a wonderful
question. 5 year olds, you know,

782
00:45:34,880 --> 00:45:36,720
get it, You know, it's time for
bed.

783
00:45:36,720 --> 00:45:39,840
Why?
Because I said so why and so on.

784
00:45:41,320 --> 00:45:46,720
So I think curiosity will never
go out of fashion and and asking

785
00:45:46,720 --> 00:45:49,680
The Who, what, where, when, why
and how questions.

786
00:45:50,760 --> 00:45:54,000
Yeah, I think stay curious.
Is a great piece of advice not

787
00:45:54,000 --> 00:45:56,920
just for anybody in that world,
but obviously anybody in the

788
00:45:57,000 --> 00:46:00,560
world itself.
That's how people adopt that as

789
00:46:00,560 --> 00:46:03,080
it's kind of a mindset and we
can maybe get through this

790
00:46:03,080 --> 00:46:08,240
quagmire of mess using the
Internet to to help find those

791
00:46:08,280 --> 00:46:11,280
pieces of truth.
I'm hopeful for no apparent.

792
00:46:11,280 --> 00:46:12,880
Reason.
That's good.

793
00:46:12,880 --> 00:46:13,880
Me too.
Me too.

794
00:46:14,720 --> 00:46:18,000
Thanks so much for that, John.
What have you got coming up on

795
00:46:18,000 --> 00:46:20,000
the horizon?
Where can people follow what

796
00:46:20,000 --> 00:46:22,640
you're up to?
Probably the best thing is my

797
00:46:22,640 --> 00:46:24,920
website.
Which is myname.com,

798
00:46:24,920 --> 00:46:34,240
johndidakis.com, JOHND as in dog
Ed as in dog Akis as in

799
00:46:34,240 --> 00:46:37,760
samjohndidakis.com.
And I think that there are

800
00:46:37,760 --> 00:46:40,160
actually some Didakis is living
in Australia.

801
00:46:40,400 --> 00:46:41,600
Oh, really?
I have to look that up.

802
00:46:41,640 --> 00:46:44,320
Yeah, that.
Migrated from Lofka.

803
00:46:44,520 --> 00:46:47,840
In in Greece on the Peloponnese,
I was just there where my

804
00:46:47,840 --> 00:46:49,640
grandpa and great grandpa were
born.

805
00:46:50,040 --> 00:46:52,000
Yeah, right.
Yeah, we do have a decent.

806
00:46:52,000 --> 00:46:54,480
Greek population here, so it
wouldn't wouldn't surprise me.

807
00:46:54,480 --> 00:46:57,360
Yeah, right.
Pasiotis is another.

808
00:46:57,680 --> 00:47:00,480
Greek family that moved to
Australia.

809
00:47:01,160 --> 00:47:04,000
So yeah, my website is probably
the best place to do.

810
00:47:04,240 --> 00:47:08,000
I'm doing more public speaking
now on helping people use

811
00:47:08,160 --> 00:47:10,080
writing as a way to heal from
grief.

812
00:47:10,080 --> 00:47:14,000
I'm working on my 7th novel.
I have a short story that's

813
00:47:14,160 --> 00:47:17,480
that's with an editor and I've
written a memoir that's with a

814
00:47:17,480 --> 00:47:20,400
publisher and they're deciding
whether to publish it.

815
00:47:20,400 --> 00:47:24,240
So I've got, you know, plus I'm
teaching classes online and

816
00:47:24,240 --> 00:47:27,480
because it's online on the
Internet, you could even take

817
00:47:27,480 --> 00:47:30,040
one of my classes even though
you're in Australia.

818
00:47:30,920 --> 00:47:33,360
And where can people find find?
Those it's just at your website.

819
00:47:33,360 --> 00:47:35,560
Go to my website, go to upcoming
events.

820
00:47:35,560 --> 00:47:38,440
And you'll find it awesome,
John, thank.

821
00:47:38,440 --> 00:47:40,360
You so much.
Thank you, Gareth, it was

822
00:47:40,360 --> 00:47:41,360
wonderful talking.
To you.

823
00:47:41,360 --> 00:47:44,560
Thank you for more info on what
we've discussed today.

824
00:47:44,560 --> 00:47:46,640
Check out the show notes if you
enjoyed this one.

825
00:47:46,680 --> 00:47:49,120
You can subscribe to Ruined by
the Internet on your favourite

826
00:47:49,120 --> 00:47:52,240
podcast app and help spread the
word by sharing this episode or

827
00:47:52,240 --> 00:47:54,920
leaving a review.
I'm Gareth King, see you next

828
00:47:54,920 --> 00:47:55,160
time.

John DeDakis Profile Photo

John DeDakis

Award-winning novelist John DeDakis (Pronounced: dee-DAY-kiss) is a former editor on CNN's "The Situation Room with Wolf Blitzer." DeDakis is the author of six novels in the Lark Chadwick mystery-suspense-thriller series. In his most recent book, Enemies Domestic, Lark is a White House press secretary forced to make her extremely personal abort-or-not-to-abort decision in a highly toxic and polarized political fishbowl—all while dealing with an attack on the presidency. DeDakis, a former White House correspondent, is a writing coach, manuscript editor, podcaster, and regularly teaches novel writing at literary centers and writers’ conferences nationwide. Website: www.johndedakis.com