Oct. 11, 2025

Has the internet ruined Human Resources? – Dr Justine Ferrer

Has the internet ruined Human Resources? – Dr Justine Ferrer
The player is loading ...
Has the internet ruined Human Resources? – Dr Justine Ferrer

It promised to make things more efficient, objective, and data-driven, but instead turned a human-centric discipline into an automated, impersonal system with new ethical and privacy concerns.

To help us explore how people and technology work together, we’re joined by Dr Justine Ferrer, a Senior Lecturer in Human Resource Management in the Deakin Business School.

http://www.linkedin.com/in/justineferrer

In this episode, Justine joins us to discuss the evolving role of human resources in the age of technology, exploring the impact of automation and AI on HR practices, the balance between efficiency and the human touch, and the ethical implications of data privacy and surveillance in the workplace.

We also look at the skills needed for future HR professionals, and the potential for AI to enhance rather than replace human roles in HR.

00:00 The Impact of Technology on Human Resources

02:57 The Shift Towards Strategic HR Management

06:00 Balancing Automation and Human Touch

08:47 Surveillance and Employee Trust

11:53 Data Privacy and Security Challenges

14:48 The Role of Small Businesses in HR Tech

17:57 AI, Bias, and the Future of Hiring

20:55 Skills for the Future of HR

23:43 The Ethical Framework for AI in HR

26:50 The Future of Human Resources

Let us know what else you’d like us to look into, or just say hello at https://www.ruinedbytheinternet.com/

1
00:00:00,200 --> 00:00:01,640
Welcome to Ruined by the
Internet.

2
00:00:01,720 --> 00:00:04,440
I'm Gareth King.
Today we're asking, has the

3
00:00:04,440 --> 00:00:07,720
Internet ruined human resources?
It promised to make things more

4
00:00:07,720 --> 00:00:11,240
efficient, objective and data
driven, but instead turn a human

5
00:00:11,240 --> 00:00:14,320
centric discipline into an
automated, impersonal system

6
00:00:14,560 --> 00:00:16,560
with new ethical and privacy
concerns.

7
00:00:17,120 --> 00:00:19,800
To help us explore how people
and technology work together,

8
00:00:20,040 --> 00:00:23,200
we're joined by Doctor Justine
Ferrer, a senior lecturer in

9
00:00:23,200 --> 00:00:25,840
Human Resource Management in the
Deacon Business School.

10
00:00:30,320 --> 00:00:32,759
Justine, thank you so much for
joining us and welcome to the

11
00:00:32,759 --> 00:00:34,360
show.
Thank you for having me.

12
00:00:34,360 --> 00:00:37,040
I'm, I'm very excited about
being here and having this

13
00:00:37,040 --> 00:00:38,480
discussion with you today,
Gareth.

14
00:00:38,880 --> 00:00:41,560
Yeah, me too.
But before we begin, can you

15
00:00:41,560 --> 00:00:44,200
tell us a little bit about the
work that you do and the journey

16
00:00:44,200 --> 00:00:47,840
that's LED you to this point?
Wow, the work that I do as I'm

17
00:00:47,840 --> 00:00:50,720
an academic, I work at Deakin
University in the Deakin

18
00:00:50,720 --> 00:00:54,600
Business School.
I am a senior lecturer in human

19
00:00:54,600 --> 00:00:57,800
resource management.
I have been in this space for

20
00:00:57,800 --> 00:01:02,000
quite a while and I think my
passion is around HR and the

21
00:01:02,000 --> 00:01:04,239
profession, particularly the
dark side elements.

22
00:01:04,239 --> 00:01:08,480
And and as our discussion today
will entail, some of that is

23
00:01:08,480 --> 00:01:11,560
considered the technology and
that's got its own little dark

24
00:01:11,560 --> 00:01:15,520
side that has implications for
for HR and and for the workplace

25
00:01:15,520 --> 00:01:16,640
generally.
Yeah.

26
00:01:16,640 --> 00:01:21,080
Look, I'm sure we will go into
those implications for HR in the

27
00:01:21,080 --> 00:01:24,240
workplace as we know, you know,
it's in the name.

28
00:01:24,240 --> 00:01:27,640
Human resources is obviously a
traditionally human centric

29
00:01:27,720 --> 00:01:29,560
field.
But as we look at the increased

30
00:01:29,560 --> 00:01:33,600
adoption of various forms of HR
tech, what would you say is

31
00:01:33,600 --> 00:01:36,560
proving to be the most difficult
about maintaining that human

32
00:01:36,560 --> 00:01:38,720
touch?
Really great question and a

33
00:01:38,720 --> 00:01:41,960
really hard question to to
answer because there is argument

34
00:01:41,960 --> 00:01:45,360
that you know, HR is losing its
human side because of technology

35
00:01:45,360 --> 00:01:50,320
and because of focuses on
efficiencies and productivity as

36
00:01:50,320 --> 00:01:53,200
a opposed to looking after the
well being of the employees in

37
00:01:53,200 --> 00:01:55,400
the workplace.
And if we go back historically

38
00:01:55,440 --> 00:01:59,400
to traditional models of HR, it
was about looking after that

39
00:01:59,400 --> 00:02:02,200
well being.
But HR has seen a massive shift

40
00:02:02,200 --> 00:02:05,160
and with that shift it's become
more strategic and with more

41
00:02:05,160 --> 00:02:08,479
strategic focus, it's focusing
on productivity and greater

42
00:02:08,479 --> 00:02:11,480
efficiencies.
However, when we think about HR,

43
00:02:11,600 --> 00:02:14,080
it's has two notable parts of
it.

44
00:02:14,240 --> 00:02:17,760
There is the process part.
Now HR is all about looking

45
00:02:17,760 --> 00:02:20,160
after how the workplace
functions, how employees

46
00:02:20,160 --> 00:02:22,880
function, so making sure people
are are coming to work, that

47
00:02:22,880 --> 00:02:25,320
they're getting paid, that they
have the right training and, and

48
00:02:25,320 --> 00:02:27,640
safety demands met and so on and
so forth.

49
00:02:27,760 --> 00:02:30,640
So there's a process part of
that, but there's also the human

50
00:02:30,640 --> 00:02:35,160
side of that, which as much as,
as we talk about technology and,

51
00:02:35,240 --> 00:02:39,280
and dehumanising human resource
management, there is this

52
00:02:39,520 --> 00:02:42,960
massive part of HR that is
inherently human.

53
00:02:42,960 --> 00:02:46,560
And I'm not sure that that can
be lost just yet.

54
00:02:46,840 --> 00:02:51,360
It sounds like as well that the
function of HRS expanded and

55
00:02:51,360 --> 00:02:54,320
grown quite a lot over time and
you just mentioned there that

56
00:02:54,320 --> 00:02:57,360
there has been that shift
towards a more strategic

57
00:02:57,360 --> 00:03:00,000
approach overall.
Can you just give us a top line

58
00:03:00,000 --> 00:03:02,040
rundown of what that shift has
looked like?

59
00:03:02,360 --> 00:03:04,560
It depends who you ask to be
honest.

60
00:03:05,160 --> 00:03:08,560
So what what we're seeing is and
you and you look at RE, which is

61
00:03:08,640 --> 00:03:11,760
a strange human resource
institute and even coming out of

62
00:03:11,760 --> 00:03:15,280
the US and and the UK, they're
talking about the importance of

63
00:03:15,480 --> 00:03:19,920
HR having a strategic role on
boards and making a strategic

64
00:03:19,920 --> 00:03:22,880
contribution.
So we're thinking of CEOs that

65
00:03:22,880 --> 00:03:25,960
they're making decisions that
are impacting on the whole of

66
00:03:25,960 --> 00:03:29,880
the organisation, including the
human resource, so including the

67
00:03:29,880 --> 00:03:31,880
people.
So the argument is from a

68
00:03:31,880 --> 00:03:34,720
strategic perspective that HR
should be involved in that

69
00:03:34,720 --> 00:03:38,880
conversation because ultimately
it's the employees who are going

70
00:03:38,880 --> 00:03:42,080
to deliver on those
organisational goals and those

71
00:03:42,160 --> 00:03:43,640
the objectives that they're
setting.

72
00:03:44,000 --> 00:03:47,880
So HR strategy inherently should
be linked to the business

73
00:03:47,880 --> 00:03:50,400
strategy.
So HR, when we're talking about

74
00:03:50,720 --> 00:03:53,640
HR being more strategic, it's HR
understanding what the business

75
00:03:53,640 --> 00:03:57,240
strategy is and then being able
to link their own strategy, the

76
00:03:57,240 --> 00:04:00,320
people strategy, to ensure that
the employees are delivering on

77
00:04:00,320 --> 00:04:02,840
what the organisation needs for
them to deliver on.

78
00:04:03,000 --> 00:04:07,440
In that case, have those various
forms of HR tech that have been

79
00:04:07,440 --> 00:04:11,080
implemented so far, are they
actually freeing up human

80
00:04:11,080 --> 00:04:14,560
resources staff to be more
strategic and more human?

81
00:04:14,560 --> 00:04:18,959
Or do they somewhat shift those
previous administrative burdens

82
00:04:18,959 --> 00:04:21,320
because obviously they would
need some kind of management and

83
00:04:21,320 --> 00:04:23,320
oversight from the person,
right?

84
00:04:23,600 --> 00:04:27,400
Yeah, absolutely.
So I was talking to a colleague

85
00:04:27,400 --> 00:04:31,360
recently and she was talking
about the impacts of AIM on her

86
00:04:31,360 --> 00:04:34,360
current workforce.
And she was saying that people

87
00:04:34,360 --> 00:04:36,840
are scared that they're going to
be replaced.

88
00:04:37,320 --> 00:04:41,240
You know, that this move towards
automation in, in different

89
00:04:41,240 --> 00:04:43,760
ways, whether it's through AI or
through other types of

90
00:04:43,760 --> 00:04:46,200
automation, people are going to
lose their jobs.

91
00:04:46,240 --> 00:04:49,560
And, and she's saying that or
trying to encourage them to

92
00:04:49,560 --> 00:04:53,200
think about, well, how well
should you use your time?

93
00:04:53,200 --> 00:04:56,520
How could you better use that in
order to do something that's

94
00:04:56,520 --> 00:04:59,080
more productive?
There is a case at Coles recent,

95
00:04:59,200 --> 00:05:03,800
recently, a couple of years ago
where they introduced ASAP

96
00:05:03,800 --> 00:05:08,280
SuccessFactors, which is a whole
integrated workplace system with

97
00:05:08,280 --> 00:05:09,960
lots of data and, and things
like that.

98
00:05:10,000 --> 00:05:13,520
And, and it automated quite a
lot of their HR process.

99
00:05:13,640 --> 00:05:18,760
But what they were able to do in
that was then reallocate people

100
00:05:18,760 --> 00:05:21,840
to to do things that were more
meaningful to them.

101
00:05:22,560 --> 00:05:25,000
So it wasn't about, we've got
this, you're going to be

102
00:05:25,000 --> 00:05:27,360
replaced.
It was about giving them the

103
00:05:27,360 --> 00:05:29,920
opportunity to say, all right,
well, what can you do that's

104
00:05:29,920 --> 00:05:31,960
more meaningful to the to the
business?

105
00:05:32,400 --> 00:05:34,720
How can you better spend your
time?

106
00:05:35,040 --> 00:05:39,120
And I think that that that
sentiment, I guess that that

107
00:05:39,120 --> 00:05:43,000
wider sentiment actually comes
up regarding AI in a lot of

108
00:05:43,000 --> 00:05:45,200
different things.
You know, there is obviously so

109
00:05:45,200 --> 00:05:47,440
much unknown and uncertainty
around it.

110
00:05:47,440 --> 00:05:49,440
Now.
From one side, you've got people

111
00:05:49,440 --> 00:05:52,440
thinking this is my replacement.
And then the other side, you

112
00:05:52,440 --> 00:05:54,720
know, which I'm sure we'll get
to, is people thinking, this is

113
00:05:54,720 --> 00:05:57,080
my augmentation.
This is going to make me a super

114
00:05:57,080 --> 00:05:59,320
version of myself.
So there's, you know, it's going

115
00:05:59,320 --> 00:06:02,320
to be super interesting to see
how that all shakes out.

116
00:06:02,480 --> 00:06:06,280
But you mentioned something
there around, you know, the data

117
00:06:06,280 --> 00:06:10,320
and, and inputs from various
members of staff across various

118
00:06:10,320 --> 00:06:13,720
parts of their performance and I
guess their role within the

119
00:06:13,720 --> 00:06:16,920
business, etcetera.
Obviously, people have a little

120
00:06:16,920 --> 00:06:20,800
bit wary of being treated as
data.

121
00:06:21,000 --> 00:06:23,000
Is there a risk that the
employees will feel like they're

122
00:06:23,000 --> 00:06:26,360
just being reduced to a series
of data points and metrics

123
00:06:26,720 --> 00:06:28,960
rather than the human beings
that they are?

124
00:06:29,200 --> 00:06:33,000
And does something like that
potentially get worse by

125
00:06:33,000 --> 00:06:37,360
necessity, simply because the
larger an organisation gets, the

126
00:06:37,360 --> 00:06:39,680
harder the task of managing all
the staff is?

127
00:06:40,240 --> 00:06:43,960
Yeah, So absolutely the
dehumanisation and the

128
00:06:43,960 --> 00:06:47,880
reductionism that comes with
people thinking they've just

129
00:06:47,880 --> 00:06:51,320
been reduced to a number and and
the consequential impacts of

130
00:06:51,320 --> 00:06:55,520
that for employee well being a
substantial you know, it, it

131
00:06:55,520 --> 00:06:59,920
it's, but I think it's a more of
a cultural discussion as an

132
00:06:59,920 --> 00:07:03,280
organisation as to why we're
doing this, why we need to do

133
00:07:03,280 --> 00:07:07,640
this, why this is important and
where those touch points are for

134
00:07:07,640 --> 00:07:09,880
human engagement and human
interaction.

135
00:07:10,120 --> 00:07:13,520
And, and we see this a lot,
Gareth, now with working from

136
00:07:13,520 --> 00:07:17,800
home, where, where are those,
where are those emotional touch

137
00:07:17,800 --> 00:07:20,240
points that you have with
someone where we just can sort

138
00:07:20,240 --> 00:07:24,080
of gas bag and, and say
whatever, it's the same in this

139
00:07:24,080 --> 00:07:28,600
translation or the use of metro
tricks to drive a lot of our

140
00:07:29,200 --> 00:07:32,440
decision making and, and a lot
of what happens in organisation.

141
00:07:32,440 --> 00:07:35,480
And you're right, data has
become a commodity now and

142
00:07:35,520 --> 00:07:38,080
substantially more and more
we're seeing it.

143
00:07:38,240 --> 00:07:42,080
I was reading a study recently
about applicants applying for

144
00:07:42,080 --> 00:07:46,120
jobs and, and they were faced
with automation the entire way

145
00:07:46,160 --> 00:07:48,560
through.
They didn't actually engage with

146
00:07:48,640 --> 00:07:51,080
a person, a human until the
very, very end.

147
00:07:51,360 --> 00:07:53,760
And it's like, well, what is the
consequence for that, for

148
00:07:53,760 --> 00:07:57,200
building your employer brand,
for your how you feel about the

149
00:07:57,200 --> 00:08:00,280
organisation?
So there is, there is a big

150
00:08:00,280 --> 00:08:02,720
story around that.
And I don't think we've really

151
00:08:03,440 --> 00:08:05,720
started to unpack what that
looks like.

152
00:08:05,720 --> 00:08:10,600
But there definitely is that
reductionism that people being

153
00:08:10,720 --> 00:08:16,000
reduced to a number and and also
the idea that humans are complex

154
00:08:16,280 --> 00:08:20,280
individuals and can they be
reduced to a number or should

155
00:08:20,280 --> 00:08:23,000
they be reduced to a number is
also the big question there.

156
00:08:23,360 --> 00:08:26,440
Yeah, absolutely.
And I think that part of

157
00:08:26,440 --> 00:08:29,560
everything that you've you've
explained there in my mind, I

158
00:08:29,560 --> 00:08:32,559
can put a lot of it back to the
kind of unknown.

159
00:08:32,679 --> 00:08:35,679
And you said something there
around managing your employer

160
00:08:35,679 --> 00:08:37,360
brand.
If the impression that you're

161
00:08:37,360 --> 00:08:41,400
giving is your, you've automated
everything and you've got all

162
00:08:41,400 --> 00:08:44,320
these applicants applying for a
role or an interest in your

163
00:08:44,320 --> 00:08:47,080
company being faced with these
automated systems that don't

164
00:08:47,080 --> 00:08:50,640
treat them like that human,
regardless if you're doing it

165
00:08:50,640 --> 00:08:53,760
out of necessity, like that
almost doesn't really matter to

166
00:08:53,760 --> 00:08:56,560
that person on the outside.
They only know what they're

167
00:08:56,560 --> 00:08:58,880
faced with.
So yeah, they're it'll be really

168
00:08:58,880 --> 00:09:02,520
interesting to see how that all
falls out around how people

169
00:09:02,520 --> 00:09:05,800
manage their employer brand.
But you, you also mentioned

170
00:09:05,800 --> 00:09:08,760
there around work from home.
One thing that I've seen and,

171
00:09:08,760 --> 00:09:11,800
and read a little bit of
recently was the implementation

172
00:09:11,800 --> 00:09:16,000
of surveillance tools from, you
know, organisations of various

173
00:09:16,000 --> 00:09:19,240
types, whether they're kind of
keystroke loggers or you know, I

174
00:09:19,240 --> 00:09:22,240
was reading something earlier
today around some company

175
00:09:22,280 --> 00:09:25,240
activated some software that
recorded through the microphones

176
00:09:25,240 --> 00:09:28,560
their their Staffs computers
while they're at home working

177
00:09:28,560 --> 00:09:30,080
from home.
So there's obviously all these,

178
00:09:30,560 --> 00:09:33,600
you know, in my mind, quite
dodgy things going on and I can

179
00:09:33,600 --> 00:09:37,200
only guess that that adds to
that cynicism and that distrust

180
00:09:37,200 --> 00:09:40,840
amongst workers who feel like
they're being monitored due to

181
00:09:40,840 --> 00:09:45,160
these various tech solutions.
That said, if that perceptions

182
00:09:45,160 --> 00:09:47,760
out there, what are the
implications of that for

183
00:09:47,760 --> 00:09:50,320
workplace culture?
How would that be addressed?

184
00:09:50,320 --> 00:09:54,880
You know, the more tools to try
and spend more time working on

185
00:09:54,880 --> 00:09:58,040
culture feels like is what's
breaking that cultural

186
00:09:58,040 --> 00:10:01,240
perception in the 1st.
Place Well, well, yeah, I think

187
00:10:01,240 --> 00:10:04,800
with surveillance and when we
talk about surveillance tools,

188
00:10:04,840 --> 00:10:08,520
we automatically go to the
negative surveillance, you know

189
00:10:08,520 --> 00:10:11,800
that I'm sneaking around as an
employer and I'm checking up on

190
00:10:11,800 --> 00:10:14,000
you.
Now we know through COVID there

191
00:10:14,000 --> 00:10:17,840
was a substantial increase in
organisations accessing those

192
00:10:18,000 --> 00:10:20,440
surveillance tools that they
were using online.

193
00:10:20,440 --> 00:10:24,600
But we also surveil people in
with cameras in the workplace.

194
00:10:24,600 --> 00:10:28,320
We can also surveil them by
require them to do drug tests,

195
00:10:28,880 --> 00:10:31,840
you know, So there's a whole
range of different surveillance

196
00:10:32,240 --> 00:10:35,760
that we do in organisations.
And you know, it's often sold to

197
00:10:35,800 --> 00:10:40,840
us as surveillance as care or is
it surveillance as coercion,

198
00:10:41,440 --> 00:10:43,960
right.
You know, so this is where it,

199
00:10:44,240 --> 00:10:47,680
it, it becomes quite interesting
to me to be honest, because this

200
00:10:47,680 --> 00:10:51,920
is how it's sold to someone is
how it's implicated into the

201
00:10:51,920 --> 00:10:54,440
culture.
Yeah, that's, that's really

202
00:10:54,440 --> 00:10:56,480
interesting what you said.
There's surveillance as care.

203
00:10:56,480 --> 00:10:59,800
I can imagine you could sell
that a lot easier if it was,

204
00:10:59,800 --> 00:11:04,440
say, cameras looking over the
open plan office type thing, you

205
00:11:04,440 --> 00:11:07,200
know, versus telling someone
that you're going to be logging

206
00:11:07,200 --> 00:11:10,800
their keystrokes and microphone
in the privacy of their own

207
00:11:10,800 --> 00:11:13,960
home, you know?
And there's also surveillance

208
00:11:13,960 --> 00:11:17,520
with health monitoring where
they want you to wear a device

209
00:11:17,520 --> 00:11:19,800
to check, Oh, really?
So your, your, your blood

210
00:11:19,800 --> 00:11:21,240
pressure's not going up.
I know that.

211
00:11:21,240 --> 00:11:24,200
I think that there was, and I'm
not 100% sure on this example,

212
00:11:24,440 --> 00:11:28,200
there was a factory in China
where they were using some

213
00:11:28,280 --> 00:11:32,520
surveillance technology to sort
of check people's using facial

214
00:11:32,520 --> 00:11:35,840
recognition to see if that would
come extra and then being able

215
00:11:35,840 --> 00:11:38,240
to take them off the line and,
and, and replace them with

216
00:11:38,240 --> 00:11:40,960
someone else.
So they were using it for good.

217
00:11:41,160 --> 00:11:43,400
But were they though?
Because then you go on the other

218
00:11:43,400 --> 00:11:46,960
extreme for those people in a
call centre and they have to log

219
00:11:46,960 --> 00:11:49,880
every time they go to the
bathroom and, and how many

220
00:11:49,880 --> 00:11:52,680
minutes you, you've got like 3
1/2 minutes to go to the

221
00:11:52,680 --> 00:11:54,440
bathroom.
And you can't only have so many

222
00:11:54,440 --> 00:11:57,920
bathroom breaks.
So it's it's problematic.

223
00:11:58,200 --> 00:12:00,840
Yeah, look, you know, we've all
heard stories around, you know,

224
00:12:00,960 --> 00:12:03,240
Amazon, how they're managing
their warehouse stuff.

225
00:12:03,240 --> 00:12:05,280
That's just one, one example off
the top of my head.

226
00:12:05,280 --> 00:12:08,720
And that sounds absolutely not
surveillance as care.

227
00:12:08,720 --> 00:12:12,280
That's why surveillance as as
crushing of the soul.

228
00:12:12,680 --> 00:12:18,000
But we've we've also looked at
the amount of data that is being

229
00:12:18,000 --> 00:12:21,680
collected potentially on on
people within and outside the

230
00:12:21,680 --> 00:12:24,080
workplace.
What are the biggest challenges

231
00:12:24,080 --> 00:12:28,360
and risks that human resources
departments face when collecting

232
00:12:28,360 --> 00:12:30,280
and analysing so much of this
data?

233
00:12:31,000 --> 00:12:34,040
The probably the biggest one is
privacy and you know, that's not

234
00:12:34,040 --> 00:12:37,400
just AHL problem, that's an
organisational wide problem and

235
00:12:37,400 --> 00:12:40,680
just ensuring that the, the
employee data we're talking

236
00:12:40,680 --> 00:12:44,920
about data is, is safe and, and
it's protected by the firewalls

237
00:12:44,920 --> 00:12:48,880
and whatever that whatever the
organisation has to protect it.

238
00:12:49,080 --> 00:12:52,960
However, we do know that there
can be breaches and those

239
00:12:52,960 --> 00:12:57,360
breaches may be unintentional or
they may be intentional or it

240
00:12:57,360 --> 00:13:00,920
might be a third party breach.
So it's it's about then the

241
00:13:00,920 --> 00:13:04,560
organisation being on top of and
ensuring that they've got the

242
00:13:04,560 --> 00:13:07,080
systems in place to protect the
data.

243
00:13:07,560 --> 00:13:11,480
Me personally, I've, I've been
in a couple of those breaches so

244
00:13:11,480 --> 00:13:14,160
far, like, you know, whereas
various things have been hacked

245
00:13:14,160 --> 00:13:15,760
and it's like, oh, your licence
is gone.

246
00:13:15,760 --> 00:13:18,280
I was like, oh, awesome.
But with, with those concerns

247
00:13:18,280 --> 00:13:22,680
around privacy, how do you, I
mean, how do organisations

248
00:13:22,680 --> 00:13:26,440
beyond saying that they, it's
stored well and you know, you

249
00:13:26,440 --> 00:13:29,800
see that little padlock in the
in the browser thing when you're

250
00:13:29,800 --> 00:13:33,600
handing over your stuff, like
how do they manage it securely

251
00:13:33,600 --> 00:13:36,080
enough to build the trust from
employees and beyond?

252
00:13:36,120 --> 00:13:39,280
I think it comes back to
communication and, and that

253
00:13:39,280 --> 00:13:44,400
cultural piece to say that we
have invested X or we have

254
00:13:44,400 --> 00:13:48,080
invested in this company to do
this and their reputation

255
00:13:48,080 --> 00:13:50,640
exceeds them in this particular
space.

256
00:13:50,640 --> 00:13:55,120
Or we have the big banks, they,
they hire those hackers and they

257
00:13:55,120 --> 00:14:00,120
set up the, whoever hacks, it
gets $20,000, you know, and, and

258
00:14:00,120 --> 00:14:03,600
it goes out to the hackers to
hack a particular system, you

259
00:14:03,600 --> 00:14:06,440
know, so they're, they're
actively doing things.

260
00:14:06,440 --> 00:14:09,480
And I know some companies are
hiring hackers to have in their,

261
00:14:09,800 --> 00:14:13,920
in their company to be hacking
everything so that they, they

262
00:14:13,920 --> 00:14:17,200
can identify where some of those
pitfalls are and where those

263
00:14:17,360 --> 00:14:21,640
sort of back entries might be or
whether be able to sort of get

264
00:14:21,640 --> 00:14:22,720
in.
But it does come down to

265
00:14:22,720 --> 00:14:26,480
communication.
Let's go back to some of the

266
00:14:26,480 --> 00:14:30,840
reasons why HR is starting to
implement various forms of

267
00:14:31,360 --> 00:14:35,120
technology for efficiency and
the scale of of what they're up

268
00:14:35,120 --> 00:14:38,000
against.
I often, you know, see this

269
00:14:38,000 --> 00:14:41,680
digital deluge that HR
departments currently face.

270
00:14:42,000 --> 00:14:45,800
Most often from what I've seen,
that refers to say you put a job

271
00:14:45,800 --> 00:14:48,960
advert out in the world
previously, I don't know, you

272
00:14:48,960 --> 00:14:52,240
might have got 50 applicants,
now you're getting 800.

273
00:14:52,680 --> 00:14:56,680
And obviously that's just a
ridiculous amount for somebody

274
00:14:56,680 --> 00:15:00,280
to try and sort through
themselves, which which makes

275
00:15:00,280 --> 00:15:03,720
those tools such a necessity.
If you've got these gigantic

276
00:15:04,240 --> 00:15:07,600
corporates, right, let's say
they're probably most likely to

277
00:15:07,600 --> 00:15:11,280
be using these systems in the
name of efficiency and, and

278
00:15:11,280 --> 00:15:14,520
managing that scale.
But when it comes to smaller

279
00:15:14,680 --> 00:15:19,840
businesses, do they have an
advantage to potentially being

280
00:15:19,840 --> 00:15:22,800
behind the cutting edge when it
comes to these tools?

281
00:15:23,320 --> 00:15:25,440
And, you know, is there an
advantage there in, in

282
00:15:25,440 --> 00:15:28,760
maintaining some of that human
face and touch?

283
00:15:28,920 --> 00:15:32,240
I, I, I think it's a, almost a
double edged sword, Gareth, for

284
00:15:32,240 --> 00:15:35,640
small businesses, they can't
afford to invest in the

285
00:15:35,640 --> 00:15:38,360
technologies that the big
companies are using.

286
00:15:38,360 --> 00:15:39,960
So that that's one of the big
things.

287
00:15:39,960 --> 00:15:44,280
So their reliance will be on, on
different types of systems.

288
00:15:44,400 --> 00:15:46,840
So they still have some data and
they'll still be collecting it,

289
00:15:46,840 --> 00:15:50,200
but how they go about collecting
it is is might be quite

290
00:15:50,200 --> 00:15:53,920
different to a big organisation.
However, as you say, it does

291
00:15:53,920 --> 00:15:57,520
allow them to be more human, but
they have more hands on that

292
00:15:57,520 --> 00:15:59,400
they have to be more involved
in.

293
00:15:59,680 --> 00:16:02,080
Now.
I'm not sure if you know any,

294
00:16:02,360 --> 00:16:07,560
any HR people that are working
in a small firm where there's

295
00:16:07,560 --> 00:16:11,160
like one or two of them now, I'm
sure they're wishing that they

296
00:16:11,160 --> 00:16:15,000
had more automation because they
don't have enough time even to

297
00:16:15,000 --> 00:16:17,840
be human because they're going
jumping from one ship to the

298
00:16:17,840 --> 00:16:19,920
next.
They're going from a performance

299
00:16:19,920 --> 00:16:23,600
review to a workplace dispute to
some other, like they're jumping

300
00:16:23,600 --> 00:16:26,800
around.
So for them, I think using tools

301
00:16:26,800 --> 00:16:29,680
to streamline position
descriptions or something like

302
00:16:29,680 --> 00:16:33,480
that to take some of that menial
stuff out of it for them.

303
00:16:33,480 --> 00:16:36,400
So then they potentially have a
little bit more time.

304
00:16:36,640 --> 00:16:38,760
Yeah, yeah.
And I think you've, you've kind

305
00:16:38,760 --> 00:16:42,320
of outlaid a few things there,
which a lot of the time when

306
00:16:42,320 --> 00:16:47,240
people hear human resources like
their, their mind goes to this

307
00:16:47,240 --> 00:16:49,280
is kind of the gatekeeper of a
job.

308
00:16:49,640 --> 00:16:53,520
Do you know what I mean?
But the role of HR departments

309
00:16:53,520 --> 00:16:56,840
is so much wider than that, as
you've kind of just touched on

310
00:16:56,840 --> 00:16:58,680
right now.
Can you just give us a bit of a

311
00:16:58,680 --> 00:17:01,840
rundown of the wider world of
HR, what it looks like

312
00:17:01,840 --> 00:17:03,880
currently?
Well, going back to an early

313
00:17:03,880 --> 00:17:06,960
point I made about that process
versus the human side.

314
00:17:06,960 --> 00:17:09,720
Now when we talk about the
process side that's around

315
00:17:09,720 --> 00:17:13,800
payroll, that's around things to
get, you want holiday pay.

316
00:17:14,040 --> 00:17:17,000
It's it's in the system like
getting all those systematic

317
00:17:17,000 --> 00:17:20,040
things that can be automated,
automated like there's a lot of

318
00:17:20,040 --> 00:17:22,640
processes that we have to follow
if it's about policy

319
00:17:22,640 --> 00:17:25,160
development.
But then the human element is

320
00:17:25,160 --> 00:17:27,040
the other part of it.
And that's where we're having

321
00:17:27,040 --> 00:17:30,960
conversations might be dealing
with a dispute or workplace

322
00:17:30,960 --> 00:17:33,560
investigation.
It might be performed management

323
00:17:33,560 --> 00:17:35,800
conversations.
Now performance management is a

324
00:17:35,800 --> 00:17:40,520
tricky one because some of it
may be automated or using a tech

325
00:17:40,800 --> 00:17:44,280
to you go in the system, you
fill in your goals, that type of

326
00:17:44,280 --> 00:17:46,720
thing.
But then it's usually, I say

327
00:17:46,720 --> 00:17:49,160
usually loosely, I'd like to
think everyone does it.

328
00:17:49,360 --> 00:17:50,960
There is a human discussion with
that.

329
00:17:50,960 --> 00:17:53,920
There's human touch points where
we sort of say, how is it

330
00:17:53,920 --> 00:17:58,440
actually really going?
And and that's critical to

331
00:17:58,480 --> 00:18:01,560
ensure that the employees are
finding meaning in what they're

332
00:18:01,560 --> 00:18:03,560
doing.
So I can go in and fill it, but

333
00:18:03,560 --> 00:18:07,280
if no one's looking at it or no
one cares, then how, how is that

334
00:18:07,280 --> 00:18:10,040
meaningful for me?
And that, you know, leads onto

335
00:18:10,040 --> 00:18:13,040
all other things like low job
satisfaction, low commitment,

336
00:18:13,040 --> 00:18:14,880
engagement, productivity, and so
on.

337
00:18:15,600 --> 00:18:18,800
Yeah, look, it, it sounds like
as you said earlier too, that

338
00:18:18,800 --> 00:18:23,120
that human aspect still needs to
be so strong in everything

339
00:18:23,120 --> 00:18:26,840
because humans as, as we all
know, they're not machines like

340
00:18:26,840 --> 00:18:31,880
the incredibly complicated
beings with absolutely unique

341
00:18:31,880 --> 00:18:36,040
sets of needs and wants.
But while we are talking around

342
00:18:36,280 --> 00:18:40,280
AI and and these new tools, this
is like quite a big, huge

343
00:18:40,280 --> 00:18:42,840
discussion that with a lot of
areas that this can go.

344
00:18:43,760 --> 00:18:47,960
One of the things that I guess
is out in the discourse is

345
00:18:47,960 --> 00:18:53,600
around AI and bias in hiring and
management of teams and and

346
00:18:53,600 --> 00:18:55,400
people.
Some people say it helps

347
00:18:55,400 --> 00:18:59,360
eliminate these biases, but
others say introduces new even

348
00:18:59,360 --> 00:19:02,200
harder to spot ones.
What's your take on this?

349
00:19:02,320 --> 00:19:05,240
Yeah.
Well, just taking a step back to

350
00:19:05,560 --> 00:19:09,800
algorithms and algorithmic bias
just in the data, I think that's

351
00:19:09,800 --> 00:19:15,280
a, a part of this discussion
before AI even was part of the

352
00:19:15,520 --> 00:19:19,480
bias discussion because it's not
operating in isolation.

353
00:19:19,480 --> 00:19:22,920
Whether it's an algorithm or or
whether it's an AI, someone's

354
00:19:22,920 --> 00:19:25,960
inputting something.
Yes, it has the ability to

355
00:19:26,000 --> 00:19:28,840
address bias, but it depends
what's going into it at the

356
00:19:28,840 --> 00:19:31,120
start.
Now we can't make assumptions

357
00:19:31,400 --> 00:19:36,560
that the information and the
positioning that's putting it at

358
00:19:36,560 --> 00:19:40,080
the start is completely without
bias because bias is going to

359
00:19:40,080 --> 00:19:43,120
be, whether it's unconscious
bias or not, it's going to be

360
00:19:43,120 --> 00:19:46,360
inherent in in everything and as
well as errors.

361
00:19:46,760 --> 00:19:50,560
So it has the ability to address
so many things, including bias.

362
00:19:51,160 --> 00:19:54,280
However, from HR perspective,
they just have to be wary and

363
00:19:54,280 --> 00:19:58,280
questioning and checking because
the biggest problem I think with

364
00:19:58,400 --> 00:20:01,920
a lot of HR systems and
particularly HR tech is a set

365
00:20:01,920 --> 00:20:06,200
and forget mentality where it's
like I said it, I do it here is

366
00:20:06,200 --> 00:20:09,760
a process irrespective of what
it is, and then I forget about

367
00:20:09,760 --> 00:20:11,040
it.
Well, it's like, well, knowing

368
00:20:11,040 --> 00:20:13,360
actual fact is an evaluative
part of that.

369
00:20:13,360 --> 00:20:16,640
We have to go back, check is
what's coming out, the writing.

370
00:20:16,760 --> 00:20:19,400
Information, do we need to go
back and check what's going in

371
00:20:19,560 --> 00:20:22,680
or what we're saying or we're
asking the AI to do for us?

372
00:20:22,680 --> 00:20:26,960
So yes, is there new biases
coming out potentially?

373
00:20:27,200 --> 00:20:29,240
I I don't know, it's a scary
thought.

374
00:20:29,600 --> 00:20:32,320
You said something there around,
you know, you've still got to

375
00:20:32,320 --> 00:20:35,600
have someone go in and kind of
mine through the data and find

376
00:20:35,600 --> 00:20:38,280
the right outputs and and
summarise what is presenting

377
00:20:38,280 --> 00:20:41,720
what it's finding is that quite
a big skill for people to learn

378
00:20:41,720 --> 00:20:46,040
And does that present a new
problem, which if there is that

379
00:20:46,040 --> 00:20:48,280
learning curve to analysing that
stuff?

380
00:20:48,280 --> 00:20:50,040
I mean, look, I don't know how
complicated it is.

381
00:20:50,040 --> 00:20:52,160
I struggle to look in Google
Analytics.

382
00:20:52,160 --> 00:20:53,640
That's that's how bad I am at
it.

383
00:20:53,960 --> 00:20:56,880
But if, if people are using
these tools to process all this

384
00:20:56,880 --> 00:20:59,960
data and then they've got to
spend all this time looking at

385
00:20:59,960 --> 00:21:02,280
it and and finding the
conclusions that are reaching

386
00:21:02,280 --> 00:21:04,720
and out putting something.
Is there a risk that people

387
00:21:04,720 --> 00:21:08,680
could become too reliant on tech
to solve all of these problems,

388
00:21:09,400 --> 00:21:11,880
whether they're forced to or
whether they choose to?

389
00:21:12,480 --> 00:21:16,000
I think there is an potential
that people will become too

390
00:21:16,000 --> 00:21:20,160
reliant on it because if I don't
understand what's coming out of

391
00:21:20,160 --> 00:21:24,400
it, then how do I know if what's
coming out is wrong or right or

392
00:21:24,720 --> 00:21:28,720
correct or biassed or you know,
So there is a certain level of

393
00:21:28,720 --> 00:21:31,920
skill required just to
understand the data.

394
00:21:32,240 --> 00:21:35,360
And we're, we're seeing more
sophisticated HR systems now

395
00:21:35,360 --> 00:21:38,440
where all the data is in the
back end and you can go in and

396
00:21:38,440 --> 00:21:41,720
you can ask it.
Tell me about the turnover

397
00:21:41,720 --> 00:21:45,440
trends for the the next and it
will spit out the data and give

398
00:21:45,440 --> 00:21:48,640
you and answer the question.
But then the problem becomes

399
00:21:48,760 --> 00:21:53,400
couple problems. 1 is a data
correct. 1 error in a line of

400
00:21:53,400 --> 00:21:57,200
code can, can break it all and,
and we don't know unless

401
00:21:57,200 --> 00:22:00,360
someone's checking it.
And two, how do I know what

402
00:22:00,360 --> 00:22:03,920
questions to ask?
So, so we, we hear a lot about,

403
00:22:04,040 --> 00:22:07,680
you know, ChatGPT and, and all
those generative AIS and, and

404
00:22:07,760 --> 00:22:10,320
the importance of how, how we
write prompts.

405
00:22:10,360 --> 00:22:12,680
It's going, it's going to be the
same thing for this.

406
00:22:12,800 --> 00:22:16,320
It's about what type of
questions do we need to ask the

407
00:22:16,320 --> 00:22:20,040
AI in order to draw out what we
need from an organisational

408
00:22:20,040 --> 00:22:22,480
perspective.
But the other at the other end

409
00:22:22,480 --> 00:22:25,920
of that or the other start of
that is have we collected the

410
00:22:25,920 --> 00:22:28,280
right data in the first place?
Yeah.

411
00:22:28,720 --> 00:22:31,680
But then be able to get the
right information out of it.

412
00:22:31,680 --> 00:22:35,480
So yes, there is an opportunity
to become over reliant.

413
00:22:35,760 --> 00:22:39,280
Secondly, the skills required
are analytical skills.

414
00:22:39,360 --> 00:22:43,600
And even when I talk to
different people in in HR, HR

415
00:22:43,600 --> 00:22:46,480
managers, they're looking for
particular analytical skills for

416
00:22:46,480 --> 00:22:49,680
new people coming in, people who
can analyse some data, who can

417
00:22:49,680 --> 00:22:54,080
read it and can interpret it and
make then use it to make data

418
00:22:54,080 --> 00:22:56,520
driven decisions.
That makes total sense.

419
00:22:56,760 --> 00:22:59,160
There's two things I want to
explore from that What other new

420
00:22:59,160 --> 00:23:01,920
skills do people in the field
need to learn?

421
00:23:02,600 --> 00:23:06,320
And then also, does it still
come down to humans to predict

422
00:23:06,840 --> 00:23:09,000
where and how those trends might
continue?

423
00:23:09,200 --> 00:23:13,320
OK, the new skills, absolutely
analytical skills that prompt

424
00:23:13,320 --> 00:23:16,480
into engineering like that,
getting the right prompts,

425
00:23:16,480 --> 00:23:19,480
critical thinking skills, I
think it's always been on the

426
00:23:19,720 --> 00:23:23,120
cards pretty much for everyone,
but for HR particularly to all

427
00:23:23,120 --> 00:23:26,760
right, how can I, how can I look
at this in a different way, not

428
00:23:26,760 --> 00:23:29,400
just take for granted what's
being spit out.

429
00:23:30,040 --> 00:23:32,720
So just a little bit of those
broader type of skills.

430
00:23:32,840 --> 00:23:37,400
We are seeing increasingly more
sophisticated autonomous

431
00:23:37,440 --> 00:23:42,560
programmes that can actually
tell you like the trend or or

432
00:23:42,560 --> 00:23:46,160
give you the actual answer.
So I don't know if I answered

433
00:23:46,160 --> 00:23:47,560
your second question quite
right.

434
00:23:47,560 --> 00:23:50,440
No, no, I think, I think that
actually could lead us into

435
00:23:50,440 --> 00:23:53,800
another question, which is
around those tools as they get

436
00:23:53,800 --> 00:23:56,560
more autonomous.
When I've played this entire

437
00:23:56,560 --> 00:24:00,360
scenario out in my own mind, the
best case scenario I can imagine

438
00:24:00,360 --> 00:24:02,760
is everything's get so
autonomous that it's just kind

439
00:24:02,760 --> 00:24:05,720
of human to human conversations
again, and all the

440
00:24:05,720 --> 00:24:08,960
administrative data driven stuff
is just running in the

441
00:24:08,960 --> 00:24:12,040
background.
Is there that that potential or

442
00:24:12,040 --> 00:24:17,640
does that full autonomy over so
much of it kind of tap into

443
00:24:17,840 --> 00:24:20,840
everybody's worst fear around
AI, which is kind of it's a

444
00:24:20,840 --> 00:24:23,880
human replacer?
It depends if who you're

445
00:24:23,880 --> 00:24:26,760
reading, because that everyone's
got a real different view on

446
00:24:26,760 --> 00:24:29,040
this.
Like I know Elon Musk has has

447
00:24:29,040 --> 00:24:31,320
come out and said AI is going to
be fully developed and

448
00:24:31,320 --> 00:24:33,520
autonomous in, in this
particular year.

449
00:24:33,680 --> 00:24:36,160
And then others are saying,
well, we've pretty much hit the

450
00:24:36,440 --> 00:24:38,680
the top of where we'll hit and
we'll sort of stagnate for a

451
00:24:38,680 --> 00:24:41,160
little bit at where we are.
I get the point.

452
00:24:41,160 --> 00:24:42,920
Are we being replaced by the
machine?

453
00:24:43,720 --> 00:24:47,720
And, and this is the the biggest
question and I think for HR this

454
00:24:47,720 --> 00:24:51,000
is absolutely significant
because HR as a function in the

455
00:24:51,000 --> 00:24:54,040
organisation was probably the
first adopters to start

456
00:24:54,120 --> 00:24:56,160
implementing different types of
technology.

457
00:24:56,160 --> 00:25:00,040
And then seeing the value of
what AI can actually provide.

458
00:25:00,120 --> 00:25:03,080
It's like, well, is it going to
replace us?

459
00:25:03,120 --> 00:25:09,080
Are we going to lose our jobs?
It is scary to know when it's

460
00:25:09,080 --> 00:25:12,280
going to where it can go.
But one of those topics that

461
00:25:12,280 --> 00:25:15,440
have just come up there is AI
and HR decisions just in

462
00:25:15,440 --> 00:25:19,600
general, as businesses face
increasing scrutiny and calls

463
00:25:19,600 --> 00:25:23,320
for regulation across this
stuff, what procedures and

464
00:25:23,320 --> 00:25:28,080
protocols either are they or
might they potentially implement

465
00:25:28,080 --> 00:25:30,160
to address this?
I think they just have to have a

466
00:25:30,160 --> 00:25:33,520
clear framework, maybe an
ethical framework, I'm not sure,

467
00:25:33,520 --> 00:25:36,680
just to say what what as an
organisation we're going to

468
00:25:36,680 --> 00:25:40,640
tolerate and what we won't, how
we can use it and how we can't.

469
00:25:41,040 --> 00:25:44,400
Until we get some national
standards about how it's used or

470
00:25:44,400 --> 00:25:46,800
how we can use it.
It's really hard for

471
00:25:46,800 --> 00:25:51,640
organisations to know what the
general consensus is.

472
00:25:51,800 --> 00:25:54,960
So I think then if we are
talking regulation, something

473
00:25:54,960 --> 00:26:00,560
I've seen lately is these
current US legal cases around, I

474
00:26:00,560 --> 00:26:03,640
think it was one guy has
launched some and I, what do

475
00:26:03,640 --> 00:26:06,520
they call it like a suit or, or
something over there because he

476
00:26:06,520 --> 00:26:09,480
got rejected by AI from, I don't
know, let's say a couple of 100

477
00:26:09,720 --> 00:26:13,000
rolls and he thinks it's shit.
But it might be, it might not

478
00:26:13,000 --> 00:26:16,640
be, but it doesn't matter.
Do cases like this kind of set a

479
00:26:16,640 --> 00:26:18,520
precedent?
Who gets a final word in

480
00:26:18,520 --> 00:26:20,840
something like that?
And depending on the outcome of

481
00:26:20,840 --> 00:26:24,160
that case, how could it
potentially change how these

482
00:26:24,160 --> 00:26:26,920
systems are used by all
companies moving forward?

483
00:26:27,360 --> 00:26:28,880
Well, that case is really
interesting.

484
00:26:28,880 --> 00:26:31,480
So he's suing work day work.
Day.

485
00:26:31,520 --> 00:26:33,680
That's the.
One to say that work day has

486
00:26:33,680 --> 00:26:36,560
discriminated against him
because of his age and has

487
00:26:36,560 --> 00:26:40,120
stopped him even getting through
to interview stage on on any of

488
00:26:40,120 --> 00:26:43,920
it based on the algorithm or the
AI that they've used.

489
00:26:43,920 --> 00:26:48,960
And in reading the cases, it's
really quite interesting in who

490
00:26:48,960 --> 00:26:51,560
has that owners of
responsibilities, the employing

491
00:26:51,560 --> 00:26:55,000
organisation, or is it the
company who has the technology,

492
00:26:55,160 --> 00:26:57,800
who's selling the technology?
And that's who he was.

493
00:26:57,800 --> 00:27:01,360
He was suing the technology
company to say that all the

494
00:27:01,360 --> 00:27:05,360
organisations I went to that
used your technology were

495
00:27:05,360 --> 00:27:06,840
discriminating against me, I
think.

496
00:27:06,880 --> 00:27:09,960
I'm think that's what I read.
No, that that sounds about right

497
00:27:10,040 --> 00:27:11,520
from what I've taken from the
case.

498
00:27:11,520 --> 00:27:13,840
But that, you know, that brings
back an interesting parallel.

499
00:27:13,840 --> 00:27:17,120
What like if someone commits a
crime through the Internet, you

500
00:27:17,120 --> 00:27:19,760
know, who's to blame?
Is it the person?

501
00:27:19,760 --> 00:27:21,760
Is it the Internet service
provider?

502
00:27:21,960 --> 00:27:24,480
It's it's just kind of like
where does the finger point at

503
00:27:24,480 --> 00:27:26,280
the end of the day for the
blame?

504
00:27:26,280 --> 00:27:29,840
But you know a system like like
that, surely the companies have

505
00:27:29,840 --> 00:27:33,920
some control over what you know,
selections and inputs that they

506
00:27:34,200 --> 00:27:37,040
want it to philtre, right?
But you would think so because

507
00:27:37,040 --> 00:27:40,600
that's what we have been
programmed to think in terms of.

508
00:27:40,880 --> 00:27:45,840
All right, I'm I'm buying an off
the shelf or a bespoke programme

509
00:27:46,040 --> 00:27:48,960
to help me with my recruitment
as as a company.

510
00:27:49,000 --> 00:27:52,840
I need to tell that programme
what it is I want as a company,

511
00:27:52,840 --> 00:27:55,760
what I'm looking for, what our
parameters are.

512
00:27:55,840 --> 00:27:59,320
I'm inputting that before the
system actually does the work it

513
00:27:59,320 --> 00:28:00,880
needs to do.
Yeah, no, totally.

514
00:28:00,880 --> 00:28:04,600
So I mean, it feels pretty open
and shut to me, but I, I don't

515
00:28:04,600 --> 00:28:07,600
know, I don't know how the the
US legal system will, will shake

516
00:28:07,600 --> 00:28:09,960
something like that out.
But you mentioned there there's

517
00:28:09,960 --> 00:28:13,880
still a person providing the
inputs and I guess the framework

518
00:28:13,880 --> 00:28:16,240
that can be built into that
bespoke tool.

519
00:28:16,320 --> 00:28:18,920
This, this is AI.
Don't know, it could be a bit a

520
00:28:18,920 --> 00:28:21,720
bit of a heavy question, but do
you believe AI can eventually

521
00:28:21,720 --> 00:28:26,040
make the human in human
resources obsolete or does it

522
00:28:26,040 --> 00:28:29,400
force the field to become even
more human centric in the years

523
00:28:29,400 --> 00:28:31,200
to come?
Well, that is a very heavy

524
00:28:31,200 --> 00:28:34,920
question.
But my answer to that is that I

525
00:28:34,920 --> 00:28:37,480
don't think the human will
become obsolete.

526
00:28:38,320 --> 00:28:42,840
I, I think it, there may be more
opportunity to automate and to

527
00:28:42,840 --> 00:28:45,360
use tech for some of the
processes.

528
00:28:45,680 --> 00:28:48,960
But as I mentioned before, there
is still a huge part of HR that

529
00:28:48,960 --> 00:28:53,600
is around people and, and it is
around ensuring that you know

530
00:28:53,760 --> 00:28:57,240
that the people have got the,
the capacity and the

531
00:28:57,240 --> 00:28:59,960
capabilities to do so.
It might be around, even though

532
00:28:59,960 --> 00:29:02,720
we can news tech for learning
and development, but it might be

533
00:29:02,720 --> 00:29:05,600
a career conversation or it
might be, as I said, a

534
00:29:05,600 --> 00:29:08,480
performance conversation, or it
might be I'm negotiating an

535
00:29:08,480 --> 00:29:11,840
enterprise agreement.
You know, I'm working with the

536
00:29:11,840 --> 00:29:14,040
unions.
I, you know, so they're still

537
00:29:14,240 --> 00:29:17,880
going to have to be a human part
of what HR does.

538
00:29:17,880 --> 00:29:21,640
I don't think that that's lost.
And that that makes absolute

539
00:29:21,640 --> 00:29:23,840
sense.
And hopefully that efficiency

540
00:29:23,840 --> 00:29:29,040
gain will lead to that kind of
personal interaction, human

541
00:29:29,040 --> 00:29:34,120
gain, which I guess is is what
everybody really wants to avoid

542
00:29:34,120 --> 00:29:36,480
feeling like you're just one of
those data points that we

543
00:29:36,480 --> 00:29:38,760
mentioned earlier.
So the more humanisation.

544
00:29:39,360 --> 00:29:42,440
Yeah, yeah, more humanisation.
That's a, that's a great way to

545
00:29:42,440 --> 00:29:45,200
describe what we want from human
resources.

546
00:29:45,200 --> 00:29:48,640
But just to finish up, how do
you see the future for human

547
00:29:48,640 --> 00:29:50,680
resources professionals taking
shape?

548
00:29:51,480 --> 00:29:54,640
Do you think the field will
become more human like we're

549
00:29:54,640 --> 00:29:57,920
hypothesising now, or will it
potentially end up more

550
00:29:57,920 --> 00:30:03,560
technical as we move forward?
Well, I, I would like to say it

551
00:30:03,560 --> 00:30:08,000
would become more human in the
sense that the thing with HR and

552
00:30:08,000 --> 00:30:10,600
tech and HR tech, it's a real
balancing act.

553
00:30:10,800 --> 00:30:14,160
It's a balancing act between
embracing technology and

554
00:30:14,160 --> 00:30:19,040
preserving our human values.
So HR has got this critical role

555
00:30:19,040 --> 00:30:23,240
in, in being able to do this.
So I think in an ideal world, if

556
00:30:23,440 --> 00:30:26,800
HR can do that, well, embrace
the tech, bring it along.

557
00:30:26,920 --> 00:30:30,480
We use it, we automate
processes, we we get everything

558
00:30:30,480 --> 00:30:34,920
happening then will allow us to
free our time as HR

559
00:30:34,920 --> 00:30:39,480
practitioners to then be more
human in the sense that I can be

560
00:30:39,480 --> 00:30:43,480
more interactive and engage with
the workforce and whatever that

561
00:30:43,480 --> 00:30:46,080
may mean.
Do I think it's going to happen

562
00:30:46,280 --> 00:30:49,680
in that ideal world?
Maybe for some, but I think that

563
00:30:49,840 --> 00:30:53,560
for some companies and maybe an
over reliance on technology and

564
00:30:53,560 --> 00:30:57,040
then say well, we don't need you
or we can do something else for

565
00:30:57,040 --> 00:31:00,840
you or we're going to find
something else for you to do as

566
00:31:00,840 --> 00:31:05,880
opposed to just here is an
opportunity to rehumanise HR.

567
00:31:05,880 --> 00:31:09,080
Yeah, fingers crossed that it
does rehumours.

568
00:31:09,080 --> 00:31:11,160
I really like that.
I like the way you've summed

569
00:31:11,160 --> 00:31:14,800
that up there.
Unfortunately for anybody in HR,

570
00:31:14,800 --> 00:31:18,320
it's like they're right in the
firing line now from everybody,

571
00:31:18,320 --> 00:31:20,240
whether that's external or
internal.

572
00:31:20,240 --> 00:31:23,840
So that was why I really was was
keen to hear the thoughts for

573
00:31:23,840 --> 00:31:25,960
this conversation.
Can I just add one more point

574
00:31:25,960 --> 00:31:30,800
whilst I'm, I'm, I'm presenting
this sort of almost sceptical

575
00:31:30,800 --> 00:31:34,800
view of, of technology, I do
think it has its place.

576
00:31:35,080 --> 00:31:38,120
And the HR practitioners that
I've been talking to, they're

577
00:31:38,120 --> 00:31:41,160
excited by the opportunities
that the technology is

578
00:31:41,160 --> 00:31:43,800
providing.
They're not scared of what comes

579
00:31:43,800 --> 00:31:46,160
next.
They're excited and ready to

580
00:31:46,160 --> 00:31:50,560
embrace that next stage of of
whatever it is, whatever the

581
00:31:50,560 --> 00:31:53,600
workforce is going to look like.
Interesting to hear that they're

582
00:31:53,600 --> 00:31:56,000
almost at the coalface, and if
they're feeling positive,

583
00:31:56,000 --> 00:31:59,440
hopefully that will eventually
become the feeling for everybody

584
00:31:59,520 --> 00:32:02,520
else outside the world of human
resources.

585
00:32:02,920 --> 00:32:06,200
Thanks so much for for that.
Justine, what's on the horizon

586
00:32:06,200 --> 00:32:08,680
for you and where can people
follow what you're up to?

587
00:32:09,040 --> 00:32:13,920
They can follow me on LinkedIn,
just Justine Farah and find me

588
00:32:13,920 --> 00:32:16,120
there.
I'm doing lots of things there,

589
00:32:16,200 --> 00:32:18,920
as you'll see with some
publications and the like.

590
00:32:19,120 --> 00:32:21,080
Awesome.
Justine, thank you so much.

591
00:32:21,240 --> 00:32:24,640
Thanks Gareth, I appreciate it.
For more info on what we've

592
00:32:24,640 --> 00:32:26,400
discussed today, check out the
show notes.

593
00:32:26,480 --> 00:32:28,960
If you enjoyed this one, you can
subscribe to Ruined by the

594
00:32:28,960 --> 00:32:31,560
Internet on your favourite
podcast app and help spread the

595
00:32:31,560 --> 00:32:33,880
word by sharing this episode or
leaving a review.

596
00:32:34,240 --> 00:32:35,960
I'm Gareth King, see you next
time.

Justine Ferrer Profile Photo

Justine Ferrer

Senior Lecturer in Human Resource Management

Dr. Justine Ferrer is a Senior Lecturer in Human Resource Management at Deakin Business School. She has held many leadership roles including, Director of Teaching, and Discipline Lead for the Human Resource and Employment Relations group and has played a key role in shaping HR/ER education and research. Justine’s research explores the complexities of the HR profession, including its ethical challenges, evolving role in organizations and its dark side. Recently leading a national study on the state of the HR profession in Australia. Her broader research interests include workforce diversity, career development, and the experiences of minority groups in the workplace. Most recent publications in the International Journal of Human Resource Management, Asia Pacific Journal of Human Resources, Industrial Relations Journal. Justine is also involved in many industry consulting projects, around workforce analysis, flexible work, enterprise bargaining and workplace diversity. Beyond academia, Justine is actively engaged in professional governance, where she is a Board Member of the Australia and New Zealand Academy of Management (ANZAM), serving as Board Secretary and Chair of the Education and Outreach Committee. Justine also holds the role of Conference Chair for the 2025 ANZAM conference.