The AI Impact: Transforming Tech Roles and the Future of Work | The Pair Program Ep54
In this episode, we dive deep into the transformative power of AI in the workforce with two thought leaders shaping the future of tech talent: Dr. Diana Gehlhaus, Director for Economy at the Special Competitive Studies Project, and Angela Cough, Special Advisor to the Department of Defense’s Chief Digital and Artificial Intelligence Officer.
Together, they explore:
- The evolution of tech roles as AI continues to redefine industries.
- The ripple effects of AI on hiring practices and workforce education.
- Key skill sets and industries poised for growth as AI evolves.
- How organizations can prepare for critical shifts in education and upskilling.
Dr. Gehlhaus and Cough bring their rich backgrounds in policy, defense, and innovation to discuss how AI is reshaping what it means to be “tech talent” and the opportunities it presents for professionals across various sectors.
Whether you’re a tech leader, educator, or job seeker, this episode will inspire you to rethink how we approach the workforce in the age of AI.
About Diana Gehlhaus: Dr. Diana Gehlhaus is a Director for Economy at the Special Competitive Studies Project. She is also an adjunct policy researcher at the RAND Corporation. Diana was previously a senior advisor in the U.S. Department of Defense Chief Digital and Artificial Intelligence Office (DoD CDAO), as well as a research fellow at Georgetown University’s Center for Security and Emerging Technology (CSET).
About Angela Cough: Angela Cough serves as Special Advisor to the Department of Defense’s Chief Digital and Artificial Intelligence Officer (CDAO). She leads the Digital Talent Management Division, driving innovation and advancing the DoD’s data, analytics, and AI workforce. Previously, Angela spearheaded the AI and Data Accelerator Initiative (ADA), enhancing digital capabilities across the Department and promoting data literacy. Her experience includes serving as Deputy Director of Defense Digital Services, managing counter-small unmanned product development, and contributing to NATO C-sUAS policy. With over 20 years of experience in startups and small businesses, Angela is also an entrepreneur, investor, and holds a black belt in the Martial Art of Tang Soo.
Sign-Up for the Weekly hatchpad Newsletter: https://www.myhatchpad.com/newsletter/
Transcript
Welcome to The Pair Program from hatchpad, the podcast that gives you
2
:a front row seat to candid conversations
with tech leaders from the startup world.
3
:I'm your host, Tim Winkler,
the creator of hatchpad.
4
:And I'm your other host, Mike Gruen.
5
:Join us each episode as we bring
together two guests to dissect topics
6
:at the intersection of technology,
startups, and career growth.
7
:Hello, everyone, and welcome
back to The Pair Program.
8
:Uh, your host, Tim Winkler,
alongside my cohost, Mike Gruen.
9
:Mike, um, did you know that today, I
think I've told you this before my wife's
10
:really, really into the national holiday
days or the calendar days of each day.
11
:So the first Wednesday of November
is national stress awareness day.
12
:Um, you never really
experienced stress, right?
13
:No, um, me neither.
14
:So in honor of, of stress awareness
day, what's, what's one thing that
15
:kind of continues to kind of stress you
out and, and how do you cope with it?
16
:Oh,
17
:Mike Gruen: you're, you're asking me to
how to, so I can do the coping one much
18
:easier, which is, uh, I do meditate.
19
:Um, and I find even just five minutes
or 10 minutes, uh, is enough to sort of.
20
:Bring me back to like ridiculous mode.
21
:Um, what's one thing that's
continues to stress me out.
22
:Um,
23
:that's a good one.
24
:Cause there's so many to spitball.
25
:I mean, like, I mean, like right now it's
definitely a stage of life type stuff.
26
:Cause my, my oldest is 18.
27
:He's in high school.
28
:We're talking about college.
29
:So when we start thinking about like
his future and what jobs are going
30
:to be available and what the world
is going to look like in the economy,
31
:and like, then my brain, I get down
that hamster wheel and just start,
32
:It's going into some sort of downward
spiral of this is a terrible place.
33
:Why did I bring it into a world like this?
34
:But anyway, um,
35
:you asked, man, you asked.
36
:Anyway, so yeah, uh, so that
kind of stresses me out.
37
:And then I, and then I think about it.
38
:I think about all the things
I, you know, meditate and then
39
:think about all the things I'm
grateful for and stuff like that.
40
:So, uh, yeah, but yeah, but yeah, that
type of stuff definitely stressed me out.
41
:What about you?
42
:Tim Winkler: Yeah, I, I'd say, uh,
just the running of a small business
43
:is super stressful, especially, you
know, when we've had these up and down
44
:kind of years over the last, you know,
since the pandemic, it's been a super
45
:stressful stretch of four years, um,
uh, paired with managing a, you know,
46
:a little toddler, uh, has been, you
know, its own kind of It comes with its
47
:own challenges, but, you know, I think
that's a little bit of a vague answer.
48
:I'm going to go with it anyways, uh,
but the, um, the way that I combat
49
:that is just, I have to like it.
50
:Well, I say, uh, the healthier outlet
here was going to say, you have to
51
:go to the gym, have to go exercise.
52
:Um, and if I don't, you know,
it, it certainly weighs on me.
53
:I can feel it, um, compound.
54
:So.
55
:Anyways, exercise and, uh, yeah,
I'm running a small business.
56
:So there you go.
57
:And it was, I, I usually start things
for, for, for, I guess I usually start
58
:things a little bit more lighthearted
and funny and, and job subject.
59
:Um, I, I, I know there's a lot of stress
in the world these days, so I just
60
:figured bring some awareness to it and,
you know, recommend coping mechanisms.
61
:So.
62
:I'm glad to hear, uh, Mike, uh, Mike,
isn't just going to the bottle as well.
63
:So thank you, Mike.
64
:Uh, all right, let's, let's fill
the listeners in on what today's
65
:episode is all about today.
66
:We're going to be diving into a topic
that's reshaping the landscape of tech.
67
:And that would be the evolution of tech
talent and the age of generative AI.
68
:Uh, joining us, we've got a
couple of fantastic guests that
69
:have great insights into the
intersection of tech talent and AI.
70
:Um, we have Diana, uh, Dan, I just
already blew it at, uh, Gail house.
71
:Diana: That's okay.
72
:Gail house.
73
:Oh, you didn't, oh, I got
74
:Tim Winkler: it.
75
:Awesome.
76
:All right.
77
:Diana Gail house.
78
:Uh, Diana serves as the director
of economy at this special
79
:competitive studies project.
80
:She has nearly 20 years in tech and
talent policy, uh, instrumental in
81
:positioning the U S as a leader in
AI driven economies, uh, her work on
82
:AI workforce strategies for the DOD.
83
:And research on talent pipelines, uh, will
no doubt make for some very insightful
84
:additions for today's discussion.
85
:So thank you for joining us, Diana.
86
:Um, and alongside Diana,
we have Angela cough.
87
:Angela serves as the special advisor
at the DOD's chief digital and
88
:artificial intelligence office.
89
:Uh, she oversees digital talent
management, uh, has experienced
90
:deploying AI capabilities and
enhancing data literacy across the DOD.
91
:Uh, Angela offers us real world
insights on adapting workforces for
92
:an AI driven future, which I'm excited
to expand on and our chat today.
93
:So Diana and Angela, thank
you both for joining us today.
94
:Angela: Thank you.
95
:We're excited to be here.
96
:Tim Winkler: Awesome.
97
:All right.
98
:Now, before we dive in, we do
like to kick things off with a
99
:fun segment called pair me up.
100
:Uh, here's what we'll all go
around the room and spit ball.
101
:A complimentary pairing of our choice.
102
:Mike, you lead us off.
103
:What do you got for us?
104
:Uh, I'm going
105
:Mike Gruen: with a poker
face and video calls.
106
:Uh, so the last couple of weeks
has been a lot of, uh, conference
107
:calls and leadership calls.
108
:It's the end of the year, so it's
budgeting and lots of things.
109
:And, uh, my, uh, some of my, uh,
counterparts like to slack and message
110
:me funny things and do the same.
111
:So being able to maintain a poker face
while, uh, also having fun on the video,
112
:uh, because it is a serious topic, uh,
but at the same time still having fun.
113
:Um, and trying to do it.
114
:So containing that sort of poker
face while on the video call,
115
:that would be my, my pairing.
116
:Can we see your poker face?
117
:No, because now I'm smiling
and I can't, I can't.
118
:I thought you could just turn it
119
:Tim Winkler: on.
120
:You can't just turn it on.
121
:Mike Gruen: I mean, I can.
122
:Okay, there we go.
123
:There
124
:Tim Winkler: we go.
125
:Mike Gruen: I'm serious.
126
:You're
127
:Diana: not listening.
128
:You have to be.
129
:Oh.
130
:Yeah, no.
131
:Well, on the
132
:Mike Gruen: plus side, I just saw this
awesome, uh, AI, uh, video of a guy who's
133
:figured out how to just get on the, like,
it's just an AI that's doing him and.
134
:Acknowledging and whatever, while
he's off playing video games, uh,
135
:and pretending that he's on the call.
136
:Gets
137
:Tim Winkler: sick and sick and mouse
jiggle to a whole nother level.
138
:That's
139
:Mike Gruen: ridiculous.
140
:Diana: I can help you with, by the way,
can transpose your face on a screen.
141
:That's what I'm saying.
142
:That's what he has.
143
:It's a whole
144
:Tim Winkler: thing of
145
:Diana: like,
146
:Tim Winkler: yep.
147
:That's really what we're going to
talk about here today, how to get
148
:away with it, how to supplement AI
tools to not have to do anything.
149
:Uh, I dig it.
150
:Um, many of times I've had to
put that poker face on many of
151
:times on these podcast episodes.
152
:I appreciate you, you calling it out.
153
:Uh, I'm going to go with a, um, a
top of mind seasonal struggle for
154
:many folks out there these days.
155
:And that is the.
156
:End of daylight savings time and
the age old question of how do
157
:I change the clock on my stove?
158
:Uh, because this in my opinion has to be
one of the most annoying Trickle effects
159
:of dealing with the end of daylight
savings time or just day saving time
160
:in general Um, but I do think that this
is a pretty common hardship with folks
161
:across the country, you know, minus your
Arizona, your Hawaii's, uh, but I feel
162
:like this is something that we all have
to deal with each year, uh, and the, in
163
:the face of daylight savings time ending.
164
:And so, um, you know, I feel
everyone's pain with that.
165
:So that's going to be
my, my pairing for today.
166
:Has anybody here had to
change that clock yet?
167
:Diana: I, uh, I empathize with my
small device that I have, which is
168
:just a clock, but it's also a radio.
169
:Tim Winkler: And I
170
:Diana: always have to look up the
instructions because I never remember
171
:what combination on the remote is
supposed to actually make a change.
172
:Tim Winkler: That's right.
173
:Diana: Oh, I can empathize there for sure.
174
:Tim Winkler: Yeah, sometimes it's those
little analog, you know, scenarios that
175
:you have to like go and do some googling
on like, how do I, what am I doing?
176
:Diana: I was actually just going to
also empathize with I recently moved
177
:and I got into a very heated argument.
178
:No pun intended with my
microwave about this.
179
:Very question.
180
:And I didn't want to, cause
I couldn't figure out how to
181
:work the new microwave to me.
182
:And I literally could not, I must've
spent like 15 minutes trying to,
183
:and I was like, I will not give up.
184
:I will not surrender.
185
:I hate quitting.
186
:Um, and so I gave it a timeout.
187
:I went away, came back and I
finally figured it out, but
188
:I know exactly what you mean.
189
:So that was really funny.
190
:I feel so
191
:Mike Gruen: lucky that my, the only
thing that we have to change manually
192
:at this point is our microwave.
193
:And you push the clock button
and then it has the instructions.
194
:Like, do you want to reset the time?
195
:And it just walks you right through it.
196
:So, yeah, very spoiled on that.
197
:I hated my used to have a car.
198
:Diana: It is Diana for you.
199
:Yeah.
200
:And replace it.
201
:Exactly.
202
:Tim Winkler: I'm glad I've got some
some empathy points on my pairing.
203
:I'm going to pass it along
to our guest now, Diana.
204
:How about a brief intro and your pairing?
205
:Diana: Sure.
206
:Well, thank you for the great intro.
207
:I am an economist by training.
208
:I care a lot about innovation and growth.
209
:And when you think about what
drives that it's tech and talent.
210
:So I've spent many years thinking about
different dimensions of these questions
211
:and, um, in around:started to think about AI as it was
212
:first coming on the scene as an issue.
213
:Uh, conversations were
moving from cyber to AI.
214
:In the workforce policy space,
and so just really been following
215
:along on, um, how transformative
it's been in such a short time.
216
:So, uh, that's a little bit about me.
217
:I come by way of several think
tanks, government and what's next.
218
:I guess we'll get into that later.
219
:We're all on a journey.
220
:My pairing, I think, I, so I went
with something kind of silly, uh, and
221
:I think Angela will appreciate this.
222
:I'm going to go with yogurt
and granola because I love it.
223
:Oh,
224
:Tim Winkler: that sounds
so good, that's strong.
225
:Diana: It's like one of the only things
I've ever seen her eat aside from cookies,
226
:so it's a big part of her daily routine.
227
:There you go.
228
:Do you, do
229
:Mike Gruen: you go with any, any
fresh fruit or is it just the,
230
:uh, the granola and the yogurt?
231
:Because I love the granola,
blueberries, yogurt, blueberries.
232
:Angela: That is the go to for
her that I have approved myself,
233
:so I appreciate that pairing.
234
:Mike Gruen: I'm pretty sure I've even
used that pairing because it is so good.
235
:I support your pairing.
236
:Tim Winkler: I gotta follow
up on Angela's point there.
237
:You said, in addition, that's the
only thing you eat aside from cookies.
238
:Um, what kind of cookies?
239
:Well,
240
:Diana: it's definitely not true,
but, uh, but, uh, oh, I love cookies.
241
:But I do, I do love fruit
and granola and yogurt.
242
:And I also do love cookie,
chocolate chip cookies.
243
:Chocolate chip,
244
:Tim Winkler: classic, yeah.
245
:Yeah.
246
:Yeah.
247
:That's right.
248
:That's the, that's the, that's
the right answer right there.
249
:Yeah.
250
:Yogurt and granola, uh, you know,
with a toddler and, and, uh, and, and
251
:our family, we are constantly doing
yogurt and granola for breakfast.
252
:It's a, it's a goat combo.
253
:Great pairing, Angela, uh,
quick intro and you're pairing.
254
:Diana: Yeah.
255
:Um, yeah.
256
:So, uh, as you well introduced earlier,
and I'm so proud of you for getting
257
:all of the chief digital and artificial
intelligence office and all of the
258
:words that go into that title, cause
it can get quite lengthy, um, yeah,
259
:currently I'm, I'm overseeing the digital
talent management activities, trying
260
:to enhance and expand the department of
defense access to data and AI talent.
261
:So it's a very interesting challenge.
262
:And my.
263
:Former colleague that I so
desperately miss, Diana.
264
:I'm so excited that she's here to join me.
265
:Um, my, you know, it's, it's been an
interesting journey and, and there's
266
:lots of stuff to come next for the
department and lots of work to do.
267
:So I'm very excited to be
here to help kind of promote.
268
:How we can get after it.
269
:My favorite pairing, I think for right
now, considering we were just talking
270
:about the wood walls behind me as my
delicious cup of coffee as an entrepreneur
271
:as well, uh, from my own coffee shop
that I like to pair with my Pacific
272
:Northwest vibe, um, wood walls as, and,
and, you know, appropriately, uh, vest,
273
:Tim Winkler: you
274
:Diana: know, to keep myself nice and cozy.
275
:Because it was 39 degrees this morning
and the daylight savings is giving me
276
:one more hour of delicious sunlight
to help myself throughout the day.
277
:Tim Winkler: Nice.
278
:So I need to clarify,
is it your coffee shop?
279
:You have a coffee shop out there?
280
:Diana: Yes.
281
:Tim Winkler: Want to give it a plug?
282
:What's it called?
283
:Diana: Um, it is Hotwire Coffeehouse.
284
:It was established before Hotwire,
the online, um, uh, place.
285
:So it's, it's a local coffee
shop, a small little place
286
:that's in a historical building.
287
:Actually only a few blocks from
my house, but it's been serving
288
:this community for over 20 years.
289
:It, it really is sort of like
a weird little community hub
290
:that we stay connected with
the neighborhood that I'm in.
291
:So it's been, yeah.
292
:Yeah.
293
:Uh, very interesting.
294
:We've owned it from the
original founder since wow.
295
:Tim Winkler: That's really cool.
296
:I love that.
297
:And I love coffee and you know, you're,
you're seeing right now for those that
298
:are looking at watching this on you on
the YouTube, it's, uh, really taking
299
:me into that Northwest cabin vibe.
300
:So I, I, I applaud you for, for
bringing all of those pieces together.
301
:Um, awesome.
302
:Well, yeah, we're excited
to have both you all on.
303
:Um, I know that Angela, when we first
had our, Discovery call, you know,
304
:we, we really had a chance to, you
were the one that really kind of took
305
:me down this journey of, I think this
is what we really should be talking
306
:about here, uh, on a larger scale.
307
:And, um, I think it's a, an
appropriate conversation to have.
308
:It's one that is, uh, impacting,
you know, for the most part,
309
:everyone, uh, in some fashion, uh,
and it's certainly highlighted in
310
:a lot of the tech roles that we.
311
:Talk to folks about on a daily basis.
312
:And so I'm excited to peel it back a
little bit and, um, uh, yeah, expose,
313
:shed a little bit of light on, you know,
from, from a couple of experts here
314
:on where we think things are going.
315
:So, um, I'm gonna go ahead and transition
us into the heart of the discussion now.
316
:And, um, again, a quick recap,
we're going to be talking about
317
:tech talent in the age of AI.
318
:Uh, and so again, many of our listeners
have probably experienced in some
319
:Way shape or form, uh, AI has had
some level of impact on their role.
320
:And as it continues to redefine roles and
skills, you know, we're going to start
321
:to see industries rethinking, not just
who they hire, but how they hire as well.
322
:Uh, and so in this episode, we'll
explore how AI is broadening, what it
323
:means to be, you know, kind of tech
talent beyond traditional STEM roles.
324
:And discuss the critical shifts
that, uh, Are going to be needed
325
:in education to prepare the
workforce for this AI driven future.
326
:So I want to start by talking
about how AI is reshaping what
327
:it means to be tech talent.
328
:Uh, Diana, from your perspective,
how do you see tech roles evolving
329
:as AI continues to play a bigger
role in the industry at large?
330
:Diana: I love this question.
331
:Uh, so I actually think we're already
seeing early signs of these roles
332
:evolving and what it means to be
a tech worker and you're seeing
333
:it in a couple of different ways.
334
:Um, and and I think you'll continue
to see as this technology deploys at
335
:scale, given the rapid advancements that
are already happening, like, even if.
336
:You stopped and froze all of the
advancement that's happened over
337
:the last 2 or 3 years today.
338
:You'll still see, uh, the effect
on the economy and on the workforce
339
:over the coming few years.
340
:So I think you're already starting
to see some signs and and that is.
341
:The following, right?
342
:I see it as you've got
different tiers of talent.
343
:You've got this exquisite talent.
344
:That's still very high in demand.
345
:That's not going away anytime soon.
346
:Right?
347
:You've got your PhD computer
science researchers, your PhD
348
:machine learning engineers.
349
:Um, you know, your PhD scientists of
many different flavors and that, that
350
:is still a core group of innovators
that are doing a critical R and D.
351
:Right.
352
:So we absolutely still need
to cultivate that talent.
353
:Then you've got this gigantic layer of
practitioner talent, and that's where
354
:you've got your Debbie's, your software
engineers, your data scientists, a lot
355
:of that practitioner talent, and, um,
This is where our, like, taxonomies and
356
:how we are able to, our lexicon about
how we're able to talk about this talent
357
:starts to get super squishy and outdated.
358
:Um, and it makes it hard to accommodate
this type of discussion, but I really
359
:see this talent is it's not just like
one hard technical skill anymore.
360
:It's really like a I plus like
your ability to leverage a I into
361
:whatever it is that you're doing.
362
:Uh, starts to become
really, really key here.
363
:And you're already seeing shifts in
demand for people with certain types
364
:of technical skill sets or certain
combinations of technical skill sets.
365
:Right?
366
:So, I think that's really a key thing.
367
:And then the other piece that
people always forget about when
368
:it comes to technical talent is
the skilled technical workforce.
369
:And that's also seeing, uh, um.
370
:A strong, uh, rise in demand, I think,
as we start to think about careers
371
:in biotech and cyber and advanced
manufacturing, um, and we've cultivated
372
:some of that with our own policies.
373
:Right?
374
:So, but there's this cadre of
talent that you maybe don't need
375
:that traditional for your degree.
376
:So you think about other pathways
you think about skills based.
377
:But they are part of
the technical workforce.
378
:Um, and so the definition starts
to kind of expand and morph.
379
:And then you think about how all of
us, just like we all use computers.
380
:We're on a, you know, a podcast right now.
381
:How we're going to be operating and
interacting with AI enabled capabilities.
382
:And how much, how much knowledge
we need to have about what these
383
:capabilities are and how they work.
384
:Is to some degree, we're all going to
have to have some level of sophistication.
385
:Tim Winkler: Yeah.
386
:I love the, uh, I love how
you just broke that down into
387
:a couple of different tiers.
388
:Cause, um, I think that's super helpful
when folks are trying to figure out,
389
:you know, where do I kind of fit in,
in, in this kind of transition space.
390
:Um, no doubt still, I agree
wholeheartedly with a lot of
391
:the, you know, the PhD folks.
392
:It's the, that next level that you
described that I think is where we get a
393
:lot of head scratching coming into play,
um, You know, um, front end development
394
:is a good example or design, right?
395
:We, we see a lot of these types of
individuals have a little bit of,
396
:um, uh, you know, I don't want to say
fear, but you know, a little bit of,
397
:a little bit of uncertainty, right.
398
:Of, of, you know, where, where
is my skillset going to be in
399
:demand in five, 10 years from now,
if these tools are starting to
400
:have the ability to do X, Y, Z.
401
:Um, and so, you know, the way that
you describe that, I think is, you
402
:know, Is important to note, um,
the AI plus, uh, we ran an episode
403
:about this previously about, you
know, uh, generalist for specialist.
404
:And I think this is where you're starting
to see some of these specialists and
405
:verticals really becoming so key if it's
health care, if it's finance, what, what
406
:have you, if it's in defense, um, but
combining these, these, uh, skill sets
407
:and, and to, you know, one overarching
kind of skill set is, uh, is something
408
:that I think you were kind of alluding
to, which I think is really intriguing.
409
:Angela, I want to, uh, pass it to you
because you're obviously doing a lot of
410
:work within Department of Defense and,
um, you know, you're transforming on the
411
:ground, you know, firsthand some of this.
412
:What are, what are you
seeing on this topic?
413
:Diana: I think some of it, I like the idea
of referring to it as AI plus, because
414
:in part, the analogy that I like to use
is we're sort of, this is sort of the
415
:next advent of when we introduced the PC.
416
:Right.
417
:It's another tool in folks toolkit
and it will become something that
418
:we have to adapt to use and and
will become part of our daily lives.
419
:I think I refer to them as soft skills,
but really there's this idea of increasing
420
:people's awareness from a product
perspective of what they're trying to do.
421
:You know, Diana mentioned, you've
got these hyper specialists that can
422
:sort of do this R& D development,
sort of see how the frontier is
423
:going to expand with respect to AI.
424
:Then we have the people who have
to be able to implement, use, and
425
:support, maintain, and test that AI.
426
:And then we have folks that
are going to be hyper focused
427
:on certain technical pathways.
428
:And we're seeing this trend happen
across education too, where we've
429
:got concentrated areas for machine
learning, concentrated areas for
430
:other thoughts and practices.
431
:And But there's also this piece about,
you know, uh, AI and the ability to be
432
:able to develop models that truly reflect,
um, human behavior is going to require
433
:human behavioral sciences, people who
understand how people think, how people
434
:work, how people do things day to day,
um, and then be able to expand upon
435
:that into what would be Traditionally,
not necessarily be a technical role,
436
:but it's like product development.
437
:Do you understand the scope of the
problem that you're trying to solve for?
438
:And then how do you translate
that to an effective outcome?
439
:Um, you know, I was asked at a
previous sort of conference that I
440
:attended what skills these college
folks should be looking to have.
441
:And I was like, figure out how to
deconstruct problems that you're
442
:given and then how you can actually
put that into consumable products.
443
:Um, sort of executable tasks, but then
also be able to identify what parts
444
:of that could potentially benefit
from some level of process automation
445
:and how to differentiate that from A.
446
:I.
447
:Enablement because there
is there is a conflation.
448
:I think that's occurred across
the landscape right now, which
449
:is Sprinkle a little AI on it,
and it'll solve your problem.
450
:But there are a lot of moving
parts that have to go into actually
451
:making that accomplishable.
452
:And so, it's important for us to
have folks who can actually think and
453
:deconstruct what they're trying to
accomplish, and then understand how to
454
:adapt that to some of the tools that are
available right now, because it's just
455
:part of our toolkit is what we're going
to expand into, and then be able to adapt
456
:that to their day to day work environment,
kind of regardless of what they
457
:actually do or have their skill set in.
458
:And I agree with Diana and what we've been
talking about a lot in side conversations,
459
:you know, across this landscape, which
is the traditional four year degree in
460
:computer science may become less relevant.
461
:It doesn't mean that it is not
relevant, but it may, but there are
462
:certain barriers I think to access and
contribution in this space that will
463
:be sort of like equitably um, leveled.
464
:For some folks that would otherwise
be able to contribute their thought
465
:and diversity of thought to a
space of technical application.
466
:So I, I as an individual can
actually share my thoughts
467
:about how to design a model.
468
:I may not know how to technically
do it, but I can at least be part
469
:of the process of getting it done.
470
:And that's where I think we have an
opportunity to really leverage more
471
:diversity of thought and experiences
and application and perspective.
472
:In a way that wasn't quite, I
think, as appreciated and it goes
473
:beyond traditional stem scope.
474
:Mike Gruen: I think one of the things you
sort of talked about, because so before
475
:AI like really took off, like this would
be 10 years ago, whatever I was working
476
:in a company, we're doing, um, inferential
statistics and describing human behavior.
477
:We're looking for inside
risk inside threat.
478
:One of the biggest model, like
one of the biggest things we could
479
:do is identify someone who was
thinking about leaving their job.
480
:Right.
481
:So flight risk and we had, right.
482
:And we sort of talked about, and we sort
of getting back to that, like decomposing
483
:and understanding people's behavior.
484
:And then we tested all these,
what was interesting at that time
485
:was we built these models and
then they were fairly effective.
486
:Like we ran it on data sets.
487
:We could show every time we talked to
anybody, they were always like, well, how
488
:do you know you didn't just get lucky?
489
:How do you know that
this is really working?
490
:And I think that's one of the things
that being able to like explain what's
491
:going on and not just trusting that
the AI, like, Oh, that's the answer.
492
:There is something and being
and having that experience to
493
:be able to recognize, I think.
494
:When, you know, maybe it isn't really
describing human behavior, and there's
495
:some bias in the data set and being
able to question it and stuff like that.
496
:I think that's maybe that's a little
bit what you're talking about.
497
:And I love the like, um, the plus.
498
:I mean, there we spent a lot of time.
499
:We had data scientists come in.
500
:They didn't know how to
do software engineering.
501
:We spent a lot of time teaching
them how to do software engineering,
502
:um, so that they could build
this stuff themselves again.
503
:It's that sort of bringing
these These things together.
504
:I like a lot of what both of
you were saying about that.
505
:It's like, you have to bring these
multiple disciplines together
506
:in order to sort of get it.
507
:To really work and function.
508
:Um, I think that's an important part and
I appreciate you guys bringing it up.
509
:Diana: Can I just, I want to also
emphasize here what you're just,
510
:what you're saying without saying is
the importance of technical teams.
511
:Tim Winkler: Yes.
512
:Diana: Yeah.
513
:It's really not just one part.
514
:We can't ask everything, everybody.
515
:It's like saying I, you know, you need
a partner who's everything to everyone.
516
:I mean, that's just not right.
517
:You, you can't have, not everyone can be
a unicorn who does everything all right.
518
:So it's.
519
:It's also about the team and making sure
that you're thinking about technical teams
520
:and how you're deploying those teams.
521
:Um, uh, you know, whether it's within a
business unit or more enterprise wide.
522
:Mike Gruen: Absolutely.
523
:And I also think the whole notion of the
CS degree, which I think is funny at this
524
:point, because most of what I learned
in computer science when I took computer
525
:science was all about like machine
architect, like computer architecture and
526
:stuff that like, I think in the course
of my career has come up maybe, you know,
527
:thrice where I've been like, Oh, I really
am glad I understand how paging and memory
528
:works because by rewriting this loop in
this way, I've like, You know, actually,
529
:uh, solve the problem because it was a
caching issue inside the chip, you know,
530
:like some crazy like Edge case, but these
days I think that doesn't happen anymore.
531
:And I think a lot of what traditional,
like, I think most of what people
532
:are coming out of now is basically
just software engineering,
533
:which is really just a vocation.
534
:And I can, you know, if you're pretty
technical, it's, I don't think you
535
:necessarily need a four to four year
degree to learn how to write software.
536
:Um,
537
:Diana: Well, I think you're, you're
walking around the edge too of another
538
:thing that we're seeing, which is
there's a lot of people who are very
539
:passionate about analyzing information.
540
:It didn't necessarily start out there.
541
:So like, for example, not to age myself
too much, I ran into somebody who was
542
:perhaps in their early thirties that I
used to babysit when they were a baby.
543
:And I found out from talking to them
that they actually got, I think they
544
:mentioned a history degree in college
because it was something that they were
545
:particularly passionate about, et cetera.
546
:But they also were just naturally
very good at math and then happened
547
:across the pathway of data science,
went and did a concentrated effort
548
:to become very skilled in that space.
549
:And now they work full
time as a data scientist.
550
:And as they put it, they are far more
effective at their job because they
551
:both have the passion for the work
that they're doing, the aptitude to
552
:do it well, but they also didn't come
from sort of like a, a limited scope
553
:background in just technology to apply it.
554
:And I think that's where we
can really see some shifts in.
555
:How we're finding some of the folks who
are very effective in this space, they
556
:may not have started out there, but the
advent of the opportunity ahead of them
557
:is what's starting to shift their focus
and the passion of getting nerdy about
558
:the numbers is really what's driving their
interest, which is a little bit different
559
:than I think what we've seen, um, today,
560
:Mike Gruen: right, but I think that
parallels a lot of what I saw happen
561
:with software right back in the Bye.
562
:Bye.
563
:When I started my career in the
nineties, like it was very, the
564
:like writing software was harder.
565
:It was more esoteric, whatever.
566
:And as we've built more and more tools
and made it more accessible, we've
567
:seen a lot of creatives, like people,
like the best front end engineers I've
568
:worked with, like they went to art
school, they didn't get a CS degree.
569
:And like, so opening those doors
for these people to be able to
570
:build, like take their idea.
571
:And have that in software without
having to work through like a series of
572
:engineers and business analysts and the
rest of it and describing it all, but
573
:like they actually can bring it to life.
574
:And I think we're seeing the same thing.
575
:It's just in data.
576
:And I think.
577
:This is that sort of enablement where
we just, you know, we build the tools
578
:and then it opens the doors for more
people to be able to do more things.
579
:Um, so hopefully it's not
too scary for folks, um, that
580
:they're not losing their job.
581
:It's creating more
opportunities to do more.
582
:Tim Winkler: I kind of want to, uh,
pull on the thread, Angela, of, uh,
583
:the, the term soft skills that you
referenced, because I, I want to kind
584
:of dissect a little bit more of like,
what are these skills that we think are
585
:going to become so essential in this?
586
:Yeah, next era of, you know, A.
587
:I.
588
:Um, and, you know, for example, right?
589
:I think for me personally, I think
kids, students, you know, coming out
590
:of school, you know, it's not about
memorization and regurgitating because.
591
:You've got this PhD in your pocket, right?
592
:With chat, GPT, AI.
593
:Um, so what is it that's going to
become essential and, you know,
594
:learning how to interpret, I think
is, is becoming a real key skillset.
595
:What I'd love to just hear your thoughts
on that, because, you know, where, where
596
:do you see, you know, from an education
perspective, Well, where teachers are
597
:going to be placing a lot of emphasis
with students, um, down the line.
598
:Diana: So from my perspective, and
I'd love to hear Diana's, uh, input on
599
:this too, because she's studying this
also, and she and I have had lots of
600
:philosophical conversations about this.
601
:I would say that What I would, what
I've observed and what I think is going
602
:to become particularly important, and
I also think similar to Mike's comment
603
:about like how the evolution of like
computer science has changed over time.
604
:Nobody thinks about the computer that
they have in their hands every single
605
:day because it just kind of works.
606
:And I do think that that has allowed
us to um, allow our critical thinking
607
:skills to fall off a little bit because
we just take things for granted.
608
:We don't like, I've had this
conversation just recently.
609
:I'm like, how many teenagers
out there do you think actually
610
:know how a toilet flushes?
611
:Because they don't ever really think about
having to do anything with it and they're
612
:going to call somebody to come and fix it.
613
:So when we're thinking about, um,
the, what I'm referring to as soft
614
:skills, and I don't know if that's
technically the right term for it,
615
:but I'm just thinking about what are
the skills that are not taught to you,
616
:that are not those rote memorization,
algorithmic thought processes.
617
:That allow you to be able to adapt what
you are trying to do and then think
618
:critically about how to solve for or fix
or come up with solutions for that thing.
619
:Um, and our ability to revisit really
developing individual critical thinking
620
:skills and problem solving skills and kind
of going back to reinvigorating curiosity.
621
:And, and, um, valuing the
ability to deconstruct and
622
:then understand and then apply.
623
:That's very abstract, but I do think
that it is something that when we
624
:weren't so in our devices was something
that we were kind of forced to do
625
:a little bit more in the olden days
of doing stuff, like you had to go
626
:figure out how to entertain yourself.
627
:You couldn't just simply
stare at a phone and.
628
:Regularly flip through Instagram,
for example, which is pushing to
629
:Mike Gruen: you, right?
630
:I mean, I think that's part of
like, there's this, I agree.
631
:And like the, the, how do I do this?
632
:And just being able to look
it up so quickly and always
633
:being able to find the answer.
634
:So you don't even have to
like really spend a lot of
635
:time trying to figure it out.
636
:You just, why would I waste my time
doing that when I can just Google it?
637
:Um,
638
:Diana: right.
639
:Mike Gruen: Right.
640
:And I
641
:Diana: feel like part of really what we
can do to really maximize the adoption
642
:of some level of AI enablement is to
really get people to start thinking about,
643
:um, how can this help me, but also how
does it help me to actually move faster,
644
:do things better, be more efficient.
645
:And if you're not even thinking about it
as an art of the possible, Then you're not
646
:even going to look for those solutions.
647
:And that's where I think people are
kind of on the edge of thinking about
648
:how does this really, I'm not going to,
well, maybe I will be outing myself.
649
:Like, for example, I noticed that
for my own business, there were
650
:these incredibly philosophical
responses to people's reviews.
651
:And I was like, where did those come from?
652
:Turns out we decided to go with using
generative AI to come up with responses
653
:to reviews that could not be argued with.
654
:Right.
655
:And that was something that helped us
to be able to respond in a way that, um,
656
:allowed us to be able to, uh, Have our
customers feel heard, but at the same
657
:time made it so that they couldn't argue.
658
:And I was like, wow, that's genius.
659
:Because not only does it take the burden
off of my husband for having to write
660
:the review, because he would get super
emotional about sometimes like if somebody
661
:didn't like their coffee or whatever.
662
:Um, and now he's just putting in
a prompt that says, Uh, here's
663
:the problem that was presented.
664
:Please come up with a philosophical
way of responding to that problem.
665
:And that becomes the review
response, which is hilarious, but
666
:also incredibly effective because
I read them and I'm like, that's an
667
:interesting way of thinking about
that particular piece of feedback.
668
:But again, it's a creative way of
solving for what is something that
669
:we have to manage every day, but
coming up with a way to make it
670
:better because the outcome is better.
671
:Um, but also adapting something that
also helps us to save time and focus
672
:our efforts on other activities.
673
:Tim Winkler: Yeah, Diana, uh, Angela
kind of referenced that you're,
674
:you know, you're spending a lot
of time in this area specifically.
675
:Um, I'd love to hear your, your
thoughts on, on that subject.
676
:Diana: Well, this is sort of
the age old question, right?
677
:It's not really a new question.
678
:Everyone's always asking what, you
know, what should I study in school?
679
:What are the future?
680
:I started my career doing,
uh, employment projections.
681
:Like this is always for the Bureau
of Labor Statistics, super wonky,
682
:but this is always something
that people care about of.
683
:Like where's the future job demand and
where are the skills, uh, gaps going
684
:to be, and, and, you know, education
always lags a few years behind.
685
:So now we've caught up with demand
for data scientists and software
686
:developers, but, you know, so you have to.
687
:Now, continue to move.
688
:Look, I think there are some things
that never go out of style and
689
:Angela touched on some of them.
690
:Critical thinking will not
go out of style anytime soon.
691
:Social and emotional skills, being
able to communicate, uh, effectively
692
:will never go out of style.
693
:And something that I
call plan for competence.
694
:It's a term in the literature.
695
:That's really about how you can
design a plan and follow through
696
:and execute on that with competence.
697
:So, you know, being able to think
through and you talk kind of
698
:back to the conversation earlier.
699
:What am I trying to achieve?
700
:What is the best way
I'm going to get there?
701
:How am I, what are the
steps that I need to do?
702
:Things that only, that are
right, that are uniquely human.
703
:For us to use AI as a tool to help us
achieve certain things, but don't waste
704
:your time on summarizing the memo.
705
:Waste your time or use your
time to advance an agenda, to
706
:advance a conversation, to ask
the right question, to know which
707
:questions you should ask, right?
708
:Like even doing data analytics.
709
:Um, you know, I managed a team of
brilliant researchers in a previous
710
:role, and they, it was, how do you
know what the right question is to ask?
711
:How do you design a research project?
712
:And when I have a data set, that's great.
713
:What am I trying to achieve?
714
:Like, what do I want to actually get at?
715
:What's how should I construct
these at the the analytics and
716
:and in a way that creates results.
717
:results and actionable
recommendations, right?
718
:So you, you need to be able to,
um, it's a combination of skills,
719
:I think, and that's where the
soft skills come in because it's.
720
:It's part art and part science, right?
721
:Part being able to communicate, part
being able to think critically, part
722
:being able to, um, work on a team
and, you know, part being able to
723
:think through what it is that we need.
724
:And then how to get there and then
following through and achieving it.
725
:And, you know, and those
are not easy things, right?
726
:A lot of people, it's like you
started out at life is stressful.
727
:We're stressed out.
728
:Um, and so it's really
easy to just say, screw it.
729
:I'm gonna, you know, just
keep my time on the microwave.
730
:An hour ahead because I can't
deal with it in a few months.
731
:It'll be
732
:Mike Gruen: right again.
733
:Diana: You know, but, but these are
like in the workplace, these skills
734
:are, are really, really invaluable.
735
:I think being, and the final thing is
really being able to adapt, flex, stretch.
736
:Like what, what do we actually need?
737
:Like I wear so many hats
and I've worn so many hats.
738
:Like Angela will tell you at CDAO, I
really wore the hat of an action officer.
739
:I was a researcher, but that's
not what they needed at that time.
740
:So that's that.
741
:So no, take off that hat and
put on the hat that you need to
742
:have on and be able to adapt.
743
:Mike Gruen: I think the mentioned
something that, um, the being able
744
:to like question things like the, I
think a lot of people just are ready
745
:to sort of follow and want to be.
746
:Told what to take it for what it is and
not take that step back and be like,
747
:yeah, we could solve that, but you know,
or we could do that, but if we actually
748
:did this other thing, it solves in a
completely different, more effective,
749
:like, and it does take that, like
stepping up, stepping out of the moment
750
:and having that sort of bigger picture,
critical thinking, like, viewpoint
751
:of like, does this even make sense?
752
:Or is there a better way
to go about doing this?
753
:And how do we do that?
754
:And I think that is, um, I mean, I
manage a lot of people, um, and it
755
:seems to be a diminishing skill.
756
:Um, it's something I interview for.
757
:Diana: Well, and that's, and
that's the thing is it seems
758
:to be a diminishing skill.
759
:So you asked earlier, what should
educators in schools be focusing on?
760
:And I don't know what the magic
mix is from an educational
761
:perspective of how you re.
762
:Invest and re educate people to start
having that level of critical thought.
763
:Um, you know, it's like, it's almost
like I want to give my kid a hammer,
764
:a nail and, you know, a box of
screws and And tell them, now figure
765
:out how to make a table, right?
766
:That really is what we're asking people
to do, oftentimes, is here, here are
767
:some tools, now figure out how to
make this happen with those tools.
768
:And if you're not curious enough to
fill in the gaps in between, and if
769
:you need too much hand holding along
the way, You're not a, you're not
770
:an effective part of that mechanism,
and so I don't know right now at this
771
:exact moment what the list of skills
are or processes that we would have to
772
:then reinstitute into our educational
ecosystem to make that critical thought.
773
:become more, more prevalent,
I think, in the development of
774
:folks going into the job space.
775
:But I do feel like that is one of those
where if you're not critical in the
776
:way that you think about the problems
and the things that are given to you
777
:to solve, you also are going to risk
not also being a critical consumer
778
:of what AI is generating for you.
779
:And that is just as important To not
just take what's being regurgitated
780
:from a, you know, statistical, logic
based, um, you know, generative tool
781
:like ChatGPT or something like that.
782
:If you're also not going to be a critical
consumer of that, you're also going to
783
:risk just accepting what's given to you.
784
:And then translating that out as if it's
fact, because there's both the, how are
785
:you going to use it for your intent and
purpose, but also are you not just going
786
:to accept it and also validate what
you, the tool you are using or that is
787
:aiding you in a responsible way so that
you can then implement what it's helping
788
:you to do and feel confident about it.
789
:And Diana, you referred to,
what did you refer to it as?
790
:Planful.
791
:Oh, planful competence.
792
:Planful competence.
793
:I think.
794
:It's going to take me a moment to
make sure that I put that into my
795
:brain and hold on to it as a phrase.
796
:But that idea is the same thing I was,
I was referring to when I was talking
797
:about this critical deconstruction.
798
:And so that planful competence aspect
is just as important for our utilization
799
:of the outputs of AI, uh, products.
800
:And also what we're going to put
into it to then generate that output.
801
:There's that constant cycle of feedback
and then retesting validation, um, that
802
:I think is very important for us to own.
803
:As consumers of that technology,
804
:okay, with failure to, by the way, in
that, like, this is a, it's a process
805
:and we don't learn if we don't fail.
806
:Mike Gruen: Well, I would actually,
and somebody asked me, I get this all
807
:the time when I do reference checks for
people or whatever, like somebody, Oh,
808
:how to talk to Travis time when this
person failed, whatever, in my opinion.
809
:Failure is not learning from a mistake.
810
:So mistakes are mistakes.
811
:They're not failures.
812
:To me, a failure is when you
actually just accept like you haven't
813
:learned anything from that mistake.
814
:So, um, I'm always sensitive to the
whole concept of failure, right?
815
:Don't be afraid to fail.
816
:I'd like a psychologically
safe environment.
817
:There is no such thing.
818
:It's just you made a mistake.
819
:We'll learn from it.
820
:Let's move on.
821
:Tim Winkler: Something I wanted to
just kind of, um, uh, ask Diana, you,
822
:cause you know, when we, we talked
about the AI plus, um, and I, I dropped
823
:one vertical like healthcare, but what
verticals do you think are, are kind of
824
:ripe for upskilling workers and AI plus?
825
:Diana: Oh, everything, you know,
no, really, uh, I think, right.
826
:We've got.
827
:What we, I guess, when you consider
what you're talking about a vertical
828
:as an industry sector, it would be
like, education and health care.
829
:And those are the 2 big nuts, by the way,
that are notoriously low productivity
830
:sectors that have 0 incentive.
831
:Um, to move, uh, for many reasons, for
many, many, many reasons, uh, and we
832
:could talk about some of the education
stuff, but I mean, when you look at
833
:what's growing job wise, it's state,
it's education and healthcare right now.
834
:That's really what's driving a job
growth and that's worth noting.
835
:Um, so.
836
:I think that you've got a lot of
different, um, opportunities for AI plus,
837
:I call it AI plus X and, uh, it's not my,
uh, generic term, by the way, like that's
838
:something that other countries have also
latched onto in their education system.
839
:So we're actually a little bit
behind, uh, our education systems a
840
:Mike Gruen: little behind.
841
:Diana: Hello, you know what?
842
:That's not fair.
843
:And so in some in some places for
some students and you talked about
844
:we were talking about the classroom.
845
:It's but also the students learn
differently and we don't have a
846
:model that's set up for student
success for all student success.
847
:Anyway, um, So, you know, I think,
uh, that you've got, you know, the AI
848
:plus finance, the AI plus healthcare,
the, um, AI, literally you could
849
:do like, I could do AI plus retail.
850
:Like you could literally go down
the entire industry taxonomy and
851
:say like, AI is going to transform.
852
:Some faster than others, some places
faster than others, and some places
853
:more disruptive than others, like the
information sector, the publishing,
854
:the media, the news sector, right?
855
:Like, that is, that is taking a hit more
quickly than in some of these other areas.
856
:And you'll see AI come out with the,
you know, the, I think the scope here
857
:is Gen AI, but you've got other AI,
like autonomous vehicles that are,
858
:that are also going to continue to
disrupt the transportation sector.
859
:So I think that.
860
:Yeah.
861
:The list doesn't really end in terms
of where you're going to be able to
862
:apply AI and as a practitioner, uh,
where you'll be particularly valuable
863
:if you've got, um, some subject matter
expertise, some domain experience, and
864
:then also being able to understand how
to leverage these tools in that domain.
865
:Tim Winkler: All right.
866
:Um, obviously just scratching the surface
of the conversation here, uh, seems like
867
:a, uh, an episode prime for a followup
sequel at some point, so we'll, we'll
868
:We'll have to table it just because
we, we of course have to get the five
869
:seconds scramble segment in, or this
is not a complete podcast episode.
870
:So I'm going to wrap it on that note and
transition us into this final segment,
871
:five seconds, scramble, quick, rapid Q and
a, uh, try to keep it under five seconds.
872
:Otherwise we will air
horn you off the stage.
873
:Um, some business, some fun,
uh, personal, not too personal.
874
:Uh, Mike, why don't you lead us off with
Diana, uh, and then I'll get to Angela.
875
:Yep.
876
:Mike Gruen: Sounds great.
877
:And I apologize ahead of time because
I thought of most of these questions
878
:ahead of time, and we spent a lot of
time talking about them, but hopefully
879
:we'll be able to get a little concise.
880
:All right.
881
:Uh, Diana, uh, describe
the culture at S, uh, CSP.
882
:Diana: Innovative.
883
:Mike Gruen: Nice.
884
:Uh, any types of roles or people
that you're looking to hire there?
885
:Diana: Yes.
886
:Come to our website.
887
:Mike Gruen: Uh, what's an important
skill you look for in a new hire?
888
:Diana: Planful competence.
889
:You can see it on a resume.
890
:Mike Gruen: There you go.
891
:Oh, yes.
892
:Right.
893
:Um, have you ever seen it on a resume?
894
:Diana: Not that word, but you can
see what they follow through on
895
:Mike Gruen: what
896
:Diana: they were able to
897
:Mike Gruen: execute.
898
:Um, what's the best advice
you've ever been given?
899
:Diana: Sounds terrible.
900
:No one's going to look
out for you, but you.
901
:Mike Gruen: That's a good one.
902
:I like that.
903
:Um, and then, uh, touched on it
a little bit throughout the pod,
904
:but, uh, What advice would you
give to my high school student?
905
:Diana: Uh, do what gives you
passion, back to Angela's point.
906
:Mike Gruen: Nice.
907
:Um, what's something you did as
a kid that you still enjoy doing?
908
:Diana: Eating cookies.
909
:Mike Gruen: What's something you
enjoy doing but are really bad at?
910
:Diana: Everything
911
:playing the clarinet.
912
:I'm not very good,
913
:Mike Gruen: but you enjoy it.
914
:That's great.
915
:Um, all right.
916
:My personal favorite.
917
:What's your, uh, what's the
largest land animal you think
918
:you could take in a street fight?
919
:A
920
:Diana: fuzzy dog.
921
:Mike Gruen: I'm assuming
a small fuzzy dog.
922
:Small fuzzy dog.
923
:What's a charity or corporate
philanthropy that's near and dear to you?
924
:Diana: Uh, St.
925
:Jude's Hospital.
926
:Mike Gruen: Nice.
927
:Um, if you could, uh, live
in any fictional universe,
928
:which one would you choose?
929
:Diana: I wanna defer.
930
:Come back to me on
931
:Mike Gruen: that.
932
:That's the last one.
933
:We'll come back to you.
934
:You can get
935
:Tim Winkler: some time.
936
:That's a deep one.
937
:You keep thinking, Diana.
938
:We're gonna come back to you.
939
:Angela, are you ready?
940
:Diana: Yeah, if it's on what
we just heard, let's go.
941
:Tim Winkler: There's a
few, few overlaps there.
942
:Um, explain why folks from
industry would want to come to
943
:Diana: CDAO.
944
:Mission based work.
945
:You can make a big impact.
946
:Tim Winkler: How would you
describe the culture at CDAO?
947
:Diana: Ooh, it's complex.
948
:Uh, it is diverse and, uh,
you know, we're rearing to go.
949
:It's a new PSA.
950
:Thank you.
951
:Tim Winkler: What kind of technologists
thrives in that environment?
952
:Diana: We have a lot of work
to do, so I think the planful
953
:confidence would be very important.
954
:Mike Gruen: I think we
have an episode title.
955
:I think you're right.
956
:I think you're right.
957
:Tim Winkler: Um,
958
:Diana: sociology, academic journal.
959
:Tim Winkler: What kind of,
uh, kind of tech roles are
960
:you hiring for at the moment?
961
:Diana: Ooh, there's a lot.
962
:I would encourage folks to go to AI.
963
:mil to find out what we're hiring
for exactly, but we've got lots of
964
:AI and actual technical positions
that are being hired for, including
965
:product management and others.
966
:Tim Winkler: What would you say
is the biggest challenge facing
967
:your agency heading into:
968
:Diana: Getting in its own way.
969
:Tim Winkler: Nice.
970
:Uh, describe your morning routine.
971
:Diana: I get up and I check my phone for
what my calendar is going to be for the
972
:day, and then I head out for a delicious
cup of coffee from my coffee shop.
973
:Tim Winkler: What's your
favorite app on your phone?
974
:Diana: Oh, favorite app on my phone?
975
:Probably, uh, that's a good question.
976
:I might get air horned for this one.
977
:Uh, we'll have to come back to it.
978
:Tim Winkler: Come back to it.
979
:Uh, what's a charity or corporate
philanthropy that's near and dear to you?
980
:Diana: Um, I'm actually big on, uh,
human rights and anything that has to
981
:do with preventing human trafficking.
982
:Thank you.
983
:Tim Winkler: If you could have
dinner with any celebrity past
984
:or present, who would it be with?
985
:Diana: Ooh, Maya Angelou.
986
:Tim Winkler: What was the first, oh
sorry, what was the worst fashion
987
:trend that you've ever followed?
988
:Diana: Oh, probably, you know, back
in the day when we had those great
989
:leotards that you had to, that you had
to exercise in that all the comedians
990
:now have created like really great.
991
:Uh, spooks on movement.
992
:That stuff is good.
993
:Tim Winkler: Yeah.
994
:Those are fun.
995
:Little commercials to look back on, right?
996
:Those old:
997
:Diana: and it's really not a good
look for the majority of human beings.
998
:Tim Winkler: Pays it out.
999
:Uh, and the last question, what
was your dream job as a kid?
::
Diana: I actually wanted to
be a, um, uh, heart surgeon.
::
Whoa.
::
Whoa.
::
Tim Winkler: Whoa.
::
Deep.
::
That was deep.
::
That was deep.
::
Um, all right.
::
I, I do want to come back to Diana,
the fictional universe, the question.
::
Diana: Yeah.
::
Oh, and I wish I got the app question.
::
Um, so I, I actually, so the first, I'm
going to go with the first thing that
::
came into my mind and it's a movie called
Defending Your Life and it's from the 90s.
::
I remember that movie.
::
I thought that was a
really cool place to be.
::
Tim Winkler: Oh, I'm going
to have to Google that.
::
Um, and then Angela, we had to come
back to the question for you too.
::
Diana: It was the app question.
::
And I actually, you know what it is?
::
It's the New York Times games app where
Wordle, other crossword things are.
::
Tim Winkler: Yeah.
::
There you go.
::
And connections.
::
Yeah.
::
Diana: One.
::
It's a
::
Tim Winkler: part of my
morning routine is connections
::
Diana: and it's my night routine.
::
So
::
Tim Winkler: Diana.
::
All right.
::
Look, I know you're itching.
::
What's your, what's your favorite?
::
Oh my gosh.
::
So much.
::
So many questions now.
::
Um, all right.
::
That's, that's, that's, uh, that's a wrap.
::
I think you guys both nailed it.
::
Um, thank you for participating and
thank you for joining us on the podcast.
::
You've been great guests.
::
Uh, sharing your insights on
this evolution of tech talent
::
in the age of gen AI, uh, and
thanks for joining us on the pod.
::
Diana: Thank you, Neural.