Transcript
1
00:00:14,025 --> 00:00:17,860
Welcome to AI Goes to College, the podcast that helps higher education
2
00:00:17,920 --> 00:00:21,540
professionals navigate the changes brought on by generative AI.
3
00:00:22,000 --> 00:00:24,660
I'm your host, doctor Craig Van Slyke.
4
00:00:25,685 --> 00:00:29,305
The podcast is a companion to the AI Goes to College newsletter.
5
00:00:29,845 --> 00:00:32,825
You can sign up for the newsletter at ai goes to college.com/
6
00:00:38,010 --> 00:00:41,850
newsletter. Today, we've got a very special edition of AI Goes
7
00:00:41,850 --> 00:00:45,605
to College. Rob Crossler from Washington State University
8
00:00:45,665 --> 00:00:49,425
is joining me in our very first interview. Rob
9
00:00:49,425 --> 00:00:53,129
is an associate professor of information systems at the Carson College of
10
00:00:53,129 --> 00:00:56,730
Business at Washington State University, and I think he's soon to be
11
00:00:56,730 --> 00:01:00,245
professor. He serves as chair of the Department of Management
12
00:01:00,305 --> 00:01:03,925
Information Systems and Entrepreneurship and holds the Philip
13
00:01:03,985 --> 00:01:07,525
a Kaye's distinguished professorship in management
14
00:01:07,585 --> 00:01:11,170
information systems. Rob has served as the
15
00:01:11,170 --> 00:01:14,770
president of the AIS Special Interest Group on Information Security and
16
00:01:14,770 --> 00:01:18,594
Privacy, and he's done a lot of other stuff. He's won
17
00:01:18,594 --> 00:01:22,354
research awards. He's been funded by the NSF and the Department
18
00:01:22,354 --> 00:01:26,090
of Defense. His research has appeared in top journals like
19
00:01:26,090 --> 00:01:29,690
MIS Quarterly, Information Systems Research, and the list
20
00:01:29,690 --> 00:01:33,450
goes on and on and on. But, anyway, I'm really happy that Rob was able
21
00:01:33,450 --> 00:01:36,875
to join me for this wide ranging and quite
22
00:01:36,875 --> 00:01:40,314
interesting conversation about how generative AI is
23
00:01:40,314 --> 00:01:44,020
shaping higher education. Rob, thanks for being
24
00:01:44,020 --> 00:01:47,780
the very first guest on AI Goes TO College. Great.
25
00:01:47,780 --> 00:01:50,945
I'm glad I could be here, and I I look forward to this conversation. Yep.
26
00:01:50,945 --> 00:01:53,985
That's what you say now. We'll see what you say at the end. Well, it's
27
00:01:53,985 --> 00:01:57,105
up to up to the questions you asked me, Greg. I guess. That's right. That's
28
00:01:57,105 --> 00:02:00,729
right. So I wanna start off with, how do you
29
00:02:00,729 --> 00:02:04,329
use AI? And here, we're talking specifically about
30
00:02:04,329 --> 00:02:08,030
generative AI. So if I say AI, I mean generative AI.
31
00:02:08,455 --> 00:02:12,235
Perfect. I use generative AI in in a couple of different ways. As
32
00:02:12,775 --> 00:02:16,295
a professor, in the classroom, I will use it to
33
00:02:16,295 --> 00:02:19,340
actually help me to write questions for students to
34
00:02:20,120 --> 00:02:23,640
take my general ideas that I think will make a really good question, and I
35
00:02:23,640 --> 00:02:27,475
feed it in through prompts. And it does a much quicker job, much more
36
00:02:27,475 --> 00:02:30,835
efficient job of of getting to the point where I've got material to use in
37
00:02:30,835 --> 00:02:34,460
the classroom. I also encourage students. It's like,
38
00:02:34,460 --> 00:02:38,220
okay, let's open up generative AI and see how it can help us with our
39
00:02:38,220 --> 00:02:41,120
assignments. So I'll I very easily
40
00:02:41,795 --> 00:02:45,474
feel comfortable saying, you know, this is a new technology. We're information
41
00:02:45,474 --> 00:02:49,235
systems students. Let's leverage it. And then they start asking the the questions of,
42
00:02:49,235 --> 00:02:52,730
well, how can we use it responsibly? How can we not be accused of cheating?
43
00:02:52,790 --> 00:02:56,630
That's usually what they're concerned about. How is this not cheating? And and my
44
00:02:56,630 --> 00:03:00,415
answer to that has always been transparency. The more you bring
45
00:03:00,415 --> 00:03:04,174
in recognition that you're using it, how and why, in my eyes, that's
46
00:03:04,174 --> 00:03:07,790
not much different than I use Google to help me figure out something that I
47
00:03:07,790 --> 00:03:11,310
baked into an answer on something. Interestingly, I had a student in my
48
00:03:11,310 --> 00:03:14,689
office, this last week that was working on their resume,
49
00:03:15,555 --> 00:03:18,834
and they didn't have a lot of experience. Right? So their their only job they've
50
00:03:18,834 --> 00:03:22,295
had was delivering sandwiches for a sandwich delivery company.
51
00:03:22,700 --> 00:03:26,300
And when they were describing kind of what that was, it said I
52
00:03:26,300 --> 00:03:29,500
delivered sandwiches to people's house. I made sure they were on time and then they
53
00:03:29,500 --> 00:03:33,084
were you know, they stayed fresh while I delivered. Them. Very generic things that had
54
00:03:33,084 --> 00:03:36,765
nothing to do with an information systems student seeking
55
00:03:36,765 --> 00:03:40,260
an information systems job. And so I I said, well, let's
56
00:03:40,260 --> 00:03:43,959
ask chat gpt how to make
57
00:03:44,580 --> 00:03:47,880
experience as a sandwich delivery driver sound appealing
58
00:03:48,885 --> 00:03:52,725
for an information system student seeking an information systems job.
59
00:03:52,725 --> 00:03:55,625
And and it changed some of the words for the student to
60
00:03:56,290 --> 00:03:59,349
utilize GPS to ensure a 15%
61
00:04:00,050 --> 00:04:03,510
increase in delivery time efficiency. Right?
62
00:04:03,995 --> 00:04:07,595
Saying kind of the same thing about how they're delivering sandwiches, but bringing
63
00:04:07,595 --> 00:04:11,355
technology into that equation and giving a student who is struggling with how
64
00:04:11,355 --> 00:04:15,159
do I capture information systems in this in in a way that that
65
00:04:15,159 --> 00:04:18,680
worked. Right? And and so assuring the student with how to
66
00:04:18,680 --> 00:04:22,465
utilize this tool to help them say more creative things, I saw
67
00:04:22,465 --> 00:04:26,305
more creative product from them, which helped them to get past the stumbling block
68
00:04:26,305 --> 00:04:29,810
of of I don't know how to be creative with something like that. That's
69
00:04:29,810 --> 00:04:33,569
great. So how has it worked out to have students open
70
00:04:33,569 --> 00:04:37,409
up their AI tool of choice in the classroom to dig into
71
00:04:37,409 --> 00:04:41,165
something? Because I'm teaching the junior level principles of
72
00:04:41,165 --> 00:04:44,545
information systems class in the spring yet again.
73
00:04:45,005 --> 00:04:48,730
And I've toyed with the idea of really minimizing
74
00:04:48,870 --> 00:04:52,550
some of the lecturing and just say, okay. What we're gonna do, you're
75
00:04:52,550 --> 00:04:56,229
gonna go out and and let's say we're talking about what is an information
76
00:04:56,229 --> 00:04:59,995
system. Plug it into chat, gpt, PO, Gemini,
77
00:05:00,055 --> 00:05:03,735
whatever you use, see what it comes up with. Now let's
78
00:05:03,735 --> 00:05:07,440
see what AI comes up with, and let's compare that to
79
00:05:07,440 --> 00:05:11,200
what's in the book or see where, you know, maybe
80
00:05:11,200 --> 00:05:14,875
that's not the best answer or that's pretty solid. Mhmm. But I'm
81
00:05:14,875 --> 00:05:18,495
I'm not sure how that would work, but it sounds like you're doing something
82
00:05:18,555 --> 00:05:22,340
similar to that. A little bit. So so the class I teach is a
83
00:05:22,419 --> 00:05:25,960
junior level. I've got juniors and seniors in a a cybersecurity class.
84
00:05:26,660 --> 00:05:30,180
And what I found as I
85
00:05:30,340 --> 00:05:34,185
I've done similar sorts of things with students. Some are afraid to
86
00:05:34,185 --> 00:05:37,945
use it. Others embrace using it and and just, like, how creative can I
87
00:05:37,945 --> 00:05:41,770
get? And so what I do and
88
00:05:41,770 --> 00:05:45,370
what I've encouraged others to do is to try it,
89
00:05:45,370 --> 00:05:49,115
to experiment, and to be okay with failure. Because from
90
00:05:49,115 --> 00:05:52,955
failure, we learn. And the the the caveat I give in
91
00:05:52,955 --> 00:05:56,795
being accepting of failure is to also take the approach of do
92
00:05:56,795 --> 00:06:00,370
no harm. So just because we're experimenting with this new
93
00:06:00,370 --> 00:06:03,970
technology and how we go about utilizing it, I don't think the
94
00:06:03,970 --> 00:06:07,634
students should get a lower grade because I decided to do
95
00:06:07,634 --> 00:06:11,474
something creative and experimental. So with the mindset of the student will not
96
00:06:11,474 --> 00:06:15,060
be harmed by this, let's get creative. Let's see how the learning
97
00:06:15,060 --> 00:06:18,340
happens. And in a lot of ways, where I think the learning happens is, you
98
00:06:18,340 --> 00:06:21,700
know, in the student's mind, it happens because they used a new tool. They learned
99
00:06:21,700 --> 00:06:25,414
how to use a new tool. I actually think the learning happens in
100
00:06:25,414 --> 00:06:29,254
the reflection, which is what did I learn from this? How did this
101
00:06:29,254 --> 00:06:32,940
help me advance? And then how could I do something similar
102
00:06:33,640 --> 00:06:37,400
in the future of my own will in the court? Yeah. Yeah. I I
103
00:06:37,400 --> 00:06:40,815
tried a little experiment this quarter where I gave a
104
00:06:40,815 --> 00:06:44,655
question that I've used every time I teach this class. It's
105
00:06:44,655 --> 00:06:48,470
really simple. It's compare and contrast customer relationship
106
00:06:48,530 --> 00:06:52,229
management systems and supply chain management systems. And
107
00:06:52,849 --> 00:06:56,389
I got to thinking, they're gonna use generative AI for this.
108
00:06:56,765 --> 00:07:00,085
And so what I did is I put that into generative AI, put it into
109
00:07:00,205 --> 00:07:03,185
I think I use chat GPT, got the answer,
110
00:07:03,940 --> 00:07:07,699
Said in the assignment, said, okay. Here's the question. I put it
111
00:07:07,699 --> 00:07:11,505
into chat GPT. Here's the answer. Now what I want
112
00:07:11,505 --> 00:07:14,965
you to do is compare that to what's in the book.
113
00:07:15,505 --> 00:07:18,805
How are the two answers similar? How are the two answers different?
114
00:07:19,479 --> 00:07:22,940
What we talked about in the book and what chat gpt came up
115
00:07:23,240 --> 00:07:26,759
with was pretty similar, but here are a couple of things in the chat
116
00:07:26,759 --> 00:07:29,965
gpt answer. I'm not so sure we're right.
117
00:07:30,664 --> 00:07:33,224
And then it brought up some things that we didn't bring up. So I think
118
00:07:33,224 --> 00:07:36,745
it was a good exercise for them. The responses fell into 2
119
00:07:36,745 --> 00:07:40,479
categories, students that actually did what I asked them to do
120
00:07:41,099 --> 00:07:44,620
and students who just answered the original question and had no clue what I was
121
00:07:44,620 --> 00:07:48,384
really trying to get at. Mhmm. So but I think there's a
122
00:07:48,384 --> 00:07:51,585
larger point here, and I want to see what you think of this. I think
123
00:07:51,585 --> 00:07:55,264
it's a mistake to pretend that the tool doesn't exist and to pretend that they're
124
00:07:55,264 --> 00:07:58,950
not going to use it By doing things like what,
125
00:07:59,010 --> 00:08:02,690
Rob, you're talking about and what I'm talking about, we teach
126
00:08:02,690 --> 00:08:05,590
them to your earlier point kind of about the tool
127
00:08:06,384 --> 00:08:09,985
and and how it can be used. But we also teach
128
00:08:09,985 --> 00:08:13,525
them to be critical users of the tool.
129
00:08:14,139 --> 00:08:17,840
Because as anybody who's used any of these tools knows,
130
00:08:18,220 --> 00:08:22,000
it'll make stuff up. And so they can't just take
131
00:08:22,525 --> 00:08:25,885
and say, okay. This is right because ChatGPT said so. So what what do you
132
00:08:25,885 --> 00:08:29,645
think? Are you are you in agreement with that? Or Absolutely agreement. As I think
133
00:08:29,645 --> 00:08:33,460
back about, you know, my my career so far in academia, critical thinking has
134
00:08:33,460 --> 00:08:37,220
always been one of the things that industry has said
135
00:08:37,220 --> 00:08:40,174
our students need to be better at. Right? I I don't know how you perfectly
136
00:08:40,955 --> 00:08:44,635
enable students to be critical thinkers, but I think generative AI gives a great
137
00:08:44,635 --> 00:08:48,230
place to come up with activities that focus on,
138
00:08:48,230 --> 00:08:51,910
okay, we're using the tools, but then, well, how does the critical thinking
139
00:08:51,910 --> 00:08:55,325
occur? How do we how are we confident that we have correct
140
00:08:55,325 --> 00:08:59,085
information in what we use? And and so I was talking with someone on the
141
00:08:59,085 --> 00:09:01,985
the board of advisers, for us at at WSU
142
00:09:02,830 --> 00:09:06,510
who shared that he thinks our students are in a unique position as they
143
00:09:06,510 --> 00:09:10,270
enter the marketplace to where people in industry
144
00:09:10,270 --> 00:09:13,895
are gonna be looking to the students coming out of college right now about how
145
00:09:13,895 --> 00:09:17,735
to use generative AI because they fully expect that in all the free
146
00:09:17,735 --> 00:09:21,415
time that I put air quotes around free time. All the free time students
147
00:09:21,415 --> 00:09:24,840
have is they're they're playing with these technologies and they're figuring out how to use
148
00:09:24,840 --> 00:09:28,520
them in in ways that industry is not yet. And so I think preparing
149
00:09:28,520 --> 00:09:32,245
students, a, to feel confident that they can enter into a field and bring
150
00:09:32,245 --> 00:09:36,085
something to the conversation is is important for us as academics to
151
00:09:36,085 --> 00:09:39,860
be able to do, but also how am I critical of that thinking or critically
152
00:09:39,860 --> 00:09:43,700
thinking about the that information I'm getting. And and the example I like to give
153
00:09:43,700 --> 00:09:47,460
my students when we talk about this is let's assume that
154
00:09:47,460 --> 00:09:51,175
you created a document and you gave it to your boss because he
155
00:09:51,175 --> 00:09:53,995
gave you an assignment to go out and do things. And
156
00:09:55,630 --> 00:09:59,390
your boss is gonna use that to go and, you know, pitch to clients
157
00:09:59,390 --> 00:10:03,154
to do something that's gonna put your boss in the hot seat. You wanna
158
00:10:03,154 --> 00:10:06,834
know that what you're giving your boss is good and and
159
00:10:06,834 --> 00:10:10,380
reliable. And if not, it's gonna come back and bite you. If you're
160
00:10:10,380 --> 00:10:13,900
using generative AI to come up with that, how are you
161
00:10:13,900 --> 00:10:17,580
confident in what you're giving your boss that it's not going to
162
00:10:17,580 --> 00:10:21,355
embarrass him in front of a client that's potentially gonna write a check
163
00:10:21,355 --> 00:10:25,195
for many 1,000,000 of dollars for something your organization is building
164
00:10:25,195 --> 00:10:28,260
up. But it's no different than doing a Google search. Right? You're gonna go out
165
00:10:28,260 --> 00:10:31,720
and do Google search and you're gonna make an assessment of, you know, that's somebody's
166
00:10:31,940 --> 00:10:35,400
blog that I don't know what makes them an expert on that topic
167
00:10:35,460 --> 00:10:39,214
versus that's a a white paper from a leading
168
00:10:39,435 --> 00:10:43,195
industry consulting firm that that has the legitimacy to be able to
169
00:10:43,195 --> 00:10:46,980
say something. And and how do you then discern that in a in a generative
170
00:10:46,980 --> 00:10:50,580
AI world? And and then that opens the door for a lot of really interesting
171
00:10:50,580 --> 00:10:54,315
conversations with students about how do they know that they believe what it
172
00:10:54,315 --> 00:10:58,075
is that they're getting out of these products. Yeah. That absolutely. And that
173
00:10:58,235 --> 00:11:01,960
that's so important for them because I think
174
00:11:01,960 --> 00:11:05,720
they use technology so much and they have for so much of their lives
175
00:11:05,720 --> 00:11:08,220
that they just uncritically accept
176
00:11:09,215 --> 00:11:13,055
what Google or, you know, whatever returns. I mean, it's just
177
00:11:13,055 --> 00:11:15,395
that that must be right. And
178
00:11:16,430 --> 00:11:19,950
AI is so confident in its bad
179
00:11:19,950 --> 00:11:23,710
answers. I mean, it's really kind of impressive. I wish I could be that
180
00:11:23,710 --> 00:11:27,375
confident even on my right answers, but it's just, you know,
181
00:11:27,375 --> 00:11:31,215
yep. The, you know, the declaration of
182
00:11:31,215 --> 00:11:34,490
independence was signed on July 4,
183
00:11:34,630 --> 00:11:38,310
1982, you know, or whatever. And it just this is the
184
00:11:38,310 --> 00:11:42,004
fact. They really do need to understand that they have to
185
00:11:42,004 --> 00:11:44,985
be critical consumers. I use the parallel of Wikipedia.
186
00:11:46,165 --> 00:11:49,840
So my wife, Tracy, and I are watching some movie, and we're trying to, like,
187
00:11:49,840 --> 00:11:53,280
you know, who else was the was or what else was this
188
00:11:53,280 --> 00:11:56,660
person in? I pull up their Wikipedia page.
189
00:11:57,040 --> 00:12:00,705
Well, who who cares if that's right? It puts in a movie
190
00:12:00,705 --> 00:12:04,225
that they weren't really in or it misses a television show they were
191
00:12:04,225 --> 00:12:07,825
in. Who cares? But if you're gonna embarrass your
192
00:12:07,825 --> 00:12:11,519
boss and get your rear end fired, you know,
193
00:12:11,519 --> 00:12:15,060
you better not be using Wikipedia, and you better not be
194
00:12:15,439 --> 00:12:19,115
uncritically using generative AI. So that brings me
195
00:12:19,115 --> 00:12:22,715
to another question, and this is something that came up
196
00:12:22,715 --> 00:12:26,475
recently in a in a couple of different conversations I was
197
00:12:26,475 --> 00:12:30,220
having. Some people seem to think that students are all using this
198
00:12:30,220 --> 00:12:33,660
like crazy. My experience has been
199
00:12:33,660 --> 00:12:37,205
that relatively few of them are actually using
200
00:12:37,205 --> 00:12:41,045
it at all, and just a subset of
201
00:12:41,045 --> 00:12:44,610
that group is using it very much. So we, you
202
00:12:44,610 --> 00:12:47,890
know, we we I'm using the big Mhmm.
203
00:12:48,210 --> 00:12:51,890
Broad royal we not royal we, but I'm using the big broad
204
00:12:51,890 --> 00:12:55,535
we, seem to think that students are all over the latest technology,
205
00:12:56,475 --> 00:12:58,955
and so they're all over this. But I don't think that's true. I don't think
206
00:12:58,955 --> 00:13:02,540
many students are using it. So what what's your take? That that's
207
00:13:02,540 --> 00:13:06,240
generally what I see. The the and and
208
00:13:06,300 --> 00:13:09,980
the way I I come to that conclusion is the conversations I have with
209
00:13:09,980 --> 00:13:12,745
students when we start using it is
210
00:13:13,444 --> 00:13:17,204
almost a fear to use it at least in the academic space because there
211
00:13:17,204 --> 00:13:20,940
has been so much fear built into them that it
212
00:13:20,940 --> 00:13:24,620
is cheating. And so those who have used
213
00:13:24,620 --> 00:13:28,140
it have done, you know, stupid party games where it's like, write a story
214
00:13:28,140 --> 00:13:31,565
about an adventure I went on or, you know, write this poem in
215
00:13:31,565 --> 00:13:35,165
Limerick format or or something that is just them
216
00:13:35,165 --> 00:13:38,939
playing with it, using it purposefully as as something that's helping
217
00:13:38,939 --> 00:13:42,699
them with their coursework or or what they're doing academically. There's a lot of
218
00:13:42,699 --> 00:13:46,240
fear in students' minds about what's gonna get them right up the flagpole
219
00:13:46,375 --> 00:13:50,055
for, an integrity violation. So, yeah, I see a lot of
220
00:13:50,055 --> 00:13:53,899
students that that's their first biggest question is they're like, what? You mean I I
221
00:13:53,899 --> 00:13:57,360
can use this in your class and and this is allowed? Because
222
00:13:57,899 --> 00:14:01,500
I've been told with so many channels that it's not. One one of my
223
00:14:01,500 --> 00:14:05,225
colleagues, Tom Stafford, recently taught a class, and he
224
00:14:05,225 --> 00:14:08,764
required them to use it. I want you to use generative
225
00:14:08,824 --> 00:14:12,125
AI for whatever. I don't remember exactly what he was having them do,
226
00:14:12,700 --> 00:14:16,380
and then refine that. Mhmm. Use that to help you create your
227
00:14:16,380 --> 00:14:19,840
final product, which I I thought was a really good way to do it.
228
00:14:20,035 --> 00:14:23,715
Yep. That was the final exam I gave last semester. It was I I
229
00:14:23,715 --> 00:14:27,235
basically told them, choose any of the topics that we've talked about this
230
00:14:27,235 --> 00:14:30,740
semester and create a document related to that particular
231
00:14:30,800 --> 00:14:34,500
topic, your choice, but utilize generative
232
00:14:34,639 --> 00:14:38,445
AI to do this and give it at least,
233
00:14:38,445 --> 00:14:42,205
you know, 3 iterations of prompts. Share with me, a,
234
00:14:42,205 --> 00:14:45,990
what you created, and, b, what were your prompts? What was the
235
00:14:45,990 --> 00:14:49,830
output for each of your prompts? And then what was your reflection on what
236
00:14:49,830 --> 00:14:53,385
you received from each of those prompts and and, you know, how you
237
00:14:53,385 --> 00:14:57,065
critically assessed it. And, really, the the grading of that was on
238
00:14:57,065 --> 00:15:00,605
their ability to build more critical prompting.
239
00:15:00,665 --> 00:15:04,070
Right? Playing with that prompting to get it to get some get them to what
240
00:15:04,070 --> 00:15:07,210
they want, but also to demonstrate that
241
00:15:07,830 --> 00:15:11,545
they validated in some way, shape, or form the information that they
242
00:15:11,545 --> 00:15:15,305
got. And I used generative AI to write the question. I'd let students
243
00:15:15,305 --> 00:15:18,790
know that. I was completely transparent. And I was afraid I was gonna
244
00:15:18,790 --> 00:15:22,329
have all the students receive a 100% on the on that assignment.
245
00:15:22,550 --> 00:15:26,225
And I had variance in grades. They created the rubric for me. It is
246
00:15:26,225 --> 00:15:29,745
so good at creating rubrics that I will never create a rubric all by myself
247
00:15:29,745 --> 00:15:33,560
again. I had students that got perfect scores and I had students
248
00:15:33,560 --> 00:15:36,380
that got c's. I think that was kinda my range with c's to,
249
00:15:37,319 --> 00:15:40,765
to a's. And my overall class average was around
250
00:15:41,225 --> 00:15:44,265
85 to an 88 or something on that exam. So a little bit higher than
251
00:15:44,265 --> 00:15:47,705
I probably would have had on a final exam. But, again, going back to kind
252
00:15:47,705 --> 00:15:50,980
of that whole idea with do no harm, and I didn't want students to be
253
00:15:50,980 --> 00:15:54,660
punished because I came up with a creative way of of doing this exam. But,
254
00:15:54,660 --> 00:15:58,259
also, I think those that did well embraced the ability to be
255
00:15:58,259 --> 00:16:02,075
creative with these tools, and they played with it, and they they did something useful.
256
00:16:02,135 --> 00:16:05,895
And then others struggled a little bit, but but I
257
00:16:05,895 --> 00:16:09,220
think I was I was okay with that, right, as as an IS course and
258
00:16:09,220 --> 00:16:12,280
what I was asking students to do. So I I think it turned out well,
259
00:16:12,420 --> 00:16:16,245
and I think you can ask students to be required to do it. And it's
260
00:16:16,245 --> 00:16:20,084
just ultimately what I'm grading on now is not can
261
00:16:20,084 --> 00:16:23,740
you create the final product, but more
262
00:16:23,740 --> 00:16:27,180
like the process of how do you get to create that final product. I can
263
00:16:27,180 --> 00:16:30,855
look at the process with the prompting and those sorts of things. But also, I
264
00:16:30,855 --> 00:16:34,535
I get to see some insight into critical thinking that really I
265
00:16:34,535 --> 00:16:38,135
wasn't paying attention to before I was setting up questions in that sort of
266
00:16:38,135 --> 00:16:41,550
way. Yeah. I think that you're really on to
267
00:16:41,550 --> 00:16:44,450
something. Seems like that's the second time I've used that phrase.
268
00:16:45,310 --> 00:16:48,625
This is really chat GPT. Nevertheless, nonetheless
269
00:16:49,404 --> 00:16:53,245
Craig GTP. That's right. Craig GTP. I'm going to see if
270
00:16:53,245 --> 00:16:56,720
I can form a coherent thought here. Technology
271
00:16:57,100 --> 00:17:00,940
helps free people from mundane tasks. At
272
00:17:00,940 --> 00:17:04,705
least it can. And so we've been we in higher
273
00:17:04,705 --> 00:17:08,065
ed had higher education have
274
00:17:08,065 --> 00:17:11,780
been criticized for a long time that we focus too much on
275
00:17:11,780 --> 00:17:15,480
memorization and that sort of thing and not enough on
276
00:17:15,780 --> 00:17:19,240
critical thinking creativity, which I think is a fair criticism.
277
00:17:20,174 --> 00:17:23,634
Students need to know some things. They just need to know certain things,
278
00:17:23,934 --> 00:17:27,154
certain facts. But, you know, if I forget
279
00:17:28,000 --> 00:17:30,580
cutoff for some statistic,
280
00:17:31,840 --> 00:17:35,360
I don't sit down and memorize every possible heuristic for every
281
00:17:35,360 --> 00:17:39,195
possible statistical technique I might use, I get on
282
00:17:39,195 --> 00:17:41,755
Google and look it up or I go on Google Scholar, find a paper and
283
00:17:41,755 --> 00:17:45,250
look it up. So I don't wanna waste my
284
00:17:45,250 --> 00:17:48,930
limited brainpower remembering a bunch of stuff that I can find in 30
285
00:17:48,930 --> 00:17:52,290
seconds or a minute. And so I think you're you're kind of doing the same
286
00:17:52,290 --> 00:17:56,054
thing is yeah. I mean, they need to know the concepts, and they need
287
00:17:56,054 --> 00:17:59,815
to understand things at a certain level. But, you
288
00:17:59,815 --> 00:18:03,320
know, what are the stages in this
289
00:18:03,320 --> 00:18:07,080
process, or, what are all the components of some
290
00:18:07,080 --> 00:18:10,845
framework? You know, do they really need to memorize all
291
00:18:10,845 --> 00:18:14,525
of that? Maybe some of them, but some of them, they don't. And so
292
00:18:14,525 --> 00:18:18,044
go out to Gemini or chat GPT and look that up. And
293
00:18:18,044 --> 00:18:21,450
then the real goal is on how you apply that.
294
00:18:21,990 --> 00:18:24,550
So I I think that's kind of what I was hearing as you were talking
295
00:18:24,550 --> 00:18:28,085
about what what you did on the final. Yeah. Absolutely.
296
00:18:28,385 --> 00:18:31,925
And and it's pushing it's pushing students in the direction of
297
00:18:32,145 --> 00:18:35,745
how do you utilize tools to to do
298
00:18:35,745 --> 00:18:39,150
better. Well and I think that's a critical point for
299
00:18:39,150 --> 00:18:42,990
everybody to understand. And and you might disagree I don't think you do, but
300
00:18:42,990 --> 00:18:46,290
you might disagree with me on this. I think we have an ethical obligation,
301
00:18:46,735 --> 00:18:50,195
particularly in our field. Rob and I are both in information systems
302
00:18:51,375 --> 00:18:54,835
to help students understand how to use these new tools effectively.
303
00:18:55,830 --> 00:18:59,270
I mean, it it's not all that different than
304
00:18:59,270 --> 00:19:03,005
when spreadsheets started becoming popular. You
305
00:19:03,005 --> 00:19:06,685
know, why wouldn't you use that tool instead of pulling out your
306
00:19:06,685 --> 00:19:10,480
calculator or your slide rule or your abacus or whatever? Do you
307
00:19:10,480 --> 00:19:13,540
agree with that that we really do have an obligation to help them learn?
308
00:19:14,080 --> 00:19:17,760
Absolutely. And I think that's what students want from us as as I as
309
00:19:17,760 --> 00:19:21,525
professionals is there's a lot of fear for students of how do they enter this
310
00:19:21,525 --> 00:19:24,885
marketplace that if you watch the news, doom and gloom sells
311
00:19:24,885 --> 00:19:28,440
news. And the doom and gloom that you hear is all the jobs are gonna
312
00:19:28,440 --> 00:19:32,200
be replaced. Everything we do, the, you know, chat gbt is gonna do
313
00:19:32,200 --> 00:19:35,720
for us, which I don't think is true. I think it's gonna change the nature
314
00:19:35,720 --> 00:19:39,304
of how we do the things that we do. And with what we are as
315
00:19:39,304 --> 00:19:42,924
information systems professionals, I I think we're in a unique place where
316
00:19:43,544 --> 00:19:47,040
as we embrace and utilize these tools, I
317
00:19:47,040 --> 00:19:50,480
think our job of helping people to use technology and be able to use it
318
00:19:50,480 --> 00:19:54,020
better to do their jobs is exactly what we've been doing for decades.
319
00:19:54,245 --> 00:19:57,925
Right. And now it's at the forefront. It's a place where we can be
320
00:19:57,925 --> 00:20:01,685
leaders in ways that may have been a little bit harder to earn that ability
321
00:20:01,685 --> 00:20:03,700
to step up to the plate and to be able to do that. So if
322
00:20:03,700 --> 00:20:06,919
we can prepare our students to have the confidence to step into that,
323
00:20:07,380 --> 00:20:10,899
I think we're gonna have some incredible success stories that that are,
324
00:20:11,664 --> 00:20:14,784
we're gonna be able to talk about, you know, 2, 3 years from now. Well,
325
00:20:14,784 --> 00:20:17,904
there's a quote that's been floating around the the,
326
00:20:18,225 --> 00:20:21,330
interwebs. I I can't remember who supposedly said it.
327
00:20:22,130 --> 00:20:25,029
It's not that AI is gonna replace jobs.
328
00:20:25,970 --> 00:20:29,809
It's people using AI are going to replace people that don't
329
00:20:29,809 --> 00:20:33,485
use AI. Absolutely. And I I yeah. I think that makes a lot of
330
00:20:33,485 --> 00:20:36,705
sense. So I wanna go down a little bit different path here.
331
00:20:37,565 --> 00:20:41,279
How are you seeing your colleagues react to
332
00:20:41,279 --> 00:20:44,580
this? I I had a conversation. I'm not gonna disclose too much,
333
00:20:45,440 --> 00:20:48,500
but the impression this person had was that another
334
00:20:49,425 --> 00:20:52,245
entity, administrative entity,
335
00:20:53,185 --> 00:20:56,325
was basically telling students don't use it. It's cheating to use it,
336
00:20:57,640 --> 00:21:01,340
which kind of surprised me a little bit. But what are you hearing? So
337
00:21:01,880 --> 00:21:05,480
I'm lucky in some ways in the Carson College of Business here at
338
00:21:05,480 --> 00:21:09,325
WSU. Our leadership has embraced generative
339
00:21:09,385 --> 00:21:12,985
AI from day 1. Right? We have a an interim dean in place who is
340
00:21:12,985 --> 00:21:16,670
from an information systems background, and she
341
00:21:16,670 --> 00:21:20,510
saw the strategic vision of of generative AI. So we've been talking about
342
00:21:20,510 --> 00:21:24,345
it as a college since it hit the scene. Initially, awareness. What is
343
00:21:24,345 --> 00:21:27,385
it? How can it, you know, do for us? There's a lot of, oh, no.
344
00:21:27,385 --> 00:21:31,065
This is scary. This is bad. But we had conversations that says,
345
00:21:31,065 --> 00:21:34,730
yes. It could be, but here's how we can embrace it and here's how we
346
00:21:34,730 --> 00:21:38,090
can use it. And we've moved from the awareness stage to actually
347
00:21:38,090 --> 00:21:41,230
doing intentional things as faculty to say, okay.
348
00:21:41,845 --> 00:21:44,645
Even those of you who aren't using it yet, let's put you with those who
349
00:21:44,645 --> 00:21:48,325
are and get you confident in your ability to adopt it and to begin using
350
00:21:48,325 --> 00:21:51,880
it. So how do we move people, you know, from awareness to adoption and
351
00:21:51,880 --> 00:21:55,559
use of of the technology? And so within
352
00:21:55,559 --> 00:21:59,165
my college, we have seen very little
353
00:21:59,165 --> 00:22:03,005
resistance of people telling students that you can't use it. It's bad because we've
354
00:22:03,005 --> 00:22:06,550
had a very strong openness of conversation
355
00:22:06,690 --> 00:22:10,450
as as a, college to that. Where we've run
356
00:22:10,450 --> 00:22:14,205
into issues is there are other people at Washington State University
357
00:22:14,505 --> 00:22:18,345
in other classes and other disciplines where they are telling students, do not use it
358
00:22:18,345 --> 00:22:22,150
as cheating. It's bad. Initially, our academic
359
00:22:23,010 --> 00:22:26,770
integrity council who are looking for cheating had a policy that said, if
360
00:22:26,770 --> 00:22:30,475
2 different chat GPT detectors report
361
00:22:30,475 --> 00:22:34,235
70% or higher likelihood that it was used by chat
362
00:22:34,235 --> 00:22:37,835
GPT, then we're going to go ahead and call that an academic integrity
363
00:22:37,835 --> 00:22:41,340
violation. Okay. They've moved away from that. Thank goodness. That was kind of the initial
364
00:22:41,340 --> 00:22:45,180
positioning, but it it really did put a scare on people. I I've seen
365
00:22:45,180 --> 00:22:48,965
research that suggests that the chat GPT detectors are biased. They're
366
00:22:48,965 --> 00:22:52,565
biased to people who English is their second language because when you
367
00:22:52,565 --> 00:22:56,130
learn English later in life, it's usually based on rules. So you're very
368
00:22:56,130 --> 00:22:59,970
systematic in how you use it, and your choices of your word
369
00:22:59,970 --> 00:23:03,810
breadth is not as large as someone who who initially learned the English language. And
370
00:23:03,810 --> 00:23:07,434
so if the way we're detecting it is biased towards people that where
371
00:23:07,434 --> 00:23:10,875
English was their second language, and in many ways, that's wrong. Right? We should
372
00:23:10,875 --> 00:23:14,510
not, be using it that way. And so I I think as a higher
373
00:23:14,510 --> 00:23:18,350
institution, we're beginning to see that. But, students are getting conflicting
374
00:23:18,350 --> 00:23:21,870
messages across the different courses at least at a university
375
00:23:21,870 --> 00:23:25,414
level, And that's why I'm an advocate for let's talk about this as much as
376
00:23:25,414 --> 00:23:29,095
we can in the places where we can talk about it and and have influence
377
00:23:29,095 --> 00:23:32,720
and power to be able to do that to, you know, lead the way as
378
00:23:32,720 --> 00:23:36,480
opposed to put our head in the sand. Well, I'm glad to hear
379
00:23:36,480 --> 00:23:40,115
they're moving away from the detectors because the detectors
380
00:23:40,115 --> 00:23:43,655
are largely garbage. It occurs to me
381
00:23:44,515 --> 00:23:48,215
that it's tough for us to hold students completely responsible
382
00:23:49,399 --> 00:23:53,179
when we haven't helped them understand kind of what the guardrails
383
00:23:53,240 --> 00:23:57,000
are. You know, they they don't even really understand plagiarism in a
384
00:23:57,000 --> 00:24:00,285
lot of ways. Well, I didn't directly copy and paste it,
385
00:24:00,905 --> 00:24:04,505
or or I did and I cited it. You know, that's not plagiarism. And so
386
00:24:04,665 --> 00:24:08,160
and and, you know, plagiarism is not as black and white as sometimes
387
00:24:08,220 --> 00:24:11,680
people might think it is. So they don't understand plagiarism
388
00:24:11,900 --> 00:24:15,725
that's been talked about forever. How can we
389
00:24:16,105 --> 00:24:19,565
be sure that they understand where the limits are on using generative
390
00:24:19,625 --> 00:24:23,409
AI? Now I think most of them would know if you put a question
391
00:24:23,409 --> 00:24:26,710
in, copy and paste it, that's probably crossing the line.
392
00:24:27,570 --> 00:24:30,929
But anything that's close to an edge case is really confusing for
393
00:24:30,929 --> 00:24:34,675
everybody, especially students. And back to an earlier point you
394
00:24:34,675 --> 00:24:38,375
made, if we make them terrified
395
00:24:39,315 --> 00:24:42,730
of trying to use some new tool that's
396
00:24:42,950 --> 00:24:46,710
changing the way the world works, you know, we're kinda doing them a
397
00:24:46,710 --> 00:24:50,550
disservice. But I'm glad to hear that WSU, at
398
00:24:50,550 --> 00:24:53,965
least in the Carson College of Businesses, kind
399
00:24:53,965 --> 00:24:57,404
of adopted it. Our our dean was also
400
00:24:57,404 --> 00:25:00,919
a a big proponent of it early on. I Pretty
401
00:25:00,919 --> 00:25:04,679
early on after Chatt GPT was released, he
402
00:25:04,679 --> 00:25:08,365
had me put on a couple of brown bags, you know, for faculty,
403
00:25:08,365 --> 00:25:11,665
and we're we're working on a policy and
404
00:25:12,365 --> 00:25:16,125
those sorts of things. I actually, I think about ChattoptingPT was, you
405
00:25:16,125 --> 00:25:19,490
know, the technology that seemed to change the world just over a year ago.
406
00:25:19,950 --> 00:25:23,730
But when I I step back and I think about it, it's not
407
00:25:23,870 --> 00:25:26,945
necessarily generative AI. It's not necessarily chat GPT.
408
00:25:27,565 --> 00:25:30,785
But what's more important, I think, is how
409
00:25:31,725 --> 00:25:35,265
do you embrace and start using a new disruptive
410
00:25:35,440 --> 00:25:39,280
technology in the world. Right? And and so there's gonna be
411
00:25:39,280 --> 00:25:42,720
something else 3 years from now. And if our students aren't equipped for how do
412
00:25:42,720 --> 00:25:46,505
I adjust what I'm doing in the workplace and leading with technology
413
00:25:46,505 --> 00:25:49,945
in the workplace and figuring out how to think about these new technologies in the
414
00:25:49,945 --> 00:25:53,540
workplace, In many ways, they've lost that opportunity to
415
00:25:53,540 --> 00:25:57,220
be, you know, uniquely positioned to learn about it while they're, you know,
416
00:25:57,220 --> 00:26:01,054
sitting in a higher ed classroom right now. So it it is, I
417
00:26:01,054 --> 00:26:04,515
I think, many ways, as developmental of
418
00:26:05,190 --> 00:26:07,910
how do I do things 2 years from now when that next new thing comes
419
00:26:07,910 --> 00:26:11,750
out versus what are the rules about this one new thing that came out right
420
00:26:11,750 --> 00:26:15,015
now? Well and we tend to forget about how
421
00:26:15,795 --> 00:26:18,055
early users are often ridiculed.
422
00:26:20,320 --> 00:26:23,540
Yeah. I absolutely agree. And and and and that's where I think
423
00:26:24,160 --> 00:26:27,920
that's what needs to be embraced and taught about is is is how do you
424
00:26:28,605 --> 00:26:31,325
how are you that early ever? How how are you you know, when it comes
425
00:26:31,325 --> 00:26:35,025
to technology, the IS people are gonna be the ones that people look to.
426
00:26:35,085 --> 00:26:37,580
And so how do you figure out how to use it? Well, the first way
427
00:26:37,580 --> 00:26:41,180
is you just start using it. Right? And you're gonna use it in in
428
00:26:41,180 --> 00:26:45,020
ways where, you know, am I gonna make that presentation for my boss with it
429
00:26:45,020 --> 00:26:48,164
the very first time I use it? Yeah. Probably not. I'm probably gonna do something
430
00:26:48,784 --> 00:26:51,745
exploratory and fun trying to figure out what it can do for me. And as
431
00:26:51,745 --> 00:26:55,240
you build confidence in when do I know, becomes more
432
00:26:55,240 --> 00:26:59,000
useful. But I I still go back to with any technology, with any decision
433
00:26:59,000 --> 00:27:02,840
making. It takes a level of expertise to have confidence in what's going
434
00:27:02,840 --> 00:27:06,145
on. So in the process of of educating students,
435
00:27:06,525 --> 00:27:10,365
there's still the importance of developing that level of expertise for them so they
436
00:27:10,365 --> 00:27:13,910
gain that confidence. Right. But that process, you know, I I would say
437
00:27:14,210 --> 00:27:17,010
the way I teach a class today is different than the way I taught it
438
00:27:17,010 --> 00:27:20,635
20 years ago. And and that's because what
439
00:27:20,715 --> 00:27:23,355
needs to be learned has changed. You know, some of the theories are still the
440
00:27:23,355 --> 00:27:27,035
same, but how you apply them and how you use them is much, much
441
00:27:27,035 --> 00:27:30,840
different. And so, again, it's it's it's a different
442
00:27:30,840 --> 00:27:34,280
way of getting to some of the same outcomes, I think, is is where we're
443
00:27:34,280 --> 00:27:37,980
going. Yep. Yep. I absolutely agree.
444
00:27:39,105 --> 00:27:42,465
So what do you see as the future of
445
00:27:42,465 --> 00:27:46,299
generative AI, especially as it relates to higher ed? How's that
446
00:27:46,299 --> 00:27:49,659
for a question? That's a great question. I I think there's a couple of
447
00:27:49,659 --> 00:27:52,960
interesting challenges. One
448
00:27:53,019 --> 00:27:56,625
is the equity of
449
00:27:56,625 --> 00:28:00,385
making it available. And it's been an interesting conversation we've had is
450
00:28:00,385 --> 00:28:03,025
is, you know, if you're gonna use it, you know, in your class, allow it
451
00:28:03,025 --> 00:28:06,850
in your class. And, my experience has been chat gpt 4. The one you
452
00:28:06,850 --> 00:28:10,550
pay for is better than chat gpt 3.5, the one you can get for free.
453
00:28:10,690 --> 00:28:13,430
What about the students who just can't afford to buy
454
00:28:14,435 --> 00:28:17,875
the better one? What disadvantage does that place to them at? How do you
455
00:28:17,875 --> 00:28:21,395
achieve conquer that? Right? Does do universities just buy site
456
00:28:21,395 --> 00:28:25,220
licenses and make it available to all the students? And then how do you agree
457
00:28:25,220 --> 00:28:28,740
on which generative AI tool you're gonna be? Who's the winner? Who's the loser? Right?
458
00:28:28,740 --> 00:28:32,475
It sounds like the the Dropbox versus Google Drive versus
459
00:28:32,635 --> 00:28:36,475
OneDrive argument that we've done with with cloud computing. So
460
00:28:36,475 --> 00:28:39,294
I I think getting that figured out is how do we
461
00:28:40,130 --> 00:28:43,570
build an equal playing field for everyone is one of the first things. But what
462
00:28:43,570 --> 00:28:47,110
makes that challenging, and and this is where I think universities
463
00:28:47,330 --> 00:28:50,995
have to get things figured out, is so
464
00:28:50,995 --> 00:28:54,115
much changes every month. Right? It seems like every time I turn around, there's something
465
00:28:54,115 --> 00:28:57,860
new with generative AI. So when, you know,
466
00:28:57,860 --> 00:29:01,700
bureaucracy gets involved, it takes forever to make a decision, and then that decision is
467
00:29:01,700 --> 00:29:05,515
stuck with for a long, long time. In a world of
468
00:29:05,515 --> 00:29:08,955
drastically changing new technologies, how
469
00:29:08,955 --> 00:29:12,649
does that get get baked into everything
470
00:29:12,649 --> 00:29:16,409
as well? So I I think we're at at this explosive stage with a lot
471
00:29:16,409 --> 00:29:19,850
of new things, which really, you know, with things changing
472
00:29:19,850 --> 00:29:23,065
fast, the bureaucracy of universities is gonna have an interesting
473
00:29:23,524 --> 00:29:27,225
challenge that is in front of it is how do they
474
00:29:27,445 --> 00:29:31,110
make this a part of who they are in a way that doesn't
475
00:29:31,110 --> 00:29:34,950
limit the avail availability of of creative tools and the use
476
00:29:34,950 --> 00:29:38,705
of those creative tools as they come onto the market? You're spot on.
477
00:29:38,705 --> 00:29:41,125
There there are gonna be some serious equity issues,
478
00:29:42,145 --> 00:29:45,605
especially as generative AI tools
479
00:29:46,060 --> 00:29:49,440
are increasingly embedded in other tools.
480
00:29:49,740 --> 00:29:53,580
I I think that's the future. Matter of fact, Rob and Franz
481
00:29:53,580 --> 00:29:57,395
Belanger, and I were on a call with our publisher, and
482
00:29:57,395 --> 00:30:00,914
that came up. You know, in a in a couple of years, we're not really
483
00:30:00,914 --> 00:30:04,680
gonna be talking about chat GPT all that much. AI is just gonna
484
00:30:04,680 --> 00:30:08,120
be in everything. I mean, when was the last time a normal person used the
485
00:30:08,120 --> 00:30:11,420
term ecommerce? It's just commerce.
486
00:30:12,275 --> 00:30:15,955
You know? And that's what AI is going towards. And so, you know,
487
00:30:15,955 --> 00:30:19,630
maybe you can scrape together the $20 for chatgptpro
488
00:30:20,250 --> 00:30:24,030
or Gemini advanced or, you know, Po or whatever it is. But
489
00:30:25,130 --> 00:30:28,975
what if there are all these AI tools that are $10
490
00:30:28,975 --> 00:30:32,815
a month, $20 a month? You know, there's one that's an
491
00:30:32,815 --> 00:30:36,580
accounting tutor, and there's one that's like Grammarly
492
00:30:36,580 --> 00:30:40,179
and helps you write. And there's another one that helps you do research and
493
00:30:40,179 --> 00:30:43,355
on and on and on. And now all of a sudden, you know, you're at
494
00:30:43,355 --> 00:30:46,875
a couple $100 a month, which is a lot of
495
00:30:46,875 --> 00:30:50,635
money to a lot of students. And so I'm I'm pretty
496
00:30:50,635 --> 00:30:54,210
concerned about that. And I I guess those inequities have been around forever,
497
00:30:55,309 --> 00:30:58,850
but that doesn't mean we shouldn't be trying to find ways to
498
00:30:58,910 --> 00:31:02,485
mitigate them. Yeah. No. I think I think some we absolutely have to be
499
00:31:02,485 --> 00:31:05,925
pushing talking about. And I know organizations I've talked to are
500
00:31:05,925 --> 00:31:09,650
concerned about paying all these fees for for use of
501
00:31:09,650 --> 00:31:13,410
things. And so it's gonna be interesting if it's a a third party product
502
00:31:13,410 --> 00:31:17,005
that's completely owned by, you know, the
503
00:31:17,005 --> 00:31:20,765
big companies or if our our our university is gonna say, you know
504
00:31:20,765 --> 00:31:24,480
what? We're gonna develop and host our own large language model or
505
00:31:24,559 --> 00:31:28,240
own language model where we control what are the inputs of of
506
00:31:28,240 --> 00:31:32,020
of information. Because because right now, you know, when you look at OpenAI,
507
00:31:32,080 --> 00:31:35,664
ChatDpT, there's a very, very large amount of information that
508
00:31:35,664 --> 00:31:39,505
feeds into that. And is all of that relevant important to what
509
00:31:39,505 --> 00:31:43,290
I am teaching in a cybersecurity class, or would students be better
510
00:31:43,290 --> 00:31:47,130
served having a language model that was purposely designed for
511
00:31:47,130 --> 00:31:50,830
what we're gonna, be teaching here? I I have a
512
00:31:51,005 --> 00:31:54,685
a colleague that's experimenting with using he's an using an open
513
00:31:54,685 --> 00:31:58,525
source textbook that he used to feed into a language model.
514
00:31:58,525 --> 00:32:02,309
So it became the chat GPT, if you will, and then students
515
00:32:02,309 --> 00:32:05,529
could utilize a chatbot to ask the textbook questions.
516
00:32:06,470 --> 00:32:10,025
We did that with our textbook. I was doing a little bit of consulting work
517
00:32:10,025 --> 00:32:13,865
with a firm that was working on an AI based
518
00:32:13,865 --> 00:32:17,640
tool, and our publisher gave us the permission to take an earlier edition of
519
00:32:17,640 --> 00:32:21,419
the textbook, load it into this large language model,
520
00:32:22,200 --> 00:32:25,015
and then students could just go ask questions of it.
521
00:32:25,575 --> 00:32:29,415
Unfortunately, the product didn't survive. It went in a
522
00:32:29,415 --> 00:32:32,935
different direction. The company survived. The product didn't, which
523
00:32:32,935 --> 00:32:36,650
is is not all that unusual. But I thought that this is
524
00:32:36,650 --> 00:32:40,170
really something because I don't know about your
525
00:32:40,170 --> 00:32:43,755
students, but but the students that I've been
526
00:32:43,755 --> 00:32:47,595
around in the last 8 or 10 years, really, they don't
527
00:32:47,595 --> 00:32:50,815
wanna come to office hours. They don't wanna email you with questions
528
00:32:51,690 --> 00:32:55,450
unless they're grade grubbing at the end of the term. You know,
529
00:32:55,450 --> 00:32:59,049
they get stuck on something, and they don't know what to do, so they just
530
00:32:59,049 --> 00:33:02,745
kinda give up. Well, if there was a a model, you know,
531
00:33:02,745 --> 00:33:06,585
textbook GPT or whatever it is where they can just say, I
532
00:33:06,585 --> 00:33:10,360
don't understand the difference between these two definitions, and they get
533
00:33:10,360 --> 00:33:13,900
an answer. Was that as good as talking to the professor?
534
00:33:14,440 --> 00:33:17,500
Maybe not, but it might be good enough.
535
00:33:18,095 --> 00:33:21,375
Well, and is it available at 2 o'clock in the morning when they decide they're
536
00:33:21,375 --> 00:33:24,514
gonna do their coursework? Right? I I may be sleeping and unavailable,
537
00:33:25,280 --> 00:33:28,980
but if they can get help from the technology at the moment
538
00:33:29,200 --> 00:33:32,420
that they're struggling, you know, that's that's helpful.
539
00:33:32,935 --> 00:33:36,475
Yeah. The downside of of some of what you were talking about is
540
00:33:37,175 --> 00:33:40,795
we may separate out the haves and have nots in terms of institutions.
541
00:33:41,600 --> 00:33:45,360
Mhmm. Most schools aren't gonna have the resources to do those
542
00:33:45,360 --> 00:33:49,095
kinds of things, and very few professors will. So
543
00:33:49,095 --> 00:33:52,534
it's gonna be interesting to see, you know, do we respond do we respond at
544
00:33:52,534 --> 00:33:56,375
a system level? Do we get consortia of schools or,
545
00:33:56,375 --> 00:34:00,150
you know, how do we navigate that? But I think these
546
00:34:00,150 --> 00:34:03,990
localized large language models are are really gonna
547
00:34:03,990 --> 00:34:07,365
be part of the future as well. Rob, we've been talking
548
00:34:07,505 --> 00:34:11,125
primarily about educational use, but there's a lot of potential
549
00:34:11,985 --> 00:34:15,580
to get rid of some of the administrative drudgery using
550
00:34:15,580 --> 00:34:19,120
generative AI. You're a department chair, so you unfortunately have the administrative
551
00:34:19,179 --> 00:34:22,860
drudgery more than us lucky faculty members, or is it we lucky
552
00:34:22,860 --> 00:34:26,474
faculty members do. So are you using it or
553
00:34:26,474 --> 00:34:29,775
thinking about using it for any for your administrative duties?
554
00:34:30,474 --> 00:34:33,960
Thanks, Greg. Yeah. I've been starting to a little bit. And and one of the
555
00:34:33,960 --> 00:34:37,500
places where I've used it is sometimes you get asked
556
00:34:37,719 --> 00:34:41,135
questions that are hard to answer, that you know you
557
00:34:41,135 --> 00:34:44,415
want to answer, you know how you're going to answer it, but you need to
558
00:34:44,415 --> 00:34:47,955
write that email that is, you know, firm but sensitive
559
00:34:48,140 --> 00:34:51,820
and and is, you know, to the point. And I spent a lot of time
560
00:34:51,820 --> 00:34:55,340
struggling when I write emails like that and and probably way more time thinking about
561
00:34:55,340 --> 00:34:59,115
how to send that response than I should. And out of
562
00:34:59,115 --> 00:35:02,655
curiosity and frustration, I I asked chat gpt
563
00:35:02,875 --> 00:35:05,620
to write an email like that for me so I could respond and say, you
564
00:35:05,620 --> 00:35:08,820
know, I want a email that is going to be written to tell a person
565
00:35:08,820 --> 00:35:12,660
that, in this case, it was someone who is seeking a job that
566
00:35:12,660 --> 00:35:16,425
we were deciding, you know, not to hire them. How do I reply to this
567
00:35:16,425 --> 00:35:20,185
email? And it gave me an answer, and it gave me a 5
568
00:35:20,185 --> 00:35:23,705
paragraph answer that was so fluffy and full of
569
00:35:23,705 --> 00:35:27,359
just nice words. And I'm like, you know, I don't wanna say all that. Right?
570
00:35:27,359 --> 00:35:30,420
It was it was overly wordy. And so I said, could you make it shorter?
571
00:35:31,119 --> 00:35:34,494
And so it condensed it down. It went from 5 paragraphs to 4. I said,
572
00:35:34,494 --> 00:35:36,974
well, can you make it shorter? It gave me 3. I have one more time.
573
00:35:36,974 --> 00:35:40,255
Can you make it shorter? I got 2 paragraphs, and it was really close to
574
00:35:40,255 --> 00:35:44,060
what I would want to say. And so I copy pasted it, made a
575
00:35:44,060 --> 00:35:47,900
few minor edits. And something that would have made I would have struggled
576
00:35:47,900 --> 00:35:51,595
and second guessed myself about how to write that email for 20 minutes to
577
00:35:51,595 --> 00:35:55,355
a half an hour. I was done in 3 minutes, and I felt confident
578
00:35:55,355 --> 00:35:59,115
that I had done a good job of appropriately, giving
579
00:35:59,115 --> 00:36:02,930
feedback. And so in in some ways, that that struggle with how do
580
00:36:02,930 --> 00:36:06,770
you how do you write a message that you never thought you would be
581
00:36:06,770 --> 00:36:10,464
asked in an email question from somebody or whatever, which happens way too
582
00:36:10,464 --> 00:36:13,984
often as a department chair. In in some ways, it can really help you to
583
00:36:13,984 --> 00:36:16,464
be able to get to the the point of, you know, I know what I
584
00:36:16,464 --> 00:36:19,960
wanna say, but how do I put the words around it that show people
585
00:36:19,960 --> 00:36:23,560
dignity, show people respect, and and do so in a way that has the the
586
00:36:23,560 --> 00:36:27,005
kind of voice I'd like to have and how I communicated that. But I've
587
00:36:27,005 --> 00:36:30,684
also used it to have it right, structure of
588
00:36:30,684 --> 00:36:34,125
policy statements or whatever. We're working on, you know, we need to have a a
589
00:36:34,125 --> 00:36:37,790
policy that's gonna address blah blah blah blah blah. You know, usually,
590
00:36:37,790 --> 00:36:41,550
they're pretty standard in in kind of what the format looks like.
591
00:36:41,550 --> 00:36:45,330
And to get a professional starting point for how you're gonna put a document together,
592
00:36:45,765 --> 00:36:49,445
it just saves that 20 minutes of getting started, which is
593
00:36:49,445 --> 00:36:53,205
helpful. Yeah. And and where we don't really add value. I mean, I I think
594
00:36:53,205 --> 00:36:56,850
that's part of what I keep hearing that excites
595
00:36:56,850 --> 00:37:00,690
me is we can focus where we add value and get
596
00:37:00,690 --> 00:37:04,235
rid of some of the things where we don't add value. One of
597
00:37:04,235 --> 00:37:07,755
my very favorite prompts is please critique
598
00:37:07,755 --> 00:37:11,470
this. And it's really good at pointing out
599
00:37:11,630 --> 00:37:15,170
holes and logic or, you know, if you miss some
600
00:37:15,230 --> 00:37:18,990
element of a policy or you don't put something
601
00:37:18,990 --> 00:37:22,615
in an email message, it's pretty good at pointing out
602
00:37:22,615 --> 00:37:26,375
those those sorts of holes, I think. So Yeah.
603
00:37:26,375 --> 00:37:30,055
And and another place and as you're in, you know, administrative roles and dealing with
604
00:37:30,055 --> 00:37:33,840
bureaucracy, you're oftentimes told to make something for this and submit
605
00:37:33,840 --> 00:37:37,520
it, but it can only be 250 words. And you're alright. And
606
00:37:37,520 --> 00:37:41,155
then I'm like, oh, no. That's 300 words. And that process of trying to carve
607
00:37:41,155 --> 00:37:44,995
out 50 words can be tough. And and I've learned I can actually
608
00:37:44,995 --> 00:37:48,760
take a document like that and say, please take this and reduce it to
609
00:37:48,760 --> 00:37:52,440
250 words or less. Yeah. And it does a really good job of saying
610
00:37:52,440 --> 00:37:56,280
the exact same thing I wanna say, but doing it in a slightly
611
00:37:56,280 --> 00:37:59,945
more efficient manner. Yeah. Those are both great examples of of
612
00:37:59,945 --> 00:38:03,725
things that are just kinda tough to do that you can do pretty quickly and
613
00:38:04,185 --> 00:38:07,740
ethically. You know, how is that any different than,
614
00:38:08,360 --> 00:38:11,960
you know, giving a message to somebody and saying, hey. Can
615
00:38:11,960 --> 00:38:15,525
you help me figure out how to say this in a little bit kinder
616
00:38:15,525 --> 00:38:19,125
way? Mhmm. But at the end of the day, you're
617
00:38:19,125 --> 00:38:22,950
responsible for what's in that that email. You know? So I
618
00:38:22,950 --> 00:38:26,490
think where I see people screwing up, like, I think Vanderbilt did
619
00:38:27,110 --> 00:38:30,950
early on, is they just take what generative AI puts
620
00:38:30,950 --> 00:38:34,555
out, and that's what it is. There's no human in the loop
621
00:38:35,095 --> 00:38:38,855
making sure that it that it's quality. So Yep. And
622
00:38:38,855 --> 00:38:41,930
in all cases, I read it, and I I I ask myself, is this is
623
00:38:41,930 --> 00:38:45,450
this what I would say and how I would say it? Because sometimes I get
624
00:38:45,450 --> 00:38:49,125
phraseology that's like, you know, I would never use that phrase in my life.
625
00:38:49,285 --> 00:38:52,245
So I changed it to the phrase I would use. You know? And so I
626
00:38:52,245 --> 00:38:55,685
I but if you just save me 15 minutes, that's
627
00:38:55,685 --> 00:38:59,260
15, I can spend on something else. So Yeah. I mean, I mean, how
628
00:38:59,319 --> 00:39:03,079
how wonderful is it that to have a tool that can cut out
629
00:39:03,079 --> 00:39:06,680
3 quarters of something? Mhmm. You know, even if it doesn't take your effort to
630
00:39:06,680 --> 00:39:10,365
0, it can still save a lot of time.
631
00:39:11,065 --> 00:39:14,845
Absolutely. Alright. Any last words? Anything you wanna plug? Any
632
00:39:14,985 --> 00:39:18,720
last bit of advice? I guess I would go back
633
00:39:18,720 --> 00:39:21,540
to give yourself the ability to be creative
634
00:39:22,400 --> 00:39:26,105
and to take your intellectual curiosity and put it to work because you
635
00:39:26,105 --> 00:39:29,865
may find some surprising interesting ways to use things. And then once
636
00:39:29,865 --> 00:39:33,530
you've found them, talk about them, a, in the hallways of those that you're
637
00:39:33,530 --> 00:39:37,130
working with, you know, so that way you can spur conversations, and,
638
00:39:37,130 --> 00:39:40,970
b, as you have opportunities into greater academic
639
00:39:40,970 --> 00:39:44,625
conversations, whether it's through Craig's channel here
640
00:39:44,625 --> 00:39:48,385
or if it's through academic conferences you attend. But, don't don't keep
641
00:39:48,385 --> 00:39:52,130
your mouth shut about this. You know? Great. Bring your experiences
642
00:39:52,190 --> 00:39:55,626
to the conversation so we can all learn from each other. Alright.
643
00:39:55,926 --> 00:39:59,686
Well, thanks again, and I think we're done.
644
00:39:59,686 --> 00:40:01,466
We're out. Thanks, Greg.