Podcast | 22 Jun, 2023

Episode 3: Generative AI over the Years: Security, Job Creation, and Fighting Writer’s Block

Alex Olesen

Host

Alex Olesen

VP Vertical Strategy & Product Marketing

Sarah Luger

Host

Sarah Luger

Senior Director of AI, NLP, & ML at Orange Silicon Valley

Sarah Luger, PhD sat down with host Alex Olesen, VP Vertical Strategy & Product Marketing at Persado to share some key ways humans and AI work together and could collaborate in the future. This partnership between humans and AI offers an uplifting perspective. AI automates mundane tasks at scale, leaving humans with more time to complete more complex tasks. But, what does this mean in the short term and in the long term? According to Sarah, right now, ChatGPT and GPT-based technologies are being used a lot in marketing, writing, and creativity-based tasks. Generative AI is great for writer’s block and people can easily figure out how to engineer prompts thanks to decades of using Google. On the enterprise level, the relationship between AI and humans is based on trust. Not just humans trusting AI to complete tasks quickly at scale, but also using AI to add value to the customer experience. Forward-thinking companies use AI to learn customers’ preferences and to provide more personalized experiences and recommendations. 

“At Orange, we take our customer engagement extremely seriously. We are customer-centric. Our concern is a great customer experience and we know that we’re sitting on a gold mine of data. Now we have a tool that allows us to personalize and create even better experiences,” said Sarah. 

Sarah hopes to see in the future how large language models will transform economic and global engagement, especially when it comes to underserved languages and communities.

Episode Transcript:

00:00:10.500 –> 00:00:19.100
Alex Olesen: Welcome back to the Motivation AI Matters. Today I am really excited to be joined by Sarah Luger.

5
00:00:33.850 –> 00:00:36.530
Alex Olesen: Sarah, thank you for joining us today.

6
00:00:38.190 –> 00:00:53.090
sarah: Thank you so much, Alex. It’s a real pleasure. As you know, I’ve been working in the space for some time, but also excited about Persado’s role in this space. So thank you again for having me. My name is Sarah Luger.

7
00:00:53.210 –> 00:01:00.069
I got my PhD. At the University of Edinburgh many years ago in artificial intelligence, and…

8
00:01:00.480 –> 00:01:18.790
Sarah: …I’m being quite honest to say that I’m surprised and excited about the developments in this space. In the past, I’ve worked at startups. I’ve worked at IBM building a precursor to the IBM Watson Jeopardy challenge robot.

9
00:01:18.820 –> 00:01:28.430
Sarah: I’ve also been at Orange in Silicon Valley for five years. We’ve worked on numerous topics, including voice by metrics, chatbots, call center technology…

10
00:01:28.480 –> 00:01:30.540
…and of course…

11
00:01:31.100 –> 00:01:35.130

Sarah: …everything that has to do with large language models and Generative AI.

12
00:01:35.320 –> 00:01:39.469

I’m really excited, especially because, as you may or may not know….

13
00:01:39.690 –> 00:01:56.930

Sarah: …Orange is a large company with almost a hundred 40,000 employees, and we’re in 27 countries. Many of those are in North and West Africa, and many of our customers speak languages that are low resource, which means they don’t have a lot of training data.

14
00:01:57.030 –> 00:02:12.949

Sarah: They don’t have an online presence that supports the kind of data and that’s used in conventional translation systems to create high-quality translations. So, I’ve been working a lot in that space. And I do see both potential…

15
00:02:13.080 –> 00:02:20.550

Sarah: …for large language models to support our customers, but also some peril. And I’m looking forward to chatting more about that today. Thank you.

16
00:02:20.720 –> 00:02:26.429
Alex Olesen: I think that’s fantastic. And you’ve got a great background. I know you’ve…

17
00:02:26.580 –> 00:02:38.099
Alex Olesen: …seen this industry evolve through numerous iterations. It’s most recent what is now being referred to as Generative AI.

18
00:02:38.370 –> 00:02:50.530
Alex Olesen: I know you just touched on a lot of really interesting topics. We’ll dive into it over the course of this episode. But to give the listeners a good baseline…

19
00:02:50.730 –> 00:03:03.009
Alex Olesen: …in your own words, could you define what this new term Generative AI means, and then talk to me a little bit about who Generative AI is important for?

21
00:03:06.390 –> 00:03:09.710
Sarah: So, I think…

22
00:03:09.820 –> 00:03:15.070
Sarah: …for the average person out there. Generally, today…

23
00:03:15.360 –> 00:03:25.520
Sarah: …AI is where the output resembles human content. It resembles a language that is either…

24
00:03:25.540 –> 00:03:28.680
…that seems like it’s constructed by a human…

25
00:03:29.000 –> 00:03:54.640
Sarah: …or technically Generative AI systems are based on algorithms that learn from a vast amount of input data. And the most recent cases that we’ll dig into. That would be all of the digital data that’s on the web as well as some knowledge bases, knowledge bases being things like Wikipedia that give structure and associate terms. And…

26
00:03:54.700 –> 00:04:02.799
Sarah: …some apparent meaning to this, this vast sea of language data. And so…

27
00:04:02.820 –> 00:04:05.690
Sarah: …what’s going on under the hood is that there is…

28
00:04:05.840 –> 00:04:14.769
Sarah: …this vast amount of data is being used to learn the patterns of how we as humans speak…

29
00:04:14.950 –> 00:04:26.620
Sarah: …and how we write, and with innovations, both from Google’s 217 transformer paper, incredible compute innovations…

30
00:04:26.700 –> 00:04:31.960
…as well as just ongoing neural networks developments.

31
00:04:33.070 –> 00:04:41.310
Sarah: There’s the possibility, as many of us have now tried since November 30th, 2022, when Chat GPT was launched…

32
00:04:41.340 –> 00:05:01.849
Sarah: …to engage with a Generative AI system in a way that most people had not engaged with an AI system, you know. Perhaps in the past, you had AI, a secondary character in a video game, you know, or there’d maybe been some predictive analytics in an enterprise…

33
00:05:02.070 –> 00:05:15.019
Sarah: …application you were using. But the core of Generative AI is using these patterns of words at a vast scale. That then for us makes it seem like…

34
00:05:15.220 –> 00:05:17.369
Sarah: …this computer, is…

35
00:05:17.590 –> 00:05:25.859
Sarah: …almost human-like content that’s being output. And it’s really a powerful difference between…

36
00:05:25.870 –> 00:05:36.760
Sarah: …systems from even six months ago, right? We had it seven months ago. We’ve had a seat change. And your second question is…

38
00:05:39.790 –> 00:05:43.560
Sarah: …Who is it most important for?

39
00:05:44.540 –> 00:05:49.109
Sarah: Right now…

40
00:05:49.190 –> 00:05:57.200
Sarah: …we’re in the hype cycle, and it’s important for everyone…

41
00:05:57.240 –> 00:05:59.539
…this is great for everything…

42
00:05:59.570 –> 00:06:11.029
Sarah: …and I respect the hype as someone who’s in Silicon Valley because I understand the role that it plays and the duality of…

43

00:06:11.410 –> 00:06:22.590
Sarah: …how we get an investment and how we build prop products and how we have to compete with other hype cycles, most recently in blockchain.

44
00:06:22.670 –> 00:06:26.100
Sarah: But I think that this is really important for…

45

00:06:26.200 –> 00:06:29.799
Sarah: …creating customer-centric tools

46
00:06:29.970 –> 00:06:32.610

Sarah: …that support…

47
00:06:32.830 –> 00:06:46.540

Sarah: …voice bots textual, you know, marketing text. I think that marketing and customer support are the first areas that are going to see innovations in these really…

48

00:06:46.810 –> 00:06:49.360
Sarah: …human seeming…

49
00:06:49.390 –> 00:06:54.050
Sarah: …engagements that can be created for their customers. So…

50
00:06:54.560 –> 00:07:02.539
…those are the two areas that the people I see that will be most affected. But then I want to also flip that and say…

51
00:07:02.620 –> 00:07:06.619
Sarah: …I think the Holy Grail of enterprise…

52
00:07:06.790 –> 00:07:15.259
Sarah: …innovation that isn’t as shiny and sparkly as some of the others, you know, key terms I’ve just mentioned…

53
00:07:15.280 –> 00:07:19.839

Sarah: …is enterprise intranet search.

54
00:07:20.020 –> 00:07:36.159
Sarah: So, the ability to search through a company’s resources, to answer questions for employees, or answer questions for employees that are then passed on to customers. I think that that is…

55
00:07:36.430 –> 00:07:45.519
Sarah: …really key because it will help you and I do our jobs better and reduce mundane tasks…

57
00:07:48.060 –> 00:07:55.780
Sarah: …it’s something that AI, in general, is, you know, aims to do with any technology. We try to elevate our…

58
00:07:55.790 –> 00:07:58.139
Sarah: …work tasks up…

59
00:07:58.220 –> 00:08:10.129
…the difficulty chain. So, we want, as humans, to not do the same thing every day, but to understand patterns…

60
00:08:10.180 –> 00:08:13.969
Sarah: …and reduce repetition…

61
00:08:14.020 –> 00:08:29.960
Sarah: …and do more and more challenging tasks, and those more challenging tasks are very hard for computers, so don’t fret. Many of us will still have jobs. On the other hand, those lower-level tasks are…

62
00:08:30.000 –> 00:08:39.289
Sarah: …really great opportunities for computers to come in, Generative AI systems to come in, and support us.

63
00:08:41.820 –> 00:08:51.909
Alex Olesen: Yeah, you’ve raised some really interesting points, some of which, you know, we’ve discussed in prior conversations. But I’ll summarize those quickly for the listeners.

64
00:08:52.080 –> 00:09:08.989
Alex Olesen: You know we’ve talked about. And I love the way that you put it. You know, some hype cycles around the Metaverse around blockchain. I think we’re definitely seeing a surge of a hype cycle around Generative AI…

65
00:09:09.050 –> 00:09:19.740
Alex Olesen: ..you know, to another one of your points. The barrier to entry for consuming this type of technology in my observation, has been lowered…

66
00:09:19.860 –> 00:09:23.610
Alex Olesen: ..as of November of last year.

68
00:09:26.660 –> 00:09:36.730
Alex Olesen: Chat GPT is so readily available for the average consumer. You know, broaching the consumer market…

69
00:09:36.860 –> 00:09:37.900
Alex Olesen: …and…

70
00:09:37.910 –> 00:09:44.730
Alex Olesen: …you know, as you’re alluding to breaking into the enterprise space as well.

71

00:09:44.910 –> 00:09:47.030

Alex Olesen: I think, is going to have…

72

00:09:47.100 –> 00:09:59.210

Alex Olesen: ….widespread benefit both in terms of the way consumers interact with this type of technology, but also the way large enterprises similar to Orange…

73

00:09:59.340 –> 00:10:12.729

Alex Olesen: …derive value from this type of technology. But I do want to touch on another point that you’ve made, which is unique to the other conversations that I’ve had in the last couple of months…

74

00:10:12.840 –> 00:10:19.369

Alex Olesen: …which are the implications around education and career development.

75

00:10:19.430 –> 00:10:27.680

Alex Olesen: I love what you said because it goes. It goes against the grain, in my opinion…

76

00:10:27.870 –> 00:10:32.050

Alex Olesen: …of what we have seen in the news around job displacement…

77

00:10:32.230 –> 00:10:50.360

Alex Olesen: …and the automation of tasks. I think you delineate very well, and artificial intelligence will help eliminate some mundane tasks. But you put a great perspective on the state of the market by saying it will free…

78

00:10:50.380 –> 00:10:59.559

Alex Olesen: …humans up to do more challenging tasks, and I find that to be a very pragmatic and uplifting perspective…

79

00:11:00.400 –> 00:11:04.460

Alex Olesen: …if you could expand on that a bit, Sarah.

80

00:11:05.040 –> 00:11:06.130

Alex Olesen: What…

81

00:11:06.240 –> 00:11:09.140

Alex Olesen: would you say…

82

00:11:09.430 –> 00:11:16.780

Alex Olesen: …are some interesting applications of humans and AI working together?

83

00:11:16.920 –> 00:11:21.819

Alex Olesen: …might be that we haven’t necessarily considered yet….

84

00:11:24.690 –> 00:11:25.560

sarah: …Great…

85

00:11:26.330 –> 00:11:30.300

…yeah. And I I appreciate you…

86

00:11:30.720 –> 00:11:42.030

sarah: …your perspective as well, because I think that we’re at a point where, because of the hype, there are some…

87

00:11:42.930 –> 00:11:48.020

sarah: …main primary narratives that seem to be…
88

00:11:48.380 –> 00:11:57.640

…they. They seem to be extremes. They’re either fear- based or or wildly outlandish. And you know that the reality is somewhere in between.

89

00:11:57.730 –> 00:11:59.160

sarah: I’ve heard that…

90

00:11:59.230 –> 00:12:03.270

…with new innovations we grossly…

91

00:12:03.350 –> 00:12:10.729

sarah: …overestimate the near-term change, and then underestimates the long-term change…

92

00:12:11.010 –> 00:12:27.699

sarah: …and that’s not my quote, but I strongly agree with your point. What are the ways that this hasn’t been used? And it hasn’t been clear and with collaboration between humans and…

93

00:12:27.790 –> 00:12:30.179

…these tools. What will…

94

00:12:30.550 –> 00:12:39.399

sarah: …become clear in the near term? So, what is happening now, what do we see now?

95

00:12:39.910 –> 00:12:58.110

sarah: Right now, Chat GPT and GPT-based technologies are being used in a lot of marketing, writing, and text and ideation and creativity-based tasks.

96

00:12:58.110 –> 00:13:21.119

sarah: So, they’re really good for this blank page problem. Maybe call it writer’s block…Also, from a prompt perspective. You know, we’ve all grown up to some extent feeling comfortable with Google searches. And so, we can figure out how to engineer prompts that give us an output that we’re looking for…

97

00:13:21.260 –> 00:13:32.460

…But humans and computers working together has a long-standing history of this. And in artificial intelligence we’ve developed agents

98

00:13:32.640 –> 00:13:39.060

sarah: …which are some little software

99

00:13:39.140 –> 00:13:57.239

sarah: …products that help humans complete tasks. And there’s been research in this multimodal, you know, with sensors, what is there. There’s a lot of use in industry. But I think that the next wave of…

100

00:13:57.590 –> 00:14:02.659

sarah: …this technology will be using the

101

00:14:02.720 –> 00:14:15.749

sarah: …trust which a lot of users have gained from this first iteration. And you know, later, I want to maybe push back on that trust. But if we think of this as kind of a consumer life cycle..

102

00:14:15.910 –> 00:14:24.009

sarah: …using the trust that the consumer has from the output of their large language model….

103

00:14:24.030 –> 00:14:26.460

sarah: …chat interface…

104

00:14:26.520 –> 00:14:30.389

sarah:… and turning that into a resource…

105

00:14:31.230 –> 00:14:44.590

sarah: …that maybe takes the place of a Google search. Right? So, I think that chat interfaces are going to transform the way that humans work…

106

00:14:44.690 –> 00:14:45.540

sarah: …with…

107

00:14:46.460 –> 00:15:01.670

sarah: …computers in the fact that we may no longer have Google page rank be as an important in terms of Google being the font of all knowledge. Maybe that verb itself. I’m gonna Google. It will…

108

00:15:01.780 –> 00:15:11.079

sarah: …cease to be as relevant. I think we’re going to have a split in our society where people who have become very…

109

00:15:11.090 –> 00:15:23.469

sarah: …adept at using chat technology may go to use this as their primary resource for new information, for information discovery.

110

00:15:23.760 –> 00:15:35.740

I think that that could have profound implications on the Enterprise search and our ecosystems, you know, if we think of trying to make a decision about, say, is…

111

00:15:35.970 –> 00:15:43.760

sarah: …Siri, or Google Home or Apple home pod. What is the best smart device…

112

00:15:43.840 –> 00:15:46.620

sarah: …it’s hard to evaluate, which is…

113

00:15:46.670 –> 00:15:51.900

sarah: …in a way you know, just a discrete test.

114

00:15:52.260 –> 00:16:03.200

sarah: But maybe, as many consumers do, people make decisions based on their ecosystem. So, I imagine in the future that…

115

00:16:03.290 –> 00:16:06.279

sarah: …open AI will leverage this…

116

00:16:06.660 –> 00:16:16.670

sarah: …collaboration with Microsoft to really pivot the enterprise solutions that the average person spends a lot of their time on…

117

00:16:16.830 –> 00:16:18.010

…and by…

118

00:16:18.240 –> 00:16:19.980

sarah: …changing that habit…

119

00:16:20.620 –> 00:16:25.100

sarah: …anytime you use an ecosystem, you’re providing them with a lot of data…

120

00:16:25.200 –> 00:16:38.960

sarah: …and when you provide an ecosystem with a lot of data, it allows them to personalize and learn your preferences. And this could change the way that we, online…

121

00:16:39.440 –> 00:16:45.780

sarah: …receive, ask for and receive new information, answer questions. And…

122

00:16:45.860 –> 00:17:00.449

sarah: …could that pivot reshuffle the companies that we get many of our goods and services from, or at least our initial point of contact for those goods and services, because I do think that…

123

00:17:00.540 –> 00:17:07.520

sarah: …especially younger consumers, will feel more comfortable using these tools…

124

00:17:08.140 –> 00:17:09.640

sarah: …as…

125

00:17:09.670 –> 00:17:13.109

sarah: …a co-author, as a co-pilot…

126

00:17:15.890 –> 00:17:28.250

Alex Olesen: That’s really interesting. And, Sarah, you’re touching on a topic that actually comes up with a lot of clients that I speak to. You know, we’re in an interesting…

127

00:17:28.680 –> 00:17:40.300

Alex Olesen: …environment from a macro perspective. At the moment there have been macroeconomic headwinds…

128

00:17:40.650 –> 00:17:53.389

Alex Olesen: …you know, large mid market startup businesses have to navigate in terms of personnel decisions, R&D investment decisions…

129

00:17:53.660 –> 00:17:57.960

Alex Olesen: …specifically within tech, there have been a lot of reductions in force…

130

00:17:58.480 –> 00:18:02.189

Alex Olesen: …So, let me just paint that as the

131

00:18:02.480 –> 00:18:09.760

Alex Olesen: …personnel backdrop for humans interacting with machines simultaneously…

132

00:18:09.870 –> 00:18:15.489

Alex Olesen: …this new technology appears seemingly out of nowhere last year…

133

00:18:16.820 –> 00:18:29.850

Alex Olesen: …to the untrained, it can come across as this, all-seeing oracle that can answer any question. So, if I’m sitting from the perspective of…

134

00:18:30.540 –> 00:18:34.430

Alex Olesen: …someone in the workforce that spells a threat…

135

00:18:35.030 –> 00:18:44.960

Alex Olesen: I love the direction that you’re taking your point of view because of the way that you phrased it, and I will have to look up this quote. We…

136

00:18:45.490 –> 00:18:51.040

Alex Olesen: …overestimate the near-term impact and underestimate the long-term impact…

137

00:18:51.670 –> 00:18:53.399

Alex Olesen: …and what I’m finding…

138

00:18:54.200 –> 00:18:58.689

Alex Olesen: …in my personal experience with our clients is…

139

00:19:00.140 –> 00:19:02.220

Alex Olesen: …these…

140

00:19:03.150 –> 00:19:08.840

Alex Olesen: content assistance or generation tools, they help alleviate…

141

00:19:08.990 –> 00:19:13.850

Alex Olesen: …writer’s block. They help get the first notes on the page…

142

00:19:14.250 –> 00:19:20.570

Alex Olesen: But that human expertise of motivating someone to act…

143

00:19:20.790 –> 00:19:24.670

Alex Olesen: …of understanding the nuance of…

144

00:19:24.850 –> 00:19:26.930

Alex Olesen: …what’s being discussed…

145

00:19:27.380 –> 00:19:32.249

Alex Olesen: …is it from what I’m gathering, from what you’re saying, not quite…

146

00:19:33.290 –> 00:19:41.750

Alex Olesen: …and maybe this is two points, not quite the capability and not quite the near-term intent of this technology. And I think…

147

00:19:41.860 –> 00:19:43.210

Alex Olesen: …that is.

148

00:19:44.430 –> 00:19:52.050

Alex Olesen: …a very good dynamic to exist specifically in the environment that we find ourselves in now.

149

00:19:54.100 –> 00:19:58.589

sarah: Alex, can I? Let me ask for clarification? So…

150

00:19:58.740 –> 00:19:59.970

sarah: …are you…

151

00:20:01.450 –> 00:20:08.319

sarah: …I agree completely with what you said. But this is your question…

152

00:20:09.860 –> 00:20:17.150

sarah: …I think. Let me just. I kind of started thinking about one word you said with human intent…

153

00:20:17.480 –> 00:20:25.640

sarah: …and I think one of the challenges with the output of large language model systems…

154

00:20:26.220 –> 00:20:28.580

…is what we’re calling them…

155

00:20:29.170 –> 00:20:31.290

sarah: We’re calling that output information…

156

00:20:32.440 –> 00:20:37.259

sarah: …and I’m not sure if that’s a great word…

157

00:20:38.270 –> 00:20:42.870

There’s a researcher I spoke with last week…

158

00:20:43.480 –> 00:20:50.090

sarah: …who would like us to remember how large language models work on the inside

159

00:20:50.200 –> 00:20:59.430

sarah: …and view the output as synthetic media, because information is constructed by a human…

160

00:20:59.570 –> 00:21:00.540

sarah: And if…

161

00:21:01.090 –> 00:21:05.790

sarah: …words are not constructed by a human…

162

00:21:06.490 –> 00:21:08.999

…Is it information?

163

00:21:09.110 –> 00:21:26.929

sarah: And this is a very interesting discussion, and I think that this is nothing that we’re going to solve today. But it goes. It touches on another topic in this space, which is anthropomorphizing these systems, as you noted, you know, they’re…

164

00:21:27.480 –> 00:21:33.260

…impressive. And if you don’t know the inner workings, or if you’re new to this space…

165

00:21:33.310 –> 00:21:38.250

sarah: Or actually, that’s not fair. If you’re a human. you…
166

00:21:38.490 –> 00:21:40.560

…automatically…

167

00:21:40.710 –> 00:21:44.489

sarah: …give intention to text…

168

00:21:44.650 –> 00:21:56.639

sarah: …you give you meaning out of the kind of bag of words that you get back. And this bag of words is really good. And so, it’s our natural tendency…

169

00:21:56.830 –> 00:22:08.139

sarah: …to find meaning. I mean, think of conspiracy theories like we try to find meaning in desperate facts. And this goes back to…

170

00:22:08.390 –> 00:22:09.510

sarah: …our…

171

00:22:10.830 –> 00:22:15.670

sarah: …core. Human animals…

172

00:22:15.690 –> 00:22:26.339

sarah: …backgrounds. You know, this is how we probably survived on the African belt. Right? We needed to understand the signal. So…

173

00:22:26.670 –> 00:22:31.510

sarah: …when I hear the word hallucination…

174

00:22:31.640 –> 00:22:37.190

sarah: …I also get a little bit concerned that we’re anthropomorphizing these systems.

175

00:22:39.220 –> 00:22:50.759

sarah: And I think maybe a better word would be error and answer. And in using the word hallucination, we’re about, we’re evoking…

176

00:22:50.940 –> 00:22:55.560

sarah: …spirituality, creativity…

177

00:22:56.750 –> 00:23:05.190

sarah: …ideas and words that are very close to human souls. And again, I want to take a step back and say, this is a machine, people.

178

00:23:05.260 –> 00:23:07.920

…It’s an error.

179

00:23:07.990 –> 00:23:13.610

sarah: And so, I just kind of went on that tirade because of your mentioning intention. But…

180

00:23:13.790 –> 00:23:17.690

sarah: …I think, harnessing…

181

00:23:18.300 –> 00:23:26.820

sarah: …our human intention to complete tasks, to work together with each other. you know, to build companies and…

182

00:23:26.930 –> 00:23:29.760

…support customers…

183

00:23:30.000 –> 00:23:42.770

sarah: …we can use these tools to do that. But I want to push back on hallucinations and perhaps information as output…

184

00:23:42.990 –> 00:23:45.060

sarah: And then…

185

00:23:46.140 –> 00:24:06.549

sarah: Yeah, sorry, Alex, please. No need to apologize. Quite the contrary, Sarah. I am always looking for a good reason to weave the word anthropomorphize into a conversation, and I’m glad that you took it.

186

00:24:06.730 –> 00:24:12.050

Alex Olesen: Well, it’s interesting, right? Like it’s really interesting how we do this. And…

187

00:24:12.880 –> 00:24:19.570

sarah: …it’s not like it’s not a bad thing. It’s a natural thing. And this is why we…

188

00:24:19.790 –> 00:24:26.220

Sarah: …you know, this is how we relate to our pets. I mean, you know, this is like…

189

00:24:26.690 –> 00:24:36.510

sarah: …you know. I’ll fight you if you don’t think my cat totally understands…this is a totally human behavior, but…

190

00:24:37.190 –> 00:24:39.880

sarah : …it’s something that…

191

00:24:39.980 –> 00:24:43.049

sarah:…has a…

192

00:24:43.360 –> 00:24:52.900

sarah: …a core part of who we are, and there’s a reason that it’s in the fabric of our humanity. but…

193

00:24:53.190 –> 00:24:54.660

sarah: …it also…

194

00:24:54.820 –> 00:24:58.880

sarah: …in this case confers trust…

195

00:24:59.260 –> 00:25:03.180

…and trust, is something that should be earned…

196

00:25:03.580 –> 00:25:06.500

sarah: …and I like the fact…

197

00:25:06.790 –> 00:25:08.930

sarah: …that you know…

198

00:25:08.940 –> 00:25:23.180

…as you said, a lot of people are saying, Oh, chat with you! Is it bad for education? Chat GPT has gotten a lot of folks who had not previously played with AI playing with AI. It’s lowered the barrier to entry…

199

00:25:23.200 –> 00:25:24.740

sarah: …in this space.

200

00:25:24.900 –> 00:25:30.390

…There’s a ton of things that I had to learn in school that are now, you know, solved problems.

202

00:25:31.980 –> 00:25:50.090

sarah: You know the ensemble learning of the past is completely out the window. It’s great that I understand what’s going on. But if the tools are out there, there’s great documentation. There’s a great open-source community. Then we have a lot of people entering this workspace…

203

00:25:50.110 –> 00:26:04.230

sarah: …who are going to be because some of these current iterations are so fun, they’re sticky. They’re good products, as you and I in the Bay Area would say.

204

00:26:04.420 –> 00:26:09.280

sarah: But as you also mentioned, that means…

205

00:26:09.420 –> 00:26:13.330

sarah: …that’s in conflict with the macroeconomic trend…

206

00:26:13.360 –> 00:26:20.859

sarah: …of letting people go in a lot of mid-size companies, even the larger companies. And I think that there’s…

207

00:26:21.120 –> 00:26:24.400

sarah: …an essence of…

208

00:26:24.570 –> 00:26:39.430

sarah: …a lot of mid-size companies that have AI teams are taking a step back and saying we were going to build this internally. And now we can buy this. And so, we don’t need some of these internal people…

209

00:26:39.720 –> 00:26:40.560

…and…

210

00:26:41.180 –> 00:26:42.550

sarah: …I think…

211

00:26:42.850 –> 00:26:44.910

…that if you look at…

212

00:26:45.520 –> 00:26:50.770

sarah: …the current strategic ecosystem…

213

00:26:51.100 –> 00:26:59.410

sarah: …strategically, this is exactly what Open AI and some of these companies want to do…

214

00:26:59.760 –> 00:27:03.920

sarah: …they’re looking for customers and…

215

00:27:04.140 –> 00:27:26.679

sarah: …to some degree the price point has been so low for people that it’s really lowered the barrier in that dimension as well. It’s an accessible price point to learn these technologies, and then, when you’re playing with them at home. You bring them into the workspace and say, Hey, I’ve got some great ways that we can automate…

216

00:27:26.900 –> 00:27:36.320

sarah: …some workflows. But more broadly, if you look at it you know these macroeconomic trends…

217

00:27:37.710 –> 00:27:44.260

Sarah: ..the open AI lost 54 million dollars, and it has a hundred employees.

218

00:27:44.840 –> 00:27:45.640

sarah: The…

219

00:27:45.940 –> 00:27:53.719

sarah: …technologies that we’ve been talking about are extremely extremely compute, intensive, which means…

220

00:27:53.760 –> 00:28:05.160

sarah: …they’re probably 2 companies who are making money out of LLMs right now. One is Nvidia and the other is whatever power company they were…

221

00:28:05.270 –> 00:28:10.600

sarah: …accessing to run these scrapes of the web.

223

00:28:27.260 –> 00:28:35.190

Alex Olesen: Sarah, I want to touch on a theme we have alluded to throughout the episode…

224

00:28:35.800 –> 00:28:38.150

Alex Olesen: …and that is the theme of trust…

226

00:28:42.290 –> 00:28:45.080

Alex Olesen: …as these companies…

227

00:28:45.650 –> 00:28:56.389

Alex Olesen: …be them, large enterprises, start-ups, even providers in this space. As these companies embark on their journey…

228

00:28:56.400 –> 00:29:01.410

Alex Olesen: to either build LMS or incorporate Generative AI…

229

00:29:01.540 –> 00:29:12.080

Alex Olesen: …into their go-to market, how can companies earn the trust of their consumer base…

230

00:29:12.140 –> 00:29:27.820

Alex Olesen: …and use this technology in a way that will not feel invasive or not feel like it is infringing on the privacy or the rights of the consumer?

231

00:29:30.360 –> 00:29:34.259

Great question. I think that this is going to be…

232

00:29:35.930 –> 00:29:42.250

sarah: …you know, I do want to give respect to Open AI for creating a great product because…

233

00:29:43.190 –> 00:29:46.960

sarah: …LLMs have been around vectorized…

234

00:29:46.990 –> 00:29:49.609

sarah: …data…

235

00:29:49.640 –> 00:29:54.439

…word embeddings have been around, but they created a front end. That was it.

236

00:29:54.580 –> 00:29:57.870

sarah: It’s a great experience and…

237

00:29:58.210 –> 00:30:09.370

sarah: …I think that in general, trust, in a way that they can parlay into building a…

238

00:30:10.670 –> 00:30:25.570

sarah: …partner and a customer business that will have some really big, impressive names. What does that mean? You know, when the rubber hits the road for everyone else.

239

00:30:25.740 –> 00:30:27.780

sarah: When you come into…

240

00:30:28.060 –> 00:30:33.770

sarah: …a situation where you’re trying to win a sale from…

241

00:30:33.890 –> 00:30:38.129

sarah: …a customer, I think it’s really important to say…

242

00:30:38.930 –> 00:30:57.289

sarah: How can I define my company? And what do I have that makes my company unique? And in most cases this question, before November 30th, 2022 would be, I have a relationship…

243

00:30:57.400 –> 00:31:07.420

sarah: …and I am with this customer. I have relationships with customers like this customer, and I have a long standing…

244

00:31:07.740 –> 00:31:09.470

…data set…

245

00:31:09.580 –> 00:31:11.430

sarah: …that includes…

246

00:31:11.490 –> 00:31:29.060

sarah: …well documented metadata of these relationships. Okay? So, maybe that was a computer scientist describing an enterprise customer engagement. But I think that that fundamentally has not changed. If you came into a…

247

00:31:29.100 –> 00:31:34.260

sarah: …space where you are providing a service, and you have…

248

00:31:34.370 –> 00:31:45.980

sarah: …a great customer field. And because of that ongoing engagement over years, you have a ton of data that reflects your customers…

249

00:31:46.090 –> 00:31:49.849

sarah: Then you are going to be in a great situation…

250

00:31:50.160 –> 00:31:52.319

sarah: …in the large language model realm…

251

00:31:52.450 –> 00:32:05.930

sarah: Why? Because you’ve already gained the trust, and you’ve already created great results for your customers. And now you’re just laying a technical…

252

00:32:06.110 –> 00:32:13.129

sarah: …technology on top of that, a technical layer that says we can…

253

00:32:13.280 –> 00:32:29.280

sarah: …optimize this past data even more, we can surface new insights. We can respond to your questions in a more naturalistic way. And we can have deeper…

254

00:32:29.450 –> 00:32:53.260

sarah: …information discovery from our pre-existing information. Using some of these, vector-based approaches. Right? So, when you think about trust, I don’t think about someone who’s coming to the market tomorrow with, I can solve X, y, and Z with large language models or Chat GPT, for…

255

00:32:53.680 –> 00:33:07.300

sarah: …you know, road cleaning, whatever it may be. But I do think that trust is about relationships, and from a technical perspective, it’s about data. And if you can leverage that data…

256

00:33:07.550 –> 00:33:20.730

sarah: …to create naturalistic experiences for your existing customers, you’re going to make them happy. And you’re going to gain more data is key and data is what…

257

00:33:20.990 –> 00:33:31.749

sarah: …is really amplified. The large language model scene. So, I think a lot of companies need to understand that their data is gold.

258

00:33:31.790 –> 00:33:50.039

sarah: At Orange, we take our customer engagement extremely seriously. We are customer-centric. Our concern is great customer experience. And we know that we’re sitting on a gold mine of data. We know past preferences….

259

00:33:50.280 –> 00:34:04.840

sarah: …and now we have a tool that allows us to personalize and create even better experiences, you know. And that’s why we’re amplifying that trust.

260

00:34:06.480 –> 00:34:20.589

Alex Olesen: Fantastic. Well, Sarah, I know we’ve covered a lot of ground today, and thank you for all of the insight. I have one final question now that we’re on the topic of Orange and the projects that you’re working on.

261

00:34:21.150 –> 00:34:28.960

Alex Olesen: What’s on your plate right now or in your pipeline excites you the most that you’re working on?

262

00:34:30.170 –> 00:34:33.969

sarah: Thank you. Because…

263

00:34:34.580 –> 00:34:39.539

sarah: …I’ve been talking a lot about generative, and…

264

00:34:39.770 –> 00:34:50.789

sarah: …I hope this is as articulate as needed. But the thing that excites me the most is working on…

265

00:34:50.969 –> 00:34:56.769

…low resource language projects. We have a lot of customers…

266

00:34:56.810 –> 00:35:00.150

sarah: …in our West African. Well…

267

00:35:00.290 –> 00:35:11.150

sarah: …okay, let me start at the beginning. Hi, my name is Sarah, and I’m really interested in low-resource language projects. This is why I joined Orange five years ago.

268

00:35:11.780 –> 00:35:18.680

Oranges in a particularly well-suited position to support…

269

00:35:19.480 –> 00:35:34.140

sarah: …our customers in numerous areas. We have a bank, we have a video game company. We’ve entertainment streaming services. We lay Internet cable. We have satellites, you know. What don’t we do? Well…

270

00:35:34.210 –> 00:35:35.330

sarah: …we…

271

00:35:35.420 –> 00:35:41.349

sarah: …are focused on providing great customer experiences to everyone…

272

00:35:41.450 –> 00:35:49.150

sarah: …and many of our customers are in places that have underserved languages…

273

00:35:49.450 –> 00:36:01.050

sarah: …So, orange is in French Africa. But a lot of people in these regions no longer speak French, it’s not considered…

274

00:36:01.460 –> 00:36:06.989

…a necessary language as much as new trends in…

275

00:36:07.000 –> 00:36:11.779

sarah: …in political and community…

276

00:36:12.170 –> 00:36:19.770

sarah: …economics spring up, you know, there’s the ability to buy and sell products locally…

277

00:36:19.890 –> 00:36:31.249

sarah: …and with that local focus, a lot of people have been returning to local languages that are spoken..

278

00:36:31.640 –> 00:36:48.650

sarah: …or comparatively underserved. And so what does that mean? It means that with the shifting dynamics of these areas as well as other, you know, macroeconomic forces, we have a lot of customers who do not speak French…

279

00:36:48.690 –> 00:36:52.640

sarah: …or would prefer not to speak French…

280

00:36:53.030 –> 00:36:53.910

sarah:…now.

281

00:36:54.180 –> 00:37:07.009

sarah: I describe to the uninitiated low-resource languages as languages where there’s no Reddit, although in the past week that illusion is less powerful. But…

282

00:37:07.040 –> 00:37:26.570

sarah: …it means that there’s not a lot of data online. So, I am very passionate about selling people products in whatever language they would like, and providing goods and services and whatever language they would like. So recently, I ran a machine translation competition between French and Bombara. That was text-based. We had…

283

00:37:26.930 –> 00:37:34.860

sarah: …20 participants cash prizes. The winners are from all over the globe, and well…

284

00:37:34.960 –> 00:37:48.069

sarah: …only a couple of the participants actually spoke Bambara. Almost all of the participants in the competition spoke, or had family members who spoke a different low-resource language. And…

285

00:37:48.140 –> 00:37:58.590

sarah: …this was a really impressive experience, because it reminded us that no matter how much the large language model hype cycle…

286

00:37:58.830 –> 00:38:02.260

sarah: …you know, brings us cool, sticky…

287

00:38:02.280 –> 00:38:04.259

sarah: …products and…

288

00:38:04.280 –> 00:38:16.720

sarah: …crazy new applications, and really, and you know, the Pope and a puffer coat, whatever it may be. I think it’s really important to remember that…

289

00:38:16.840 –> 00:38:31.330

sarah: …natural language processing is not a solved domain. And there are a lot of challenges out there for the global population that are not English-centric or not French-centric…

290

00:38:31.360 –> 00:38:38.210

sarah: …and I’m hoping that you know when we were working on this last iteration of the translation project.

291

00:38:38.240 –> 00:38:51.019

sarah: Everyone was saying, Okay, large language models have dropped. How relevant are they for this domain? And then our next step is to experiment with that and figure out how can you…

292

00:38:51.030 –> 00:39:02.640

sarah: … .augment large language models for languages that are basically small language models and you know, have their own orthographic challenges.

293

00:39:02.710 –> 00:39:12.449

sarah: And at the end of the day, it’s really about customer service and people. So, I think it’s been a really great experience. And I’ve met a lot of really interesting people…

294

00:39:12.570 –> 00:39:20.619

sarah: …and I trust that that will continue to be a great way to leverage new technologies…

295

00:39:20.640 –> 00:39:23.330

sarah: …with a global…

296

00:39:23.470 –> 00:39:29.129

sarah: …economy and global engagement. But I do want to reiterate that…

297

00:39:29.580 –> 00:39:39.040

sarah: …LLMs have not solved AI. We there’s probably going to be a plateau right now, where…

298

00:39:39.240 –> 00:39:46.870

sarah: Open AI has not been mentioned. They don’t have plans to release a GPT…

299

00:39:47.010 –> 00:39:51.170

sarah: …different, than you know the LLMs.

301

00:39:52.660 –> 00:39:58.190

sarah: Models are out there, and there’s been a lot of optimization around…

302

00:39:58.210 –> 00:40:15.360

sarah: …how to compute and run these models on smaller and smaller memory devices. And so I would watch to see what Apple is up to because I think they’re going to. They’re going to jump on the scene with a pretty interesting privacy…

303

00:40:15.440 –> 00:40:18.919

sarah: …oriented solution as is their brand.

304

00:40:19.160 –> 00:40:30.080

sarah: But I’m really excited about the power of communication and the power of speaking to people in their own language. So, watch the space for more machine translation…

305

00:40:30.240 –> 00:40:34.400

sarah: …and low resource language innovations…

306

00:40:35.490 –> 00:40:44.690

Well, thank you, Sarah, and you know this has been a fascinating discussion, one which I think the audience is really going to appreciate, not only for the…

307

00:40:44.800 –> 00:40:49.650

Alex Olesen: …information and insight that you’ve provided, but I think also a very…

308

00:40:49.830 –> 00:40:59.740

Alex Olesen: …actionable, optimistic, and pragmatic point of view. Again, for the listeners, this was Sarah Luger.

309

00:40:59.880 –> 00:41:12.280

Alex Olesen: Sarah, thank you very much for joining. I really enjoyed our conversation and I’m looking forward to the next one.

310

00:41:14.060 –> 00:41:16.500

sarah: Thank you, Alex. It was a pleasure.

Share

Subscribe to our newsletter for the latest news & information.