Podcast | 22 Jun, 2023

Episode 3: Generative AI over the Years: Security, Job Creation, and Fighting Writer’s Block

Alex Olesen


Alex Olesen

VP Vertical Strategy & Product Marketing

Sarah Luger


Sarah Luger

Senior Director of AI, NLP, & ML at Orange Silicon Valley

Sarah Luger, PhD sat down with host Alex Olesen, VP Vertical Strategy & Product Marketing at Persado to share some key ways humans and AI work together and could collaborate in the future. This partnership between humans and AI offers an uplifting perspective. AI automates mundane tasks at scale, leaving humans with more time to complete more complex tasks. But, what does this mean in the short term and in the long term? According to Sarah, right now, ChatGPT and GPT-based technologies are being used a lot in marketing, writing, and creativity-based tasks. Generative AI is great for writer’s block and people can easily figure out how to engineer prompts thanks to decades of using Google. On the enterprise level, the relationship between AI and humans is based on trust. Not just humans trusting AI to complete tasks quickly at scale, but also using AI to add value to the customer experience. Forward-thinking companies use AI to learn customers’ preferences and to provide more personalized experiences and recommendations. 

“At Orange, we take our customer engagement extremely seriously. We are customer-centric. Our concern is a great customer experience and we know that we’re sitting on a gold mine of data. Now we have a tool that allows us to personalize and create even better experiences,” said Sarah. 

Sarah hopes to see in the future how large language models will transform economic and global engagement, especially when it comes to underserved languages and communities.

Episode Transcript:

00:00:01.150 –> 00:00:05.030
[email protected]: All right. We are now recording. So I’m going to get started in a couple of seconds.

00:00:08.500 –> 00:00:09.570
[email protected]: All right. Here we go.

00:00:10.500 –> 00:00:19.100
[email protected]: Welcome back to motivation. AI matters. Today I am really excited to be joined by Sarah Luger.

00:00:33.850 –> 00:00:36.530
[email protected]: Sarah, thank you for joining us today.

00:00:38.190 –> 00:00:53.090
sarah: Thank you so much, Alex. It’s a real pleasure, as you know. I’ve been working in the space for some time, but also excited about Prisado’s role in this space. So thank you again for having me. My name is Sarah Luger.

00:00:53.210 –> 00:01:00.069
I got my Phd. At University of Edinburgh many years ago in artificial intelligence, and I would be.

00:01:00.480 –> 00:01:18.790
sarah: I’m being quite honest to say that I’m surprised and excited about the developments in this space. In the past I’ve worked at startups. I’ve worked at Ibm building a precursor to the Ibm. Watson jeopardy challenge robot.

00:01:18.820 –> 00:01:28.430
sarah: I’ve also been at Orange Silicon Valley for 5 years. We’ve worked on numerous topics, including voice by metrics. Chat bots call center technology.

00:01:28.480 –> 00:01:30.540
and of course.

00:01:31.100 –> 00:01:35.130

sarah: everything that has to do with large language models and generative. AI,

00:01:35.320 –> 00:01:39.469

I’m really excited, especially because, as you may or may not know.

00:01:39.690 –> 00:01:56.930

sarah: Orange is a large company with almost a hundred 40,000 employees, and we’re in 27 countries. Many of those are in North and West Africa, and many of our customers speak languages that are low resource, which means they don’t have a lot of training data.

00:01:57.030 –> 00:02:12.949

sarah: They don’t have an online presence that supports the kind of data and that’s used in conventional translation systems to create high quality translations. So I’ve been working a lot in that space. And I do see both potential

00:02:13.080 –> 00:02:20.550

sarah: for large language models to support our customers, but also some peril. And I’m looking forward to chatting more about that today. Thank you.

00:02:20.720 –> 00:02:26.429
[email protected]: I think that’s fantastic. And you’ve got a great background. I know you’ve

00:02:26.580 –> 00:02:38.099
[email protected]: you’ve seen this industry evolve through through numerous iterations. It’s most recent what is now being referred to as generative. AI.

00:02:38.370 –> 00:02:50.530
[email protected]: I know you. You just touched on a lot of really interesting topics. We’ll dive into over the course of this episode. But to give the the listeners a good baseline

00:02:50.730 –> 00:03:03.009
[email protected]: in your own words, could you define what this new term generative AI means, and then talk to me a little bit about who generative AI is important for

00:03:04.990 –> 00:03:06.020
sarah: okay?

00:03:06.390 –> 00:03:09.710
sarah: So I think

00:03:09.820 –> 00:03:15.070
sarah: for the the average person out there. General, today I

00:03:15.360 –> 00:03:25.520
sarah: is AI, where the output resembles human content. It resembles a language that is either

00:03:25.540 –> 00:03:28.680
that seems like it’s constructed by a human

00:03:29.000 –> 00:03:54.640
sarah: or technically generative AI systems are based on algorithms that learn from the a vast amount of input data. And the most recent cases that we’ll dig into. That would be all of the digital data that’s on the web as well as some knowledge bases knowledge bases being things like Wikipedia that give structure and associate terms. And

00:03:54.700 –> 00:04:02.799
sarah: some apparent meaning to this, this vast sea of of language data. And so

00:04:02.820 –> 00:04:05.690
sarah: what’s going on under the hood is that there is

00:04:05.840 –> 00:04:14.769
sarah: this vast amount of data is being used to learn the patterns of how we as humans speak

00:04:14.950 –> 00:04:26.620
sarah: and how we write, and with innovations, both from Google’s 217 transformer paper incredible compute innovations

00:04:26.700 –> 00:04:31.960
as well as just ongoing neural networks developments.

00:04:33.070 –> 00:04:41.310
sarah: There’s the possibility, as many of us have have now tried since November thirtieth, 2,022, when Chat Gpt was launched

00:04:41.340 –> 00:05:01.849
sarah: to engage with an generative AI system in a way that most people had not engaged with an AI system, you know. Perhaps in the past you had AI, a secondary characters in a video game, you know, or there’d maybe been some predictive analytics in an enterprise app

00:05:02.070 –> 00:05:15.019
sarah: application you were using. But the core of generative AI is using these patterns of words at at a vast scale. That then for us makes it seem like

00:05:15.220 –> 00:05:17.369
sarah: this computer, is

00:05:17.590 –> 00:05:25.859
sarah: it? Almost? It’s almost a a human like content that’s being output. And it’s really a powerful difference between

00:05:25.870 –> 00:05:36.760
sarah: systems from even gosh, even 6 months ago, right? We’ve had 7 months ago. We’ve had a seat change. And your second question is, who is it

00:05:37.180 –> 00:05:39.400
sarah: most?

00:05:39.790 –> 00:05:43.560
sarah: Who is it most important? Who is? Who is it most important? For

00:05:44.540 –> 00:05:49.109
sarah: who is it most important? For well, right now the

00:05:49.190 –> 00:05:57.200
sarah: we’re in. We’re in the the hype cycle, and it’s and it’s a little bit of it’s important for everyone.

00:05:57.240 –> 00:05:59.539
This is great for everything.

00:05:59.570 –> 00:06:11.029
sarah: And I respect the hype as someone who’s in Silicon Valley, because I understand the role that it plays and the duality of of


00:06:11.410 –> 00:06:22.590
sarah: of how we get an investment and how we build prop products and how we have to compete with other hype cycles, be them metaverse. most recently in blockchain.

00:06:22.670 –> 00:06:26.100
sarah: But I think that this is really important for


00:06:26.200 –> 00:06:29.799
sarah: creating customer-centric tools

00:06:29.970 –> 00:06:32.610

sarah: that support

00:06:32.830 –> 00:06:46.540

sarah: voice bots textual. You know, marketing text. I think that marketing and customer support are the first areas that are going to see innovations in these really

00:06:46.810 –> 00:06:49.360
sarah: human seeming

00:06:49.390 –> 00:06:54.050
sarah: engagements that can be created for their customers. So

00:06:54.560 –> 00:07:02.539
those are the 2 areas that the people I see that it will most of affect. But then I want to also flip that and say.

00:07:02.620 –> 00:07:06.619
sarah: I think the Holy Grail of enterprise.

00:07:06.790 –> 00:07:15.259
sarah: Innovation that isn’t as shiny and sparkly as some of the other. you know. Key terms I’ve just mentioned

00:07:15.280 –> 00:07:19.839

sarah: is enterprise intranet. Search.

00:07:20.020 –> 00:07:36.159
sarah: so the ability to search through a company’s resources, to answer questions for employees, or answer questions for employees that are then passed on to customers. I think that that is

00:07:36.430 –> 00:07:45.519
sarah: really key cause it will help you and I do our jobs better and reduce mundane tasks and reducing

00:07:45.820 –> 00:07:47.410
Monday and task.

00:07:48.060 –> 00:07:55.780
sarah: It’s something that AI in general is, you know, aims to do any technology. We try to elevate our

00:07:55.790 –> 00:07:58.139
sarah: our work tasks up

00:07:58.220 –> 00:08:10.129
the the difficulty chain. So we want, as humans to not do the same thing every day, but to understand patterns

00:08:10.180 –> 00:08:13.969
sarah: and reduce repetition

00:08:14.020 –> 00:08:29.960
sarah: and do more and more challenging tasks, and those more challenging tasks are very hard for computers, so don’t fret. Many of us will still have jobs. On the other hand, those lower level tasks are.

00:08:30.000 –> 00:08:39.289
sarah: really great opportunities for computers to come in generative AI systems to come in and support us.

00:08:41.820 –> 00:08:51.909
[email protected]: Yeah, you you’ve raised, you raise some really interesting points, some of which, you know we’ve discussed in in prior conversations. But I’ll I’ll summarize those quickly for the listeners.

00:08:52.080 –> 00:09:08.989
[email protected]: You know we we’ve we’ve talked about. And and I I love the way that you put it. You know, some hype cycles around the metaverse around blockchain. I think we’re we’re definitely seeing a surge of a hype cycle around generative AI,

00:09:09.050 –> 00:09:19.740
[email protected]: you know, to another one of your points. The barrier to entry for consuming this type of technology in in my observation has been lowered.

00:09:19.860 –> 00:09:23.610
[email protected]: as of November of last year.

00:09:23.630 –> 00:09:26.399
[email protected]: you know, chat

00:09:26.660 –> 00:09:36.730
[email protected]: so readily available for the average consumer. you know, broaching the prosumer pro prosumer market.

00:09:36.860 –> 00:09:37.900
[email protected]: And

00:09:37.910 –> 00:09:44.730
[email protected]: you know, as you’re alluding to breaking into the enterprise. Space as well.


00:09:44.910 –> 00:09:47.030

[email protected]: I think, is going to have


00:09:47.100 –> 00:09:59.210

[email protected]: widespread benefit both in terms of the way consumers interact with this type of technology, but also the way large enterprises similar to orange


00:09:59.340 –> 00:10:12.729

[email protected]: derive value from this type of technology. But I do want to touch on another point that you’ve made, which is unique to the other conversations that I’ve had in the last couple of months


00:10:12.840 –> 00:10:19.369

[email protected]: which are the implications around education and career development.


00:10:19.430 –> 00:10:27.680

[email protected]: I love what you said because it goes. It goes against the grain, in my opinion


00:10:27.870 –> 00:10:32.050

[email protected]: of what we have seen in the news around job displacement


00:10:32.230 –> 00:10:50.360

[email protected]: and and the automation of tasks. I think you delineate very well, and artificial intelligence will help eliminate some mundane tasks. But you put a great perspective on the state of the market by saying it will free


00:10:50.380 –> 00:10:59.559

[email protected]: humans up to do more challenging tasks, and I find that to be a very pragmatic and and and uplifting perspective.


00:11:00.400 –> 00:11:04.460

al[email protected]: if you could expand on that a bit, Sarah.


00:11:05.040 –> 00:11:06.130

[email protected]: What


00:11:06.240 –> 00:11:09.140

[email protected]: would you? What would you say


00:11:09.430 –> 00:11:16.780

[email protected]: some interesting applications of humans and AI working together


00:11:16.920 –> 00:11:21.819

[email protected]: might be that we haven’t necessarily considered yet.


00:11:24.690 –> 00:11:25.560

sarah: Great?


00:11:26.330 –> 00:11:30.300

yeah. And I I appreciate you.


00:11:30.720 –> 00:11:42.030

sarah: your perspective as well, because I think that we’re at a a point where. because of the hype, there are some


00:11:42.930 –> 00:11:48.020

sarah: main primary narratives that seem to be


00:11:48.380 –> 00:11:57.640

they. They seem to be extremes. They’re either fear, based or or wildly at outlandish. And you know the the reality is somewhere in between.


00:11:57.730 –> 00:11:59.160

sarah: I’ve heard that


00:11:59.230 –> 00:12:03.270

with new innovations we grossly


00:12:03.350 –> 00:12:10.729

sarah: out, we overestimate the near term change, and then underestimate the long term, change


00:12:11.010 –> 00:12:27.699

sarah: and that’s not my quote, but I I strongly agree so to your point. What are, what are the ways that this hasn’t been using? And it hasn’t been clear and and with collaboration between humans and


00:12:27.790 –> 00:12:30.179

these tools. What will be


00:12:30.550 –> 00:12:39.399

sarah: more? what will become clear in the in the near term? So what is happening now, what do we see now?


00:12:39.910 –> 00:12:58.110

sarah: Right now, chat Gpt and Gpt. Based technologies are being used in a lot of marketing, writing, and text and ideation and creativity based tasks.


00:12:58.110 –> 00:13:21.119

sarah: So they’re really good for this blank page problem. Maybe call it writers. Block call it Also, from a prompt perspective. You know, we’ve all grown up to some extent feeling comfortable with Google searches. And so we we can figure out how to engineer prompts that give us an output that we’re looking for.


00:13:21.260 –> 00:13:32.460

But humans and computers working together has been There’s a long standing history of this. And in artificial intelligence we’ve developed agents


00:13:32.640 –> 00:13:39.060

sarah: which are some little little software


00:13:39.140 –> 00:13:57.239

sarah: products that help humans complete tasks. And there’s been research in this multimodal, you know, with sensors, what is there. There’s a lot of use in industry. But I think that the next wave of


00:13:57.590 –> 00:14:02.659

sarah: this technology will be using the


00:14:02.720 –> 00:14:15.749

sarah: trust which a lot of users have gained from this first iteration. And you know, I later, I want to maybe push back on that trust. But if we think of this as kind of a consumer life cycle.


00:14:15.910 –> 00:14:24.009

sarah: using the trust that the consumer has from the output of their large language model


00:14:24.030 –> 00:14:26.460

sarah: chat interface


00:14:26.520 –> 00:14:30.389

sarah: and turning that into a resource


00:14:31.230 –> 00:14:44.590

sarah: that maybe takes the place of a Google search. Right? So I think that chat interfaces are going to transform the way that humans work


00:14:44.690 –> 00:14:45.540

sarah: with


00:14:46.460 –> 00:15:01.670

sarah: computers in the fact that we may no longer have Google page rank be as an important in terms of Google being the font of all knowledge. Maybe that verb itself. I’m gonna Google. It will


00:15:01.780 –> 00:15:11.079

sarah: will cease to be as relevant. I think we’re going to have a split in our society where people who have become very


00:15:11.090 –> 00:15:23.469

sarah: adept at using chat technology may go to use this as their primary resource for new information, for for information discovery.


00:15:23.760 –> 00:15:35.740

I think that that could have profound implications on the Enterprise search and our ecosystems, you know, if we think of trying to make a decision about, say, is


00:15:35.970 –> 00:15:43.760

sarah: Siri, or Google Home or Apple home pod. What is what is the best smart device


00:15:43.840 –> 00:15:46.620

sarah: it’s hard to evaluate, which is


00:15:46.670 –> 00:15:51.900

sarah: in a you know, just a discrete test.


00:15:52.260 –> 00:16:03.200

sarah: But maybe, as as many consumers do, people make decisions based on their ecosystem. So I imagine in the future that


00:16:03.290 –> 00:16:06.279

sarah: open AI will leverage this


00:16:06.660 –> 00:16:16.670

sarah: collaboration with Microsoft to really pivot the enterprise solutions that the average person spends a lot of their time on.


00:16:16.830 –> 00:16:18.010

and by


00:16:18.240 –> 00:16:19.980

sarah: changing that habit.


00:16:20.620 –> 00:16:25.100

sarah: Anytime you use an ecosystem, you’re providing them with a lot of data.


00:16:25.200 –> 00:16:38.960

sarah: And when you provide an ecosystem with a lot of data, it allows them to personalize and learn your your preferences. And this could change the way that we online


00:16:39.440 –> 00:16:45.780

sarah: receive, ask for and receive new information, answer questions. And


00:16:45.860 –> 00:17:00.449

sarah: could could that pivot could reshuffle the companies that we get many of our goods and services from, or at least our initial point of contact for those goods and services, because I do think that


00:17:00.540 –> 00:17:07.520

sarah: especially younger consumers, will feel more comfortable using these tools


00:17:08.140 –> 00:17:09.640

sarah: as


00:17:09.670 –> 00:17:13.109

sarah: as a co-author, as a co-pilot


00:17:15.890 –> 00:17:28.250

[email protected]: that that’s really interesting. And and, Sarah, you’re you’re touching on a a topic that actually comes up with a lot of clients that I speak to. You know, we’re in an interesting


00:17:28.680 –> 00:17:40.300

[email protected]: environment from a macro perspective. At the moment there have been macroeconomic headwinds that many.


00:17:40.650 –> 00:17:53.389

[email protected]: you know, large mid market startup businesses have to navigate in terms of personnel decisions, R&D investment decisions


00:17:53.660 –> 00:17:57.960

[email protected]: specifically within tech, there have been a lot of reductions in force.


00:17:58.480 –> 00:18:02.189

[email protected]: So let me just paint that as the


00:18:02.480 –> 00:18:09.760

[email protected]: personnel backdrop for humans interacting with machines simultaneously


00:18:09.870 –> 00:18:15.489

[email protected]: this new technology appears seemingly out of nowhere last year


00:18:16.820 –> 00:18:29.850

[email protected]: to the untrained I it can come across as this, all seeing oracle that can answer any question. So if I’m sitting from the perspective of


00:18:30.540 –> 00:18:34.430

alex.ol[email protected]: someone in the workforce that spells a threat.


00:18:35.030 –> 00:18:44.960

[email protected]: I love the direction that you’re taking your point of view, because the way that you phrased it and and I will have to look up this quote. We


00:18:45.490 –> 00:18:51.040

[email protected]: overestimate the near term impact and underestimate the long term impact.


00:18:51.670 –> 00:18:53.399

[email protected]: And what I’m finding


00:18:54.200 –> 00:18:58.689

alex.o[email protected]: in my personal experience with our clients is


00:19:00.140 –> 00:19:02.220

[email protected]: these Llms


00:19:03.150 –> 00:19:08.840

[email protected]: content, assistance or generation tools. They help alleviate


00:19:08.990 –> 00:19:13.850

[email protected]: writers block. They help get the first notes on the page.


00:19:14.250 –> 00:19:20.570

[email protected]: But that human expertise of motivating someone to act.


00:19:20.790 –> 00:19:24.670

[email protected]: of understanding the nuance of


00:19:24.850 –> 00:19:26.930

[email protected]: what’s being discussed


00:19:27.380 –> 00:19:32.249

[email protected]: is it is from from what I’m gathering, from what you’re saying, not quite.


00:19:33.290 –> 00:19:41.750

[email protected]: And it maybe this is 2 points, not quite the capability and not quite the near term intent of this technology. And I I think


00:19:41.860 –> 00:19:43.210

[email protected]: that that is.


00:19:44.430 –> 00:19:52.050

[email protected]: I think that that is a very good dynamic to exist specifically in the environment that we find ourselves in. Now.


00:19:54.100 –> 00:19:58.589

sarah: Alex, can I? Let me ask a clarification? So


00:19:58.740 –> 00:19:59.970

sarah: are you.


00:20:01.450 –> 00:20:08.319

sarah: I agree, completely with what you said. But is your question.


00:20:09.860 –> 00:20:17.150

sarah: I I think. Let me just. I kind of started thinking about one word you said intent human intent.


00:20:17.480 –> 00:20:25.640

sarah: And I think one of the challenges with the output of large language model systems


00:20:26.220 –> 00:20:28.580

is that we’re calling them.


00:20:29.170 –> 00:20:31.290

sarah: We’re calling that output information.


00:20:32.440 –> 00:20:37.259

sarah: And I’m not sure if that’s a a great word.


00:20:38.270 –> 00:20:42.870

There’s a researcher I spoke with last week


00:20:43.480 –> 00:20:50.090

sarah: who would like us to remember how large language models work on the inside


00:20:50.200 –> 00:20:59.430

sarah: and view the output as synthetic media. because information is constructed by a human.


00:20:59.570 –> 00:21:00.540

sarah: And if


00:21:01.090 –> 00:21:05.790

sarah: if words are not constructed by a human.


00:21:06.490 –> 00:21:08.999

is it information?


00:21:09.110 –> 00:21:26.929

sarah: And and this is a this is a very interesting discussion, and I think that this is nothing that we’re going to solve today. But it goes. It touches on another topic in this space, which is anthropomorphizing these these systems, as you as you noted, you know, they’re


00:21:27.480 –> 00:21:33.260

they’re impressive. And if you don’t know the inner workings. Or if you’re new to this space.


00:21:33.310 –> 00:21:38.250

sarah: or actually, that’s not fair. If you’re a human. you you


00:21:38.490 –> 00:21:40.560



00:21:40.710 –> 00:21:44.489

sarah: give intention to text


00:21:44.650 –> 00:21:56.639

sarah: you, you give you make meaning out of the kind of bag of words that you get back. And this bag of words is really good. And so it’s our natural tendency


00:21:56.830 –> 00:22:08.139

sarah: to to find meaning. I mean, think of conspiracy theories like we try to find meaning in in desperate facts. And this is this goes back to


00:22:08.390 –> 00:22:09.510

sarah: our


00:22:10.830 –> 00:22:15.670

sarah: our core. Human animal.


00:22:15.690 –> 00:22:26.339

sarah: backgrounds. You know, this is this is how we probably survived. on the African belt. Right? We needed to understand signal. So


00:22:26.670 –> 00:22:31.510

sarah: when I I hear the word hallucination.


00:22:31.640 –> 00:22:37.190

sarah: I also get a little bit concerned that we’re anthropomorphizing these systems.


00:22:39.220 –> 00:22:50.759

sarah: and I think maybe a better word would be error and answer. And by answer for more. 5 z these systems. And in using the word hallucination we’re about, we’re evoking


00:22:50.940 –> 00:22:55.560

sarah: spirituality, creativity. very


00:22:56.750 –> 00:23:05.190

sarah: ideas and words that are very close to human souls. And again, I want to take a step back and say, this is a machine people.


00:23:05.260 –> 00:23:07.920

it’s it’s an error.


00:23:07.990 –> 00:23:13.610

sarah: And so I just I just kind of went on that tirade because of you mentioning intention. But


00:23:13.790 –> 00:23:17.690

sarah: I think. I think, harnessing


00:23:18.300 –> 00:23:26.820

sarah: our human intention to complete tasks, to work together with each other. you know, to build companies and


00:23:26.930 –> 00:23:29.760

and support customers.


00:23:30.000 –> 00:23:42.770

sarah: We can use these tools to do that. But I I want to push back on hallucinations and perhaps information as as output.


00:23:42.990 –> 00:23:45.060

sarah: And then


00:23:46.140 –> 00:24:06.549

sarah: yeah, sorry, Alex, please. No, no, no need to apologize. Quite quite the contrary, Sarah. I am always looking for a good reason to weave the word anthropomorphize into a conversation, and I’m glad that you took it.


00:24:06.730 –> 00:24:12.050

[email protected]: Well, it’s interesting, right? Like it’s really interesting how we do this. And


00:24:12.880 –> 00:24:19.570

sarah: it’s not like it’s not a bad thing. It’s a natural thing. And this is why we


00:24:19.790 –> 00:24:26.220

sarah: you know, this is how we relate to our pets. I mean, you know, this is like


00:24:26.690 –> 00:24:36.510

sarah: you, you know. I’ll fight you if you don’t think my cat totally understands what I you know this is a totally human behavior, but


00:24:37.190 –> 00:24:39.880

sarah: I it is, and it’s something that that


00:24:39.980 –> 00:24:43.049

sarah: that has a


00:24:43.360 –> 00:24:52.900

sarah: It has it. It’s a core part of who we are, and there’s a reason that it’s in our the fabric of our humanity. but


00:24:53.190 –> 00:24:54.660

sarah: it also


00:24:54.820 –> 00:24:58.880

sarah: in this case confers trust


00:24:59.260 –> 00:25:03.180

and trust, is something that should be earned.


00:25:03.580 –> 00:25:06.500

sarah: and I like the fact


00:25:06.790 –> 00:25:08.930

sarah: that you know.


00:25:08.940 –> 00:25:23.180

as you said, a lot of people are saying, Oh, chat you! It is bad for education. hey? Chat Gpt has gotten a lot of folks who had not previously played with AI playing with AI. It’s lowered the barrier to entry


00:25:23.200 –> 00:25:24.740

sarah: in this space.


00:25:24.900 –> 00:25:30.390

There’s a ton of things that I had to learn in school that are now, you know, solved problems.


00:25:30.830 –> 00:25:31.970

sarah: you.


00:25:31.980 –> 00:25:50.090

sarah: you know the ensemble learning of the past is completely out the window. It’s great that I understand what’s going on. But if the tools are out there, there’s great documentation. There’s a great open source community. Then we have a lot of people entering this workspace


00:25:50.110 –> 00:26:04.230

sarah: who are going to be because because some of these current iterations are so fun they’re sticky. They’re good products, as as you and I in in the Bay Area would say.


00:26:04.420 –> 00:26:09.280

sarah: But as you also mentioned, that means that the


00:26:09.420 –> 00:26:13.330

sarah: that’s that’s in conflict with the macroeconomic trend


00:26:13.360 –> 00:26:20.859

sarah: of letting people go in a lot of mid-size companies, even the larger companies. And I think that there’s


00:26:21.120 –> 00:26:24.400

sarah: a an essence of


00:26:24.570 –> 00:26:39.430

sarah: a lot of mid-size companies that have AI teams are taking a step back and saying we were going to build this internally. And now we can buy this. And so we don’t need some of these internal people.


00:26:39.720 –> 00:26:40.560



00:26:41.180 –> 00:26:42.550

sarah: I think


00:26:42.850 –> 00:26:44.910

that if you look at


00:26:45.520 –> 00:26:50.770

sarah: the current strategic ecosystem.


00:26:51.100 –> 00:26:59.410

sarah: the strategically, this is exactly what open AI and some of the these companies want to do.


00:26:59.760 –> 00:27:03.920

sarah: they’re looking for customers. and


00:27:04.140 –> 00:27:26.679

sarah: to some degree the the price point has been so low that people that it’s really lowered the barrier in that dimension as well. It’s an accessible price point to learn these technologies, and then, when you’re playing with them at home. You bring them into the workspace and say, Hey, I’ve got some great ways that we can automate.


00:27:26.900 –> 00:27:36.320

sarah: Some workflows. but more broadly if you look at You know these, these macroeconomic trends.


00:27:37.710 –> 00:27:44.260

sarah: the open AI lost 54 million dollars and 22, and it has a hundred employees.


00:27:44.840 –> 00:27:45.640

sarah: The


00:27:45.940 –> 00:27:53.719

sarah: the technologies that we’ve been talking about are extremely extremely compute, intensive, which means


00:27:53.760 –> 00:28:05.160

sarah: they’re probably 2 companies who are making money out of Lms right now. One is Nvidia. and the other is whatever power company they were.


00:28:05.270 –> 00:28:10.600

sarah: They were accessing to run these these scrapes of the web.


00:28:13.060 –> 00:28:27.159

sarah: I doubt it’s Pg. And the as a Pg. And a customer So, Sarah, I want to touch on you. You can cut that. I don’t want them coming after me. That sounds good.


00:28:27.260 –> 00:28:35.190

[email protected]: Sarah, I want to touch on a themed. We have alluded to throughout the episode.


00:28:35.800 –> 00:28:38.150

[email protected]: and that is the team trust.


00:28:39.090 –> 00:28:42.250

sarah: as I think. So, yeah.


00:28:42.290 –> 00:28:45.080

[email protected]: as these companies


00:28:45.650 –> 00:28:56.389

[email protected]: be them. Large enterprises start ups even providers in this space. as these companies embark on their journey


00:28:56.400 –> 00:29:01.410

alex.ol[email protected]: to either build Lms. Incorporate generative AI


00:29:01.540 –> 00:29:12.080

[email protected]: into their go to market. How can companies and the trust of their consumer base


00:29:12.140 –> 00:29:27.820

[email protected]: and use this technology in a way that will not feel invasive or not feel like. it is infringing on the privacy or the the rights of the consumer.


00:29:30.360 –> 00:29:34.259

Great question. I I think that this is going to be.


00:29:35.930 –> 00:29:42.250

sarah: You know, I do want to give respect to open AI for creating a great product because


00:29:43.190 –> 00:29:46.960

sarah: Lms have have been around vectorized.


00:29:46.990 –> 00:29:49.609

sarah: data


00:29:49.640 –> 00:29:54.439

word embeddings have been around, but they they created a front end. That that was.


00:29:54.580 –> 00:29:57.870

sarah: It’s a great experience. and


00:29:58.210 –> 00:30:09.370

sarah: I think that that in gender trust, in a way that they can parlay into building a


00:30:10.670 –> 00:30:25.570

sarah: partner and a customer business that will have some really big, impressive names. What does that mean? You know, when the rubber hits the road for everyone else. No.


00:30:25.740 –> 00:30:27.780

sarah: when you come into


00:30:28.060 –> 00:30:33.770

sarah: a situation where you’re trying to win a sale from


00:30:33.890 –> 00:30:38.129

sarah: a customer, I think it’s really important to say, what?


00:30:38.930 –> 00:30:57.289

sarah: How can I define my company? And what do I have that makes my company unique? And in most cases this question, before November thirtieth, 2,022 would be, I have a a relationship.


00:30:57.400 –> 00:31:07.420

sarah: and I with this customer. I have relationships with customers like this customer, and I have a long standing


00:31:07.740 –> 00:31:09.470

data set


00:31:09.580 –> 00:31:11.430

sarah: that includes


00:31:11.490 –> 00:31:29.060

sarah: well documented metadata of these relationships. Okay? So maybe that was a computer scientist describing an enterprise customer engagement. But I think that that fundamentally has not changed. If you came into a


00:31:29.100 –> 00:31:34.260

sarah: space where you are providing a service, and you have


00:31:34.370 –> 00:31:45.980

sarah: a great customer field. And because of that ongoing engagement over years, you have a ton of data that reflects your customers.


00:31:46.090 –> 00:31:49.849

sarah: Then you are going to be in a great situation


00:31:50.160 –> 00:31:52.319

sarah: in the large language model realm.


00:31:52.450 –> 00:32:05.930

sarah: Why? Because you’ve already gained the trust, and you’ve already created great results for your customers. And now you’re just laying a a technical


00:32:06.110 –> 00:32:13.129

sarah: technology on top of that, a technical layer that says we can


00:32:13.280 –> 00:32:29.280

sarah: optimize this past data even more, we can surface new insights. We can respond to your questions in a more naturalistic way. And we can have deeper


00:32:29.450 –> 00:32:53.260

sarah: information discovery from our pre existing information. Using some of these, vector, based approaches. Right? So when you think about trust, I don’t think about someone who’s coming to the market tomorrow with, I can solve X, y, and Z with large language models or chat Gpt, for


00:32:53.680 –> 00:33:07.300

sarah: you know, road cleaning, whatever it may be. But I do think that trust is about relationships, and from a technical perspective, it’s about data. And if you can leverage that data


00:33:07.550 –> 00:33:20.730

sarah: to create naturalistic experiences for your existing customers, you’re going to make them happy. And you’re going to gain more data is key and data is what


00:33:20.990 –> 00:33:31.749

sarah: is really amplified. The large language model scene. So I think a lot of companies need to understand that their data is gold.


00:33:31.790 –> 00:33:50.039

sarah: At Orange we, we take our customer engagement extremely seriously. We are customer-centric. Our our concern is great customer experience. And we know that we’re sitting on a gold mine of data. We know past preference.


00:33:50.280 –> 00:34:04.840

sarah: And now we have a tool that allows us to personalize and and create even better experiences, you know. And that’s we’re amplifying that trust


00:34:06.480 –> 00:34:20.589

[email protected]: fantastic. Well, Sarah, I know we’ve we’ve covered a lot of ground today. and thank you for all of the insight. I have one final question now that we’re on the topic of Orange and the projects that you’re working on.


00:34:21.150 –> 00:34:28.960

[email protected]: What What? That’s on your plate right now or in your pipeline excites you the most that you’re working on.


00:34:30.170 –> 00:34:33.969

sarah: thank you. Because


00:34:34.580 –> 00:34:39.539

sarah: I’ve been talking a lot about generative, and


00:34:39.770 –> 00:34:50.789

sarah: I I I hope this is as articulate as as as needed. But the thing that excites me the most is working on


00:34:50.969 –> 00:34:56.769

low resource language projects. We have a lot of customers


00:34:56.810 –> 00:35:00.150

sarah: in our West African. Well.


00:35:00.290 –> 00:35:11.150

sarah: okay, let me start at the beginning, Hi, my name is Sarah, and I’m really interested in low resource language projects. This is why I joined Orange 5 years ago.


00:35:11.780 –> 00:35:18.680

Oranges in a particularly well suited position to support


00:35:19.480 –> 00:35:34.140

sarah: our customers in numerous areas. We have a bank, we have video game company. We’ve entertainment streaming services. We lay Internet cable. We have satellites, we you know. What don’t we do? Well.


00:35:34.210 –> 00:35:35.330

sarah: we


00:35:35.420 –> 00:35:41.349

sarah: are focused on providing great consumer experiences to everyone.


00:35:41.450 –> 00:35:49.150

sarah: and many of our customers are in places that have underserved languages.


00:35:49.450 –> 00:36:01.050

sarah: So orange is in Francophone, Africa. But a lot of people in these regions no longer speak French is not is not considered


00:36:01.460 –> 00:36:06.989

a necessary language as as much as new trends in


00:36:07.000 –> 00:36:11.779

sarah: in political and community


00:36:12.170 –> 00:36:19.770

sarah: economics spring up, you know, there’s ability to to buy and sell products locally.


00:36:19.890 –> 00:36:31.249

sarah: And with that local focus a lot of people have been returning and to local languages that are spoken


00:36:31.640 –> 00:36:48.650

sarah: or comparatively underserved. And so what does that mean? It means that with the shifting dynamics of these areas as well as other, you know, macroeconomic forces, we have a lot of customers who do not speak French.


00:36:48.690 –> 00:36:52.640

sarah: or would prefer not to speak French.


00:36:53.030 –> 00:36:53.910

sarah: Now.


00:36:54.180 –> 00:37:07.009

sarah: I describe to the uninitiated low resource languages as languages where there’s no reddit, although in the past week that illusion is less powerful. But


00:37:07.040 –> 00:37:26.570

sarah: it means that there’s not a lot of data online. So I am very passionate about selling people products in whatever language they would like, and providing goods and services and whatever language they would like. So recently, I ran a machine translation competition between French and Bombbara. That was text based. We had


00:37:26.930 –> 00:37:34.860

sarah: 20 participants cash prizes. The winters are from all over the globe, and well.


00:37:34.960 –> 00:37:48.069

sarah: only a couple of the participants actually spoke Bambara. Almost all of the participants in the competition spoke, or had family members who spoke a different low resource language. And


00:37:48.140 –> 00:37:58.590

sarah: this was a really impressive experience, because it reminded us that no matter how much the large language model hype cycle


00:37:58.830 –> 00:38:02.260

sarah: you know, brings us cool, sticky


00:38:02.280 –> 00:38:04.259

sarah: products and


00:38:04.280 –> 00:38:16.720

sarah: and crazy new applications, and really And you know the Pope and a puffer coat whatever it may be. I think it’s really important to remember that


00:38:16.840 –> 00:38:31.330

sarah: natural language processing is not a solved domain. And there are a lot of challenges out there for the global the global population that are not English centric or not French centric.


00:38:31.360 –> 00:38:38.210

sarah: And I’m hoping that you know, when we were working on this last iteration of the translation project.


00:38:38.240 –> 00:38:51.019

sarah: Everyone was saying, Okay, large language models have dropped. How relevant are they for for this domain. And then so our next step is to experiment with that and figure out, how can you


00:38:51.030 –> 00:39:02.640

sarah: augment large language models for languages that are basically small language models and and you know, have their own orthographic challenges.


00:39:02.710 –> 00:39:12.449

sarah: And at the end of the day it’s really about customer service and people. So I think it’s it’s been a really great experience. And I’ve met a lot of really interesting people.


00:39:12.570 –> 00:39:20.619

sarah: and I trust that that will continue to be a great way to leverage new technologies


00:39:20.640 –> 00:39:23.330

sarah: with a global


00:39:23.470 –> 00:39:29.129

sarah: economy and global engagement. But I do want to reiterate that


00:39:29.580 –> 00:39:39.040

sarah: Llms have not solved AI. We there’s probably going to be a plateau right now, where


00:39:39.240 –> 00:39:46.870

sarah: open AI has not has mentioned. They don’t have plans to release a Gpt. 5 barred


00:39:47.010 –> 00:39:51.170

sarah: different, you know the llamas


00:39:51.190 –> 00:39:52.590



00:39:52.660 –> 00:39:58.190

sarah: models are out there, and there’s been a lot of optimization around


00:39:58.210 –> 00:40:15.360

sarah: compute and running these models on on smaller and smaller memory devices. And so I would watch for to see what Apple is up to, because I think they’re going to. They’re going to jump on the scene with a pretty interesting privacy.


00:40:15.440 –> 00:40:18.919

sarah: oriented solution as is their brand.


00:40:19.160 –> 00:40:30.080

sarah: But I’m I’m really excited about the power of communication and the power of speaking to people in their own language. So watch the space for more machine translation


00:40:30.240 –> 00:40:34.400

sarah: and low resource language innovations.


00:40:35.490 –> 00:40:44.690

Well, well, thank you, Sarah, and you know this has been a a fascinating discussion, one which I think the audience is really going to appreciate, not only for the


00:40:44.800 –> 00:40:49.650

[email protected]: information and insight that you’ve provided, but I think also a very


00:40:49.830 –> 00:40:59.740

[email protected]: actionable optimistic and pragmatic point of view. again for the listeners. This was Sarah Luger.


00:40:59.880 –> 00:41:12.280

[email protected]: Sarah, thank you very much for joining. I really enjoyed our conversation. and I’m looking forward to the next one.


00:41:14.060 –> 00:41:16.500

sarah: Thank you, Alex. It was a pleasure.


Subscribe to our newsletter for the latest news & information.