Lalit Bhatt 20 min

Where is your Installed Base?


Is your Installed Base data a mess? How to get more visibility into your Installed Base Data. This session will cover: - Implementing Installed Base Visibility - Benefits of Installed Base Visibility - Enhancing Customer Relationships - Driving Revenue Growth



0:00

Let's get started. Let me quickly introduce our speaker for today.

0:03

Lullet, but Lullet is the CTO and title and has been at forefront of

0:08

designing both SAS and AI based solutions in the industrial

0:12

space for more than a decade.

0:14

He's one of those industry leaders who loves to get down in

0:18

trenches and has immense expertise in things related to

0:21

install base. And normally when he's not around his work desk,

0:24

you'll find them posting exciting stuff around data and AI on

0:28

LinkedIn. So with that, Lullet, I'm sure you're excited to get

0:31

started over to you.

0:33

Thanks, Sussan. Thanks for the introduction and welcome

0:36

everyone. Good morning, good evening to whichever part of the

0:39

world you are. So today as the point of the discussion is the

0:45

building how we can build the visibility of the install base

0:48

data. And why that is important will touch upon that a

0:51

little bit as well. But I want to start with this picture, which

0:56

when we talk to our customers, our prospects and generally in

0:59

the B2B world, everyone agrees to the fact that install

1:04

base is a gold mine. But that gold mine to mine that one that

1:10

that somewhat has becomes very, very difficult. And that's a

1:14

representation of the picture that we get when we start

1:17

talking to people that vary our data is and seems it's lying

1:20

all over the places. What kind of interaction the data points

1:24

has between them. It seems everyone is talking to everyone

1:27

even to try to figure out something very simple thing. So what

1:31

it leads to is that like the data is messy. You have different

1:35

kind of data points. Even the same data is represented with

1:39

different notion that they have a different categorization

1:43

and that is sitting in different kind of systems. And these

1:47

silos then lead to when we want to make them to talk to each

1:50

other leads to a lot of non scalable integrations. And fundamentally

1:54

what it leads to is that if anything useful has to be done. So

1:58

just for example, you might want to do some reporting or do

2:01

what to some adopt analysis or even want to figure out the

2:03

valid share. What it basically leads to is that like post

2:07

figuring out where the data is lying in different systems, then

2:10

start pulling that data, harmonizing them, putting it

2:14

together, unifying it. And the problem is that that whole

2:17

exercise takes a number of resources, a ton of time. And by

2:22

the time you are ready with the result, sometimes it's too

2:25

late to do anything with that kind of analysis. So with that

2:30

in mind, we'll just move ahead and why that happens. It's just

2:35

because I would say that actually that's how the industry

2:39

is evolved when we talk about space with the B2B

2:41

manufacturing world. Sometimes we talk about companies which

2:44

are there for like the gates and even centuries. And what has

2:48

happened is that during the time they go through multiple

2:51

technology transitions, they're still a part of the things that

2:54

are manual in spite of all the digital transformation. There

2:58

are big divisions. The sometimes it becomes more centralized,

3:02

sometimes it becomes more distributed, but all these leads to

3:06

different kind of systems that come into play. The system

3:09

come into the system, but never go away. So sometimes people

3:13

write internal tools, sometimes the acquired tools, but they

3:16

just remain there. And that just leads to a lot of this messiness

3:21

and lot of data silos that is sitting all across the

3:24

organization. And then we also know that I think what also is

3:28

happening is that this is like a lot of now tribal knowledge

3:32

that starts acquiring in the system. There are people who know

3:35

about everything, but even they don't know what they know

3:38

about everything kind of thing because you have to poke them

3:41

and figure out that okay do you know, then they will recollect

3:43

okay yeah, that is always sitting in my spreadsheet or in

3:46

some of the notebook and things like that. And that is what leads

3:52

to the whole message of the installed ways. So just before

3:56

move ahead, we can just do a small guess like for those who

4:01

are participating here today. And if you are dealing with

4:04

the installed base data, what state or the installed base data.

4:11

And then I think if you see that's a continuum basically. It's

4:20

mostly like either it's very manual within spreadsheets, PDFs

4:24

and those kind of things. There are organizations which are

4:29

I would say that still on a further in the curve that they

4:34

have systems in place, but there are multiple in picture. Then

4:37

there are organizations are further high in the curve who

4:40

usually have this master data management notion or

4:43

sing the source of unified data. But then there is this what

4:47

we call the Nirvana state where people have really using them

4:50

for the intelligence over unified data. So yeah, I think we

4:55

can move ahead on this one. Okay, so okay, looks like

5:01

multiple system that's what this problem that one. So and

5:06

that's not unusual that's what we have seen with most of the

5:10

organization and that's where I would say that the state of

5:13

the technology is also right here. We have like lot of this

5:16

ERP systems lot of CRM system and especially with bigger

5:19

organizations what we see is that these systems come

5:23

bound because of different mergers, equiditions kind of

5:25

thing. And just to basically look into some of the data

5:31

points what we have seen is that like this is a rough estimate

5:36

what we have seen among our customer base. Usually there

5:39

are 7 to 12 system that contain customer product and asset

5:42

information. And then this data quality because of that

5:47

usually the productivity losses by 20%. One of the basic

5:51

reason is that either the customer data is wrong or it is

5:55

incomplete which leads to a lot of work to figure out to

5:58

build the picture completely to build the rubric cube. So

6:01

that that can be actionable. And what it leads to is that

6:05

any simple workflow that has to be done minimum requires

6:08

4 to 5 people. We have to reach to them figure out those

6:11

data points collated together and then build an

6:13

X-rayable inside to them. So how we can solve this

6:18

problem. So it's like what happens it's like going to the

6:25

other end of a continuum saying that what happens if there

6:28

is a single source of truth is available to all stick

6:31

folder and it can automate the insights and people can

6:35

collaborate over that one. Would that be a Nirvana

6:39

state kind of thing which would help in mining the

6:42

install base the intelligence around it and drive the

6:46

growth through that one. And we also talk about a lot of data

6:50

driven decisions and that's where these all things start

6:53

coming into play. So we we talk about AI intelligence

6:59

and everything. But essentially at least what I have seen

7:02

this is very personal. I would say that 70 percent I

7:07

arrive just taking a number but it's in that range kind of

7:09

thing. The quality of analysis is a big function of the

7:13

datasets that it going inside those models that is going

7:17

inside those analytics. Otherwise it just becomes a

7:20

garbage in and that is out. So what we'll do is that now

7:23

we'll look into how we can do that journey. What is the

7:28

way to do that one. So the first and foremost thing is to

7:32

make sure that you get control of your install base data

7:35

quality. How that happens? You would have your data

7:39

sitting in all different systems. You need to take it and

7:43

take it through a process. Again I have put it like you

7:46

have to profile it clean it, map it and then when I say

7:52

map for example like in the install base world it's like

7:55

that whether it's an asset or it's a part those kind of

7:58

classifications come into picture. What is the mom

8:01

structures around that one. Then you have to enrich it

8:04

miss fill out the missing data points need to click it

8:07

them if I'm stitching. And this process is usually not

8:10

like one step after another. It's like a lot of it.

8:12

If things happen and then what happens is that once

8:15

you're able to do it you get your install base data

8:17

model which access to single source of truth. Believe me,

8:20

I think getting to this picture itself it's a journey,

8:24

but the journey worth taking because on the other side of

8:26

the thing now very clean data and what it opens up is the

8:32

possibility to a lot of different kind of analytics over a

8:36

clean set of data and the quality of analytics automatically

8:39

becomes much better just for one reason that the data is

8:42

very very clean now. So I'll just give another picture on

8:47

that one and actually just a small story out there because

8:51

entitled as you know the name means entitlement. So it's

8:55

like when we started almost 10 years back. We thought it

8:58

will just focus on the analytics AI part of the things. We

9:01

would be expect we were expecting that our customers which

9:05

is this B2B equipment manufacturer. We would get good

9:10

quality data and we will just focus on the AI model. And I

9:15

think that's where we wanted to focus on as well. But then

9:18

you quickly realize that I think to reach to that point we

9:22

will need to take care of this data story because the data

9:25

that you're getting was I think saying messy would be an

9:29

understatement. It's just like it comes from all over the

9:33

place kind of thing no relationship between them you get a

9:36

bunch of CXL files or bunch of dumps basically and start

9:40

making sense of that. And that's where we started doing work

9:45

on our data pipelines started making sure that if we need to

9:48

scale we need to get handle on this whole thing. And that's

9:52

where I'm trying to put a picture in place that's how we

9:56

look into that whole thing. We get the data on the left

9:59

inside of the things it could be coming in any format it

10:02

could be coming in CSV data could be served to API is

10:06

different kind of ERPCs. She has systems and even like

10:10

right now we see that a lot of people start putting the

10:12

data in data lakes. But the problem with data lakes is that

10:15

what people are doing is they just take the data and dump it

10:18

into data lakes. It's just like now the silos are sitting

10:21

inside data like okay you have all the data but doing

10:24

anything meaningful becomes very very difficult because the

10:26

data is not harmonized. A part at one place with a different

10:31

time and it is another data cello it's a different name.

10:34

How to make sure that we are talking about the same part. So

10:37

all those things start becoming a big problem. So what

10:40

happens is that then bring all this data into basically

10:43

harmonize it and then start pulling out a canonical

10:46

version of your install based data which would include a

10:50

clean data, clickers list of customers. They're different

10:54

locations. What all equipments they have got. What all parts

10:57

they have got. What are the service contracts which have

11:00

expired which are in force. What are the different

11:02

warranties. The complete transaction history. The

11:04

complete service history. Once that is in place, the

11:08

right hand side starts making sense. Otherwise they are

11:11

just models doing nothing. So how we can do that one. So I'm

11:18

just putting like this is typically and we have seen that

11:22

I think customers do go in that one even in our initial

11:25

days. We had a similar stack in place. You basically start

11:29

writing a lot of custom scripts. Take this data out.

11:32

Mishmash them together. Take them through different tools.

11:36

There are those ETL kind of tools. Then you'll take it

11:39

through analytics and this kind of things. The only problem

11:42

is that the stack becomes too complex very quickly. And the

11:46

bigger problem happens is that these are very generic tools.

11:50

So making doing to do anything purpose built for them, it takes

11:54

a lot of time to make sure that they are aligned with the

11:57

problem statements. And of course, one thing always that

12:00

comes into play is that the domain knowledge. That how

12:04

that easily how easily that can be captured as part of the

12:08

tool setups as part of the scripts or as part of the

12:11

recipes that they are put in. And usually that leads to a lot

12:15

of time resources at various skill sets that are needed.

12:21

And yeah, I'm not saying that we have seen this happening, but

12:24

what we also seen is that it usually turns out to be very

12:27

costly. And it becomes a effort, a normization in itself in

12:33

this up and that's what we did is that we actually brought out

12:38

this tool set in our part of our solution, which basically,

12:43

it is very purpose built. I would say that it's like this is

12:47

not like any generic tool, but we've just focused on building

12:50

this picture for our B2V customer base, building the picture

12:54

of the installed base data. And it again has the same set of

12:57

steps like data profiling, they're cleaning, deduping and

13:00

reaching unifying. And then what we have also got is that we

13:04

have put recipes in place. So what happens is that whenever we

13:07

get the data, say the initial set kind of thing. Of course,

13:11

like we had interaction with our customer because we have

13:13

to also understand the business domain, the way people work

13:18

that terminology they use, the categorization they use.

13:21

There might be two HV companies, but two XS companies still

13:24

are two very different companies in terms of a lot of

13:27

those smaller details. But what happens is that we captured them

13:30

as part of our recipes. And once it is captured, then then

13:35

as per the continuous process, whenever the new data comes in

13:39

that we just keep harmonizing it against the existing data

13:43

model that we have created. So I'll just put a small picture

13:48

in place, which gives the overall sense of how the whole

13:51

arc looks like. So again, like it's just again, like now you

13:57

would have been very comfortable with like how the whole

13:59

process is working. But on the left side of the thing, you have

14:02

like all the data silos from which the data would be picked up.

14:07

And then what happens is that it goes through this data quality

14:10

engine, which is called the install based data studio. And then

14:13

we do all the analytics and AI. And then it is served through

14:16

web and mobile. You can you now have your complete 360

14:21

install based visibility may not be today is not the right

14:24

day, we are not part of that one, but yes, at some point we'll

14:27

probably show our web and mobile interfaces are not weeks

14:32

each year. And then this is a complete open architecture. We

14:36

have like we can integrate it very clean. We can integrate

14:40

it very natively with the Salesforce and with open APIs and

14:43

all. It's very easy to pull the data into any other system.

14:47

Just a sneak peek of how the application looks like and the

14:52

how the web looks like. If you see you would be seeing a map

14:55

basically in which now you can actually visualize all the

14:59

all your install based. So if you see location here location is

15:03

basically a customer location. And you can go to the equipment

15:07

view and it will tell you equipment view and in fact, what we do

15:11

is that we actually pull all the possible bits and bytes of

15:14

that equipment and create a equipment 360. So you know that

15:19

to which I guess we should order it was purchased when it was

15:22

installed, those kind of thing, but also visiting it has been

15:25

done. What is the bomb associated to that one. And these

15:29

fields are many of these fields are editable like if someone

15:32

wants to track the hours there in the field. So with mobile

15:35

app and all this business very easy on the go when people are

15:38

visiting, they just update this one. But what it gives is

15:41

that it gives very clean and very powerful data points that

15:46

now can be feed into the AI models to figure out. So why we are

15:51

doing all this. We are doing all these things so we can figure

15:53

out the entire revenue opportunities. And that's where the

15:56

story comes to that's where the story starts are

15:59

lying in with the revenue goals basically. So we'll start

16:02

also figuring out which customers are at risk those kind of

16:05

things. But more important thing looking at those patterns of

16:08

conjunctions or parts and services. That system can now

16:13

predict that where we would see more opportunities now which

16:18

can be tapped. And of course, like there are other things like

16:20

propensity and also it will help out in prioritizing the

16:23

efforts as well. And we know that like Salesforce here is

16:28

probably one of the most big cases. See are them right now. So

16:33

what we have done is that we bring this so that so Salesforce

16:38

helps us in managing the whole workflow of the thing. But

16:41

just imagine if that whole install based data is available

16:45

of that particular customer as well. How powerful it can be

16:49

in terms of doing the conversation with the customer. And

16:53

that's what we have got where we serve this I be intelligence

16:58

data. So for example, just showing you a Q next communication.

17:01

But when you go here, there is this I be intelligence tab

17:04

which pulls the data from the install based data stores. But

17:08

you can now see all the overviews all the different

17:10

equipment parts services service contracts. Every day all

17:14

the data points are available a complete 360 view is available

17:17

to help the sales people doing a much better conversation

17:22

with the customer about their needs and predicting their

17:25

opportunities. And you will also see opportunities

17:28

tab. So we also have like different ways of creating

17:32

opportunities where there are even AI models that help

17:35

in creating opportunities. So yeah, I think with that I guess I

17:42

hope I am able to complete the picture on the I install

17:45

based visibility and how it helps in basically driving the

17:50

revenue further. I guess yes, and I think that's where we can

17:54

I don't know if there are any questions we can take that up.

17:57

Perfect. Thank you. Lullet. That was fantastic. Let me have a look.

18:01

I do see a couple of questions that have come in.

18:04

Let me take this one first. It's an interesting one looks at

18:07

the broader scope of the industry. It's what does future of

18:10

this industry looks like?

18:11

That's very interesting. We know that we see a couple of

18:16

trends out there. We definitely has seen a huge digital

18:19

transformation trend which is still going on. But what we

18:22

see is that there are two parts of the puzzle that we will

18:26

see a lot of things will need to happen. One is definitely

18:31

on the data side of the things to have a very clean data,

18:36

very unified, harmonized data so that the models can run

18:41

very efficiently. And then of course, the whole AI side of

18:44

the things. These two things are very interrelated to each

18:48

other, but that's where we see a lot of. Of course, like there

18:52

are workflows on top of them, but I think those workflows

18:55

would need a set of clean data and a set of very powerful

18:59

models so that these workflows can run very efficiently as and

19:06

that has to be like part of the playbook of any organization.

19:10

Right. That's true. As you always say, no data is like a

19:15

new titanium. It's all about using it wisely.

19:17

And moving on to the second question, this is a little

19:23

industry pertaining. This company works or manufactures

19:27

electrical engines. They sell it to OEMs. They do not really know

19:31

whether OEM sell that machine to is a any way we can locate

19:35

that install base. Okay. So yeah, that's the end customer

19:39

mapping problem. We see multiple times. Again, we have

19:43

done work even to achieve that when I'm not saying that it's

19:46

salvage straightforward, but there are ways to do that. We have

19:50

we have a couple of ways to handle those things. For

19:53

example, sometimes a service history has those information

19:56

sometimes there are regulations which allow which need the

20:01

companies to maintain the end customers and there are

20:03

different other databases which can be merged together to put

20:07

that picture in place. Yes, that's the problem that we deal

20:10

with that one and there are ways to solve that. 100% no, we

20:13

also talk. I would probably put a disclaimer that even we

20:17

have not solved with 100% but yes, with bits and pieces, we

20:20

have been able to do a fairly good job in piecing that picture

20:24

with that. Perfect. And that's to thanks to the question. Thank

20:28

you Lolli. I think that was the last question we have. Okay,

20:29

then thank you very much for your time and a big thank you to

20:32

everyone who joined in. Folks, be sure to check our next event

20:37

happening on July 24. They can scan the QR code to hop on to

20:39

our website. You can type on www.entitle.com. But that's it for

20:45

today guys. So thank you for tuning in. Take care and have a

20:47

nice one. Yeah, thank you. Thanks everyone for joining.