How to get more visibility into your Installed Base Data This session will cover: - Implementing Installed Base Visibility - Benefits of Installed Base Visibility - Enhancing Customer Relationships - Driving Revenue Growth
0:00
But Mr. Lullet is the CTO at Entitle and has been at the forefront of designing
0:06
both SAS and AI-based solutions in the industrial OEM space for decades now.
0:12
He's one of those industry leaders who loves to get down in the trenches and
0:16
has immense expertise in things related to the install base.
0:20
Lullet has a phenomenal grip on technology and its application for machinery
0:24
manufacturers and when he is not at his work desk, you will find him posting
0:29
some very interesting stuff around AI and data on LinkedIn.
0:33
So that Lullet's over to you.
0:35
Thanks, JW. Thanks for the introduction and good morning, good evening, good
0:40
afternoon to whichever part of the world you are.
0:43
So, yeah, so today we are going to touch on this topic of install base
0:47
visibility and install base as such is usually a very important topic for any
0:52
original equipment manufacturers, especially industrial OEMs who make these
0:57
large equipments.
0:59
And if we take a step back, everyone understands that inside that install base
1:04
if mind correctly lies cold. What it means is that basically if we can
1:09
understand the customer behavior, the equipment behavior that is sitting in the
1:13
market, it can give us a lot of more revenue opportunities.
1:18
Because once an equipment is sold, the relationship that it ends there, the
1:22
relationship continues in terms of the service needs, the parts needs, the
1:26
consumable needs and it even goes to further that when they want to replace
1:30
equipment, home they should go to and that is where the quality of the
1:35
relationship makes a lot of difference.
1:38
So, with that in mind, we will look into the full install base notion and how
1:42
to make sense of it, what we can do and how we can drive actually growth out of
1:47
it.
1:48
So, I am sure you might be seeing a diagram on the slide right now and this is
1:53
a very common story that's what we have seen and this is an endless hill for
1:57
most of the OEMs.
1:59
There are like so many systems and especially with large organizations with a
2:03
lot of mergers and those things going on and even with some of these companies
2:08
are like that we deal with, we have seen companies which are like 100 years old
2:12
kind of things. So, they have, they just don't have systems coming as part of mergers, but they
2:16
also have a lot of legacies systems in place.
2:21
We have even seen data sitting in mainframes and then what happens is that any
2:25
new thing is needed any new initiative sometimes will land up making new
2:28
systems or making new data silos.
2:31
So, essentially what it leads to, it leads to all kind of actually integration
2:36
message well, the data flowing from one system to another system and then each
2:43
of this data has their own peculiarities, they do become a cellar's with time.
2:47
And this all leads to very, very non-scalable way of doing things and it's not
2:52
unusual.
2:53
This story we have heard again and again that if they need to build even a
2:57
small part of the view of customer, lift customer 360 just an aspect of the
3:02
customer, it basically means a lot of manual work in terms of pulling the data
3:07
from couple of systems.
3:09
Actually, even before that figuring out where the data points are residing,
3:13
then basically pull it makes sense of it because somewhere the data that and
3:18
ontology is different, the categorization is different, then someone has to
3:22
make sense of it, put it together and then some analysis can be done on that
3:27
one.
3:28
The only problem is that sometimes takes so much time that by the time the rid
3:33
dles are there, they are not valid and why this happens.
3:38
This happens because as I have talked about like what happens is that lot of
3:41
these efforts go. Of course, like there are certain things we started
3:45
enterprise level like ERP choices, CRM choices and then sometimes there are
3:49
like those big divisions, everyone has their own way of, in fact everyone has
3:53
their own take on which ERP or which CRM.
3:56
Sometimes these are not driven in a from the headquarters basically. Again,
4:01
like no one model is right or wrong, but what happens is that these all leads
4:05
to so many different kind of decisions which fundamentally leads to so many
4:10
different kinds of tools and that's where leads us to this whole data mess.
4:15
So before we move ahead, let's just check the state of the things within the
4:18
ecosystem, with the crowd out here, all the participants who are there. So we
4:23
'll just run a small poll, we'll just try to figure out like what is the state
4:26
of the installed based data right now as part of your organization.
4:30
So if you can help me with starting the poll, you might see like poll on your
4:36
street.
4:37
So basically, they say that what is the state of the installed based data, are
4:41
you still mostly working on spreadsheets or the situation is that they are into
4:45
multiple ERP CRM systems or you have some sort of MDM or some sort of master
4:51
data management notion in place so that you can reach to one system and get the
4:54
complete 360 view.
4:56
And going further, there is a MDM, there is a single unified source and there
5:01
is also analytics layer on top of it so that more insights can be derived out
5:06
of it which can help in revenue growth.
5:10
Okay, so I think this is actually very interesting and not I would say not
5:15
different from what we have seen, we see 25% mostly on spreadsheets and 75% on
5:22
in multiple systems.
5:24
And this is what is the state of the thing right now, there is like it's what
5:28
we see day in and day out with our customer base as well.
5:32
So before we move ahead, this is what we have seen and this is what the sum of
5:37
the like art and all else have verified that what they see we have seen
5:41
typically 7 to 12 systems again like based on the size of the OEMs it can vary
5:47
a lot.
5:48
But we have seen generally 7 to 12 system kind of things and usually the
5:52
productivity loss because one of the fundamental reason is that someone has to
5:57
collect this data together makes sense of it, unify it before anything useful
6:02
can be done with that data.
6:05
And the other problem is that even if this unification happens, there is still
6:10
a lot of data quality issues that still lingers on and then to do anything you
6:15
have to reach out to more than one people.
6:18
So this all if you see take a step back what it all leads to it all leads to a
6:21
very, very highly un scalable system and the guarantee of it being accurate
6:25
being precise is also very, very low.
6:28
What's the solution and that's what we will get into we will try to figure out
6:31
that how that situation can be solved.
6:34
So what we want to do it what is the ideal situation the ideal situation is
6:38
that if we can have a single source of truth what it means is that a system of
6:44
records must MDM we can go it.
6:46
We can give it a different names but basically there is a single place to go
6:51
which is all the data unified together clean duplicate did duplicate it in
6:56
reach everything and it is available to all stakeholders and not just that
7:02
there are insights there analytics very AI layer on top of it which is helping
7:07
to create those predictive,
7:09
predictive insights and what happens is that when this data is centralized it
7:15
also helps in collaborations and what the end result and result is that we are
7:19
able to serve the customer the best possible way and going further this is
7:24
where the growth lies.
7:26
So and that's where we talk about this whole install based visibility and why
7:30
it is important and if you really look into that it's not just about one part
7:35
of the organization that gets help from it it's not just a sales team it's not
7:40
just the marketing team in fact if that data is clean we have seen in our
7:45
customer base people have used it to get inside right in pricing in inventory.
7:51
So all sorts of functions start getting help out of this clean data so the
7:55
question comes up how should we get to this picture the most usual way I think
8:00
again here I would say there's nothing different from what a typical if anyone
8:07
has little bit gone through this data cleaning exercise is not very typical
8:11
about what what someone has to do and this thing people do it what we have seen
8:15
is that to do some limited inside they do these things in a very very manual
8:19
way but again it cannot be done.
8:20
But again it cannot be done at the scale when it is done in manual way but
8:24
essentially what you do is that if you look on the left hand side of the things
8:28
you have a set of systems you take that all data together and a profile it to
8:33
understand the nature of the data and then you take it through a path of
8:37
cleaning it which would mean cleaning mapping enriching the duplicating and
8:41
then unifying and stitching it together so you can actually bring the whole
8:46
data into a unified data model.
8:49
And then once it is there the magic happens you now have a single source of so
8:53
this is a technical view of the thing if you look from the business perspective
8:57
what essentially is happening is that you are bringing all the data into place
9:01
here again like this is a typical thing if you go to any data unification
9:07
exercise the the philosophy remains seems but being a weird more focus on the
9:12
install base data but essentially what would happen is that you will start
9:15
getting a very very clean and unified picture of all your customer.
9:18
And then you have all your customer addresses equipments parts service context
9:23
wearing these order history service history whatever makes sense to understand
9:27
that customer whatever makes sense to when we say that this is our customer 360
9:32
and once that is in place then one whatever you do like about AI or all kind of
9:40
analytics that start becoming possible and because the data that is coming on
9:46
the left hand side is very very clean the quality of the ridges.
9:47
Automatically becomes much better they are accurate they are precise so that's
9:53
where and then they help in taking what you call a data driven decision and how
10:00
now typically this is done so again we have seen models like we have seen many
10:05
companies trying it the DIY way and what happens is that basically you on the
10:10
left again like is the left to right if you see you will again it will
10:15
essentially go to the same exercise of acquiring.
10:16
Data cleaning and then you run some analytics engine but in the DIY what
10:21
happens is that you have to now basically put a team different kind of tools
10:28
and then it takes a lot of resources and time and that's where this usually
10:35
becomes a very very costly exercise and it's again we have we actually it's
10:40
very interesting that we have seen in our customer base.
10:44
Some customers actually who has taken this approach and they did it for for a
10:48
year and year plus kind of thing and then they realize that this is not a very
10:53
scalable way of doing things and then we have worked with them and we are able
10:57
to do it at a very very high scale and why we are able to do it actually let me
11:02
tell small story here actually when the title started almost nine 10 years back
11:07
initially we thought that we will just focus on the AI part analytics part of
11:11
the things.
11:12
What we wanted to do is that we will get the data from our customers and we
11:15
have over all this focus which algorithm to find the patterns in the buying
11:19
behavior so that we can predict and prescribe that what kind of parts a certain
11:23
customer would be needing what is the propensity to buy and all those things
11:28
but what we realize is that the data we are getting is very very bad.
11:32
So what we decided on that unless we don't take control of the data story we
11:36
will never be able to have a very predictable story on the analytic side of the
11:42
things and that's where we started investing a lot on this part of the equation
11:47
as well and that's where we have our own like this installed with data studio.
11:52
What essentially it is that is that like again the same steps you profile you
11:57
clean the duplicate you basically then unify it together and then you get a
12:02
very clean set of installed based data and just putting some screenshot here
12:06
just to give an idea of the things this is basically a complete local is the no
12:11
code platform what we do is that we basically once we go through the initial
12:16
exercise we capture it as part of recipes and then whenever a new data sets
12:20
come in the data pipeline.
12:21
The data pipeline automatically run those recipes and this really helps us in
12:26
doing the data cleaning at scale and once that is done then what happens is
12:30
that this is available into our what we call this the front end so we have a
12:37
front end mobile and that one so I'll give a small view of that one that once
12:41
you are able to get through this exercise and you have a mechanism in place to
12:45
slice and dice this data it can really help in doing a lot of powerful.
12:50
Powerful data driven customer research let me just switch here a little bit I
12:55
'll go to so this is how once you have done that when if you see here this is
13:00
now all the installed base if you see here there are like equipment and
13:03
locations it is one of our demo account but you can now start going inside that
13:08
one and you can see that well all your customers are if I just click here it
13:12
can even tell me like this place how many equipment parts and all those things
13:17
are there.
13:18
And then what I can do is that now I can for example I want to go to location
13:23
call QNX communication and so we have a notion of account location what it
13:28
means is that so for example general motor general motors have 10 plans so
13:35
general motor would be an account and each plan would become a location so I
13:38
can go to a plus location out here and here if you can see that it is telling
13:43
me that at this location what is the behavior of that customer along the years.
13:47
And you see here it is telling me that there are 4 equipment so I can see all
13:51
the list of equipment in fact I can go to every equipment and can see the edges
13:56
bomb the service bombs sitting against this equipment and then there are again
14:01
like details what we talked about like we can see the parts list the services
14:05
service contracts was the status of the wear and t's context and this is
14:09
interesting thing opportunities.
14:11
So what we also do is that we basically look into this whole purchasing
14:15
patterns of the past on the history not of just this customer but across the
14:19
customer base and then we do some sort of there is like those models that we
14:24
have built on which we can figure out the behavior of the whole co-horge inside
14:28
the customer and we can then fair benchmark the best ones and there is an
14:34
opportunity where we know that the leg arts can be taken up to the level where
14:37
the benchmark customer is sick and then.
14:40
This is interesting that now if you really see what is happening here is that
14:45
the whole 360 view of that customer is available at your fingertips and you can
14:51
as well if I go back to the my map if you see here there's a lot of filters you
14:55
can really for example if I just want to know that show me the equipments which
15:00
are like 5 to 10 years old kind of thing so it is just filter and show me those
15:04
equipment so that's kind of what slice and dicing is possible.
15:07
So that you can really define your pipelines in terms of so for example if you
15:12
know that certain equipments of age let's say older than 10 years 10 to 15
15:18
years kind of thing can be upgraded or some sort of that thing those those all
15:23
things start becoming very very easy but having said that I think we always
15:27
hear that not another tools and that's where what we are doing is that we have
15:32
soon very launching.
15:34
Well okay let me take a step back actually this whole architecture is also very
15:37
open architecture we have all open APIs and everything it plays customers do
15:41
connect to our open APIs and take this data back through into their system but
15:46
what we have also going to do is that we are going to launch it at the sales
15:50
force package we know that like last of our industrial oeums that we work with.
15:55
Majority of our customer are actually in sales force what we have done is that
16:01
if you remember here if I go to the same QNX last location so we were able to
16:07
see the basically the 360 view of that location so if you see this 360 view
16:14
that this 360 view is now would be available right inside the sales force how
16:19
does it helps now imagine that a sales person is now researching an account and
16:24
they have to compete with the salesperson. And they have the complete history of that account available here in the click
16:28
of the button they can see what all equipment that customer has what all parts
16:33
services again the same thing that I just talked about it's exactly a replica
16:37
of the same thing but it is available right in the context of the workflow that
16:41
a sales person is running when he is doing or researching to find the
16:45
opportunities within that.
16:46
So yeah I guess that's what I wanted to cover today the whole notion of this
16:49
part of this visibility and how we can achieve it from the data cleaning part
16:54
of the thing taking it through the whole data pipeline bringing it to a point
16:58
that it can it is available to the not just sales team but across the
17:03
organization to take very very rich data driven.
17:06
Yeah, I think that's probably we can stop here and we can see that if there are
17:12
any questions we can take that yes awesome thank you all appreciate it we did
17:18
get a couple questions the first one you mentioned about a data mess in early
17:23
slides yeah what exactly are the services you off.
17:27
Okay so we are basically what we call this we have a stalvesth intelligence
17:32
platform so what we do is that we take care of this whole entire pipeline we
17:38
just so for our customers they just have to give us their data and then we take
17:44
care of everything we will take it will figure out of course like we need your
17:47
help in terms of understanding the domain because you people are the domain
17:52
expert you will tell us that like certain categorization or how the product has
17:56
been. How the product hierarchy should be made on what what are the classifications
18:00
but once basically that is in place we just take it through everything we run
18:05
it through our pipeline and we build it in such a way that it just remains
18:09
scalable next time things come up it just runs automatically the data is
18:14
claimed it is now available on insights is available in CRM it is available
18:17
open API you can integrate with your own systems and then we have analytics all
18:22
this algorithms built on top of that one that will start giving.
18:25
The opportunities the predictive opportunities we can we also tell about prop
18:30
ensity customer health and all those things what it does is that it basically
18:35
arms you with all the relevant data points to basically when you are talking to
18:40
a customer you know exactly what are the points that you should leverage to
18:45
basically when you were engaging.
18:47
Perfect thank you the next question if the wrong data was ingested initially
18:52
can we export those wrong information and update with correct information we
18:58
ingest absolutely this is our we do it on day to day basis and then we know
19:02
that sometimes it takes time to get it right and we have an iterative process
19:07
it's not like once and once done and those kind of things the data pipeline is
19:12
very flexible.
19:13
And and and we we always constantly look for opportunities to fix data we have
19:17
a lot of mechanism in place and in fact going further we also like we we work
19:22
with a notion of a very I would say that a little bit high touch time to think
19:27
so we have experts whom we call the customer success manager who understand the
19:31
after market very well who understood the data part of the story very well and
19:34
they were very constantly in terms of making sure that data is of clean and
19:39
very high quality perfect.
19:41
A couple more next one if we implement in title what is the implementation time
19:46
and learning curve for our team okay so to question the first part is
19:51
implementation time depends on the size of the things but typically we have
19:57
done like implementation within I would say 8 to 12 weeks kind of thing that's
20:02
where people start seeing their results on this day they are able to access
20:06
their data they can start even between pipelines size and twice the data.
20:10
And those kind of things to do with the second part of the question second part
20:15
is what was what is the learning curve for our team so if you see the UI is
20:19
pretty intuitive and it's not like we just leave you there is a very high touch
20:26
engagement that happens where our CSNs will make sure that your team
20:30
understands how to use the tool how to make pipelines there are constant touch
20:35
points. There are periodic meetings not just in the initial days but throughout the
20:40
engagement period as well and so that we make sure that you are leveraging the
20:45
tool the best possible way but otherwise the tool is very very intuitive if you
20:49
understand if I think any salesperson understand this whole customer 360
20:54
behavior if you just look from that perspective the tool is very easy to know
20:58
yeah. Awesome good okay one one more is your solution compatible with sales force why
21:05
do I need it if I already have sales.
21:09
Okay so the thing with sales force is that it doesn't have data it has the data
21:15
model but it doesn't have data and that's the piece that we think we bring all
21:22
data together so for our customers it's not unusual sometimes we break 10 20
21:27
years of data. So remember so so like one of our bigger customers is they have chillers the
21:31
chillers has life of 25 or something 30 years kind of thing so what we have
21:36
done is actually the whole 30 years of data here and we have cleaned it
21:41
together we now know the 30 year history of every customer location we know
21:44
that against that chillers what all has happened what are all the different
21:48
cellars chillers against that location this is where the power lies so sales
21:52
force and these tools they provide a good data model but someone has to fill
21:58
that data and that's where in title comes into picture we will take all your
22:03
data put it together unified
22:05
and we'll give you that data access so that you can take very high quality data
22:10
driven hopefully JW like yeah and such the question I think I think it
22:15
definitely does.
22:17
There was another one came in though you kind of just touched on it how does
22:23
entitled give visibility to our install based data is the UI tool and then if
22:30
yes is it only on top of sales force.
22:33
No in title in essence is the installed base data or the core of the title is
22:39
about building that data you can call it MDM or what sort of that thing but
22:46
essentially what we do is that we will take all the data in our platform will
22:49
take it through that whole data cleaning exercise we have we have built all
22:53
those tools that helps us in doing that in a very very scalable way that's
22:57
where like if you see our implementation cycle are usually like 8 to 12 weeks
23:01
kind of thing of course like if you can see that we have a lot of data in the 8 to 12 weeks kind of thing of course like if you are talking about very huge
23:05
install when I say huge it's like 100, 150,000 kind of locations kind of thing
23:10
and bringing their 30 year history but otherwise usually 8 to 12 weeks.
23:15
Customs are up and running with us we bring this data we will put it together
23:21
will give it the context we will enrich it we will deduplicate it and then that
23:26
is available through our inside
23:27
tool which I just show so I can just go back to that one yeah so if this tool
23:32
through which you can access it and this tool we don't have like per user
23:38
license kind of thing we just work on a subscription model where every one
23:42
possible in the
23:43
organization can access to that one that's the flexibility we give but we also
23:47
know that many of the sales organization wanted in the context of sales force
23:51
there are other part of the organization which needed in the context of their
23:54
tools.
23:55
So in those cases there are integration available the data is available through
23:59
open API so people make it available there but yes through this tool you can
24:03
now access this whole data people can come here research it or they can
24:07
directly see it in the context of their sales force.
24:11
So hopefully that answers the question.
24:13
I think so there was a follow up to that which is only master data or along
24:18
with ERP transfer data once again only says only master data or along with ERPs
24:25
Oh okay you can give us the RP data give us CRM data give your spreadsheets
24:29
give your mainframe data give your service FSM tools data.
24:33
We have we use there are some implementation we take data from 8 to 10 systems
24:38
and there are like we also take data from ticketing systems and all those kind
24:42
of things so give us all kind of data both data better for us because then
24:47
algorithms run at scale the richer the rich the data set is the quality of the
24:53
algorithm also becomes much better so not limited to any one system or anything
24:58
give us data from wherever you are to whatever quality it is.
25:01
And we will make sense of it we will unify it and we will bring it together and
25:01
we will put it in a canonical data model which is like the date the purpose
25:02
built in the data model for the install piece where we will host this data
25:15
perfect perfect all right we are coming just up on time folks well thank you so
25:20
much for the content and going through those those questions for the attendees
25:25
here.
25:26
Any closing remarks no I just want to say thanks thanks everyone to spending
25:32
their time last 30 minutes with us and seeing what we are doing and we hope to
25:38
talk to you some as well in the future.
25:42
Excellent yeah thank you for your time folks have a nice day or wherever you
25:46
are good I see you on the next event thank you.