DataFramed
DataFramed

Episode · 2 months ago

#113 Successful Frameworks for Scaling Data Maturity

ABOUT THIS EPISODE

To become a data-driven organization, it takes a major shift in mindset and culture, investments in technology and infrastructure, skills transformation, and clearly evangelizing the usefulness of using data to drive better decision-making.

With all of these levers to scale, many organizations get stuck early in their data transformation journey, not knowing what to prioritize and how. In this episode, Ganes Kesari joins the show to share the frameworks and processes that organizations can follow to become data-driven, measure their data maturity, and win stakeholder support across the organization.

Ganes is Co-Founder and Chief Decision Scientist at Gramener, which helps companies make data-driven decisions through powerful data stories and analytics. He is an expert in data, analytics, organizational strategy, and hands-on execution. Throughout his 20-year career, Ganes has become an internationally-renowned speaker and has been published in Forbes, Entrepreneur, and has become a thought leader in Data Science.

Throughout the episode, we talk about how organizations can scale their data maturity, how to build an effective data science roadmap, how to successfully navigate the skills and people components of data maturity, and much more.

You're listening to Data Framed, a podcast by Data Camp. In this show, you'll hear all the latest trends and insights in data science. Whether you're just getting started in your data career or you're a data leader looking to scale data driven decisions in your organization, Join us for in depth discussions with data and analytics leaders at the forefront of the data revolution. Let's dive right in. Hello everyone, this is a Bell data science educator and evangelist at Data Camp. If you've been listening to this podcast, you'll know that one of my favorite topics to talk about is analytics and data maturity and how organizations and data leaders can traverse the data maturity spectrum. There's arguably no better person to talk to about this with than Ghanissary. For over twenty years, Ghanas has been focused on solving organizational challenges using technology. If you follow his presence, he's always sharing knowledge and helping decision makers leverage data to drive business value. Kanda School founded Gramner plus member firm with plus clients and the global presence. He launch Innovation Titan, Data Analytics Advisory and Thought Leadership firm, and he advises executives on organizational transformation using data science. Kinda specializes in building data science teams and helping firms adopt a data culture. Throughout the episode, we talk about his perspective on data maturity models, how organizations can scale their analytics maturity, how data leaders can build an effective data science roadmap, how to tackle the skills and people components of data maturity, and much more. If you enjoyed this episode, make sure to rate and subscribe to the show. And now on today's episode, Guanas is great to have you on the show. Thank you a thanks for having me. I'm so excited to discuss with you all things data culture, how to build a data driven organization and operational lies. A lot of the best practices that you discuss, But before, can you share a bit of a background about yourself and what you're most known for the data space. I'm a co founder and a chief decision scientists at Gramner. I have about twenty years of experience solving organizational challenges using technology. Co founded Grammer primarily to help organizations make better decisions using data, analytics and storytelling. We serve hundred plus clients the likes of World Bank, Microsoft, unstand, Young and in my role. I lead our client advisory and innovation practice. I've worked with c i o s and Chief Data officers on their data strategy roadmap, helping them realize business r o Y from their data investments. Work with several organizations in terms of analytics, data visualization and tracking r OY and measuring this on a month on month basis and reporting it back to the executives to secure subsequent investments and helping the executives do that. I also lead our innovation practice. We do applied research at our AI and story laps done a lot of interesting stuff in terms of applying say you, spatial analytics and the novel forms of storytelling like data comics and so on. So some of those are some some great interesting stuff created by the team. And apart from all of this, I am very passionate about teaching and writing and to ask me what I'm known for, I am quite principle. I'm active on LinkedIn. I share what I've learned and also the mistakes have committed and what I've learned from that. So I write for Forbes and Entrepreneur. I also spoken at several industry events and like TDEX and other forums. I also run guest lectures at institutions like Princeton and rudgar'st University. Overall, my world revolves around data. I would have to consider them obsist with data that's really great, and I think you're in the right place. So you know, one of the things that I'm really excited at the deep dive with you about is the topic of data culture and becoming data driven. I think given your experience leading grammar, working with a lot of data leaders on their own data culture their own organization, I think we can all agree that succeeding at building a data...

...culture is absolutely mission critical for any organization that wants to treat data as an asset. So what I want to first start off with is maybe the mystifying some of the terms, right, I think our industry is drowning with buzzwords data transformation, digital transformation, data will receive data fluency, data culture, transformation, and the data capt is even like guilty of using a lot of these terms, right, But maybe walk us through as well how you define a data driven organization, maybe how it relates to some of these terms. Ye sure, there's no posity of buzzwords in data science, which in itself is as a buzzword answering a question, what really is a data driven organization? In simple terms, an organization that is able to effectively and consistently utilize data for decision making across all levels. That is what a data driven organization is. The organization is treating data as a strategic asset, not just as a support function or as an enabler, and they are able to um utilize data not just for strategic decisions, but also for everyday decision making. That data is a way of life for everyone within the organization. That is a data driven organization. That's really good and I completely agree with that. It's kind of, you know, an organization that has the habit always of looking at data when making decisions as well. And I think there are many dimensions to becoming data driven. We've seen this with data come for business customers, for example. There's a cultural investments, people skills investments, technological and infrastructure investments. Right, I'm very excited to deep type with you on these different dimensions. Right from your experience working with data leaders and a lot of these transformation projects, what have you seen to be the most challenging aspect of building a data driven organization. Firstly, completely agree that building a data driven organization is very, very difficult. The reasons survey by New Vantage Partners that found that about twenty six percent of the organizations only have managed to build a data driven organization, and when they looked at data culture, which is higher level where everyone really believes in data and breaths data day in day out, that is sub twenty So why is it? Sort of there are four shifts I see as critical to becoming a data driven organization. You have to shift your skill set. You'll have to shift the tools set, You'll have to shift the process set, and finally you have to shift the mindset. The skill set is where you build for the team that is creating the solutions, they need to be upscaled on data analytic storytelling, and for the people using these solutions, they also need a skill set upgrade. And second, aspectmen, what we really mean by tool set is all the technology solutions architectures that the text tack. You need to invest and build that and it has to be a technology strategy. And third, process set is where you need to integrate all of these back into your business workflow so that this fits squarely into the business process. It's not on the side where people have to make a deviation to go and use these insights and then come back into the workflow. Any organization that designs data solutions like that is bound to fail, so it has to be it has to rewire the processes. That's where the third aspect comes. And finally, the most important one and the difficult one, is shifting the mindset. You can train the people, build tools, and rewire the processes, but how do you overcome that the cultural resistance and make people comfortable and naturally turn to data when they're faced with the question. That's the mindset shift which is also required. How should their leaders approach this transformational project? Should it be an investment in tandem and all of these lovers happening at the same time. Is it procedural step wise investments? How do you approach that as their leader? This is best done step by step again, big bank investments. There are several organizations that I've seen firsthand. They take up a mega project six months, one year, and they pour millions of dollars into it that the risk of failure is high. There are some organizations that have...

...succeeded, but the chances of success are much higher when you take this step by step. The four leavers we talked about in terms of skill set, tools that process and process it in mindset, it's best to start with the data that you have, the analytical capability and tools that you have, and with the people that you have. So start with small, simple initiatives, and you'll have to build a roadmap which helps you use the current capability to solve some simple, descriptive, smaller problems, and then you realize the benefits, project the benefits to the rest of the organization and say this is what we could achieve and say a few weeks or a month's time, and then build upon that. And that's how you build momentum for success. So it has to be done step by step, and you'll have to come up with a roadmap which helps you do that. So I'm very excited to get like you expand on that notion, discuss how to build a roadmap all these things. But I think what's first more important is maybe to tie a lot of the conversations that we're happening and they're having here with the concept of data maturity. This is something I've seen you speak about This is something also we've spoken about DATICA with data maturity framework. I've seen you speak about this successively. Walk us through how a data maturity framework helps organizations prioritize their data journeys. A lot of people ask me, do we really have to go through all of this? Can you just tell us some projects to start? We would like to get started tomorrow. So I see this as a self assessment. You need to first find out where you are before you can chart out the path to where you need to get to. So the self discoveries what data maturity helps you understand. And this concept is what is really data maturity. It is a reflection of the organization's readiness and capability to embrace data for decision making. And there are five dimensions. We have a data maturity framework at Grammar, and there are five dimensions we've thought through. We've seen this work varia well across our clients. One aspect of vision, second is planning, third execution, fourth dimension is value realization and fifth this culture. So these are the five dimensions, and you'll have to assist the organization on how strong is the vision with data? Do you have a plan for data short term, long term, and are you able to translate that into a roadmap, a set of projects and capabilities to build. That's a planning aspect. Third one, how well you're able to execute it? What kind of tools processes do you have? And fourth, how are you able to improve adoption and measure r o Y The value realization part of the fourth one and fifth is culture. Is there resistance to data within your organization? How ready are the people to embrace a technology like this? So when you assess the organization on these five dimensions, you can find out, for instance, there are some areas when we run these maturity assessments, we've seen there are some organizations that are great on vision and planning. The executive management is completely aligned. However they suffer from poor execution or from poor value realization, Whereas there are others that have a very strong execution muscle, good tools, good people, great data science team, but they don't have a vision and hence they're not able to pick the right projects. So you need all these five and some proportion. Without that, making any investment could lead to failure. So a data maturity assessment tells you what level you are on each of these dimensions, where you need to focus on next. And it's overall the I talked about data maturity and improvement and capability. That takes time. That's a journey. You can't directly move from a load to a very high maturity. You'll have to do its step by step and work on all these five dimensions over time. And what's really great about the framework I think is that it allows organizations to really be honest with themselves about what is important right now. I think an easy pitfall an organization may have, or like an executive team may have, for example, they may have poor vision right with the Okay, we don't necessarily have a great data infrastructure, we don't necessarily have a lot of data talent here. But what we're gonna do is we're gonna hire like ten PhDs...

...in machine learning and build out really complex machine learning projects right and without necessarily having a good grasp of one the tool stack, the cultural adoption of such tools within the organization, and this leads to, you know, regrets of working with data science within the organization, absolutely very true. There are several organizations we've seen they have this capability, but they're not able to get the R O Y because they are missing some of these dimensions. So giving that walk us through the different levels of data maturity that you've seen organizations operate in, and what are the kind of investments that they can make to progress alongside the maturity levels. A good reference here is Gartner's Data Maturity the five Levels of Data Maturity, which Gardners extensively published and talked about. I think that's the good framework where organizations start at level one of these five levels. Organizations start at level one, where they're using data opportunistically. When there is a big challenge, they're not able to use the gut field. That's when they turn to the data. It is very opportunistic usage. And Level two is where there are few teams that have started believing in data, or maybe one business leader, one technology leader starts investing in it, so there are pockets within the organization that start using it. Level three is where the adoption is slightly broader, but they don't use it strategically. Level four is where they have gone beyond that and they are almost the entire organization is using it and they're using it and realizing business r O Y. However, the only thing that differentiates these level four from a level five is Level five is where they view data as a strategic asset and they view it as central to business strategy. Data central to business strategy. So that's a fifth level. You can see that across these five levels, for you to progress, you need to improve the four dimensions, the four aspects we talked about, the four shifts we talked about, right, you need to continuously build your skill set, train the data analytics team and the users. You need to improve the data literacy of the data consumers within the organization. So that's a skill set aspect. And when it comes to the tool set, first you will tap into the tools that you already have access to, and then you will progressively bring in more sophisticated tools as you mature. And importantly, you will also tap into not just internal data, but also external data and even variety of data, not just structured but unstructured data. So similarly, we talked about skill set, tool set, and same thing. It reflects on these other two dimensions of the processes we talked about, coverage of the organization and finally the mindset. How in fact, one characteristic of the higher levels, typically levels three and four, is when there's a data leader like a chief Data officer or a Chief analytics officer is hired at those levels. So each of these levels are reflective of a certain kind of maturity and organizations need to progress through these. So maybe if I'm a their leader that's trying to start off in my journey and I don't necessarily have a good understanding of my organizations their maturity, how do I measure my organizations SATA maturity? What are the things that I need to be aware of the ANYTP consistently looking at to be able to understand where my organization sits on the data maturity spectrum. You should start with the self assessment. You could do this yourself if you have that. Data leaders can run this internally if they are able to come up with these right frameworks and tools its or they could hire an external partner to do that. But what is important is looking at three aspects. One, you reflect on the organizational strategy and the role of data as part of that in the current state where the organization is. And number two, talk to people, Talk to the technology team, talk to the business team and understand what are their priorities, where do they see the gaps with the data practices and how much data is able to support their business priorities. There's a second aspect talking to people and and running surveys and figuring it out, and a third aspect which is often missed out as you'll have to inspect the assets. You can't go with what people say. Again, each person has their own bias and blind...

...spots. So when we run data maturity surveys, what we've seen often as it's uh attempts, it's it's funny, But when you look at how people view it, then you can understand this better. Typically, the data maturity scores when you do a self assessment the technology team, naturally the scores are much higher compared to the business teams. So the business teams and to report that we don't have the data of the right quality availability is not great, whereas technology teams say that we don't get to know about the business priorities or the business priorities keep changing, they're not able to keep pace with it. It changes too often, so the scores there's always a little bit of attention and different opinions you'll have. That's when then you go and inspect the assets to find out look at what project was run earlier, how much was it able to help, what were some gaps? So when you inspect the assets, you will be able to find for yourself. So when you combine these three aspects of the vision priorities, the survey and talking to people and third inspecting assets, that will give you a very good understanding of the organization's current data maturity level and what needs bridging immediately and the path you need to take to reach your end go. That's really great and I really love the way you set it out. So we definitely discussed how did the ignose where you are in the maturity spectrum and how to necessarily how to approach setting the path right. But I think this is an excellent opportunity to discuss how to move across that path right. So walk us through maybe the different steps that you recommend to become a data driven organization. So there are five steps. One, begin with the business strategy. Reflect on what the organization wants to achieve, What are the business priorities for the current year, for the current quarter, and for the next four to five years. What is the vision for the business, and based on that, come up with a vision for data and a data strategy. That's the first step painting the vision for data and number two translating this vision into into execution map to achieve this. If we have to achieve certain milestones in the coming quarter, coming year, what initiatives, what strategic initiatives should we take up? For instance, it could be let's say let's take the example of improving top line. If that is a business priority for the current year. I want to grow my revenues, so I need to look at certain aspects of customer relationship management or expanding into newer markets. So based on these business initiatives, you can come up with the data strategic data initiatives. For instance, it could be, how can I gain market intelligence through data to choose the markets I need to launch and expand into. And if you want to grow the top line by expanding share of wallet of my customers, can I use data to identify what are the current share of wallet? What? What are they happy about? What and not happy about? How can I get them to spend more with with me and keep them happy? So that'd be the strategic initiatives will directly follow based on the business priorities that the second part, you pick relevant initiatives into your road map, and third one is based on the projects you want to execute, you build the capability. This is clearly a third step because we want to build only those capabilities based on the projects you want to execute. For instance, of the next six months you're going to be doing a lot of descriptive and diagnostic projects. You may not have to onboard heavy talent and AI that can come in a little later, which also health you red priorities because people typically tend to hire five to ten PhD s and data science and then they try to figure out what project to execute, which is again putting the card before the horse. Building capability should only follow the prioritization of strategic initiatives. So that's step three. Step four is once you have once you decide the capability you want to build tools people, then what are the processes, which business areas, which business processes should you require and so that you can plan for that integration we talked about earlier and do I call it as a fifth step, some of it has to happen slightly earlier, which is the people aspect, mindset and shift thing. So these are the five steps coming up...

...with a vision's data strategy roadmap for execution, building the capability processes and the people aspect. You'll have to do this iteratively in cycles. It's not it's not like a linear process like I would visualize this in a way of doing multiple cycles of these five steps. You start with this one one cycle goes on for two to three months, and then you revisit the cycle, keep expanding it again and again. It's like doing a circuit training where you work out multiple muscles, but you do it over like nine months or so. You keep adding more way it's making it more complex. So it's like the iterative muscle of developing these different functionalities. I love that. So one thing that you mentioned here, especially at the end, that is extremely important is the data science roadmap. Walk us through your best practices for building a effective data science roadmap. How long should roadmap be? What are the inputs their leaders need to consider, especially as they evaluate where they are in the data maturity spectrum. We talked about starting with the business prior a taste for the roadmap. That's the most important thing. It's not about what is urgent or the loudest voice on the floor what they're asking for, but instead start with what is strategic for the business, and it's important to balance the business impact and feasibility. Imagine this like a two way to write low impact, high impact, and low feasibility high feasibility. Anything which is high on impact and high on feasibility feasibility as and you have the data tools and people to execute it. That's something we have. You have to start immediately because it's going to impact the business in a big way, and you can do it starting today. So high impact, high feasibility should be those initiatives you prioritize, whereas there are projects which which are low and impact but very high feasibility. A lot of people get distracted by these. Yes, we have the data tools available. Even if people are not asking for this in a big way, we can finish this. Often say two to three weeks, can we go ahead and do that. That's a question I often get from technology teams. Would say, even if it is two to three weeks of effort, if it's not going to really solve anyone's problem in a big way, why bother. So balancing impact and feasibility is very helpful, and don't let urgency come in the way. Yes, at times it is a really burning need, so then that case you'll have to look at it as an exception, but otherwise impact and feasibility is what you should really go after. And you talked about how should you structure how long should it be? I would recommend that you should have a balance of a short term and a long term roadmap. For many of our clients, there are clients where we have done a five year roadmap with a lot of projects for the next twelve to eighteen months horizon, which has most of the projects and the other aspirational initiatives go into year two, year three onwards. What is very common is typically coming up with one to two year roadmap is very common ask, which I think also makes a lot of sense if you're looking at limited resources and a short term to focus upon. Let us say you're building a twelve to eighteen month roadmap. You should have a good mix of some short term initiatives, things that you can deliver quickly and demonstrate the quick business benefits which I can usually call us quick wins, and you have to balance that with some strategic initiatives which need more investment, but at the same time we'll give much more longer term benefits. So you'll have to balance both of these. So to summarize, start with the business priorities, balance impact and feasibility, and balance shot and long term initiatives. I love the balance between quick wins and long term strategic priorities. I think quick wins are such a great way as well to impact that last component of the steps, right, which is the cultural adoption all of these things, because if you can showcase the power of a quick win, you'll be able to get the people on your side, create more case for an investment in data science. It's the virtual cycle. Paint a picture for us on what those quick wins look like and especially common quick wins that you can see in various industries, and how do you pair those with long term projects? Well, what do these strategic long term projects look like as...

...well? When we talk about projects, I usually ask business leaders what do you think should be the relatives spend on simpler analytical tools capabilities like descriptive diagnostic analytics capabilities versus the predictive and AI capabilities of that real the real advanced analytics capabilities. And I also ask them where do you think the biggest business benefits will come? Perhaps based on the market bus. A lot of people get carried away and they say, we need to put more money into predictive analytics because that is where the biggest business benefits would come. Agreed that predictive analytics needs more investment or needs more investment relatively, But often I've seen across industries majority of the quick wins come from simpler data and analytics initiatives. You identify descriptive initiatives or descriptive diagnostics. For example, we were working with the manufacturing firm finding out what was driving the failures across the batches, so it's more of diagnostic analytics what causes failure of batches? That was very impactful. Or for instance, what are the major drivers of manufacturing yield? Like what are those two three factors I need to tweak and play around to improve my yield? Very important for a manufacturing organization. These kind of projects can give you a lot more benefits in the short term term as compared to, for instance, predicting the machine parameter setting. That's again another initiative you could do that if I have multiple parameters going in to come up with a golden batch as in farm my industry, Golden batches of concept what is the optimal yield and what parameters I need to tweak for the machine. That's a predictive analytics engagement, but even before that, looking at failures and the factors driving yield, that could be much more impactful and a quicker when project. So look at descriptive and diagnostic analytics before you get into predictive analytics the forward looking questions, and there is a place for both, but start with these and this is where a lot of industries can benefit immediately completely Gray. I think like the examples of descriptive analytics are so useful because it showcases, you know, we're just uncovering data. Insights can lead to improve decisions, and even on the predictive analytics examples, I think they're even within predictive analytics, there could be a category of predictive analytics use cases that could also be easier to pull off than any others that kind of showcase clear value. One thing that comes to mind is like a simple customer churn model. It doesn't need to be embedded in software, it doesn't need to be embedded in the business process. It just shows you who are the customers or category of customers that are more likely to turn and then you can make Babe take a quick action on it for a marketing campaign and see if that works out or not. So these are also a category of predictive analytics that could work. That's right here, just to add onto that, and that's a great example. Even within predictive analytics, you could do a lot with simpler statistical or machine learning techniques as opposed to doing what you can do with say deep learning or the more advanced ELGON. So the trade off is always there, and you could start with the simpler ones, but I would make a slight correction there that whether it is simpler or a more advanced one, integration into the business process I would say is critical. I'll give you an example of a telecom firm we were working we were working with they were trying to identify which customers are likely to churn and be able to churn. Model. We saw that the accuracy of decision trees and some of the simpler models it was able to give an improvement of about whereas deep learning and the more advanced techniques were able to give almost a sixty percent improvement. So you're you're seeing almost twenty percent difference. So we went for the more advanced techniques, but then it obviously has engineering cost, and the bigger challenge we faced was explainability with the decision that you can say that these are the three factors why you need to act on these customers and why they are likely to leave as for the way the model sees, whereas when it comes to say deep lanning, it just spits out a set of...

...customer names and numbers saying go and act on them without giving much of an explanation. That we saw. The resistance from the marketing teams was the greatest. They were not ready to act on it because they said, without explanation, some of these customers are using our products very heavily. Why would the model think that they're going to leave? So explainability also becomes a challenge when you go for advanced techniques, So it's always good to start with simpler techniques. That's a great nuance and I really appreciate that. So I think it's also given that we're talking about roadmaps, connecting back to the original question, how do you ensure that a roadmap is executed upon? As a their leader, I think it's often easy to fall into analysis paralysis. It's often easy to fall into, you know, reprioritizing the roadmap, etcetera. What are the main challenges in execution that you find and how do you keep the teams motivated to execute on the roadmap. One common challenge I've seen when it comes to building roadmaps and executing on it. A lot of teams create a roadmap, but they face practical challenges in acting upon them, and the final projects they green light and execute are very different from the roadmap. Why does this happen after the roadmap is created? Day to day there are business priorities which keep changing, and say there is some other major crisis in a business, or there are the leaders perceptions have changed. They're asking for a different initiative. That's one reason why people end up picking different projects compared to what they already prioritized in a roadmap. It's always good to revisit the roadmap, but bringing the same factors in terms of business impact, feasibility, and we talked about short and long term, bringing all these factors. Anytime you're revisiting the roadmap, bring these in and compare it holistically. Otherwise you will plan for something and you will end up executing without even revisiting what you planned. So that's one challenge and another common challenge I've seen as people, Uh, if let us say there is capacity to run five projects and there are four or five functions chief data officers in touch with and if they're trying to pick five projec x, there is a tendency to allocate one project to each of these functions so that we satisfy a little bit of everyone. But the challenge with this approach is that you need to cross a particular threshold for the business benefits to kick in. So it's often the case that you need to do to three projects in adjacent areas within one business function so that you're able to deliver that sizeable business benefit, which is which the teams would would notice and it demonstrate pro wife for the business. So when you split it one per function and if if it's not able to cross that critical threshold, then that's again none of these functions would ultimately see an arrow way from this effort. So by trying to please everyone, you will not be able to demonstrate our away for anyone. So it's important to look at a portfolio approach when I'm picking projects, am I building a minimum portfolio of two to three projects in adjuscent areas which have high synergy, so that I can deliver a strong r ow Y as a combination of all these projects. Don't try to average out and and try to please all functions by distributing your projects. That's really great and I love that, so kind of walk us through maybe how do you cross some of these challenges and some of the ways their leaders can pre empt these challenges from the get go. In the in the world map execution, one biggest factor that you can leverage is executive involvement. We're talking about business priority impact and I think so far the I've repeated this multiple times is what is the benefit for the organization and who better to reflect and decide on that than an executive. So you should have executive involvement throughout the engagement, not just for signing the check, but also reviewing the initiatives and green lighting saying yes, this initiative makes sense, or if you're going to change and swap it out with another initiative, this is a good alternative. So executive involvement is critical. What is a way to operationalize that I've seen that forming a steering committee, A data and analytics steering committee is a good vehicle to ensure that you bring together technology leaders, business leader us and ensure that there are at...

...least one or two executives as part of the steering committee. And the steering committee should be tasked with ensuring that the vision for the data is delivered upon and that there is a certain arrow y which the organization would have projected. The steering committee should own and ensure that the projects they pick and how they're reviewing it it leads to that arrow y. So the steering committee should meet periodically to green light the initiatives, review whether it's being executed in spirit, and whether the arrowy is being delivered and any changes in the roadmap, like that the question you earlier asking any changes in the roadmap or other difficulties, it has to be tabled to the steering committee. That's one group that can reflect and support the initiatives along the way. Okay, that's really great, and I really appreciate you make that distinction as well. I think the last thing that we need touch upon, which is obviously one of the most important aspects of building a their driven organization, is the people and cultural components that we discussed throughout. You know, one thing that you mentioned greatly here throughout the discussion so far is the cultural resistance. It's the importance of creating that mindset, and I think a big part of that is that a lot of the adoption of data science projects ride on the fact that there needs to be this cultural adoption and this lack of resistance within the organization of data science projects. So maybe as an introduction or prelude towards the conversation here walk us through why building or improving the mindset and shifting the culture is absolutely so critical for the adoption of data science. We talked about vision for data. When everyone without an organization understands that this is how data and analytics is going to help us, this is where the organization can go with data and the business benefits are obvious, then there will be much lesser resistance. So I often say that the leaders have an important role to play here in explaining why data and analytics is required, how it will enable the business goals, who should get involved, and what is expected of each stakeholders. So the why, how, who, and what all four aspects communicating it clearly is leaders responsibility. Man, this is the leaders do a great job of storytelling and communicating this, then alignment is much easier. I completely agree, and I think maybe touching upon here the more practical side of things, what are some of the ways and tactics by which you can shift mindsets within the organization and increase the adoption if there is science, And maybe what is the role of education and their literacy here at the heart of the shifting mindset. So when you talk about mindset and shifting, interestingly, the biggest hurdle for shifting the mindset is not a technical or a capability, but it is semantics. The language people use to communicate data is often the biggest inhibitor. When people talk about data, they usually I've heard technology teams talking about out the data governance, or we're talking about data science, and we throw in a lot of jargons into this, and we assume that the other person knows this because this is our world. But the other person. I've had this thing about at times. I've many post on LinkedIn a couple of times there are some friends who reached out asking for abbreviations of some simple terms. For insteven cdo they asked what does the CEO mean? And then I realized that, depending on the function in the industry, CDEO could mean a number of different things. I've come across this in my other engagements as well, mean talk to certain For instance, in marketing world, cdo means something else. So the language you used often turns into a roadblock. So if we are able to simplify this reducer jargons and communicate in direct terms that this is what we mean, this is what is expected, that actually can help win people over and align teams for a common purpose. So what what do we really mean by data literacy? Right? The ability to read, write, and communicate with data and insights...

...identified using data. That's data literacy in simple terms. If everyone within the organization is data literate, just like you're literate with a particular language, the organization uses a primary language of communication, then communicating with data will be seamless, completely great praise that common data language, right. I think another side of the cultural resistance or the mindset shift, especially in frontline workers or like in within the organization, you see a lot of people that are distrustful of data. There's a fear of automation, there's a fear of data and redundancy within the role, which I think is super valid right to have that fear walk us through maybe as a data leader, how do you assuage these concerns. I've come across three types of fear. One is fear of unknown, or the fear of new technology. People are worried there's this new thing AI or data science, which we have no clue about. Certainly people have started talking about it. That's a fear of the unknown. Second is fear of automation. Once they understand ai is capabilities, what is datas sience, the fear of automation that it will take away the jobs. That's again very common. And the third fear is fear of getting exposed. Even when people get over these two, I've heard comments again off the boardroom, when you talk one on one to people, they say that I'm comfortable doing it this way because I too much of transparency and too much of insights from data will actually expose some inefficiencies within the system. So we don't want that much of clarity with data. So we are comfortable the way we're doing business. So the fear of getting exposed to the third fear how do you tackle these three as a leader. The first one fear of unknown or new technology. You'll have to tackle it through say data literacy and other communication mechanisms to explain what the technology is really is and how it can help the business. So that's the first aspect of education. The fear of automation. So when leaders talk about the purpose for data and the future state for the organization, that's where it's important to paint a picture of the future that once we have these data analytics initiatives going life, this is where we expect efficiencies to kick in. At the same time, this is where we expect the people to still stay engaged and move on to some higher level problems or a different set of challenges which they're not tackling today. So when you paint that picture of the future and show the role that people will play, then you will be able to tackle the fear of automation and for the fear of getting exposed the tougher thing compared to all the three. That's the toughest. You'll have to talk about why the organization needs it, how some of these inefficiencies are hurting the business, and what kind of incremental gains you could get for the top line of the bottom line. And also when you show some of the quick wins and build momentum as we talked about earlier, then people slowly get over that and then they see that yes, this actually I will be able to change the ways of working for the team. And after all, we will be able to change the ways of working and we may not get exposed and we will be able to adapt. So leaders play and act on these three aspects of the fear of unknown, fear of automation, and fear of getting exposed through some of these techniques, which are very essential for ensuring our option and a culture of data. That's really great. I love how you categorize these three fears. I think the first two fears are more closely related to each other. And one thing that we've seen is really effective, at least with data came for business customers you know that roll out like big upskilling programs, transformational programs, etcetera, is that there's very important need to create this messaging of like what's in it for you? From becoming a data lictorally professional, from adopting these skills, from working with data, I think there's massive cure benefits to be discussed at the organizational level for the individuals, but on the fear of exposure, I think there needs to be this psychological safety for a lot of leaders, employees, and like workforces. Data is not going to be used to punish. It's going to be used to improve and that really needs to be embedded in the messaging, which I think leads to my last question here on the cultural side, there's a lot of internal evangelism excitements, have it...

...changing communication that needs to go into a data culture transformation program. What have you seen are the most impactful tactics that achieved their officers data leader can have when communicating about the transformational agenda, well, talking about the business benefits and the impact of data. So we talked about painting a picture of the future and how you can quantify some of the business benefits and talk about the new state where it's not just the business benefiting or the people will be able to do their jobs better if you when you talk about these benefits at different levels. That's one way that chief data officers and analytics leaders will be able to communicate. And another important aspect they need to keep in mind is that with the adoption, rising adoption, rising investment in data and analytics tools. I also see that there's a good focus on execution and adoption, but the experimentation with data and analytics is reducing. I think it's an important responsibility of data and analytics leaders to also channelize some efforts and energy into experimentation with data because still the ecosystem is evolving. There are a lot of tools, and I would expect that this for the next couple of years, there will be a lot of flux and new technologies coming in. And there is great research which is helping with advances in these technologies. So how can you run experiments to find out what is the latest and greatest, which of these advances can help your business solve some challenges and do some quick pilots to test it out, and if it is ready for productionization, then roll it out across the business and plan for adoption. So I think these are some points that data leaders need to keep in mind. That's really great and given here that you mentioned the next few years are going to be in flux, how do you see that they literacy, their cultural conversation evolving. What are some of your predictions about how organizations are going to become data driven the next three or five years or so. I'm really happy with the way the adoption and progress in this space has happened in the last few years. About five years back, we were talking about the promise of AI yet again and promise of deep planning, and three years back, the topmost challenge I've noticed was data culture, whereas now it is a challenge, but it is not the topmost one anymore. There are several organizations which have been able to achieve this in at least in pockets. So the way I look at this in the future, I think data literacy will continue to stay important and a lot of organizations would have progressed quite a bit on these dimensions and with greater adoption. Eventually, I would consider data and analytics to have actually delivered on the promise. When no one talks about data analytics the same excited way, it is business as usual. It becomes invisible and it delivers the business benefits silently. That is a future state where I would consider the promise to have been realized. I completely agree. I cannot wait, so we don't have any more relevant topics to talk about on the podcast. Anymore because everything has been done right now, we'll always have a new thing to talk about. This is what's been again as as practice as we've we've talked about the fear of what comes next, the relevance right and I'm pretty sure we'll be discussing some really advanced topics in the podcast and be happy to come back and discuss about those topics in a few years. So again, as as we were, U pup, is there any final call to action that you have before we end today's episode, I would just emphasize the focus on benefits and r O Y that needs a lot more attention. Can we call it out as leaders, talk about it and have a focus on it throughout That will make all the difference. That's really great. Thank you so much, Gannis for coming on Data Framed. Thanks for having me at it. It's a pleasure you've been listening to Data Framed, a podcast by Data Camp. Keep connected with us by subscribing to the show in your favorite podcast player. Please give us a rating, leave a...

...comment, and share episodes you love. That helps us keep delivering insights into all things data. Thanks for listening. Until next time. H.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (121)