DataFramed
DataFramed

Episode 102 · 5 months ago

#101 How Real-Time Data Accelerates Business Outcomes

ABOUT THIS EPISODE

Most companies experience the same pain point when working with data: it takes too long to get the right data to the right people. This creates a huge opportunity for data scientists to find innovative solutions to accelerate that process.

One very effective method is to implement real-time data solutions that can increase business revenue and make it easier for anyone relying on the data to access the data they need, understand it, and make accurate decisions with it.

George Trujillo joins the show to share how he believes real-time data has the potential to completely transform the way companies work with data. George is the Principal Data Strategist at DataStax, a tech company that helps businesses scale by mobilizing real-time data on a single, unified stack. With a career spanning 30 years and companies like Charles Schwab, Fidelity Investments, and Overstock.com, George is an expert in data-driven executive decision-making and tying data initiatives to tangible business value outcomes.

In this episode, we talk about the real-world use cases of real-time analytics, why reducing data complexity is key to improving the customer experience, the common problems that slow data-driven decision-making, and how data practitioners can start implementing real-time data through small high-value analytical assets.

You're listening to data framed, a podcast by data camp. In this show you'll hear all the latest trends and insights in data science. Whether you're just getting started in your data career or you're a data leader looking to scale data driven decisions in your organization, join us for in depth discussions with data and analytics leaders at the forefront of the data revolution. Let's dive right in. Hi Everyone, this is Richie, your resident data evangelist and host for today. There's a universal problem with data analytics and that it takes time from asking a question to getting an answer and, in the worst case, you may not care about what the answer is by the time you receive it. So essentially, all organizations have an ongoing quest to beat up the time to get value out of data, and the end game for this is real time analytics, where you get the results from data analyses in microseconds. Telling us how to achieve this holy grail of analytics is George Trujillo, the principal data strategist at data stacks. He has a ton of experience helping C suite executives figure out a data strategy for their organization and helping them deliver value from data quicker. I am very excited to hear his advice. Hi there, George. Thank you for joining us today. We're talking about real time analytics and how it can help your business and help your customers. Since you work for data stacks and data stacks is primarily a tooling company, I'd like to talk a little bit about some of the tools you need for this. If you're trying to put together some sort of real time data stack yourself, then where do you begin? What are the different components to that? Yeah, Wellee, thank you for having here today. I'm really looking forward to joining you and discussion and is you can always get me to talk about data. So one of the things that kind of helps me is when I look at a data ecosystem, it has to work together, and one of the things that helps me visualize it and architect and designment is to look at the data ecosystem as a data supply chain, that data flows through that ecosystem. So you can have applications and IOT devices and databases as sources of data, and then that data will flow into an area which is your streaming, your messaging, your queuing data, and then from that live data flow, you move into your databases, where data has a lifespan by persisting, and then either from the streams of the databases, that data again flows in its raw form or transformed format into your analytical platforms, which are your data warehouses, your Lakehouses, your cloud storage, etcetera. So that data supply chain, the more it flows efficiently, the faster you can go from data discovery to realizing value from that data. So really all about how quickly you can get answers to data problems. Then yeah, that's correct because, you know, we get so focused on data it can become easy to lose sight of the fact that if we don't generate data value from the data, it doesn't matter how much data we have. The goal is always to generate value from the data and to generate revenue for the company. And so when you talk about tools, what kind of helps me look at that ecosystem is understanding. What are the tools that make up the data ingestion for data flows in the ecosystem? What are the tools that make up the databases where you persist the data, or the memory caches where you reside the data for very low latency, or how you want to format and transform that data intore analytical platforms. So breaking the data into those flow areas I think is a good and pro which because it always...

...makes sure that you're looking at everything holistically and you don't look at one area myopically, because the flow impacts all the ecosystem you mentioned this phrase a data supply chain. That's really interesting. I haven't heard that term before. Can you tell me a bit about what you mean by the data supply chain? Yeah, I've had different roles in my career. It's allowed me to look at data and business from a lot of different perspectives. I've been a VP of data where all data in an organization before to my office. I've an MP of data architecture and data strategies. So in those roles I was always looking at a vertical what's going on in the databasis? What's going on to the data warehouses, what's going on in the gestion platform? But I spent about four years working with Oracle where I had a similar role as I have a data stacks looking at enterprise customers and then we're looking at how do we solve the problem for our business? How can we execute on data faster and when I had all day if an organization reporting to me, I wasn't solving one problem. I was really solving a holistic problem and I think once you can look at the data ecosystem and your tools holistically from ecosystem it completely changes the way you go about solving problems. You start realizing that it's not just the tools you have, and everybody tries to when you select the tool, it's often for a project or a use case or an initiative. Often enough thought is not put into the rafications of how that's going to impact the entire ecosystem. So the reason I started looking at a data supply chain is I really looked at we discovered data flows into a system, it persists, then it gets into a form where people want to analyze it and run machine learning agorithms on to get value from it, and I started seeing that it's really that flow of data that has become efficient. It's how that flow of data people can tap into easily to generate value from it. So that data supply chain viewpoint really helps me make sure that I stay consistent and making sure that the ecosystem and the data flows days efficient is possible, and I don't look at things myopically and just see a vertical view of the technology. So this is a really high level overview of where data is being used throughout the business. That's correct, because when I talked to lines of business leaders or talked to CMOS or presidents, they're never talking about technology. We're saying that we need to get access to data faster and what rich you would actually change my perspective and how I started looking at things holistically and from the data supply chain is I started spending a lot of time talking to business leaders and asking them, what are your challenges? What do you want me to do faster? What can you not do that you need to be able to do? What frustrates you in getting value from data? And all US, without exception, you could take out all of their answers, concerns and frustrations down to one thing. It takes too long to get the right data to the right people and that's an ecosystem problem, it's not an individual tool problem. AH, interesting. Okay, so if you're trying to build up this ecosystem, then where are the bottlenecks? What's the thing that's most commonly not there? You know it's really interesting when data flows and you look at your data edition platform, it's very consistent. When I look at organizations, they have specific software for queuing data, they have very specific data for messaging data, they have very specific software for pub sub. So they have all these different data flows. And the thing is is when you want to innovate with data, it typically happens at the data integration points. So you have data flowing in from things like rabbit and cue from Cocka from pulsar. When you're trying to build...

...up this ecosystem for yourself, where do you start? What are the common breakages in this ecosystem? The commonplaces it breaks is at the data integration points. And the reason that's so critical is data integration points are usually where you have tremendous value of innovation. So, for example, if companies have built their technology one project in the use case at the time. So for the right reasons, they picked a great pub sub tool, they picked a great queuing system, they picked a great messaging systems within themselves they all worked really well. But at data integration points data has to come together. So you might have a business leader saying hey, we'd like to have a new look of this data and they're going, why is it taking me two months to get my response? You have to go to the Coptic developers, you have to go to the rabbit and q developers, you have to go to the pulsar developers, you have to go to the product managers and you have to get these teams together and understand how are we going to change that data successfully? Who is responsible for it? So the more complex your tooling is at the data integration points, the more it's going to slow down your ability to get value from data. If these data integration points are really important, then who tends to be responsible for data integration? That's a very good point. You know, it typically is part of data architecture and also part of data engineering. Those are two of the teams that are typically going to be involved, and one of the things that I think is very important is you always have to have somebody that leads with data science characteristics to be part of this, because the whole goal is how are we modeling the data? How are we making sure that when we integrate this data and we update this structure, that analysts and data scientists are getting be able to get value from it? Or once we've got this data at this integration point the way we want, what data does it need to integrate with? And you can see and if it's very difficult to do that, analysts and business scientists and data scientists get very frustrated because the data is hard to work with and you end up with this tribal knowledge that's so complex only a few people really understand it, and so if you only have one or two people that can fix a problem in an organization, it's going to absolutely slow down your ability to get the insights that you're looking for. Absolutely. That's definitely a problem I recognize, where only a few people know how to do a certain technical task. Since we're talking about improving the time to getting value in things and real time analytics is a big part of this, I'd like to talk a bit about how you get there. So intuitively, real time analytics is this good idea because you want to get your answers quickly, but it also feels like a more difficult thing than just getting analytics done. At some point it can be like an impossible challenge. So when do you actually need need real time analytics? When do you just need things to go fast? Real plan analytics begins with the customer experience. If you look at when customers work with the business. At one time we used to work with a banker, we'd work with an accountant, would work with our favorite major D at a restaurant. If you look at all of our relations down they're almost more with applications and with mobile APPs then it is with people. So when somebody goes into an APP, can they find the information that they're looking for if they're trying to make a decision? Are they able to look at the different products quickly or they having to struggle to find them? Can they get accurate information on is that product and stock? Or we're trying to get information on how long will it take for it to be delivered? And the more you can do that efficiently and easy for the customer and created great customer experience, the higher the probability is that you're going to get a transaction. So we've spent so many years focusing on analytics on the back back end in terms of your data warehouses, your cloud...

...storage, your analytical platforms. But if you don't do a great job with real time data, you're probably not going to generate that transaction, so that data will never get to your data warehouse because the customer is probably gonna go somewhere else. So if you're a customer and you're trying to make a decision and you're on your mobile APP, you're on the browser, are you waiting ten seconds? Are you waiting twenty seconds? In a mobile transaction that's a lifetime. So your technology stack has to have a little latency and has to be able to handle the volume and the speed of the data that's going through it, and that's where rich I've actually seen a big change in the industry. Customers that picked certain tooling three and five years ago and seven years ago that worked great for them, they're now seeing that tool is not scaling or it's not able to handle the velocity of data, and so customers are really looking at making sure that they now have these of gross platforms that can handle the speed that is now demanded for real time analytics. said that even three years ago, the technology has changed so much and that's interesting. So what exactly has changed in terms of these platforms then? What's the difference over the last three years? But I would say one of the top things is the speed that you have to be able to generate a decision or you have to have a value. Three or four years ago it was seven minutes, it was five minutes. Now it's microseconds or it's a couple of seconds. So the speed difference changes the whole customer experience. The other thing is is often we were able to work in isolation of business units, whether you're supplied, whether you're marketing, whether your sales. But the more you have data integration points, the more you have to be able to work with data from different sources very easily. So the way I kind of look at it is those different technologies, the people speak different languages. So how can I get agreement and alignment and get work done if everybody speaks different languages? Well, instead of trying to get everybody to speak multiple languages, what if we started reducing the number of languages that people spoke or we can find more of a common language? So, for example, if you're an organization that has multiple pub subs, multiple cuing systems, local messaging systems, it's very typical for me to go into an organization and see five to seven different pools around their ingestion platform. What if we could go from seven down to two. Now I'm only speaking two languages. Well, nine times out of ten, if I can have a group speak a couple of languages or one language and have another group speak seven languages, who do you think is going to be faster and more efficient and make rest mistakes? Yeah, that's interesting, so set me. I see that in the data analysis or data science world things are standardizing on Python and sequel and maybe our as well. But for the data engineering side there are so many tools around. Am I writing thinking about? What you're saying is that people are working towards a smaller number of more standard tools. In that case for data engineering. I think that that has to occur because if you want to improve efficiency you have to be able to standardize, to be able to optimize to create a compound effect across an organization. I've never seen a way that you can buy or hire your way out of that. They start getting faster the more you reduce complexibly. Absolutely that makes sense. Do you see examples of what tools people are standardizing on them? Yes, I think they're looking at over simplifying a little here. It's just how can I speak less languages? So one of the things that kind of led me to the tool that I recommend now is the fact that I went to the business and I started asking all the businesses, what are your channel Jews,...

...what are your issues? What would help you be faster? And instead of looking at technology, I took all of their input and I actually reverse engineered and I came up with a bunch of checkboxes in terms of what are all the capabilities I need to basically improve things for the business, and I actually had an epiphany because I had one or two congestion platforms and databases that I had implemented it for years and been very successful, and I found that the way companies are looking to innovate today, they need higher scalability. They're looking for multi cloud, they're looking at how can we move quickly from on premise to hybrid or multi cloud or not that they have to be able to do that, but they want the option that in the future we're not making decisions that are going to pay us in a corner two or three years down the road. So they want that increased flexibility and so I was started seeing that instead of trying to find the best technology, and let me qualify that we often look at speeds and feeds. We look at how much you can scale. We're looking at all the technical perspectives, but what if we started looking at how is that solution going to help us drive business revenue? It really starts changing how you look at your tool and with real time it runs at such a faster pace. And let me give you an example. You can do analytics and M L in the data warehouse or a data lay and if you don't like that report, you can try different data, you can try different algorithms, you can work with it and iterate to get where you want. With real time data, you're making a decision that customer seeds. That impacts revenue. So speed here is absolutely essential, not only in terms of supporting the decision ing, but also your tools have to be scalable and handle the loss of the data if they have to deal with this is something you've mentioned a few times now. Is that the most importance of various way you really need to care about your data flow is to the customer experience. Can you give me an example of some things that are really going to impact the customer experience, like what sort of specific things are the most important? Yeah, one I'll give you one example in financial services. You may call Your Bank or financial service and you want to talk to them about something specific. It gets very frustrating if you get put into a call with someone and it's not you need to speak to you. Let me transfer you, and then you have to do that two or three times. It's now created a bad experience and you haven't even gotten started yet. Right. So in financial services, what we started looking at it is really doing analytics in real time of when that customer called, an understanding what's the probability of the reason you're calling, and we were able to improve who we connected into the first time. So that's how we did that seven years ago. But we're now going through that same experience, but we're now doing it with a mobile APP. Where customer connects that APP, can they get to where they can make a decision? So your click stream data that shows what's the number of clicks it takes a customer to get to a product where they make a decision? How many pages do they go through before they find the product that they're looking for? Are you able to convert that view into a transaction and a sale? So all that up front dynamics around the customer in the mobile APP or on the browser is what's defining that customer experience. I can definitely appreciate that. I've used so many terrible apps where I was like, I'm just trying to do something simple, even with banking APPs. Okay, I just want to check whether I bought something. It's like you go through twenty clicks to try and find some transaction. So I can definitely believe that's useful. Just on the flip side that, are there any things where maybe people would often think...

...that they were important but they actually turned out to be not important for the customer experience? Yeah, one of the key things that I've seen is sometimes, when applications are being developed, there's not a clear enough understanding of how the data that's generated from that APP is going to create value in terms of a business outcome or in terms of generating revenue. And I think that's historical with our industry because when big data first came out, it was about how can we get data into the data warehouse, how can we get data into how can we get it into cloud storage, and we'll let the data scientists figure it out later. You don't have that luxury with real time and with real time that data is complex or it doesn't integrate with the right data that it needs to make a decision, or it's hard for the develop developers to try from that data in the way that they can use then all that is going to create a negative customer experience and then it's hard to undo. So in highlight it's sometimes become acceptable to have technical debt and we'll figure it out, but with real time data you don't have that luxury of absorbing technically debt. You need to create a great experience. You don't with the data interaction. Would you say there's a risk of jumping into this too quickly and trying to get to real time and then realizing that you've done it wrong somehow, or is there an easy, sort of gradual way of getting from your slower processes to real time without that risk? You know what, I think the fundamental best practices change in you get to real time data. So one of the things that I think is very important is to get some quick wins, build confidence in the business that the approach is gonna work, Bill Confidence that you have the right coal, build confidence that we can trust the data and we can manipulate it easy. So I always look for what are some high value analytical assets that the business can drive revenue or outcomes with and what type of changes can we make to accelerate that? So I like starting small with high value analytical assets, improving them for the business, getting winds, building the teams together, getting their ability to interact with the tools and the data more efficiently, finding out where do we have weaknesses that we have to fix, and so I think the quick winds with high value and limbical assets that will have impact is a great way to get started. That seems very sensible. Start simple and then build up to something more complex. Can you give me an example of what you mean by a High Value Analytical Asset? With marketing? When you put together a marketing campaign or you put together coupons or you put together discounts, you're investing revenue and capital in that being successful and you're expecting a certain outcome. So there has to be a very clear understanding of the analytics around those business activities and understanding if we offer a five percent discount or a ten percent discount or pel percent discount, can we generate a certain type of revenue from that if we offer and send out ten million coupons, what's the potential revenue that we're going to get from that effort? So I think the analytics around those type of business activities have to be well understood because that's how you start executing your business model to drive revenue for the organization. Okay, and of course coupons can be a virtual thing as well. It's not necessarily a physical coupon. Right, it can be fair on my...

...businesses too. Yeah, absolutely, where that you can go into your mobile APP and they might say, Hey, Richie's here and Richie's goes to a baseball stadium and Richie Las Baseball hats. So they might look for are there some baseball hats or some baseball jerseys that Richie might like? So it seems like quite a lot of this stuff we've been talking about has been not necessarily just the data analytics, but more about data applications as well. Maybe you can talk me through what you see as an example of a good data application over something by a little bit. A good data application generates revenue. I think that's the key thing. So we have to understand if we're booting this application, we're booming the data set. Somebody has to have strong skinning the game to understand how are we going to generate revenue and make predictions on that. The second key thing is the data that you're going to generate from an application. Is it something that the customer is going to be easy to work with, easy to understand? And the third thing is, can we trust the data that comes from this application, especially when it integrates with other data, because data scientists and analysts must have confidence in the data that's been generated. So applications that create high value of trust are also important. Okay, and I should probably clarify before, by datail application doesn't necessarily mean to be a mobile APP or something. Sometimes data application could be a dashboard or something like that, just some way of having output from data. So do you have any examples of data applications you've seen that have been successful? Yeah, the perfect example is like home depot. When covid occurred, they realized that their whole business models changing. Instead of having people come in their door to their organization was now their mobile APP. So how quickly they could get that mobile APP up and running and they could understand inventory and they could understand delivery times and they can understand how to acquite coupons and discounts, how to reward customers, and they got that application up at a very short time frame and that had a very positive impact on that company through the whole covid time frame that we went. So, as you're world aware, a lot of organizations were going through that process of changing the front door from a physical store to their mobile APP. Okay, and they've really changed the whole business model, I guess due to external influences, but the data transition seems to have been a big part of that. So just doing something huge like that and changing a business model seems pretty impressive. How do you go about identifying these opportunities where you need to make that change in terms of how you deal with data? You know, one of the things that I think is missed is the data x here we often talk about open source and sometimes I think that open source isn't understood well enough. Open source is basically about a culture of innovation. If you look at developers that are trying to find a solution or take an application to the next level, often the first place that they're going to look is open source. So that culture of innovation can drive an organization. I'll give you an example. I was working at a company and we were moving more to real time and a lot of the solutions that I needed in terms of big quality data Durbability, data discovery that empowers all the technical data tool decisions that you make, weren't available and I went to some of the biggest companies and they didn't have those products ready or they had very small versions of it, and I always started finding that when I was going to open source, I was starting to get the Deriddis that I was looking for, because that was the cost of the next wave of technology. So, along...

...with building that culture of innovation in your organization, it really does help avoid vendor lock in, and I think that becomes even more important in today's world because things are moving faster and when you look at that data supply chain that are referred to, there has to be individual components in that technical data stack that are malible, that interface really well with other tools that are in that stack, and so being able to move from on premise to the cloud, be able to move to multi cloud, that avoiding vendor lock in really helps future proof of decisions that they're that you're making. And the other thing that I think is a big difference is if you look at the scale of applications and the scale of the velocity and barnment data, scalability becomes very important, not just of your technology but also managing your budget. So unit cost economics become very important and you need to make sure, whatever the tools election is that you make for your real time data in your analytics, that you're gonna be able to manage the unit costs as that environment scales. And I would highly recommend that anybody that hasn't read red hats report on the state of Enterprise Open source. I think it would be an eye open to a lot of people in terms of how open source at the emprise level is empowering organizations and how executives are looking at open source very differently than they were even five years ago. Okay, is it then, to lock in that you see as the big driver first switching to open source, or are there are the bigger effects that make people choose open source over a proprietary solution? It's basically often sometimes for the tip of the sphere. In terms of when I need new capabilities, I often see them an open source first and sometimes when a business is trying to push the envelope in terms of opportunities, they need something that they you start working with the day and they can enter on it in the future, but they can't wait six months or a year for a larger enterprise get their first version of that out. What also does in terms of creating innovation is often you want to work with different products together and see how well they work together. So when someone can download something in open source to start working with it and they can look at what they're doing from a data ingestion perspective, in data durability, in data profile and data discovery, and they complaine with it very easily, you can see how that can really drive speed of getting something into production. Do you find many organizations are going to end up contributing to the open source platforms themselves once they start using it, or do you see most organizations as just being consumers of the technology? I think it depends on the open source model that's being followed in a certain area that I am seeing more and more of the enterprise companies contributing to open source. If you look at just limits and we look at the things like Cassandra and Paul Saar, you're seeing that it's a community driven generation of innovation. So I do see more and more enterprise companies contributing to open source because they've realized it's in their best interests. Excellent. And have you seen any of the particular open source tools like these that you think have become important within a modern data stack? One of the tools that I think is really important is cubdetis. I'm really seeing that. Someone referred to it as the future glue of your applications and your data and your streams as you move from hybrid to cloud and the multi cloud. And I think we talked about applications and we talk about being data center and data driven as if were two completely different things. But again, if you look at it from a data supply chain perspective, applications feed streams, they feed data. So if I want to move an application from on premise to the cloud, it's not just that application that needs...

...to move, its environment needs to move. So kubernetes really supports it with containers and unit testing and see I C D, so all the work around testing and productionizing and application kubernets facilitates well. If you're going to be moving your applications from on premise of the cloud. If they feed data streams, you're gonna have to move those streams. If you're going to feed databases, those databases have to be able to move to so aligning your applications around something like Kubernetes for quickly producing high quality applications. Well, having a cloud native environment like Apache Pulsar and as a first class citizen with Kubernetes, helps those applications move faster. If you have databases like Cassandra that are hybrid multi cloud open wars, it allows your applications to align really well with your streams and your databases to move across different environments at the speed that you need. And if you don't, if you decide that you like something other than a Pattie pulls or a Cassandra, I would highly recommend that you do your due diligence and make sure that whatever you're looking at choosing has that same criteria. So if you're going to go about choosing the rest of your data stack, so far we've got CASSANDRO, we've got pulse off, you've got cuber eight S. can you give me your ideal data stack. What are your top picks? I really like the flexibility that you have with Apache, pulsar CASSANDRA and how well they align with applications. It allows me to accelerate the speed of deployments. I think that a memory cash becomes very important for real time data. You can cash some of your data in a database that there's going to be very little latency data that you have to memorcat and I would look at something like period. I look like something like vault that gives me a distributed memory cash that I can work with. And in terms of analytical platforms, I think data bricks and snowflake or access solutions big query. I think there's a little bit more flexibility on the analytical platform side, but I think one of the things that that is important. If somebody needs to run a query, it needs to be transparent or the data resides, but the customer doesn't care if it's in your crowd storage or if it's in data bricks or snowflake or a good query. All they care is can I access the data and doesn't return a result. So the more you can make data transparent to the consumers of that data, amount of word the data resides. That becomes very important for the organization getting the business in sides, and I believe it's very hard to have a successful data culture out of data catalog. The data catalog is basically how somebody discovers data it and it's how someone understands data, and when you can make it easy for people to find data and for people to understand data, you're now empowering them with data. So I think that the data governance program and the data catalog are also a very important part of your staff to be successful with real time data, and that's an area of growth for the industry, because there's still a lot of work that has to be done to make real time data the first class citizen. With data catalogs. I think that's crucial for success. We've talked quite a lot about tools now, so maybe we can talk a bit about people as well. Who needs to be involved in working with these tools and what skills do they need? Yeah, first of all, you you have your data scientists and if you look, what data scientists want to be able to do is they want to be able to work with different types of models and data that they can test different types of algorithms. Well, and the more you make data accessible and...

...easier for them to work with, the more they can go through their models faster and you can see that they're going to innovate quickly. If you look at data analysts, they're also important part of the creating business value. We have to reduce the complexity. So one of the things I've found is an important key is how easy is it for some of the work with data? And sometimes you have with data that's in a seven way join that you pretty much have to be a brain scientist to understand. Some more so I see companies that are making data available to a wider audience in the organization that they're moving to wide and little tables, so some of them doesn't have to be a brain scientists to be able to work with queries that return the data that they're looking for. Product managers are very important. I think we're evolving to a role of a product data manager, that someone can understand data and data science characteristics that's helping to find the product that's going to generate value. The other thing, Richie, that has been in internal debate for since I've been in log the ministry is do we centralize or do we decentralized and often in your centralized teams. is where you have your technology experts. That's where you have your experts in Cassandra and pulse, are in Copka and Rabbit Mq, and then you have all the developers in the lines of business, and it can be very frustrating for lines of business to say we can't get the help that we need on the technology side to be able to innovate with that data. So I think business developers play a very key role in this as well, because organizations that can empower lines of business developers downstream to innovate with data, they're going to be more successful in companies that can't. So finding that balance of centralized expertise and decentralized business develop first that need to invite that data is another important piece of the people that you have to involve with this, and a something that I think is not prioritize as much as it needs to be is data modeling and data architecture. If that's not done right, it impacts everything downstream. So also having your your data architecture teams or your enterprise architects part of that process is important as well, and maybe the most important person in this process is who has the vision that they can sell the business leadership on, that they can sell the consumers of that data on, and everybody understands. Yes, this is the right vision. We see the track that you're leading US down and we believe that's the way to go. You have to get buy in to have a data culture. You have to have people believe in the tooling, in the approach that you're using. So the person that's driving that vision and leading that effort, I think, is key to an ever, as well, that lines a lot with what I've experienced, that there are so many different people and different roles that end up being involved with data. It also naturally leads to a problem that I've experienced almost everywhere, and that's how do you get these different people to talk to each other? So how do you get the business people to talk to the data people and the data people to talk to engineers and things like that? How do you get this common language around communicating with data between teams and different roles? I believe it comes back to getting everybody to speak a common language, and the common language is business. So I think it's very important, as evolving your data culture, to get your technology teams, to get your data teams to be able to speak business and understand the business objectives and understand the business challenges and understand the business operating model. And once you start focusing on...

...we're going to drive everything from the business perspective and if we're an I T or word technologist or word data expert, that we're speaking the business language. I think that's an absolute key and when I look at companies that are really succeeding in their data culture and being data driven in the digital transformations, it's they're speaking more of a common business language. That seems like really great advice. I think from my personal background, I started with doing the data things and the business stuff came later and I know a lot of people they worry like Oh, doing data is hard, but I find like, well, actually, the data is the easy bit, and then learning the business side of things. I think that's where the challenges but I definitely agree that's a really great strategy. It's just to get everyone to understand what your business objectives are. I mentioned that as if it's really easy, but that is really hard and sometimes I think that the most successful companies with data are usually the ones that are the most tenacious and really stick to say we are business with an organization and we're going to use data to help drive it. And you have to have the right technical and data leadership to get the technology and the data teams to buy into we have to speak to business language and we speak of value. We're not talking speeds and feeds and how big stuff you can get. We're talking about business value of the customer. Again, just helping people to try and get started with this. How do you get this sort of alignment around business value? Where's the place to start? I think it comes back to really picking out two or three areas of data that you believe with high and litical value that we can transition and generate business outcomes or increase revenue it. Because if you can do that successfully, you're going to get the business buying into that, you're going to have the business being champions of your effort and you're going to start to get your data and technology teams understanding that this is the whole goal of what we're trying to do. And just if you're playing soccer, baseball or basketball, you have to get some wins to start building confidence and in picking two or three high value analytical assets that maybe just need some tweaks or some changes or need a new data added to it, and or having data that closed from a strain to a data base, to a data warehouse and back into memory. If you can have that streaming into the real time decisioning process quicker and going from seven minutes to two seconds make a decision of that data, that's how you start getting wins. It sounds really simple. You go from like seven minutes to a few micro seconds of whatever, so you don't have to shave seven minutes off, but I'm sure that's like there's a big challenge. Wonderful. All right. So, just to wrap this up, we talked a lot about trying to improve business performance of data and having an impact on customers. So do you have any final advice for any business is trying to get that? I think that the number one thing that that I see makes a very key difference is you have to reduce complexity two your applications, your data streams and your databases have to be able to align together and move well together and whatever you're doing from a data perspective, you have to address data quality and trust in that data that is critical for real time data that the data is trusted and when we're giving our customers all these coupons and we're giving them all these discounts, we know all that we're basing those decisions on accurate information. All Right, super...

...thank you very much. She's been really informative. I'm sure a lot of people are going to be inspired to try and speed up their time to value with their data stack. So that's brilliant. Thank you for your time, George. Hey, Richie, thank you. I appreciated as well. You've been listening to data framed, a podcast by data camp. Keep connected with us by subscribing to the show in your favorite podcast player. Please give us a rating, leave a comment and share episodes you love. That helps us keep delivering insights into all things data. Thanks for listening. Until next time,.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (121)