Deep dive into Capital One's cloud and data strategy wins

As part of Data Week for VB Transform 2022, Patrick Barch, Senior Director of Product Management at Capital One Software, took the stage to explain why operationalizing the data mesh is critical to operating in the cloud. Then, on day two of Data Week, he sat down to chat with Matt Marshall, CEO of VentureBeat, to dive into the governance element of cloud strategy, and why a holistic approach is key to managing the cloud. influx of data into a new environment.

About six years ago, Capital One went all out on the public cloud. The company shut down data centers it owned and operated, and embarked on modernizing the data ecosystem for machine learning.

"How do you handle something like that?" Barch asked rhetorically. “And by the way, you have to get it right, because – pick your phrase. Data is the new oil. Data is the new gold. At Capital One, we say data is the air we breathe. Businesses recognize that the key to success in today's technology landscape is the use of their data. So no pressure. »

Moving to the cloud means more data, from more sources, stored in more places, and an entire society of users demanding self-service access to all that data in tool, format and the mode of consumption of their choice. All of this is happening against a backdrop of patchwork privacy legislation popping up all over the world.

"When you move to the cloud, you face a lot of challenges," Barch told Marshall. "There are challenges with publishing, sending data to the cloud in a well-managed way. There are consumption issues. How do you help your teams find all this data that is exploding in quantity, it's i.e. on all these different platforms like AWS and Google and Snowflake and such? How do you handle all of this data, especially against a patchwork of emerging privacy legislation popping up all over the world? Finally, this is a new paradigm for infrastructure management. You are no longer responsible for the servers. You pay as you go. How do you put the right controls in place around all of this?"

Early in the journey, the company invested in product management and user-centric design in the data ecosystem to address the specific challenges of all of its customers and users: how they use data and where they encounter difficulties. This includes everyone from the people who publish high-quality data to a shared environment, to the analysts and scientists who leverage that high-quality data to make critical business decisions. There are the data governance and risk management teams concerned with setting policies and enforcing them across the enterprise, and the teams responsible for managing the underlying infrastructure that powers all of these use cases. .

Organizations often end up with an array of point solutions to meet some of these user needs. A single person may have to navigate six or seven different tools and processes to perform a simple task, such as sharing a new set of data, or finding data. But it just doesn't work, Barch says. Scaling this ecosystem becomes extremely complicated both for the engineering teams who have to build and maintain these integrations, and for the users who have to navigate this map.

“For me, the heart of this thing is treating data as a commodity,” Barch said. “Once your company makes that mindset shift – and it really is a mindset shift – the rest of those principles fall into place. You have to figure out how to organize all of those products, and you need to find the right capabilities to enable self-service for a variety of factors."

This is where the data mesh comes in: an operating model that can help scale a well-managed cloud data ecosystem. Capital One approached its own ecosystem through two strands. A centralized policy integrated into a common platform which made it possible to federate the responsibility for data management. The goal was to give more control to the teams closest to the data itself, as the data mesh only works when operationalized in self-service. And the overarching goal is to keep your data practices running at the speed of business.

“When you combine common tools and centralized policy with federated ownership, you make the job easier for your practitioners,” he said. "You turn data from a bottleneck into something that can boost and energize your business."

Capital One engineers built these tools and infrastructure in-house, but Barch acknowledges that not all companies have the luxury of building their own. Fortunately, there is a wide range of solutions available today that did not exist when the company began its journey.

"You just have to make sure that you're creating a user experience that works for your user base," he explained. "The days of one central data team and data being the work of the IT team - those days are over. Think...

Deep dive into Capital One's cloud and data strategy wins

As part of Data Week for VB Transform 2022, Patrick Barch, Senior Director of Product Management at Capital One Software, took the stage to explain why operationalizing the data mesh is critical to operating in the cloud. Then, on day two of Data Week, he sat down to chat with Matt Marshall, CEO of VentureBeat, to dive into the governance element of cloud strategy, and why a holistic approach is key to managing the cloud. influx of data into a new environment.

About six years ago, Capital One went all out on the public cloud. The company shut down data centers it owned and operated, and embarked on modernizing the data ecosystem for machine learning.

"How do you handle something like that?" Barch asked rhetorically. “And by the way, you have to get it right, because – pick your phrase. Data is the new oil. Data is the new gold. At Capital One, we say data is the air we breathe. Businesses recognize that the key to success in today's technology landscape is the use of their data. So no pressure. »

Moving to the cloud means more data, from more sources, stored in more places, and an entire society of users demanding self-service access to all that data in tool, format and the mode of consumption of their choice. All of this is happening against a backdrop of patchwork privacy legislation popping up all over the world.

"When you move to the cloud, you face a lot of challenges," Barch told Marshall. "There are challenges with publishing, sending data to the cloud in a well-managed way. There are consumption issues. How do you help your teams find all this data that is exploding in quantity, it's i.e. on all these different platforms like AWS and Google and Snowflake and such? How do you handle all of this data, especially against a patchwork of emerging privacy legislation popping up all over the world? Finally, this is a new paradigm for infrastructure management. You are no longer responsible for the servers. You pay as you go. How do you put the right controls in place around all of this?"

Early in the journey, the company invested in product management and user-centric design in the data ecosystem to address the specific challenges of all of its customers and users: how they use data and where they encounter difficulties. This includes everyone from the people who publish high-quality data to a shared environment, to the analysts and scientists who leverage that high-quality data to make critical business decisions. There are the data governance and risk management teams concerned with setting policies and enforcing them across the enterprise, and the teams responsible for managing the underlying infrastructure that powers all of these use cases. .

Organizations often end up with an array of point solutions to meet some of these user needs. A single person may have to navigate six or seven different tools and processes to perform a simple task, such as sharing a new set of data, or finding data. But it just doesn't work, Barch says. Scaling this ecosystem becomes extremely complicated both for the engineering teams who have to build and maintain these integrations, and for the users who have to navigate this map.

“For me, the heart of this thing is treating data as a commodity,” Barch said. “Once your company makes that mindset shift – and it really is a mindset shift – the rest of those principles fall into place. You have to figure out how to organize all of those products, and you need to find the right capabilities to enable self-service for a variety of factors."

This is where the data mesh comes in: an operating model that can help scale a well-managed cloud data ecosystem. Capital One approached its own ecosystem through two strands. A centralized policy integrated into a common platform which made it possible to federate the responsibility for data management. The goal was to give more control to the teams closest to the data itself, as the data mesh only works when operationalized in self-service. And the overarching goal is to keep your data practices running at the speed of business.

“When you combine common tools and centralized policy with federated ownership, you make the job easier for your practitioners,” he said. "You turn data from a bottleneck into something that can boost and energize your business."

Capital One engineers built these tools and infrastructure in-house, but Barch acknowledges that not all companies have the luxury of building their own. Fortunately, there is a wide range of solutions available today that did not exist when the company began its journey.

"You just have to make sure that you're creating a user experience that works for your user base," he explained. "The days of one central data team and data being the work of the IT team - those days are over. Think...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow