Photo by NASA on Unsplash

A Modern Approach to Data Sharing

Issam Ouchen
Slalom Technology
Published in
7 min readMar 4, 2022

--

In an increasingly connected world, organizations need to realize the full potential of their partner ecosystem. Unlocking these new opportunities requires having a more straightforward path towards secure, fully controlled real-time data sharing solutions, at scale.

Competing in the digital age requires an organization’s IT systems to have the flexibility and scalability to allow for experimentation and iteration at speed. As a result, many companies have launched programs to modernize their technology. Now more than ever, organizations need to rethink modernization paradigms.

A key ingredient to digital transformation is realizing the full potential of real-time data sharing across partners in an organization’s ecosystem. Relying on insights from data inside the enterprise alone is no longer enough. Making the right data available to those with the right to access it can enable effective collaboration, derive useful insights and improve decision-making. Think of the potential impact this kind of collaboration can unlock in areas such as: safety in aviation, food sourcing, efficiency in global supply chains, customer experience in retail, and so on.

“Data and analytics leaders who share data externally generate three times more measurable economic benefit than those who do not.” — Gartner, 2021

We’re not going to explore the challenges of data sharing in this article. Rather, we’ll be focusing on defining an innovative paradigm through which data sharing solutions should be understood.

Before we get there, however, let’s talk about APIs.

The API journey

The acronym “API” is probably the most used technical term when talking about IT systems. Application programming interfaces (APIs) have emerged as a key element of tech modernization at many businesses, no matter the industry. They provide the ability to link systems and data and play a crucial role in making IT systems more responsive and adaptable.

Yet, many organizations fail to capture the value they envisioned from APIs. In many cases, API implementations quickly morph into a hard-to-manage mess with redundancies, poor maintenance practices, and limited transparency. Some organizations spend considerable time and resources ripping and replacing these systems, adding APIs in an ad-hoc manner, and not making any real progress.

When examining these sets of difficulties, some common themes emerge. One is the disconnect between business goals and API program ownership. The latter usually falls within the realm of IT groups and is considered a purely technical initiative. Since these efforts are not tied to specific modernization goals, they mostly end up missing the mark and not delivering on their perceived value.

To ultimately realize the full potential of APIs, organizations have to make significant progress along multiple dimensions.

The API journey starts with defining which APIs to build, or more precisely, which data needs to be accessed and by whom. APIs constitute the connective tissue between different systems and can give access to underlying data. The flexibility of APIs means that the range of choices of which APIs to build can be overwhelming, and small experiments can quickly take on a life of their own, leading to wasted energy and resources.

Making this decision starts with understanding which APIs can enable customer-facing solutions and which can build a sound technical foundation. Within this context, Enterprises can start to prioritize API development based on business goals and modernization strategy. For example, a financial services client started with APIs that digitized a key customer journey. It then went on to build APIs that simplified the architecture and introduced efficiencies.

While architecting modern API solutions and approaches, a shift in mindset is needed. Traditionally, APIs have been seen as a middle-ware that integrates multiple systems and exchanges data between them. Because of this, most API initiatives were taken up by the IT group. As a result, the API taxonomy or grouping that describes what these APIs can do was usually technical and non-intuitive. Leading organizations today are defining their API taxonomy in a way that creates a common language that’s understood by both business and technical units.

Architecting an API-led approach means layering in process and experience APIs on top of system APIs. While system APIs can access data from multiple systems and abstract that complexity, a process API can consume and orchestrate the data that is exposed and represent common business processes. Then, an experience API can transform data and services in a way that allows their intended audience to consume them; they abstract the underlying data from the complexity of downstream systems.

The need for “API Management”

When deploying a growing number of APIs on a large scale, an API management solution starts to make sense. API management is a centralized and unified way to deploy and reuse your APIs, share documentation, and keep your services secure. A typical API management solution will have the following components:

  • API gateway to manage deployment and security
  • API developer portal for discoverability
  • API Analytics for monitoring

API management solutions help teams monitor different aspects of APIs: connection consistency, traffic, errors, and security. All monitoring and maintenance within an API’s lifecycle happen within the API management solution. However, there are some things to keep in mind when adopting these solutions and frameworks. An API management solution requires putting thoughts and resources into the following areas:

  • Security: APIs create a range of new and unique security challenges. There needs to be a security layer to ensure that attackers are unable to access or misuse the exposed systems.
  • Access Control: Strong access control needs to be implemented. The solution should balance back-end security with user experience leveraging IAM standards such as OAuth.
  • Maintenance: API lifecycle management should be a key component of the solution to ensure optimal system performance.

If it’s not clear by this point — the journey to realizing the full potential and promise of APIs is not an easy one. It is full of hurdles and challenges. These range from security to governance to orchestration and management. Add to that the fact that onboarding new partners or integrating new systems within an organization’s ecosystem — which requires building and managing more APIs — takes all these challenges to new heights. And if you’re thinking about real-time data sharing with an increasingly growing network of partners, you have a mountain of technical work ahead of you to get to a place where you can start realizing value.

APIs have always been considered one of the best approaches to effectively share data. However, due to the challenges we’ve discussed thus far, this paradigm is being challenged. There is significant cost and effort required that make moving data this way slow, expensive and risky.

Modern, data-driven organizations need to rethink the API approach to data sharing. This approach needs to evolve if it’s going to ensure that the data being shared is always consistent, complete and up to date, regardless of the source.

The modern approach

When we talk about connecting distributed systems and providing a single source of truth for a group of participants within a network, one thing comes to mind: Blockchain technology.

While Blockchains such as Hyperledger Fabric or Ethereum enable the creation of tamper-proof, transactional, and multi-party solutions —giving participants within a network access to complete and identical data at all time — they do lack the performance, scalability, and integration needed to address real-world enterprise-grade problems.

This is mainly due to their deployment architecture. Because these Blockchains are deployed on a single machine for each user in the distributed network, their ability to scale storage, bandwidth, and compute power is forever limited. Blockchains have a great vision and potential, but their architecture needs to evolve if they’re going to solve enterprise-level challenges.

But what if we can connect Blockchains to cloud infrastructure — Giving it the computing power it needs to be truly useful?

Serverless technology is the missing piece in the Blockchain puzzle. Serverless functions or functions-as-a-service (“FaaS”) abstract away all the underlying infrastructure complexities and shift focus to business logic alone. These functions are usually used as the glue between services in a cloud environment. They provide the perfect solution for event-driven computing needs. They allow access to unlimited (if you can pay for it) computing resources without the need to manage a single infrastructure aspect. It also means that resources can scale down to zero when not being used (so does their cost). This is due to their on-demand nature. Data can reside in a distributed ledger and any changes to it can trigger the needed compute power automatically. If no requests are coming in, then no compute resources are needed, for example.

Tying Blockchain to Serverless capability is a powerful thing. It’s scalable compute that can be triggered to run with external events, such as changes in Blockchain’s state.

Next-generation blockchain solutions like Vendia provide a fully cloud-native solution with unlimited compute, storage, and networking capacity. This scalability and flexibility are a must when designing a data-sharing solution. They make it possible to store data from diverse sources and share that data among a wide pool of participants without contention or competition for computing resources.

Vendia’s Share platform is an innovative solution for data sharing across organizations and participants. It allows organizations to only focus on the things that matter most: building value-driven applications that meet end-user’s demands and expectations. Just imagine putting more of your energy and resources not towards building and managing the “plumbing” needed to connect disparate systems, but towards building solutions that create immediate value for your organization and end-users.

The Vendia solution to data sharing is ideal when a use case has the following characteristics:

  • Disparate Data: The data being shared resides across multiple parties, either within a single organization or across multiple organizations.
  • Highly Transactional Data: There is a high volume of data across these parties requiring the ability to read and/or write in real-time, at scale.
  • Verified Data Lineage: All transactions across all participants require full lineage, auditability, and guaranteed immutability.
  • Data Security and Control: Fine-grained access control by role at the record and field level and built-in compliance & data governance.
  • Speed to outcome: Minimal resource availability to support development and accelerated timeline for POC/MVP.

Participants within a business network can now start sharing data externally in real-time with fine grained access control without having to dedicate massive resources to solving and managing the API and integrations headaches.

In 2022, implementing data-sharing solutions should not be a multi-year endeavor. Technology modernization requires future-looking tools and solutions. Platforms like Vendia are helping our clients explore the full potential of their partner ecosystem and getting them there faster.

--

--

Issam Ouchen
Slalom Technology

Enterprise Technology Architect. Passionate about finding innovative solutions to complicated problems.