In order to better understand the challenges of data sharing, governance and reuse across multi-stakeholder actors in data value chains, the REACH project has conducted interviews with 15 of its Round 2 data startup pioneers. Below are our findings.

Trends & Insights

Many sectors and industries are experiencing a radical change, in consumer and provider behaviour (for example, in energy sector, consumers are becoming prosumers as a result of novel data driven solutions and exploitation models). Furthermore, there are more and more programs focusing on educating people on data analytics and tech development, radically influencing a mindset change towards data usage.

The startup ecosystem to an extend fears the big players imposing their solutions, making the barrier to entry and room for innovation quick tight.

Edge Computing, 5G and seamless and real time access to data to augment the capacity the legacy systems while accessibility and availability of data make the data value chains possibility more realistic. Various sectors and industries are also experiencing the explosion of the data sources and data lakes and we are seeing unprecedented efforts put towards gathering the data, which in continues nurturing the mindset around data driven business intelligence.

On the hardware side, the availability of IoT sensors in the last 10 years has increased tenfold, while cloud computation is constantly becoming cheaper.

EU regulation in the sharing of the data established the fertile environment for sharing of the data (defined standards, processes, obligations of data sharing), however, it must be said that they enforce a particular mindset towards collecting and processing data that both limits and nurtures data innovation capacity. To close it off, the development and implementation of novel technologies now days goes hand in hand with the careful consideration of cybersecurity.

Macro / socio-political also trends (energy crisis, covid, geo poltical activities) force companies to accelerate in the digital transformation, cloud integration, disrupt supply chains of hardware and influence the development and widespread particular innovations.

Signals of disruption

The traditional way of doing business in some sectors like energy and building management was corporations provide hardware and small integrators that do the job. At the end of the day we had siloed, fragmented, and slow deployment solutions. The change we see today, an interoperable solution while the hardware is open. The applications are not necessary developed by the same guys, but various providers operating in this open space interoperably. The data in the building is unlocked enabling becoming available for everyone to use, merge and create value. The new way of operations enables anyone to develop applications using this data. To top it of, change in the exploitation business model and stakeholder relationships enable the new interconnected stakeholder’s, in this case, the consumers, to become the prosumers.

Another industries that has seen signs of disruption from leveraging AI and ML is healthcare and finance industries, where particular development has been seen in relation to using data science and data value chains to establish fraud detection models and cost estimation models.

On the hardware level, while ever cheaper cloud infrastructure and edge computing is currently running the game, there is a curiosity of the potential of commercial quantum computing lingers in the air.

We have also noticed a particular interest of some startups in exploration and implementation of Federated learning within the data spaces to establish collective sharing, learning and leveraging insights from data.


From startups’ perspective, the main barrier(s) that they see by participating in Data Value Chains are:

Technical Aspects and Data

  • The lack of common data model (standardization),
  • Lack of semantic data interoperability
  • The lack of right and powerful infrastructure for deployment of big data
  • Processing capability and performance (high latency data for example)
  • Legacy systems
  • Quality of the data which is not good
  • Data Management is old fashioned Energy sector. Not so easy to propose something totally new, as they are also conservative, so changing their mind takes a lot of time.


  • The problem is the lack of trusted mindset and transparency between the value chain and the stakeholders in the ecosystem and their willingness to share data (the thought that data is their IP / proprietary for example)
  • Some sectors still do not fully trust the technology. Currently some of it it done manually and more often than not it misses to collect all data. It misses the mindset to be open towards the AI to help humans perform their work.
  • The propensity of a Data Owner to engage in data sharing agreements is blocked by internal legal and security reasons
  • Many managers are not familiar with novel tech / not aware of their data and its potential / and the implementation insecurity arises from this. Accompanying this is the fact that most of the potential synergies are based on the feelings of the manager.
  • There is a great fear with messing what works good.


  • Lack of clear business models that enable joint multi stakeholder collaboration
  • Corporate’s conservative internal data management regulation
  • Data is siloed or invisible within the company leading to people working these companies don’t know they have valuable data to share and it prevents different departments to access it
  • Lobbies and big players navigate the industry and the direction where both innovation and regulation heads towards
  • Cost of storing and processing the data is still high for some companies
  • Realtime data analytics present a high value to the customer, which in turn presents barriers to startups (when competing to Google for example)
  • Early stage Investors that invest in ground breaking and promising tech are afraid of the future regulation’s developments


  • Occasionally, legal aspects prevent the data to be shared, while reporting can be complicated in some sectors
  • Occasionally regulations published are not fully tuned with the tech development and progress, and therefore the strict playground they propose tends to limited the AI and ML to fully leverage its tech to deliver impact
  • GDPR compliance of some media companies generating data is complex


  • Poor of data analytics /processing capabilities of Data Scientist and the business partners involved

Opportunities for growth

  • Particularly in Reg Tech– we are seeing a struggle in the communication between the regulators and the regulatees’, especially those that focus on finance and energy markets. Aspects such as regulation provisioning and compliance check are becoming extremely important and the ability to easily implement this leaves room for new opportunities
  • Lack of multi stakeholder data value chain is itself is an opportunity of growth as none else is doing it.
  • Data summarization provided by humans and machines powered by AI and ML (continually fine tuning the model) provides room for opportunity
  • Opportunities for solutions for Early warning systems and statistical data analytics in the water utilities.