Technology

Combining best of both worlds

Find out how

Romcore combines two architectural paradigms, a data processing pipeline to handle real time data streams (for example sensor data from Internet of Things applications) and supporting microservices that serve as a set of software components. These two technology stacks are tightly integrated to flexibly build state-of-the-art software solutions.

A harmonized deployment stack, built on top of Kubernetes, allows us to continuously deploy the applications at scale and on a growing range of public and private cloud platforms.

Schema 1

Streaming data processing

Romcore offers a big data platform to process data from Internet of Things devices and other data sources in a reliable and scalable way. Our kappa architecture is based on Apache Kafka and its Streams and Connect components. It handles both real-time and continuous data reprocessing using a single streaming engine.

The Romcore kappa architecture consists of a number of layers. Data is ingested from IoT devices and legacy systems into the Kafka distribution layer. Processing, both with app-specific microservices and view builders, generate processed data. These resulting views, together with the original raw data is stored and made accessible, e.g., with Elasticsearch methods.

Schema 2

Microservice architecture

Romcore incorporates a diverse toolkit of software components that allow to represent and manage the real world, from user and asset management to big data processes and security functions. These established microservices and standard libraries allow us to flexibly and rapidly build applications in a wide diversity of application domains.

Schema 3 first

Basic components

Basic components allow a conceptual representation of the real-world setup, including users, assets and IoT devices and allow the exchange of external data with the platform.

Schema 3 second

Supporting components

Supporting components extend the functionality of the basic components. Supporting components involve workflow management, planning, geo services, invoicing, reporting, notification and conversation services.

Schema 3 third

Additional components

Additional components increase the added value of the system, by integrating big data analysis tools (e.g., Apache Spark is tightly integrated for large scale data processing) and enablers for enhanced data security.

Get in touch