Horizon 2020, digital twin

Extra data, extra variation in data, greater and completely different resolutions of data, scheduled or event-triggered data and the necessity to course of giant quantities of data in as near real-time as potential: these are the real-life data challenges that utilities across the globe are dealing with as they optimize their grid and fulfill regulatory necessities with all types of latest data sources and resolution assist programs.

Greenbird’s Utilihive platform is designed to deal with massive data. And plenty of it! 

Utilihive allows real-time data flows inside Superior Metering Infrastructure (AMI), Sensible Grid, Distributed Vitality Useful resource Administration (DERM) and different areas similar to our Utilihive Datalake.

Performance and scalability are key.

Listed below are some examples, each from take a look at bench situations and actual buyer circumstances.

Instance 1: Fast Processing of Sensible Metering Data

Ideally, Superior Metering Infrastructure strikes data shortly and securely from a number of Headend Techniques (HES) to a Meter Data Administration system. Generally, data essential for grid operations similar to outage occasions, wants to enter ADMS or different operational options. Transporting and remodeling the data into the required codecs is a core activity for our Utilihive platform.

Right here’s how this works in follow:

  • One among our prospects is at present in the course of a multi-million good meter roll-out.
  • They requested us to carry out a scalability take a look at to display Utilihive’s data dealing with with messages from 9 million meters.
  • For this buyer, Utilihive runs in a non-public data middle and collects 15-minute values from six registers for each good meter.
  • We used the identical metrics within the scalability take a look at, simulating a two-hour data batch with 432 million meter values.

Utilihive processed, remodeled, and handed off all data in underneath quarter-hour.

For full transparency:

  • The take a look at for 9 million meters was carried out in an on-premise atmosphere with 10 nodes (8CPU cores and 96 GB RAM every).
  • The {hardware} proved to be outsized, as CPU utilization by no means went above 40%.
  • Reminiscence utilization didn’t even go considerably above 10%.

Instance 2: Processing Data From A number of Energy Grid Sources

  • One other buyer makes use of Utilihive Datalake to retailer and analyze data from the grid.
  • In whole, we built-in and provisioned data from completely different sources, amounting to roughly 70 billion readings from substations all the way in which to house-hold good meters.
  • Our buyer had beforehand developed a reporting system utilizing queries from this data. Earlier than they applied Utilihive, this report would take them round three days to compute and end.

Once they applied the identical report utilizing Utilihive Datalake, it was generated in slightly below 10 minutes.

If you happen to discover that arduous to imagine, so did our consumer. They advised us that that they had re-run the report a number of instances, as a result of they thought the outcomes have been unimaginable.

Extra information from Greenbird

How Can Utilihive Obtain Outcomes Like These?

Utilihive is completely different from different integration platforms or ESBs. How? It’s constructed as community of reactive microservices (service mesh), following the actor mannequin for extremely concurrent purposes and the precept of event-driven structure. This permits for top efficiency, dynamic scalability and ensures excessive availability by design. As well as, we will usually compress data to beneath 5% of the unique data measurement, additional boosting data processing efficiency.

Sooner or later, utilities might want to deal with rising portions of data from sensors, meters, native producers and way more. We are saying: Carry it on!

What's Your Reaction?

hate hate
confused confused
fail fail
fun fun
geeky geeky
love love
lol lol
omg omg
win win
The Obsessed Guy
Hi, I'm The Obsessed Guy and I am passionate about artificial intelligence. I have spent years studying and working in the field, and I am fascinated by the potential of machine learning, deep learning, and natural language processing. I love exploring how these technologies are being used to solve real-world problems and am always eager to learn more. In my spare time, you can find me tinkering with neural networks and reading about the latest AI research.


Your email address will not be published. Required fields are marked *