Personal tools

Data Streaming

The Lunar Eclipse, October 2014
(15min progression of the Lunar Eclipse, San Francisco/Bay Area, California, October, 2014 - Jeff M. Wang)
 
  

- What is Streaming Data?

Streaming data is data that is continuously generated by different sources. Such data should be processed incrementally using Stream Processing techniques without having access to all of the data. In addition, it should be considered that concept drift may happen in the data which means that the properties of the stream may change over time. It is usually used in the context of big data in which it is generated by many different sources at high speed.

 

- Processing Steam Data in Real-Time for Business Success

Data streaming is the transfer of data at a steady high-speed rate sufficient to support such applications as high-definition television (HDTV) or the continuous backup copying to a storage medium of the data flow within a computer. Data streaming requires some combination of bandwidth sufficiency and, for real-time human perception of the data, the ability to make sure that enough data is being continuously received without any noticeable time lag. 

In today’s digitally driven world, processing streaming data in real-time is a requirement for business success. In order to gain a competitive advantage, organisations must enable app developers to combine the power of complex event processing (CEP) with real-time analytics on streaming data. Real-time intelligent decisions powered by machine learning empower organisations to glean turbo-charged insights, arming them with tools to succeed and thrive in the real-time economy.

To deliver applications that meet ever-evolving user experience demands, businesses are moving from post-event reconciliatory processing to in-event and in-transaction decisioning on streaming data. As real-time use cases increasingly become the norm in verticals such as telecommunications, financial services, IoT, gaming, media, eCommerce and more, developers need to adopt new approaches. This will allow them to make the low latency complex decisions that drive business actions without compromising the performance and scale that is critical in the modern enterprise.

 

- Smart Streaming in the Age of Wireless 5G

Wireless 5G networks will only increase the data volume and speed requirements that are already putting pressure on traditional data architectures. Organisations need to ingest this unprecedented increase in data traffic, while also driving actions by making intelligent, dynamic decisions across multiple data streams.  Though current data streaming architectures are usually sufficient to act as processing pipelines, they do not meet the needs of mission-critical applications which are underscored by low latency and responsive multi-step decisions. In addition, with a projected increase in density of connected things per sq. Km (1 million per sq. km), and the prescribed low latency in single digit milliseconds, data and processing is going to be decentralised with several edge data centres, as opposed to the traditional few central hub data centres.


 


[More to come ...]

 

 

 
Document Actions