By 2020, businesses harnessing the power of Big Data should see $430 billion in productivity benefits over competitors not using data, according to International Institute for Analytics. And today, most companies have the opportunity to bring Big Data to the bottom line. It is therefore not surprising that conversations about Big Data, data analytics and, more recently, cognitive analytics, are dominating corporate conversations. Data analytics finds useful information inside Big Data. Insights can be descriptive, as when analyzing the percentage of bank customers who use online banking exclusively. Insights can be predictive, as when forecasting the percentage of banking customers who will be using online banking exclusively 10 years from now. Or insights can be prescriptive, as when recommending whether a bank should expand its online presence or secure commercial space for additional branch offices. Cognitive analytics, a newcomer to Big Data discussion, draws on diverse dynamic learning tools, such as artificial intelligence and artificial neural networks. The horizon for new opportunity through data monetization is vast, as companies transform information collection and use to customer benefit and ultimately, profitability. Although data analytics can tell us the “what,” the “whether” and sometimes the “how to,” redefining data monetization as “leveraging data to generate value” informs corporate conversations with a clearer realization of the monetary value within Big Data.
According to Forbes, in 2013 the Journal of Business Logistics published a white paper calling for “crucial” research into the possible applications of Big Data within supply chain management. Since then, significant steps have been taken, and it now appears many of the concepts are being embraced wholeheartedly.Supply chain management is a field where Big Data and analytics have obvious applications. Until recently, however, businesses have been less quick to implement Big Data analytics in supply chain management than in other areas of operation such as marketing or manufacturing. Applications for analysis of unstructured data has already been found in inventory management, forecasting, and transportation logistics. In warehouses, digital cameras are routinely used to monitor stock levels and the messy, unstructured data provides alerts when restocking is needed. Forecasting takes this a step further – the same camera data can be fed through machine learning algorithms to teach an intelligent stock management system to predict when a resupply will be needed. Eventually, the theory is, warehouses and distribution centers will effectively run themselves with very little need for human interaction.
Finding the hidden monetary value of Big Data – The Tennessean
The big news in Big Data for the last 12 months has all been about real-time and streaming analytics. Spark, which became a top-level Apache project in 2014, has gained endorsements from every major Big Data player. A new player that’s been stirring up excitement is Apache Flink, a true stream processing engine whose developers recently landed $6 million in financing for their Flink-focused startup, Data Artisans. By 2026, 59 percent of all Big Data spending will be tied to Spark or related streaming analytics as enterprises seek to deploy applications that can make decisions on behalf of individuals. The reasons relate both to speed and simplicity. Hadoop introduced important concepts like moving analytics engines close to the data and incorporating unstructured data into a data lake, but it has developed incrementally into an ecosystem of more than 30 discrete components that can be daunting to coordinate. Spark has added real-time-like features through the Spark Streaming project, but it’s still fundamentally a “micro-batch” architecture for now, meaning that it simulates real-time analytics by processing small volumes of data quickly in batch mode. For most applications Spark is good enough, but true stream processing will demand a combination of Flink and Kafka unless Spark is able to evolve beyond its micro-batch approach by adding per-event streaming.
Gizmodo reports that the state of Victoria is working with IT and communications companies to test out Narrowband-Internet of Things (NB-IoT) systems in its sewer and water systems. Huawei, Optus and Vodafone are partnering with the state agency South East Water for the trial in urban areas that include Melbourne. NB-IoT is a low-powered, low-cost radio technology that allows thousands of connected devices and infrastructure elements to gather and share data, regardless of the location. The three-month trial seeks to provide operators with access to granular, real-time data to better the safety, reliability and efficiency of Victoria’s sewer and water infrastructure. The trial will see NB-IoT sensors placed on sewer manhole covers to inform city workers of unauthorized access, to reduce the risk of injury and damage to water assets. NB-IoT technology will also be fitted to rainwater tank management systems to monitor storage levels amid efforts to optimize stormwater runoff and rainwater harvesting
Neil Chandler, CEO of financial services at UK firm Shop Direct makes a statement that arguments reality and tactile technology will be the next change in online shopping. He states that Virtual reality (VR) and augmented reality (AR) are just one or two years away from changing the way consumers shop online. He gives the example that augmented reality would enable consumers to see how a sofa would look in their living room or how would they look in a piece of clothing.