Your daily digest of “All Things Big Data” gathered, collected and researched by your very own 10Fold Big Data Practice team.
With the Star Wars craze of the last year, TechCrunch has taken a completely different approach to fan theories surrounding the series, wondering, “How much data do they need to manage in the Death Star?” It is estimated that 1.7 million military personnel (stormtroopers and their commanders, trash compactor operators, etc.) and 400,000 droids were on board the Death Star. This is close to the population of Philadelphia — if you don’t count the droids. The amount of data created each year by all the people here on earth is growing exponentially, and is expected to double each year. In 2012 alone, 2.8 zettabytes of data were collected. By 2020, that number isexpected to have increased to 40 zetabytes. That’s 5,200 GB of data for every person on earth. Using that estimate, TechCrunch figures that the 1.7 million personnel on the Death Star would generate 8.84 exabytes of data per year. If you prefer to express data in factors of 1024 bytes, the number is 8.63 exabytes (or “exbibytes”). Of course, that’s assuming residents of the Death Star generate as much data as a human living in the year 2020, but it’s likely the galaxy far, far away may be far more advanced than that.
Forbes reported on Forrester’s TechRadar methodology, which evaluates the potential success of 10 Big Data techniques, which, the research firm believes, are projected to have “significant success.” In addition, each technology is placed in a specific maturity phase—from creation to decline—based on the level of development of its technology ecosystem. The first 8 technologies above are considered to be in the Growth stage and the last 2 in the Survival stage. Forrester also estimates the time it will take the technology to get to the next stage and predictive analytics is the only one with a “>10 years” designation, expected to “deliver high business value in late Growth through Equilibrium phase for a long time.” Technologies #2 to #8 above are all expected to reach the next phase in 3 to 5 years and the last 2 technologies are expected to move from the Survival to the Growth phase in 1-3 years.
UK and EU policy-makers have urged governments to raise their data capture and analytics game to the level demanded by the digital economy. Charlie Bean, a former deputy governor of the Bank of England, announced the findings of his report into the state of UK economic statistics, under the auspices of the Cabinet Office and the Treasury, on March 11. Meanwhile, the European Parliament debated and passed a resolution on March 10 that urged the European Commission to boost a “data-driven economy” in the European Union. Bean’s report – Independent review of UK economic statistics – commissioned in 2015 by the chancellor George Osborne and the minister for the Cabinet Office Matt Hancock, found that British government statistics need to be recast to capture the economic data characteristics of a digital economy.
Big Data and the Death Star – TechCrunch
Top 10 Hot Big Data Technologies – Forbes
UK government and EU Parliament step up big data analytics policy push – ComputerWeekly
Market Research Store has released a new market research report “Global Hadoop-as-a-Service (HaaS) Market Size, Share, Trends, Demand, Analysis, Research, Report, Segmentation and Forecast, 2013 – 2020” to add to its collection of research reports. The report comprises of data storage in the cloud and its analysis through Hadoop without the need to install any infrastructure in the premises. HaaS market has witnessed a tremendous growth in 2013 and has doubled from the market size in 2012. However, it is expected to witness a tremendous growth in the next 7 years, expanding its horizon into the end user industries of conventional Hadoop.
Learn details of the global hadoop-as-a-service (haas) market 2016 analysis and growth to 2020—WhaTech
Forbes examines how the use of key performance indicators, in conjunction with the real-time IoT data, creates opportunities for companies to provide new or improved services that are transforming their industries. A few examples include opportunities in risk reduction, especially when trading commodities like energy precious metals, livestock, and in the mining and oil industries, new pricing models such as tiered services from basic to premium, which is beneficial to both consumers and providers, and flexible pricing.
ZDNet examines the new trend of services arising aimed at helping hardware developers connect with experts and suppliers, including HWTRek, which bills itself as a one stop shop for IoT hardware innovators working to manage their product development, connect with manufacturing and supply chain industry experts, and bring their connected device projects to market. The IoT market is expected to reach $1.7 trillion in 2020, up from $655.8 billion in 2014 according to IDC. Meanwhile, the electronics manufacturing services (EMS) industry, including electronics original design services (ODM), should expect revenues of $505 billion in 2019. The source of innovation is shifting to small and medium-sized businesses in these industries due to the obstacles of collaboration in the traditional supply chain model.
How IoT Changes Pricing Models – Forbes
Collaboration Tools for IoT devs – ZDNet
Amid the peak of SXSW, Austin, TX has been pinned as a “laboratory for [IIoT 5G] innovation.” RCR Wireless has partnered with leading technologists and industrial Internet of Things and 5G leaders to create a 10-episode, immersive documentary series on the “technologies, vertical markets, policies and investments driving what Goldman Sachs and others have termed the next mega trend, a 4th Industrial Revolution that will make the steam-driven transformation of 1800s look like a blip on the economic radar screen.” In the series, Austin is highlighted for its long history of tech innocation and its established ecosystem required for IIoT 5G innovation, as evidenced by AT&T’s recent selection of the city for its 5G trials and Spectrum Lab, in addition to the University of Texas-Austin and its world-renowned Wireless Networking and Communications group as well as Google’s ongoing deployment of its Google Fiber service and tests of its self-driving vehicles.
Austin, Texas: Where IIoT 5G technology sparks fly – RCR Wireless
According to a new study by IBM and Forbes Insights, cognitive computing will help companies that have experienced the effects of a natural disaster to get back up and running much faster than they were previously able to. IBM provides three ways that cognitive computing can help keep businesses afloat in the event of a natural disaster: predicting/avoiding damages from disasters, analyzing best practices of other companies previously affected by damage, and integrating the cognitive agent into technical support to help businesses get systems back online when there are technical issues.
How Cognitive Computing can get businesses up and running faster after disasters – Forbes
Self Service and Enterprise
Datawatch Corporation announced today that it has teamed with IBM to deliver better and faster data access and self-service data preparation to IBM Watson Analytics and IBM Cognos Analytics users. As part of this agreement, IBM will resell Datawatch Monarch, Datawatch’s market-leading self-service data preparation solution, which enables business analysts to rapidly access, manipulate and blend data from the widest variety of sources.
Datawatch Brings Powerful Self-Service Data Preparation Capabilities to IBM Watson Analytics and IBM Cognos Analytics Users—EconoTimes
Software Defined Networks
With the growing certainty that cybersecurity vulnerabilities will increase over the next couple of years, Alternative Global Networks (AGNs) is creating a new, more secure internet that will dramatically improve cyber resilience, and at the same time, reduce expenditures on cybersecurity. AGN benefits can include all that software-defined networking (SDN) aim to introduce, such as cost reduction, software-defined packet forwarding, central management, but on a global scale. One of the most important benefits will be simplified virtual management. Virtualization in networking will be similar to that of virtualization in computing, as it will completely revolutionize the paradigm of the existing coupling between hardware and software. This will also simplify implementing security tools.
Building a Brand New Internet – TechCrunch
Network Function Virtualization
Ericsson and NEC both came out as vendor partners for NTT DoCoMo’s network functions virtualization plans, including the telecom giant’s recently announced multi-vendor NFV platform. Ericsson said its platform is based on the Open Platform for NFV network architecture and was used for NTT’s commercial service launch. “To maximize benefit of NFV, DoCoMo expects to virtualize many other key components of its mobile network, aiming to eventually establish a fully virtualized network,” explained Seizo Onoe, EVP and CTO at NTT. “I’m convinced that our multi-vendor NFV is the first step toward our goal.”
Ericsson and NEC tout their support for NTT DoCoMo’s multivendor NFV deployment – RCR Wireless