Your daily digest of “All Things Big Data” gathered, collected and researched by your very own 10Fold Big Data Practice team.
InfoWorld highlights big-data vendor, Talend, which is following in the footsteps of two of the hottest open-source technologies in big data — Hadoop and Apache Spark. Talend provides integration technologies for big data, cloud and applications based on the open-source software model. Because of this, the company has placed a significant bet of its own on Hadoop, Spark, and open source in general.
With modern humanity continuing to generate large sets of data, machine learning has become a relevant tool to help researchers interpret all of this information.. These machines have been helpful in analyzing complex networks, yet some supercomputers still present problems. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California. They believe this approach, which uses algebraic topology, will help reduce the impact of distortions that arise.
Why open source is the ‘new normal’ for big data – InfoWorld
A new quantum approach to big data – MIT News
Cybercom: OPM Hack Highlights China Big Data Spying – The Washington Free Beacon
Splice Machine has secured $9m in C-round funding to continue its efforts in connecting Hadoop and relational database management systems (RDBMS) technologies together. Splice Machine aims to have an RDBMS running on top of Hadoop and Spark and aims to increase performance over traditional RDBMS, such as Oracle and MySQL, at a lower cost.
Cloud-scale data processing software, Qubole, raised $30 million to aid its mission in simplifying Hadoop by allowing users to manipulate information in their cloud-based analytics clusters without writing any code. The additions aim to make Qubole’s Hadoop distribution more viable for sensitive workloads, such as healthcare information and financial records.
Splice Machine bags $9m to fund RDBMS on Hadoop and Spark – The Register
What you missed in Big Data: Hadoop is the star of the show – SiliconAngle
Because IoT is all about connectivity, there have been many alternatives rising for getting data from “here to there.” A new breed of low-power, long-range wireless networks have arisen and are now being used by several companies. But there is also a new option: LPWANs. These networks are designed to work at distances measured in kilometers and have power consumption figures that allow for years of battery power. Another option is LTE-M, which is designed to work with existing equipment installed in LTE networks. Although neither technology is a fool-proof solution for IoT devices, they still provide newer options for carriers and companies to chose from.
As most organizations now embrace the IoT, they still need to process and analyze the subsequent, large quantities of data in real-time, which can increase security, capacity and analytics challenges.One way to address these would be to put automated, intelligent analytics at the edge — near where the data is generated to reduce the amount of data and networking communications overhead. The questions of what data can be collected, what data should be collected, and how long the data should be retained still apply. The difference is the physical point at which the data should be analyzed and acted upon, which depends on the use-case and on what an organization is trying to achieve.
Edge Analytics An Antidote To IoT Data Deluge – Informationweek
Salesforce saw growth like never before in 2015 with an estimated 1 million jobs projected by 2018 that will be directly enabled by the Salesforce ecosystem. That being said, CloudCraze, Salesforce’s only enterprise-class eCommerce partner, is poised to lead the forefront of cloud innovation for 2016. CEO Chris Dalton provides insight for how he believes eCommerce will be effected in the future with the use of CloudCraze.
According to a recent report from the OpenStack Foundation, Accelerating NFV Delivery with OpenStack, NFV is changing the game for telcos because it helps them quickly develop and deploy new applications while reducing their reliance on proprietary hardware from traditional network suppliers, and eases the strain on their data centers. Although adoption of NFV remains in its infancy, it is already projected to grow rapidly by the end of the decade.
More telcos turning to NFV to cut costs and provision services – Computer Weekly