Editor's Picks
Nvidia demonstrates self-driving car that learned entirely from watching humans
Google Cloud chief: Yes, we're really serious about the enterprise
Unicorn sighting: Apttus raises $88M at $1.3B valuation, launches AI called Max
New Apache project Spot taps machine learning to sniff out cyber threats
Tech giants form partnership to promote ethical AI development
Top Stories
New York minute: The blink-and-you'll-miss-it pace of change in data monetization | #BigDataNYC
Starting the ignition for cloud-powered car dealers | #splunkconf16
Collaborating to drive data cataloging | #BigDataNYC
Is your Big Data strategy a $15 million Excel download? | #BigDataNYC
Nvidia demonstrates self-driving car that learned entirely from watching humans
Creating new tools to bring together the data science community | #BigDataNYC
Cyberbit and ETA to develop cybersecurity training range
Shape Security raises $40M to fight cyber attacks with machine learning
Why should Google Analytics get all the credit? Tracing the path to purchase with martech | #BigDataNYC
Taking action on cluttered data lakes | #BigDataNYC
Premium Research
Industrial IoT, the largest segment of the Internet of things (IoT) with the highest potential value, will require a deep integration between modern IT (Information Technology) and OT (Operations Technology). Modern IT technologies, to be truly extensible to OT, will need a hybrid cloud approach, with by far the majority of data and processing residing at the so-called "Edge". Architectures and software written by industrialists for industrialists, such as GE Predix, are showing how that can work.
The big data arena is at a crossroads. Use cases and tools are proliferating faster than most big data teams are gaining experience. In establishing the big data business capabilities required to cut through the complexity, CIO’s must balance the accessibility of integration of traditional SQL DBMS’s versus speed of innovation in the mix and match open source big data ecosystem.
In the big data domain, businesses are trying to solve complex problems with complex and novel technology -- and often failing. Simplifying the packaging of big data technologies will streamline big data pilots and accelerate big data time-to-value. CIOs looking to establish differentiating big data capabilities need to consider Single Managed Entities to help solve the complexity problem.
Oracle M7 technology is meeting or exceeding the announcement performance claims against previous generation. Even taking very conservative assumptions, the business case for migration from T5 to T7 servers is good. Wikibon concludes that for Oracle software and the servers they run on, the adoption of M7 technology (and T7 server technology) is best practice for these high value compute areas.
The premise tested in this research is that high value applications and software should be run on more capable converged performance-optimized infrastructure, even when they constitute a small proportion of the total workload. In contrast, cost-optimized infrastructure will save on short-term hardware costs, but will incur much higher overall costs long-term. The conclusion strongly recommends IT executives adopt a default of performance-optimized converged infrastructure for all mixed workloads when even a small proportion includes high-cost software and/or high-value applications.
Big data pros need to identify which data feedback loops in their machine learning applications can deliver sustainable differentiation through network effects. Starting early is critical because getting to scale is likely to create the "winner takes most" competitive dynamics that have become so common in tech industries. The biggest sin is to wait for the tooling to become automated enough for all competitors to jump in.
The public cloud competitive environment remains turbulent, but sectors show signs of crystallizing a longer-term market structure.. SaaS remains turbulent with SaaS vendors successfully gaining share vs. incumbent licensed software providers who must migrate their core products to cloud-friendly offerings and/or acquire native SaaS applications to shore up their applications leadership positions. Public cloud IaaS segment leadership is crystallizing to a handful of viable providers as a function of scale requirements. PaaS is just formulating and finding its way, but is likely to gravitate towards a wide variety of development models suited for different application types and public cloud platforms.
For enterprise executives trying to achieve aggressive RPO and RTO SLAs , Wikibon believes that batch backup appliances (PBBAs) will give way to real-time, continuous data protection systems that aggressively support very large memory application memory. Practitioners requiring close to RPO zero and aggressive RTO SLAs should plan for an integrated data protection approach that effectively eliminates the concept of storage-led backup and shifts thinking to a application-led virtual point-in-time recovery model. The database and file system vendors will be the predominant suppliers of this technology, and understanding their roadmaps and commitment to support application-led recovery strategies is crucial to both database and data protection technology selection.
Mainstream enterprises are beginning to deploy machine learning applications that add differentiation to traditional products and services. But the complexity of their big data infrastructure is becoming a problem. Big data pros need to deploy a digital business platform (DBP) to support modern applications and then extend the DBP to machine learning applications.
IT shops that want to achieve modern IT status will have to address the "IT iron triangle" constraining transformation.