The Internet of Things will never reach its full potential without sensor analytics ecosystems [SAE]. Today, Salesforce announces the IoT Cloud, powered by Thunder, "a massively scalable, real-time event processing engine, IoT Cloud connects billions of events with Salesforce" and, thus, takes the first step to becoming a backbone for SAE.
“Salesforce is turning the Internet of Things into the Internet of Customers,” said Marc Benioff, chairman and chief executive officer, Salesforce.“The IoT Cloud will allow businesses to create real-time 1:1, proactive actions for sales, service, marketing or any other business process, delivering a new kind of customer success.”
With these words, the Internet of Connected Products from Dreamforce 2014 merges with the Internet of Customers from Dreamforce 2013. With more than seven billion potential customers, each streaming data from multiple sensors on their person, in their home, though their transportation, and at work, the Salesforce Thunder and IoT Cloud will need to listen to literally trillions of data streams, adding context and updating prior information with current data for ever improving predictives, informing human decisions and improving automated actions.
We are especially excited by the list of partners coming together to enable Thunder and create the IoT Cloud.
Salesforce is accelerating the adoption of IoT Cloud with initial launch partners ARM, Etherios, Informatica, PTC ThingWorx and Xively LogMeln that provide connectivity between devices and the Internet. The company will continue to expand its ecosystem of IoT partners to enable businesses to connect with their customers in a whole new way.
This news was just released. As Clarise and I speak with Salesforce personnel, partners and customers at Dreamforce 2015, and based upon other briefings that come out of embargo throughout the week, we will be updating this post with more information and finer analysis.
Vibe Data Stream [Vibe] and Virtual Data Machine [VDM] combine at the center of Informatica’s Internet of Things strategy. Primarily for Machine-to-Machine [M2M] data, and by connecting through Power Center, ultimately leading to Machine-to-Human [M2H] Data. The goal is to have VDMs residing in mobile devices, sensor packages, or as part of sensor networks. At this point, VDMs require more processing power than available in most components. Thus, Vibe and VDM are primarily suited today to data, network operations, and communication centers.
However, Informatica is seeing a broad range of use cases involving both large machines and sensor networks, from many different sectors including
- oil and gas,
- financial services,
- data center operations, and
- building services.
The Proof is Out There
One Proof of Concept [PoC] currently underway is with a Heating, Ventilation and Air Conditioning [HVAC] company. In the PoC, the HVAC company is looking at streaming data from all of their installations. Using Informatica products, they are bringing this data into their data center for both streaming and batch analytics. There are actually three use cases being examined in this PoC:
- Improving customer service
- Internal analytics on generic patterns of use for improved design, reliability and maintainability
- Predictive maintenance from the provider rather than from the building management team
Other field trials look at Vibe and VDM capabilities in regard to Pub/Sub models working with Informatica Ultra Messaging, as well as persisting data in all forms of data stores from traditional Enterprise Data Warehouses [EDW] to Hadoop [HDFS] and NoSQL databases such as Cassandra. These field trials involve solving the ongoing problems of the different areas mentioned above.
- In a financial services case, both application log data and financial information exchange [FIX] log data are being used to pull in log data real time for market, order flow and trade data.
- For online retail, Vibe is used to track web-site visitor paths through the site using log data.
- Data center operational efficiency optimization for green IT, sustainability or improving the bottom line through log data from switches, servers, applications and call centers.
- For one governmental agency, Informatica Vibe and VDM are maintaining the Service Level Agreement [SLA] in real time, for 800 separate field organizations over more than a million devices, using industry-standard Security Content Automation Protocols [SCAP] data formats.
Perhaps the most involved trials begin done to date with Informatica Vibe and VDM, are within the Telecommunications space. As one might expect, the explosion of data and customer expectations, as cellular goes from 2G to 3G to 4G/LTE requires real-time management of ever increasing amounts of data. But also the wireline/fiber and cable use cases are exploding as the traditional market places of voice, entertainment and connectivity intertwine.
Out to the Edge
Informatica is aggressively working with partners, such as chip, sensor and package manufacturers, to understand how to optimally implement Vibe, whether that is through streaming collection capability of Vibe on the device itself or as part of the larger infrastructure at some point in the collection tier to implement the needed streaming collection. Currently, collecting sensor data can hit performance limits using the sensor or communication base protocols. Thus, for example in the oil and gas industry, Informatica is working with both vertical-specific sensor manufactures and large organizations in the industry, to determine how Vibe can supplement or even replace the collection tier.
What Informatica brings to evolving sensor analytics ecosystems [SAE] is not only their specific technologies of Vibe and VDM, but combining these with a complete package for supporting streaming analytics, operational intelligence, complex event processing [CEP], batch analytics, predictives, reporting, data marts and EDW, through their existing technology families such as Ultra Messaging, Power Center, Master Data Management, Data Quality, and more, both through traditional and Cloud deployments. This results in bringing mature market features to the SAE in the form of
- Guaranteed delivery
- Automated zero latency fail-over
- Centralized GUI administration
- No intermediary staging of data at source, broker, or target
- Fail-over does not require shared file systems
This blog post is based upon both the Informatica Press release referenced below, and a private briefing from the Informatica team that allowed us to gather more information and get answers to our questions. Also referenced are other of our blog posts on IoT and Big Data, for context.
- Informatica Press Release from Strata + Hadoop World
- What does IoT All Mean
- The IoT and Change
- Big Data: It’s Not the Size, It’s How You Use It
- New Hope from Big Data
Back in July, I wrote
[An] excellent example of the importance of the Industrial Internet comes from Salesforce.com use of The Social Machine by Digi International and its Etherios business unit, in bringing sensor data into customer relationship management [CRM] by allowing sensors embedded in industrial refrigerators, hot tubs, and heavy and light equipment of all types to open SFDC chatter sessions and to file cases.
At Dreamforce 2013, Salesforce.com is announcing Salesforce1, their new Internet of Customers ecosystem, bringing together Force.com, Heroku, and ExactTarget FUEL platforms under a united series of APIs controlled by the Salesforce1 App.
Today and tomorrow, Dreamforce is all about the Internet of Things, and I'll be providing my analyses of how SFDC is building out it's massive existing ecosystem of parnters, services and customers into Marc Benioff's evolving vision of the Internet of the Customer. The message here, is how Salesforce1 is ready today to prepare their customers to leverage the opportunities presented by the Internet of Things today. As Cisco states, over a trillion dollars in added value was left on the table this year by companies not taking advantage of IoT. For 2014, SFDC's customers won't have an excuse to leave this money behind.
One challenge for Salesforce1 is its dependence on partners for analytics. Are SFDC partners ready to help in bringing the Internet of Customers to full potential through connected analytics? How will IBM's MQTT, Smarter, and Cognitive Computing, Oracle's Device-to-Data-Center, Teradata's Hub for Monetizing the IoT, Infobright's M2M optimized ADBMS, and many other data management & analytics initiatives focused on M2M and M2H data fit in?
Will Salesforce1 create or be integrated into Sensor Analytics Ecosystems, with the necessary marketplaces for raw, processed and insights from M2M & M2H data? SFCD has never been up to the challenge of analytics in the past. While there are many general BI and Analytics partners, SFDC specific analytics firms have come and gone. Salesforce1 is a broader concept and brings SFDC into a future beyond salesforce automation and customer relationship management.
The IoT Keynote at Dreamforce today, and the packed sessions on IoT will answer some of these questions. I'll be providing my analysis of how well these questions are answered in an Event Report blog post after the close of Dreamforce 2013.
The number of articles about the Internet of Things [IoT], Machine-to-Machine communication [M2M], the Industrial Internet, the Internet of Everything [IoE] and the like have been increasing since I wrote my post introducing my IoT mindmap almost a year ago. I learn from some of them, some I nod sagely in agreement, and others cause me to scratch my head in confusion. One in particular this last week fell in that last category, when they claimed that all the terms listed here all mean the same thing.
From my reading, briefings and research over the past year, I've come to a different conclusion. The following definitions are my opinion. I can't say that any authority has certified these definitions. I believe them to be accurate, and if any vendor with an interest in any of these definitions strongly agree or disagree, I would be very much interested in talking with you.
The first thing to be considered is Machine-to-Machine communication. M2M is really just one of four types of interchanges that occur over the Internet, intranets and any command, control, communication, computing or intelligence network. The other types are Human-to-Machine [H2M], Human-to-Human [H2H] and Machine-to-Human [M2H]. H2M and H2H interchanges have been around since the beginning of ARPAnet, which evolved to become the Internet. From the many different protocols at the beginning, such as FTP and Gopher [among many more], two have come to dominate Internet traffic:
- simple mail transfer protocol [SMTP] at the heart of email, and
- hypertext transfer protocol [HTTP] at the heart of the world wide web [WWW or Web].
Every transaction made using a computer: online transaction process [OLTP] electronic data interchange [EDI], and eCommerce; every purchase you make at your favorite web store, is an example of H2M.
Of course, starting with email [still the dominant form of communication over the Internet and for businesses and individuals] and expanding to Twiter, Facebook, Waze, Yelp, Foursquare, Yammer, all the various instant messaging networks, voice over Internet protocol [VoIP] and your favorite public or private social network, we have many examples of Internet enabled H2H communication.
These two, H2M and H2H, have become so prevalent, and so important to business, governments and our personal life, that the over-hyped phenomenon "Big Data" was born. But the importance and pervasiveness of M2M, and soon, M2H data will swamp the so-called data tsunami of the past decade. Predictive maintenance, building automation, elastic provisioning, machine logs, software "phoning home" and automated decision support systems are all good examples of direct M2M interchanges where one sensor, device, embedded computer or system has a productive exchange with another such machine, without concurrent human intervention. Self-quantification, gamification, personalized medicine and augmented reality [AR] are all early examples of M2H interchanges, where sensors, devices, embedded computers or system directly provides relevant information to an individual, allowing for better informed decisions.
The Internet of Things
The Internet of Things was coined in 1999 by Kevin Ashton. Since then, the term has come to mean any device that is connected to the Internet. Most people don't consider computers, routers, edge equipment and other Internet infrastructure hardware to be a "device", and usually exclude such hardware from consideration as a thing that uses that infrastructure. For many, the devices are only smart phones, feature phones and tablets. This has led to predictions by Cisco and GSMA to declare that there will be 30 to 50 billion devices connected to the Internet by 2020. However, even these organizations, and most people with whom I speak who have skin in the IoT game, feel that my own prediction of one trillion devices connected to the Internet by 2020 is more likely. These devices span from individual, but connected sensors, to heavy machinery. However, as companies come out with Tweeting diapers, glowing clothing and other such silliness, the Internet of Things is in danger of becoming a fad. So, what is the Internet of Things? To my mind, the Internet of Things comprises any sensor, embedded sensor, embedded computer, component, package, sub-system, systems, or System that is connected to the Internet and intended to have meaningful interchanges with other such items and with humans. The Internet of Things primarily uses M2M and increasingly M2H interchange.
The first treatment of the IoT as large, complex system, to which I was exposed was at networking event in 2008… One of those events where IBM was introducing their new initiative for a Smarter Planet. The Smarter Planet brings complex systems such as the Smart Grid, building automation across facilities, water management, traffic management, Smarter Cities and Smarter Farms under one System. One approach and one initiative that raises the IoT to a new level of importance for world governments, global businesses and individuals from the poorest village to the most cosmopolitan city. The Smarter Planet initiatives go beyond IoT, beyond the individual things, to treating all such things, the Internet, the protocols, process and policies as one very large, complex, possibly cognate system.
The Industrial Internet is a term coined by General Electric [GE] in 2011. At a very simple level, the Industrial Internet can be thought of connected industrial control systems. But the impact is much more complex, and much more significant. The first thing to be realized is that connected sensors and computing power will be embedded in everything, from robots and conveyor belts on the factory floor, to tractors and irrigation on the farm, from heavy equipment to hand drills, from jet engines to bus fleets; every piece of equipment, everywhere. The Industrial Internet also primarily uses M2M and M2H. While this sounds much like the Internet of Things, the purpose is much different. The Industrial Internet is about changing business processes and making data the new coin of the realm. GE is very serious about the Industrial Internet, and while they don't use the term yet, Sensor Analytics Ecosystems. Data Marketplaces are rapidly becoming core to GEs businesses, as proven by their recent 140 million dollar investment in Pivotal, the new Big Data Platform as a Service [PaaS] by EMC. Another excellent example of the importance of the Industrial Internet comes from Salesforce.com use of The Social Machine by Digi International and its Etherios business unit, in bringing sensor data into customer relationship management [CRM] by allowing sensors embedded in industrial refrigerators, hot tubs, and heavy and light equipment of all types to open SFDC chatter sessions and to file cases.
Internet of Everything
Cisco has recently started two initiatives related to the IoT, the Internet of Everything [IoE] and Fog Computing. IoE seeks to bring together H2H, H2M, M2M and H2H interchanges. On June 19th of this year, Cisco introduced their IoE Value Index [link to PDF]. By bringing together people, processes, data, and things, and with some impressive research to back it up, Cisco feels that the IoE, in 2013, could bring 1.2 Trillion Dollars in added value, and by 2022, 14.4 Trillion dollars in added market value to business around the world. Fog Computing tends more to the infrastructure of the IoE, bringing the concepts of Cloud Computing, such as distributed computing and elastic provisioning, to the edge of the network, with an emphasis on wireless connectivity, streaming data, and heterogeneity.
While some of the above are corporate initiatives, they each represent important and distinct concepts. In addition to these from IBM, Cisco, GE, EMC and Salesforce.com, there are other initiatives and products, in this sphere, coming from HP, Oracle, SAP, MuleSoft, SnapLogic, Nuance, Splunk, Mocana, Evrythng, Electric Imp, Quirky, reelyActive, Ayla, SmartThings, Withings, Fitbit, Jawbone including BodyMedia, Nike, Basis, Cohda Wireless, AT&T, Verizon, Huawei, Orange, Belkin, DropCam, Gravity Jack, Alcatel-Lucent, and Siemens. Platforms, software, sensor packages and services, are being developed by a wide variety of innovative companies:
- C3 Energy,
- Digi Intenational,
- Noesis Energy,
- Space-Time Insights,
- Yellowfin Business Intelligence,
- Actian including Pervasive Software and Vectorwise ADBMS,
- Alpine Data Labs,
- Opera Solutions,
- Freescale Semiconductor
- Wind River
- August Smart Lock,
- Data Sensing Lab,
- Metaio, and
These innovative companies, and others, are implementing one or more of these concepts in a variety of ways. As I stated at the beginning, I don't think that these concepts are the same. While the IoT was first named 14 years ago, it is still early days in its implementation. There are many ways that the Internet of Things might evolve, and many missteps that could lead the IoT to be a passing fancy, leaving some important changes in its wake, but never reaching its full potential. I think there is one way, and one way only, that all of the concepts and initiatives will come together and change everything that we do, how we make decisions, how we think about ourselves, how governments make policy, how businesses make money: The Sensor Analytics Ecosystem [SAE]. Here's a tease of a mindmap giving a hint of what I mean by the SAE. Look for my upcoming report "Sensor Analytics as an Ecosystem" and a series of research reports delving into each area introduced therein. The companies listed above are building out parts of the SAE, and will feature heavily in these reports.
A sensor is anything that can create data about its environs. A more formal definition is
a device that detects or measures a physical property and records, indicates, or otherwise responds to it -New Oxford American Dictionary
A very simple example is a thermocouple.
Essentially, two metals are bound together such that when the environment around this wire becomes hotter or colder, the metals produce a voltage. Through this thermoelectric effect, this strain translate into a voltage differential across the wire, producing an electrical signal. A simple voltmeter can read this signal, and one could calibrate that electrical signal to be read as degrees of temperature change.
You likely have one of these in your home thermostat. Perhaps you have a very simple thermostat that turns your home heater on and off.
Perhaps you have a more complex, programmable thermostat that can control the temperature and humidity of your home through a furnace, air conditioner, humidifier/dehumidifier and fans, with different settings for different times of the day and days of the week.
Perhaps you have something that looks very simple, but is now part of a complex system that includes not only your home HVAC system, but your computer and smartphone, and computers and analytic software at your utility company.
And this progression is why the Internet of Things is about to explode with Connected Data, with sensors being the new nerve endings of an increasingly intelligent world.
Imagine sensors streaming Connected Data from your home entertainment system, refrigerator & most of its contents, toaster, coffee maker, alarm clock, garden, irrigation, home security, parking on the street in front of your home, traffic flowing by your home to your destination, air quality, and so much more.
We will interact with the world around us in ways that will change our decision making processes in our personal lives, in business, and in the regulatory processes of governments.
If you want to learn more, join IBM and my fellow panelists on Thursday, Sept. 13, from 4 to 5 p.m. ET to chat about cloud and the connected home using hashtag #cloudchat.