
Our guest author is Felix Wortmann, assistant professor at the Institute of Technology Management, University of St. Gallen (HSG) and the scientific director of the Bosch Internet of Things Lab. His research interests include Big Data and the Internet of Things. Before that, he worked as an assistant to the executive board of SAP. Felix Wortmann received a BScIS and MScIS from the University of Muenster, Germany, and a PhD in Management from the University of St. Gallen.
I recently keynoted at ISC Big Data’13 conference. To get prepped, I put together five key topics that shaped big data for me in 2013.
1. Database technology and data analytics have been around for decades. What is actually new about big data?
Indeed, in the last 20 years companies have heavily invested into data analytics infrastructures. The corresponding business intelligence (BI), data warehousing and analytics initiatives have addressed information needs of the whole enterprise including production, sales, marketing, service, and finance. However, these data platforms mainly focus on internal well-structured data. Moreover, these infrastructures have inherent performance and latency constraints. Big data technology now offers new possibilities regarding data volume and speed of analysis. Moreover, there is a focus shift on external and unstructured data. This focus shift goes beyond technology. Companies now have means to better understand and interact with their environment, e.g. customers, competitors or partners.
2. How is big data technology different? Is there really a new generation of analytical software available?
Big data is often associated with technologies such as Hadoop and NoSQL. Moreover, main memory databases are heavily discussed. Overall, there is a very fragmented market of available big data solutions.

Picture: THE 451 GROUP
http://blogs.the451group.com/information_management/2013/06/10/updated-database-landscape-map-june-2013/
If you take a step back, these solutions have one common denominator. They all leverage the potential of today’s hardware. Computing has changed fundamentally over the last 20 years. Basic ingredients of computing such as processing power, memory, and networking bandwidth remain the same. However, multi core, main memory instead of hard disk, and high-speed networks have not been taken as the basis for analytical software design 20 years ago.
3. What about disruptive innovation enabled by big data technologies? Is that myth or reality?
We definitely have seen disruptive innovation on the basis of big data. Just think about the large internet companies. However, if we talk about bringing big data technologies into other domains, we can be very optimistic, but should be careful. More data and low latency do not directly translate into additional business value. To facilitate disruptive innovation and new business models you have to bring together business opportunities as well as IT capabilities. Therefore, business and IT have to collaborate and rethink how business is done tomorrow.
4. The NSA PRISM program currently represents the “dark side” of big data. How do you think this affects the public’s perception of big data?
Privacy is a major concern which has to be taken serious. And for sure, cases such as PRISM shape public’s perception. However, people are willing to share data – if they see a benefit. Moreover, trust becomes a major business asset and companies start realizing this.

Picture: PBH3, original comic from Geek & Poke
http://geekandpoke.typepad.com/geekandpoke/2010/12/the-free-model.html
http://alligator-sunglasses.com/post/10500181092/the-truth-about-facebook
Sharing and exchanging information is a fundamental pillar of our information society. We definitely need to reconsider existing practices from different perspectives. Furthermore, it will not only be about regulation but also education and responsibility. Not only enterprises and public agencies have to follow “good practices”. The same holds true for the individual. Just think about privacy issues around Google Glasses.
5. How do big data and the Internet of Things (IoT) go together?
Stefan Ferber provides an interesting perspective on big data in the context of IoT. His bottom line: Big data in the context of the Internet of Things is different. I fully agree with his viewpoint. Machine-generated data bring special technical challenges and data management principles with it. Store everything: you might eventually need it later is a (questionable) big data mantra definitely not generally applicable in the context of IoT. Identifying relevant data points, filtering data, and throwing away are core competencies in IoT environments.
However, even more interesting than these technical aspects are IoT business opportunities enabled by leveraging the full potential of machine-generated data. Data is a valuable resource for more intelligent products, and there are exciting use cases such as predictive maintenance even in the consumer context, e.g. for residential heating systems. Furthermore, the ever swelling amount of plant data provides tremendous opportunities for a next generation of safe and profitable production.
What do you think: How do big data and the Internet of Things go together? Where are the specific big data challenges in the context of the IoT?