Showing posts with label IoT. Show all posts
Showing posts with label IoT. Show all posts

Thursday, August 16, 2018

Successful IoT deployment Requires Continuous Monitoring


Growth of the IOT has created new challenges to business. The massive volume of IoT devices and the deluge of data it creates becomes a challenge — particularly when one uses IoT as key part of their business operations. These challenges can be mitigated with real-time monitoring tools that has to be tied to the ITIL workflows for rapid diagnostics and remediation. 

Failure to monitor IoT devices leads to a failed IoT deployment.

Wednesday, July 25, 2018

Why Edge Computing is critical for IoT success?



Edge computing is the practice of processing data near the edge of your network, where the data is being generated, instead of in a centralised data-processing warehouse.

Edge computing is a distributed, open IT architecture that features decentralised processing power, enabling mobile computing and Internet of Things (IoT) technologies. In edge computing, data is processed by the device itself or by a local computer or server, rather than being transmitted to a data centre.

Edge computing enables data-stream acceleration, including real-time data processing without latency. It allows smart applications and devices to respond to data almost instantaneously, as its being created, eliminating lag time. This is critical for technologies such as self-driving cars, and has equally important benefits for business.

Edge computing allows for efficient data processing in that large amounts of data can be processed near the source, reducing Internet bandwidth usage. This both eliminates costs and ensures that applications can be used effectively in remote locations. In addition, the ability to process data without ever putting it into a public cloud adds a useful layer of security for sensitive data.

Friday, May 04, 2018

Key Technologies for Next Gen Banking



Digital Transformation is changing they way customers interact with banks. New digital technologies are fundamentally changing banks from being a branch-centric human interface driven to a digital centric, technology interface driven operations. 

In next 10 years, I predict more than 90% of existing branches will close and people will migrate to digital banks. In this article, I have listed out 6 main technologies needed for next gen banking - aka the Digital Bank.

1. MobileMobile Apps is changing how customers are interacting with bank. What started as digital payment wallets, mobile banking has grown to offer most of the banking services: Investments, Account management, Lines of credit, International remittances etc., providing banking services anywhere, anytime!

2.Cloud & API

Mobile banking is built on cloud services such as Open API & Microservices. Open API allows banks to interact with customers and other banks faster. For example, Open API allows business ERP systems to directly access bank accounts and transfer funds as needed. Open API allows banks to interact faster, transfer funds from one back to another etc. In short Cloud technologies such as Open API and microservices are accelerating interactions between banks, and banks & customers, thus increasing the velocity of business.

3. Big Data & Analytics

Big data and analytics are changing the way banks reach out to customers, offer new services and create new opportunities. Today banks have tremendous access to data: Streaming data from websites, cloud services, mobile data and real time transaction data. All this data can be analyzed to identify new business opportunities - micro credit, Algorithmic trading etc.

4. AI & ML

Advanced analytical technologies such as AI & ML is increasingly being used to detect fraud, identify hidden customer needs and create new business opportunities for banks. Though these technologies are still in their early stages, it will get a faster adaption and become main stream in next 4-5 years.

Already, several banks are using AI tools for customer support activities such as chat, phone banking etc.

5. Biometrics & security.

As velocity of transactions increases, Security is becoming vital for financial services. Biometric based authentication, Stronger encryption, continuous real time security monitoring enhances security in a big way.

6. Block chain & IoT

IoT has become mainstream. Banks were early adapters of IoT technologies: POS devices, CCTV, ATM machines, etc.  Block chain technology is used to validate IoT data from retail banking customers. This is helping banks better understand customers and tailor new offerings to create new business opportunities.

Friday, March 23, 2018

Friday, July 07, 2017

Effective workplace for Digital Startups

Earlier this week, I met with a CEO of a startup in Bangalore who wanted to setup a smart office for this startup. I had a very interesting discussion on this subject and here in this blog, is the gist of our discussion.

==

Startups need a work environment that fosters collaboration, productivity, and innovation, it is able to attract and retain the best employees.

Office spaces are now turning into intelligent spaces - where office space can engage with employees to maximize effectiveness, connect seamlessly and securely anywhere, anytime.


How to build such a work place? 


Today, we have ultra-fast Wi-Fi, mobile "anywhere" communications, and Internet of Things (IoT) connectivity that connects physical building work places to employees. For example having a mobile app - which gives information about available meeting rooms, directions to meeting rooms and an easy one-click interface to book a meeting room - helps save 5-10 minutes off an employee's time in setting up a meeting. This alone translates into 42-85 man hours of productivity per employee per year! Freeing up time for innovation and collaboration.

In the world of intelligent work spaces technology becomes a key enabler. Every employee has fast and complete access to the applications and data they need and can use any mobile device to schedule space, operate electronic white boards or projectors, or set up video conference calls. Employees can be productive, seamlessly and securely, anywhere, anytime, whether in a quiet work space, a conference room, a boardroom, or even an outdoor space such as a rooftop café.

In a startup office, basic Wi-Fi and video conferencing are now so very common that they are taken for granted. The closed-door offices and cubicles have given way to open-space designs, casual meeting areas. Cafeterias & pantries are now seamlessly integrated with work spaces - to encourage open idea sharing and other collaborative exchanges among workers.


Understanding the requirements


Startup offices often share offices spaces with other startups (typically non-competing - Of course!)

A startup workplace must also be truly innovative. In most of the legacy workplaces, there is too much friction and inefficiency that hampers office productivity. Talented & creative employees are still shackled to desks on which sit hardwired computers that act as their main and sometimes only access point to the applications, software, and data they need to do their work.

Startups do not have a hierarchical organization structure. Instead they tend to have a team-based organizational structure. Teams are formed and disbanded depending on the project at hand. Cross-functional teams are dynamically created when necessary. This means employees need the right tools to work in a fluid environment where they and their colleagues can collaborate whenever and wherever the need arises.

Good news is that, today we have mobile-first, cloud-first and IoT technologies that enable such intelligent spaces. Facility managers will have to dorn an IT hat and ensure that:

  1. Secure, untethered, and consistent connectivity anywhere.Security is of paramount importance in a multi-tenant workplace. Employees are no longer tethered to a wired deskspace, they need to have complete mobility within the workspace - and yet have safe and consistent high bandwidth connectivity.
  2. Consistent workplace productivity solutions across all devicesWorkplace productivity tools such as Slack, Skype, Google Meetups etc are essential. These tools can work on various devices: iPad, iPhone, Android Phones, Windows Laptops, Apple Macbook etc.
  3. Collaboration solutions built on cloud
     
    Note that for a consistent workplace collaboration tools are all connected to cloud. Its not just the productivity tools, even booking conference rooms or meeting rooms are handled via cloud. In a multi-tenant workplace where conference rooms are shared, the solutions must be able to generate pay-per-use billing systems for all shared resources.
  4. Location based services
    Based on number of people per floor or per zone or area, smart facilities are turned on or off based on actual need. This means all lighting, cooling/heating and Wi-Fi connectivity are all based on number of people in that area. This implies use of intelligent sensors and smart analytics on the edge to minimize energy usage.   

Build Analytics into workplace  


Industrial IoT devices opens up a whole new way of seeing how an existing facility is being used. Heat/motion sensors can track which areas of the office are highly used and which areas are least used. This data over a period of time can be of immense value - to optimize the way  office spaces are designed. This data can be used to optimize the cooling & lighting requirements and HVAC systems planning.

The use of smart building technologies - sensors on the floor, motion sensors, thermal scanners, CCTV, Biometric scanners etc., generate vast amounts of data which can be used in lot of ways to a better workplace.

Closing Thoughts 


When technology and facility design is done right, we can create workspace that allows organizations, small or large, to orchestrate workflows for maximum efficiency and productivity. This will unleash the kind of innovation, creativity, and productivity needed to compete in the new digital economy.

Such a building would be a truly digital workplace where technology becomes a strong hidden foundation for a true user centered work place.

Invest in IT enabled facilities to make employees happier, attract and retain the most talented workers, it must provide employees with a modern, digital environment where they can work efficiently and seamlessly.


Tuesday, June 27, 2017

Key Metrics to measure Cloud Services


As a business user, if you are planning to host your IT workloads on a public cloud and you want to know how to measure the performance of the cloud service, here are seven important metrics you should consider.

1. System Availability

A cloud service must be available 24x7x365. However, there could be downtimes due to various reasons. This system availability is defined as the percentage of time that a service or system is available. Often expressed as a percentage. For example, a downtime of 7.5 hours unavailable per year or 99.9% availability! A downtime of few hours can potentially cause millions of dollars in losses.

365 or 3.65 days of downtime per year, which is typical for non redundant hardware if you include the time to reload the operating system and restore backups (if you have them) after a failure. Three nines is about 8 hours of downtime, four nines is about 52 minutes and the holy grail of 5 nines is 7 minutes.

2. Reliability or also known as Mean Time Between Failure (MTBF)  and Mean Time To Repair(MTTR)

Reliability is a function of two components: Mean Time Between Failures (MTBF) and Mean Time To Repair (MTTR) - i.e., time taken to fix the problem. In the world of cloud services, it is important to know MTTR is often defined as the average time required to bring back a failed service back into production status.

Hardware failure of IT equipment can lead to a degradation in performance for end users and can result in losses to the business. For example, a failure of a hard drive in a storage system can slow down the read speed - which in turn causes delays in customer response times.

Today, most cloud systems are built with high levels of hardware redundancies - but this increases the cost of cloud service.        

3. Response Time

Response time is defined as the time it takes for any workload to place a request for
work on the cloud system and for the cloud system to complete the request. Response time is heavily dependent on the network latencies.

Today, If the user and the data center are located in the same region, the average overall response time is 50.35 milliseconds. When the user base and data centers are located in different regions, the response time increases significantly, to an average of 401.72 milliseconds.

Response Time gives a clear picture of the overall performance of the cloud. It is therefore very important to know the response times to understand the impact on application performance and availability - which in-turn impacts customer experience.

4. Throughput or Bandwidth

The performance of cloud services are also measured with throughput; i.e., Number of tasks completed by the cloud service over a specific period. For transaction processing systems, it is normally measured as transactions/second. For systems processing bulk data, such as audio or video servers, it is measured as a data rate (e.g., Megabytes per second).

Web server throughput is often expressed as the number of supported users – though clearly this depends on the level of user activity, which is difficult to measure consistently. Alternatively, cloud service providers publish their throughputs in terms of bandwidth - i.e., 300MB/Sec, 1GB/sec etc. This bandwidth numbers most often exceeds the rate of data transfer required by the software application.

In case of mobile apps or IoT, there can be a very large number of apps or devices streaming data to or from the cloud system. Therefore it is important to ensure that there is sufficient bandwidth to support the current user base.

5. Security

For cloud services, security is often defined as the set of control based technologies and policies designed to adhere to regulatory compliance rules and protect information, data applications and infrastructure associated with cloud computing use. The processes will also likely include a business continuity and data backup plan in the case of a cloud security breach.

Often times, cloud security is categorized into multiple areas: Security Standards, Access Control, Data Protection (Data unavailability & Data loss prevention), Network  - Denial of service (DoS or DDoS)

6. Capacity

Capacity is the size of the workload compared to available infrastructure for that workload in the cloud. For example, capacity requirements can be calculated by tracking average utilization over time of workloads with varying demand, and working from the mean to find the capacity to handle 95% of all workloads.  If the workloads increases beyond a point, then one needs to add more capacity - which increases costs.

7. Scalability

Scalability refers to the ability to service a theoretical number of users - degree to which the service or system can support a defined growth scenario.

In cloud systems, scalability is often mentioned as scalable up to tens of thousands, hundreds of thousands, millions, or even more, simultaneous users. That means that at full capacity (usually marked as 80%), the system can handle that many users without failure to any user or without crashing as a whole because of resource exhaustion. The better an application's scalability, the more users the cloud system can handle simultaneously.

Closing Thoughts


Cloud service providers often publish their performance metrics - but one needs to dive in deeper and understand how these metrics can impact the applications being run on that cloud. 

Monday, June 12, 2017

Taking Analytics to the edge


In my previous article, I had written about HPE's EdgeLine servers for IoT analytics.

In 2017, we are seeing a steady wave of growth in data analytics that's happening on the edge and HPE is in the forefront of this wave - leveraging its strengths in hardware, software, services, and partnership to build powerful analytic capabilities.

With HPE EdgeLine, customers are able to move  analytics from the data center to the to the edge, providing rapid insights from remote sensors to solve critical challenges in multiple industries like energy, manufacturing, telecom,  and financial services.

Why IoT project fail?


Recently, Cisco reported that  ~75% of IoT projects fail. This is because IoT data has been managed in centralized, cloud-based systems. In traditional settings, data is moved from a connected 'thing' to a central system over a combination of cell-phone, Wi-Fi and enterprise IT network, to be managed, secured, and analyzed.

But with IoT devices generating huge volumes of data, and data being generated at multiple sites - even in remote areas with intermittent connectivity. This meant that analysis could not be done in a meaningful way as the data collection was taking time, and when the analysis was completed, results were computed, it was irrelevant.

Centralized cloud systems for IoT data analysis just does not scale nor can it perform at speeds needed.

HPE Solution - EdgeLine servers for Analytics on the Edge


With HPE EdgeLine servers, we now have a  solution that optimizes data for immediate analysis and decision making at the edge of the network and beyond.

For the first time ever, customers have the first holistic experience of the connected condition of things (machines, networks, apps, devices, people, etc.) through the combined power of HPE EdgeLine servers and Aruba wireless networks.

Analysis on the edge is just picking up momentum and it's just the beginning of good things to come.

Today, cloud is omnipresent, but for large scale IoT deployment, a new model of computing is needed emerge where constant cloud connectivity is not essential. Most data will be processed at or near its point of origin to provide real-time response and will be handled on-site in that  moment. Running analytics on edge will save costs, refine machine learning on massive data sets -  that can be acted on at the edge.

In June 2017 at HPE Discover, customers were delighted to get an in-depth view of this solution.

HPE's continued investments in data management and analytics will deliver a steady stream of innovation. Customers can safely invest in HPE technologies and win.

HPE along with Intel is future proofing investments in data and analytics for hyper distributed environments. HPE has taken a new approach to analytics to provide the flexibility of processing and analyzing data everywhere - from right at the edge where data is generated for immediate action and for future analysis in the cloud at a central data center.

Customers are using IoT data to gain insight through analytics, both at the center and the edge of the network to accelerate digital transformation. With HPE Edgeline, one can take an entirely new approach to analytics that provides the flexibility of processing and analyzing data everywhere—at the edge and in the cloud, so it can be leveraged in time and context as the business needs to use it.

This technology was developed in direct response to requests from customers that were struggling with complexity in their distributed IoT environments. Customers, analysts and partners have embraced intelligent IoT edge and are using it in conjunction with powerful cloud-based analytics.

Analytics on the edge is a game changing approach to analytics that solves major problems for for businesses looking to transform their operations in the age of IoT. The HPE Vertica Analytics Platform now runs at the IoT edge on the Edgeline EL4000. This combination gives enterprises generating massive amounts of data at remote sites a practical solution for analyzing and generating insights.

Customers like CERN, FlowServe,  etc are using Edge analytics to expand its monitoring of equipment conditions such as engine temperature, engine speed and run hours to improve maintenance costs. Telecom services companies are pushing the edge with analytics to deliver 4G LTE connectivity throughout the country, regardless of the location of the business.


Closing Thoughts 


Benefits of centralized deep compute make sense—for traditional data. But the volume and velocity of IoT data has challenged this status quo. IoT data is Big Data. And the more you move Big Data, the more risk, cost, and effort you'll have to assume in order to provide end-to-end care for that data.

Edge computing is rebalancing this equation, making it possible for organizations to get the best of all worlds: deep compute, rapid insights, lower risk, greater economy, and more trust and security.


Friday, June 02, 2017

Managing Big data with Intelligent Edge



The Internet of Things (IoT) is nothing short of a revolution. Suddenly, vast numbers of intelligent sensors and devices are generating vast amounts of data that contain potentially game-changing information.

In traditional, data analytics, all the data is shipped to a central data warehouse for processing in order to get strategic insights, like all other Big data projects, tossing large amounts of data of varying types into a data lake to be used later.

Today, most companies are collecting data at the edge of their network : PoS, CCTV, RFID scanners, etc. IoT data being churned out in bulk by sensors in factories, warehouses, and other facilities. The volume of data generated on the edge is huge and transmitting this data to a central data center and processing it in a central data center turns out to be very expensive.

The big challenge for IT leaders is to gather insights from this data rapidly, while keeping costs under control and maintaining all security & compliance mandates.

The best way to deal with this huge volume of data is to process this data right at the edge - near the point where data generated.
 

Advantages of analyzing data at the edge  


To understand, lets consider a factory.  Sensors on a drilling machine that makes engine parts - generates hundreds of bits of data each second. Over time, there are set patterns of data. Data showing vibrations, for example - it could be an early sign of a manufacturing defect about to happen.

Instead of sending the data across a network to a central data warehouse - where it will be analyzed. This is costly and time consuming. By the time the analysis is completed and plant engineers are alerted, there may be several defective engines already manufactured.

In contrast, if this analysis was done right at the site, plant managers could have taken corrective action before defect occurs. Thus, processing the data locally at the edge lowers costs while increasing productivity.

Also keeping data locally improves security and compliance. As all IoT sensors - could potentially be hacked & compromised. If data from a compromised sensor makes its way to the central data warehouse, the entire data warehouse could be at risk. Avoiding data from traveling across a network prevents malware from wreaking the main data warehouse.  If all sensor data is locally analyzed, then only the key results can be stored in a central warehouse - this reduces cost of data management and avoid storing useless data.

In case of banks, the data at the edge could be Personally Identifiable Information (PII), which is bound by several privacy laws and data compliance laws, particularly in Europe.

In short, analyzing data on the edge - near the point where data is generated is beneficial in many ways:

  • Analysis can be acted on instantly as needed.
  • Security & compliance is enhanced.
  • Costs of data analysis are lowered.


Apart from these above mentioned obvious advantages, there are several other advantages:

1. Manageability:

It is easy to manage IoT sensors when they are connected to an edge analysis system. The local server that runs data analysis can also be used to keep track of all the sensors, monitor sensor health, and alert administrators if any sensors fail. This helps in handling a wide plethora of IoT devices used at the edge.

2. Data governance: 

It is important to know what data is collected, where it is stored and to where it is sent. Sensors also generate lots of useless data that can be discarded or compressed or eliminated. Having an intelligent analytic system at the edge - allows easy data management via data governance policies.

3. Change management: 

IoT sensors and devices also need a strong change management( Firmware, software, configurations etc.). Having an intelligent analytic system at the edge - enables all change management functions to be off loaded to the edge servers. This frees up central IT systems to do more valuable work.

Closing Thoughts


IoT presents a huge upside in terms of rapid data collection. Having an intelligent analytic system at the edge gives a huge advantage to companies - with the ability to process this data in real time and take meaningful actions.

Particularly in case of smart manufacturing, smart cities, security sensitive installations, offices, branch offices etc. - there is a huge value in investing in an intelligent analytic system at the edge.

As conventional business models are being disrupted. Change is spreading across nearly all industries, and organizations must move quickly or risk being left behind their faster moving peers. IT leaders should go into the new world of IoT with their eyes open to both the inherent challenges they face and the new horizons that are opening up.

Its no wonder that a large number of companies are already looking to data at the edge.

Hewlett Packard Enterprise makes specialized servers called Edgeline Systems - designed to analyze data at the edge.  

Wednesday, July 13, 2016

Facebook Disrupts Telecom with OpenCellular


Facebook today announced that it will launch OpenCellular, a mobile infrastructure platform designed to lower barriers to entry for would-be providers of Internet service to the developing world.

After facing a sharp rebuke to its Internet Basics program, Facebook has taken a more ambitious project to enable free Internet to areas that are currently under served.

OpenCellular is a customizable base chassis for a wireless access point which will connect devices using 2G, LTE or even Wi-Fi. The base chassis is designed to be modular to keep down costs and making it easier to deploy at a high point in an open area - like a tall tower or a tree.

OpenCellular is a wireless access point which will allow users to connect to Internet via cell phone devices. Keeping in mind that this is targeted at areas that are currently undeserved, I would guess that most users will connect with a cheap Android phones.

According to the blog post of Facebook engineer Kashif Ali: "We designed an innovative mounting solution that can handle high winds, extreme temperatures, and rugged climates in all types of communities around the world," The device can be deployed by a single person and at a range of heights — from a pole only a few feet off the ground to a tall tower or tree."

Facebook said that the emphasis in the design process was on keeping the design as modular and inexpensive as possible, as well as making it easy to deploy. Keeping in mind of a rural or remote installation location, the chassis can be powered from multiple sources PoE, Solar Panels, External Batteries etc.

Given Facebook's penchant for OpenSource and its contribution to OpenCompute Foundation, Facebook will eventually make the hardware design and the software running on OpenCellular will be made open-source eventually.

At this point, it is not clear as to how these OpenCellular base station will connect to Internet. It can be to a UAV drones or a direct satellite uplink.

A free wireless Internet access could be a game changer in rural areas of India & other parts of Asia, where the existing telcos make tonnes of money by charging for Internet and basic telephone calls. With free Internet, users can use Facebook's WhatsApp calling feature and use Facebook for local mass communication. This will effectively disrupt a whole lot of cell phone service providers business models all over the world, while Facebook will benefit indirectly by gaining more customers and users - which leads to greater advertisement revenues.

2017 - The year of Artificial Intelligence

Every leader in technology sector know that success in a very transient in nature. Even global giants - such as Kodak, Nokia, Blackberry, Motorola, IBM have become victims of rapid technological changes.

Rapid Innovation brings in rapid obsolescence. Technology is ever changing and companies need to invest continuously in R&D, to develop new products, technologies - just to stay relevant in this ever-changing world.

We are already seeing the early fruits of Artificial Intelligence. Google Car - A self driving car is clearly the flag bearer, and today several startups are working hard to make the self driving cars a standard. This new technology will make the current auto giants such as GM, VW, Toyota, Honda, Tata, obsolete. Taxi hailing companies such as Uber will be first ones to capitalize on this new technology. Together with self-driving cars and on-demand taxi service will change the way people look at owning an automobile.

While this is good for technology companies, this shift in marketplace will deal a death blow to auto industry, and millions of workers will lose their jobs.

Another sector where AI will bring a huge change is the world of IT services.

Today, several million people are employed in IT/BPO service sectors - who do just the basic first line or second line product support. Low cost communication technology moved these low skill jobs from US/Europe to India. But now, the lower cost of AI will eliminate several of these jobs altogether.

Apple's Siri & Facebook's chat bot are just the beginning, soon all e-commerce companies, product support organizations will move the first line support functions to computers which have cognitive system intelligence.

To win in this AI arms race, companies such as Microsoft, Google, Facebook, Tesla, Amazon etc are investing billions of dollars to develop new AI technologies. Companies that can master new AI technologies will be able to lower their operational costs, execute better and thus compete better and win new business. For example, AI enabled drones can speedup delivery for Amazon, thus reducing distribution costs. Customer support Bot will answer customer questions, etc,

Just like Internet drove the business in 1990's-2010, Artificial Intelligence will be the driver of the new smart world of 2020's.

Monday, December 07, 2015

IoT is about to change Human Behavior


Internet of Things (IoT) is no longer a futuristic science fiction stuff. It is here and is slowly but surely changing our lives. As the technology matures and the prevalence of connected devices increases, it will change our lives in several profound ways. IoT would have an impact greater than that of World Wide Web!

I would say IoT will be one of those things that will fundamentally change the way we live on this planet:

 - Just like our ability to harness fire put humans on path to civilization,
 - Just like our ability to read & write helped create complex civilizations,
 - Just like Newspaper and Telegram created a global economy,
 - Just like Internet & WWW created a knowledge economy,

IoT will fundamentally change the way humans live, behave & interact in the society.

Let illustrate this with an example.

Let's say there is a construction on a major highway in the city. One hundred years ago, information of this construction would be published in newspaper - which very few people would read, and the local administration would put up big hoarding all around the construction site - requesting commuters to take alternate routes, and a local policemen would manually redirect traffic.

Percentage of people who knew about the advisory - 5%
Percentage of people who heeded to the advisory - 1%

The rest 99% would still drive on the same path and get redirected by local police.

With the advent of Radio & television the situation did not change much. With better communication technology, more people were now aware of the construction, but did not know

Percentage of people who knew about the advisory: 20-30%
Percentage of people who heeded to the advisory:  1-5%

The rest 95-99% would still drive on the same path and get redirected by local police.

Internet and World Wide Web along with  Social Networking tools improved the level of awareness, but people would still pile up at the construction point and choke traffic, and we still needed police to redirect traffic.

Percentage of people who knew about the advisory: 50-80%
Percentage of people who heeded to the advisory:  5-10%

While we now have all the tools to send out information, we as humans still have limited capability to absorb information and act on it.  So even with instant communications, real time traffic map service and social networks, people behavior hardly changed.
Studies done on the aftermath of Hurricane Sandy and Katrina show that people now have awareness - but still do not act on those information!

But IoT is about to change this.

With the advent of GPS Navigation systems, users can now chart out their driving path so as to avoid the construction area. So with IoT,

Percentage of people who knew about the advisory: 100%
Percentage of people who heeded to the advisory:  50-80%

The next generation IoT does even bettwe, with the advent of self-driving cars, the information about road constructions will be consumed by the car and it will reroute its path to avoid construction site, thus avoiding unnecessary traffic jams and wasted time/fuel. Thus with second generation of IoT,

Percentage of people who knew about the advisory: 100%
Percentage of people who heeded to the advisory:  100%

First Generation of IoT


While IoT is no longer a futuristic pipe dream, the technology is still maturing. The  technology has evolved a lot ever since Simon Hackett and John Romkey operated a toaster via the Internet in 1990. Technology evolved quite a bit since then and today we have Smart thermostats such as Nest, or fridges which can display Tweets!

Today, there is much wider prevalence of connected devices - which collect lots of data from the surrounding environment. But these are essentially the first generation of IoT.

Collecting information and presenting information is just a proof of concept of what role IoT can play in our daily lives. Devices such as smart watches or Fit-bit bands can collect a whole lot of information on a person's daily activity. But it does very little to change the person's behavior.

Very few people can change their actions and behavior based on the information presented to them. I would say less than 1% of people changed their exercise pattern because of Smart Bands/watches.

First generation of IoT was all about collecting data. But data was not an end in itself.  Consumers are already overwhelmed by data. Some systems were able to present data in a meaningful way in form of actionable information for people to act. For example, smart baby monitors can alert parents when the baby's breathing pattern changes. But then all mothers are still very anxious and would not change their sleeping patterns - even after knowing that their baby us being safely and actively monitored.

First generation IoT did not make life better, rather it added tremendous complexity with yet another gadget that consumes attention and feeds our obsessive-compulsive disorders.

Smart products today still have big challenges in making users interact with the products and take actions.

The first generation IoT was still far away from creating meaningful experiences.

Second Generation of IoT


Second generation of IoT is just around the corner. Few systems are already being deployed and tried - but these are just the beginning. Smart systems today react to all the information coming from connected systems in a meaningful, purposeful ways.

For example, Smart insulin pumps can alter the quantities of Insulin being pumped into body based on real time blood sugar levels.

The second generation of IoT goes beyond collecting & presenting data. It has to simplify life through automation. Connected devices have to remove tedious chores from our daily lives, allowing people to concentrate on more important stuff.

Second generation of IoT has to provide the ability to control things automatically - which we otherwise would not. For example changing the building thermostat settings to match with power savings plans or based on real-time power supply data - when grid loads increase beyond a point, smart gird can tell all connect smart buildings to lower their power consumption. Smart buildings will then react to this request by changing the temperature & lighting settings.

The next big innovation in IoT will be to create user experience in ways that it makes lives richer by seamlessly eliminating mundane activities. One must create systems focused on meeting a conscious or an unconscious need. For example self driving cars - which meets the need for transport whenever without the hassle of driving.

Today, Airlines and aircraft's are using data from IoT to change the flight parameters to create better fuel efficiency and better passenger comfort - all without pilots actions.

Closing Thoughts


Developing this second generation IoT systems will certainly require a different ways of thinking. This also presents a tremendous opportunity to enhance human experience and enhance life in ways that were never possible before.

Smart systems must be able to understand our intentions and desires and then act independently to create those experiences. When IoT and Smart systems can take real time, meaningful action - without human interaction, IoT will become mainstream.

Second Generation of IoT will not be centered around the product. Instead, it will be centered around user experiences, while hiding all complexity of technology.  Second generation IoT does away with all the unnecessary interfaces and enable customer to concentrate on more important tasks. When IoT products (sensors, actuators), the network, the apps - will all work seemlessly - it will ehance the customer experience - without "Experience of the product"

Friday, December 04, 2015

How IoT will help Sustainability


Any one who has a car today will have experienced a traffic jam and once a while, while waiting in the traffic jam, there will be a thought: How to eliminate this traffic jam and help environment by better use of technology?

Several of today's startups are working on latest technology that may do just that.

In my previous post I had mentioned about How IoT is helping build sustainable buildings. But there is more to technology built on top of IoT which will profoundly change the way we live - and thus reduce energy consumption by increasing energy efficiency.

Cars are the BIGGEST users of energy today. It is no wonder, a lot of innovation and new technology is being developed to improve energy utilization in cars. Some of the innovations that we have already seen are:

1. Hybrid Cars that run on electric motors and fossil fuel engines
2. Electric Cars that can be charged from Solar panels
3. Uber's network of cars, where Internet connects the drivers to customers
4. Google's self driving cars

Now the stage is set for the next revolution:  A network of connected cars, connected buildings and smart traffic management systems.

The idea of self driving electric cars integrated with Uber like technology will enable commuters to share rides - thus eliminating waste. The idea of one person driving alone in a car will soon be history!

To enable such a system where self driving taxis can be hailed via smart phones will require lots of new technologies driven by IoT in form of Smart/Intelligent cars, Traffic monitoring systems on roads such as cameras, Vehicle speed detectors, Intelligent traffic monitoring and management via smart traffic lights, Smart parking spaces  & recharge points - which broadcast empty slots to self driving cars, Air quality & pollution monitoring and broadcasting system - which will guide hybrid cars to switch over to electric mode in polluted areas, Commute demand management systems - which will automatically determine the best possible commute options,  etc.

I am sure there will several other technologies that will emerge in the next decade.

On November 30th, amidst the backdrop of the COP21 Climate Change Conference in Paris,  a group of high-powered tech executives announced Monday the formation of the Breakthrough Energy Coalition and its intention to push investment to pioneering climate change technologies, there can be some real change in energy efficiency in the enterprise with technology available today.

IoT and the always connected network forms the backbone of this new technological breakthroughs - that will create build a sustainable world. IoT forms the eyes, ears & senses of this new technology, while the wireless networks from the interlinks that connects the people, the business and the cities together and provides relevant information for insights on what needs to be done to reduce energy consumption in real time.

Linking vehicles, commuter traffic and air emissions to air quality is giving traffic management the right data to manage for optimal efficiency: Reduce operational cost, greenhouse gas emissions and the environmental footprint.

The key value will come from raising the efficiency. Today people who drive their cars & big SUVs around normally use it for 5% of the time and at 20% of the capacity - i,e ~99% of all the available transport capacity in cars & automobiles are being wasted!

This will have to change and technology sector will drive this change to create a sustainable cities.

Similarly, almost 100% of sunlight falling on cities are being wasted! Developing more effective and efficient solar photo voltaic panels can eliminate the need for fossil fuels in cars.
New technology can change the way we consume energy and lead us to a path of greener and a sustainable future.

Closing Thoughts

Today, cars form the biggest polluters in the planet, and with new technology based on IoT and connected world and renewable energy, we can change the way we commute and this will help in a big way to reduce pollution and create a sustainable world.