Wednesday, November 09, 2016

Effect of Demonetization - Indian Digital Wallets will see a surge in Usage

Indian Payments companies like Paytm, Freecharge and Mobikwik will see a huge urge in number of transactions in the comming days - as the government bans the use of Rs 500 & Rs 1000 currency notes.

People in cities - mainly metro cities & tier-1 & 2 cities will be forced to use digital wallets to pay for low value transactions. With the bank ATM shut down for today - November 9th and with limited withdrawals till November 18th (limited to Rs 2000/-), I see a huge opportunity for digital wallets.

Today, I saw many Auto drivers accepting PayTM or Freecharge for payments. At lunch hour, I walked by a roadside push cart fruit vendor accepting PayTM!

Digital wallet companies will see 10 to 15 times increase in transactions in this month alone.

Indian Prime Minister Narendra Modi's fight against corruption and black money will be aided by Digital Wallets - aka "Digital India" another major initiative of PM Narendra Modi.

As on November 1st, Digital wallet companies processed 2-3 million transactions/day, but by November 10th, this number will be around 30-40 million!

This could be a game changing moment for digital payment companies! 

How the ban on Rs 500 and Rs 1000 notes will affect the common man

November 8th 2016  will go down in Indian history as a red letter day. It marks the day, Indian economy moved from Black economy to white.

In what will be known as a ground breaking, historical move, On November 8th,  Prime Minister Narendra Modi  announced the demonetization of Rs 500 and Rs 1000 currency notes.

So what does this mean for the common man?

India had been a cash based economy. Nearly 14 Lakh Crore Rupees is in currency notes - about $220 billion worth, is held in cash notes of Rs 500 & Rs 1000. This implies that the impact to Indian economy will be huge - very huge.

Impact on Common Man

Day-1-10: Near panic in local markets. Number of transactions drop by more than 50%. Today on November 9th, almost all businesses have reported more than 50% drop in transactions. I chatted with a Uber driver and a small coffee shop owner. Both reported the same. Uber driver was willing to give me 10% discount for cash payment in Rs 100 notes vs PayTM!

Common people in cities will rush towards digital payments like PayTM.

Immediate impact: Deep Deflation. The amount of money in circulation will drop dramatically while supply of goods will remain stable - hence prices of goods will drop.

Gold prices, stock prices, commodity prices will drop. People will congratulate government for making this bold move. BJP will win elections in UP and Punjab.

Day-10-50: People who have legally earned cash, will start depositing it in bank. This will help improve bank's Cash Reserve Ratios and increase bank deposits. This will lead to more lending. Increase lending activity will make it easier for legal businesses to raise capital and economy will grow.

People who have earned their money illegally, such as bribes, smuggling, Narcotics etc. will have a big problem on their hands. These people will be afraid to deposit it in a bank.  Some of them will find ways to deposit this money into a bank, and will declare it as income and pay taxes on it.

Many of these guys - who had easy money flowing will continue to stay out of legal system and will count on their luck or bad luck and sit on their stash of Rs.500 and Rs.1000 notes. This money will be effectively taken out of circulation and that aids deflation.

Day 50-200: Deflation will ease out, and inflation will return. Inflation will happen slowly because lending activities will not happen overnight and will take time. Lending will broaden money supply, creating demand for raw materials and capital goods. This leads to a steady growth of Indian economy.

Real Estate prices will crash. Builders & developers who were eager to sell for cash can no longer sell. They will be forced to lower the price by 10-20%. Already by 1 PM on November 9th, Share prices of DLF is down 21%!

Real estate developers will have to wait for demand from white economy to pick up. Once the economy picks up and with easy availability of bank loans, real estate prices will come back to pre Nov. 8th levels, and by end of 2017, the robust demand will ensure real estate prices to go up.

Real estate developers will be forced to go with legal transactions and play in white economy.

Big Losers

The biggest losers in this are corrupt government officials & Politicians who are sitting on tonnes of cash. They cannot convert the older de-monitized notes to newer ones to avoid risk of tax investigations, and will be willing to lose their illegal money.

Real Estate businessmen, who cannot convert all their hordes of cash will also be hurt by low demand.

Other illegal business owners: Money lenders, hawala finance transactions. These people will find it difficult to conduct their business in the new system. Particularly when government can track newer Rs 2000 currency notes via RF chips.

Closing Thoughts

This is just my opinion based on my knowledge of economy. I may be wrong in some aspects, but overall I am sure Indian economy will go through a cycle of deflation, followed by robust growth and the some creeping inflation.

Let's wait and see how things pan out!



Thursday, November 03, 2016

Friday, October 28, 2016

How Digital Wallets can help Banks

In my previous article, I had mentioned how Digital wallets are disrupting banks. Profitable retail banking operations such as credit cards are at risk from being disrupted by digital wallets.

From a strategy perspective, the best way to counter this disruption is to offer a digital wallet service as part of basic banking services to both retail and commercial customers.

Customer Behavior Shifts

The millennials are changing the rules of engagement for banks.. This generation was born with Internet and are tech-savy. They are more socially active online and need tools that reflect their digital lifestyle. Millenials prefer using technical interactions on mobiles over face-2-face banking interactions.

Millennials are willing to source multiple financial services from different banks - based on the banks digital capability. This implies that banks will now have to built newer digital platforms to cater to the needs of this new generation.

Digital wallets are being embraced by this new generation faster than any other demographic group. Financial crisis of 2007 has eroded the trust in traditional banking and assets. Younger generation are more willing to rent rather than own. This implies more dynamic & frequent transactions - where digital wallets have an advantage over traditional methods. Going forward, this trend is likely to continue and older customers will drop off.

While this generational shift represents a big challenge, it also offers the biggest growth opportunity for banks.

Strategic Implications of Digital Wallet

Millennials are technology savvy. Millennials generate huge amounts of data using smart devices. Banks need to make sense of this data. Post 2007 crisis millennials are trading down, deleveraging and seek long term financial planning.

To win the loyalty of this millennials, banks need to compete to offer customized services on time. A delayed offer is a wasted offer.

Use real time analytic tools on data gathered from mobile devices, digital wallets, social networks etc. can generate crucial insights on products and services that can be created & targeted to this generation. Banks can use this for instant feedback and look for spontaneous opportunities to offer personalized products or services in credit or investment services.

For example, if a customer receives a cash gift, bank can instantly offer investment products.

Tracking financial transactions - based on time, geographical location and social networks, and then running realtime data analytics on this data can also feed into marketing communications.
Instead of using traditional print advertisements or Internet adwords, Banks can create a multi-channel customized banking communication experiences. This will improve customer experience and provide marketing information to customers at a much lower costs.

For example, banks can offer foreign exchange services via the mobile digital wallets, when they know the customer is traveling abroad, offer travel insurance etc. (vs now waiting for the customer to call the bank for foreign exchange of currencies)

Realtime analytics of all financial transactions - based on time, geographical location and social networks also aids in fraud detection. Banks get know what is happening in real time.

Knowing the network of transactions occurring in real time, and the nature of transactions can prevent fraud from taking place. Fraud detection must go beyond historical data, and must combine with realtime data to analyze possible fraudant transactions and predict scenarios that can help decision making.

Digital data trails from different transactions, different digital systems - such as locational information, purchase history, and network of transactions will help improve risk analytics across risk types and business units; implement predictive risk management systems.

Understanding risks in real time helps banks comply with new regulations and offer risk weighted services.

Conclusion

Digital Wallets presents the greatest threat to traditional banks - but it also provides a great opportunity for new growth, lower costs and increase profitability by offering personalized services to customers.



Thursday, October 27, 2016

Fintech Disrupts the Personal Cheque Book

When was the last time you wrote out a cheque? I asked few of my friends and relatives around. Almost all of them had a difficult time to recall when they last wrote a cheque.

As digital wallets become mainstream, the first casualty is the personal cheque book. Currently about 75% of consumers are using smart phones, it will be short time when users will completely stop using cheque books. The current trend lines points that by 2020, the percentage of bank customers using cheques will fall to nearly 0%.

If handled properly, this disruption is essentially a good one for the banks. When almost any kind of payment can be done electronically, Banks will save costs by eliminating cheques.


Figure-1: Digital wallets along with online eBanking tools such as IMPS, NEFT, RTGS (in India) are clearly is on way to replace personal cheques.

However, I don't think cheques will never go entirely extinct - at least in India. While it is technically feasible to make all payments electronically, there will be a small & shrinking set of customers who will still prefer to write out cheques - mostly out of pure habit or due technological challenges or due to fear of hacking. (My parents for example, will never get on to eBanking)

Eventually, digital wallets will replace all credit cards, debit cards and personal cheques.

Benefits for Banks

If banks can leverage this new technology, Banks can lower operational costs by eliminating branch operations of handling cheques and paying out cash.

Essentially banks will have to partner with digital wallet service providers or technology providers, and start their digital transformation into a pure digital online banking operations. This will help banks to lower operational costs and help compete with newer financial service providers.

Wednesday, October 26, 2016

How Fintech will Disrupt Banking Industry


The banking industry for a long long time enjoyed strong barriers to entry. It was difficult for new banks to start - licensing and regulation kept new entrants away. As a result banks enjoyed  low customer switching, which in turn, allowed it to earn high returns on capital over extended periods. Banks could easily get 16-18% returns.

Fintech  is now changing this industry. New Fintech startups are launching discrete banking products that disrupt that particular segment of banking services. For example, Digital Wallets is disrupting credit/debit cards.

Lets take a look at how Digital wallet is disrupting credit cards. In India, Digital wallets such as PayTM, Mobiwiki & others are at a stage where number of transactions over digital wallet will exceed the number of payments done on credit & debit cards.


The rise of digital wallets is changing the industry's dynamics. By 2017, more number of transactions will be done over digital wallets than with the older credit or debit cards.



As digital wallets gain preeminence, digital wallets can morph into credit cards, and offer credit to customers and even merchants who accept digital wallet payments.

Today most consumers who use digital wallets such as PayTM also use credit/debit cards to transfer money from their credit/debit cards to their wallets & having a credit card like facility available on their digital wallet will make them stop using credit/debit cards.

Digital wallets uses cloud computing and captures all the transaction data to analyze customer or merchant usages. Based on this transactional information, a credit score can be developed and against which loans or business lines of credit can be issued.

In short, digital wallets will completely disrupt and swallow debit & credit card business of the banks.

Friday, October 21, 2016

Hiring Analytics


Recently my wife asked me about hiring analytics. Though I am not into HR analytics, I had read enough of IBM Watson Analytics and Hadoop, Spark use cases and in my MS course in Texas A&M, I had studied Neural Networks - and I had wide range of data to talk about Hiring Analytics.

So, I could wing it off in a discussion and I decided to write what I said in this blog ( My Wife's suggestion to blog this)

Why use Analytics for Hiring?

In today's fast paced economy, companies tend to hire people with relevant skills from outside. For example, a bank is willingto hire a business analyst from a manufacturing sector - rather than train a banker in analytics.

In short, hiring is critical to build capabilities quickly. Therefore it becomes important to hire employees who can meet its requirements and fit into its corporate culture. There in lies the challenge: "How to hire someone from outside - who has relevant knowledge needed in banking and who will fit in with the existing corporate culture."

This challenge can be solved by using data analytics during the selection process.

Today, every individual creates tonnes of digital data and also leave a wide digital trail behind. By looking at this digital data trail, and other digital data, one can develop fairly sophisticated analytics tools for hiring.

The two most popular tools in Hiring Analytics are:

  1. LinkedIn Talent Solutions 
    Ebook on using LinkedIn Talent Solution can be seen here:
  2. IBM Watson for Hiring 

The benefits of hiring right candidates are well know: Will more likely perform better and stay longer.

What goes into this Analytics

Hiring Analytics works best when we use standard verifiable Biometric Data - things that can be verified:

  • How many jobs the person had till date?
  • How long they stayed in those jobs?
  • How many promotions they had? 
  • Level of Education?
  • Photos of the candidate on Internet.
  • Industry relevant skills?
  • Public records.
  • Frequency of continuous learning and development.
  • Affiliations to various organizations - cultural, political etc.
  • Cultural Background

In addition to basic resume, Data from Social media: Facebook, Twitter, and LinkedIn are used for this analysis. The Business insights from this analysis can provide insights on candidate sentiment on various organizational factors such as productivity, business growth, or other objectives.  

For a particular role, say for example a system architect, a company can then set a basic set of requirements which defines the basic talent pool. Once this requirements are collected, automatic tools from Monster, LinkedIn etc. can be used to scour through millions of profiles and each matching profile, all the relevant data points can be collected and modeled or numerically analyzed with various analytics tools - Sentiment Analysis using MapReduce, Skill level analysis using Recursive Neural Networks, Cultural fitment analysis using RNTN etc.

The results of this analysis can then be used to short-list a set of potential candidates from a large pool of potential candidates.

During the interview process, further information regarding the candidate can be captured using a basic survey or psychometric tests or other testing tools. Data from these tools can then used to identify & rank candidate based on company specific parameters such as:


  • Willingness to Join
  • Time to Join
  • Onboarding process
  • Skills & training development needs,
  • Retention schemes 
  • Cost to Company, (level of Salary expected by candidate)


The main objective of hiring analytics is to automate hiring process as much as possible and provide hiring managers with necessary information about the candidate - so that they make the right hiring decision.

Apart from actual hiring, Hiring analytics can also be used on existing staff or new staff - to know the retention ratios and plan for future hiring as well.

Closing Thoughts

Information Technology is rapidly transforming hiring process. Industry had come a long way from the days of placing recruitment advertisements in newspapers. Today companies can rapidly analyze millions of profiles and short list potential candidates - not just based on key word search, but based on candidates digital data trails - which gives a bigger picture than just the standard resume.

Hiring analytics is still in its infancy and company is still taking small baby steps. The power of analytics is enormous. Technical advances in data-driven analytics is being used for hiring. Companies are adopting predictive analytics to make hiring decisions and HR strategies. Data analytics can be used for other HR functions such as attrition risk management, employee sentiment analysis, employee skill training plan development, etc.

Someday in future, the analytics tools will themselves find & hire the right candidate - with no human involvement.

Recall the movie "Gattaca" -where companies use DNA analysis to hire candidates!
While the technology for DNA analysis exists today - it is not used in standard hiring, however, I guess agencies like NASA, NSA, CIA may be using it today but in secret!

Wednesday, October 19, 2016

Future of Business IT : APIs


There was a time when one had to walk into a secure facility to access a computer - it was called as the mainframe era. Today all computing power is available on a mobile devices over the Internet via APIs!

Mobile Apps and Web apps rely on APIs to connect, communicate programmatically over Internet. Google's recent purchase of Apigee underscores the need for application programming interfaces in today's connected economy.  In this new world of mobile apps, APIs are used to connect different systems via APIs. The constant exchange of data between systems powers the app economy.

For example when you open a Flipkart mobile app and browse a particular product supplied by a vendor, the app  in your mobile interacts with retailer's catalog via APIs and retailers site will also gather pricing data and other shipping data from vendor's side via another APIs to provide you a complete picture.

In short, Data is the new currency in the App world! The data is made available over API.

The Importance of APIs

As apps proliferate, the life-cycle of apps can be as low as few weeks!

In such a short life-cycle, it makes more sense to use APIs to collect, collate, present information to users and also transport information back to main IT systems. In short, APIs become more than just an interface, the APIs are the central hub of the applications.

APIs make business processes simpler and smoother by connecting firms' customer-facing apps to different back end IT systems. For example, one can launch a web site with uses facial recognition API from Clarifi to authenticate users, and present stock data from Bloomberg over another API and enable trade with another API.

The ability to integrate multiple APIs to create a seamless user friendly services is the need of the day. APIs allows firms to expand into new markets that was not possible before. For example, a bank in Vietnam can offer global investment services via APIs - which was not possible few years ago.

APIs enable interactions/transactions between a business and its customers across multiple devices, apps, social networks, business networks, and cloud services.

A Growing Market for APIs

Today, it is estimated that there are 15K - 20K APIs. This will grow into several millions by 2020. The market for paid API could exceed $3 Billion in 2020.

The size of the market implies that APIs are no longer a novelty. APIs are becoming he preferred way to interact with IT systems, exchange information/data and thus build valuable new products in the app economy.

Its no wonder that companies like IBM is offering open access to Watson, its AI platform via APIs. This allows IBM to attract new customers globally and develop an unprecedented range of new services.

Closing Thoughts

APIs as not just a technical concept. It is the preferred way to offer new, valuable services. Soon APIs will be the only way customers will interact, and APIs will be the way companies interconnect and interoperate.

Tuesday, October 18, 2016

Fintech Needs High-Performance Computing

Newer Fintech companies are planning to disrupt financial markets. According to Accenture, the newer Fintech companeis are targeting ever faster settlement times.



To compete, current incumbents will have to match the turnaround time of the newer Fintech companies. In order to get to such fast turnaround times with existing workloads - companies will need High Performance Computing (HPC)

Historically, Financial companies have been first adaptors of advanced computing technologies such as Mainframes in 1960's Unix servers in 1990's. Today, activities such as high-frequency trading, complex simulations and real-time analytics are built on dedicated data centers filled with a diverse set of HPC systems.

These HPC systems are used to gather, parse, analyze and act on huge amounts of data - often several Petabytes/day. Having greater computing power increases competitive advantages in the market.

Let us see how HPC aids in building competitive advantages.


The only way for financial companies to address challenges is to use HPC solutions. 

Now, lets look at what constitues  HPC systems. 

From a hardware perspective, HPC systems has four components:


Market Outlook

It is clear that HPC provides competitive advantages to financial companies. According to IDC, total global revenue for the HPC market (including servers, storage, software and services) will increase from $21 billion in 2014 to $31.3 billion by 2019!

Tuesday, August 16, 2016

Five core attributes of a streaming data platform

Currently, I am planning a system to handle streaming data. Today, in any data driven organization, there are several new streaming data sources such as mobile apps, IoT devices, Sensors, IP cameras, websites, Point-of-sale devices etc

Before designing a system, first we need to understand the attributes that are necessary to implement an integrated streaming data platform.

As a data-driven organization, the first core consideration is to understand what it takes to acquire all the streaming data: The Variety of data, Velocity of data and Volume of data. Once these three parameters are known, it is time to plan and design of an integrated streaming platform and allow for both the acquisition of streaming data and the analytics that make streaming applications.

For the system design, there are five core attributes that needs to be considered:

1. System Latency
2. System Salability
3. System Diversity
4. Durability
5. Centralized Data Management

System Latency

Streaming data platforms need to match the pace of the incoming data from various sources that is part of the stream. One of the keys to streaming data platforms is the ability to match the speed of data acquisition, data ingestion into the data lake.

This implies designing the data pipelines that required to transfer data into data lake, running the necessary processes to parse & prepare data for analytics on Hadoop clusters. Data quality is a key parameter here. It takes definite amount of time and compute resources for data to be sanitized - to avoid "garbage in, garbage out" situations.

Data Security and authenticity has to be established first before data gets ingested into data lake. As data is collected from a disparate sources, basic level of data authentication and validation must be carried out so the core data lake is not corrupted. For example in case of Real time Traffic analysis for Bangalore, one needs to check if the data is indeed coming from sensors in Bangalore, else all analysis will not be accurate.

Data security is one of the most critical components to consider when designing a data lake. The subject of security can be a lengthy one and has to be customized for each use case. There are many open source tools available that address data governance and data security; Apache Atlas (Data Governance Initiative by Hortonworks), Apache Falcon (automates data pipelines and data replication for recovery and other emergencies), Apache Knox Gateway (provides edge protection) and Apache Ranger (authentication and authorization platform for Hadoop) provided in Hortonworks Data Platform, and Cloudera Navigator (Cloudera enterprise edition) provided in the Cloudera Data Hub.

Depending on Data producers, data ingestion can be "pushed" or "pulled". The choice of pull or push defines the choice of tools and strategies for data ingestion.  If there is a need for real-time analytics, then the system requirements needs to be taken into consideration. In case of real-time analytics, the streaming data has to be fed into the compute farm without delays. This implies planning out the network capacities, compute memory sizing and compute capacity planning.

In case of on-demand or near real time analytics, then there will be additional latency for the data to be landed into the data warehouse or get ingested into data lake. Then the systems needed to feed the ingested data to BI system or a Hadoop cluster for analysis. If there are location based dependencies, then one needs to build a distributed Hadoop clusters.

System Salability

The size of data streaming from device is not a constant. As more data collection devices are added or when the data collection devices are upgraded the size of incoming data stream increases. For example, incase of IP Cameras, the data stream size will increase when the number of cameras increase or when the cameras are upgraded to collect higher resolution images or when more data is collated - in form of infrared images, voice etc.

Streaming data platforms need to be able to match the projected growth of data sizes. This means that streaming data platforms will need to be able to stream data from a large number of sources or/and bigger data sizes. As data size changes, all the connected infrastructure must be capable to scaling up (or scale out) to meet the new demands.

System Diversity

The system must be designed to handle diverse set of data sources. Streaming data platforms will need to support not just "new era" data sources from mobile devices, cloud sources, or the Internet of Things. Streaming data platforms will also be required to support "legacy" platforms such as relational databases, data warehouses, and operational applications like ERP, CRM, and SCM. These are the platforms with the information to place streaming devices, mobile apps, and browser click information into context to provide value-added insights.

System Durability 

Once the data is captured in the system and is registered in the data lake, the value of historical data depends on the value of historical analysis. Many of the data sets at the source could have changed and data needs to be constantly updated for meaningful analysis (or purged). This must be policy/rule based data refresh.  

Centralized Data Management 

One of the core tenants of a streaming data platform is to make the entire streaming system easy to monitor and manage. This make the system easier to maintain and sustain. Using a centralized architecture, streaming data platforms can not only reduce the number of potential connections between streaming data sources and streaming data destinations, but they can provide a centralized repository of technical and business meta data to enable common data formats and transformations.

A data streaming system can easily contain hundreds of nodes. This makes it important to use tools that monitor, control, and process the data lake. Currently, there are two main options for this: Cloudera Manager and Hortonworks' open source tool Ambari. With such tools, you can easily decrease or increase the size of your data lake while finding and addressing issues such as bad hard drives, node failure, and stopped services. Such tools also make it easy to add new services, ensure compatible tool versions, and upgrade the data lake.

For a complete list of streaming data tools see: https://hadoopecosystemtable.github.io/ 

Closing Thoughts 

Designing a streaming data platform for data analysis is not easy. But having these five core attributes as the foundation for designing  the streaming data platform, one can ensure it is built on robust platform and a complete solution that will meet the needs of the data-driven organization will be built upon.

Understanding the Business Benefits of Colocation

Digital transformation and the move towards private cloud is really shaking up the design and implementation of data centers. As companies start their journey to the cloud, they realize that having sets of dedicated servers for each application will not help them and they need to change their data centers.

Historically, companies started out with a small server room to host a few servers that run their business applications. The server room was located in their office space and it was a small set up. As business became more compute centric, the small server room became unviable. This led to data centers - which was often located in their office.

The office buildings were not designed for hosting data centers and had to be modified to get more air-conditioning, networking and power into the data center. The limitation of existing buildings created limitation on efficient cooling and power management.

But now when companies are planning to move to private cloud, they are now seeing huge benefits of having a dedicated data center - a purpose built facility for hosting large number of computers, switches, storage and power systems. These purpose built data centers are built with better power supply solutions, better & more efficient liquid cooling solutions and more importantly offer a wide range of networking connectivity and network services.

As a result these dedicated data centers can save money on IT operations and also provide greater reliability & resilience.

But not all companies need a large data center which can benefit from economies of scale. Very few large enterprises really have a need for such large dedicated purpose built data centers. So there is a new solution - Colocation of Data Centers.

For CIOs colocation provides the perfect win-win scenario, providing cost savings and delivering state-of-the-art infrastructure. When comparing the capabilities of a standard server room to a colocated data center solution, often times, the benefit from power bills alone is enough to justify the project.

These dedicated data centers are built on large scale in Industrial zones with dedicated power lines and backup power systems - that the power cost will be much lower than before. Moreover, these dedicated data centers can employ newer & more efficient cooling systems that reduces the over all power consumption of the data center.

Business Benefits of Colocation

Apart from reductions in operational expenditure, there are several other benefits from colocation. Having a dedicated team where people are available 24/7/365 to monitor and manage the IT infrastructure is a huge benefit.


  1. Cost Savings on Power & Taxes

    Dedicated data centers are built in locations that offer cheap power. Companies can also negotiate the tax breaks for building in remote or industrial areas. In addition to a lower price of power, the data centers are designed to include diverse power feeds and efficient distribution paths. These data centers are dual generator systems that can be refueled while in operation as well as on-site fuel reserves, and have multiple UPS support in place.

    In addition to power costs, dedicated data centers will have engineers and technicians who will monitor the power levels, battery levels 24/7 so that the center has 100% uptime.

    Additionally, data centers have the time, resources and impetus to continually invest in and research green technologies. This means that businesses can reduce their carbon footprint at their office locations and benefit from continual efficiency saving research. Companies that move their servers from in-house server rooms typically save 90 percent on their own carbon emissions.

  2. Network Connected Globally, Securely and QuicklyToday, high speed network connectivity is the key to business. And it is lot more difficult to get big fat pipes of network connectivity into central office locations. It is lot more easier to get network connectivity to a centralized data center. A dedicated data center will have many network service providers providing connectivity and often at a lower price than at a office location.

    Dedicated data centers also provide resilient connectivity at a fairly low price – delivering 100 Mbps of bandwidth might be hard at an office location and trying to create a redundant solution is often financially unviable. Data centers are connected to multiple transit providers and also have large bandwidth pipes meaning that businesses often benefit from a better service for less cost.

    Colocation enables organizations to benefit from faster networking and cheaper network connections.

  3. Monitoring IT InfrastructureBuilding a dedicated data center makes it easier to monitor the health of IT infrastructure. The economies of scale that comes from colocation helps to build a robust IT Infrastructure monitoring solution that can monitor the entire IT infrastructure and ensure SLAs are being met.

  4. Better SecurityA dedicated data center and colocation will have better physical security than a data center in a office location. The physical isolation of the data center enables the service provider to provide better security measures that include biometric scanners, closed circuit cameras, on-site security, coded access, alarm systems, ISO 27001 accredited processes, onsite security teams and more. With colocation, all these service costs are shared - thus bringing down the costs while improving the level of security.

  5. ScalabilityPlatform 3 paradigms such as digital transformation, IoT, Big Data, etc are driving up the scale of IT infrastructure. As the demand for computing shoots up with time, the data center must be able to cope up with it. With colocation, the scale up requirements can be negotiated ahead of time and with just one call to the colocation provider and scale of the IT  infrastructure can be increased as needed.

    Data centers and colocation providers have the ability to have businesses up and running within hours, as well as provide the flexibility to grow alongside your organization. Colocation space, power, bandwidth and connection speeds can all be increased when required.

    The complexity of rack space management, power  management etc is outsourced to the colocation service provider.

  6. Environment friendly & Green ITA large scale data center has more incentives to run a greener IT operations as it results in lower energy costs. Often times these data centers are located in Industrial areas where better cooling technologies can be safely deployed - which makes it possible to improve the over all operational efficiency. Typically, a colocated data center often adhere to global green standards such as ASHRAE 90.1-2013, ASHRAE 62.1, LEED certifications etc.

    A bigger data centers enable better IT ewaste management and recycling. Old computers, UPS Batteries and other equipment can be safely & securely disposed.

  7. Additional Services from Colocational PartnersColocational service providers may also host other cloud services such as:

    a. Elastic salability to ramp up IT resources when there is seasonal demand and scale down when demand falls.

    b. Data Backups and Data Archiving to tapes and storing it in secure location

    c. Disaster Recovery in a multiple data centers & Data Redundancy to protect from data loss incase of natural disasters.

    d. Network Security and Monitoring against malicious network attacks - this is usually in form of a "Critical Incident Center." Critical Incident centers are like Network Operation Center - but monitors the data security and active network security. 
Closing Thoughts

In conclusion, a dedicated data center offers tremendous cost advantages. But if the company's IT scale does not warrant a dedicated data center, then the best option is to move to a colocated data center. Colocation providers are able to meet business requirements at a lower cost than if the service was kept in-house.

A colocation solution provides companies with a variety of opportunities, with exceptional SLAs and having data secured off-site, providing organizations with added levels of risk management and the chance to invest in better equipment and state-of-the-art servers.      

Why Workload Automation is critical in Modern Data Centers


Today, all CIOs demand that their data centers be able to mange dynamic workloads and have the ability to scale up or scale down dynamically based on their workloads. The demands of digital transformation of various business process implies that the entire IT services stack must be built for "Everything-As-A-Service." And this needs to be intelligent, self-learning and self healing IT system.

The always ON paradigm of the digital era brings in its own set of IT challenges. Today, IT systems are expected to:

Manage all incoming data. Data from a wide variety of sources and formats must be imported, normalized, sorted and managed for business applications. The volume of incoming data can vary greatly - but the systems must be able to handle it. IT systems must accommodate a wide variety of large scale data sources and formats, including Big Data technologies and integration with legacy in-house and third-party applications.


  • Enable Business Apps to sequence data, run data analysis and generate reports on demand. On demand workloads makes it difficult to predict future workloads on IT systems - but the systems must be able to handle it.
  • Ensure all Business Apps adhere to the published SLA.


This expectation on IT systems places a tremendous pressure to automate the management of all IT resources. As a result CIOs want:

  1. All workloads are effectively spread across n-tier architectures, across heterogeneous compute resources and across global data center networks.
  2. Predictively detect IT infrastructure failures, Automatically remediated failed/disrupted processes and workflows in near real time.
  3. Intelligently predict future work loads to automatically apply policies about when and where data can reside and how processes can be executed.


The traditional IT workload management solutions relied on time based scheduling to move data and integrate workloads. This is no longer sustainable as it takes too much time and delays in responses to the modern business needs. 

As a result, we need an intelligent workload automation system which can not only automate the workload management, but also made intelligent policy based decisions on how to manage business work loads.

Today, the IT industry has responded by developing plethora of automation tools such as Puppet, Chef, Ansible etc. Initially these were designed to simplify IT operations and automate IT operations - mainly support rapid application upgrades and deployments driven by the adoption of DevOps strategies.  

However, these tools have to be integrated with deep learning or machine learning systems to:

  1. Respond dynamically to unpredictable, on-demand changes in human-to-machine and machine-to-machine interactions.
  2. Anticipate, predict, and accommodate support for fluctuating workload requirements while maintaining business service-level agreements (SLAs)
  3. Reduce overall cost of operations by enabling IT generalists to take full advantage of sophisticated workload management capabilities via easy-to-use self-service interfaces, templates, and design tools.


Over the next few years, we will see a large number of automation tools that can collectively address the needs of legacy IT systems (such as ERM, Databases, Business collaboration tools: Emails, fileshare, unified communications, and  eCommerce, Webservices) and 3rd platform Business Apps - mainly entered around IoT, Big-Data, Media streaming & Mobile platforms. 

Current digital transformation of the global economy is driving all business operations and services to become more digitized, mobile, and Interactive. This leads to increasing complexity of everyday transactions - which translates to complex workflows and application architectures. For example - a single online taxi booking transactions will involve multiple queries to GIS systems, transactions and data exchanges across several legacy & modern systems.

To succeed in this demanding environment, one needs an intelligent, scalable and a flexible IT workload automation solutions. 
  

Tuesday, July 19, 2016

Why Project Managers need People Skills


Recently, I was having a discussion on why project managers need to excel in people skills. On the surface, most project managers do not have people management function and tend to discount people skills.

Project managers are often individual contributors who have no people management functions. Moreover, project managers are often overloaded with overseeing several projects, which places heavy load on their time. I have seen few project managers handling 6-8 projects simultaneously.

As a result, project managers tend to discount people skills and rely on technical skills alone. This may work in certain conditions - but in the long run, to be a really successful project manager one need good people skills.

Benefits of having good people skills:

  1. We live in a global world where projects are being executed by a globally dispersed teams. In such an environment, it is rare and a luxury to meet people face-to-face and build rapport with various stake holders. Many project managers may never see some of the stakeholders and customers face-to-face.
  2. In today's hyper competitive world, there is a constant pressure to complete projects faster than before. Reduction of project cycle times has become crucial for success.
  3. Projects and programs are becoming more complex. In software world, new projects are being launched on unproven technologies. So for a project manager, this becomes very difficult to identify project risks. The only way to correctly identify project risks is to connect with people working on that project and then get a first hand information of all the risks.
  4. Engineers often multitask. Today it is common for engineers to be working on more than one project. So unless project managers can build a good personal rapport with the team, the project can suffer with unexpected delays and defects.
  5. Project Managers work in a matrix organization, often reporting to several stake holders. Having good people skills helps in managing several stakeholders. Having good people skills will make it easier to interact with various stakeholders and get a more positive outcomes.
  6. Many organization do not have clearly demarcated management roles. Engineering execution managers often step into project manager's space and vice-versa. In such cases having good people skills help in smooth interactions and reduce friction. 

Wednesday, July 13, 2016

IoT needs Artificial Intelligence



IoT - Internet of Things has been the biggest buzzword in 2016. Yet, one fails to derive value for IoT devices, and IoT has not yet hit the mainstream and continues to be at the periphery - but with a lot of hype surrounding it.

Personally, I have tested several of these IoT devices. Wearable devices such as FitBit, Google Glass, Smart Helmets - but failed to derive value from it. The main reason, I had to work more to make sense of these devices.  In short, I had to work to make IoT work for me!

There are two main problems plaguing IoT.

1. Lack of inbuilt intelligence to derive value out of IoT
2. Power supply for IoT

In this article, I will concentrate on the first problem plaguing IoT.

Essentially, IoT produces raw data and lots of it. The type of data depends on the type of device: Sensor data in cars, heartbeat information from Pace Makers, etc. Collectively, all this sensor data usually falls under BIG DATA category.  The sheer volume of data being created by them will increase to a mind-boggling level. This data holds extremely valuable insight into what's working well or what's not – pointing out conflicts that arise and providing high-value insight into new business risks and opportunities as correlation and associations are made.

The problem is that it takes huge amount of work to find ways to analyze this huge deluge of raw data and build valuable information out of it.

For example, with a health wearable, I can get to know my heartbeat pattern, heart rate, how many steps I walked, how many calories I burnt, how much rest, how much sleep etc. But this information is useless to me unless I can analyze it and develop a plan to change my activities. The wearable IoT does not tell what I need to do to reach my health goals, nor can it handle any anomalies.

For corporates using Big Data Analytics tools (which is expensive), one can get a really valuable insight. But, it takes a large team of experts to develop a big data analytics tools & platform - which then product valuable insight. The organizational leaders, must then understand the insights and ACT on it. All this means - LOTS OF WORK!

The only way to keep up with this IoT-generated data and make sense of it is with machine learning or Artificial Intelligence.

As the rapid expansion of devices and sensors connected to the IoT continues, it will produce a huge volume of data: Big Data which can be used to develop self driving cars, save fuel in Airplanes, improve public health etc. The treasure trove of big data is valuable only when machine intelligence is built on it - which can take autonomous decisions.

While the idea of AI sounds great. There is limitless benefits of AI - which will eliminate the need for humans to intervene in daily mundane tasks.  However, the big problem will be to improve the speed and accuracy of AI.

In an IoT based AI system must be able to regulate the action without errors. If IoT & AI does not live up to its promise, then the consequences should not be disastrous. A minor glitch like home appliances that don't work together as advertised. But a life-threatening   malfunction like the Tesla car crash would force people from embracing IoT & AI.

AI is already in use in many ways, for example Netflix, Amazon, Pandora use it to recommend products/movies/songs that you may like.

Today, there are lots of opportunities in IoT-AI solution space. For example an intelligent insulin pump which can regulate insulin levels based on the person's activities and sugar levels. Similarly, driverless vehicles - trains, planes, cars etc., so that better decisions can be made with out human intervention.

Facebook Disrupts Telecom with OpenCellular


Facebook today announced that it will launch OpenCellular, a mobile infrastructure platform designed to lower barriers to entry for would-be providers of Internet service to the developing world.

After facing a sharp rebuke to its Internet Basics program, Facebook has taken a more ambitious project to enable free Internet to areas that are currently under served.

OpenCellular is a customizable base chassis for a wireless access point which will connect devices using 2G, LTE or even Wi-Fi. The base chassis is designed to be modular to keep down costs and making it easier to deploy at a high point in an open area - like a tall tower or a tree.

OpenCellular is a wireless access point which will allow users to connect to Internet via cell phone devices. Keeping in mind that this is targeted at areas that are currently undeserved, I would guess that most users will connect with a cheap Android phones.

According to the blog post of Facebook engineer Kashif Ali: "We designed an innovative mounting solution that can handle high winds, extreme temperatures, and rugged climates in all types of communities around the world," The device can be deployed by a single person and at a range of heights — from a pole only a few feet off the ground to a tall tower or tree."

Facebook said that the emphasis in the design process was on keeping the design as modular and inexpensive as possible, as well as making it easy to deploy. Keeping in mind of a rural or remote installation location, the chassis can be powered from multiple sources PoE, Solar Panels, External Batteries etc.

Given Facebook's penchant for OpenSource and its contribution to OpenCompute Foundation, Facebook will eventually make the hardware design and the software running on OpenCellular will be made open-source eventually.

At this point, it is not clear as to how these OpenCellular base station will connect to Internet. It can be to a UAV drones or a direct satellite uplink.

A free wireless Internet access could be a game changer in rural areas of India & other parts of Asia, where the existing telcos make tonnes of money by charging for Internet and basic telephone calls. With free Internet, users can use Facebook's WhatsApp calling feature and use Facebook for local mass communication. This will effectively disrupt a whole lot of cell phone service providers business models all over the world, while Facebook will benefit indirectly by gaining more customers and users - which leads to greater advertisement revenues.

2017 - The year of Artificial Intelligence

Every leader in technology sector know that success in a very transient in nature. Even global giants - such as Kodak, Nokia, Blackberry, Motorola, IBM have become victims of rapid technological changes.

Rapid Innovation brings in rapid obsolescence. Technology is ever changing and companies need to invest continuously in R&D, to develop new products, technologies - just to stay relevant in this ever-changing world.

We are already seeing the early fruits of Artificial Intelligence. Google Car - A self driving car is clearly the flag bearer, and today several startups are working hard to make the self driving cars a standard. This new technology will make the current auto giants such as GM, VW, Toyota, Honda, Tata, obsolete. Taxi hailing companies such as Uber will be first ones to capitalize on this new technology. Together with self-driving cars and on-demand taxi service will change the way people look at owning an automobile.

While this is good for technology companies, this shift in marketplace will deal a death blow to auto industry, and millions of workers will lose their jobs.

Another sector where AI will bring a huge change is the world of IT services.

Today, several million people are employed in IT/BPO service sectors - who do just the basic first line or second line product support. Low cost communication technology moved these low skill jobs from US/Europe to India. But now, the lower cost of AI will eliminate several of these jobs altogether.

Apple's Siri & Facebook's chat bot are just the beginning, soon all e-commerce companies, product support organizations will move the first line support functions to computers which have cognitive system intelligence.

To win in this AI arms race, companies such as Microsoft, Google, Facebook, Tesla, Amazon etc are investing billions of dollars to develop new AI technologies. Companies that can master new AI technologies will be able to lower their operational costs, execute better and thus compete better and win new business. For example, AI enabled drones can speedup delivery for Amazon, thus reducing distribution costs. Customer support Bot will answer customer questions, etc,

Just like Internet drove the business in 1990's-2010, Artificial Intelligence will be the driver of the new smart world of 2020's.

Wednesday, January 20, 2016

Should Cisco Buy NetApp?


Right now NetApp is worth only $6.9 Billion! With a recession likely in 2016, the market valuation of NetApp will only go lower, making it a good target for acquisition.

As a stand alone Data Storage vendor, NetApp is at cross roads. The revenue from its traditional data storage array is falling at double digit rates. Its main competition - EMC is going private via acquisition by Dell. Hard Disk based data storage devices are in their twilight years - which is the main stay for NetApp.

In short, the future does not look bright for NetApp as an independent publicly traded company. Any investor can buy out NetApp and milk out the cash from business. At $6 Billion valuation, NetApp is primed for a takeover.

Across the silicon valley is Cisco. 2015 had been a very good year to Cisco. The sales of its UCS blade servers has propelled Cisco into becoming 4th largest server vendor. Though Cisco, along with EMC created VCE - which is a market leader in converged infrastructure, Cisco faces severe head winds in its core business. Adding to the business challenge, Cisco exited VCE business venture and EMC is being bought over by DELL. The Flexpod product line is also seeing strong head winds by new types of hyper converged infrastructure - Nutanix, Simiplivity and others.

At this juncture, it makes business sense for Cisco to buy a storage vendor will add teeth to its UCS server business and compete aggressively in converged infrastructure markets. With NetApp buying SolidFire - an all Flash storage vendor can help Cisco position UCS better in market.

Cisco is also at cross roads

In 2016, Cisco will have to decide if it wants to remain as an equipment provider or become a Cloud Service provider and remain as center of IT world.

For a long time, Cisco has served as a bell weather for the IT industry. Cisco provided the key networking devices which enabled the Internet and eCommerce world.

Now, with Cloud Service providers such as AWS, Microsoft Azure, Google etc., - the need to build enterprise scale data centers is becoming less important. Many companies are embracing  cloud service providers for all their IT infrastructure and are not building data centers. To Cisco's discomfort, the leading cloud service providers are not buying Cisco gear!

The cloud service providers are fundamentally challenging Cisco's business model.

So the question is: Should Cisco buy NetApp?

Cisco can buy a storage vendor and integrate its UCS with a data storage & its ACI enabled Nexus range of switches. This will enable Cisco to offer end-to-end solution for datacenters.

In my opinion, (its my opinion and it in no way reflects EMC's opinion) Cisco must buy a storage vendor - but not a standalone pure storage vendor like NetApp. Buying NetApp would be a retrograde move in terms of technology, as it does not give any technological benefit to Cisco. Just doing an integration between UCS servers and NetApp storage system does not add value to its customers.

Instead, Cisco should look ahead and buy companies that can truly add technological advantage to its servers. Cisco did buy Whiptail - a server flash storage vendor, but the integration did not go well.

Ideally, Cisco should buyout new technology company which can provide:

1. Ultra Fast Server Flash Storage, which will make its servers blazing fast.
2. OpenStack distribution for server virtulization and server attached commodity storage
3. Hadoop & BigData tools company which can run on UCS+FlashStorage+OpenStack
4. Cloud Service Provider

NetApp does not bring any new technology to Cisco's armory. Cisco currently gets the same technology from NetApp via partnerships.

So, Cisco should look at acquiring new & emerging technology companies that can help redefine UCS Servers as a hyper converged infrastructure. Something like ScaleIO or Nutanix - but can do more than just storage.

Historically, Cisco has been very good at buying technology companies which are developing new, promising technologies and nurturing them into major sellers. For example, UCS, Nexus, ACI, Meraki - were all in startup phase when Cisco acquired them.

Cisco needs to choose if it wants to remain as a major vendor of enterprise IT. And if Cisco wants to remain as a major vendor of enterprise IT, it must become a cloud service provider.

Finally, My Answer

Buying NetApp will be a good financial decision. NetApp can generate far more cash than its current net worth. So from a pure financial transaction point of view, buying NetApp will add financial value to Cisco's shareholders.

From technology point of view, buying NetApp is like buying 2000 model car in 2016! NetApp does not give Cisco any technological advantage and will not help Cisco transform its equipment business. Acquiring a older technology and spending time/energy to integrate it with it business will not help Cisco build a new generation technology.

In short, Cisco can buy NetApp for the financial value - but not for the technology. Ideally, if Cisco buys NetApp, Cisco must continue to run NetApp as a separate business entity and not merge it with its core server or network business.