Showing posts with label Information Security. Show all posts
Showing posts with label Information Security. Show all posts

Thursday, August 16, 2018

Successful IoT deployment Requires Continuous Monitoring


Growth of the IOT has created new challenges to business. The massive volume of IoT devices and the deluge of data it creates becomes a challenge — particularly when one uses IoT as key part of their business operations. These challenges can be mitigated with real-time monitoring tools that has to be tied to the ITIL workflows for rapid diagnostics and remediation. 

Failure to monitor IoT devices leads to a failed IoT deployment.

Thursday, July 19, 2018

5 Pillars of Data Management for Data Analytics

Basic Data Management Principles for Data Analytics

Data is the lifeblood for Big data analytics and all the AI/ML solutions built on top.

Here are 5 basic data management principles that must never be broken.



1. Secure Data at Rest

  • Most of the data is stored in storage systems which must be secured.
  • All data in storage must be encrypted  


2. Fast & Secure Data Access 

  • Fast access to data from databases, storage systems. This implies using fast storage servers and FC SAN networks.  
  • Strong access control & authentication is essential


3. Manage Networks for Data in Transit

  • This involves building fast networks - a 40Gb Ethernet for compute clusters and 100Gb FC SAN networks
  • Fast SD-WAN technologies ensure that globally distributed data can be used for data analytics.


4. Secure IoT Data Stream

  • IoT endpoints are often in remote locations and have to be secured.
  • Corrupt data from IoT will break Analytics.
  • Having Intelligent Edge helps in preprocessing IoT data - for data quality & security


5. Rock Solid Data backup and recovery

  • Accidents & Disasters do happen. Protect from data loss & data unavailability with a rock solid data backup solutions.
  • Robust disaster recovery solutions can give zero RTO/RPO.


Monday, July 02, 2018

Benefits of Aadhaar Virtual ID




Use Aadhaar Virtual ID to Secure your Aadhaar Details

Considering the privacy of the personal data including the demographic and biometric information mentioned on the Aadhaar card, UIDAI has recently decided to come up with a unique feature, termed as Aadhaar Virtual ID.

The Aadhaar Virtual ID offers limited KYC access providing only that much information which is required for verification rather than offering complete details of an individual's Aadhaar card.

What is an Aadhaar Virtual ID?

The Aadhaar Virtual ID consists of 16-digit random numbers that is mapped to an individual's Aadhaar card at the back end. An Aadhaar card holder using the virtual ID need not submit his Aadhaar number every time for verification purpose, instead he can generate a Virtual ID and use it for various verification purposes like mobile number, bank and other financial documents.

The Aadhaar Virtual ID gives access to the biometric information of an Aadhaar card holder along with the basic details like name, address and photograph that are sufficient for the e-KYC. Unlike in the past, the agency will not know the 12-digit Aadhaar number and other personal details.

Benefits of Aadhaar Virtual ID


  1. Complete Privacy of personal Data
    eKYC can now be done without sharing Aadhaar number
    All private information: biometric, DOB, address are private
     
  2. User has complete control on sharing Aadhaar ID details
    Only the Aadhaar card holder can generate virtual ID
    Only the Aadhaar card holder can share virtual ID
    Aadhaar Virtual ID expires after a pre-set time, preventing misuse
     
  3. Automates all eKYC verification process in the backend
    Simplifies agencies task of individually verifying KYC data
    Web Based verification system is fast and reliable for real time business applications

Friday, June 22, 2018

Blockchain for Secure Healthcare Records



Individual users health records - both user generate data: Activity monitors (fitness bands, mobile apps),  home medical devices etc., and co-generated data from hospitals, testing laboratories, insurance companies etc., are becoming important to protect and regulate access of this sensitive data.

Patient health records is a sensitive data and there are regulations that must be followed. Globally, there has been a heightened need to increase privacy and limit access to patient's health records to avoid misuse and abuse.

Here is one use case for Blockchain based record vault - which enables users to secure their own health records & information.

At this point in time, there are no major solutions which offers this, and hence this a mere idea today - which can become a potential product/solution in near future.

There are several benefits of such a system:


  1. Secure HIPPA Compliant Data Access.
  2. User – i.e, Patient owns all his data & can choose to whom to share this data.
  3. Doctors & Healthcare professionals get authenticated data on all patient records – when needed.
  4. Simplifies Hospital data record management & has data only when patient has approved & for a agreed duration of time.
  5. Drug Research organizations get authentic data & speeds up research.
  6. Patient’s insurance claims are faster due to secure & authentic data. Results in faster disbursal
  7. Retail & Pharma companies can buy user data directly from the patient/user. Thus helping patients from monetizing their own data. 

Blockchain's smart contract system allows users to regulate who has access and for how long. Users can thus monetize their own health records and also prevent misuse of their health records.

Tuesday, June 12, 2018

Aadhaar - A Secure Digital Identity Platform



Secure identity platform helps businesses such as Fintech, Banks, Healthcare, Rental services, etc can use to verify customers' real identities. With a Aadhaar number & a finger print scan, Aadhaar lets businesses accurately identify a customers for trusted transactions.

Digitization has created new business opportunities like Peer-to-peer lending, robo investing, online insurance, online gaming, digital wallets etc. As digitization speeds up the pace of business and needs an equally fast, secure identification system.

Currently, Aadhaar platform has over One Billion Identities - and be used to create new business opportunities and also optimize existing processes. For example, companies can use Aadhaar ID to:

1. Optimize Conversions
A fast & accurate customer verification helps mobile companies or Fintech companies speed up conversion inquiries into paying customers.

2. Deter & Reduce Fraud
Secure identification allows Fintech companies to prevent account takeover and online frauds and also detect & prevent new frauds.

3. Meet Compliance Mandates
Data in Aadhaar provides the necessary data to comply with regulations and directives.

4. Enable new business opportunities
Aadhar ID system enables the new 'sharing' economy, allowing owners to share/rent their assets & earn money

Thursday, May 24, 2018

Most Common Security Threats for Cloud Services


Cloud computing continues to transform the way organizations use, store, and share data, applications, and workloads. It has also introduced a host of new security threats and challenges. As more data and applications are moving to the cloud, the security threat also increases.

With so much data residing in the cloud — public cloud, these services have become natural targets for cyber security attacks.

The main responsibility for protecting corporate data in public cloud lies not with the service provider but with the cloud customer.  Enterprise customers are now learning about the risks and spending money to secure their data and applications.


Tuesday, May 22, 2018

5 Aspects of Cloud Management


If you have to migrate an application to a public cloud, then there are five aspects that you need to consider first before migrating.



1. Cost Management
Cost of public cloud service must be clearly understood and charge back to each application must be accurate. Lookout for hidden costs and demand based costs - as these can burn a serious hole in your budgets.

2. Governance & Compliance
Compliance to regulatory standards is mandatory. In addition you may need additional compliance requirements. Service providers must proactively adhere to these standards.

3. Performance & Availability
Application performance is the key. Availability/Up time of underlying infrastructure and performance of IT infrastructure must be monitored continuously. In addition, application performance monitoring both direct methods and via synthetic transactions is critical to know what customers are experiencing

4. Data & Application Security
Data security is a must. Data must be protected against data theft, Data loss, data unavailability. Applications must also be secured from unauthorized access and DDoS attacks. Having an active security system is a must for apps running on cloud.

5. Automation & Orchestration
Automation for rapid application deployment via DevOps, rapid configuration changes and new application deployment is a must. Offering IT Infrastructure as code enables flexibility for automation and DevOps. Orchestration of various third party cloud services and ability to use multiple cloud services together is mandatory. 

Friday, May 18, 2018

Software Defined Security for Secure DevOps



Core idea of DevOps is to build & deploy more applications and do that a whole lot faster. However, there are several security related challenges that needs to be addressed before a new application is deployed.

Software Defined Security addresses this challenge of making applications more secure - while keeping pace with business requirements for a DevOps deployment.

The fundamental concept of software defined security is the codify all security parameters/requirements into modules - which can be snapped on to any application. For example, micro segmentation, data security, encryption policies, activity monitoring, DMZ security posture etc are all coded into distinct modules and offered over a service catalog.

A small team of security experts can develop this code, review & validate it and make these security modules generally available for all application developers.

Application developers can select the required security modules at the time of deployment. This gives tremendous time to deployment advantage as it automates several security checks and audits that are done before deployment.

Security code review & security testing is done once at the security module level and thus individual security code review of each application can then be automated. This saves tremendous amount of time during application testing time - leading to faster deployment.

Software security is ever changing, so when a new standard or a security posture has to be modified, only the security modules are changed and applications can pick up the new security modules - thus automating security updates on a whole lot of individual applications. This leads to tremendous effort saving in operations management of deployed apps.


Tuesday, May 08, 2018

Build Modern Data Center for Digital Banking



Building a digital bank needs a modern data center. The dynamic nature of fintech and digital banking calls for a new data center which is  highly dynamic, scalable, agile, highly available, and offers all compute, network, storage, and security services as a programmable object with unified management.

A modern data center enables banks to respond quickly to the dynamic needs of the business.
Rapid IT responsiveness is architected into the design of a modern infrastructure that abstracts traditional infrastructure silos into a cohesive virtualized, software defined environment that supports both legacy and cloud native applications and seamlessly extends across private and public clouds .

A modern data center can deliver infrastructure as code to application developers for even
faster provisioning both test & production deployment via rapid DevOps.

Modern IT infrastructure is built to deliver automation - to rapidly configure, provision, deploy, test, update, and decommission infrastructure and applications (Both legacy, Cloud native and micro services.

Modern IT infrastructure is built with security as a solid foundation to help protect data, applications, and infrastructure in ways that meet all compliance requirements, and also offer flexibility to rapidly respond to new security threats.

Monday, May 07, 2018

Product Management - Managing SaaS Offerings



If you are a product manager of a SaaS product, then there additional things one needs to do to ensure a successful customer experience - Manage the cloud deployment.

Guidelines to choosing the best data center or cloud-based platform for a SaaS offering

1. Run the latest software. 

In the data center or in the IaaS cloud, have the latest versions of all supporting software: OS, hyper visors, Security, core libraries etc., Having the latest software stack will help build the most secure ecosystem for your SaaS offerings.

2. Run on the latest hardware. 

Assuming you're running on your data center, run the SaaS application on the latest servers - like HPE Proliant Gen-10 servers to take advantage of the latest Intel Xeon processors. As of mid-2018, use servers running the Xeon E5 v3 or later, or E7 v4 or later. If you use anything older than that, you're not getting the most out of the applications or taking advantage of the hardware chipset.

3. Optimize your infrastructure for best performance.

Choose the VM sizing (vCPU & Memory) for the best software performance. More memory almost always helps. Yes, memory is the lowest hanging of all the low-hanging fruit. You could start out with less memory and add more later with a mouse click. However, the maximum memory available to a virtual server is limited to whatever is in the physical server.

4. Build Application performance monitoring into your SaaS platform

In a cloud, application performance monitoring is vital in determining customer experience. Application performance monitoring had to be from a customer perspective - i.e., how customers experience the software.

This implies constant Server, Network, Storage performance monitoring, VM monitoring, application performance monitoring via synthetic transactions.

Application performance also determines the location of cloud services. If  customers are in East coast - then servers/datacenters should be in east coast. Identify where customers are using the software and locate the data centers closer to customer, to maximize user experience.

5. Build for DR and Redundancy

SaaS operation must be available 24x7x365. So every component of SaaS platform must be designed for high availability (multiple redundancy) and active DR. If the SaaS application is hosted on big name-brand hosting services (AWS, Azure, Google Cloud etc) then opt for multi-site resilience with auto fail over.

6. Cloud security

Regardless of your application, you'll need to decide if you'll use your cloud vendor's native security tools or leverage your own for deterrent, preventative, detective and corrective controls. Many, though not all, concerns about security in the cloud are overblown. At the infrastructure level, the cloud is often more secure than private data centers. And because managing security services is complex and error-prone, relying on pre-configured, tested security services available from your cloud vendor may make sense. That said, some applications and their associated data have security requirements that cannot be met exclusively in the cloud. Plus, for applications that need to remain portable between environments, it makes sense to build a portable security stack that provides consistent protection across environments.

Hybrid SaaS Offering

Not all parts of your SaaS application can reside in one cloud. There may be cases where your SaaS app runs on one cloud - but pulls data from other cloud. This calls for interconnect between multiple cloud services from various cloud providers.

In such hybrid environment, one need to know how apps communicate and how to optimize such a data communications. Latency will be a critical concern and in such cases, one needs to build in a cloud interconnect services into the solution.

Cloud Interconnect: Speed and security for critical apps

If the SaaS App needs to access multiple cloud locations, you might consider using a cloud interconnect service. This typically offers lower latency and when security is a top priority, cloud interconnect services offer an additional security advantage.

Closing Thoughts

SaaS offerings has several unique requirements and needs continuos improvements. Product managers need to make important decisions about how the applications  are hosted in the cloud environment and how customers experience it. Making the right decisions gives the results for a successful SaaS offering.

Finally, measure continuously. Measure real-time performance, after deployment, examining all relevant factors, such as end-to-end performance, user response time, and individual components. Be ready to make changes if performance drops unexpectedly or if things change. Operating system patches, updates to core applications, workload from other tenants, and even malware infections can suddenly slow down server applications.

Thursday, January 18, 2018

Software Defined Security

 
 In the virtual world an organization might have thousands of virtual machines... The organization cannot manage them manually. That's where the Software Defined Security comes handy. Applying security policy uniformly and automatically in all environments.
 
 With runtime virtualization, different containers/VMs can reside together in the same cloud infrastructure – and have different security protections. Each application can have a different security profile. A software bases security solutions helps automate secure deployments in clouds and allow for customization of protection across different applications
 

 In a hybrid cloud environments, applications can span across multiple clouds – and yet have an uniform security settings and responses. Automate responses to security events to minimize damages and increase vigilance with automated monitoring.

Tuesday, November 28, 2017

Big Data Warehouse Reference Architecture


Data needs to get into Hadoop in some way and needs to be securely accessed through different tools. There is a massive choice of tools for each component, dependent on the choice of Hadoop distribution selected – each having their own versions of the tools, but, nonetheless providing the same functionality.

Just like core Hadoop, the tools that ingest and access data in Hadoop can scale independently.  For example, a Spark cluster, flume cluster, or kafka cluster

Each specific tool has it's own infrastructure requirements.

For example: Spark requires more memory and processor power, but is less dependent on hard disk drives

Hbase doesn't require as many cores within the processor but requires more servers and faster non-volatile memory such as SSD and NVMe based flash.    

Thursday, October 26, 2017

Bitcoin is not Anonymous



One of the commin misconception about Bitcoin is its supposed anonymity. In reality, it is not so. Though all users of Bitcoin take up a pseudonym which is their public key and this public key does not reveal the name or identity of the user. By using pseudonym, users tend to think that Bitcoin is anonymous - but it is not so. Let me explain below:

The pseudonym is now used for all transactions and all the transactions are recorded in a shared distributed ledger - i.e., everyone can see all transactions. This implies that all transaction data is available for any big data analytics.

By using transaction data with data from person's mobile: Locational data, Social network data, internet access data etc, it is possible to triangulate the pseudonym to the identity of a real person. With rapid advances in realtime bigdata analytics, it is now possible to link a bitcoin address to a real-world identity.

If one interacts with a bitcoin business: Online wallet service, Exchange, or a merchant - who usually want your real-world identity for tansactions with them. For example details like credit card information or shipping address; Coffee shop or resturants that accept bitcoins - then the buyer must be physically present in the store. The store clerk/attender know about the buyer - thus the pseudonym now gets associated with the real-world identity. The same information can be gleamed from digital trail in form of data from cell phone, CCTV grabs etc,. Information like geographic location, time of the day, social network postings (Twitter, Facebook etc) can all be collected remotely, analyzed and soon a pseudonym can be associated with a real-world identity.

Thursday, June 01, 2017

6 Key Tools and Techniques for Taming Big Data



Using Big Data across the enterprise doesn't require massive investments in new IT systems. Many Big Data tools can leverage existing and commodity infrastructures, and cloud-based platforms are also an option. Let's take a look at some of the most important tools and techniques in the Big Data ecosystem.

1) Data governance. 

Data governance includes the rules for managing and sharing data. Although it's not a technology per se, data governance rules are enforced by technologies such as data management platforms.
"There's a lack of standards and a lack of consistency," explains Doug Robinson, executive director of the National Association of State CIOs (NASCIO). "There's certain data quality issues: Some of the data is dirty and messy and it's non-standardized. And that increasingly has made data sharing very difficult because you have language and syntax differences, the taxonomy on how information is represented.

... All that is problematic because there's no overarching data governance model or discipline in most states. Data governance isn't very mature in state government nor local governments today, and certainly not the federal government."

Data governance is critical to gaining buy-in from participating agencies for enterprise-wide data management. Before data sharing can begin, representatives of all participating agencies must work together to:


  • Discuss what data needs to be shared
  • Determine how to standardize it for consistency
  • Develop a governance structure that aligns with organizational business & compliance needs


2) Enterprise data warehouse. 

With an enterprise data warehouse serving as a central repository, data is funneled in from existing departmental applications, systems and databases.

Individual organizations continue to retain ownership, management and maintenance of their data using their existing tools, but the enterprise data warehouse allows IT to develop a single Big Data infrastructure for all agencies and departments. The enterprise data warehouse is the starting point for integrating the data to provide a unified view of each citizen.

3) Master data management (MDM) platforms. 

With data aggregated into an enterprise data warehouse, it can be analyzed collectively. But first it has to be synthesized and integrated, regardless of format or source application, into a master data file. MDM is a set of advanced processes, algorithms and other tools that:

  • Inspect each departmental data source and confirm its rules and data structures. Identify and resolve identity problems, duplicate record issues, data quality problems and other anomalies 
  • Ascertain relationships among data
  • Cleanse and standardize data 
  • Consolidate the data into a single master file that can be accessed by all participating organizations
  • Automatically apply and manage security protocols and data encryption to ensure accordance with privacy mandates


4) Advanced analytics and business intelligence.

High-performance analytics and business intelligence are the brains of the Big Data technology ecosystem, providing government centers of excellence with a comprehensive analytical tool set that leverages extensive statistical and data analysis capabilities. Through the use of complex algorithms, these platforms quickly process and deliver Big Data's insights. Functionality includes the ability to:

  • Mine data to derive accurate analysis and insights for timely decision-making
  • Create highly accurate predictive and descriptive analytical models Model, forecast and simulate business processes
  • Apply advanced statistics to huge volumes of data 
  • Build models that simulate complex, real-life systems


5) Data visualization. 

Data visualization tools are easy to use — often with point-and-click wizard-based interfaces — and they produce dazzling results. With simple user interfaces and tool sets, users of advanced business intelligence and visualization tools can easily:


  • Develop queries, discover trends and insights
  • Create compelling and dynamic dashboards, charts and other data visualizations 
  • Visually explore all data, discover new patterns and publish reports to the Web and mobile devices 
  • Integrate their work into a familiar Microsoft Office environment
     
6) Specialty analytics applications. 

Multiple analytics techniques can be combined to deliver insight into specialized areas such as:
Fraud, waste and abuse. By detecting sophisticated fraud, enterprises can stop fraud before payments are made, uncover organized fraud rings and gain a consolidated view of fraud risk.

Regulatory compliance. Analytics tools can help agencies quickly identify and monitor compliance risk factors, test various scenarios and models, predict investigation results, and reduce compliance risk and costs.

HR analytics. Hiring is critical to build capabilities quickly. Therefore it becomes important to hire employees who can meet its requirements and fit into its corporate culture. There in lies the challenge: "How to hire someone from outside - who has relevant knowledge needed in banking and who will fit in with the existing corporate culture." This challenge can be solved by using data analytics during the selection process.

Each BU will have several such tools and techniques that are important, but that can't be justified to create data silos. Breaking data silos, combined technology with analytics expertise, new organizational workflows and cultural changes to enable enterprise-wide data management.

Big Data as an Organizational Center of Excellence



Introduction - Managing Data Across Large Enterprise

Today, enterprises are looking for Integrating Data to Support Analytics Based Decision Making and are still facing massive challenges. The biggest challenges they have is that data is located in silos, and yet large volumes of data are being generated are still managed in silos.

Thanks to new technologies such as IoT, there is an abundance of data being generated. Enterprise information systems, Networks, Applications and Devices that churn out huge volumes of information are awash in Big Data.

But enterprises are unable to make best use of this data as their internal organization — the network of people & business process are operating in isolation and many analytics efforts that only take into account information from a single silo - that delivers results in a vacuum - thus prevent them from making better business decisions.

The best way to lower costs of managing big data and leveraging this data for actionable insights - is to have a  pan-enterprise Big Data strategy.  (Also see Getting Your Big Data Strategy Right)

The best way to solve this problem is to create Big Data as an organizational center of excellence. This special group that can cut across silos, take ownership of all data and create new opportunities for operational excellence with Big Data.

Big Data as an Organizational Center of Excellence


Managing all data across the entire organization will improve efficiencies and services while lowering costs of mining this data.

Organizational BU's can now use this centralized data for actionable insights that can help them make better business decisions.

As business leaders recognize a pan-enterprise Big Data effort provides much more meaningful insights because it's based on an integrated view of the business. Yet they are faced with the challenges of siloed information systems. The current IT implementations and business process prevent data sharing and access to data.

In this article, I will present a solution to address the challenges of data silos and data hoarding.  As a enterprise wide solution, I shall present a new CoE for Big Data - a solution for breaking down data silos and discuss the key benefits of the solution.

Big Data CoE will take ownership of all data and present an integrated data for data analysis improves performance across the global enterprise. Big Data CoE can thus deliver long-term return on investment (ROI), enabling business leaders to develop a solid data for making better Analytics Based Decisions.


Understanding Data Silos and the Pitfalls of Data Hoarding 


Before I dive into the details of Big Data CoE, let take a quick look at data silos and dangers of data hoarding.

In large, global enterprises, individual business units (often referred to as BU's) are notorious and pervasive. Data silos are born in this toxic environment - where individual BU budgets procurement processes create data silos.

Group IT resources which are often scarce and are often designated for specific BU functions.   Each individual BU has no financial incentive to share data and IT resources. As a result, data collect by each BU becomes siloed and data is hoarded in each BU's IT systems and is not shared.

This results in organizational deficiencies where the entire organization suffer from redundant systems and inefficient decision-making. Because enterprise information systems remain segregated, data is walled up in departmental databases and applications. With valuable assets trapped in silos, BU's are unable to leverage data to improve processes, workflow or service delivery.

While data silos are created by operational and technical challenges, data hoarding is a result of insular agency cultures that encourage autonomy and self-reliance, as well as stringent compliance mandates for securing data, especially in this era of risks from data leaks/breaches and liability lawsuits.

In this environment, "business data" becomes "OUR DATA." Data hoarding trumps openness and sharing.

The impact of data silos and data hoarding is quite devastating. Without data sharing across BU's, each BU maintains its own view of business and there is no holistic view of a consistent global business. There is no integration of relevant data and this leads to missed opportunities and wrongful expenses and wastage, delays in discovering fraud, waste & misuse of money.

In addition, critical decisions are made with partial data - that leads to unproductive staff and duplicated efforts - which leads to wastage. Budgets are drained by the cost of managing and maintaining complex and redundant information systems, applications and system interfaces.

Finally, data silos and data hoarding weaken security & compliance efforts. It's harder to ensure the security and privacy of information as it moves among computer systems and databases, and can lead to noncompliance with critical regulations. (PCI-DSS, SOX, etc)

A Holistic Model: Big Data CoE 


Envision a pan-enterprise model for managing Big Data as an organizational center of excellence, A competency center whose core focus is to manage Big Data across the organization and provide right set of tools & infrastructure for business analytics.

Big Data CoE is created with a common focus to manage data and develop new technologies and architectures to analyze big data for making better business decisions

When Big DataCoE model is applied to the enterprise, we can instantly see the following benefits:


  • Data is treated as an organizational asset.
    Treating data as an organizational asset, CoE develops and fosters a collaborative environment for users across BU's to meet and exchange ideas, discuss new projects and share best practices.

     
  • Data is managed separately from IT in terms of strategy, organization, resources, purchasing and deployment. This frees up enterprise IT from handling the challenges of Big Data. Data can reside in-house systems or on public clouds.

     
  • Distinct processes are developed for collecting, consolidating, managing, linking, securing, sharing, analyzing, archiving, publishing and governing data.

     
  • Analytical expertise is shared among individual departments, which relieves them of the burden of independently recruiting their own talent and developing unique solutions.

     
  • Data is aggregated, shared and analyzed using a single, enterprise-wide data platform. A unified system of enterprise data management and analytics tools ensures seamless integration of data repositories with analytic tools, provides a common user experience with access to all types of enterprise data and builds end user engagement in data-driven decision-making.


Unlike siloed data, an enterprise wide approach to data provides all BU's with a single version of the truth. With this integrated, holistic view, decision-making involves all relevant, consistent data, regardless of data ownership.

Creation of Big Data CoE is Transformative


The biggest benefit of Big Data CoE is that it transforms business operations.

Big Data CoE leads to business transformation:


  1. More efficient Enterprise. When used at the enterprise level, Big Data can reduce fraud, waste and misuse of funds, Enhance communication and coordination between BU's, Improve management of Big Data, and Identify key business trends across the entire enterprise.
     
  2. Faster & Better Decision Making. A complete view of each BU's data, CoE can offer the most appropriate data management & analytical services by identifying patterns that might otherwise be missed. They can also eliminate data processing errors and duplicative data entry. Provide consistent procedures and processes eliminates waste of valuable time and speed up decision making.
     
  3. Stronger compliance efforts. When data management is integrated, it makes it easier to implement data compliance mandates across organization. Entire enterprise data can be made secure and compliant even when it is shared across BU's.
     
  4. Cost reduction. More efficient data management, analytics & workflows, compliance, security and  service delivery - all lead to cost reductions. By consolidating data analytics efforts under a single CoE, additional savings can be realized because departments don't have to procure and manage their own systems or hire department-specific data scientists and analysts.


To realize the benefits of an enterprise approach to Big Data, Enterprises must adopt a comprehensive approach that leverages appropriate tools and techniques.

Closing Thoughts  

Big Data can provide tremendous business advantages by for improving business productivity,  business decision making and service delivery. But Big Data can only live up to its true potential only if analytics programs are implemented thoughtfully and skillfully.

The strategic use of Big Data and data analytics technologies and tools requires considerable innovation, creative thinking and leadership.

The "silo mentality" has to be broken up and data needs to be shared across the enterprise as a common asset. Having a CoE manage all of Big Data allows enterprises to holistically manage, share and leverage data for faster decision making and service delivery.

Big Data CoE helps enterprises and its BU's to rethink and retool the way they collect, manage, archive and use data. CoE will enable Bus to work together and share information - this leads to better decision-making, faster service delivery and develop an enterprise wide approach to managing and using Big Data.

Wednesday, December 23, 2015

Juniper's Software Security Problem and What We can learn from it?



Recently, Juniper was in the news for a wrong reason: FBI is investigating Juniper for a security hole in its Netscreen Firewall products. 

FBI is investigating on a security hole - i.e., someone had put  "unauthorized code" inside Netscreen Firewall - a security equipment sold by Juniper Networks.

While this investigation goes on, there is a major learning for all product companies. Adding some "unauthorized code" into the product is a regular technique used by various government agencies. Few years ago,  NSA had done a similar thing - put code on Cisco network equipment, which prompted John Chambers, then the CEO of Cisco, to write an open letter to President Obama asking Obama to stop the NSA from hacking into Cisco's equipment.

In February 2015, Kaspersky labs found out that NSA had created spyware hidden in the hard drive firmware of more than dozen of the largest manufacturers brands in the industry, including Samsung, Western Digital, Seagate, Maxtor, Toshiba and Hitachi. See: NSA Planted Stuxnet-Type Malware Deep Within Hard Drive Firmware

Given the extent of reach and power NSA and other governmental agencies have, it would be wise to assume that majority of computer equipment could have "unauthorized code"!


How does such "unauthorized code" get into the system?


According to information revealed by Edward Snowden, hacking into hardware is relatively easy for agencies like NSA. NSA has a special department to handle this - called as TAO: Tailored Access Operation Unit. TAO intercept servers, routers, and other network gear being shipped to organizations targeted for surveillance and install covert implant firmware onto them before they're delivered. These devices are then re-packaged and placed back into transit to the original destination. All of this happens with the support of Intelligence Community partners and the technical wizards in TAO.

What can be done about it?


Since the "unauthorized code" is added during the transit, the original manufacturer has no idea that the software inside the device has been compromised. The customer also has no idea of what's inside the box.

As the experts at TAO can do a good job, there will be no physical signs of tampering on the physical boxes.

There are several ways to avoid such hacks. (but there is no perfect plan) This calls for vendor to provide a greater level of transparency to the customer.

1. Ship Hardware only - without any firmware in the device.

All physically shipped products are shipped without any software or data storage drives/memory.
Equipment vendors can then provide another copy of the firmware via secure channel. In an extreme case, customer physically walks into a vendor location and get a copy of the firmware in a disc, which is then installed on the device.

2. Vendor provides CRC checksum details, code size, time stamps and other details to customer 

Equipment vendor also provides loads of meta data about the compiled firmware. Details such as CRC checksum, code size, code time stamps for the golden image is shared with customer. Customers can then check their equipment and embedded code to see if anything has changed.

3. Customer can insist on Open source software only

Several governments are now insisting on using open source software - which engineers from customer side can validate, compile and install on the hardware.

The other option for the customer to insist on vendor to provide the source code. Customer then can validate, compile and install on the hardware.

All this calls for greater transparency by the vendor and a change in product shipment & deployment - which adds to the final cost of the product.

How to detect and catch such "unauthorized code"?


The burden on detecting and catching such "unauthorized code" lies on the end customer. Customers of computer systems have much to lose, hence they must step-up their internal software security practices.

Many customer have instituted a software security assurance program - but this program must also include firmware and hardware.

EMC has a software product called Network Configuration Manager (NCM) to ensure that the network devices are running on authorized code.

Network Configuration & Compliance Management tool, can be used to regularly check all equipment for any changes to the underlying firmware and warn if any irregularities are found. NCM can also be used to remediate infected devices and change the firmware to the known "Golden Image" which is a customer validated version.

Network Configuration & Compliance Management tool can be used to automatically check several thousands of devices simultaneously, do regular compliance audits, report any violations and even do automated remediation steps.

In addition to network device configuration, customer will have to constantly monitor network traffic to detect any "Unauthorized Data Movement". Here again there are tools such as RSA's Envision and Lanscope's StealthWatch System.

Tools can help - but remember that organizations like NSA has a long reach and even RSA's security code has been hacked by NSA. See: Alleged NSA Dual_EC_DRBG backdoor 

Closing Thoughts


Information Security is a major problem. When governments are hacking basic network devices and hardware such as hard drives, the burden of information security lies squarely on customer's shoulders. Customers should be vigilant, demand greater transparency and implement tools and processes to catch any security violations.

Today, there are software tools to help in network security and compliance. Customer have to use tools to the best possible extent, demand greater code level transparency and implement a robust security assurance program. (And then keep your fingers crossed!)