Skip to main content

Data Analytics for Business Excellence


Abstract

Data Analytics has been around from 2004. The Apple, Amazon, Google have adopted to the concept from 2006-2007. The demand was a result of web, apps, and I.o.T. (internet of Things),that spurred the number of users on all these mediums.(also referred as Digitization) Also it required special needs of database and handling of new data that emerged out from these usages’. The IT already had a solution in the form of NoSQL (origin 1960) but neglected till 2004. This provided the simple, scalable and efficient and also fails proof solution.

BigTable (Google), Dynamo (Amazon), Cassandra (Facebook) and Mongo DB (used in CERN Project)

The use of above tools combined with analytical requirements normally based on statistical theory is most useful for Business where in all functionalities are evaluated to ensure the sustainability of Corporation. European Quality Foundation Quality Management System and Balridge Criteria for Performance are two such systems adopted worldwide. In India we do follow EFQMS with RAPTR. The tools MINITAB, STATA,SAS,R Language are apt to generate Mean , Median, Mode, Standard Deviation, Control Charts, R Charts, X Bar chart, R&R Gauge Analysis, Box Plot Graph (all types of Graphs),Regression Analysis,DOE. Most Powerful and free tool being R Language (community based free software) giving almost new versions every week free and involving very least code of lines to get all Graphical representations.

The most important aspect being Amazon, Google and Apple using disruptive business model to create and recapture the market share. Amazon has introduced Fire TV Stick with TV and Motion Pictures Indian content, USA Content to face NETFLIX. Amazon added Web series, ECHO to its product range with very attractive price tags.The same phenomenon is replicated in India by most online e-commerce companies. The customers are benefited in the process.

Keywords---Data Analytics, Business Excellence,Six Sigma, Descriptive Analytics,Prescriptive Analytics, Predictive Analytics,NoSQL,R Language,

What is Data Analytics?

Data Analytics is scientific analysis of data collected by a business, generated from business process.

Data Analytics is done with the help of Computer program as the amount of data generated exceeds sometimes beyond terabytes to petabytes.

The computer program mainly handles data and treats it with statistical analytical methods, that have been useful in arriving at some practical decision.

The computer program may be a package devoted solely to do these kind of analysis. They are SPSS, STATA, Minitab,SAS, R language.

Need for Data Analytics

The need for data analytics arose from the fact that the quantum of data generated is huge(big) and it had to be distributed to all business process owners.

Also, the need of data at different business process is different and it influences the decision making aspects for achieving business objectives.

The uniformity of processing and interpretation of data analysis is also extremely important, which is influenced by the culture of the organization.

As this analysis impacts business bottom line, extreme care needs to be taken in data collection, cleansing, and presenting it in proper perspective.

What is Business Excellence?

Business Excellence. Business Excellence is often described as outstanding practices in managing the organization and achieving results, all based on a set of fundamental concepts or values. These practices have evolved into models for how a world class organization should operate.

Business Excellence, as described by the European Foundation for Quality Management (EFQM), refers to; ”Outstanding practices in managing the organization and achieving results, all based on a set of eight fundamental concepts”, these being, “results orientation; customer focus; leadership and constancy of purpose; Key Performance Results, Society Results, People Results, Customer Results, Processes, Partnerships and Resources, Policy and Strategy, Leadership, People.

Baldrige Criteria forPerformance Excellence emphasis’s on Leadership, Strategic Planning, Customer and market Focus, Measurement Analysis and Knowledge Management, Workforce focus, Process Management and Results.

Balridge Criteria for Performance Excellence

Leadership=Guide & Sustain, Governance Structure, ethical,legal & Community responsibilities.

Strategic Planning=Develop Strategic Objective, Action Plan, Change /Deploy, Progress Measure

Customer & Market Focus= How customer requirements are determined, customer relationship, factors to customer acquisition, sat - isfaction, Loyalty, Retention, business expenses, sustainabiliy.

Measurement Analysis & Knowledge Management- Gather, Analyze, Manage and improve data, How to manage IT ?, Review and Improve performance.

Workforce Focus- Engage, Manage, & develop workforce, Utilise full scale, to meet mission, strategy, and action plan, assess, capability, capability needs, workforce environment.

Process Management- Core Competences, work systems to customer value & Organization success & sustainability, emergency preparation.

Results- measured in all above areas.

EFQM Excellence Model Criteria

Leadership-Achieve mission, & vision, Oragnization Values, Systems for Sustainable success, Implement action, Consistency of Purpose, Inspire all to follow.

People –manage, develop & deliver potential of people, (at induvidual, team & orgn.level), fairness & equality, Involve, empower, care, communicate, reward & recognize, motivate & build commitment, use skills and knowledge for organization.

Policy & Strategy- Stakeholder focused strategy, market, policies, plan ,objectives & processes to deliver .

Partnerships and Resources- manage external partnerships,suppliers & internal resourcess to support policy and strategies .Effective operation of processes, Balance current & future needs of Organization. Community & environment.

Processes-designed,managed, & improved to satisfy and add value for customers, & other stakeholders.

People, Customer & Society Results-measure and achieve results.

Key Performance Results- measure and achieve w.r.t. policy and strategy

How Data Analytics is useful for Business Excellence?

Descriptive Analytics –Descriptive analytics is a preliminary stage of data processing that creates summary of historical data to yield useful information and possibly prepare for further analysis. It uses data aggregation and data mining to provide insights on what has happened.

Descriptive Statistics is a method of organizing, summarizing, and presenting data in a convenient and informative way with aim of understanding what has happened or current situation and aids in descriptive analytics. The actual method used depends on what information we would like to extract.

The tools and techniques covered in Six Sigma that are applied in Descriptive analytics are measures of central tendency like mean, median, mode and quartiles and measures of dispersion / variation like standard deviation, variance and range. This is reasonably well captured in Lean Six Sigma.

Prescriptive Analytics – Advise on possible outcomes and how to influence it (provides basis for decision making)

Prescriptive Analytics is a relatively new field that allows users to “prescribe” a no. of possible actions and guide them towards a solution. It helps to quantify effect of future decisions in order to advice on possible outcomes before actual decision is made. So it provides not just insight on what will happen but why it will happen in terms of actions that are required to ensure prediction is realized and how best to maximize benefits.

It uses a combination of techniques and tools like business rules, algorithm, machine learning and computational modeling procedures which are applied against input from various sources like transactional, historical data, real time data feeds and big data.

These are relatively complex and most companies aren’t applying it yet. The tools and techniques covered in Six Sigma that can be applied in Prescriptive Analytics are Design of Experiments and Simulation. This is captured in Lean Six Sigma however can be handled better with Prescriptive Analytics using BigData.

Predictive Analytics – Understanding the future

Predictive analytics uses statistical models and forecast techniques to understand the future and answer what could happen. It helps predict what could happen based on data and these predictions are not 100% certain and is this uncertainty is denoted in form of probability. It uses historical data available within organizations to identify patterns and apply statistical models to forecast customer behavior, purchase patterns, inventory and sales. Another common application is to compute credit score. It helps fill in information that is not available based on information available.

The tools and techniques covered in Six Sigma that can be applied in Predictive Analytics are Hypothesis testing, correlation and regression.  This is also reasonably well captured in Lean Six Sigma.

Descriptive Analytics-----Tools Deployed are--- Data aggregation and Data mining methods organize the data and make it possible to identify patterns & relationships

Predictive Analytics------Data Mining, Machine Learning, AI,

Prscriptive Analytics-----Tools Deployed-----Mathematical Algorithms and business rules applied to data set

[1]

Top 10 Trends of 2018

1.Artificial Intelligence

2.Predictive and Prescriptive Tools

3.Natural Language Processing (NLP)

4.Data Quality Management

5.Multicloud Strategy

6.Data Governance

7.Security

8. Chief Data Officer(CDO)

9.Embedded Business Intelligence

10.Collaborative Business Intelligence

[2]

Features of R useful for Data Analytics

R is free

R is comprehensive statistical platform offering all types of data analytic techniques.

R contains advanced statistical routines not yet available in other packages. New methods become available for downloads on a weekly basis.

R has a state of art graphic capabilities.

R is powerful platform for interactive data analysis and exploration.

R can import data from multiple sources including text files, Database Management System, Statistical packages & specialized data stores.

R can access data directly from web pages, social media sites & a wide range of online data services.

CII has been promoting the Excellence Framework over two decades towards building organizational excellence. The Excellence Model encompasses all aspects of managing the business viz., People, Partnerships, Processes and Performance and provides an integrated management framework for self-assessment for the top management.

The Excellence Model is based on universally accepted standards and practices that are found in the European Quality Award, the US Malcolm Baldrige National Quality Award, Japan Quality Award and Australian Quality Award. The Business Excellence Model has several organizations who successfully implement the Model. This recognition is based on International benchmarks and is a rigorous and transparent procedure, creating opportunities to benchmark against other organisations.

The Assessment is done by a team of highly qualified and trained business professionals. CII and Export-Import Bank of India (EXIM) jointly instituted the CII- EXIM Bank Award for Business Excellence, to recognize the organisations that have demonstrated outstanding performance supported by excellent practices, and present them as role models for others to emulate. Read More http//cii-iq.in

NoSQL vs SQL- 4 Reasons Why NoSQL is better for Big Data applications[3] Big Data NoSQL databases were pioneered by top internet companies like Amazon, Google, LinkedIn and Facebook to overcome the drawbacks of RDBMS. RDBMS is not always the best solution for all situations as it cannot meet the increasing growth of unstructured data. As data processing requirements grow exponentially, NoSQL is a dynamic and cloud friendly approach to dynamically process unstructured data with ease.IT professionals often debate the merits of SQL vs. NoSQL but with increasing business data management needs, NoSQL is becoming the new darling of the big data movement. What follows is an elaborate discussion on SQL vs. NoSQL-Why NoSQL has empowered many big data applications today.

NoSQL is a database technology driven by Cloud Computing, the Web, Big Data and the Big Users. NoSQL now leads the way for the popular internet companies such as LinkedIn, Google, Amazon, and Facebook - to overcome the drawbacks of the 40 year old RDBMS. NoSQL Database, also known as “Not Only SQL” is an alternative to SQL database which does not require any kind of fixed table schemas unlike the SQL. NoSQL generally scales horizontally and avoids major join operations on the data. NoSQL database can be referred to as structured storage which consists of relational database as the subset. NoSQL Database covers a swarm of multitude databases, each having a different kind of data storage model. The most popular types are Graph, Key-Value pairs, Columnar and Document.[6]

Unstructured data in Big Data

In today's world of Big Data, most of the data that is created is unstructured with some estimates of itbeing more than 95% of all data generated. As a result, enterprises are looking to this new generation of databases, known as NoSQL, to address unstructured data. MongoDB stands as a leader in this movement with over ... Before the modern day ubiquity of online and mobile applications, databases processed straightforward, structured data. Data models were relatively simple and described a set of relationships between different data types in the database.

Unstructured data, in contrast, refers to data that doesn’t fit neatly into the traditional row and column structure of relational databases. Examples of unstructured data include: emails, videos, audio files, web pages, and social media messages. In today’s world of Big Data, most of the data that is created is unstructured with some estimates of it being more than 95% of all data generated.

As a result, enterprises are looking to this new generation of databases, known as NoSQL, to address unstructured data. MongoDB stands as a leader in this movement with over 10 million downloads and hundreds of thousands of deployments. As a document database with flexible schema, MongoDB was built specifically to handle unstructured data. MongoDB’s flexible data model allows for development without a predefined schema which resonates particularly when most of the data in your system is unstructured.

Unstructured Data In Organization Emails, Word Processing Files, PDF Files , Spreadsheets, Digital Images, Video, Audio, Social Media Posts. https://sherpasoftware.com/blog/structured-and-unstructured-data-what-is-it/ https://en.wikipedia.org/wiki/Unstructured_data

Techniques such as data mining, natural language processing (NLP), and text analytics provide different methods to find patterns in, or otherwise interpret, this information. Common techniques for structuring text usually involve manual tagging with metadata or part-of-speech tagging for further text mining-based structuring. The Unstructured Information Management Architecture (UIMA) standard provided a common framework for processing this information to extract meaning and create structured data about the information.[5]

Software that creates machine-processable structure can utilize the linguistic, auditory, and visual structure that exist in all forms of human communication.[6] Algorithms can infer this inherent structure from text, for instance, by examining word morphology, sentence syntax, and other small- and large-scale patterns. Unstructured information can then be enriched and tagged to address ambiguities and relevancy-based techniques then used to facilitate search and discovery. Examples of "unstructured data" may include books, journals, documents, metadata, health records, audio, video, analog data, images, files, and unstructured text such as the body of an e-mail message, Web page, or word-processordocument. While the main content being conveyed does not have a defined structure, it generally comes packaged in objects (e.g. in files or documents, …) that themselves have structure and are thus a mix of structured and unstructured data, but collectively this is still referred to as "unstructured data".[7] For example, an HTML web page is tagged, but HTML mark-up typically serves solely for rendering. It does not capture the meaning or function of tagged elements in ways that support automated processing of the information content of the page. XHTML tagging does allow machine processing of elements, although it typically does not capture or convey the semantic meaning of tagged terms.

Since unstructured data commonly occurs in electronic documents, the use of a content or document management system which can categorize entire documents is often preferred over data transfer and manipulation from within the documents. Document management thus provides the means to convey structure onto document collections.

Prodcuts Products are available for analyzing and understanding unstructured data for business applications. This includes companies like OpenText, Sailpoint, Basis Technology Corp., NetOwl, LogRhythm, ZL Technologies, SAS, Provalis Research, Inxight, Mareana, Datagrav,[5] and IBM's SPSS or Watson, as well as more specialized offerings such as Attensity, Megaputer Intelligence, Clarabridge, Graphext, Stratifyd, Medallia, General Sentiment, and Sysomos, which focus on analyzing unstructured social media data. Other vendors such as Smartlogic or IRI (CoSort) can find and structure data in unstructured sources, then integrate and transform it along with structured data for business intelligence and analytic purposes.[8] Object Storage systems are a more common way of storage and managing large volumes of unstructured data - examples of these include Scality, Dell EMC Elastic Cloud Storage and CEPH.

10 Killer Applications of NoSQL DBMS. [1]FACEBOOK MESSAGING PLATFORM Apache Cassandra was created by Facebook to power their Inbox. It did this for a number of years. Cassandra worked by doing the following: • Cassandra indexed users’ messages and the terms (words, and so on) in the messages and drove a search over all the content in those messages. The user ID was the primary key. Each term became a super column, and the message IDs were the column names. • Cassandra provided the ability to list all messages sent to and from a particular user. Here the user id was the primary key, the recipient IDs were the super columns, and the message IDs were the column names. • The original Facebook Cassandra paper is annotated with recent information and is maintained by DataStax, the commercial company promoting Cassandra today.

[2]AMAZON DYNAMODB Amazon originally published the Dynamo paper, thereby launching the concept of NoSQL key-value stores. Since then, Amazon has created a separate database called DynamoDB as a service offered on the Amazon Web Services marketplace site. Although DynamoDB gets its name from the original Dynamo, it has a different approach: DynamoDB provides worldwide synchronous replication in order to guarantee consistency and durability essential in enterprise applications. With DynamoDB, you pay only for the hourly throughput capacity you use, as you use it, rather than for the amount of data you store, which is an interesting model that new application developers will find appealing. You also get as of writing a ‘free tier’ option that includes 25GB of storage and a number of write and read capacity units. Google Mail Google’s Bigtable was created to provide wide-column storage for a range of Google’s applications, including Orkut, Google Earth, web indexing, Google Maps, Google Books, YouTube, blogger.com, Google Code and Google Mail. Bigtable clones provide index lookup tables for very large sets of information.

[3]LINKEDIN LinkedIn has used Hadoop to churn information about relationships overnight and to push the latest graph information to the Voldemort key-value NoSQL store for query the next day. In this way, LinkedIn maintained a rolling view of all data in the service.

Conclusion

Most important aspect being Amazon, Google and Apple using disruptive business model to create and recapture the market share. Amazon has introduced Fire TV Stick with TV and Motion Pictures Indian content, USA Content to face NETFLIX. Amazon added Web series, ECHO to its product range with very attractive price tags.The same phenomenon is replicated in India by most on-line e-commerce companies. The customers are benefited in the process.

References

[1]Anita Upadhya on https://www.benchmarksixsigma.com/forum/topic/34901-business-analytics/

[2}www.datapine.com/blog/business-intelligence-trends/

[3]https://www.dezyre.com/article/nosql-vs-sql-4-reasons-why-nosql-is-better-for-big-data-applications/86

[4]https://sherpasoftware.com/blog/structured-and-unstructured-data-what-is-it/

[5]http://www.dummies.com/programming/big-data/nosql/10-killer-nosql-applications/

[6]https://www.mongodb.com/scale/unstructured-data-in-big-data


[7]R in Action-Data Analysis and Graphics with R -Robert Kabacoff

Comments

Popular posts from this blog

Latest Tech news from Newspapers

21-08-2018 Unity Day No matter how many times the teeth bite the tongue, they still stay together in one mouth. That's the spirit of FORGIVENESS. Even though the eyes don't see each other, they see things together, blink simultaneously and cry together. That's UNITY." May the Lord grant us all the spirit of forgiveness and togetherness Have a united day. 9-8-2018 Plagiarism: Teachers to lose jobs, students their registrations,say HRD Dept. norms NEW DELHI: Student researchers found guilty of plagiarism registration and teachers could lose their jobs as the HRD ministries approved new regulations on plagiarism drafted by the Universities Grants Commission. The ministry had notified the UGC (Promotion of Academic Integrity and Prevention of Plagiarism in Higher Education Institutions) Regulations, 2018 this week. According to a gazette notification, plagiarism of up to 10 per cent would not result in any penalty for students. Those with plagiarism between 10 per cent

Quality Management System and Corporate Sustainability

Quality Management System and Corporate Sustainability Pradeep Joshi Executive Summary The current market conditions are Vulnerable, Uncertain, Complex and Ambiguous. The Corporates should embrace the Innovation and Innovative processes to survive in market place. This means customer requirements have been more demanding and equally challenging for corporates to sustain in terms of delivery of desired product of customer in given timeframe and at acceptable prices for customer. There is need for Corporates to follow Quality Systems based on Lean Organization and aiming to deliver at low prices, less waste and least variability in product. This calls for using Lean Six Sigma Methodology that emphasizes on reducing waste, costs and innovating design and technical process selection. This basically includes Innovation and precision of forecasts, agility to face challenges, flexible strategy by Corporate to sustain in long run. All The Stakeholders namely Investors, Society, Environmen
Artificial Intelligence !!! Artificial Intelligence (AI) Since mankind lived in caves, we have pushed our will into passive tools with our hands and our voices. Our mice and our keyboards do exactly as we tell them to, and devices like the Amazon Echo can help us do simple tasks, like turning on lights, or more complex tasks, like responding to questions with analytics. But with the rise of artificial intelligence (AI), the tides might turn. Can machines morph from passive objects into active participants that weave themselves into the fabric of our lives? Will machines drive us, or will we drive the machines? Will objects inform us what they have done on our behalf, or will we continue to tell objects what to do? Could we become mere pawns in a life orchestrated by autonomous intelligence, as everything becomes smarter, more intelligent? How close are we to such a reality? The state of AI today If you are worried about the machines taking over the world, you can sleep soundly. I