Unit 3
Unit 3
Unit 3
Although there are instances of rigorous process thinking in manufacturing all the way back to the
Arsenal in Venice in the 1450s, the first person to truly integrate an entire production process was
Henry Ford. At Highland Park, MI, in 1913 he married consistently interchangeable parts with
standard work and moving conveyance to create what he called flow production. The public
grasped this in the dramatic form of the moving assembly line, but from the standpoint of the
manufacturing engineer the breakthroughs actually went much further.
Ford lined up fabrication steps in process sequence wherever possible using special-purpose
machines and go/no-go gauges to fabricate and assemble the components going into the vehicle
within a few minutes, and deliver perfectly fitting components directly to line-side. This was a
truly revolutionary break from the shop practices of the American System that consisted of
general-purpose machines grouped by process, which made parts that eventually found their way
into finished products after a good bit of tinkering (fitting) in subassembly and final assembly.
The problem with Ford’s system was not the flow: He was able to turn the inventories of the entire
company every few days. Rather it was his inability to provide variety. The Model T was not just
limited to one color. It was also limited to one specification so that all Model T chassis were
essentially identical up through the end of production in 1926. (The customer did have a choice of
four or five body styles, a drop-on feature from outside suppliers added at the very end of the
production line.) Indeed, it appears that practically every machine in the Ford Motor Company
worked on a single part number, and there were essentially no changeovers.
When the world wanted variety, including model cycles shorter than the 19 years for the Model
T, Ford seemed to lose his way. Other automakers responded to the need for many models, each
with many options, but with production systems whose design and fabrication steps regressed
toward process areas with much longer throughput times. Over time they populated their
fabrication shops with larger and larger machines that ran faster and faster, apparently lowering
costs per process step, but continually increasing throughput times and inventories except in the
rare case—like engine machining lines—where all of the process steps could be linked and
automated. Even worse, the time lags between process steps and the complex part routings
required ever more sophisticated information management systems culminating in computerized
Materials Requirements Planning (MRP) systems.
As Kiichiro Toyoda, Taiichi Ohno, and others at Toyota looked at this situation in the 1930s, and
more intensely just after World War II, it occurred to them that a series of simple innovations
might make it more possible to provide both continuity in process flow and a wide variety in
product offerings. They therefore revisited Ford’s original thinking, and invented the Toyota
Production System.
This system in essence shifted the focus of the manufacturing engineer from individual machines
and their utilization, to the flow of the product through the total process. Toyota concluded that by
right-sizing machines for the actual volume needed, introducing self-monitoring machines to
ensure quality, lining the machines up in process sequence, pioneering quick setups so each
machine could make small volumes of many part numbers, and having each process step notify
the previous step of its current needs for materials, it would be possible to obtain low cost, high
variety, high quality, and very rapid throughput times to respond to changing customer desires.
Also, information management could be made much simpler and more accurate.
The thought process of lean was thoroughly described in the book The Machine That Changed the
World (1990) by James P. Womack, Daniel Roos, and Daniel T. Jones. In a subsequent volume,
Lean Thinking (1996), James P. Womack and Daniel T. Jones distilled these lean principles even
further to five:
Lean Today
As these words are written, Toyota, the leading lean exemplar in the world, stands poised to
become the largest automaker in the world in terms of overall sales. Its dominant success in
everything from rising sales and market shares in every global market, not to mention a clear lead
in hybrid technology, stands as the strongest proof of the power of lean enterprise.
This continued success has over the past two decades created an enormous demand for greater
knowledge about lean thinking. There are literally hundreds of books and papers, not to mention
thousands of media articles exploring the subject, and numerous other resources available to this
growing audience.
As lean thinking continues to spread to every country in the world, leaders are also adapting the
tools and principles beyond manufacturing, to logistics and distribution, services, retail, healthcare,
construction, maintenance, and even government. Indeed, lean consciousness and methods are
only beginning to take root among senior managers and leaders in all sectors today. Learn more
about lean thinking.
Lean manufacturing, also known as lean production, or lean, is a practice that organizations from
numerous fields can enable. Some well-known companies that use lean include Toyota, Intel, John
Deere and Nike. The approach is based on the Toyota Production System and is still used by that
company, as well as myriad others. Companies that use enterprise resource planning (ERP) can
also benefit from using a lean production system.
Lean manufacturing is based on a number of specific principles, such as Kaizen, or continuous
improvement. Lean manufacturing was introduced to the Western world via the 1990 publication
of The Machine That Changed the World, which was based on an MIT study into the future of the
automobile detailed by Toyota's lean production system. Since that time, lean principles have
profoundly influenced manufacturing concepts throughout the world, as well as industries outside
of manufacturing, including healthcare, software development and service industries.
A widely referenced book, Lean Thinking: Banish Waste and Create Wealth in Your Corporation,
which was published in 1996, laid out five principles of lean, which many in the field reference as
core principles. They are value, the value stream, flow, pull and perfection. These are now used as
the basis for lean implementation.
Identify value from the customer's perspective. Value is created by the producer, but it is
defined by the customer. Companies need to understand the value the customer places on their
products and services, which, in turn, can help them determine how much money the customer is
willing to pay. The company must strive to eliminate waste and cost from its business processes
so that the customer's optimal price can be achieved -- at the highest profit to the company.
2. Map the value stream. This principle involves recording and analyzing the flow of information
or materials required to produce a specific product or service with the intent of identifying waste
and methods of improvement. Value stream mapping encompasses the product's entire lifecycle,
from raw materials through to disposal. Companies must examine each stage of the cycle for waste.
Anything that does not add value must be eliminated. Lean thinking recommends supply chain
alignment as part of this effort.
3. Create flow. Eliminate functional barriers and identify ways to improve lead time. This aids in
ensuring the processes are smooth from the time an order is received through to delivery. Flow is
critical to the elimination of waste. Lean manufacturing relies on preventing interruptions in the
production process and enabling a harmonized and integrated set of processes in which activities
move in a constant stream.
4. Establish a pull system. This means you only start new work when there is demand for it. Lean
manufacturing uses a pull system instead of a push system. Push systems are used in manufacturing
resource planning (MRP) systems. With a push system, inventory needs are determined in
advance, and the product is manufactured to meet that forecast. However, forecasts are typically
inaccurate, which can result in swings between too much inventory and not enough, as well as
subsequent disrupted schedules and poor customer service. In contrast to MRP, lean manufacturing
is based on a pull system in which nothing is bought or made until there is demand. Pull relies on
flexibility and communication.
The Toyota Production System laid out seven wastes, or processes and resources, that don't add
value for the customer. These seven wastes are:
● unnecessary transportation;
● excess inventory;
● unnecessary motion of people, equipment or machinery;
● waiting, whether it is people waiting or idle equipment;
● over-production of a product;
● over-processing or putting more time into a product than a customer needs, such as
designs that require high-tech machinery for unnecessary features; and
● defects, which require effort and cost for corrections.
Although not originally included in the Toyota Production System, many lean practitioners point
to an eighth waste: waste of unused talent and ingenuity.
Lean manufacturing requires a relentless pursuit of reducing anything that does not add value to a
product, meaning waste. This makes continuous improvement, which lies at the heart of lean
manufacturing, a must.
areas for workers and which prevent wasted effort and time. 5S emphasizes
organization and cleanliness.
● Kanban: a signal used to streamline processes and create just-in-time delivery. Signals
can either be physical, such as a tag or empty bin, or electronically sent through a
system.
● Jidoka: A method that defines an outline for detecting an abnormality, stopping work
until it can be corrected, solving the problem, then investigating the root cause.
● Andon: A visual aid, such as a flashing light, that alerts workers to a problem.
● Poka-yoke: A mechanism that safeguards against human error, such as an indicator
light that turns on if a necessary step was missed, a sign given when a bolt was
tightened the correct number of times or a system that blocks a next step until all the
previous steps are completed.
In the simplest terms, where lean holds that waste is caused by additional steps, processes and
features that a customer doesn't believe adds value and won't pay for, Six Sigma holds that waste
results from process variation. Still, the two approaches are complementary and have been
combined into a data-driven approach, called Lean Six Sigma.
5S
What is 5S?
Eliminates waste that results from a poorly organized work area (e.g., wasting time looking for a
tool).
Andon
What is Andon?
Visual feedback system for the plant floor that indicates production status, alerts when assistance
is needed, and empowers operators to stop the production process.
Acts as a real-time communication tool for the plant floor that brings immediate attention to
problems as they occur – so they can be instantly addressed.
Bottleneck Analysis
Identify which part of the manufacturing process limits the overall throughput and improve the
performance of that part of the process.
Continuous Flow
Manufacturing where work-in-process smoothly flows through production with minimal (or no)
buffers between steps of the manufacturing process.
Eliminates many forms of waste (e.g., inventory, waiting time, and transport).
A philosophy that reminds us to get out of our offices and spend time on the plant floor – the place
where real action occurs.
What is Heijunka?
Reduces lead times (since each product or variant is manufactured more frequently) and inventory
(since batches are smaller).
Align the goals of the company (Strategy), with the plans of middle management (Tactics) and the
work performed on the plant floor (Action).
Ensures that progress towards strategic goals is consistent and thorough – eliminating the waste
that comes from poor communication and inconsistent direction.
Jidoka (Autonomation)
What is Jidoka?
Design equipment to partially automate the manufacturing process (partial automation is typically
much less expensive than full automation) and to automatically stop when defects are detected.
After Jidoka, workers can frequently monitor multiple stations (reducing labor costs) and many
quality issues can be detected immediately (improving quality).
Just-In-Time (JIT)
What is Just-In-Time?
Pull parts through production based on customer demand instead of pushing parts through
production based on projected demand. Relies on many lean tools, such as Continuous Flow,
Heijunka, Kanban, Standardized Work, and Takt Time.
Highly effective in reducing inventory levels. Improves cash flow and reduces space requirements.
What is Kaizen?
Combines the collective talents of a company to create an engine for continually eliminating waste
from manufacturing processes.
A method of regulating the flow of goods both within the factory and with outside suppliers and
customers. Based on automatic replenishment through signal cards that indicate when more goods
are needed.
Eliminates waste from inventory and overproduction. Can eliminate the need for physical
inventories, instead relying on signal cards to indicate when more goods need to be ordered.
Metrics designed to track and encourage progress towards critical goals of the organization.
Strongly promoted KPIs can be extremely powerful drivers of behavior – so it is important to
carefully select KPIs that will drive desired behavior.
● Are aligned with top-level strategic goals (thus helping to achieve those goals)
● Are effective at exposing and quantifying waste (OEE is a good example)
● Are readily influenced by plant floor employees (so they can drive results)
Muda (Waste)
What is Muda?
Anything in the manufacturing process that does not add value from the customer’s perspective.
Framework for measuring productivity loss for a given manufacturing process. Three categories
of loss are tracked:
What is PDCA?
What is Poka-Yoke?
Design error detection and prevention into production processes with the goal of achieving zero
defects.
It is difficult (and expensive) to find all defects through inspection, and correcting defects typically
gets significantly more expensive at each stage of production.
A problem solving methodology that focuses on resolving the underlying problem instead of
applying quick fixes that only treat immediate symptoms of the problem. A common approach is
to ask why five times – each time moving a step closer to discovering the true underlying problem.
Helps to ensure that a problem is truly eliminated by applying corrective action to the “root cause”
of the problem.
Enables manufacturing in smaller lots, reduces inventory, and improves customer responsiveness.
Six categories of productivity loss that are almost universally experienced in manufacturing:
● Breakdowns
● Setup/Adjustments
● Small Stops
● Reduced Speed
● Startup Rejects
● Production Rejects
Provides a framework for attacking the most common causes of waste in manufacturing.
SMART Goals
Standardized Work
Documented procedures for manufacturing that capture best practices (including the time to
complete each task). Must be “living” documentation that is easy to change.
Eliminates waste by consistently applying best practices. Forms a baseline for future improvement
activities.
Takt Time
The pace of production (e.g., manufacturing one piece every 34 seconds) that aligns production
with customer demand. Calculated as Planned Production Time / Customer Demand.
Provides a simple, consistent and intuitive method of pacing production. Is easily extended to
provide an efficiency goal for the plant floor (Actual Pieces / Target Pieces).
Creates a shared responsibility for equipment that encourages greater involvement by plant floor
workers. In the right environment this can be very effective in improving productivity (increasing
up time, reducing cycle times, and eliminating defects).
A tool used to visually map the flow of production. Shows the current and future state of processes
in a way that highlights opportunities for improvement.
Exposes waste in the current processes and provides a roadmap for improvement through the future
state.
Visual Factory
Visual indicators, displays and controls used throughout manufacturing plants to improve
communication of information.
Six Sigma is one of today’s foremost process improvement methodologies. It introduces a set of
standards for organizations to follow, with the ultimate goal of trimming operational waste and
redundancy and therefore eliminating errors, defects and waste.
How did Six Sigma Methodology evolve to become one of the manufacturing industry’s most
prominent process improvements? And does the methodology’s results match the hype?
The Six Sigma Methodology comprises five data-driven stages — Define, Measure, Analyze,
Improve and Control (DMAIC). When fully implemented, DMAIC standardizes an organization’s
problem-solving approach and shapes how it ideates new process solutions.
1. Define
The “Define” stage seeks to identify all the pertinent information necessary to break down a
project, problem or process into tangible, actionable terms. It emphasizes the concrete, grounding
process improvements in actual, quantifiable and qualifiable information rather than abstract goals.
2. Measure
In the “Measure” phase, organizations assess where current process capabilities are. While they
understand they need to make improvements and have listed those improvements concretely in the
Define phase, they cannot go about tweaking and tailoring changes until they have a data-backed
baseline.
3. Analyze
The “Analyze” phase examines the data amassed during the Measure stage to isolate the exact root
causes of process inefficiencies, defects and discrepancies. In short, it extracts meaning from your
data. Insights gleaned from Analyzation begin scaffolding the tangible process improvements for
your team or organization to implement.
Organizations can move beyond the Analyze phase once they’ve conducted the following:
• Pareto charts and similar Six-Sigma approved data maps tracking the frequency of an issue
• Potential capability (Cp) and actual capability (Cpk) calculations
• A formal root cause analysis
4. Improve
The “Improve” initiates formal action plans meant to solve the target root problems gleaned from
your Analyzations. Organizations directly address what they’ve identified as problem root causes,
typically deploying a Design of Experiment plan to isolate different variables and co-factors until
the true obstacle is found.
5. Control
In the final phase, “Control,” Six Sigma teams create a control plan and deploy your new
standardized process. The control plan outlines improved daily workflows, which result in critical
business process variables abiding by accepted quality control variances.
Each of these five phases creates a repeatable template to improve your business’ process
capabilities. When the five stages are fully implemented, organizations can measure both the
effectiveness and efficiency of critical manufacturing business processes. Measurements are
tracked in a control chart, lending you quantifiable, comparable process-control data that leverage
a competitive advantage.
DMAIC: The DMAIC method is used primarily for improving existing business processes. The
letters stand for:
DMADV: The DMADV method is typically used to create new processes and new products or
services. The letters stand for:
The Six Sigma methodology in the manufacturing industry carries numerous benefits, each
quantifiable and backed by charted growth.
1. Bolstered Productivity
Most manufacturing environments use a few key variables to measure productivity. From input
figures and output units to times to market, cost-benefit ratios and more, there’s a long list of star
players on a manufacturer’s productivity benchmark roster. Nearly every industrial and
manufacturing operation these days searches for ways to improve these metrics, reducing overhead
or lead times without sacrificing product quality or brand reputation.
Six Sigma presents a quantifiable, actionable template to do so. Through its data measurements
and analysis, organizations target bottlenecks, redundancies and error-prone processes, ultimately
unlocking that Goldilocks balance between producing more in less time.
2. Increased Throughput
Six Sigma’s philosophy of root cause identification and elimination is the ideal complement to a
manufacturer looking to improve throughput rates.
Using precise data, you target exact production areas, from processing times or inspections to unit
movement, queuing and warehousing. Further domain-specific data analyses then allow you to
make additional micro tweaks and changes. No more enacting new or revised workflows via trial
and error. Raw materials and complete units alike move swiftly through the production cycle, with
the throughput times to prove it.
3. Improved Quality
Six Sigma doesn’t replace the operations and workflows you perform. It aims to improve what
you’re already doing, tweaking tactical adjustments here and there to reduce mistakes and
eliminate redundancies. These are inevitable in the manufacturing industry — but not irreversible.
The results are a throughput process that produces end-to-end higher quality goods more
consistently, without upending your infrastructure or operations.
Certified Six Sigma manufacturers have better visibility over their complete production cycle.
With that knowledge comes decreased “fire drills,” — instances of major product or process
defects. Six Sigma peels back the curtain on some of today’s most notorious causes of product
damage in the manufacturing industry, from improperly calibrated equipment that mishandles
components to complete changeover errors resulting in expensive — and cumbersome —
production halts.
Managers and employees alike can focus on what they need to, with fewer ad-hoc emergencies
occupying their time.
An end-to-end efficient manufacturing operation is a less costly operation. End of story. This
leaves more money in the pockets of your manufacturing company, reducing overhead and direct
expenses alike and burgeoning your overall bottom line.
The Challenges of Six Sigma
Like any process improvement, Six Sigma is not without its growing pains.
The guiding philosophy behind Six Sigma is the five-stage DMAIC process, which assumes every
detail and activity in the manufacturing environment involves quantifiable inputs and outputs.
While this isn’t misguided, it presents problems with less immediately or obviously quantifiable
operations. Without proper statistical training, organizations are burdened with the task of
quantifying data points seemingly in the dark.
What’s more, DMAIC is not the only approach used in Six Sigma theory. Others have fine-tuned
similar but separate process improvement methodologies for manufacturers to adopt depending on
operational efficiency goals.
2. Highly Statistical
Six Sigma relies on a diverse set of statistical tools to identify and validate a process’ root problem.
Using the DMAIC model alone, teams may conduct any of the following quantitative calculation
techniques:
Leaders in the wider organization may not have the technical fluency needed for Six Sigma buy-
in and continued support. This leadership disconnect is also likely to spur misappropriated
resources, with Six Sigma efforts not receiving proper time commitments, budgets, dedicated
personnel and other resources needed to guide it to full process-control validation.
These challenges shouldn’t discourage your manufacturing company from researching Six Sigma
consulting and training opportunities. With the proper commitment and org-wide communication,
a Six Sigma methodology can be engrained into your manufacturing organization’s end-to-end
workflows.
Six Sigma in the manufacturing industry has one guiding premise — the elimination of variation
within the product lifecycle, ultimately ensuring each run aligns with an ideal production outcome.
This goal has distinct relevancy for those in manufacturing, including many of the following:
Business stability: Six Sigma reduces business changeability. By using statistical data analysis to
target and phase out previously problematic workflows, organizations objectively assure they are
maximizing outputs using only the best, most researched and properly quantified inputs. The result
is an organization achieving new levels of business fluidity and nimbleness, alongside tighter cost
controls, smoother delivery and the greatest possible product quality
In this digital era of advanced information technology and communication, the ever-expanding
channels bring both opportunities and challenges. It is very easy today to acquire customers’ data
which can help you design effective marketing campaigns, personalized efforts, and fundraising
efforts. This increases the chances of the business succeeding if the data acquired is accurate.
However, there are so many data entry points both on the company and customer-end which
increases the chances of inaccurate organizational databases. If there are no strategies to prevent
inaccurate data entry or cleanse the inaccurate data from databases, marketing campaigns, efforts
awareness and other outreaches to users may not be effective.
Data quality is an assessment or a perception of data’s fitness to fulfill its purpose. Simply put,
data is said to be high quality if it satisfies the requirements of its intended purpose. The quality of
data can be measured by six dimensions:
Consistency: Data is said to be consistent if all the systems across the entreprise reflect the same
information.
Accuracy: Data accuracy is defined as the degree with which data correctly reflects the event in
question or the ‘real world’ object.
Validity: Data is valid if it conforms to type, format and range of its definition
Decision making
When the quality of data is high, its users have high confidence in the outputs. The old saying
‘garbage in, garbage out’ is true as is its inverse. When quality is recorded and used, the outputs
are reliable which mitigates risks and guesswork in decision making.
Productivity
Good quality enhances productivity. Workers spend more time working towards their primary
mission instead of spending time validating and fixing data errors.
Effective marketing
High quality data increases marketing effectiveness. Accurate data allows accurate targeting and
communications and companies are more likely to achieve desired results.
Compliance
Maintaining good quality data makes it easy for companies to ensure compliance and save
probably huge fines for non-compliance. This is particularly so in industries where regulations
govern trade with customers such as finance industry.
Traditionally, data management experts have been involved in refining data analysis and reporting
platforms while overlooking data quality. Traditional data quality control mechanisms are based
on users experience or predefined business rules. Apart from being a time-consuming exercise, it
also limits the performance and has low accuracy.
New and smarter way – use of AI and AI powered MDM platforms data quality
Every organization values the importance of data and its contribution to its success. The case is
even worse in this era of big data, cloud computing and AI. The relevance of data goes beyond its
volume or how it is used. If a company has terrible data quality, actionable analytics in the world
will make no difference. How Artificial Intelligence, Machine Learning and Master Data
Management can work together is a hot topic right now in the MDM realm. MDM platforms are
incorporating AI and Machine Learning capabilities to improve accuracy, consistency,
manageability among others. AI has managed to improve the quality of data through the following
ways.
AI can grab data without the intervention of manual activities. If the most critical details are
automatically captured, workers can forget about admin work and put more emphasis on the
customer.
Duplicate entries of data can lead to outdated records that result in bad data quality. AI can be used
to eliminate duplicate records in an organisation’s database and keep precise golden keys in the
database. It is difficult to identify and remove recurring entries in a big company’s repository
without the implementation of sophisticated mechanisms. An organisation can combat this by
having intelligent systems that can detect and remove duplicate keys.
Detect anomalies
A small human error can drastically affect the utility and the quality of data in a CRM. An AI-
enabled system can remove defects in a system. Data quality can also be improved through the
implementation of machine learning-based anomaly.
Apart from correcting and maintaining the integrity of data, AI can improve data quality by adding
to it. Third-party organisations and governmental units can significantly add value to the quality
of a management system and MDM platforms by presenting better and more complete data, which
contributes to precise decision making. AI makes the suggestions on what to fetch from a particular
set of data and the building connections in the data. When a company has detailed and clean data
in one place, it has higher chances of making informed decisions.
It is imperative for companies to have the right algorithms and queries to operate on their big data.
Random forest
Random forest is a flexible machine learning algorithm which produces reliable results. It is the
most used algorithm due to its simplicity, and it can be used for regression and classification
purposes.
How it works
Like its name, a random forest algorithm creates a forest and makes it random. It establishes several
decision trees and combines them to achieve a more stable and accurate prediction.
The main advantage of random forest is it can be used in both classification and regression tasks
Disadvantages
A large number of trees makes the algorithm slow and ineffective for real-time predictions.
The Random Forest algorithm for data quality is used in various institutions such as banks, e-
commerce, medicine, and the stock market. In a banking institution, Random Forest is used to
determining account holders who use the bank services more frequently than others and pay back
their debt in time. In the same field, it is used to identify fraud customers having the intention to
Scam the bank.
In finance, the algorithm is used to determine stock behavior and influence decision making in the
future.
Random forest is used in the medical field to detect the most appropriate combination of medicine
components and analyze a patient’s medical history. The results from such predictions help in
determining the frequency of a disease occurring in a particular area and the best treatment.
In E-commerce, the algorithm is used to predict customer behavior in buying products. It helps in
presenting a customer with their most preferred products having analyzed their purchase behavior
from past experiences. It can also predict the probability of a customer buying a particular product
based on the behavior of other customers.
For the application in the stock market, the algorithm can be used to determine stocks behavior
and identify the expected loss or profit.
It is a supervised machine learning algorithm that can be used for both classification and
regression. The primary goal of SVM is to classify unseen data.
Applications of SVM
Text and hypertext detection: the algorithm allows for categorisation of text and hypertext for
transductive and inductive models. It uses training data to classify documents into different
categories. The categories are put based on scores generated and then comparing with the highest
value.
Handwriting recognition: SVMs are used to identify widely used handwritten characters. These
characters are majorly utilised in validating signatures of vital documents.
Bioinformatics: This includes protein and cancer classification. SVM algorithm is used to identify
the classification of genes and other biological problems in patients. In recent years, the SVM
algorithm has been used to detect protein remote homology.
Advantages of SVMs
Calculation simplification
It has comprehensive algorithms that simplify predictions and calculations since the algorithm is
presented in a graphic image.
Use cases
AI in business is progressively advancing. What was once science fiction is being implemented by
many organisations around the globe. In today’s business era, companies are using machine
algorithms to determine trends and insights. Upon cleansing and increasing the quality of the data,
the information obtained helps in decisions making to increase the company’s competitiveness.
HANA takes the collected information from various access points across the business such as
desktop computers, mobile phones and sensors. If an organisation’s sales staff uses company
devices to record purchase orders, HANA can analyse the transactions to identify trends and
irregularities.
The intent of HANA, as is with other machine learning solutions, is to come up with data-driven
decisions which are potentially better informed. Walmart, a multinational retail corporation, uses
HANA to process its high volume of transaction records within seconds.
General Electric
Many firms in different industrial sectors such as oil, aviation and gas have been using GE’s Predix
operating system which powers industrial apps to process the historical performance data of the
equipment. The acquired information can be used to identify different operation information,
including when the machine will fail. Besides the small scale logistics, Predix can process large
amounts of information taken over long periods to develop its focus.
In the aviation sector, aircrafts use applications like Prognostics from GE built on Predix. The app
helps airline engineering crews to determine how long the landing gear can serve before it is put
in service. This time prediction can be used to create a maintenance schedule and eliminate
unexpected issues and flight delays.
Avanade
Avanade is a joint venture between Accenture and Microsoft that uses Cortana Intelligent Digital
Assistant and other solutions to form predictive data-based insights and analytics. Pacific
Specialty, an insurance company, used Avanade to establish an analytics platform to give its staff
more insight and perspective to the insurance business. The goal of the exercise was to use policy
and customer data to influence more growth. The firm sought to provide better products by
understanding policyholder, trends and behaviour through analytics.
Plagiarism checkers
Renowned plagiarism checker platforms such as Turnitin use ML at their core functioning to detect
plagiarised content. Traditional plagiarism detection applications rely on massive databases to
compare the text in question.
Machine learning helps in detecting plagiarised content that is not in the database. It has the ability
to recognise text from foreign languages. The algorithmic key to plagiarism is the resemblance
function which gives a numeric value representing the similarity of one document to the other. An
effective similarity function improves the accuracy of the results and also increased the efficiency
of computing them.
Summary:
Artificial intelligence and Machine Learning are expected to change the present and future
business world. Businesses using AI are getting better at their predictive tasks like determining the
preferences of different customers. The prediction results are based on the information fed to the
system. It is clear that this new development will affect many industrial sectors such as banking,
stock market, E-commerce, learning, health care, manufacturing and many others. The overall
effect of implementing AI in businesses would be increased productivity, better customer
experience, improved decision making and timely planning.
Data acquisition systems (DAS or DAQ) convert physical conditions of analog waveforms into
digital values for further storage, analysis, and processing.
In simple words, Data Acquisition is composed of two words: Data and Acquisition, where data
is the raw facts and figures, which could be structured and unstructured and acquisition means
acquiring data for the given task at hand.
Data acquisition meaning is to collect data from relevant sources before it can be stored, cleaned,
preprocessed, and used for further mechanisms. It is the process of retrieving relevant business
information, transforming the data into the required business form, and loading it into the
designated system.
A data scientist spends 80 percent of the time searching, cleaning, and processing data. With
Machine Learning becoming more widely used, some applications do not have enough labeled
data. Even the best Machine Learning algorithms cannot function properly without good data and
cleaning of the data. Also, Deep learning techniques require vast amounts of data, as, unlike
Machine Learning, these techniques automatically generate features. Otherwise, we would have
garbage in and garbage out. Hence, data acquisition or collection is a very critical aspect.
Collection and Integration of the data: The data is extracted from various sources and also the data
is usually available at different places so the multiple data needs to be combined to be used. The
data acquired is typically in raw format and not suitable for immediate consumption and analysis.
This calls for future processes such as:
Formatting: Prepare or organize the datasets as per the analysis requirements.
Labeling: After gathering data, it is required to label the data. One such instance is in an application
factory, one would want to label the images of the components if the components are defective or
not. In another case, if constructing a knowledge base by extracting information from the web then
would need to label that it is implicitly assumed to be true. At times, it is needed to manually label
the data.
This acquired data is what is ingested for the data preprocessing steps. Lets’ move to the data
acquisition process …
Data Discovery
Data Augmentation
Data Generation
Data Augmentation:
The next approach for data acquisition is Data augmentation. Augment means to make something
greater by adding to it, so here in the context of data acquisition, we are essentially enriching the
existing data by adding more external data. In Deep and Machine learning, using pre-trained
models and embeddings is common to increase the features to train on.
Data Generation:
As the name suggests, the data is generated. If we do not have enough and any external data is not
available, the option is to generate the datasets manually or automatically. Crowdsourcing is the
standard technique for manual construction of the data where people are assigned tasks to collect
the required data to form the generated dataset. There are automatic techniques available as well
to generate synthetic datasets. Also, the data generation method can be seen as data augmentation
when there is data available however it has missing values that need to be imputed.
.
How predicting manufacturing downtime using machine learning can ensure business success
As manufacturing becomes more machine and AI-reliant, moving from trying to prevent
machine downtime to predicting it before it happens will become ever more important in the
future.
The Make-in-India initiative aims at increasing the contribution of the manufacturing sector to
India’s Gross Domestic Product (GDP) from the current levels of about 16 percent to 25
percent by 2022. As manufacturing looks to play a larger role in the future of our economy,
technology advancement and intervention in this sector will continue to be a great opportunity
for the entire IT industry.
One of the major concerns and challenges of having a seamless manufacturing output is to
prevent and avoid unfavourable machine performance. With the assumption that machines
will degrade over time, manufacturing companies, prior to advanced technology intervention,
aimed at focusing on preventive and reactive maintenance of the health of their machines, but
the use of deep learning technology is leading towards a new-age term method to safeguard
the health of machines, coined in the industry as ‘predictive maintenance’.
Predictive maintenance technology approaches can help the manufacturing sector find the
optimal inflection point between costs and machine failures. But predictive maintenance is not
as simple as a plug-n-play solution, as the requirement of machine learning requires layers of
historic data to be collected over time. The true impact of this strategy will only be seen in the
mid-term, and like good wine, this technology intervention will only get better and better with
time. The data collected over time will help proactively perform machine maintenance to reach
optimum efficiency levels of production.
Consider the life-cycle of a (Computer Numerical Control) CNC machine. Today, most CNC
manufacturers define the maintenance cycles based on the type of work the CNC machine does
for their customer. It is based on their individual experience and judgement. However, if we
were to get not just real-time data on the display but also store and analyze the historical data
and use of the CNC machine, deep learning algorithms could find out the pattern of use and
predict the maintenance and life of the CNC machine.
False positives would occur, i.e. a situation where the algorithm may predict the maintenance
incorrectly based on the parameters it has to play with. With some human intervention, this
pattern is corrected, learnt, and applied on the following data set to improve the result. So, the
algorithm can learn from its mistake and give more relevant and accurate results over time.