Nothing Special   »   [go: up one dir, main page]

Manufacturing Technology Assignment

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 15

E022-01-0793/2015

DANIEL KIAMA MURIITHI


EMT 5205
MANUFACTURING TECHNOLOGY ASSIGNMENT
Globalization has forced manufacturing organizations into competitive arenas which include:
 quality,
 cost, and
 responsiveness.
Arora(2009) notes that the quality of goods is determined by customers. Customers become a
key factor that can create competition among organizations and this make firms to focus more on
quality.
Some quality control techniques have been applied to improve the quality of the process by
reducing its variability. Some of techniques are available to control product or process quality
include:
 seven statistical process control (SPC) tools,
 acceptance sampling,
 quality function deployment (QFD),
 failure mode and effects analysis (FMEA),
 six sigma,
 and design of experiments (DoE).
Quality can be defined as fulfilling specification or customer requirement, without any defect.
A product is said to be high in quality if it is functioning as expected and reliable. Quality control
refers to activities that ensure that produced items are fulfilling the highest possible quality.
Quality control techniques can be classified into
 basic,
 intermediate, and
 advance level,
but there is no consensus among researchers in the classification.

Excessive variability in process performance often results in waste and rework. For improvement
in quality and productivity process variation needs to be reduced.

Statistical Process Control techniques

SPC uses statistics to detect variations in the process so that it can be controlled. Poor quality is
usually as a result of a variation in some stage of production. The concept of variation states that
no two products will be perfectly identical even if extreme care is taken to make them identical
in some aspect. Statistical Process Control (SPC) applies statistical methods to monitor and
control a process to operate at full potential. Statistical process control is a collection of tools that
when used together can result in process stability and variance reduction.

Statistical process control techniques include:


1. Check Sheet
2. Histogram
3. Pareto Chart
4. Cause and Effect Diagram
5. Flow Chart
6. Control Chart
7. Scatter Diagram
Check Sheet
Check sheets are simple forms with certain formats that can aid the user to record data in a firm
systematically. Data is collected and tabulated on the check sheet to record the frequency of
specific events during a data collection period. They prepare a consistent, effective, and
economical approach that can be applied in the auditing of quality assurance for reviewing to
follow the steps in a particular process. Also, they help the user to arrange the data for the
utilization later. Check sheets easy to apply and understand, and it can make a clear picture of the
situation and condition of the organization. They are efficient and powerful tools to identify
frequent problems, but they dont have effective ability to analyze the quality problem into the
workplace.
Types of check sheets include:
 Defect-location check sheets;
 tally check sheets, and;
 defect-cause check sheets

Histogram
Histograms are used to describe a sense of the frequency distribution of observed values of a
variable. It is a type of bar chart that visualizes both attribute and variable data of a product or
process, also assists users to show the distribution of data and the amount of variation within a
process. It displays the different measures of central tendency (mean, mode, and average).
Also, a histogram can be applied to investigate and identify the underlying distribution of the
variable being explored
Pareto Analysis
A Pareto chart is a special type of histogram that can easily be applied to find and prioritize
quality problems, conditions, or their causes of in the organization. On the other hand, it is a type
of bar chart that shows the relative importance of variables, prioritized in descending order from
left to right side of the chart. A Pareto chart is to figure out the different kinds of nonconformity
from data figures, maintenance data, repair data, parts scrap rates, or other sources. Also, Pareto
charts can generate a mean for investigating concerning quality improvement, and improving
efficiency, material waste, energy conservation, safety issues, cost reductions, etc

Fishbone Diagram
The cause and effect diagram is a problem-solving tool that investigates and analyzes
systematically all the potential or real causes that result in a single effect. On the other hand, it is
an efficient tool that equips the organization's management to explore for the possible causes of a
problem This diagram can provide the problem-solving efforts by gathering and organizing the
possible causes, reaching a common understanding of the problem, exposing gaps in existing
knowledge, ranking the most probable causes, and studying each cause. The generic categories
of the cause and effect diagram are usually six elements (causes) such as environment, materials,
machine, measurement, man, and method.

Scatter Diagram
Scatter diagram is a powerful tool to draw the distribution of information in two dimensions,
which helps to detect and analyze a pattern relationships between two quality and compliance
variables (as an independent variable and a dependent variable), and understanding if there is a
relationship between them, so what kind of the relationship is (Weak or strong and positive or
negative). The shape of the scatter diagram often shows the degree and direction of relationship
between two variables, and the correlation may reveal the causes of a problem. Scatter diagrams
are very useful in regression modeling. The scatter diagram can indicate that there is which one
of these following correlation between two variables:
a) Positive correlation; b)
b) Negative correlation, and
c) c) No correlation.

Flowchart
Flowchart presents a diagrammatic picture that indicates a series of symbols to describe the
sequence of steps exist in an operation or process. On the other hand, a flowchart visualize a
picture including the inputs, activities, decision points, and outputs for using and understanding
easily concerning the overall objective through process.

Control Chart
Control charts are a special form of run chart that it illustrates the amount and nature of variation in the
process over time. Also, it can draw and describe what has been happening in the process. They can
observe and monitor process in statistical control samplings are usually between UCL and LCL (upper
control limit and the lower control limit ). The main aim of control chart is to prevent the defects in
process. It is very essentially for different businesses and industries; the reason is that
unsatisfactory products or services are more costly than spending expenses of prevention by
some tools like control charts.

Acceptance Sampling

The Acceptance Sampling for Attributes procedure is used to determine the number of items to
be sampled from a lot to determine whether to accept or reject the lot. The number of items in the
sample depend upon a number of parameters, including the lot size, the acceptable quality level
(AQL), the desired producer’s risk, the limiting quality level (LQL, sometimes called the
rejectable quality level or lot tolerance percent/proportion defective), and the desired consumer’s
risk. This procedure permits the user to enter multiple values of any of these parameters to
determine the sensitivity of the sample size to that parameter. When multiple values are entered
for a parameter, a sample size curve is also produced. The cutoff value of acceptance, or
acceptance number, is also given as part of the output. In this procedure, the lot size can be
assumed to be infinite (or continuous) and use the binomial distribution for calculations, or the
lot can have a fixed size, whereupon the calculations are based on the hyper geometric
distribution.
Factors for classifications of sampling plans
Sampling plans by attributes versus sampling plans by variables. If the item inspection leads to
a binary result (conforming or nonconforming), we are dealing with sampling by attributes,
detailed later on. If the item inspection leads to a continuous measurement X, we are sampling by
variables.
We generally use sampling plans based on the sample mean and standard deviation, the so-called
variable sampling plans. If X is normal, it is easy to compute the number of items to be collected
and the criteria that leads to the rejection of the batch, with chosen risks. For different sampling
plans by variables among others.
Incoming versus outgoing inspection: If the batches are inspected before the product is sent to
the consumer, it is called outgoing inspection. If the inspection is done by the consumer
(producer), after they were received from the supplier, it is called incoming inspection.
Rectifying versus non-rectifying sampling plans: All depends on what is done with
nonconforming items that were found during the inspection. When the cost of replacing faulty
items with new ones, or reworking them is accounted for, the sampling plan is rectifying.
Single, double, multiple and sequential sampling plans.
Single sampling: This is the most common sampling plan: we draw a random sample of n items
from the batch, and count the number of nonconforming items (or the number of
nonconformities, if more than one nonconformity is possible on a single item). Such a plan is
defined by n and by an associated acceptance-rejection criterion, usually a value c, the so-called
acceptance number, the number of nonconforming items that cannot be exceeded. If the number
of nonconforming items is greater than c, the batch is rejected; otherwise, the batch is accepted.
The number r, defined as the minimum number of nonconforming items leading to the rejection
of the batch, is the so-called rejection number. In the most simple case, as above, r = c + 1, but
we can have r > c + 1.
Double sampling: A double sampling plan is characterized by four parameters: n1 << n, the size
of the first sample, c1 the acceptance number for the first sample, n2 the size of the second
sample and c2 (> c1) the acceptance number for the joint sample. The main advantage of a
double sampling plan is the reduction of the total inspection and associated cost, particularly if
we proceed to a curtailment in the second sample, i.e. we stop the inspection whenever c2 is
exceeded. Another (psychological) advantage of these plans is the way they give a second
opportunity to the batch.
Multiple sampling: In the multiple plans a pre-determined number of samples are drawn before
making a decision.
Sequential sampling: The sequential plans are a generalization of multiple plans. The main
difference is that the number of samples is not pre-determined. If, at each step, we draw a sample
of size one, the plan, based on Wald's test, is called sequential item-to-item; otherwise, it is
sequential by groups. For a full study of multiple and sequential plans see, for instance
Special sampling plans: Among the great variety of special plans, we distinguish:
Chain sampling: When the inspection procedures are destructive or very expensive, a small n
is recommendable. We are then led to acceptance numbers equal to zero. This is dangerous for
the supplier and if rectifying inspection is used, it is expensive for the consumer.
Continuous sampling plans (CSP): There are continuous production processes, where the raw
material is not naturally provided in batches. For this type of production it is common to
alternate sequences of sampling inspection with 100% inspection | they are in a certain sense
rectifying plans.. It begins with a 100% inspection. When a pre-specified number i of
consecutive nonconforming items is achieved, the plan changes into sampling inspection, with
the inspection of f items, randomly selected, along the continuous production. If one
nonconforming item is detected, 100% inspection comes again, and the nonconforming item is
replaced. For properties of this plan and its generalizations see Duncan (1986).
characteristics of a sampling plan.
OCC. The operational characteristic curve (OCC) is Pa Pa(p) = P(acceptance of the batch j p),
where p is the probability of a nonconforming item in the batch. AQL and LTPD (or RQL). The
sampling plans are built taken into account the wishes of both the supplier and the consumer,
defining two quality levels for the judgment of the batches: the acceptance quality level (AQL),
the worst operating quality of the process which leads to a high probability of acceptance of the
batch, usually 95% | for the protection of the supplier regarding high quality batches, and
the lot tolerance percent defective (LTPD) or rejectable quality level (RQL), the quality level
below which an item cannot be considered acceptable. This leads to a small acceptance of the
batch, usually 10% | for the protection of the consumer against low quality batches. There exist
two types of decision, acceptance or rejection of the batch, and two types of risks, to reject a
\good" (high quality) batch, and to accept a \bad" (low quality) batch. The probabilities of
occurrence of these risks are the so-called supplier risk and consumer risk, respectively. In a
single sampling plan, the supplier risk is α= 1 - Pa(AQL) and the consumer risk is β= Pa(LTPD).
The sampling plans should take into account the specifications AQL and
LTPD, i.e. we are supposed to find a single plan with an OCC that passes through the points
(AQL, 1-α_) and (LTPD,β). The construction of double plans which protect both the supplier and
the consumer are much more difficult, and it is no longer sufficient to provide indication on two
points of the OCC. There exist the so-called Grubbs' tables (see Montgomery, 2009) providing
(c1; c2; n1; n2), for n2 = 2n1, as an example, α = 0:05, β= 0:10 and several rates RQL/AQL.
AOQ, AOQL and ATI. If there is a rectifying inspection program | a corrective program, based
on a 100% inspection and replacement of nonconforming by conforming items, after the
rejection of a batch by an AS plan |, the most relevant characteristics are the average outgoing
quality (AOQ), AOQ(p) = p (1- n=N) Pa, which attains a maximum at the so-called average
output quality limit (AOQL), the worst average quality of a product after a rectifying inspection
program, as well as the average total inspection (ATI), the amount of items subject to inspection,
equal to n if there is no rectification, but given by ATI(p) = nPa + N(1- Pa), otherwise.

Define N to be the lot size (possibly infinite), n as the (unknown) size of the sample to be drawn,
and c to be the acceptance number (the highest number of nonconforming units for which the lot
will still be accepted). Let X denote the number of nonconforming units in the sample. Let p0 be
the AQL, the highest proportion of nonconforming (defective) units for which the lot should still
be accepted. Let α be the producer’s risk, the probability of rejecting a lot with a proportion of
nonconforming (defective) units that is below the AQL. Let p1 be the LQL, the proportion of
nonconforming (defective) units above which the lot should be routinely rejected. Let β be the
probability of accepting a lot with a proportion of nonconforming (defective) units that is above
the LQL
For a given N, p0, α, p1, and β, we desire to obtain an n and c such that

Pr{𝑋𝑋 ≤ 𝑐𝑐|𝑝𝑝0} ≥ 1 – 𝛼𝛼 and

Pr{𝑋𝑋 ≤ 𝑐𝑐|𝑝𝑝1} ≤ 𝛽𝛽

If the lot size is finite, n and c should satisfy the hyper geometric distribution inequalities

(𝑐𝑐; 𝑁𝑁, 𝑀𝑀0, 𝑛𝑛) ≥ 1 – 𝛼𝛼 and

(𝑐𝑐; 𝑁𝑁, 𝑀𝑀1, 𝑛𝑛) ≤ 𝛽𝛽

where 𝑀𝑀0 = [Np0] and 𝑀𝑀1 = [Np1].

The hyper geometric probability of obtaining exactly x of n items with the characteristic of
interest is calculated using

Quality function deployment (QFD)


Quality function deployment (QFD) is a management tool that provides a visual connective
process to help teams focus on the needs of the customers throughout the total development
cycle of a product or process. It provides the means for translating customer needs into
appropriate technical requirements for each stage of a product/process-development life-cycle.
It helps to develop more customer-oriented, higher-quality products. While the structure
provided by QFD can be significantly beneficial, it is not a simple tool to use. This article
outlines how techniques such as fuzzy logic, artificial neural networks, and the Taguchi method
can be combined with QFD to resolve some of its drawbacks, and proposes a synergy between
QFD and the three methods and techniques reviewed.
QFD is a visual connective process that helps teams focus on the needs of the customers
throughout the total development cycle. It provides the means for translating customer needs into
appropriate technical requirements for each stage of a product/process development life-cycle.
Among its drawbacks are the complexities of its charts, the vagueness in the data collected and
the analysis is performed on a rather subjective basis. Artificial intelligence techniques such as
fuzzy logic and artificial neural networks, together with management and statistical tools such as
the Taguchi method are proposed to resolve some of QFD's drawbacks.
The QFD process
The starting point of any QFD project is the customer requirements, often referred to as the non-
measurable such as ``how it looks, how it feels, durability, etc.''. These requirements are then converted
into technical specifications like ``oven temperature, mould diameter, etc.''. This stage is referred to as the
engineering characteristics or measurables. The QFD process involves four phases:
1. Product planning: house of quality.
2. Product design: parts deployment.
3. Process planning.
4. Process control (quality control charts).

A chart (matrix) represents each phase of the QFD process. The complete QFD process requires
at least four houses to be built that extend throughout the entire system's development life-cycle
(Figure 1), with each house representing a QFD phase. In the first phase, the most important
engineering characteristics, that satisfy most of the customers' demands defined by the scoring at
the bottom of the house, go on to form the input to the subsequent stage in the QFD process.

The house of quality


The first chart is normally known as the ``house of quality'', owing to its shape manufacturing
process. The QFD charts help the team to set targets on issues, which are most important to the
customer and how these can be achieved technically. The ranking of the competitors' products
can also be performed by technical and customer benchmarking. The QFD chart is a
multifunctional tool that can be used throughout the organization. For engineers, it is a way to
summarise basic data in a usable form. For marketing, it represents the customer's voice and
general managers use it to discover new opportunities.
Customer needs or requirements are stated on the left side of the matrix as shown below. These
are organized by category based on the affinity diagrams. Insure the customer needs or
requirements reflect the desired market segment(s). Address the unspoken needs (assumed and
excitement capabilities). If the number of needs or requirements exceeds twenty to thirty items,
decompose the matrix into smaller modules or subsystems to reduce the number of requirements
in a matrix. For each need or requirement, state the customer priorities using a 1 to 5 rating. Use
ranking techniques and paired comparisons to develop priorities.
FMEA
FMEA is an efficient tool for the identification of potential failure modes and their effects in
order to increase the reliability and safety of complex systems. Also this technique is useful to
gather data needed for decision making and risk control. In fact, the purpose of this technique is:
a. to identify failure modes and their effects; b. to specify the corrective actions to eliminate or
reduce the probability of failure and ultimately c. development of efficient maintenance system
to reduce the occurrence of potential scenarios.
For calculating the risk priority number (RPN) in FMEA technique three factors are used:
a. Incident Occurrence Probability (O), b. Incident Detection Probability (D) and c. Incident
Consequent Severity (S). RPN is calculated using formula 1.
RPN = O*D*S formula 1
Incident Consequent Severity (S) reflects the scope and extent of damage, injury and death
caused by the incident if occur. To assess the Incident Consequent Severity pre-designed tables
usually used as a criterion. Table 1 shows the criteria to evaluate the Incident Consequent
Severity in the FMEA method.
Risk priority numbers are from 1 to 1000 and are considered in order to classify the corrective
measures necessary to reduce or eliminate potential failure modes. The modes of failure that
have the highest RPN score should be evaluated primarily. Consideration to the severity level is
very important. If severity level is 9 or 10, regardless of the RPN, its cause should be
investigated immediately.
Six Sigma

Six Sigma is a collection of process improvement tools used in a series of projects in a


systematic way to achieve high levels of stability. It is based on principles set up by quality
experts, such as Deming, Juran, Shewart and Ishakawa (Experts Archive Questions, 2007). The
term Sigma is the sound of the Greek letter (σ) that is usually used to refer to the standard
deviation which is a measure of the variation or spread in a process output around its mean value
(μ). Quantitatively, Six Sigma quality means only two defects per billion opportunities fall
outside the upper and lower specification limits. This is almost a defect-free level.
The necessity to operate at such a low defect level may not be economic in all industries.
However, at high-yield companies such as Motorola, producing electronic parts with thousands
of opportunities of failure due to the numerous parts involved in every product, achieving a high
defect-free level is very necessary so that the combined opportunity for failure stays as low as
possible.
The Six Sigma methodology starts with the identification of the need for an improvement
project. When the project starts, a financial analysis is performed to quantify its expected
financial savings. This is estimated based on an improvement target for a certain measure of the
outcome of a process. The process current performance is measured and analyzed for critical
causes for improvement and solutions are implemented. The performance is monitored and the
achievement is proven by the end of the project based on the data on hand.
Six Sigma principles include:
• aligning key processes and customer requirements with the strategic goals
 • identifying champions for each project, obtaining necessary resources and securing
help to overcome the resistance to change
• instituting a standard measurement system and identifying appropriate metrics
• training, deploying improvement teams and setting stretch improvement goals
Design of Experiments (DoE)
Design of Experiments (DoE) is a methodology for systematically applying statistics to
experimentation. It consists of a series of tests in which purposeful changes are made to the input
variables (factors) of a product or process so that one may observe and identify the reasons for
these changes in the output response. DoE provides a quick and cost-effective method to
understand and optimize products and processes. Although these techniques are commonly found
in statistics and quality literature, they are hardly used in industry.
References
[1] Deming, W. Edwars, Some Theory of Sampling”, 3rd Edition, New York: John Wiley, 1950.
[2] Besterfield, D. H., Quality Control, 8th Edition, Pearson-Prentice Hall, (2009)
[3] Besterfield, D. H., Michna, C. B., Besterfield, G. H., and Sarce, B. M., Total Quality
Management, 3rd Edition, Prentice Hall, 2003.
[4] Devor, R. E, Chang, T. and Sutherland, J. W. Statistical Quality Design and Control:
Contemporary Concepts and Methods, Pearson Prentice Hall, 2nd Edition, 2007.

You might also like