ENGINEERING STATISTICS MONTGOMERY PDF

adminComment(0)

Applied Statistics and Probability for Engineers. Third Edition. Douglas C. Montgomery. Arizona State University. George C. Runger. Arizona State University. Introduction to engineering statistics, By Montgomery and Runger appropriate for a one-semester .. The pdf is variable and shown in Fig. Applied Statistics and Probability for Engineers, Sixth Edition by Montgomery and in probability and statistics for students in engineering and applied sciences.


Engineering Statistics Montgomery Pdf

Author:RAUL BLACKWATER
Language:English, Dutch, Japanese
Country:Brazil
Genre:Religion
Pages:706
Published (Last):16.11.2015
ISBN:844-5-65233-920-7
ePub File Size:20.42 MB
PDF File Size:16.33 MB
Distribution:Free* [*Registration Required]
Downloads:38074
Uploaded by: TWANA

You could quickly download this engineering statistics montgomery solutions manual Solution Manual Engineering Statistics kaz-news.info - Free download. manual to montgomery engineering statistics such as: bystronic bystar laser sxc workshop repair service manual pdf, marriage romance. none Click This Link To Download: kaz-news.info Language: English.

This could involve computing a P-value or comparing the test sta- tistic to a set of critical values.

Steps 1—4 should be completed prior to examination of the sample data. This sequence of steps will be illustrated in subsequent sections. The burning rate of this propellant is an im- A set of example problems Propellant portant product characteristic.

What conclusions should he draw? We may solve this problem by following the seven-step procedure outlined in Section This results in the following: Reject H0 if the P-value is smaller than 0. Construct a cumulative frequency plot and histogram End-of-Section Exercises for the gene expression data from each group separately in Exercises at the end of each Construct a cumulative frequency plot and histogram Exercise Comment on any differences.

Construct a cumulative frequency plot and histogram b Use 16 bins and compare with part a. Use 6 bins. Construct a cumulative frequency plot and histogram using the failure data from Exercise The following information on structural defects in Construct a cumulative frequency plot and histogram automobile doors is obtained: Compare your results with a set of supplemental She obtains the following of the chapter topics and data: Explain how you would proceed.

Do you solve the problem. The percentage mole conversion of naphthalene to enough to the target value to accept the solution as con- maleic anhydride from Exercise follows: Explain your reasoning. Construct a data set for which the paired t-test miles per gallon in urban driving than Car Type B. The these exercises challenge statistic is very large, indicating that when this analysis standard or claim may be expressed as a mean average , students to apply chapter variance, standard deviation, or proportion.

Collect two methods and concepts to is used, the two population means are different; however, appropriate random samples of data and perform a hy- problems requiring data t0 for the two-sample t-test is very small, so the incorrect pothesis test.

Report on your results. Be sure to include collection. Visit the student section of the book Web site at www. The Student Solutions Manual may be downloadd from the Web site at www. This icon in the book shows which exercises are included in the accompanying Student Solutions Manual.

Visit the instructor section of the book website at www. Student versions of software often do not have all the functionality that full versions do. Consequently, student versions may not support all the concepts presented in this text.

If you would like to adopt for your course the set of this text with the Student Version of Minitab, please contact your local Wiley representative at www.

A Research-based Design. WileyPLUS provides an online environment that integrates rele- vant resources, including the entire digital textbook, in an easy-to-navigate framework that helps students study more effectively. One-on-one Engagement. Students engage with related examples in various media and sample practice items, including: Craig Downing, including: Throughout each study session, students can assess their progress and gain immediate feedback.

WileyPLUS provides reliable, customizable resources that reinforce course goals inside and outside of the classroom as well as visibility into individual student progress. Pre-created materials and activities help instructors optimize their time: Customizable Course Plan: Pre-created Activity Types Include: WileyPLUS provides instant access to reports on trends in class performance, student use of course materials and progress towards learning objectives, helping inform deci- sions and drive classroom discussions.

Learn More. Powered by proven technology and built on a foundation of cognitive research, WileyPLUS has enriched the education of millions of students, in over 20 countries around the world. We are grateful to Dr. Dale Kennedy and Dr. Mary Anderson—Rowland for their generous feedback and suggestions in teaching our course at Arizona State University. We also thank Dr. Lora Zimmer, and Dr. Sharon Lewis for their work in the development of the course based on this text when they were graduate assistants.

We are very thankful to Ms. Busaba Laungrungrong, Dr.

Nuttha Lurponglukana, and Dr. Sarah Street, Dr. James C. Ford, Dr. Craig Downing, and Mr. Patrick Egbunonu for their assistance in checking the accuracy and completeness of the text, solutions manual, and WileyPLUS.

We appreciate the staff support and resources provided by the Industrial Engineering program at Arizona State University, and our director Dr. Ronald Askin. Several reviewers provided many helpful suggestions, including Dr.

Thomas Willemain, Rensselaer, Dr. John D. David Mathiason. We are also indebted to Dr. Smiley Cheng of the University of Manitoba for permission to adapt many of the statistical tables from his excellent book with Dr. This project was supported, in part, by the National Science Foundation Opinions expressed are those of the authors and not necessarily those of the Foundation Douglas C.

Runger Norma Faris Hubele. The Ironto Wayside Footbridge was built in and is the oldest standing metal bridge in Virginia. Although it has now been restored as a footbridge, in its former life it routinely carried heavy wagonloads, three tons or more, of goods and materials.

Huffman conducted a historical survey of the bridge and found that a load-bearing analysis had never been done. After gathering the available structural data on the bridge, she created a computer model stress analysis based on typical loads that it would have carried. After analyzing her results, she tested them on the bridge itself to verify her model.

She set up dial gauges under the cen- ter of each truss. She then had a 3. Her results and conclusions will be helpful in maintaining the bridge and in helping others to restore and study historic bridges. Her adviser Cris Moen points out that her computer model can be used to create structural models to test other bridges. It is an excellent example of using sample data to verify an engineering model. The steps in the engineering method are as follows: Develop a clear and concise description of the problem.

Identify, at least tentatively, the important factors that affect this problem or that may play a role in its solution. State any limitations or assumptions of the model. Conduct appropriate experiments and collect data to test or validate the tentative model or conclusions made in steps 2 and 3.

Manipulate the model to assist in developing a solution to the problem. Draw conclusions or make recommendations based on the problem solution.

Develop Identify Propose or Manipulate Confirm Draw a clear the important refine the model the solution conclusions description a model and make of the problem factors recommendations Collect data Figure The engineering problem-solving method. Steps 2—4 in Fig. Many of the engineering Many aspects of engineering practice involve collecting, working with, and using data in the sciences are employed in solution of a problem, so knowledge of statistics is just as important to the engineer as the engineering problem- knowledge of any of the other engineering sciences.

Statistical methods are a powerful aid solving method: For example, consider the gasoline mileage performance of your car.

Do you always get as thermodynamics and exactly the same mileage performance on every tank of fuel? Of course not—in fact, sometimes heat transfer the mileage performance varies considerably. These factors represent potential sources of variability in the system.

Statistics gives us a framework for describing this variability and for learning about which potential sources of variability are the most important or have the greatest impact on the gasoline mileage performance. We also encounter variability in most types of engineering problems.

For example, suppose that an engineer is developing a rubber compound for use in O-rings. The O-rings are to be employed as seals in plasma etching tools used in the semiconductor industry, so their resistance to acids and other corrosive substances is an important characteristic.

The tensile strengths in psi of the eight O-rings are , , , , , , , and As we should have anticipated, not all the O-ring specimens exhibit the same measurement of tensile strength. There is variability in the tensile strength measurements. Because the measurements exhibit variability, we say. The constant remains uncover patterns in data. However, this never happens in engineering practice, so the actual measurements we observe exhibit variability. Figure is a dot diagram of the O-ring tensile strength data.

The dot diagram is a very useful plot for displaying a small body of data, say, up to about 20 observations. This plot allows us to easily see two important features of the data: The need for statistical thinking arises often in the solution of engineering problems.

Consider the engineer developing the rubber O-ring material. From testing the initial specimens, he knows that the average tensile strength is Eight O-ring specimens are made from this modified rubber compound and subjected to the nitric acid emersion test described earlier.

Applied Statistics and Probability for Engineers, 6th Edition

The tensile test results are , , , , , , , and The tensile test data for both groups of O-rings are plotted as a dot diagram in Fig. However, there are some obvious questions to ask.

For instance, how do we know that another set of O-ring specimens will not give different results? In other words, are these results due entirely to chance? Is a sample of eight O-rings adequate to give reliable results? Statistical inference is has no effect on tensile strength?

Statistical thinking and methodology can help answer the process of deciding if these questions. This reasoning is from a sample such as the eight rubber O-rings to a population such as the O-rings that will be sold to customers and is re- ferred to as statistical inference. See Fig. Clearly, reasoning based on measurements from some objects to measurements on all objects can result in errors called sampling errors.

We can think of each sample of eight O-rings as a random and representative sample of all parts that will ultimately be manufactured. The order in which each O-ring was tested was also randomly determined. This is an example of a completely randomized designed experiment. Sometimes the objects to be used in the comparison are not assigned at random to the treatments. For example, the September issue of Circulation a medical journal pub- lished by the American Heart Association reports a study linking high iron levels in the body with increased risk of heart attack.

The researchers just tracked the subjects over time. This type of study is called an observational study. Designed experiments and observational studies are discussed in more detail in the next section. For example, the difference in heart attack risk could be attributable to the dif- ference in iron levels or to other underlying factors that form a reasonable explanation for the observed results—such as cholesterol levels or hypertension.

In the engineering environment, the data are almost always a sample that has been se- lected from some population.

In the previous section we introduced some simple methods for summarizing and visualizing data. In the engineering environment, the data are almost always a sample that has been selected from some population.

A sample is a subset of the population containing the observed objects or the outcomes and the resulting data. Generally, engineering data are collected in one of three ways: When little thought is put into the data collection procedure, serious problems for both the statistical analysis and the practical inter- pretation of results can occur.

Montgomery, Peck, and Vining describe an acetone-butyl alcohol distillation col- umn. A schematic of this binary column is shown in Fig. For this column, production person- nel maintain and archive the following records: The production personnel very infrequently change this rate.

Are you interested in…

In most such studies, the engineer is interested in using the data to construct a model relating the variables of interest. These types of models are called empirical models, and they are illustrated in more detail in Section A retrospective study takes advantage of previously collected, or historical, data. It has the advantage of minimizing the cost of collecting the data for the study. However, there are several potential problems: The historical data on the two temperatures and the acetone concentration do not correspond directly.

Constructing an approximate correspondence would probably require making several assumptions and a great deal of effort, and it might be impos- sible to do reliably. Because the two temperatures do not vary. Within the narrow ranges that they do vary, the condensate temperature tends to in- crease with the reboil temperature. Retrospective studies, although often the quickest and easiest way to collect engineering process data, often provide limited useful information for controlling and analyzing a process.

In general, their primary disadvantages are as follows: Some of the important process data often are missing. The reliability and validity of the process data are often questionable. The nature of the process data often may not allow us to address the problem at hand. The engineer often wants to use the process data in ways that they were never in- tended to be used. Using historical data always involves the risk that, for whatever reason, some of the important data were not collected or were lost or were inaccurately transcribed or recorded.

Consequently, historical data often suffer from problems with data quality. These errors also make historical data prone to outliers. Just because data are convenient to collect does not mean that these data are useful. Historical data cannot provide this information if information on some important variables was never collected. For example, the ambient temperature may affect the heat losses from the distillation column. On cold days, the column loses more heat to the environment than during very warm days.

The production logs for this acetone-butyl alcohol column do not routinely record the ambient temperature. Also, the concentration of acetone in the input feed stream has an effect on the acetone concentra- tion in the output product stream.

Design of experiments

However, this variable is not easy to measure routinely, so it is not recorded either. Consequently, the historical data do not allow the engineer to include either of these factors in the analysis even though potentially they may be important. The purpose of many engineering data analysis efforts is to isolate the root causes underly- ing interesting phenomena. With historical data, these interesting phenomena may have occurred months, weeks, or even years earlier. Analyses based on historical data often identify interesting phenomena that go unexplained.

Finally, retrospective studies often involve very large indeed, even massive data sets. As the name implies, an observational study simply observes the process or population during a period of routine operation. Usually, the engineer interacts or disturbs the process only as much as is required to obtain data on the system, and often a special effort is made to collect data on variables that are not routinely recorded, if it is thought that such data might be useful.

With proper planning, observational studies can ensure accurate, complete, and reliable data. The data col- lection form should provide the ability to add comments to record any other interesting phenomena that may occur, such as changes in ambient temperature.

It may even be possible to arrange for the input feed stream acetone concentration to be measured along with the other variables during this relatively short-term study.

An observational study conducted in this manner would help ensure accurate and reliable data collection and would take care of problem 2 and possibly some aspects of problem 1 associated with the retrospective study. This approach also minimizes the chances of observing an outlier related to some error in the data. Unfortunately, an observational study cannot address problems 3 and 4. Observational studies can also involve very large data sets.

In a designed experiment, the engineer makes deliberate or purposeful changes in controllable variables called factors of the system, observes the resulting system output, and then makes a decision or an inference about which variables are responsible for the changes that he or she observes in the output performance.

An important distinction between a designed experiment and either an observational or retrospective study is that the different combinations of the factors of interest are applied randomly to a set of experimental units. This allows cause-and-effect relationships to be established, something that cannot be done with observational or retrospective studies.

The O-ring example is a simple illustration of a designed experiment. That is, a deliber- ate change was introduced into the formulation of the rubber compound with the objective of discovering whether or not an increase in the tensile strength could be obtained.

This is an experiment with a single factor. We can view the two groups of O-rings as having the two formu- lations applied randomly to the individual O-rings in each group. This establishes the desired cause-and-effect relationship. These techniques are introduced and illustrated extensively in Chapters 4 and 5.

A designed experiment can also be used in the distillation column problem. Suppose that we have three factors: The experimental design must ensure that we can separate out the effects of these three factors on the response variable, the concentration of acetone in the output product stream. In a designed experiment, often only two or three levels of each factor are employed. The best experimental strategy to use when there are several factors of interest is to conduct a factorial experiment.

In a factorial experiment, the factors are varied together in an arrangement that tests all possible combinations of factor levels. Figure illustrates a factorial experiment for the distillation column. Because all three fac- tors have two levels, there are eight possible combinations of factor levels, shown geometrically as the eight corners of the cube in Fig.

The tabular representation in Fig. The actual experimental runs would be conducted in random order, thus establishing the random assignment of factor-level combinations to experi- mental units that is the key principle of a designed experiment.

Two trials, or replicates, of the experiment have been performed in random order , resulting in 16 runs also called observations. Some very interesting tentative conclusions can be drawn from this experiment.

First, com- pare the average acetone concentration for the eight runs with condenser temperature at the high level with the average concentration for the eight runs with condenser temperature at the low level these are the averages of the eight runs on the left and right faces of the cube in Fig. Thus, increasing the condenser temperature from the low to the high level increases the average concentration by 0. The reboil temperature effect can be evaluated by comparing the average of the eight runs in the top of the cube with the average of the eight runs in the bottom, or The effect of increasing the reboil temperature is to increase the average concentration by 0.

This graph was constructed by calculating the Rate Temp. Figure A four-factor factorial experiment for the distillation column. This is an example of an interaction between two factors. Interactions occur often in physical and chemical systems, and factorial experiments are the only way to investigate their effects.

In fact, if interactions are present and the factorial experimental strategy is not used, incorrect or misleading results may be obtained. We can easily extend the factorial strategy to more factors. Suppose that the engineer wants to consider a fourth factor, the concentration of acetone in the input feed stream. Figure illustrates how all four factors could be investigated in a factorial design. Note that as in any factorial design, all possible combinations of the four factors are tested.

The experiment requires 16 trials. If each combi- nation of factor levels in Fig. Generally, if there are k factors and they each have two levels, a factorial experimental design will require 2k runs. Clearly, as the number of factors increases, the number of trials required in a factorial experiment increases rapidly; for instance, eight factors each at two levels would require trials.

This amount of testing quickly becomes unfeasible from the viewpoint of time and other resources. A fractional factorial experiment is a variation of the basic factorial arrangement in which only a subset of the factor combinations are actually tested.

Figure shows a fractional factorial experimental design for the four- factor version of the distillation column experiment. This experimental design requires only 8 runs instead of the original 16; consequently, it would be called a one-half fraction. This is an excellent experimental design in which to study all four factors. It will provide good information about the individual effects of the four factors and some information about how these factors interact.

Factorial and fractional factorial experiments are used extensively by engineers and scientists in industrial research and development, where new technology, products, and processes are. Reboil temp. Figure A fractional factorial experiment for the distillation column. Chapter 7 focuses on these principles, concentrating on the factorial and fractional factorials that we have introduced here.

The objective is to use the sample data to make decisions or learn something about the population. Recall that the population is the complete collection of items or objects from which the sample is taken. A sample is just a subset of the items in the population.

For example, suppose that we are manufacturing semiconductor wafers, and we want to learn about the resistivity of the wafers in a particular lot. In this case, the lot is the population. Data are often collected as a result of an engineering experiment. For example, recall the O-ring experiment described in Section Initially eight O-rings were produced and sub- jected to a nitric acid bath, following which the tensile strength of each O-ring was determined.

In this case the eight O-ring tensile strengths are a sample from a population that consists of all the measurements on tensile strength that could possibly have been observed. This type of pop- ulation is called a conceptual population.

Many engineering problems involve conceptual populations. The O-ring experiment is a simple but fairly typical example. The factorial exper- iment used to study the concentration in the distillation column in Section The way that samples are selected is also important. For example, suppose that you wanted to learn about the mathematical skills of undergraduate students at Arizona State University ASU. Now, this involves a physical population.

Suppose that we select the sam- ple from all of the students who are currently taking an engineering statistics course. This is probably a bad idea, as this group of students will most likely have mathematical skills that are quite different than those found in the majority of the population. In general, samples that are taken because they are convenient, or that are selected through some process involv- ing the judgment of the engineer, are unlikely to produce correct results.

For example, the. This usually happens with judgment or convenience samples. In order for statistical methods to work correctly and to produce valid results, random samples must be used.

The most basic method of random sampling is simple random sam- pling. To illustrate simple random sampling, consider the mathematical skills question dis- cussed previously. Assign an integer number to every student in the population all of the ASU undergraduates.

These numbers range from 1 to N. Suppose that we want to select a simple random sample of students. We could use a computer to generate random in- tegers from 1 to N where each integer has the same chance of being selected.

Choosing the students who correspond to these numbers would produce the simple random sample. Notice that every student in the population has the same chance of being chosen for the sample. Can we view these measurements as a simple ran- Measurements dom sample? What is the population? If the circuit is the same each time the measurement is made, and if the characteristics of the ammeter are unchanged, then we can view the current measurements as a simple random sample.

The population is conceptual—it consists of all of the current measurements that could be made on this cir- cuit with this ammeter. Suppose that the engineer runs this column Distillation for 24 consecutive hours and records the acetone concentration at the end of each hour. Is this a random sample? Column Solution.

This is also an example involving a conceptual population—all of the hourly concentration Measurements observations that will ever be made. Only if you are very sure that these consecutive readings are taken under identical and unchanging conditions and are unlikely to differ from future observations on the process would it be reasonable to think of these data as a random sample.

For example, consider the lot of semicon- ductor wafers. It is tempting to take the sample of three wafers off the top tray of the container. This is an example of a convenience sample, and it may not produce satisfactory results, because the wafers may have been packaged in time order of production and the three wafers on top may have been produced last, when something unusual may have been happening in the process.

Retrospective data collection may not always result in data that can be viewed as a random sample. Data from a designed experiment can usually be viewed as data from a random sample if the individual observa- tions in the experiment are made in random order.

Completely randomizing the order of the runs in an experiment helps eliminate the effects of unknown forces that may be varying while the experiment is being run, and it provides assurance that the data can be viewed as a random sample. Collecting data retrospectively on a process or through an observational study, and even through a designed experiment, almost always involves sampling from a conceptual popula- tion.

Our objective in many of these data studies is to draw conclusions about how the sys- tem or process that we are studying will perform in the future. An analytic study is a study or experiment where the conclusions are to be drawn relative to a future population.

For example, in the distillation column experiment we want to make conclusions about the con- centration of future production quantities of acetone that will be sold to customers. This is an analytic study involving a conceptual population that does not yet exist.

Clearly, in addi- tion to random sampling there must be some additional assumption of stability of this process over time. For example, it might be assumed that the sources of variability currently being experienced in production are the same as will be experienced in future production.

In Chapter 8 we introduce control charts, an important statistical technique to evaluate the stability of a process or system. The problem involving sampling of wafers from a lot to determine lot resistivity is called an enumerative study.

The sample is used to make conclusions about the population from which the sample was drawn. The study to determine the mathematical abilities of ASU undergraduates is also an enumerative study.

Note that random samples are required in both enumerative and analytic studies, but the analytic study requires an additional assumption of stability. Figure provides an illustration. Time Population Future? Sample x1, x2, …, xn x1, x2, …, xn Enumerative Analytic study study Figure Enumerative versus analytic study. Sometimes engineers work with problems for which there is no simple or well-understood mechanistic model that explains the phenomenon.

For instance, suppose we are interested in the number average molecular weight Mn of a polymer. Now we know that Mn is related to the viscosity of the material V and that it also depends on the amount of catalyst C and the temperature T in the polymerization reactor when the material is manufactured.

This table contains data on three variables that were collected in an observational study in a semiconductor manufacturing plant. The vari- ables reported are pull strength a measure of the amount of force required to break the bond , the wire length, and the height of the die.

Figure a is a scatter diagram of the pull strength y from Table versus the wire length x1. We used the computer package Minitab to construct this plot. Minitab has an option that produces a dot diagram along the right and top edges of the scatter diagram, allowing us to easily see the distribution of each variable individually. So in a sense, the scatter diagram is a two-dimensional version of a dot diagram. The scatter diagram in Fig. Similar information is conveyed by the scatter dia- gram in Fig.

Figure is a three- dimensional scatter diagram of the observations on pull strength, wire length, and die height. Most Chinese laborers who came to the United States did so in order to send money back to China to support their families there. At the same time, they also had to repay loans to the Chinese merchants who paid their passage to North America. These financial pressures left them little choice but to work for whatever wages they could.

Non-Chinese laborers often required much higher wages to support their wives and children in the United States, and also generally had a stronger political standing to bargain for higher wages. Therefore, many of the non-Chinese workers in the United States came to resent the Chinese laborers, who might squeeze them out of their jobs.

Furthermore, as with most immigrant communities, many Chinese settled in their own neighborhoods, and tales spread of Chinatowns as places where large numbers of Chinese men congregated to visit prostitutes, smoke opium, or gamble. Some advocates of anti-Chinese legislation therefore argued that admitting Chinese into the United States lowered the cultural and moral standards of American society.

Others used a more overtly racist argument for limiting immigration from East Asia, and expressed concern about the integrity of American racial composition.

Because anti-Chinese discrimination and efforts to stop Chinese immigration violated the Burlingame-Seward Treaty with China, the federal government was able to negate much of this legislation.

Navigation menu

In the decade —70, 64, were recorded as arriving, followed by , in —80 and 61, in — In , advocates of immigration restriction succeeded in introducing and passing legislation in Congress to limit the number of Chinese arriving to fifteen per ship or vessel. Republican President Rutherford B. Hayes vetoed the bill because it violated U. Nevertheless, it was still an important victory for advocates of exclusion.

Democrats, led by supporters in the West, advocated for all-out exclusion of Chinese immigrants. Although Republicans were largely sympathetic to western concerns, they were committed to a platform of free immigration. In order to placate the western states without offending China, President Hayes sought a revision of the Burlingame-Seward Treaty Burlingame Treaty in which China agreed to limit immigration to the United States. Angell to negotiate a new treaty with China.

The resulting Angell Treaty permitted the United States to restrict, but not completely prohibit, Chinese immigration. In , Congress passed the Chinese Exclusion Act, which, per the terms of the Angell Treaty, suspended the immigration of Chinese laborers skilled or unskilled for a period of 10 years. The Act also required every Chinese person traveling in or out of the country to carry a certificate identifying his or her status as a laborer, scholar, diplomat, or merchant.For example, if the variabil- Margin notes help to guide ity in the residuals increases with yi.

Highlights of the value of statistical methodologies are discussed using simple examples. Each adjustment is equal and opposite to the deviation of the previous measurement from the target. What about using a proxy pretest? Cox , H.