What you need for the personnel department. Structure and functions of the personnel of the organization. Mandatory documents that should be kept in the personnel department

Standard ISO argues that the correct application of statistical methods is important for conducting control influences when analyzing the market, for product design, to predict durability and service life, to study the means of regulating processes, to determine the levels of quality in selective control plans, when evaluating performance characteristics To improve the quality of processes, in assessing safety and risk analysis.

Using statistical methods, It is possible to identify the problems associated with quality in a timely manner (to detect a violation of the process before the release of defective products occurred). To a large extent, statistical methods allow you to establish and cause violations.

The need for statistical methods arises, first of all, due to the need to minimize variability (variability) of processes.

Under the variability means the deviation of various facts from the specified values. Non-timely variable variability can be fatal danger, both for production and products and enterprises as a whole.

Systems approach The decision-making procedure based on the theory of variability is called statistical thinking. In accordance with the formulation of the American Society, the quality of statistical thinking is based on three fundamental principles:

1) any work is carried out in the system of interrelated processes;

2) in all processes there are variations;

3) understanding and decline in variation is the key to success.

Deming spoke "If I had to express my message to the management of everything in a few words, I would say that the whole essence is to reduce the variation."

Causes of variations of any processes can be divided into two groups.

The first group is the overall reasons associated with the production system (equipment, buildings, raw materials, personnel) correspond to the variability cannot be changed without changing the system. Any actions of ordinary employees - performers in this situation most likely only worsens the situation. Interference in the system almost always requires action by the leadership - top management.

The second group is special causes associated with operator errors, setup failures, violations of the regime, etc. The elimination of these reasons is the personnel directly participating in the process. This is non-random causes - wear tool, weakening fasteners, change in the temperature of the coolant, violation of the technological mode. Such reasons must be studied and can be eliminated when setting up the process, which ensures its stability.

The main functions of statistical methods in the Criminal Code

Cognitive information function

Prognostic function

Evaluation function

Analytical function

False and undeclared anxiety

AT this case We are talking about statistical errors. Where as a result of their occurrence, false alarm can be reinforced and the turnover does not detect these errors can translate to an undeclared alarm.

In general, observation errors are discrepancies between statistical observation and valid values \u200b\u200bof the values \u200b\u200bunder study.

when conducting statistical observations, two types of errors are distinguished

1) registration errors

2) Representative errors

Registration errors - arise due to improper establishment of facts in the process of observation, or their erroneous recording, or both.

Registration errors are random and systematic, intentional and unintentional.

Random errors are the errors that arise under the action of random factors.

Such errors can be directed both towards exaggeration and towards the accuracy, and with a sufficiently large number of observations, these errors are mutually repaid under the action of the law of large numbers.

Systematic errors - arise according to certain constant reasons acting in the same direction, i.e. in the direction of exaggeration or accuracy of the size of the data, which leads to serious distortions of general results statistical observation.

Intentional errors are the errors of which is a conscious distortion of data.

Unintentionally errors are errors that wear an occasional, unintentional character, for example, malfunction of measuring instruments.

Representative errors - such errors occur with non-solid observation. They, as well as the registration errors come random and systematic

Random errors of representativeness arise due to the fact that the selective set of selected observation units received on the basis of the principle of chance of observation units reflects not the entire set, the value of this error can be estimated.

Systematic errors arise as a result of violation of the principle of chance of selection of units of the studied aggregate, which should be observed.

The dimensions of these errors are usually not quantified. Checking the accuracy of data from statistical observation can be implemented through control.

Classification of deviations of the quality parameters of products and control methods

Depending on the source and method of obtaining information, the quality assessment methods are classified on objective, heuristic, statistical and combined (mixed). Objective methods are divided into measuring, registration, calculated and prototypes. Euristic methods include organoleptic, expert and sociological methods.

The use of statistical methods is one of the most effective paths developing new technologies and quality control processes.

Question 2. Reliability of systems. Evaluation of the probability of failures and probability of trouble-free operation of the system with various diagrams of connecting elements included in it.

Reliability of Systems

The reliability of the system is the property of the object to save in time within the limits of all parameters characterizing the ability to perform the required functions in the specified modes and conditions of application, maintenance, repairs, storage and transportation.

The reliability indicator quantitatively characterizes one or more properties that make up the reliability of the object.

The reliability indicator may have dimension (for example, working on failure) or not to have (for example, the probability of trouble-free operation).

Reliability indicators may be single and complex. Unitthe reliability indicator characterizes one of the properties, but complex - several propertiesthat make up the reliability of the object.

Distinguish the following indicators Reliability:

Control

Performance

Undetyability

Durability

Maintainability

Regenebility

Pasteability, etc.

Causes of manufacture of unreliable products:

1) the lack of a regular verification of compliance with standards;

2) errors in the use of materials and improper control of materials during production;

3) incorrect accounting and reporting on control, including information on improving technology;

4) not meet the standards of selective control circuit;

5) the lack of tests for their compliance;

6) not fulfilling the standards for acceptance tests;

7) the absence of instructive materials and guidance instructions;

8) not regular use of control reports for improvement technological process.

Evaluation The probability of failures and the probability of trouble-free operation of any system depends on the scheme of the connection of the elements included in it.

Three compound schemes distinguish:

1) Sequential connection of elements


The sequential system of connecting the elements is reliable when all items are reliable and the greater the number of elements in the system, the lower its reliability.

Reliability of sequentially connected elements can be found by the formula:

(1)

where p is the degree of reliability of the element.

p is the number of items.

The probability of a system failure of the system of consistently connected elements is in the formula:

2) parallel connection of elements


The parallel connection of the elements increases the reliability of the system.

The reliability of the system with parallel connection of the elements is determined by the formula:

where Q is the degree of unreliability of the element

The probability of refusal with parallel connection of the elements is determined by the formula:

3) Combined compounds.

There are two schemes for combined connections of elements.

Scheme (1) - reflects the reliability of the system with parallel connecting two subsystems when each of them consists of two sequentially connected elements.

Scheme (2) - reflects the reliability of the system at sequential connection two subsystems when each of them consists of two parallel connected elements


The reliability of the system with a parallel connection of the two subsystems, when each of them consists of two sequentially connected elements is determined by the formula:

The reliability of the system with a sequential connection of the two subsystems, when each of them consists of two parallel connected elements is determined by the formula.

1. The role and importance of statistical methods in quality management. Causes to deterrent use of statistical methods in the practice of domestic enterprises

1.1 Introduction

The need to use statistical methods is substantiated by the variability observed in the process and affecting the results of production and commercial activity, even with apparent stability. Such variability can manifest itself in measuring product characteristics and processes at various stages of them. life cycle (Starting from the market research and ending with the sale of finished products).

Statistical methods help measure, describe, analyze and simulate such variability even if there is a limited amount of data. Statistical data analysis can help in the formation of a better understanding of nature, timing and causes of variability, and in the future - in solving and even preventing problems related to this kind of variability.

Thus, statistical methods make it possible to best use the available data when making decisions and improve product quality and processes at the design, development, production, supply and maintenance stages.

Currently on domestic enterprises The use of applied statistics engineering personnel, and even more so workers, is relatively rare. There are three main reasons for it.

At firstThe traditional understanding of technology leads to the fact that most engineers are engaged in the transformation of materials and energy. They do not understand the importance of transformation, understanding and using information.

SecondlyTraditional technical education is built on the principle of "accuracy". With student years, the accuracy of calculating the design, the accuracy of processing, measurement in the consciousness of a specialist becomes the main factor. Deviations are recognized by undesirable, and since they are undesirable, the orthodox principle is triggered: the deviations should not be, it means that they should not be. This is all the more surprising that still the production workers see and understand that there are no doctoral technologies and there can be no production.

Uncertainty is always present in industrial processes, actions of people, operation of machines, machine tools, devices and tools, quality materials and components, etc. "Open", to identify, detect the pattern of this uncertainty, can only statistics can, subject to the correct and meaningful sections. Statistics helps to distinguish random n systematic deviations, as well as reveal their causes. At the same time, the ability to find, control deviations (defects, marriage) to detect the causes of marriage that are subject to elimination, only in this case, with marriage (deviations, inconsistencies), it is intended to be able to approach the concept of accuracy. (It is clearly traced to a complete analogy with medicine. In the treatment of any ailment, its degree is determined precisely the magnitude of the deviation from the norm, and the method of treatment itself is entirely determined by the cause of the ailment.)

Thirdly, Most experts do not have the experience of processing empirical data, on the basis of specific observations, make general conclusions. The stereotype of industrial thinking today has developed in such a way that the intelligence factor is in virtually zero. Production problems are often solved with orientation only on data in time, which leads to severe, sometimes irreparable, consequences. The traditions of such work need to break "from the position of power" and this, first of all, should understand the leaders.

Statistics significantly help solve traditional engineering and manufacturing problems. It facilitates processing, analysis and use of information. Seven statistical analysis methods (Ishikawa scheme, Pareto diagram, histogram, etc. - see the topic 6 in the table) Help to submit data in a convenient for generalization and analysis. The use of these methods allows you to make reliable and correct conclusions, to obtain a greater certainty in the search for the causes of troubleshooting, therefore, greater concreteness and the effectiveness of the developed measures to eliminate these reasons.

The invaluable advantage of the application of statistics and production practice is a rapid reduction in costs. For example, in the company "Hewlett Pakcard", with the help of statistical methods, the optimal characteristics of the equipment in various conditions were established. Information was obtained to use this equipment. The result of ten months of work based on the analysis of the process using statistical methods was a sharp decline in marriage: from 9 thousand defects per million products up to 45 defects per million. In the same company, but in another case, even more impressive results were achieved: only after seven weeks of statistical research and the implementation of corrective measures, the marriage decreased from 36 thousand defects per million products to 1,500. Therefore, the widespread dissemination of statistical methods in the activities of foreign Firms (ideologue - E. Deming), as well as the widespread use of these methods in the standards of the ISO series 9000 are quite natural and do not cause surprise.

Currently, we need to rethink proclaimed, routine working methods that are often oriented to a natural solution of momentary problems. Alternatively, the widespread use of statistical methods by all experts should be increased, including workers aimed at professional identification and consistent elimination of bottlenecks. And for this you need to perform at least three conditions:


  • conduct training methods of applied statistics (seven analysis methods I am selective control) of all working;

  • Create official settings supported by the company's management, requiring these methods;

  • morally and financially encouraged employees applying application statistics to solve production problems , Express the official approval of their activities.
The use of seven methods of analysis contributes to improving quality, decline in marriage, and therefore, a sharp streamlining of production, reduction of costs and costs. The use of methods of statistical (selective) control will also give tangible economic and organizational benefits.

K. Ishikawa argues that "95% of all the problems of the company can be solved with the help of these seven principles. They are simple, but without them it is impossible to own more complex methods. In Japan, the use of these methods is of great importance. They enjoy without even difficulty graduates of secondary schools. " American scientist A. Faigebaum also considers it necessary to use on the production of statistical methods for analyzing and selective control.

^

1.2 Characteristics of statistical methods




method view

content, goal

value collection statement

systematic accounting of the situation in the form of specific data

bar graph

streamlining data in accordance with the frequency of appearance (for example, in temporary terms)

Pareto-analysis

streamlining facts

stratification

delamination of data of various origin

diagram "Causes-Actions"

analysis of sources of main problems (person, machine, material, method ...) with reference to the impact of the problem

diagram

correlation


withdrawal of patterns and links from informational material

quality control card

permanent control, whether the process works within the specified tolerance

descriptive statistics

Purpose - quantitative assessment The characteristics of the data obtained, the method is based on analytical procedures related to the processing and provision of quantitative data

measurement analysis

Set of procedures for estimating the accuracy of the measuring system in its conditions

constructing confidence intervals

The procedure for determining tolerances based on the accuracy of actions performed using the statistical distribution of measurements

analysis of the capabilities of the process

The possibilities of the process are an assessment of the variability of the process in statistical stability (estimates are reproducibility indices)

checking hypotheses

Statistical procedure for verifying the validity of the hypothesis, considering the parameters of one or more samples with certain levels of confidence

regression analysis

Binds the behavior of the studied characteristics with potential reasons

analysis of reliability

Use of engineering and analytical methods to solve reliability problems. This applies to the assessment, forecast and warnings of random failures over time

selective control

Systematic statistical method for obtaining information on the characteristics of aggregate by studying a representative sample (statistical acceptance control, selective examination)

modeling

The set of procedures, which the theoretical or empirical system can be represented mathematically as a computer program for finding solving problems

analysis of temporary series

Analysis of temporary trends is a set of methods for studying sequential observation groups

Planning experiments

The intentional measurements in the system under study are used, the statistical assessment of these changes in this system is included. As a result, it becomes possible to determine the main characteristics of the system or investigate the effect of one or more factors on these characteristics of the system.

^ 1.3. Simple informalized methods system analysis and methods of Japanese quality groups

Quality groups naturally suggest the use of such solutions to emerging problems that are based primarily on collective efforts. In many firms, for example, the method of "brainstorming" and its varieties are practiced.

1.3.1 "Brainstorming".

goal: Obtaining maximum number Proposals

Algorithm for holding:

Rules for holding business Game:


  1. Clearly install goal

  2. Everyone can perform in turn or ideas can be expressed spontaneously

  3. Offer one idea

  4. Do not discuss ideas

  5. Taking into account the ideas of the rest

  6. Register all ideas ..... for group members

1.3.2 Delphi Method .

goal: Choose from the Alternatives Best.

Algorithm for holding:

Calculation table.


Families of participants discussion

Alternatives

1

2

3

4

5

R


B.

P

R


B.

P

R


B.

P

R


B.

P

R


B.

P

BUT

4

7

28

3

4

12

1

1

1

2

3

6

5

10

50

B.

5

2

10

3

6

18

2

7

14

1

10

10

4

4

16

AT

2

8

16

1

1

1

4

3

12

3

4

12

5

2

10

G.

5

10

50

4

5

20

3

4

12

2

3

6

1

1

1

Amount of works

104

51

39

34

77

P - rank assessment (from 1 to 5); B - score in points (from 1 to 10); P - work p * b.

According to calculations, the fourth alternative - with the amount of 34 - turned out to be the very reason to be eliminated in the first place. The counting results are unconditionally accepted by the entire group.

1.3.3 Quality Group Methods

The "black box" method. Solving problems based on this method is carried out by analyzing specific situations that are selected in such a way that, when they were analyzed, the discussion participants involuntarily affect the issues of the occurrence of defects. This participants encourage special, focused issues, for example: "What can this situation be given?" Or: "How stable in this case the work of the mechanisms?" etc. The essence of the "black box" method is that the causes of defects are detected as if indirectly. Here is unleashed by a creative initiative of people.

Synetics. The method is used both to identify problem situations and to address emerging problems. The procedure consists of three stages. At the first stage, problems formulated by the Group leader are analyzed. Then each participant of the discussion puts forward its problems, and they are also carefully discussed. Upon completion of these two stages, some kind of solid model is detected. At the third stage, all generalizations, as well as the identified model, are subjected to intensive research. Not only members of the group, protecting their collective idea, but also invited experts take part in the discussion. The task of experts is to help members of the quality team take the right decision.

^ Diary method. Each member of the Quality Member is pocket notebooks. There for example, let's say, all the ideas arising from the problem under discussion are fitted. Often, records of all participants are analyzed by the Group's leader, followed by a discussion of the prepared material at the next meeting. According to the Japanese, this method valuable in that, first, the idea that appeared or concrete rationalization offer It acquires a collective group color, and secondly, all inconsistencies and different points of view are detected to a group meeting, categorical points of view are smoothed. The meeting usually is usually "averaged" opinion.

Method 6-6. At least six members of the quality group for six minutes try to formulate specific ideas that should contribute. The solution to the problem facing the group (hence the name of the method). Each participant on a separate sheet writes its considerations. This is done in a concise form. For example: disruption of sealing, material destruction, violation of technology, etc. After that, a discussion of all trained lists is organized in the group. In the process of discussion, clearly erroneous opinions are sifted, controversial, they are grouped by certain signs of all remaining. Task-to select a few most important alternatives, and their number must be less than the number of participants in the discussion.

The listed methods of solving emerging problems unites the overall orientation for the development of a single opinion. This orientation determines the very tone of the discussion by a group of quality even the most acute issues. Friendly Discussion Style, in which mutual accusations are impossible, personal attacks, sticking labels, identifying "right" and "guy", is considered as important condition Quickly detect optimal solutions.

In the orientation for a single opinion, no doubt, the elements of the National Cultural Heritage of the Japanese are manifested. Famous Japanese biophysicist prof. Satsuro Eckaci says the Japanese is historically accustomed to join in other people. In Japan, it is considered a good tone, he emphasizes when the interlocutors do not impose each other at the point of view, when they do everything possible to avoid excessive tensions when considering any controversial points. In the practice of quality of quality groups, these behavioral settings are traced with extreme clarity.

As a result of studying this chapter, the student must:

know

  • position of ISO standards on the role of statistical methods of controlling and ensuring product quality;
  • Methods for analyzing the quality of products and regulation of technological processes;

be able to

own

Skills to apply statistical methods when regulating product quality.

The concept of product quality control and statistical methods used. Seven quality control tools

Product quality control is an integral part of the production process. It is carried out at all stages of the technological cycle, starting with the quality control of the used raw materials and materials and ending with the determination of the compliance of the finished product with technical specifications and parameters. Production quality control is conducted in two directions:

  • 1) when regulating the stroke of the process manufacturing products;
  • 2) when receiving finished products.

Quality control at the enterprise is entrusted with special services - technical control departments, which include the development of quality assessment indicators for all types of products, quality testing methods, analysis of advertising, finding out the causes of defects and marriage and their elimination conditions.

In accordance with the International Regulations of the ISO Series 9001 standards, statistical methods are considered as one of the highly efficient means of ensuring product quality. The use of statistical methods allows for a given degree of accuracy and reliability to judge the state of the production process and the need for its regulation at all stages of the product life cycle. Statistical methods are considered as the path of developing new technology and quality control at various stages of the production process. Detailed instructions on the use of statistical methods for analysis and quality control are contained in ISO 9004-1, p. 4.20.

To obtain quality products, it is necessary to ensure the accuracy of the existing equipment, determine the correspondence of the accuracy of the selected technological process of a given product accuracy, evaluate the stability of the technological process. The solution of these tasks is carried out mainly by statistical processing of empirical data obtained by multiple measurements, or measure the size of products, or data on processing errors or measurement errors.

To control technological processes at each stage, their accuracy and stability are estimated. In this case, the actual data is compared with the reference to the controlled parameters, which are specified in the technological documentation.

The variation of data relating to the measurement of the parameters of the product is studied by statistical methods. For control over the scatter, the measurement schedule is being built, which makes it possible to understand the nature of the process. If the data variation is small, the average rating of the parameter can be considered reliable and there is no need to change the production technology. If the scatter is great, this means the need to regulate the process to stabilize it and ensure product quality.

To control product quality, it is necessary to have:

  • standards, technical parameters characterizing product quality;
  • methods and means of controlling quality verification;
  • technical means for testing;
  • reasons for the occurrence of defects, marriage and conditions for their elimination;
  • Results of analyzing complaints.

Statistical methods include:

1) Methods used in the development of technical control operations, planning industrial experiments, when calculating the accuracy and reliability of product parameters. They allow you to simply simply identify the inconsistency of the parameters of the product of the technical documentation. These are the so-called seven quality control tools which in 1979 were combined and offered

The Union of Japanese scientists and engineers (JUSE) as the most common in the use of visual methods for analyzing production processes;

2) Methods of multidimensional statistical analysis: correlation-producing and dispersion analysis, methods of factor and cluster analysis, adaptive robust statistics, etc.

The main purpose of statistical methods is to control the production process and the provision of facts to adjust and improve the production process.

Seven quality control tools. Seven quality control tools are a set of tools that make it easier to control the control of production processes, adjustments and improve product quality. They include:

  • 1) control sheets that allow you to improve the process of collecting data and their automatic ordering to facilitate their further use;
  • 2) Histograms reflecting the conditions of the process for the period during which data was obtained. Comparison of the type of distribution with control standards gives information to control the process (convenient when drawing up monthly reports on the quality of products, on the results of technical control, when demonstrating quality level changes by month, etc.);
  • 3) Charts Pareto, which make it possible to find out the causes of the appearance of defects and focus on the elimination of these causes (are used in analyzing the types of marriage, the amount of marriage losses, the costs of time and material means on its correction);
  • 4) the stratification method (data separation) is a tool that allows you to divide data to subgroups on a specific feature;
  • 5) Chart of Isicawa (diagram of the causes of quality change), showing the ratio between quality characteristics and factors affecting it (used in solving issues to ensure product quality, efficiency of equipment use, implementation of standards for technological operations);
  • 6) scattering diagrams (scattering), allowing to identify the causal relationships of quality indicators and influencing factors when analyzing the Isica chart (built as a correlation field for the relationship between two variables h. and y);
  • 7) control cards used to manage the quality of the technological process, as they allow you to control the moments when the manufactured products deviate from the specified technical Conditions tolerances.

The listed methods are simple and form effective system Methods of control and analysis of quality. They can be used in any sequence, they can be viewed as a holistic system and as separate analysis tools.

Statistical methods

Statistical methods - Methods for analyzing statistical data. Methods of applied statistics are distinguished, which can be used in all areas of scientific research and any sectors of the national economy, and other statistical methods, the applicability of which is limited to this or that sphere. Meaning methods such as statistical acceptance control, statistical regulation of technological processes, reliability and testing, planning experiments.

Classification of statistical methods

Statistical data analysis methods are applied in almost all areas of human activity. They are always used when it is necessary to obtain and substantiate any judgments about the group (objects or subjects) with some internal heterogeneity.

It is advisable to allocate three types of scientific and applied activities in the field of statistical data analysis methods (according to the degree of specificity of methods associated with immersion in specific problems):

a) development and study of general purpose methods, excluding the specifics of the application;

b) the development and study of statistical models of real phenomena and processes in accordance with the needs of a particular field of activity;

c) Applying statistical methods and models for statistical analysis of specific data.

Applied statistics

A description of the type of data and the mechanism of their generation is the beginning of any statistical research. To describe data, both deterministic and probabilistic methods are used. Using deterministic methods, you can analyze only those data that is available to the researcher. For example, with their help, tables calculated by the official state statistics Based on those represented by enterprises and organizations of statistical reports. To transfer the results to a wider totality, to use them for prediction and management can be used only on the basis of probabilistic statistical modeling. Therefore, only methods based on the theory of probabilities often include in mathematical statistics.

We do not consider it possible to oppose deterministic and probabilistic statistical methods. We consider them as consecutive stages of statistical analysis. At the first stage, it is necessary to analyze the data, submit them in a convenient form to perceive with tables and diagrams. Then the statistical data is advisable to analyze on the basis of certain probabilistic statistical models. Note that the possibility of a deeper penetration into the essence of a real phenomenon or process is ensured by the development of an adequate mathematical model.

In the simplest situation, statistics are the values \u200b\u200bof a certain feature characteristic of the objects studied. Values \u200b\u200bcan be quantified or to indicate a category to which the object can be attributed. In the second case, they talk about a qualitative sign.

When measuring several quantitative or qualitative features as statistical data on the object we obtain the vector. It can be considered as the new kind data. In this case, the sample consists of a set of vectors. There is a part of the coordinates - numbers, and part are high-quality (categorized) data, we are talking about the vector of differential data.

One element of the sample, that is, one dimension, may be the function as a whole. For example, describing the dynamics of the indicator, that is, its time change is the electrocardiogram of the patient or the amplitude of the engine shaft beating. Or a time series describing the dynamics of the indicators of a particular company. Then the sample consists of a set of functions.

Sampling elements can be other mathematical objects. For example, binary relationships. Thus, the surveys of experts often use ordering (ranking) of objects of expertise - product samples, investment projects, options management solutions. Depending on the expert research regulations, elements of the sample may be various types of binary relationships (ordering, partitioning, tolerance), sets, fuzzy sets, etc.

So, the mathematical nature of the sample elements in various tasks of applied statistics may be the most different. However, two classes of statistical data can be distinguished - numeric and non-numeric. Accordingly, applied statistics are divided into two parts - numerical statistics and non-statistics.

Numerical statistics are numbers, vector, functions. They can be folded, multiplied by the coefficients. Therefore, a variety of amounts are of great importance in numerical statistics. The mathematical apparatus of the analysis of the sums of random elements of the sample is (classical) laws of large numbers and central limit theorems.

Non-standard statistics are categorized data, multi-dimensional signs, binary relationships, sets, fuzzy sets, etc. They cannot be folded and multiplied by coefficients. Therefore, it makes no sense to talk about the amounts of non-numeric statistical data. They are elements of non-numeric mathematical spaces (sets). The mathematical apparatus for analyzing non-numeric statistical data is based on the use of distances between the elements (as well as the proximity measures, difference indicators) in such spaces. Using distances, empirical and theoretical averages are determined, the laws of large numbers are proved, non-parametric estimates of probability distribution density are being built, diagnostic and cluster analysis tasks are solved, and so on (see).

Applied studies use statistical data from various species. This is due, in particular, with the methods of their receipt. For example, if the tests of some technical devices continue until a certain point in time, then we get t. n. The censored data consisting of a set of numbers is the duration of the operation of a number of devices to failure, and information that other devices continued to work at the end of the test. Centered data is often used in evaluating and controlling the reliability of technical devices.

Usually, the statistical methods for analyzing the data of the first three types are separately considered. This restriction is caused by the noted above the circumstance that the mathematical apparatus for analyzing the data of non-Nature is essentially different than for data in the form of numbers, vectors and functions.

Probabilistic statistical modeling

When applying statistical methods in specific areas of knowledge and sectors of the national economy, we obtain scientific and practical disciplines of the "Statistical methods in industry", "Statistical methods in medicine", etc. From this point of view, Econometrics are "statistical methods in the economy". These disciplines of the group b) are usually based on probabilistic statistical models, built in accordance with the features of the application. It is extremely instructive to compare probabilistic statistical models used in various fields, to detect their proximity and at the same time to state some differences. So, the proximity of the tasks and applied to solve statistical methods in such areas as scientific research, specific sociological research and marketing research, or, in short, in medicine, sociology and marketing are visible. They are often united together entitled "Sample Research".

The difference between sample research from expert manifests itself, first of all, among the surveyed objects or subjects - in selective studies, we usually talk about hundreds, and in experts about dozens. But technology expert research is much sophisticated. Specificity in demographic or logistics models is even more pronounced, when processing narrative (textual, chronicle) information or when studying the mutual influence of factors.

Issues of reliability and safety of technical devices and technologies, mass maintenance theory are considered in detail, in a large number of scientific works.

Statistical analysis of specific data

The use of statistical methods and models for statistical analysis of specific data is closely attached to the problems of the relevant area. The results of the third of the dedicated types of scientific and applied activities are at the disk disciplines. They can be considered as examples of the practical application of statistical methods. But no less reason to attribute them to the relevant field of human activity.

For example, the results of polling of consumers of soluble coffee are naturally attributed to marketing (as they do, reading lectures on marketing research). The study of price increases with the help of inflation indices calculated by independently collected information is of interest primarily from the point of view of the economy and the management of the national economy (both on the macro level and at the level of individual organizations).

Development prospects

The theory of statistical methods aims to solve real tasks. Therefore, it constantly arises new formulations of mathematical problems of analyzing statistical data, new methods are developing and justified. The rationale is often conducted by mathematical means, that is, by proof by theorem. The methodological component is played by a big role - exactly how to put tasks, which assumptions to take for further mathematical study. Great the role of modern information technologies, in particular, computer experiment.

The task of analyzing the history of statistical methods is relevant to identify their development trends and application for forecasting.

Literature

2. Naulor T. Machine imitation experiments with models economic Systems. - M.: Mir, 1975. - 500 s.

3. Cramer Mathematical methods of statistics. - M.: Mir, 1948 (1st ed.), 1975 (2nd ed.). - 648 p.

4. Bolshev L. N., Smirnov N. V. Table of mathematical statistics. - M.: Science, 1965 (1st ed.), 1968 (2nd ed.), 1983 (3rd ed.).

5. Smirnov N. V., Dunin-Barkovsky I.V. The course of the theory of probability and mathematical statistics for technical applications. Ed. 3rd, stereotypical. - M.: Science, 1969. - 512 p.

6. Norman Drayer, Harry Smith Applied regression analysis. Multiple regression \u003d AppLied Regression Analysis. - 3rd ed. - M.: Dialectics, 2007. - P. 912. - ISBN 0-471-17082-8

See also

Wikimedia Foundation. 2010.

  • Yat-Kha.
  • Amalgam (values)

Watch what is "statistical methods" in other dictionaries:

    Statistical methods - Statistical methods scientific methods Descriptions and study of mass phenomena allowing quantitative (numerical) expression. The word "statistics" (from needle. Stato State) has a general root with the word "state". It is originally ... ... Philosophical Encyclopedia

    Statistical methods - - Scientific methods for describing and studying mass phenomena that allow quantitative (numerical) expression. The word "statistics" (from Ital. Stato - the state) has a general root with the word "state". Initially, it belonged to the science of management and ... Philosophical Encyclopedia

    Statistical methods - (in ecology and biocenology) Methods of variational statistics, allowing to investigate the whole (for example, phytocenosis, population, productivity) on its private aggregates (for example, according to data obtained at the accounting sites) and estimate the degree of accuracy ... ... Ecological Dictionary

    statistical methods - (in psychology) (from lat. Status condition) Some methods of applied mathematical statistics used in psychology mainly for processing experimental results. The main purpose of the use of S. m. Increase the validity of the conclusions in ... ... Big psychological encyclopedia

    Statistical methods - 20.2. Statistical methods Specific statistical methods used to organize, regulate and verify activities include, but are not limited to: a) experiment planning and factor analysis; b) dispersion analysis and ... Dictionary directory terms of regulatory and technical documentation

    Statistical methods - Methods of research of quantities. Parties to mass societies. phenomena and processes. S. M. It is possible to characterize the changes in societies in digital terms. processes, study Split. Forms social economy. Laws, shift ... ... Agricultural encyclopedic dictionary

    Statistical methods - Some methods of applied mathematical statistics used to process experimental results. A number of statistical methods have been developed specifically for quality verification. psychological testsFor use in professional ... ... Professional education. Vocabulary

    Statistical methods - (in engineering psychology) (from lat. Status state) Some methods of application statistics used in engineering psychology for processing experimental results. The main purpose of the use of S. m. Increase the validity of the conclusions in ... ... Encyclopedic Dictionary of Psychology and Pedagogy

Erlan Askarov, Associate Professor KazNTU. K. Satpayev


Statistical methods play an important role in an objective assessment of the quantitative and qualitative characteristics of the process and are one of the most important elements of the product quality assurance system and the entire quality management process. It is no coincidence that the founder of the modern quality management theory of E. Deming has worked for many years to the census bureau and dealt with the questions of statistical data processing. He attached great importance to statistical methods.

To obtain quality products, it is necessary to know the real accuracy of the existing equipment, to determine the accuracy of the selected technological process of the specified product accuracy, evaluate the stability of the technological process. The solution of the tasks of the specified type is carried out mainly by mathematical processing of empirical data obtained by multiple measurements of either the actual sizes of products or processing errors or measurement errors.

There are two categories of errors: systematic and vlundly. As a result of direct observations, measurements or registration of facts, a variety of data is obtained that form a statistical set and need to be processed comprising systematization and classification, the calculation of the parameters characterizing this combination, drawing up tables, graphs illustrating the process.

In practice, a limited number of numerical characteristics, called distribution parameters, are used.

Center grouping. One of the main characteristics of the statistical aggregate, which gives an idea of \u200b\u200bwhich the center is grouped by all values, is the arithmetic average. It is determined from the expression:

where xmax, Xmin is the maximum and minimum value of the statistical aggregate.

Variation scap is not always characteristic, as it takes into account only the extreme values \u200b\u200bthat can differ greatly from all other values. More accurate scattering is determined using indicators that take into account the deviation of all values \u200b\u200bfrom the average arithmetic. The main of these indicators is the average quadratic deviation of the observation result, which is determined by the formula

The form of probability distribution. For the characteristics of the distribution form, the mathematical model is usually used that best brings the probability distribution curve obtained when analyzing experimentally obtained data.

The law of normal distribution. Most of the random phenomena occurring in life, in particular, in production and scientific researchare characterized by the presence of a large number of random factors, described by the law of a normal distribution, which is the main thing in many practical research. However, the normal distribution is not the only possible. Depending on the physical nature of random variables, some of them in practice may have the distribution of another species, for example, logarithmic, exponential, Weibulla, Simpson, a relay, equal to probability, etc.

The equation describing the probability density of the normal distribution is:


(5)

The normal distribution is characterized by two parameters μ and σ 2 and the graph is a symmetric Gauss curve (Figure 1) having a maximum of the corresponding value x \u003d μ (corresponds to the average arithmetic X CP and is called the center of the grouping), and at x → -∞ and X → ∞ Asymptotically approaching the abscissa axis. The clarification point of the curve is at a distance of σ from the center of the location μ. With a decrease in σ, the curve stretches along the axis of the ordinate and is compressed along the abscissa axis. Between the abscissions μ - σ and μ + σ is 68.3% of the total area of \u200b\u200bthe normal distribution curve. This means that with a normal distribution, 68.3% of all measured units deviate from the average value of no more than σ, that is, all of them are within + Σ. The area concluded between the orders carried out at a distance of 2σ on both sides of the center is 95.4% and, accordingly, as many units of the aggregate is within μ + 2σ. Finally, 99.73% of all units are within μ + 3σ. This is the so-called rule "three sigm" characteristic of normal distribution. According to this rule, beyond the deviation on 3σ there are no more than 0.27% of all values \u200b\u200bof quantities, that is, 27 implementations by 10 thousand. In technical applications, it is accepted in evaluating the measurement results to work with the coefficients Z with σ, corresponding to 90%, 95%, 99%, 99.9% of the probability of increment to the tolerance area.


Picture 1

Z90 \u003d 1.65; Z95 \u003d 1.96; Z99 \u003d 2.576; Z999 \u003d 3,291.

It should be noted that the same rule applies to the deviations of the average value x Wed (?). It also fluctuates in some region by three values \u200b\u200bof the average quadratic deviation of the average value of S in both directions, and 99.73% of all mean values \u200b\u200bare concluded in this area. Normal distribution is well manifested with a large number of statistical compliance members, not less than 30.

Student distribution. To practice, great interest is possible to judge the distribution of random variables and determine the production errors in all manufactured products and the error of scientific experiments on the results of measuring the parameters of the statistical aggregate obtained from the small volume party. This technique was developed by Karl Gosset in 1908 and published under the pseudonym Student.

Student's distribution is symmetrically, but more flattened than the curve of the normal distribution, and therefore stretched at the ends (Figure 2). For each value n, there is its own T-function and its distribution. The Z coefficient is replaced in Student's distribution of the T coefficient T, the value of which depends on the specified level of significance, which determines which part of the implementation can be beyond the selected area of \u200b\u200bthe styudent distribution curve and the amount of products in the sample.


Figure 2.

With large n. The distribution of Student asymptotically brings closer with the standard normal distribution. With an accuracy acceptable to practice one can assume that n.? 30, distribution of Student, which is sometimes called t.distribution, approximated normal.

t.distribution has the same parameters as normal. This is a mean arithmetic XCR, a secondary quadratic deviation ? and the average quadratic deviation of the mean S. XCD is determined by the formula (1), s is determined by the formula (4), and ? according to the formula:


(6)

Accuracy control. When a random variable is known, you can get all the features of this batch of products, determine the average value, dispersion, etc. But the complete set of statistical data of the industry of industrial products, which means the law of probability distribution will be able to be known, only after the manufacture of the entire batch of products. In practice, the law of the distribution of the entire totality of products is almost always unknown, the only source of information is the sample, usually small. Each sample characteristic calculated according to selective data, for example, an average arithmetic or dispersion is the implementation of a random variable, which can take various values \u200b\u200bfrom the sample sample. The task of control is facilitated due to the fact that it is usually no need to know the exact value of the differences of random values \u200b\u200bfrom a given value. It is enough just to know whether the observed values \u200b\u200bare distinguished by more than the magnitude of the allowed error, which is determined by the value of the tolerance. Distribution on the general set of estimates made by selective data can be carried out only with some probability P (T). Thus, the judgment on the properties of the general population is always probabilistic and contains a risk element. Since the conclusion is made according to selective data, that is, with a limited amount of information, errors of the first and second sort may occur.

The probability of allowing the error of the first kind is the level of significance and designate but. Probability but, called critical, and complementary area, the likelihood of which is equal to 1-A., called admissible.

The probability of the error of the second kind is indicated ? , and the amount 1-? called the capacity of the criterion.

Value but Sometimes called the risk of the manufacturer, and the value ? called consumer risk.

With probability 1-A. Unknown meaning x 0 full aggregate lies in the interval

(XSR - Z?)< Х 0 < (Хср + Z?) для нормального распределения,

(Xsr - t?)< Х 0 < (Хср + t?) для распределения Стьюдента.

The limit extreme values \u200b\u200bof x 0 are called trust borders.

With a decrease in the size of the sample during the distribution of Student, trusted boundaries are expanded, and the probability of error increases. Setting, for example, a 5% level of significance (A \u003d 0.05), it is believed that with a probability of 95% (p \u003d 0.95) unknown value x 0 is in the interval

(XSR - T? ,:., XSR + T?)

In other words, the desired accuracy will be equal to xs + t?, with the number of parts with the size of this tolerance, will be no more than 5%.

Control stability process. In real conditions, the production of the actual values \u200b\u200bof the process parameters and characteristics of manufactured products are not only chaotic due to random errors, but often over time gradually and monotonically deviate from the specified values, that is, there is a systematic error. These errors should be eliminated by identifying and eliminating their causes. The problem is that in real conditions, systematic errors are difficult to distinguish from random. Minor systematic errors without special statistical analysis can remain unnoticed long on the background of random errors.

The analysis is based on the fact that when systematic errors are missing, the actual parameter values \u200b\u200bare randomly changed. However, their average values \u200b\u200band basic errors remain unchanged over time. In this case, the technological process is called stable. It is conventionally believed that in this party all products are the same. With a stable process, random errors are subject to the normal law of distribution with the center μ \u003d ho. The average value of the parameters obtained in different batches should be approximately equal to Ho. Consequently, all of them are approximately equal to each other, but the value of the current average value of the HSR fluctuates in the confidence interval + tS, that is:

(XSR - TS) ≤ xSrt ≤ (xSR + TS) (7)

The same data that was used to control accuracy can serve as a material for analyzing stability. But they will be suitable only if there are continuous observations, covering a sufficient period of time, or if they are composed of samples, selected after certain intervals. The intervals between samples, called in this case samples, are established depending on the observed frequency of the equipment.

At a given level of significance, the average HSR value in different current batches may differ from no more than the value of TS from the base XSR obtained for the first measurement, that is

/ CCR - HSR / ≤ TS (8)

When performing this condition, we can assume that the process is stable and both batch are released under the same conditions. If the difference in average values \u200b\u200bin two batches will exceed the value of TS, it is not possible that this difference is caused only by random causes. In the process, a dominant constant factor appeared, which changes the values \u200b\u200bof product parameters in the party according to a certain constant law. The process is unstable and products manufactured in different timewill differ significantly from each other, and this difference will increase with time.

Thus, the discrepancy between the average values \u200b\u200bin various parties is greater than TS, indicates the presence of systematic errors and the need to take measures to detect them and eliminate the reasons that they are called. This principle was applied by V. Shukhart in developing control cards.

Statistical methods for analyzing stability can also be applied in situations opposite to those discussed above. If there are some changes in the design of the product or the technological process of its manufacturing, then it is required to determine to what extent will it lead to expected results.

Therefore, it is required to test, make several trial and statistically process data. If a

/ Schoolsr.st.-chsr.n./\u003e ts, (9)

Seven simplest methods of statistical research process

Modern statistical methods are quite complex for perception and wide practical use without in-depth mathematical preparation of all participants in the process. By 1979, the Union of Japanese scientists and engineers (JUSE) collected the seven sufficiently easy-to-use processes in the use of visual methods for analyzing processes. With all its simplicity, they retain communication with statistics and give professionals the opportunity to use their results, and if necessary, improve them.

Causal inspection diagram of Isica. This chart is a very powerful tool for analyzing the situation, obtaining information and the influence of various factors on the main process. Here it is possible not only to identify the factors affecting the process, but also to determine the priority of their influence.


Figure 3.

5M type diagram considers such quality components as "people", "equipment", "material, raw materials", "technology", "Management", and in a 6m type diagram, the "Wednesday" component is added to them (Figure 3).

In relation to the solvimetric analysis task,
- For the "People" components, it is necessary to determine the factors associated with the convenience and security of operations;
- for the "equipment" components - the relationship of the elements of the design of the analyzed product between themselves, associated with the execution of this operation;
- for the "Technology" components - the factors associated with the performance and accuracy of the operation;
- for the "Material" component - factors associated with the lack of changes in the properties of the product materials in the process of performing this operation;
- for the "Technology" components - factors associated with reliable recognition of the error of the operation process;
- For the "medium" components, the factors associated with the impact of the medium on the product and the product on Wednesday.

Types of defects Control data TOTAL
Dents ///// ///// //// 14
Cracks ///// ///// ///// // 17
Output for admission in minus ///// // 7
Output for admission to plus ///// ///// ///// ///// /// 23
Burning with heat treatment ///// //// 9
Basic surfaces /// 3
Foundry shells ///// / 6
Nonconformity of roughness ///// ///// ///// /// 18
Defects Painting //// 4
Others ///// // 7
TOTAL 108

Figure 4.

Control sheets. Control sheets can be used both when monitoring quality and in the control of quantitative features, this document records certain types of defects for a certain period of time. The checklist is a good statistical material for further analysis and study of production problems and reduce defective levels (Figure 4).

Pareto analysis. Pareto analysis received its name by the name of the Italian economist Wilfredo Pareto (1848-1923), which showed that most of the capital (80%) is in the hands of a minor number of people (20%). Pareto has developed logarithmic mathematical models describing this inhomogeneous distribution, and Mathematics M.O. Lorentz presented graphic illustrations, in particular a cumulative curve.

Rule Pareto - "Universal" principle, which is applicable in a variety of situations, and no doubt about solving quality problems. D. Juran noted the "universal" application of the Pareto principle to any group of reasons that cause this or that consequence, most of the consequences caused by a small number of reasons. Analysis of Pareto ranks individual areas of importance or importance and calls to identify and first eliminate those reasons that cause the greatest number of problems (inconsistencies).

Figure 5.

The Pareto Analysis is usually illustrated by the Pareto diagram (Figure 5), on which, on the abscissa axis, the causes of quality problems are postponed in the order of decrease in problems caused by them, and in the ordinate axis - in quantitative terms the problems themselves, both in both the numerical and in accumulated (cumulative) percentage. We construct a diagram according to the data taken from the previous example - a checklist.

The diagram is clearly visible the field of adoption of priority measures, outlining those reasons that cause the largest number of errors. Thus, first of all, preventive measures should be aimed at solving these problems. The identification and elimination of the reasons that cause the greatest number of defects allows us to spend the minimum amount of resources (money, time, people, material support) to obtain the maximum effect in the form of a significant decrease in the number of defects.

Stratification. Basically, stratification is the process of sorting data according to some criteria or variables, the results of which are often shown in the form of charts and graphs. We can classify an array of data in different groups (or categories) with common characteristics, called variable stratification. It is important to install that variables will be used to sort. Stratification is the basis for other tools, such as the analysis of the Pareto or dispersion diagram. Such a combination of tools makes them more powerful.

Take data from the control sheet (Figure 4). Figure 6 shows an example of an analysis of the source of the occurrence of defects. All defects 108 (100%) were classified into 3 categories - on shifts, on working and operations. From the analysis of the presented data, it is clear that the greatest contribution to the presence of defects is introduced 2 shifts (54%) and working g (47%), which works in this shift.

Histograms. Histograms are one of the variants of a columnar chart that displays the dependence of the frequency of the product quality parameters or the process into a certain range of values \u200b\u200bfrom these values.

Below is an example of building a histogram.

For convenience of calculations and build, use the Excel application software package. It is necessary to determine the scatter of the geometric size values, for example, the diameter of the shaft, the nominal size of which is 10 mm. 20 shafts were performed, measurement data are shown in the first column A (Figure 7). In the column, we produce measuring measurements ascending, then in the D7 cell we determine the size of the size, as the difference between the largest and low measurement values. Select the number of histogram ranges to 8. Determine the range of the interval D. Then we determine the parameters of the intervals, it is the smallest and most inclusive of the value of the geometric parameter included in the interval.

where I is the interval number.

After that, determine the amount of parameter values \u200b\u200bin each of the 8 intervals, after that finally we build a histogram.


Figure 7.

Scatter diagrams. The scatter diagrams represent graphs that allow you to identify the correlation (statistical dependence) between various factors affecting quality indicators. The diagram is based on the two coordinate axes, the value of the variable parameter is deposited along the abscissa axis, and the obtained value of the parameter under study is deposited on the axis of the ordinate, which we have at the moment using the variable parameter, on the intersection of these values \u200b\u200bwe put the point. By collecting a large number of such points, we can make analysis and output.

Let us give an example. The company decided to conduct classes on the basics of quality management. Each month the training took place a certain number of workers. In January, training passed 2 people, in February 3 people, etc. During the year, the number of trained workers increased and by the end of the year it reached 40 people. The leadership gave instructions to the quality service to track the dependence of the percentage of non-defective products presented from the first time, the number of advertising advertising plants from customers and electricity consumption in the workshop from the number of trained workers. Table 1 of data for months was compiled and the scatter diagrams were built (Figure 8, 9, 10). It is clearly clear that the percentage of low-impactity increases, we have a direct correlation dependence, the amount of complaints decreases, we have a reverse correlation dependence, and a clearly pronounced correlation dependence is clearly visible in the charts, which is determined by the adolescence of points and their approximation to any exactly delineal trajectory. Our case is a straight line. The amount of electricity consumed does not have depending on the number of trained workers.

Control cards. Control maps - a special type of chart, first proposed by V. Shukhart in 1924. They reflect the nature of the change in time indicator in time, for example, the stability of the product size. Essentially, the control cards show the stability of the technological process, that is, finding the average parameter value in the corridor of the allowed values \u200b\u200bconsisting of the upper and lower limit of the tolerance. These card data can be alarmed that the parameter is approaching the border of the tolerance and it is necessary to make proactive actions before the parameter enters the marriage zone, that is, such a control method allows you to prevent the appearance of marriage at the stage of its origin.

There are 7 main types of cards.

    Deviations of the rms deviation of the average value of X-S,

    Deviations of wizards x-r,

    Deviations of individual values \u200b\u200bx,

    Oscillations of the number of defects with,

    Oscillations of the number of defects per unit of products U,

    Oscillations of the number of defective units of PN products,

    Fluctuations in the share of defective products p.

All maps can be divided into two groups. The first controls the quantitative parameters of quality, which are continuous random variables - dimensions, mass, etc. The second to control high-quality alternative discrete parameters (there is a defect - no defect).

table 2



For example, the X-S card. The oscillations of the average arithmetic value, the tolerance corridor here is 3S (for normal distribution) or TS (for Student's distribution), where S is the mean-square deviation of the average. The middle of the corridor is the average arithmetic value of the first measurement. The values \u200b\u200bof this map are most reliable and objective. General form control card Showing in Figure 11.

Literature:

1. Askarov E.S. Quality control. Tutorial. Ed.2. Almaty, Pro Servise, 2007, 256 p.