# Basic Probability and Statistics

Posted: October 17th, 2013

Basic Probability and Statistics

Name:

Institution:

Lecturer:

Course

Date:

Basic Probability and Statistics

Probability can be simply be defined as the likeliness of occurrence of an event (Gitlow, Oppenheim, Oppenheim & Levine, 2005). Its definition has two main branches: a classical definition and a relative frequency definition.Both definitions differ in the methods used at arriving at the probabilities of events. In the classical definition, the occurrence of one event prevents the occurrence of another event; such events can be defined as mutually exclusive events. The methodology to arrive at the probability of an event occurring can be summed as the total number of outcomes relating to that event divided by the total number of events. In the relative frequency, definition of probability is arrived at by dividing the number of times that an event occurred by the number of times the event could have occurred (Gitlow et al., 2005).

To arrive at the cause of the differences between classical methods and relative frequency methods, an analysis study is conducted to find the process characteristics of the relative frequency probability. Data in relative classifications can be described as either attribute data and variable data or measurement data. Attribute data can further be described as data that arises from classification of the items, either values of the number items in a given unit. In the classification of items in a group of items, the aim is to identify the presence of some qualities. Variables are obtained from the measurement of a characteristic of a subject or it can arise from the computation of two or more measurements of variable data (Gitlow et al., 2005). Characterization of data can be achieved by the full calculation of empirical data to arrive at all the information needed. During empirical calculations, errors are bound to be present either as higher values or as lesser values from the original calculations. The empirical calculations are achieved by choosing a sample from the group of data and thus these numbers are calculated within the group in question. Full evaluation of a group cannot involve the whole group of items, thus, the need to collect a sample and then study it.

Data can be visually described by tabulation to show the probability of events. This can be achieved by displaying information to show them in frequency distributions (Gitlow et al., 2005). Frequency distributions can be divided into two groups. Absolute frequency can be described as the actual numbers for each event given. Relative frequency can be described as the percentage of the total numbers of observations of each group of data. Cumulative frequency fractions can also be used to represent data can also be represented by graphical displays such as frequency polygons or bar charts, histograms, ogive curves and run charts. Data can also be described numerically by several means such as measures of central tendencies, which include means: which is described as the total value of terms divided by the number of terms (Gitlow et al. 2005). Data can also be represented in measures of variability or dispersion, which include range and dispersion. Range can be described as the difference between the largest and smallest data in a group. Standard deviation is the distance of each data in a group from the mean. Standard deviation can be used to explain the distribution of data to be either, normally distributed, unknown distribution, skewed distribution. Measures of shape can also be used to represent data. Thus, skewness can be used to describe unsymmetrical data. Kurtosis can also be used to describe data that is peaked. A normal probability plot can be used to describe a tool for approximating normal distribution; the data should be in a straight diagonal line for equal distribution. The empirical rule shows that, for a stable process to present majority of the data, approximately 99%-100%, should be located within a space of three sigma units on both sides of the mean (Gitlow et al., 2005).

References

Gitlow, H. S., Oppenheim, A., Oppenheim, R., & Levine, D. (2005). Quality management. Boston: McGraw-Hill/Irwin Publishers.

### Expert paper writers are just a few clicks away

Place an order in 3 easy steps. Takes less than 5 mins.

## Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
\$0.00 