Thursday 21 February 2013

Defination of Quality and It's characteristics


Definition of Quality:-


1>Quality is a dynamic state associated with the products, services, people, process and environment that meets or exceeds the expectations of the customer. Quality is defined as the ratio of performance to expectations:-

Quality(Q)=performance(s)/Expectations (E)

If

Q=1; customer is satisfied,

Q<1; customer is not satisfied,

Q>1; customer is delighted.

2>According to W. Edward Deming, quality can be defined only in terms of an agent who is judging the quality.

3>According to Joseph M. Juran, quality means close features of products that meet customer needs and there by provide customer satisfaction.

Greater satisfaction of customer=Increase the income of that company.

Quality Characteristics:-


Quality characteristics may be one or more elements that define the intended quality level of a product or service.

1>Structural Characteristics:-Length of part, weight of car, etc.

2>Sensory Characteristics:-Taste of food, smell of a sweet fragrance, etc.

3>Time-oriented Characteristics:-warranty, reliability, etc.

4>Functional characteristics:-Utility of purpose, Guarantees and  Warrantees, etc.

5>Ethical Characteristics:-Honesty, Friendliness,etc.

6>Non-Functional characteristics:-Self image of user, Style and appearance of the initial, etc.

 

Tuesday 19 February 2013

Linear metrology,ernier calipers, Micrometer And Autocollimator




Linear metrology is defined as the science of linear measurement, for the determination of the
distance between two points in a straight line. Linear measurement is applicable to all external and
internal measurements such as distance, length and height difference, thickness, straightness,
squareness, taper, etc.
The instruments used in length metrology are generally classified into two types:
Non – precision measuring instruments, e.g., steel rule.
Precision measuring instruments, e.g., vernier calipers, micrometer.
A caliper is an end – standard measuring instrument to measure the distance between two points.
Calipers typically use a precise slide movement for inside, outside, depth or step measurements.
Types of Calipers
Inside Calipers

Outside Calipers

Spring Calipers

Centre Measuring Caliper

Vernier caliper is a measuring tool used for finding or transferring measurements (internal or
external). Internal calipers are used to check the inside diameters of pipes. External calipers
are used to determine the diameters of a round pipe or a turned spindle.
A vernier caliper is a combination of inside and outside calipers and has two sets of jaws;
one jaw (with a depth gauge) slides along a rule.
Vernier calipers are based on principle which states that “the difference between two scales
or divisions which are near, but not alike are required for obtaining a small difference. It
enhances the accuracy of a measurement.
The vernier caliper essentially consists of two steel rules.
A solid L – shaped beam is engraved with the main scale. This is also called true scale, as
each millimeter marking is exactly 1 millimeter apart.On the movable measuring jaw, the vernier scale is engraved which slides on the beam. The
function of vernier scale is to subdivide minor divisions on the beam scale into smallest
increment that the vernier instrument is capable of measuring.
Instructions on use:
Close the jaws lightly on the object to be measured.

While measuring a round cross section make sure the axis of object is perpendicular to the
caliper.
Ignore the top scale, which is calibrated in inches and use the bottom scale, which is in
metric units.
The boldface numbers on the fixed scale are in centimeters, the tick marks on the fixed scale
between the boldface numbers are in millimeters.
Instructions on use:
 
Examine the vernier scale to determine which of its divisions coincide or are most coincident
with a division on the main scale. The number of these divisions is added to the main scale
reading.
This is one of the most useful and versatile instruments used in linear metrology for measuring,
inspection and transferring the height dimension over plane, step and curved surfaces.
It follows the principle of a vernier calliper and also follows the same procedure for linear
measurement.
It consists of the following parts:
Base: It is made quite robust to ensure rigidity and stability of the instrument.
Beam
Measuring jaw and scriber
Graduations
Slider

The vernier height gauge consists of a vertical beam on which main scale is engraved. The
vernier scale can move up and down over the beam.
The bracket carries vernier scale which slides vertically to match the main scale.
The whole arrangement is designed and assembled in such a way that when the tip of the
scriber blade rests on the surface plate, the zero of the main scale and vernier scale
coincides.
The scriber blade can be inverted with its face pointing upwards and which enables
determination of heights at inverted faces.
A vernier depth gauge is used to measure depth, distance from plane surface to a projection,
recess, slots and steps.
The basic parts of a vernier depth gauge are base or anvil on which the vernier scale is
calibrated along with the fine adjustment screw.
To make accurate measurements, the reference surface must be flat and free from dust and
burrs.
When the beam is brought in contact with the surface being measured, the base is held
firmly against the reference surface.
Micrometers have greater accuracy than vernier calipers and are used in most of the
engineering precision work.
Micrometers with an accuracy of 0.001 mm are also available.
Micrometers are used to measure small or fine measurements of length, width, thickness
and diameter of a job.
Principle of Micrometer:
 
A micrometer is based on the principle of screw and nut. When a screw is turned through
one revolution, the nut advances by one pitch distance, i.e., one rotation of the screw
corresponds to a linear movement of the distance equal to the pitch of the thread.
If the circumference of the screw is divided into n equal parts then its rotation of one
division will cause the nut to advance through pitch/n length. The minimum length that can
be used to measure in such case will be pitch/n.
If the screw has a pitch of 0.5 mm then after every rotation, the spindle travels axially by 0.5
mm and if the conical end of the thimble is divided by 50 divisions, the rotation of the
thimble of one division on the micrometer scale will cause the axial movement of screw
equal to 0.5/50 mm = 0.01 mm i.e. Least Count of the micrometer.
The main parts of outside micrometers are the following:
1) U-shaped or C-shaped Frame: It holds all parts of the micrometer together. The gap of the
frame decides the maximum diameter or length of the job to be measured.
2) Carbide – Tipped measuring faces – Anvil and Spindle: Anvil is fixed and located at 3.5 mm
from left hand side of the frame. The carbide tipped anvil guarantees extreme precision and
long life of the instrument. The spindle is the movable measuring face with the anvil on the
front side and it is engaged with the nut. When the spindle face is touched with the anvil
face, the zero of the micrometer must match with the reference line on the main scale and
the thimble is required to be set at zero division on the main scale.
3) Locking Device: It is provided on a micrometer spindle to lock it in exact position. This
enables correct reading without altering the distance between two measuring faces.
4) Barrel: It has fixed engraved graduation marks on it. The graduations are above and below
the reference line. The upper graduation marks are of 1 mm interval and are generally
numbered in multiples of five as 0,5,10,15. The lower graduations are also at 1mm interval
but are placed at the middle of two successive upper graduations to enable reading of
0.5mm.
5) Thimble: When the thimble is rotated, the spindle moves in a forward or reverse axial
direction, depending upon the direction of rotation. The conical edge of the spindle is
divided into 50 equal parts. The multiples of 5 and 10 numbers are engraved on it and the
thickness of graduations is between 0.15 to 0.20 mm.
6) Ratchet: It is provided at the end of the thimble. It controls the pressure applied on the
workpiece. The ratchet gives a clicking sound when the workpiece is correctly held.

Errores

When the anvils are brought in contact without applying undue pressure, the zero of the
circular thimble scale should coincide with the reference axial on the barrel. If it is not so,
then the meter has a zero error. It is determined by noting the number of circular thimble
scale divisions by which the zero of the circular scale is left behind the reference line.

Measuring instruments operating on the principle of screw work satisfactory if the screw
moves in an accurate cut nut without any play. However, there is always a little play
between the two and this increase with use. This give rise to shake and backlash i.e. the
screw does not moves backward or forward for a little while even when the head is turned
Dial gauges indicators perform mechanical amplification of length or displacement and
translate it into rotational motion of a pointer over a circular scale. The object of
amplification and translation is achieved by a rack and pinion arrangement. Dial gauges are
adaptable to many linear measurements where easy readability and moderate precision are
required.
Depth micrometers are used for measurements of depths, groove spacing and groove
widths.
The measurement is made between the end face of a measuring rod and a measuring face.
Because the measurement increases as the measuring rod extends from the face, the
readings on the barrel are reversed from the normal; the start at a maximum and finish at
zero.
Dial gauges or dial test indicators are used for checking flatness of surfaces and parallelism
of bars and rods. They can also be used for measurement of linear dimensions of jobs which
require easy readability and moderate precision;
The dial is divided into 100 equal divisions, each division represents a spindle movement of
0.01 mm. For 1 mm movement, the bigger arm turns through one complete revolution.
Interferometer: Is a tool which use source of monochromatic (one color or wavelength)
light. It is an optical instrument either processes light waves to enhance an image for
viewing, or analyzes light waves (or photons) to determine one of a number of characteristic
properties.What is an autocollimator?
An autocollimator is an optical instrument that is used to measure small angles with very
high sensitivity. As such, the autocollimator has a wide variety of applications including
precision alignment, detection of angular movement, verification of angle standards, and
angular monitoring over long periods.
Principles of operation
the autocollimator projects a beam of collimated light. An external reflector reflects all or
part of the beam back into the instrument where the beam is focused and detected by a
photodetector. The autocollimator measures the deviation between the emitted beam
and the reflected beam. Because the autocollimator uses light to measure angles, it never
comes into contact with the test surface.
An autocollimator is used to detect and measure small angular tilts of a reflecting surface
placed in front of the objective lens of the autocollimator.An autocollimator is based on the principle that a collimating lens can project and receive a
parallel beam of light and that the reflected beam of light will change its direction by
changing the angle of the surface reflecting the light.
If a parallel beam of light is projected from the collimating lens and if a plane reflector R is
set up normal to the direction of the beam, the light will be deflected back along its own
path and will be brought to a focus exactly at the position of the light source. If the reflector
is tilted through a small angle ÆŸ, the parallel beam will be deflected through twice the angle,
and will be brought to a focus in the same plane as the light source but one side of it.
The image will not coincide but there will be a distance = X = 2fÆŸ between them, where f is
the focal length of the lens

 
 
 
 
 

Measuring Instrument And Selection

MEASURING INSTRUMENT AND THEIR SELECTION
A measuring instrument is any device that may be used to obtain a dimensional or angular
measurement. The important characteristics which govern the selection of instruments are measuring range, accuracy and precision.
Some instruments, such as a steel rule, may be used to read directly; others like caliper, are used for
transforming or comparing dimensions.
Transformation of a measurable quantity into the required information is a function of measuring
instruments.
Generally, measuring instruments are classified as follows:
i. On the basis of function
a. Length-measuring instruments
b. Angle measuring instruments
c. Surface-roughness measuring instruments
d. Geometrical-form-checking instruments
ii. On the basis of accuracy
a. Most accurate instruments
b. Moderate accurate instruments
c. Below moderate accurate instruments
iii. On the basis of precision
a. Precision measuring instruments
b. Non-precision measuring instruments


Methods of Measurements

METHODS OF MEASUREMENTS
 
Measurement is a set of operations done with the aim of determining the value of a quantity which
can be measured by various methods of measurements depending upon the accuracy required and the amount of permissible error.
The various methods of measurement are: Direct Method: This is the simplest method of measurement in which the value of the quantity to b
measured is obtained directly without any calculations, e.g., measurements by scales, vernier
calipers, micrometers etc. It involves contact or non-contact type of inspections. Human
insensitiveness can affect the accuracy of measurement.
Indirect Method: The value of the quantity to be measured is obtained by measuring other
quantities, which are frequently related with the required value, e.g., angle measurement by sine
bar, density calculation by measuring mass and dimensions for calculating volume.
Absolute Method: This is also called fundamental method and is based on the measurement of the
base quantities used to define a particular quantity, e.g., measuring a quantity (length) directly in
accordance with the definition of that quantity.
Comparison Method: The value of a quantity to be measured is compared with a known value of
same quantity or another quantity related to it. In this method, only deviations from master gauges are noted, e.g., dial indicators or other comparators.Substitution Method: The quantity is measured by direct comparison on an indicating device by
replacing the measurable quantity with another which produces the same effect on the indicating
device.
Coincidence Method: It is also called the differential method of measurement. In this, there is avery
small difference between the value of the quantity to be measured and the reference. The reference is determined by the observation of the coincidence of certain lines or signals, e.g., measurement by Vernier calipers (LC* Vernier scale reading).Transportation Method: It is the method of measurement by direct comparisons in which the value
of the quantity measured is first balanced by an initial known value P of the same quantity. Then the
value of the quantity measured is put in place of that known value and is balanced again by another
known value Q. If the position of the element indicating equilibrium is the same in both cases, the
value of the quantity to be measured is square root of PQ.
Deflection Method: The value of the quantity to be measured is directly indicated by the deflection
of a pointer on a calibrated scale, e.g., dial indicator.
Method of Null Measurement: It is a method of differential measurement. In this method, the
difference between the value of the quantity to be measured and the known value of the same
quantity with which it is compared is brought to zero (null).



Metrology and Metrological Terminology

Metrology is derived from two Greek word, one is metro which means measurement and
other is logy which means science. Metrology is basically the science of measurement.
Metrology is field of knowledge concerned with measurement and includes both theoretical
and practical problems with reference to measurement.
Metrology is the name given to the science of pure measurement.
Engineering Metrology is restricted to measurements of length & angle.
Metrology is mainly Concerned with:Establishing the units of measurement, reproducing these units in the form of standards and
ensuring the uniformity of measurements.Developing methods of measurement.
Analysing the accuracy of methods of measurement, researching into the causes of measuring errors and eliminating them.
Type of Metrology
Metrology is separated into following categories with different levels of complexity and accuracy:

1. Scientific Metrology

2. Industrial Metrology

3. Legal Metrology

4. Fundamental Metrology
Scientific Metrology deals with the organization and development of measurement standards

and with their maintenance.Industrial Metrology has to ensure the adequate functioning of measuring instruments used
in industry as well as in production and testing processes.

Legal Metrology is concerned with the accuracy of measurements where these have
influence on the transparency of economical transactions, and health and safety, e.g., the
volume and quality of petrol purchased or the weight and quality of prepackaged flour. It
seeks to protect public against inaccuracy in trade.
Fundamental Metrology may be described as scientific metrology, supplemented by those
parts of legal and industrial metrology that require scientific competence. It signifies the
highest level of accuracy in the field of metrology.

METROLOGICAL TERMINOLOGIES
 
Accuracy is the closeness of agreement between a test result and the accepted reference value.
Precision is the closeness of agreement between independent test results obtained under stipulate
conditions.
Repeatability conditions are where independent test results are obtained with the same method on
identical test items in the same laboratory by the same operator using the same equipment within
short intervals of time.
Reproducibility conditions are where test results are obtained with the same method on identical
test items in different laboratories with different operators using different equipment.
Correction is the value which, added algebraically to the uncorrected result of a measurement,
compensates for an assumed systematic error.
Drift is a slow change of a metrological characteristic of a measuring instrument.
Magnification In order to measure small difference in dimensions, the movement of the measuring
tip in contact with work must be magnified and, therefore, the output signal from a measuring
instrument is to be magnified many times to make it more readable. In a measuring instrument,
magnification may be either mechanical, electrical, electronic, optical, pneumatic principle or a
combination of these.
Reference, accepted value serves as an agreeed-on reference for comparison, and which is derived
as theoretical or established value, based on scientific principles or experimental work.
Range is the capacity within which an instrument is capable of measuring.
Readability refers to the ease with which the readings of a measuring instrument can be read. If the
graduation lines are very finely spaced, the scale will be more readable by using a microscope, but
the readability will be poor with the naked eye.
Response time is the time which passes after a sudden change of the measured quantity until the
instrument gives an indication different.
Resolution is the smallest change of the measured quantity which changes the indication of a
measuring instrument. Resolution describes the degree to which a change can be detected.
Sensitivity of the instrument denotes the smallest change in the value of the measured variable to
which the instrument responds. Sensitivity describes the smallest absolute amount of change that
can be detected by a measurement, often expressed in terms of millivolts, microhms
Stability refers to the ability of a measuring instrument to constantly maintain its metrological
characteristics with time.
Standardization is a process of formulating and applying rules for orderly approach to a specific
activity for the benefit and with the cooperation of all concerned in particular. This is done for the
promotion of overall economy, taking due account of functional conditions and safety requirements.
standardisation is the process of developing and implementing technical standards. By using
standardisation, groups can easily communicate through the set guidelines, in order to maintain
focus. The method is made to facilitate processes and tasks
Testing is a technical investigation, e.g., as to whether a product fulfils its specified performance.
Trueness is the closeness of agreement between the average value obtained from a large series of
test results and an accepted reference value . The measure of trueness is usually expressed in terms
of bias
Verification is an investigation that shows that specified requirements are fulfilled.
Dead zone it is the range within which variable can vary without being detected.
Tolerance it is the range of inaccuracy which can be tolerated in measurements.
Accuracy
Precision



INTRODUCTION TO ENVIRONMENTAL STUDIES


INTRODUCTION TO ENVIRONMENTAL STUDIES

Definitions

The term environment is derived from a French word environner which means‘surrounding’. It refers to an aggregate of all conditions that affect the existence, growth, and welfare of an organism or a group of organisms.  The term may be defined in a number of ways.

Environment is the sum total of all social, economical, biological, physical, and chemical factors which constitute the surroundings of  humans, who are both creators and
moulders of the environment.
 
Environment is the sum total of influences which modify and determine the development of life and its associated characteristics.
Types of Environment
Natural Environment
It includes components such as air, water, soil, land,forest, wildlife,flora, fauna, etc
The natural environment on Earth is divided into the following four realms:
 
Mesosphere
Biosphere
Lithosphere
Hydrospher
Anthropogenic Environment
It includes components that have been introduced by human   beings depending on their needs and requirements.
Components of Environment
The components of environment are broadly
classified as abiotic and biotic components.
 
Abiotic or non-living components of environment include all the physical and chemical factors that influence living organisms. Examples of abiotic components are air, water, soil, rocks, etc.
 
Biotic or living components are the living components of environment and include microbes, plants, animals, and human beings.
 
Environmental Studies
Environmental Studies refers to the study of the environment. It is not restricted to the point of view of one particular discipline but involves all disciplines that may affect the environment in any possible way. It involves the study and understanding of the fact that even a single phenomenon can affect the environment in a variety of ways with varying degrees of complexity, and each of these effects can be understood from different perspectives rooted in different disciplines.
Scope of Environmental Studies
Ecosystem structure and function
Natural resource conservation
Environmental pollution control
Environmental management
Environmental impact assessment
Research and development
Social development
Forest management
Environmental consulting firms
Environmental journalism
Environmentalists
 
Importance of Environmental Studies
Environmental Studies is useful in checking environmental pollution
It helps in maintaining ecological balance.
•It helps to gain skills to assess the environmental impact of  human activities.
It gives us basic knowledge of environment and associated problems.
It helps to achieve sustainable development
It helps to educate people regarding their duties towards the protection of environment.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Overview of Welding Technology