May 29, 2005
Process Monitoring
Chemical engineers go to greath lengths in order to optimize industrial chemical processes, and the main reason for it is money: a well optimized process will consume less energy and raw materials, and produce less by-products to be disposed of and more of the intended product of a better quality. And it will allow the company's owners to make more money more rapidly.
But no industrial process can work properly without constant monitoring and control. If the process conditions go out of specifications, the quality of the products will be degraded, and ultimately nasty things can happen with severe economical and safety implications. In a factory I visited, for example, a trivial error caused a whole batch of polyester resin to solidify in the mixing tank: it took weeks of work with jackhammers to clear the tank and return it to operation.
The main parameters to control in a chemical process are pressure (occasionally, degree of vacuum), temperature and flow and composition of the feed - fluid level in tanks is another important one. There are other more subtle parameters, too: the concentration of tomato paste is measured through its refractivity index, and in some cases products are monitored for density, viscosity etc. But often this is more like quality control, rather than process control - if you process runs smoothly as planned, the product will be OK too. Modern science and engineering invented a vast array of instruments to measure and control all of the parameters above in any range and circumstance. Indeed, it is not only the chemical industry that requires accurate process control, but also other fields: in aeronautics, a reliable measure of the fuel flow to the engines is important for both performance and safety; where my father used to work there were huge vacuum current rectifiers, and big vacuum systems aren't trivial to control. Mercifully, they have been substituted with solid-state power rectifiers. But they have a quite comprehensive control system as well.
A basic instrument in the temperature control field is the thermocouple: the junction between two different metals or alloys produces a difference of potential across it that is linearly dependant from the temperature of the junction in a certain temperature range.
This difference of potential is measured with instruments that basically are voltmeters calibrated to display a temperature in degrees instead of a tension in millivolts. For practical applications, the two sensing wires are enclosed in a sheat - made of stainless steel for the thermocouples I use currently and insulated with magnesium oxide powder in order to work up to 1200 C - there are a few different types of thermocuples (K, N, T etc) for different temperature ranges. Thermocuples are a reliable, rugged and fairly cheap devices to measure temperature in almost any range of practical importance and a variety of situations - ovens, furnaces, inside pipes and chemical reactors; in presence of aggressive fluids, mechanical stress, vibrations etc.
Other temperature sensors are platinum thermoresistances, based on the property of platinum wires of having electrical resistance directly proportional to temperature. These can be very accurate, but are also more expensive than thermocouples - not really because they use small amounts of platinum, but because their whole construction is more complicated and thermoresistances require more complex indicator electronics.
It's not that the old good gauge or fluid thermometers do not work anymore, but they cannot be used in many situations and are terrible if you need to transform the temp reading in an electrical signal to be transmitted somewhere else.
Pressure nowadays is most often measure used strain-gauge pressure transducers: at the core of these devices, there is a small semiconductor (silicon) resistor with its resistance proportional to the strain applied to it. A strain-gauge can be used for different applications, but in pressure transucer it is attached to a stainless steel diaphragm that is deformed by the pressure of the process fluid. When this resistor is excited with a fixed voltage (usually 10 V) and inserted in an electric circuit called Wheatstone bridge, it will give an output in millivolts proportional tho the fluid pressure. These tansducers are compact, rugged and often all-welded stainless stell for corrosion resistance. Not exactly cheap (I have a very high-accuracy one that costed about £500), but an excellent system to measure pressure. As I said above, gauge and mercury manometers are still used occasionally, but they're limited to indication, cannot transmit data.
For the task of measuring flow, there are different types of flowmeters: one is the rotameter, the same kind of domestic water and gas counters, in which the process fluid rotates a small turbine and the volume flown is calculated counting the rotations of the turbine. Cheap and simple, but not very accurate or suitable for certain situations.
The most popular gas flow meters and controllers are based on thethermal conductivity heat capacity of gases: semplyfying, the instrument measures the electrical power required to keep a sensor immersed in the gas stream at a fixed temperature, and the amount of power is proportional the the gas flow. The great advantage of this kind of meters is that they measure the true mass flow of fluid, not its volume, that is dependant from temperature and pressure. Good thermal mass flow meters are expensive, but work really well.
Other flow meters are based on the Coriolis force, the force exerted by a fluid in motion of the walls of an omega-shaped tube. From what I've seen, these are generally high-performance, very expensive ones. Flow controllers are simply flowmeters with a needle valve downstream the metering element - but more on this some other day.
There even are flowmeters based on magnetohydrodynamics that can measure the flow (of liquids only?) from the outside of a pipe, if I remember correctly.
Once you meter properly the feed to your chemical reactor, you don't really need to measure its composition. Anyway, there is also a lot of different sensors and instruments to measure in real time the concentration of a single chemical or a class of chemicals, or to perform an almost complete analysis of the products. This is the vast field of analytical chemistry: you can use IR spectrometers, mass spectrometrs, membrane sensors, other kind of sensors based on different chemical or physical properties...
The problem of measuring the liquid level inside a sealed tank may seem trivial, but it is not, especially if great accuracy is required. The classical sinker - or viewing slit - simply cannot do the job. Thus, ultrasonic and radar level meters have been invented (maybe also laser ones, but a laser beam isn't really something you want to mix with flammable liquids): their opearative principle is rather simple - send out some kind of beam that will be reflected, and measure how long it takes to go forth and come back in order to calculate the distance from the instrument to the liquid surface.
Soon, I will explain some of how the values thus measured will be used to effectively control chemical processes.
Update 30/05: I forgot to explain something about vacuometers. Pressure gauges and transducers work because fluids exert a pressure that in turn causes a slight deformation of some metallic part; this deformation is proportional to the applied pressure and can be measure. But in vacuum conditions, the pressure exerted by the few remaining gas molecules is to low to deform anything, so other methods must be used. There are vacuum gauges based on thermal conductivity and other based on electrical conductivity. Vacuum systems have quite a lot of applications in various fields: for example, mass spectrometers must operate in high vacuum, and crucial steps of the semiconductor manufacturing need to be carried out in vacuum chambers.
It may seem easy to seal a vacuum system - after all, the pressure differential is just 1 bar. But in practice, it's rather difficult: many processes are very delicate, and even tiny infiltrations of air or other gases can totally cock them up.
But no industrial process can work properly without constant monitoring and control. If the process conditions go out of specifications, the quality of the products will be degraded, and ultimately nasty things can happen with severe economical and safety implications. In a factory I visited, for example, a trivial error caused a whole batch of polyester resin to solidify in the mixing tank: it took weeks of work with jackhammers to clear the tank and return it to operation.
The main parameters to control in a chemical process are pressure (occasionally, degree of vacuum), temperature and flow and composition of the feed - fluid level in tanks is another important one. There are other more subtle parameters, too: the concentration of tomato paste is measured through its refractivity index, and in some cases products are monitored for density, viscosity etc. But often this is more like quality control, rather than process control - if you process runs smoothly as planned, the product will be OK too. Modern science and engineering invented a vast array of instruments to measure and control all of the parameters above in any range and circumstance. Indeed, it is not only the chemical industry that requires accurate process control, but also other fields: in aeronautics, a reliable measure of the fuel flow to the engines is important for both performance and safety; where my father used to work there were huge vacuum current rectifiers, and big vacuum systems aren't trivial to control. Mercifully, they have been substituted with solid-state power rectifiers. But they have a quite comprehensive control system as well.
A basic instrument in the temperature control field is the thermocouple: the junction between two different metals or alloys produces a difference of potential across it that is linearly dependant from the temperature of the junction in a certain temperature range.
This difference of potential is measured with instruments that basically are voltmeters calibrated to display a temperature in degrees instead of a tension in millivolts. For practical applications, the two sensing wires are enclosed in a sheat - made of stainless steel for the thermocouples I use currently and insulated with magnesium oxide powder in order to work up to 1200 C - there are a few different types of thermocuples (K, N, T etc) for different temperature ranges. Thermocuples are a reliable, rugged and fairly cheap devices to measure temperature in almost any range of practical importance and a variety of situations - ovens, furnaces, inside pipes and chemical reactors; in presence of aggressive fluids, mechanical stress, vibrations etc.
Other temperature sensors are platinum thermoresistances, based on the property of platinum wires of having electrical resistance directly proportional to temperature. These can be very accurate, but are also more expensive than thermocouples - not really because they use small amounts of platinum, but because their whole construction is more complicated and thermoresistances require more complex indicator electronics.
It's not that the old good gauge or fluid thermometers do not work anymore, but they cannot be used in many situations and are terrible if you need to transform the temp reading in an electrical signal to be transmitted somewhere else.
Pressure nowadays is most often measure used strain-gauge pressure transducers: at the core of these devices, there is a small semiconductor (silicon) resistor with its resistance proportional to the strain applied to it. A strain-gauge can be used for different applications, but in pressure transucer it is attached to a stainless steel diaphragm that is deformed by the pressure of the process fluid. When this resistor is excited with a fixed voltage (usually 10 V) and inserted in an electric circuit called Wheatstone bridge, it will give an output in millivolts proportional tho the fluid pressure. These tansducers are compact, rugged and often all-welded stainless stell for corrosion resistance. Not exactly cheap (I have a very high-accuracy one that costed about £500), but an excellent system to measure pressure. As I said above, gauge and mercury manometers are still used occasionally, but they're limited to indication, cannot transmit data.
For the task of measuring flow, there are different types of flowmeters: one is the rotameter, the same kind of domestic water and gas counters, in which the process fluid rotates a small turbine and the volume flown is calculated counting the rotations of the turbine. Cheap and simple, but not very accurate or suitable for certain situations.
The most popular gas flow meters and controllers are based on the
Other flow meters are based on the Coriolis force, the force exerted by a fluid in motion of the walls of an omega-shaped tube. From what I've seen, these are generally high-performance, very expensive ones. Flow controllers are simply flowmeters with a needle valve downstream the metering element - but more on this some other day.
There even are flowmeters based on magnetohydrodynamics that can measure the flow (of liquids only?) from the outside of a pipe, if I remember correctly.
Once you meter properly the feed to your chemical reactor, you don't really need to measure its composition. Anyway, there is also a lot of different sensors and instruments to measure in real time the concentration of a single chemical or a class of chemicals, or to perform an almost complete analysis of the products. This is the vast field of analytical chemistry: you can use IR spectrometers, mass spectrometrs, membrane sensors, other kind of sensors based on different chemical or physical properties...
The problem of measuring the liquid level inside a sealed tank may seem trivial, but it is not, especially if great accuracy is required. The classical sinker - or viewing slit - simply cannot do the job. Thus, ultrasonic and radar level meters have been invented (maybe also laser ones, but a laser beam isn't really something you want to mix with flammable liquids): their opearative principle is rather simple - send out some kind of beam that will be reflected, and measure how long it takes to go forth and come back in order to calculate the distance from the instrument to the liquid surface.
Soon, I will explain some of how the values thus measured will be used to effectively control chemical processes.
Update 30/05: I forgot to explain something about vacuometers. Pressure gauges and transducers work because fluids exert a pressure that in turn causes a slight deformation of some metallic part; this deformation is proportional to the applied pressure and can be measure. But in vacuum conditions, the pressure exerted by the few remaining gas molecules is to low to deform anything, so other methods must be used. There are vacuum gauges based on thermal conductivity and other based on electrical conductivity. Vacuum systems have quite a lot of applications in various fields: for example, mass spectrometers must operate in high vacuum, and crucial steps of the semiconductor manufacturing need to be carried out in vacuum chambers.
It may seem easy to seal a vacuum system - after all, the pressure differential is just 1 bar. But in practice, it's rather difficult: many processes are very delicate, and even tiny infiltrations of air or other gases can totally cock them up.
Comments:
Post a Comment