Moisture measurement technologies for natural gas

G. McKeogh, Panametrics, a Baker Hughes Business

The measurement of moisture in natural gas is an important parameter for the processing, storage and transportation of natural gas globally. Natural gas is dehydrated prior to its introduction into the pipeline and distribution network. However, attempts to reduce dehydration result in a reduction in “gas quality” and an increase in maintenance costs and transportation, as well as potential safety issues. Consequently, to strike the right balance, it is important that the water component of natural gas is measured precisely and reliably. Moreover, in custody transfer of natural gas between existing and future owners, maximum allowable levels are set by tariffs, normally expressed in terms of absolute humidity (mg/m3 or lbs/MMsft3/hr) or dewpoint temperature. Several technologies exist for the online measurement and for spot sampling of moisture content. This article reviews the most commonly used moisture measuring instruments and provides a comparison of those technologies. 

Water vapor in natural gas. Prior to transportation, water is separated from raw natural gas. However, some water remains present in the gaseous state as water vapor. If the gas cools or contacts any surface that is colder than the prevailing dewpoint temperature of the gas, water will condense in the form of liquid or ice. Under pressure, water also has the unique property of being able to form a lattice structure around hydrocarbons—such as methane—to form solid hydrates. Ice or solid hydrates can cause blockages in pipelines. In addition, water combines with gases such as hydrogen sulfide (H2S) and carbon dioxide (CO2) to form corrosive acids. Water in natural gas also increases the cost of transportation in pipelines by adding mass, and as water vapor has no calorific or heating value, it also adds to the expense of compression and transportation. When natural gas is sold, there are contractual requirements to limit the concentration of water vapor. In the U.S., the limit or tariff is expressed in absolute humidity in units of pounds per million standard cubic feet (lbs/MMsft3). The maximum absolute humidity for interstate transfer is set at 7 lbs/ MMsft3. In Europe, bodies such as the European Association for the Steamlining of Energy Exchange-gas (EASEE-gas) make recommendations on the maximum permissible amount of water vapor in gas. EASEE-gas has approved a limit of –8°C dewpoint, referenced to a gas pressure of 70 bar(a). This recommended limit is generally being adhered to in the gas industry across Europe.  

Instrument technologies for measuring water vapor in natural gas. Various viable technologies exist for measuring the amount of water vapor in natural gas. These tend to rely on sample conditioning systems, where a gas sample is extracted, filtered, the pressures regulated and the flow controlled. It is not advisable to install a sensor directly in a natural gas pipeline, as it can contain both physical contaminants (e.g., rust, scale and others), additives (e.g., odorizers, antifreeze agents such as methanol) and liquid hydrocarbons. Another benefit of a sampling system is that it can be isolated from the main pipeline. However, the sample system must not alter the moisture concentration of the sample via leaks or desorption/adsorption from the wetted components. Currently, the most widely used measurement technologies are chilled mirror, impedance sensors, quartz microbalance, Fabry-Perot interferometer and tunable diode lasers. Each technology has its advantages and disadvantages. 

Chilled mirrors. There are two basic categories of chilled mirror hygrometers: manually operated and automated. Automated chilled mirrors are further categorized into cycling chilled mirrors and equilibrium chilled mirrors. Chilled mirrors measure the dewpoint/frost point temperature directly by using a coolant or thermoelectric heat pump to cool a plane surface until condensation forms. When the mass of condensate on the mirror is in equilibrium with the surrounding gas sample, the temperature of the mirror is equal to the dewpoint or frost point temperature.  

Chilled mirrors can also be used to determine the hydrocarbon dewpoint. In gas mixtures containing heavier hydrocarbons, the partial pressure of hydrocarbons is sufficiently high enough that cooling the gas will result in a phase change from a gas to a liquid. In a similar principle, the temperature at which hydrocarbon condensate is in equilibrium with the sample gas is the “hydrocarbon dewpoint.”  

Manual chilled mirrors typically use the expansion high-pressure gas as the coolant. The manual chilled mirror apparatus (also referred to as the Bureau of Mines-type), is described in ASTM-1142. When high-pressure gases such as methane or CO2 are decompressed, cooling occurs due to the Joule-Thomson effect. The user observes the onset of condensation via a view port, while the mirror’s surface is cooling. The rate of cooling is important. If the cooling rate is too rapid, condensation occurs prior to thermal stability. ASTM-1142 provides a procedure consisting of repeating the test several times and successively slowing the cooling rate at the observed onset of condensation. The user also must learn to identify the difference between water and hydrocarbon condensate. Water appears either as fine droplets/fog (water) or opaque crystals (ice), while hydrocarbon liquids appear as shiny film. In some designs, a matte black or ablated surface is used for hydrocarbons, while a polished metal surface is used for water. The dew/ frost reading is subjective, as each operator must decide in both instances when condensation occurs and then identify the condensate. Manual chilled mirrors are typically used for spot-checks and do not lend themselves to providing online continuous readings or telemetry.  

Automatic chilled mirrors (FIG. 1) utilize a thermoelectric cooling module coupled to a mirror.

The cooling module consists of a multi-stage stack of arrays of P-N junctions arranged in a back-to-back orientation. When direct current is applied to the P/N junctions, electrons flow from the P junctions’ leaving holes. The energy holes are filled with heat energy that flows from the mirror. The P-N junctions are additionally thermally coupled to a metal heat sink. If the polarity of the current is reversed, the mirror is heated. Visible or infrared light is emitted and aligned to reflect off the mirror. The reflected light is received by a photodetector. When the mirror is cooled sufficiently, water vapor condenses on the mirror and the light received by the photodetector decreases due to both absorption and scattering of the light. The signal from the photodector is then utilized in a feedback control loop to maintain a constant mass. A precision platinum resistance temperature detector (PRTD) measures the temperature of the mirror. The heat pump can also be augmented by refrigeration (evaporator core) or a liquid coolant block.  

The overall measurement capability of a typical chilled mirror is –80 to > 85°C. The number of thermoelectric cooling stages, auxiliary, governs the full range. This system offers excellent precision and is widely used to provide laboratory reference standards for calibration and metrology applications. However, the footprint and design of most chilled mirror systems means that they have been and are restricted to laboratory applications. Some units on the market can be used for applications in non-hazardous areas, thereby offering the precision and repeatability previously found only in laboratory standard instruments. As a result, these instruments could be used both to calibrate existing impedance-type sensors onsite and as high-accuracy, highly stable humidity sensors. The pros and cons of chilled mirror hygrometers are listed in TABLE 1.  

Impedance sensors. The most widely used impedance-based moisture sensor technology for natural gas is the metal oxide sensor and, specifically, the aluminum oxide sensor (FIG. 2). While there are variations in design, the most widely used sensors consist of an aluminum base that has a thin layer of aluminum oxide deposited or grown on the surface by means of an anodization process.

A thin layer of porous gold is deposited over the oxide. On a microscopic level, the aluminum oxide appears as a matrix with many parallel pores. When exposed to even small amounts of water vapor, the superstructure enables water molecules to permeate into the matrix where micro-condensation occurs. Since the dielectric constant of dry gases are significantly lower than gases containing moisture (about an 80:1 ratio for nitrogen or standard air), each pore acts as a micro-capacitor. As the micro-capacitors are in a parallel arrangement, the total capacitance is additive. In essence, the sensor acts as a water molecule counter. 

The sensor is excited with a low voltage alternating current at a fixed frequency. The impedance of the sensor relates to the water vapor pressure by the following relationship (Eq. 1): 

Each sensor is calibrated at multiple dew/frost points, the partial pressure of water being a function of the dew/frost point temperature. The impedance at each dew/frost point is recorded and entered into a digital table either embedded in the memory of the sensor module or programmed into an analyzer. The analyzer utilizes a polynomial expansion equation to convert the measured impedance by reference to the look-up table to produce a direct readout in dew/frost point temperature. Typical accuracy is ±2°C Td from > 60°C to –100°C Td and ±3°C Td to < –100°C Td.  

In general, impedance sensors provide an excellent response to moisture changes in the dry-to-wet direction. They, however, have significant response times in the wet-to-dry direction. Aluminum oxide sensors are subject to drift over time. The typical drift is around 2°C per year and this can be managed by a regime of recalibration. Since aluminum oxide sensors are economical, very often users maintain additional sensors that are rotated in and out of service, thus always maintaining the in-service sensors within their recommended recalibration interval (typically one year).  

Aluminum oxide sensors have the capability to be installed at high pressures (up to 5,000 psig), and their footprint is quite compact. The sensors, however, are seldom installed directly in the pipeline. Instead, an extraction-type sampling system is utilized, allowing the gas to be filtered, the pressure to be regulated and the flowrate to be controlled. The pros and cons of impedance sensors are listed in TABLE 2.  

Quartz microbalance hygrometers. Quartz microbalance hygrometers (FIG. 3) consist of a quartz substrate that is coated with a hygroscopic polymer film. When a voltage is applied, the quartz oscillates at a resonant frequency. When the sensor is exposed to gas with water vapor, water is adsorbed by the hygroscopic coating, and the resonant frequency changes in accordance with the increased mass of the sensor. The adsorption of water into the sensor’s substrate is proportional to the partial pressure of the surrounding water vapor.  

Quartz microbalance sensors have a certain degree of hysteresis and must be “re-zeroed” periodically. The measurement system therefore requires a zero gas—i.e., a high-purity gas used for calibrating instruments. While no gas supply can have an absolute value of zero, zero gas may be defined as a gas that is closer to zero than any significant amount of water. Typically, accuracy is ±10% of the reading from 1 ppmv–2,500 ppmv.  

Some measurement modes employ a non-equilibrium technique, where the sensor alternates from being exposed to the zero gas and the process gas. The offline time spent on the zero gas should be factored into response time requirements.  

The sensing surface is also susceptible to contamination and must remain clean. A suitable sampling system must be employed. Quartz microbalance analyzers are characterized by having relatively fast response times. The pros and cons of quartz microbalance sensors are detailed in TABLE 3.  

Fabry-Perot hygrometers. The sensor head in Fabry-Perot-type hygrometers consists of a multi-layered structure comprising materials with high and low refractive indices (FIG. 4).

Typical materials used are silicon dioxide (SiO2) and zirconium dioxide (ZrO2). The sensor head is coated with a glass substrate with a maximum surface pore size < 0.4 nm, making the structure specific to water molecules (pore size of 0.28 nm). A light beam is transmitted through the sensor via a fiber optic cable. The light source is generally a light-emitting diode (LED). As water molecules penetrate the sensor’s surface, they change the refractive index of the light beam (refractive index of air to water is 1.33), causing a change in wavelength. The wavelength change is proportional to the amount of water molecules equilibrated on the sensor. The refracted light is detected by a polychromator, and the reading is calibrated in terms of dewpoint temperature vs. wavelength shift.  

The sensor is mounted on the end of a stainless-steel probe and connected via a fiber optic cable to the control unit. The unit requires temperature compensation and pressure compensation if a ppmv readout is required. The pros and cons of Fabry-Perot hygrometers are detailed in TABLE 4.  

Tunable diode laser absorption spectrometer (TDLAS) hygrometers. TDLAS hygrometers (FIG. 5) offer a fully non-contact method of continuous moisture measurement in natural gas.

The measuring principle is based on the Beer-Lambert Law (Eq. 2):  

The Beer-Lambert principle states that when light energy at certain wavelengths travels through gas, a certain amount of energy is absorbed by the water within the path. The amount of light energy lost is related to the concentration of water.  

A diode laser is very similar to an LED in that when a current is injected into a P-N junction, holes and electrons recombine and release photons.  

A diode laser stimulates the release of these photons and incorporates an optical cavity to create laser oscillation and the release of a beam of coherent light at a single wavelength or frequency.  

It is possible to change the frequency of the emitted light by changing the temperature of the injection current. The monochromatic frequency can also be modulated. Consequently, by passing light at the water absorption frequency through a sample chamber containing natural gas of a certain moisture content, it is possible to precisely establish the water content by measuring the amount of loss in the absorption spectrum.  

In practice, a TDL system measures the water concentration in natural gas by sweeping a narrow band laser diode and changing its wavelength by ramping the injection current while holding the temperature constant by using a thermoelectric (Peltier) heat pump array. The laser is also modulated at a high frequency. At the center frequency, the second harmonic (known as 2F) peak height is measured. The 2F peak height is directly proportional to the partial pressure of water in the absorption cell. By simultaneously measuring the cell’s total pressure, the concentration in ppmv is determined. Boyle’s law, which relates the pressure of a gas to volume, is then applied. The simultaneous measurement of the gas temperature and process pressure enable other humidity parameters—such as absolute humidity, dewpoint and process dewpoint—to be determined with a high degree of precision. 

The typical accuracy of a TDLAS hygrometer for natural gas is 1% of reading in terms of the mole fraction or ppmv. By simultaneously measuring the temperature and pressure, the absolute humidity and dew/frost point temperature is measured with high precision using psychometric equations. Measurement of the process line pressure also enables these units to calculate the pressure dewpoint.  

TDLAS hygrometers are characterized by having very fast response times. The optical response is in < 2 sec. However, it takes time to purge the absorption cell and sampling systems. Typical system response times are < 5 min for a 90% step change. The pros and cons of TDLAS hygrometers are detailed in TABLE 5.  

Takeaways. The technology with the widest measurement range is typically the impedance-type sensor, which can measure from –110°C to > 60°C. The narrowest measurement range is confined to the automatic chilled mirror, which is constrained by the number of stages (cooling capacity) of the sensor installed with the device. Fabry-Perot-type analyzers have range capability similar to impedance-type sensors. The TDLAS technology range of measurement is determined by the type of measurement cell used in the device. A standard measurement cell has a typical lower detectable limit of 2 ppmv, while the latest instruments extend this to sub 100 parts per billion volume (ppbv) levels. The top ranges can be from 2,000 ppm–5,000 ppm. Quartz microbalance ranges down to 0.1 ppm, with upper ranges of 1,000 ppmv–2,000 ppmv.  

In terms of accuracy, the automatic chilled mirror technology is the most precise, offering a typical accuracy of 0.1°C–0.5°C dewpoint. The TDLAS unit is the next most precise instrument, with a typical accuracy of +/- 1% of reading—the accuracy will vary in terms of dewpoint due to the nonlinear relationship.  

The most stable or drift-free technologies are TDLAS and chilled mirror.  

Lasers, by their nature, are inherently stable, and the remaining components in the device can essentially be considered drift free. The non-contact nature of the measurement ensures that there is no process related degradation of the measurement circuitry, laser light source or detectors. At the other extreme, the measurement layer in impedance-type sensors is in a continual state of drift, which needs to be continually corrected by regular calibrations. 

In terms of response time, TDLAS comes out on top. The technologies that require equilibrium of moisture in the gas sample with a sensing surface/layer suffer in this category due to the polar nature of the water molecule and its tendency to stick to surfaces. A significant contact time with the gas to be measured is required, more specifically in going from a wet sample gas to a dry sample gas.  

Maintenance is an important consideration when evaluating the lifetime costs of the different measurement technologies. As more customers outsource maintenance functions, they continually look to install low maintenance equipment. Contact-based sensors will always require more in terms of maintenance than non-contact-based measurements, as their successful operation is much more dependent on a clean sample gas reaching the sensor. Corrosive components in the natural gas stream, like sulfur compounds, will also add to the maintenance requirements for contact-based sensors. Maintenance requirements range from periodic inspection/replacement of sample system filters to annual or bi-annual recalibration of the sensors themselves. TDLAS technology does not have an annual recalibration requirement and is typically sold as maintenance-free technology, except for any associated sample system filter maintenance. A planned factory calibration check every 3 yr–5 yr is typical with TDLAS technology.  

All technologies require a clean gaseous phase sample to reach the sensor; therefore, a sample handling system is always recommended, although some vendors promote direct inline measurement as an advantage. A mixed-phase sample, condensate or liquid glycol carryover can coat contact-based sensors, causing them to become unresponsive or read erroneously, or in extreme cases, can require sensor replacement. Liquid contaminant can also deposit in the TDLAS measuring cell, causing dispersal of the light signal, resulting in an erroneous measurement. TDLAS technology has the capability to alert the user if contamination occurs by comparing the measuring photodetector—tuned to a non-absorbing wavelength—with a reference photodetector to determine if a shift has occurred (within some specified limits).  

Contact-based sensors may be partially contaminated and continue reading, although experienced users may be able to determine contamination has taken place by observing sensor behavior in terms of response to step changes in moisture or an actual step change in process readings after the contamination event. In most cases, if sensors become contaminated, they can be cleaned, purged with a dry gas and returned to service. A scorecard of moisture measurement technologies for natural gas are detailed in TABLE 6.  

 

 

 

 

 

 

 

 

 

 

Related Articles

Comments

{{ error }}
{{ comment.comment.Name }} • {{ comment.timeAgo }}
{{ comment.comment.Text }}