By Dave Joseph, Emerson

Conductivity measurement is one of the most ubiquitous in industry so the issue of how to correctly calibrate a toroidal conductivity sensor comes up for a lot of plant personnel. Because the purpose of the measurement is to get information about the total concentration of ions in solution (i.e., the conductivity), the effect of sensor dimensions and windings must be accounted for. The correction factor is the cell constant. Conductivity is equal to the measured conductance multiplied by the cell constant.

There are two basic ways to calibrate a toroidal sensor: against a standard solution, or against a referee meter and sensor. A referee meter and sensor is an instrument that has been previously calibrated and is known to be accurate and reliable. The referee instrument can be used to perform either an in-process or a grab sample calibration. In-process calibration involves connecting the process and referee sensors in series and measuring the conductivity of the process liquid simultaneously. Grab sample calibration involves taking a sample of the process liquid and measuring its conductivity in the laboratory or shop using the referee instrument. No matter which calibration method is used, the analyzer automatically calculates the cell constant once the known conductivity is entered.

The cell constant is also influenced by the nearness of the vessel walls to the sensor, the so-called wall effect. Conductive and non-conductive walls have opposite effects. Metal walls increase the amount of induced current, which leads to an apparent increase in conductance and a corresponding decrease in cell constant. Plastic walls have the opposite effect.

Calibration against a standard solution requires removing the sensor from the process piping. It is practical only if wall effects are absent or the sensor can be calibrated in a container identical to the process piping. The latter requirement ensures that wall effects during calibration, which are incorporated into the cell constant, will be exactly the same as the wall effects when the sensor is in service.

Calibration against a referee – in-process – involves connecting the process and referee sensors in series and allowing the process liquid to flow through both. The process sensor is calibrated by adjusting the process analyzer reading to match the conductivity measured by the referee instrument.

Calibration against a referee – grab sample – is useful when calibration against a standard is impractical or when in-process calibration is not feasible because the sample is hot, corrosive, or dirty, making handling the waste stream from the referee sensor difficult.

If you click HERE, you’ll find a very useful white paper that walks you through the issues related to calibration of a toroidal conductivity sensor and explores each method in some depth. I hope this is useful in simplifying your conductivity measurements.

Please let me know if I can answer any questions. Thanks for stopping by.


By Jim Cahill, Emerson

This post was originally published on Emerson Exchange 365 and we wanted to share it with our readers as well.

At the Emerson Exchange conference in Austin, Emerson’s Sean McLeskey presented Fixed Gas and Flame Detection Best Practices. His abstract:

Many industrial processes involve dangerous gases and vapors: flammable, toxic, or both. With the different sensing technologies available, and the wide range of industrial applications that exist, selecting the best sensor and locating them properly for the job at hand can be a challenge. This workshop will help you get a better understanding of application challenges, learn basic installation best practices, and understand the benefits of using flame and gas detection solutions.

sean-mcleskey-emrexSean opened with a safety case study at a refinery where personnel heard a “pop” and saw what appeared to be steam. It was a gas release that led to people being injured and the refinery being shut down for 3 months with losses in the tens of millions.

Fixed flame and gas systems are for detecting releases of process hazards and provide time to alert personnel and put the process into a safe state. These systems protect people, property, provide regulatory compliance and maintain good relations in the surrounding community.

Ultrasonic gas leak detection listens for the ultrasonic sound caused by escaping gas. It is not impacted by wind direction that some other detector technology relies upon. It provides first detection but does not provide composition of the gas detected.

Another technology is point gas detection which requires the gas to pass by to be detected by the sensors. This technology is typically applied near leak sources where the escape points, such as gaskets are known. One example technology is catalytic bead combustible gas detection. This technology is used to monitor several targeted gasses across applications including hydrogen.

fixed-flame-gas-detection-1Infrared sensors are another type of point gas detectors. Their strengths compliment catalytic bead. It is unaffected by high concentrations of hydrocarbon and works in the absence of oxygen, unlike the catalytic bead technology.

Multi-spectrum infrared flame detectors are the highest performing detectors and have excellent immunity to false alarms.

After performing a risk assessment, installation considerations include pressure of gas source—is it high enough for ultrasonic listening. For the point gas family of technologies gas must be able to reach the sensor. Considerations include the properties of the gas, ambient conditions and obstructions between the gas source and detector, open path technology to cross beam path. Depending on the area where the detectors are located, beams, point gas and ultrasonic listening have advantages and disadvantages.

For flame detectors, the optical sensor is like an eye. Considerations include the size of the area to be monitored, detection technology, obstructions, nature of flame source, and potential blind spots.

No one detector is a silver bullet for use in all applications. Each has their advantages and disadvantages and you will need to consider your application. You can connect and interact with other gas and flame detector experts in the Analytical group in the Emerson Exchange 365 community.


Hi. I’m Marc Mason, business development manager, and I’m happy to be your analytic expert today. You know the old saying, “You have to spend money to make money”? Well, in the water industry we’re finding that many water plants have to spend money to save money. Recently, Tom Johnson, water industry business development manager at Emerson, wrote an article for Water & Wastes Digest that talks about advanced technologies like radar leveling, Waste Water Art-2reagent-free liquid analysis, ultrasonic control, wireless measurement devices, advanced predictive diagnostics, and SCADA control systems, and how case histories are showing the cost savings that water treatment plants can garner from investing in emerging advanced analytical, diagnostics and measurement technologies, as well as the control systems that manage those technologies. The case history described in the article demonstrates this premise pretty clearly –

Taylorsville-Bennion Improvement District serves 70,000 people in approximately 14 square miles in the center of the Salt Lake Valley, Utah. The district has approximately 16,700 connections and 229 miles of water lines. For many years, it tried to keep its old chlorine and fluoride sensors and analyzers running by constantly rebuilding, recalibrating and replacing parts. While this seemed like the cost-effective thing to do, it was proving too much for the district’s small staff – a situation familiar to many managers. The units were laborious to rebuild and required replacement of two to three probes per year; plus, they used expensive membranes that were difficult to replace and often broke during installation. The district estimates that the cost to operate the old sensors and analyzers was approximately $9,000 per year at its three locations. The units required daily attention and annual rebuilds, adding labor costs to the equation.

When the district decided to replace the old sensors and analyzers with the latest technology, its situation changed drastically. The new systems were built to last three years, versus one year, and were known to be effective as long as 15 years. The new technologies were reagent-free, reducing costs and maintenance, and needed far less frequent calibration. Bottom line: the district now replaces the membranes and electrolyte of the chlorine systems for $150 per year, compared to more than $6,000 in maintenance costs for the old systems. While the new equipment was costlier to purchase, the dramatically lower cost of ownership is rapidly offsetting that differential – a situation that can apply to many technologies.

There are many other examples of cost savings quoted in the article. Click HERE to read it.

How about you? Have you invested in what seemed a costly technology, only to discover it saved money? We’d love to hear your story.


Hi. I’m Bonnie Crossland, Rosemount product manager for gas chromatograph (GC) technology and I’ll be your analytic expert today. I recently had an opportunity to write an article for InTech magazine and I’d like to share with you some of the ideas from that article.

mj-2016_web-exclusive-storyYou may be aware that glass production is one of the most energy intensive industries, with energy costs topping 14% of total production costs. The bulk of energy consumed comes from natural gas combustion for heating furnaces to melt raw materials, which are then transformed into glass. Additionally, glass manufacturing is sensitive to the combustion processes, which can affect the quality of the glass and shorten the lifespan of the melting tanks if not managed properly. Historically, the composition of natural gas has been relatively stable. However, dramatic changes in the supply of natural gas (including shale gas and liquefied natural gas imports) are causing end users to experience rapid and pronounced fluctuations in gas quality.

You may not have anything to do with glass manufacturing, however, this application is an excellent example of the impact of the combustion process on energy consumption in any industry – and the best ways to measure and control that process. Many companies in a wide range of industries faced with the problems of inconsistent quality in natural gas may not have considered gas chromatography as a viable solution for balancing air/fuel ratio due to the traditional complexities of the measurement. It’s time to look again. New developments in gas chromatography technology may make this approach the first choice for improving energy efficiency, and ultimately, process quality.

The efficiency of the furnace can be optimized for the air/fuel ratio when the composition of the incoming gas changes. This can significantly reduce energy consumption and provide substantial savings to the business in product quality and equipment life. Optimizing the furnace efficiency has traditionally been complex and costly. Next-generation gas chromatography, however, is changing that paradigm, providing a cost-effective, task-focused methodology that can be carried out by less technically proficient personnel than were traditionally required.

A major glass company in the southeastern U.S. is a heavy user of natural gas. However, the gas comes from multiple locations, causing a constant fluctuation of the BTU value. Because gas flow is adjusted based on the BTU value, knowing the precise measurement is essential. In addition, because gases with the same BTU operate differently through a burner, knowing the Wobbe Index is critical to quality.

mj-2016_we-fig-1When the company began employing a gas chromatograph to optimize its fuel quality, it found the traditional intricacies of gas chromatographs inappropriate for its application. Despite repeated training, its staff was unable to calibrate the instrument. New GC technologies designed specifically for natural gas optimization significantly reduced the complexity of operation. In new designs, all of the complex analytical functions of the gas chromatograph may be contained in a replaceable module, greatly simplifying maintenance. Features like auto-calibration make operation easier and more accurate, even for novice users. And unlike other analyzers that change the air/fuel ratio based on feedback after combustion, the GC offers a feed-forward control, where the air/fuel ratio can be changed based on the composition before combustion occurs in the flue. This can help stay in emission compliance and maintain energy efficiency with the GC.

The InTech article has a lot of other details on the need and process of optimizing combustion and the effectiveness of new generation GCs in meeting those needs. Gas chromatographs are used throughout the natural gas chain of custody (from wellhead to burner tip) to determine the gas composition for quality monitoring and energy content. For pipeline quality natural gas, the industry standard is the C6+ measurement method. If your company is a user of natural gas, you may already have GCs involved in your process. Using them to ensure the efficiency of your combustion and the ultimate quality of the product is just another vital addition. If you aren’t currently using GCs, let me know if you have any questions, or would like a demonstration, by leaving a comment HERE, or emailing me HERE.

By J. Patrick Tiongson, Product Manager, Emerson

At first glance, calibrating conductivity sensors may seem straightforward; however, many times, this is not the case. I’d like to share with you some of the various methods for calibrating contacting conductivity sensors and outline some of the potential issues that can accompany such procedures.

There are three main methods for going about calibrating contacting conductivity sensors: 1) with a standard solution, 2) directly in process against a calibrated referee instrument and sensor, and 3) by grab sample analysis. Understanding the fundamentals of each method, and the issues associated with each, can help make the decision on how best to calibrate a sensor.

Calibration with a standard solution consists of adjusting the transmitter reading to match the value of a solution of known conductivity at a specified temperature. This method is best when dealing with process conductivities greater than 100 µS/cm. Use of standard solutions less than 100 µS/cm can be problematic as you run into the issue of contaminating the standard with atmospheric carbon dioxide, thereby changing the actual conductivity value of the standard. For maximum accuracy, a calibrated thermometer should be used to measure the temperature of the standard solution.

If dealing with low conductivity applications (less than 100 µS/cm), in-process calibration against a referee instrument is the preferred method. This method requires adjusting the transmitter reading to match the conductivity value read by a referee sensor and transmitter. The referee instrument should be installed close to the sensor being calibrated to ensure you are getting a representative sample. Best practices for attaining a representative sample across both the process and referee sensors include using short tubing runs between sensors and increasing sample flow. Though usually not as accurate as calibrating against a standard solution, this method eliminates the need to have to remove your sensor from process, and removes any risk associated with contamination from atmospheric carbon dioxide.

The least preferred method of calibrating a contacting conductivity sensor is by analyzing a grab sample. This method entails collecting a sample from process and measuring its conductivity in a lab or shop using a referee instrument. Calibration via grab sample analysis is only ever recommended if calibration with a standard solution is not possible or if there are issues with installing a referee instrument directly in process. The collected grab sample is subject to contamination from atmospheric carbon dioxide and is also subject to changes in temperature when being transported. In other words, there are many potential sources for error in the final calibration results.

More information regarding calibrating conductivity sensors can be found HERE.

Have you been utilizing the best method for calibrating your sensors? Are there any challenges that you run into?