Confusing spec sheets can lead to a bad purchase. An unsuitable gas chromatograph might miss a critical fault, risking a multi-million dollar transformer. Let’s focus on what truly matters. The most important specifications are the detector types, typically a TCD and FID. Pay close attention to the minimum detection limits (MDL) in ppm for key fault gases, especially acetylene. Also, verify the requirements for carrier gas purity and the machine’s calibration stability for accurate results. To understand why these specs matter, we need to start with the basics. It all begins with how the test actually works. So let’s look at the core principle first.
What is the principle of a DGA test?
The term ‘gas chromatography’ sounds complex and intimidating. This confusion makes it difficult to judge the quality of a machine. Think of it simply as a race for different gases. The principle is separation. First, dissolved gases are extracted from the transformer oil. Then, the gas mixture is injected into a column inside the chromatograph. Different gases travel through the column at different speeds, allowing detectors to identify and quantify each one as it exits.

The Three Key Stages of DGA
The entire process can be broken down into three simple stages. First is Extraction, where we get the gases out of the oil sample. This is often done using a headspace method, which gently coaxes the gas out of the liquid. Next is Separation. The extracted gas mix is pushed by a carrier gas through a long, thin tube called a column. This column is the race track. Some gases, like hydrogen, are very light and fast. Heavier gases, like ethane, are slower. This difference in speed is what separates them. Finally, we have Detection. As each gas crosses the finish line at the end of the column, it passes through a detector. The detector identifies the gas and measures how much of it there is, creating a report that we can analyze.
Stage | Purpose | Key Component(s) |
---|---|---|
Extraction | To remove the dissolved fault gases from the oil sample. | Headspace Autosampler, Gas Extraction Syringe |
Separation | To separate the gas mixture into individual components. | Gas Chromatograph Column, Carrier Gas, Oven |
Detection | To identify and measure the amount of each gas. | TCD (Thermal Conductivity Detector), FID (Flame Ionization Detector) |
What type of detector is used in gas chromatography for DGA?
The spec sheet mentions TCD and FID detectors, but the acronyms can be meaningless. Without knowing their purpose, you can’t be sure the machine will detect all fault gases. Each detector is a specialist for a specific group of gases. DGA machines typically use two main detectors: a Thermal Conductivity Detector (TCD) and a Flame Ionization Detector (FID). The TCD is used for inorganic gases like hydrogen and nitrogen. The FID is essential for its high sensitivity to hydrocarbon fault gases like acetylene and methane.
Why Two Detectors Are Better Than One
Think of the detectors as two different specialists. The Thermal Conductivity Detector (TCD) is a generalist. It works by measuring a simple physical property: how well a gas conducts heat compared to the pure carrier gas. It is robust and excellent for detecting gases that are usually present in larger amounts, like hydrogen, oxygen, and nitrogen. The Flame Ionization Detector (FID) is a highly sensitive specialist. It works by burning the gases that pass through it in a small, controlled hydrogen flame. When hydrocarbon gases (like methane and acetylene) burn, they produce ions. The FID measures these ions. This method is thousands of times more sensitive than a TCD for hydrocarbons. This sensitivity is not a luxury; it’s a necessity for detecting the tiny, trace amounts of fault gases that signal a serious developing problem inside the transformer.
Feature | Thermal Conductivity Detector (TCD) | Flame Ionization Detector (FID) |
---|---|---|
Gases Detected | Hydrogen (H₂), Oxygen (O₂), Nitrogen (N₂) | Hydrocarbons (CH₄, C₂H₂, C₂H₄, C₂H₆) |
Principle | Measures changes in thermal conductivity. | Measures ions produced when burning a sample. |
Sensitivity | Lower | Extremely High |
Role in DGA | Detects atmospheric and overheating gases. | Detects key fault gases indicating arcing/overheating. |
What is the typical detection limit (in ppm) for fault gases?
You see lists of gases with numbers in “ppm” on a spec sheet. But without context, you don’t know if a detection limit of 5 ppm is good or bad for a particular gas. A good DGA machine must have low detection limits, measured in parts per million (ppm). For critical fault gases like acetylene (C₂H₂), the limit should be less than 1 ppm, ideally below 0.1 ppm. This ensures you can detect serious faults early.
Fault Gases and Their Desired Detection Limits
Parts Per Million (ppm) is a measure of concentration. 1 ppm means there is one part of a gas for every million parts of oil. Some gases are only dangerous when they appear in tiny amounts, so the machine must be sensitive enough to see them. According to standards like IEEE C57.104, different gases point to different types of faults. For example, a tiny amount of acetylene (C₂H₂) is a huge red flag for high-energy arcing, the most destructive type of fault. Therefore, the machine’s ability to see this gas at very low levels is its most important job. You want a machine with the lowest possible Minimum Detection Limit (MDL) for all key gases, but especially for acetylene. It is the difference between catching a problem early and discovering it after a catastrophic failure.
Fault Gas | Chemical Formula | Indicates Fault Type(s) | Desired Detection Limit (ppm) |
---|---|---|---|
Hydrogen | H₂ | Partial Discharge, Overheating | < 5 ppm |
Methane | CH₄ | Low-Temperature Overheating | < 1 ppm |
Ethane | C₂H₆ | Low-Temperature Overheating | < 1 ppm |
Ethylene | C₂H₄ | High-Temperature Overheating | < 1 ppm |
Acetylene | C₂H₂ | Arcing (High-Energy Discharge) | < 0.1 ppm |
What are the requirements for carrier gas and calibration?
You might think a gas chromatograph is a one-time purchase. But it requires specific supplies and regular maintenance to give you accurate data, which can create unexpected costs and downtime. A gas chromatograph requires a continuous supply of high-purity carrier gas, typically Argon or Helium (99.999% pure), to function. It also needs periodic calibration with a certified standard gas mixture to ensure its measurements remain accurate over time.
Operational Requirements for Accurate Results
These ongoing requirements are not optional; they are essential for the machine’s performance. The carrier gas is the lifeblood of the system. It’s what pushes your oil gas sample through the separation column. If the gas is impure, those impurities will show up in your results as noise or false peaks, making your data useless. That’s why a purity of 99.999% (“five nines”) is the standard requirement. Calibration is the process of checking and adjusting the machine’s accuracy. You run a special gas bottle with a known concentration of all the fault gases through the machine. You then adjust the machine’s software so its readings match the certified values on the bottle. This process compensates for any minor drift in the detectors over time, ensuring your results are always reliable and legally defensible.
Requirement | Description | Why It’s Critical |
---|---|---|
Carrier Gas | A high-purity (99.999%) inert gas like Argon or Helium. | Pushes the sample through the column. Purity prevents contamination of results. |
Calibration Gas | A certified mixture of all key fault gases at known concentrations. | Used as a reference standard to check and adjust the machine’s accuracy. |
Regular Calibration | The procedure of running the calibration gas and adjusting the machine. | Ensures the DGA results are consistently accurate and trustworthy over time. |