|
6th Annual Pennsylvania Statewide
Conference on |
|
The Conference
Technical Session Registration NOW AVAILABLE!!
Return to
|
Abstract Comparison of Three Methods to Measure Acidity of Coal-Mine Drainage
Brent Means, U.S. Office of Surface Mining Tiff Hilton, WOPEC, Inc.
Although the Standard Methods 2310 hot peroxide acidity procedure is widely used for measuring the acidity of mine drainage, little work has been done to determine if “hot acidity” data actually describe the base requirement for neutralization of mine drainage. This study compared three methods for estimating the acidity of net-acidic waters emanating from the Manor, Millerstein, Ike, and Morris coal mines in Pennsylvania: Standard Methods 2310 hot acidity titration to pH 8.2 endpoint, cold acidity titration to treatment endpoint pH as high as 11.0, and calculated acidity. The results showed poor agreement between hot acidity and calculated acidity for three of the four waters. For two of the waters, Mg hydrolysis during the hot-acidity titration indicated greater acidity than that computed based on pH and dissolved Fe, Mn, and Al. The poor agreement for the other water resulted from incomplete hydrolysis of Mn during the hot acidity titration. The agreement between the acidity measured by the treatment acidity titration and the other two acidity methods was within 16 mg/L (as CaCO3) for the Manor and Millerstein waters, but greater than 200 mg/L (as CaCO3) for the Ike and Morris waters. The fair agreement between all methods for Manor and Millerstein is a result of pH, Al, Fe, and Mn being the main source of acidity in the waters. The poor agreement between the acidity methods for the Ike and Morris waters is a result of the treatment acidity titration measuring a large amount of additional “acidity” at high pH from the hydrolysis of Mg and other constituents. While the exact sources of acidity measured by a treatment titration is unknown, results from PHREEQC aqueous speciation calculations showed that the formation of cation-hydroxl complexes in the Morris water at pH 11.0 contributed 40 mg/L of acidity. The authors hypothesize that Mg hydrolysis and the formation of base-consuming complexes are the reason why acidity measured by treatment titrations at high pH is often greater than that measured by hot acidity titrations to pH 8.2 or 8.3. The authors also hypothesize that the neutralization of carbonic acid is the reason why the acidity measured by “cold” titrations at low to mid pH is often greater than that measured by hot acidity titrations. The results of this study have practical importance because they show hot acidity titrations should not be used to universally describe the acidity of mine drainage. This is especially true when estimating the acidity produced when Mg-rich mine drainage is chemically treated to high pH. This study also showed that over treating Mg-rich mine drainage not only increases chemical costs but also increases sludge production.
|
|