You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Data Analysis

Data smoothing and normalization are the first transformations performed on the raw turbidity data.
The smoothing allows us to exclude outlying data points caused by really large flocs passing in front of the light sensor of FRETA, which create turbidity fluctuations. The normalization allows us to compare data sets with varying influent turbidities. The settling velocity (Vs) was calculated by dividing the distance between the ball valve and the zone illuminated by the infrared LED of FRETA by the time elapsed (Can put an equation on here in MathType to describe this?). The plot of normalized turbidity vs. Vs can be interpreted as a cumulative distribution function (CFD) of turbidity with respect to Vs. A CFD describes the probability that a variable is less than or equal to some value. To make the analysis more robust, the experimental data was fit to a gamma distribution. Then a derivative of the CFD of the gamma distribution gives a probability distribution of the particle population with respect to Vs. (Better description than in the other section. This really should be under "Materials and Methods" in a "Data Analysis" section.)
For more precision in data analysis, see Ian Tse's thesis (Chatper I, data analysis)

  • No labels