Truncated match.
PICList
Thread
'Olympic Averaging with a PIC'
1998\07\24@122007
by
Thomas McGahee

There has been some discussion on the list of various
forms of eliminating bad data by averaging or using a
median filter.
One interesting method of improving an average is to
throw away the highest and the lowest values.
I call this the Olympic Method, because it is similar to
how scoring is done in the Olympics. It works quite
well.
If you want to get a reasonable average without having
to resort to using too many systems resources, consider
the following technique shown in pseudo code:
Set MINVAL to 0xff (8 bit)
Clear MAXVAL to 0(8 bit)
Clear TOTVAL to 0(16 bit)
Set CNTVAL to 16+2 (8 bit)
While CNTVAL>0
Get a Value
If Value>MAXVAL then Let MAXVAL=Value
If Value<MINVAL then Let MINVAL=Value
Add Value to 16 bit TOTVAL
Decrement CNTVAL
End While (Loop)
Subtract MINVAL from TOTVAL
Subtract MAXVAL from TOTVAL
;At this time TOTVAL contains the total of 16 values,
; since the largest and smallest values got removed
AVERAGE=TOTVAL/16
This AVERAGE is based on 89% of the data being retained,
and 11% of the data being discarded.
*****
Here's the same thing, but with 4 pieces out of 20
being discarded:
Set MINVAL1 to 0xff (8 bit)
Set MINVAL2 to 0xff (8 bit)
Clear MAXVAL1 to 0(8 bit)
Clear MAXVAL2 to 0(8 bit)
Clear TOTVAL to 0(16 bit)
Set CNTVAL to 16+4 (8 bit)
While CNTVAL>0
Get a Value
If Value>MAXVAL1 then Let MAXVAL1=Value
else if Value>MAXVAL2 then Let MAXVAL2=Value
If Value<MINVAL1 then Let MINVAL1=Value
else if Value<MINVAL2 then Let MINVAL2=Value
Add Value to TOTVAL (16 bit)
Decrement CNTVAL
End While (Loop)
Subtract MINVAL1 from TOTVAL
Subtract MINVAL2 from TOTVAL
Subtract MAXVAL1 from TOTVAL
Subtract MAXVAL2 from TOTVAL
;At this time TOTVAL contains the total of 16 values,
; since the two largest and two smallest values got removed
AVERAGE=TOTVAL/16
This AVERAGE is based on 80% of the data being retained,
and 20% of the data being discarded.
Hope this is useful to someone.
Fr. Tom McGahee
1998\07\27@133614
by
lilel

> There has been some discussion on the list of various
> forms of eliminating bad data by averaging or using a
> median filter.
>
> One interesting method of improving an average is to
> throw away the highest and the lowest values.
> I call this the Olympic Method,
Interesting!
I don't want to trash your idea, or your OlymPIC efforts at solving
this problem <groan> but here's some points about averaging that I've
learned the hard way, by writing code that doesn't work:
Here's the other problem with averaging. Let's try a few randomly
selected data points, and use the Olympic method (rumor has it
that's also how they do sealed bids on construction jobs in Great
Britain  Is that true, Brits?)
120, 150, 180, 100, 80, 250, 240, 120, 120, 110,
130, 120, 130, 140, 90, 80, 85, 111
We throw out the two highest and lowest values, 250 and 80, then
average the rest getting 126.625
PICS, though, don't usually have the luxury of floating point math.
The answer in a PIC is 126.
In a real application following a setpoint, this error can add up.
The PIC's answer is ALWAYS less than the real floating pont answer.
so taking a moving average tends to "wind down". If you usually get
an error of .5 LSB then you'll wind down to zero after 512
measurements, if you are trying to take a moving average. There are
various methods to get movingf averages, like wieghting the old data
vs. new data, adding the old data in as anopther data point, etc.
etc. etc. The result is usually the same, after a long period of
time.
I wrote a fine application that would always float down to the
bottom of it's setpoint range. Further attempts at writing rounding
code, or doing more digits pof math inroduce extra complexity and
needless headaches. Meanwhile the median filter would pick out a
value of 120 from this dataset, and would continue to pick out values
near there. I we really see that many values at 120, that may be the
"real" number we are looking for. In my earlier extreme example, one
screwy number can skew an average so it does not look very much like
the data being measured. Throwing out the highest number helps a
little, and you can add more and more complexities to a "simple"
averaging scheme until you are using floating point math and a Cray
supercomputer, you'll still get wrong answers. I maintain that 120
is a better picture of this data than 126.
 Lawrence Lile
"An Engineer is simply a machine for
turning coffee into assembler code."
Download AutoCad blocks for electrical drafting at:
http://home1.gte.net/llile/index.htm
More... (looser matching)
 Last day of these posts
 In 1998
, 1999 only
 Today
 New search...