Calculating RMS using fixed point math (C) -
i trying calculate rms value of waveform running problems.
i take samples every x microseconds triggered interrupt. sample stored in array , each time sample taken pushes last value next point in array , feeds new value in. take sample square , divide 20 (number of sample per period, assume waveform fixed frequency) put array, add sum value , when reach 20 samples subtract first sample made , add last sample made.
value 20 = value 19 //int16 values value 19 = value 18 ... value1 = (sample * sample)/20 sumvalue += value1 sumvalue -= value20
i call rms function takes value, divides last calculated rms value (or if not calculated yet divide 1) add last rms value divide 2.
calcrms(sumvalue) int32 tempsum if(rms) tempsum = (sumvalue/rms + rms)/2 else tempsum = (sumvalue + 1)/2 rms = tempsum
i output rms screen. problem rms value keeps changing, though waveform constant. if run dc value in there rms stays steady shove in sine wave , goes crazy.
hoping can point me in right direction. don't want answer straight up, nudges me on track.
alrigth, function doesn't compute rms.
did take @ how sumvalue changes overtime? sumvalue must constant, in dc , in sign case. if isn`t, wrong routine sum. if sumvalue constant, there wrong rms procedure.
Comments
Post a Comment