
Signal Processing
#1
Posted 27 February 2008 - 11:18
I'm doing a lot more data logging stuff this year (I have a slip angle sensor plus gyro to play with in addition to the team's motec kit) and I'm interested in understanding a bit more about signal processing, filtering, etc.
Can some of the more experienced members of the board suggest any practically oriented texts on this subject that I could use to teach myself?
Ben
Advertisement
#2
Posted 27 February 2008 - 12:30
#3
Posted 27 February 2008 - 12:54
#4
Posted 27 February 2008 - 20:59
My bookshelf is a bit out of date, but Horowitz & Hill's Art of Electronics is good, maybe get it out of the library before buying it. Be careful about texts on DSPs, that's more for people designing systems than analysing data (there's possibly a DSP in your datalogger).
The major challenge when dealing with any motorsport data acquisition system will be getting the specs from the manufacturer on the input filters / input signal processing they have already put in place. The phenomenon known as aliasing (http://en.wikipedia.org/wiki/Aliasing) means they should have something in place in the hardware already, plus hardware or software solutions to cope with reduced sampling rates if they are configurable that way.
In my experience the only way to be sure of what you've got is to measure it yourself, easy enough to do on the bench with a signal generator and an oscilloscope. Once you've overcome that hurdle you'll have the confidence to apply signal processing techniques such as the FFT using something like MATLAB (assuming it isn't a feature of your data analysis package already - and if it is you probably should benchmark it against MATLAB anyway...) The MATLAB signal processing toolbox online help (http://www.mathworks...p/helpdesk.html) is a good free resource, worth a read even if you don't plan to use MATLAB.
The most useful signal processing techniques will be low, band or high-pass phase compensated filters, plus gating functions to selectively process data under certain conditions. (see here for MATLAB implementation). Much can be done with the math channels of the decent analysis packages, which do you plan to use?
But I'll repeat myself, accurate signal process is built on the foundation of good initial sampling of the data. If you can't be confident in the solutions the datalogger manufacturer has put in place, consider adding hardware filters to your system, there's no way to recover the situation in software if the data's been corrupted by alaising in the hardware.
Regards, Ian
#5
Posted 27 February 2008 - 21:58
For some free stuff:
Here is an excellent, reasonably practical introduction to using and understanding a spectrum analyser. If you don't know much about Fourier, FFTs, windows, and calibrating then this is a good start.
http://cp.literature.../5952-8898E.pdf
If that is a bit too dry then www.dspguide.com
This looks to me like signal processing with attitude, entertaining and a good intro. It might well tell you everything you need to know, and the approach is about as non-mathematical as is practical.
This has more details about FFTs
http://www.science.u...notes/an041.pdf
Here is an excellent discussion of the philosophical details of FFTs and wavelets.
http://engineering.r...WTtutorial.html
If you are only interested in time domain analysis have a dig around here http://www.bksv.com/2148.htm, to get good data. As you can tell I'm more of an FFT guy, but I'd reinforce Ian's words, getting good data in is far more important than fancy analysis techniques.
If you want to check the antialiasing settings for yourself (unlike Ian I don't get paranoid about them, signal analysers give you direct control over them), just run a swept sine (or your engines timing pulse!) into the system, and look at the resulting trace and compare it with what you see on a scope. Interesting things happen as you approach or exceed the Nyquist limit if your AA is wrong.
Talking of scopes the Fluke LCD ones are brilliant, http://www.myflukest...scopemeters.php
#6
Posted 28 February 2008 - 13:38
Is it paranoia if they really are out to get you?Originally posted by Greg Locock
... As you can tell I'm more of an FFT guy, but I'd reinforce Ian's words, getting good data in is far more important than fancy analysis techniques.
If you want to check the antialiasing settings for yourself (unlike Ian I don't get paranoid about them, signal analysers give you direct control over them), just run a swept sine (or your engines timing pulse!) into the system, and look at the resulting trace and compare it with what you see on a scope. Interesting things happen as you approach or exceed the Nyquist limit if your AA is wrong.
I have worked with one professional motorsport datalogger that had absolutely no anti-alias filtering on it's inputs, not even a passive RC network...
In the 90's I worked extensively with combustion measurement data acquisition systems in a dyno environment. The system was crank angle triggered every degree, so the sample rate increased as engine rpm increased. That presented quite a filtering challenge as the anti-alias filter had to be rpm dependent. Again, the manufacturer had not implemented proper configurable filters, although there was a fixed cut-off frequency filter on the inputs that was spec'd for the highest sample rate the system was capable of: 100kHz or 16,666rpm...
I think that the reason most 'normal' applications get away with this sort of thing is that most 'normal' noise inputs turn out to be low amplitude and across a wide frequency spread. So you get a little bit of degradation everywhere but nothing systematic and identifiable. The exception would be something like suspension data logged at a very low rate where the tyre natural frequency is aliased back into the chassis movements, or maybe accelerometers / gyros subject to engine vibrations.
Regards, Ian