This in in relation to a MIDI knob box. All requirements should be similar
to a standard Data Aquisition system but only 7-bit accuracy is required
(would like to be able to scale up to 14bit at a later time tho), doesn't
matter what you do to the input as long as values from 0-127 are outputted.
As it's MIDI any noise is fine as long as the 0-127 range is maintained with
What I am trying to do is create some sort of automatic voltage scaling
system. The device is intended to be usable by electronically illiterate
persons so no big words\concepts can be needed by the end user. Effectively
I would like a system whereby you can plug any voltage into the box and it
will detect the voltage range allowing it to auto calibrate itself for that
The sort of process I had envisage was as follows:
1. User inserts voltage to calibrate for into special calibration input.
2. This input has a fuse rated at the maximum voltage the system can handle
without blowing up. There would be an LED after the fuse so if it doesn't
come on you know the voltage was real bad. Ideally the system would be able
to handle a fairly high voltage (e.g. 20V +) so they are very unlikely to
blow the fuse.
3. The system would then in some way get the input voltage down to the range
the ADC could handle.
I have no idea if this is really possible in any meaningful way but it would
be pretty useful so I will give it a go.
I had the following idea for achieving the actual scaling:
1. At this point we know the voltage is less than the fuse voltage (oh well,
say fuse voltage + 5V for safety).
2. The voltage is then fed through a voltage divider created with a couple
(maybe only 1 is needed) of digital pots. At first this would be set to the
maximum possible division to get the voltage as low as possible. Then the
amount of division would gradually be decreased until the resulting voltage
was at ythe upper limit of the ADC. The digital pot values would then be
stored so that the standard inputs (which don't have any extra circuitry for
voltage detection) can be calibrated for that input.
Is this actually possible in this way? Is there a better way to do it?
Ideally no extra components other than the digital pots would be needed
allowing the circuit to be fitted to all inputs (obviously some circuitry is
needed but the less the better). Even better would be to do it after MUXing
as then only one is required but then the MUX must be able to handle the
full input voltage.
|On Fri, 24 Sep 1999 11:30:02 +1000 Thomas Brandon <PSY.UNSW.EDU.AU> tom
> I would like a system whereby you can plug any voltage into the box
> and it
> will detect the voltage range allowing it to auto calibrate itself
> for that
If you use a "larger" ADC such as a 12-bit unit then you can scale the
converted result with software and still have 7 useful bits. For
example, set the converter up so an input of 1V results in a reading of
127. This decision sets the minimum full-scale voltage that can be used.
The user's source must be able to supply at least 1V full scale in order
to be able to convert it to 128 discrete readings. But without changing
any hardware, the converter can linearly accept voltages up to 32 V. An
input of 32V will convert to the full scale of a 12-bit converter, 4095.
This is the maximum voltage that can be used, more than that and the
digital output will stop increasing. It would be fairly easy to design
the input to tolerate hundreds of volts without burning anything out
The software would be real simple. Have the user turn the external
voltage up to its maximum and press a "calibrate" button. The processor
takes a reading from the ADC and remembers it. Then for operational
readings, the processor divides all ADC readings by the full scale
reading / 128 (It may be faster to convert the full scale reading to a
factor for multiplying and throw away a bunch of LSB's instead of
dividing). This gives a 7-bit useful result over the full range.
For the input I'd use an op-amp inverting amplifier. The input voltage
goes through a large resistor to the (-) input of the op-amp. The
maximum input voltage this circuit can handle is limited only by the
input resistor. If you don't care too much about always having a
constant impedance at the inputs you can simply multiplex many inputs to
one amplifier. Each input needs its own resistor and a set of diodes to
keep the voltage after the resistor in range of the multiplexer while the
channel is not selected. (If you have SPDT switches at each input then
all the unused inputs could be connected to ground, and the used one
connected to the amplifier. Clamping diodes would still be a really good
idea though.) The input to the amplifier is a "virtual ground". When a
channel is selected, the voltage at the multiplexer is forced to zero so
the clamp diodes do not conduct. You will probably need to add a small
capacitor in parallel with the feedback resistor to keep the amplifier
from oscillating with all the capacitance of multiplexers, etc. at its
input. This forms a low-pass filter which can be useful, but it also
limits the maximum sampling speed.
You could also add more analog switches to switch in different feedback
resistors, changing the gain of the amplifier under software control and
extending the input range even further. The circuit should be safe from
damage even if a high voltage is applied inadvertently while the
amplifier is set for high gain. The amplifier would just saturate
(causing the ADC to read out of range) while the clamp diodes would keep
the amplifier input voltage from rising too much.
Get the Internet just the way you want it.
Free software, free e-mail, and free Internet access for a month!
Try Juno Web: dl.http://www.juno.com/dynoget/tagj.
More... (looser matching)
- Last day of these posts
- In 1999
, 2000 only
- New search...