Instrumentation

M1. Explain the importance of resolution, accuracy, sensitivity bandwidth and input impedance on the performance of a piece of test equipment.

In measuring instruments, resolution refer to change in measurand that is adequate enough to cause a display in the readouts. In digital readouts, it is the least significant digit in the readouts and in analog readouts, it is least functional increment in the display scale. Resolution of a test equipment shows the smallest portion of measurand that can be observed hence indicating how fine the measurement would be and to which decimal places can changes be measured. It also serves as an indication of equipment precision and ability of test equipment to discriminate between close input values.

Accuracy of a test equipment refer to the difference between the true value of the measurand and the measured value indicated by the test equipment. It is an indication of how close the measured value is to the true expected value of the measurand. In a test equipment, accuracy specifications denote the degree of uncertainty inherent in a measurement made by the instrument, under specified conditions.

Sensitivity of a test equipment refer to the ratio of the change in an indication the equipment and the equivalent change in a value of the measurand. Sensitivity shows the least input value that can be measured by the equipment. Sensitivity bandwidth denoted the frequency range over which the measurements of a test equipment are useable. Below the lower cutoff frequency and above the cutoff frequency, the equipment may not be sensitive to the changes in the input and therefore the output may not be significant.

Input impedance is the impedance across the input terminals of a test equipment. Input impedance controls the power that is drawn from the measurand by a test equipment hence controlling the loading errors by ensuring reduced power loss.

M2. Use a manufacturers recommended procedure together with laboratory instruments and standards to calibrate and configure an item of electronic equipment.

A multiplexer is the instrument considered for calibration and configuration with help of manufacturers recommended procedures and standards. The following tools from the laboratory would be needed for multiplexer calibration: 5/16 Wrench, two SMA test cables, Radio Frequency Power Supply, Multiplexer Control Box, Radio Frequency(RF) Signal Generator, Spectrum Analyzer, SMA Connector for Spectrum Analyzer and a Calibration Data Sheet. The procedure calibration and configuration of the multiplexer includes: Preliminary Setup, Measure the Voltage Standing Wave Ratio and Measure Insertion Loss.

Preliminary Setup

First connect the Radio Frequency(RF) Power Supply to the Multiplexer Control Box with attached ribbon cable and also connect the multiplexer control box to the multiplexer with attached ribbon cable. Then power on RF Power Supply, RF Signal Generator, and Spectrum Analyze; set the RF Signal Generator to 1497MHz and then turn the RF on. After this, the SMA test cable is connected to the RF Signal Generator and Spectrum Analyzer. Measure the loss in the cable and record the measurements in the calibration data sheet. Repeat the whole process but with the second SMA cable. Lastly, connect one SMA test cable from the RF Signal Generator to one input channel on the RF Switch and the other SMA test cable from the Spectrum Analyzer to the matching channel output port.

Measure the Voltage Standing Wave Ratio

After preliminary set up, measure the Voltage Standing Wave Ratio (VSWR) for each input and output and record in calibration data sheet.  Ensure that the values are either equal to or below 1.8 for passing. If the values are not equal, adjust the microprocessor parameters to ensure this requirement is met.

Measure Insertion Loss

Measure the loss for each channel by using the multiplexer control box to switch between each channel; to measure for a channel, select the binary equivalent to channel number on the multiplexer control box. Then move the SMA cables accordingly for each measurement and record each reading in the calibration data sheet. Ensue that the measurements taken conform with what is require, necessary adjustments can be undertaken to ensure the measurements are met. This includes changing the cable size or setting the RF generator to a different frequency.  After recording measurements, calculate the difference of insertion loss between all channels. Ensure the insertion loss is within the required range.

M1. Explain the benefits and limitations of programmable controllers for a specific application.

A programmable controller is a modular solid state computer with tailored instructions to implement a particular task in industrial control systems by constantly monitoring the state of input devices and undertaking decisions grounded on a specified program to control the state of output devices. Some benefits of programmable controllers use for a particular application include:

Some disadvantages of programmable controllers use for a particular application include:

  • Simplicity: Programmable controllers are simple and easy to use and operate. They also make it easier to troubleshoot various applications in industrial control systems. It is also easy to develop and simulate programs and applications processes offline.
  • Modularity: programmable controllers are modular providing for scalability and usability based on application requirements. The modules used can be extended or used depending on the application requirements.
  • Reduced operational costs due to reduced power losses and maintenance costs.
  • Faster operations due to reduced scan time, improved processors capabilities and faster troubleshooting of applications.
  • Improved Rugged design to withstand vibrations, temperature, humidity, and noise interferences on applications performance.
  • Programmable controllers make monitoring of processes in given applications easier through HMI devices and from PC’s.
  • Programmable controllers ensure that the control systems environment is clean and tidy as it utilizes less amount of space is required and provides for panels which are neat and make the environment presentable.

D1. Evaluate the accuracy of own test measurements and relate them to limitations of the test equipment, test procedures or possible emerging fault conditions.

  • High initial cost: the initial costs of obtaining and installing programmable controllers is quite high and may not be used in any applications unless the returns from the applications are also high.
  • Programming the controllers requires technical knowledge in other areas such as manufacturing, electrical instrumentation and control.
  • Programmable controllers require other devices such as sensors and actuators for them to be used in an application. Acquiring this equipment may be expensive making the use of programmable controllers more expensive.

On evaluating the accuracy of various measurements of voltage taken on an oscilloscope at different frequencies, it was noticed that the accuracy was not around fluctuated for some frequencies and remained constant for frequencies within the bandwidth. The measurements were of 3% accuracy for frequencies within the bandwidth; this was also the most accurate measurements. Therefore, the first effect on accuracy of measurements was the bandwidth limitations of the oscilloscope with the 3% accuracy limited to the bandwidth and signals of voltage in other frequencies attenuated hence lowering the accuracy levels. This implies that the bandwidth was not sufficient enough to accommodate all the frequencies of measurement. The gain accuracy of the oscilloscope was affected by temperature as a result the temperature of operation could have resulted to a less accuracy than the one indicated in the oscilloscope. This implied that the amplitudes of voltages measured ad some percentage errors due to effects of temperature.

Other limitations that could have affected the accuracy of the measurements include: input impedance and resistance tolerance, A/D reference accuracy, A/D resolution and capacitive loading. Input impedance was also limited and due to effects of resistance and capacitance on frequency, the accuracy levels were affected. Resistance affected the accuracy levels at the bandwidth frequencies wile capacitance introduced capacitance loading errors for higher frequencies reducing the accuracy even further. A/D reference accuracy limitations and resolutions also ad effects on the accuracy due to bit rate errors introduced by different resolution levels. The frequency and voltage limit of the oscilloscope used also restricted the voltage levels to be measured and above which the output remained constant.

Some of the procedures on the oscilloscope that could have resulted into the given accuracies include: incorrect triggering of the oscilloscope scope hence making the waveforms not clear due to increased amplitude error bands and introduction of inductance and ringing due to improper rounding resulting to reduced amplitude levels.

D2. Devise and demonstrate a calibration procedure for an item of electronic test equipment.

Calibration is a process used to compare the measurement of a test equipment to another recognizable equipment with a standard of established accuracy and precision in order to identify and adjust any discrepancy in accuracy and precision of the test equipment to conform with the standard equipment to desired degree. The main objectives of calibration are to check the accuracy of the test equipment and establish the traceability of the measurement. Calibration ensures accuracy of readings, reliability of test equipment readings and consistence of readings from a test equipment with other test equipment. Calibration of different electronic test equipment differ based on the nature and level of complexity in the equipment. However, some basic procedures are relatively the same. Below is a calibration procedure for electronic test equipment;

Demonstration of calibration procedure to calibrate an electrometer. The following steps are undertaken during the calibration process:

  • Equipment requirements: Establish the equipment inventory specifications that would include: item numbers, manufacturer and model, measurement range, serial number and other unique identification, equipment status and previous date of calibration.
  • References standards: Establish the reference standards for the calibration of the test equipment in question. This includes the reference standards SI units and the required accuracy of measurements.
  • Detailed inspection: carry out a detailed equipment inspection to ensure all the equipment parts are in order and to standard. This includes visual inspection to ensure cleaning to remove any dirt, dust or corrosion; inspection of moving parts to ensure proper repair and fastening of loose ones; and inspection of measurement parts to ensure they perform as required and to standard. Incase repair of an important part is impossible, replacement of the part can be done.
  • Detailed measurements and adjustments: This involves carrying out detailed measurements and recording them down to ensure they are to required accuracy. The measurements are first done by the standard device and then by the test equipment under calibration. Detailed adjustments are then made to ensure that the measurements are to required accuracy as determined by the standard device. The adjustments differ from equipment to equipment depending on its nature and level of complexity.
  • Carry out detailed measurements and adjustments in different environmental conditions in order to establish and specify the storage and operating conditions of the equipment to prevent damage and interference of measurements due to humidity, temperature or vibration.
  • Establish the effects of different environmental conditions on the test equipment measurements so as to determine the time for the equipment will be recalibrated again.

References

  • Read trout e inventory of the electrometer to be calibrated and determine the electrometer model, measurement ran, label number, cable ID and electrometer calibration instrument.
  • Obtain the Electrometer calibration standards manual to determine the required calibrations and other specifications. The required specifications are voltage offset range and response to different frequencies. Some of specifications according to 5156 and 6517A electrometers calibration standards are nominal tolerance levels, maximum input voltages and stability levels at different frequencies.
  • Conduct an inspection on the electrometer to be calibrated and remove any form of dirt or dust in it. Also ensure all terminals are firmly held to the board and the body of the electrometer. Do any repair and replacement of terminals and other parts where necessary.
  • Connect the electrometer to the calibration instrument as per the guidelines provided in the calibration instrument manual. Begin by measuring voltage offsets and monitor the voltage-versus-time curves appearing on the instrument. Any value that is out of range will appear in the error message box. Note all these values in a calibration sheet. Take the readings at different frequencies and make the electrometer corrections for out of range values.
  • Take measurements at different temperatures and determine the effects of temperature on the performance of the electrometer. Not the ranges where the temperatures have no effects on the measurements.
  • Indicate the various specifications for the electrometer after calibration. These include: accuracy, maximum input voltage, operating frequency ranges, tolerance levels and operating temperature levels.

Adrover, E. P. (2012). Introduction to PLCs: A beginner’s guide to programmable logic controllers. San Bernardino, CA: Elvin Perez Adrover.

Downton, B. (2010). Improving the Accuracy of Electrometer Calibrations. Medical Physics37(7Part2), 3891-3891. doi:10.1118/1.3476130

Keithley Instruments, Inc. (2001). Model 5156 Electrometer Calibration Standard Instruction Manual (2nd ed.). Cleveland, OH: Keithley Instruments, Inc.

Schmid, J. (2012). Oscilloscope Selection. Electronics Testing & Measurement. doi:10.1007/978-1-349-01191-9_18

Squier, D. (2010). Multiplexer calibration and test procedure. doi:10.2172/10148744

Webster, J. G., & Eren, H. (2017). Measurement, instrumentation, and sensors handbook: Electromagnetic, optical, radiation, chemical, and biomedical measurement. S.l.: CRC Press.

Westinghouse Electric Corporation. (207). Calibrating multiplexer circuits. Oak Ridge, TN: United States. Dept. of Energy. Office of Scientific and Technical Information.

Place an Order

Plagiarism Free!

Scroll to Top