Mismatch reduction technique for transistors with minimum channel length Academic Article uri icon

abstract

  • An analog calibration technique is presented to improve the parameter matching between transistors in the differential high-frequency signal path of analog CMOS circuits. It can be applied for mismatch reduction in differential broadband amplifiers and direct down-conversion mixers in which short-channel devices are utilized to minimize bandwidth reduction from parasitic capacitances. In general, the proposed methodology is suitable for radio frequency (RF) applications in which direct matching of the transistors is undesired because sophisticated layout practices would increase the coupling between the high-frequency paths. The approach involves auxiliary devices which sense the existing mismatch as part of a feedback loop for error minimization. This technique is demonstrated with a differential amplifier that has a loaded gain and -3 dB frequency of 12.9 dB and 2.14 GHz, respectively. It was designed in 90 nm CMOS technology with a 1.2 V supply. Monte Carlo simulations indicate that the 4.06 mV standard deviation of the amplifier's anticipated input-referred offset voltage improves to 0.76-1.28 mV with the mismatch reduction loop, which is contingent on the layout configuration of the calibration circuitry. The associated drain current mismatch reduction for the transistor pair under calibration in the amplifier core is from 3.1% to 0.6-1.0%. Springer Science+Business Media, LLC 2011.

published proceedings

  • Analog Integrated Circuits and Signal Processing

author list (cited authors)

  • Onabajo, M., & Silva-Martinez, J.

citation count

  • 3

complete list of authors

  • Onabajo, Marvin||Silva-Martinez, Jose

publication date

  • March 2012