Giving Keener 'Electric Eyesight' to Autonomous Vehicles
February 15, 2019 | MITEstimated reading time: 5 minutes

Autonomous vehicles relying on light-based image sensors often struggle to see through blinding conditions, such as fog. But MIT researchers have developed a sub-terahertz-radiation receiving system that could help steer driverless cars when traditional methods fail.
Image Caption: MIT researchers have developed a chip that leverages sub-terahertz wavelengths for object recognition, which could be combined with light-based image sensors to help steer driverless cars through fog.
Sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum, can be detected through fog and dust clouds with ease, whereas the infrared-based LiDAR imaging systems used in autonomous vehicles struggle. To detect objects, a sub-terahertz imaging system sends an initial signal through a transmitter; a receiver then measures the absorption and reflection of the rebounding sub-terahertz wavelengths. That sends a signal to a processor that recreates an image of the object.
But implementing sub-terahertz sensors into driverless cars is challenging. Sensitive, accurate object-recognition requires a strong output baseband signal from receiver to processor. Traditional systems, made of discrete components that produce such signals, are large and expensive. Smaller, on-chip sensor arrays exist, but they produce weak signals.
In a paper published online on Feb. 8 by the IEEE Journal of Solid-State Circuits, the researchers describe a two-dimensional, sub-terahertz receiving array on a chip that’s orders of magnitude more sensitive, meaning it can better capture and interpret sub-terahertz wavelengths in the presence of a lot of signal noise.
To achieve this, they implemented a scheme of independent signal-mixing pixels — called “heterodyne detectors” — that are usually very difficult to densely integrate into chips. The researchers drastically shrank the size of the heterodyne detectors so that many of them can fit into a chip. The trick was to create a compact, multipurpose component that can simultaneously down-mix input signals, synchronize the pixel array, and produce strong output baseband signals.
The researchers built a prototype, which has a 32-pixel array integrated on a 1.2-square-millimeter device. The pixels are approximately 4,300 times more sensitive than the pixels in today’s best on-chip sub-terahertz array sensors. With a little more development, the chip could potentially be used in driverless cars and autonomous robots.
“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones,” says co-author Ruonan Han, an associate professor of electrical engineering and computer science, and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL). “Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.”
Joining Han on the paper are first author Zhi Hu and co-author Cheng Wang, both PhD students in in the Department of Electrical Engineering and Computer Science working in Han’s research group.
Decentralized Design
The key to the design is what the researchers call “decentralization.” In this design, a single pixel — called a “heterodyne” pixel — generates the frequency beat (the frequency difference between two incoming sub-terahertz signals) and the “local oscillation,” an electrical signal that changes the frequency of an input frequency. This “down-mixing” process produces a signal in the megahertz range that can be easily interpreted by a baseband processor.
The output signal can be used to calculate the distance of objects, similar to how LiDAR calculates the time it takes a laser to hit an object and rebound. In addition, combining the output signals of an array of pixels, and steering the pixels in a certain direction, can enable high-resolution images of a scene. This allows for not only the detection but also the recognition of objects, which is critical in autonomous vehicles and robots.
Heterodyne pixel arrays work only when the local oscillation signals from all pixels are synchronized, meaning that a signal-synchronizing technique is needed. Centralized designs include a single hub that shares local oscillation signals to all pixels.
These designs are usually used by receivers of lower frequencies, and can cause issues at sub-terahertz frequency bands, where generating a high-power signal from a single hub is notoriously difficult. As the array scales up, the power shared by each pixel decreases, reducing the output baseband signal strength, which is highly dependent on the power of local oscillation signal. As a result, a signal generated by each pixel can be very weak, leading to low sensitivity. Some on-chip sensors have started using this design, but are limited to eight pixels.
The researchers’ decentralized design tackles this scale-sensitivity trade-off. Each pixel generates its own local oscillation signal, used for receiving and down-mixing the incoming signal. In addition, an integrated coupler synchronizes its local oscillation signal with that of its neighbor. This gives each pixel more output power, since the local oscillation signal does not flow from a global hub.
A good analogy for the new decentralized design is an irrigation system, Han says. A traditional irrigation system has one pump that directs a powerful stream of water through a pipeline network that distributes water to many sprinkler sites. Each sprinkler spits out water that has a much weaker flow than the initial flow from the pump. If you want the sprinklers to pulse at the exact same rate, that would require another control system.
The researchers’ design, on the other hand, gives each site its own water pump, eliminating the need for connecting pipelines, and gives each sprinkler its own powerful water output. Each sprinkler also communicates with its neighbor to synchronize their pulse rates. “With our design, there’s essentially no boundary for scalability,” Han says. “You can have as many sites as you want, and each site still pumps out the same amount of water … and all pumps pulse together.”
The new architecture, however, potentially makes the footprint of each pixel much larger, which poses a great challenge to the large-scale, high-density integration in an array fashion. In their design, the researchers combined various functions of four traditionally separate components — antenna, downmixer, oscillator, and coupler — into a single “multitasking” component given to each pixel. This allows for a decentralized design of 32 pixels.
“We designed a multifunctional component for a [decentralized] design on a chip and combine a few discrete structures to shrink the size of each pixel,” Hu says. “Even though each pixel performs complicated operations, it keeps its compactness, so we can still have a large-scale dense array.”
Guided by Frequencies
In order for the system to gauge an object’s distance, the frequency of the local oscillation signal must be stable.
To that end, the researchers incorporated into their chip a component called a phase-locked loop, that locks the sub-terahertz frequency of all 32 local oscillation signals to a stable, low-frequency reference. Because the pixels are coupled, their local oscillation signals all share identical, high-stability phase and frequency. This ensures that meaningful information can be extracted from the output baseband signals. This entire architecture minimizes signal loss and maximizes control.
“In summary, we achieve a coherent array, at the same time with very high local oscillation power for each pixel, so each pixel achieves high sensitivity,” Hu says.
Suggested Items
HyRel Technologies Celebrates Future Innovators: Intern Program Empowers the Next Generation of Engineers and Professionals
05/01/2025 | HyRelHyRel Technologies, a global provider of quick turn semiconductor modification solutions, is proud to spotlight its 7th class of interns in partnership with Peoria Unified School District, featuring three outstanding young women who are already making meaningful contributions to the company's innovative engineering and operations efforts.
SEMI 3D & Systems Summit to Spotlight Trends in Hybrid Bonding, Chiplet Architecture and Geopolitical Dynamics
05/01/2025 | SEMILeading experts in 3D integration and systems for semiconductor manufacturing applications will gather at the annual SEMI 3D & Systems Summit, June 25-27, 2025, in Dresden.
Cadence Expands Design IP Portfolio Optimized for Intel 18A and Intel 18A-P Technologies, Advancing AI, HPC and Mobility Applications
05/01/2025 | Cadence Design SystemsCadence announced a significant expansion of its portfolio of design IP optimized for Intel 18A and Intel 18A-P technologies and certification of Cadence® digital and analog/custom design solutions for the latest Intel 18A process design kit (PDK).
A Visit With ‘Flexperts’ Mark Finstad and Nick Koop
05/01/2025 | Joe Fjelstad, Verdant ElectronicsAt IPC APEX EXPO 2025, I chatted with seasoned flex experts Mark Finstad and Nick Koop about "Flexperts" and their roles as leading educators and in the realm of standards development for this increasingly indispensable electronic interconnection technology. They have been teaching about lessons learned and how to successfully navigate the “seas” of flexible circuits to help their students avoid the hazards that have taken down many of their predecessors in the past.
Siemens Expands Global Electronics Intelligence Reach and Supplyframe Portfolio with Wevolver Acquisition
04/30/2025 | Siemens Digital Industries SoftwareSiemens Digital Industries Software announced its intention to acquire Wevolver, expanding its audience reach, enhancing the Supplyframe product portfolio, and combining digital marketing and integrated campaign programs that include go-to-market support and content creation.