Tantalizing Discovery May Boost Memory Technology
August 11, 2015 | Rice UniversityEstimated reading time: 3 minutes
Scientists at Rice University have created a solid-state memory technology that allows for high-density storage with a minimum incidence of computer errors.
The memories are based on tantalum oxide, a common insulator in electronics. Applying voltage to a 250-nanometer-thick sandwich of graphene, tantalum, nanoporous tantalum oxide and platinum creates addressable bits where the layers meet. Control voltages that shift oxygen ions and vacancies switch the bits between ones and zeroes.
The discovery by the Rice lab of chemist James Tour could allow for crossbar array memories that store up to 162 gigabits, much higher than other oxide-based memory systems under investigation by scientists. (Eight bits equal one byte; a 162-gigabit unit would store about 20 gigabytes of information.
Details appear online in the American Chemical Society journal Nano Letters.
Like the Tour lab’s previous discovery of silicon oxide memories, the new devices require only two electrodes per circuit, making them simpler than present-day flash memories that use three. “But this is a new way to make ultradense, nonvolatile computer memory,” Tour said.
Nonvolatile memories hold their data even when the power is off, unlike volatile random-access computer memories that lose their contents when the machine is shut down.
Modern memory chips have many requirements: They have to read and write data at high speed and hold as much as possible. They must also be durable and show good retention of that data while using minimal power.
Tour said Rice’s new design, which requires 100 times less energy than present devices, has the potential to hit all the marks.
“This tantalum memory is based on two-terminal systems, so it’s all set for 3-D memory stacks,” he said. “And it doesn’t even need diodes or selectors, making it one of the easiest ultradense memories to construct. This will be a real competitor for the growing memory demands in high-definition video storage and server arrays.”
The layered structure consists of tantalum, nanoporous tantalum oxide and multilayer graphene between two platinum electrodes. In making the material, the researchers found the tantalum oxide gradually loses oxygen ions, changing from an oxygen-rich, nanoporous semiconductor at the top to oxygen-poor at the bottom. Where the oxygen disappears completely, it becomes pure tantalum, a metal.
The researchers determined three related factors give the memories their unique switching ability.
First, the control voltage mediates how electrons pass through a boundary that can flip from an ohmic (current flows in both directions) to a Schottky (current flows one way) contact and back.
Second, the boundary’s location can change based on oxygen vacancies. These are “holes” in atomic arrays where oxygen ions should exist, but don’t. The voltage-controlled movement of oxygen vacancies shifts the boundary from the tantalum/tantalum oxide interface to the tantalum oxide/graphene interface. “The exchange of contact barriers causes the bipolar switching,” said Gunuk Wang, lead author of the study and a former postdoctoral researcher at Rice.
Third, the flow of current draws oxygen ions from the tantalum oxide nanopores and stabilizes them. These negatively charged ions produce an electric field that effectively serves as a diode to hinder error-causing crosstalk. While researchers already knew the potential value of tantalum oxide for memories, such arrays have been limited to about a kilobyte because denser memories suffer from crosstalk that allows bits to be misread.
The graphene does double duty as a barrier that keeps platinum from migrating into the tantalum oxide and causing a short circuit.
Tour said tantalum oxide memories can be fabricated at room temperature. He noted the control voltage that writes and rewrites the bits is adjustable, which allows a wide range of switching characteristics.
Wang said the remaining hurdles to commercialization include the fabrication of a dense enough crossbar device to address individual bits and a way to control the size of the nanopores.
Wang is an assistant professor at the Korea University-Korea Institute of Science and Technology’s Graduate School of Converging Science and Technology. Co-authors are former Rice research scientist Jae-Hwang Lee, an assistant professor of mechanical and industrial engineering at the University of Massachusetts, Amherst; and Rice postdoctoral researchers Yang Yang, Gedeng Ruan, Nam Dong Kim and Yongsung Ji.
Tour is the T.T. and W.F. Chao Chair in Chemistry as well as a professor of materials science and nanoengineering and of computer science and a member of Rice’s Richard E. Smalley Institute for Nanoscale Science and Technology.
Suggested Items
Beyond Design: Key SI Considerations for High-speed PCB Design
03/20/2025 | Barry Olney -- Column: Beyond DesignOver the past two decades, I've simulated numerous complex, high-speed designs for customers creating computer-based products. In addition, I've conducted signal integrity software training courses and led classes on high-speed design. In this month’s column, I will reflect on the key considerations for achieving a successful high-speed PCB design that performs reliably, and I’ll highlight some of the common signal integrity issues that I frequently encounter.
Don’t Rush: Get ‘Acclimated’ With Each Level of SI
02/27/2025 | Andy Shaughnessy, I-Connect007During DesignCon, I met with Al Neves, the founder of Wild River Technology, and a serious fly fisherman as well. As Al explains, some engineers are getting ahead of themselves by rushing to take on complex SI challenges before they’ve mastered their foundational knowledge. Like climbers on Mount Everest, these engineers need to spend more time getting “acclimated” at base camp before heading for the summit.
Beyond Design: High-speed Rules of Thumb
11/21/2024 | Barry Olney -- Column: Beyond DesignThe idiom “rule of thumb” is often used in electronics design and has its origins in the practice of measuring roughly with one’s thumb. Rules of thumb are easy-to-remember, broadly accurate guides or principles based on practice rather than theory. They are used to help feed our intuition to find a quick solution based on experience. We are often forced to use rules of thumb in PCB design in the absence of expensive analysis tools. We also use them to get quick ballpark figures initially and then fine-tune the numbers with further analysis. We can use rules of thumb as a sanity check to assess whether we are using our tools correctly. In this month’s column, I will present some commonly used and helpful rules for high-speed PCB design.
Beyond Design: Just a Matter of Time
11/21/2023 | Barry Olney -- Column: Beyond DesignElectromagnetic energy propagates at about half the speed of light within the dielectric of a multilayer PCB. This speed is inversely proportional to the square root of the dielectric constant (Dk) of the material. The lower the Dk, the faster the propagation of the wave. In the past, we ignored the board-level delay as it was relatively instantaneous compared to the slow rise time of the signal waveform. But now that we have entered the realm of Gigabit/s design, an unaccounted 10 ps of delay can mean the difference between success and absolute failure of a high-speed design. Also, the trend is toward lower core voltages, which conserves power. However, reducing the core voltage also reduces the noise margin and impacts the system timing budget.
3D Electromagnetic Analysis
08/22/2023 | Yuriy Shlepnev, SimberianData rates in PCB interconnects are increasing in all signaling protocols (PCIe, DDR, GDDR, Ethernet, USB, SAS, InfiniBand, CEI, OIF, 5G). Most of those high-speed signaling standards have one-lane data rates over 6 Gbps (GT/s) and some up to 112 Gbps with signal spectrum in microwave and even millimeter wave bandwidths. Design of compliant interconnects at these data rates cannot simply rely on geometrical rules or rules of thumb. Signal distortion by reflections, dissipation and crosstalk can cause interconnect performance degradation or even failure. To avoid it, signal integrity compliance analysis and possible interconnect optimization is required.