Scientists have increased the stability of microwave signals by 100 times. The work of researchers from the US National Institute of Standards and Technology (NIST) is published in the journal Science.
In the course of their work, scientists used the most modern ytterbium atomic clocks, advanced light detectors, and a measuring tool called the frequency comb, which serves as gears for the accurate conversion of high-frequency optical pulses into low-frequency microwave signals.
Advanced photodiodes converted light pulses into electric currents, which in turn generated a microwave signal with a frequency of 10 GHz. The signal was precisely tracked by the tick of the atomic clock, and the error was one part in a quintillion.
This performance level corresponds to the optical clock and is 100 times more stable than the best microwave sources. Optical waves have shorter and faster cycles than microwaves, so they have different shapes. By converting stable optical waves into microwaves, scientists tracked the phase — the exact synchronization of the waves to make sure they are identical and not offset from each other.
Ultra-stable electronic signals can be widely used, including the calibration of electronic clocks operating on oscillating quartz crystals. In addition, ultra-stable signals can make wireless systems more reliable.
In addition, this will make it possible to redefine the international standard of time, SI second, the calculation formula of which is based on microwave frequencies absorbed by cesium atoms in ordinary watches. It is expected that in the coming years, the international scientific community will choose a new time standard based on the optical frequencies that absorb ytterbium atoms.