![](/__local/4/9C/C6/DEBDE2F6F9D655EE26608564FC6_529F25BD_71848.jpg)
Large-scale diffractive-interference Taichi photonic chiplets solve advanced AGI tasks with 160 T...
Rapid advances in artificial general intelligence (AGI) come with increased performance and energy efficiency requirements for next-generation computing. Photonic computing has the potential to achieve these goals, but despite attracting attention, current photonic integrated circuits have limited scale and computing capabilities, barely supporting modern AGI tasks. Xu et al. explored a distributed diffractive-interference hybrid photonic computing architecture to effectively increase the scale of the ONN to the million-neuron level. They experimentally realized an on-chip 13.96-million-neuron ONN for complex, thousand-category-level classification and AI-generated content tasks. The present work is a promising step toward real-world photonic computing, supporting various applications in AI.
![](images/b-more.png)
![](/__local/4/60/50/51454BC0EB63D29EE0134AFFB43_9A64E6E5_B6420.png)
ACCEL: 4.5 Peta-OPS all-analog photoelectronic chip for high-speed vision tasks
A single chip that integrates optical and electronic analog computing modules provides a strategy for creating all-analog computing processors with a speed and energy efficiency that are several orders of magnitude higher than those of state-of-the-art digital processors. ACCEL classifies high-resolution images of various scenes of daily life in about 72 nanoseconds and requires less than 5 nanojoules — which is more than 3,000 times faster and 4,000,000 times less energy-consuming than a top-of-the-line graphics processing unit.
![](images/b-more.png)
![](/__local/2/94/41/E4E4B5909D5FDD275D16478A209_29456D88_21E65.jpg)
Meta-imaging sensor with digital adaptive optics
The paper presents a pioneering approach to tackle the challenge of rapidly declining imaging resolution and signal-to-noise ratio in dynamic and complex imaging environments. Authors develop an integrated meta-imaging sensor and introduce a digital adaptive optical architecture that enables high-speed and wide-range aberration removal, opening up new possibilities for solving the century-old problem of optical aberrations. It expands the effective field-of-view of traditional adaptive optics 40 arcseconds to 1000 arcseconds. This breakthrough research has broad application in astronomical observation, industrial inspection, and medical diagnosis.
![](images/b-more.png)
![](/__local/7/3E/06/E48AE9AEF658CC9F57C22CA0BB6_CC8BD295_B0EFB.png)
DeepCAD: denoising fluorescence images without seeing any clean data
DeepCAD is a self-supervised deep-learning method to denoise fluorescence time-lapse images with fast processing speed. The original low-SNR data can be directly used to train convolutional networks, making it particularly advantageous in functional imaging where the sample is undergoing fast dynamics and capturing ground-truth data is hard or impossible. We have demonstrated extensive experiments including calcium imaging in mice, zebrafish, and flies, cell migration observations, and the imaging of a new neurotransmitter indicator, covering both 2D single-plane imaging and 3D volumetric imaging. Qualitative and quantitative evaluations show that DeepCAD can substantially enhance fluorescence time-lapse imaging data and permit high-sensitivity imaging of biological dynamics beyond the shot-noise limit.
![](images/b-more.png)
![](/__local/A/42/87/B1BD981A29C3CAFACB838DB1C6D_DDD0E057_1BC66.jpg)
Two-photon synthetic aperture microscopy (2pSAM)
Harnessing the concept of synthetic aperture radar in two-photon microscopy, 2pSAM offers aberration-corrected 3D imaging of subcellular dynamics at a millisecond scale over large volumes in deep tissue, with reduced phototoxicity to enable challenging continuous long-term in vivo imaging applications in mice.
![](images/b-more.png)