Eye-Beam: A mmWave 5G-Compliant Platform for Integrated Communications and Sensing Enabling AI-Based Object Recognition
Abstract
We present Eye-Beam, a programmable platform for integrated communication and sensing. Eye-Beam leverages the hardware and processing required for standard millimeter-wave (mmWave) 5G directional communications to enable sensing functions. Specifically, our platform (1) receives and synchronizes to the data frame of broadcast 5G signals, (2) extracts directional communication features, creating a tensor of spatial information, and (3) utilizes this data as input to a DNN that infers the presence of specific objects in the propagation environment. Eye-Beam includes a programmable 28 GHz 64-element phased array, an SDR, and custom FPGA-based firmware. Eye-Beam's key capabilities and metrics include (i) synchronization of I/Q data (up to 200 MSPS) with beam steering (among 9,601 beams) with 10 ns accuracy; (ii) a signal processing pipeline that extracts communication features such as the SNR and channel response from received 5G waveforms; and (iii) system orchestration that synchronizes the receiver (RX) to the 5G frame structure of the base station (gNodeB) and maintains it within a worst-case OFDM cyclic prefix of 0.29~mu s. Eye-Beam is also able to emulate gNodeB transmissions. We demonstrate Eye-Beam's performance by showcasing its communication capability (decoding up to 64-QAM), as well as its performance as a channel sounder (extracting detailed directional 5G features in 2,401 beam directions within just 20 ms). We then, for the first time, demonstrate AI-based object classification only using the directional communication features derived by Eye-Beam from ambient mmWave 5G signals transmitted by a gNodeB. Six object classes, including 4 distinct objects concealed in a backpack, are classified with 98% accuracy in an indoor environment.