Difference between revisions of "Time and Frequency Synchronization in LTE Cat-0 Devices"
m (Weberbe moved page Digital Front End Design for Narrowband LTE Systems to Time and Frequency Synchronization in LTE Cat-0 Devices)
Revision as of 17:10, 18 March 2015
Various estimates predict 20 to 30 billion embedded devices connected to the internet in 2020 in what's called the the Internet of things (IoT) [1,2]. To realize this vision, cellular standards are released to meet the requirements regarding low-power and low-cost of IoT components, especially on the client side. In the latest release of the LTE standard, a new user-equipment category (Cat-0) for Machine to Machine (M2M) communications and the IoT was introduced . Since Cat-0 devices target the low-complexity M2M market, 75 percent modem complexity reduction compared to Cat-1 UE are expected. This complexity saving comes at the price of several modifications compared to conventional LTE UE devices. The bandwidth of such a receiver is 200 kHz, of which 12 · 15 kHz = 180 kHz are used by one resource block. Such 200 kHz LTE bands can be placed in between existing GSM bands. The goal of this project is to analyze, simulate and implement parts of a receiver, which is able to receive a single LTE resource block a time in Matlab.
The receive chain of an LTE device includes an analog front-end (AFE), containing analog filter stages and an analog-to-digital converter (ADC), followed by a digital-front end (DFE) and other components for e.g., symbol detection and channel decoding. Tasks of the DFE include decimation and synchronization of the signal emitted from the ADC. The DFE is the first digital building block after the ADC, and is responsible to decimate the oversampled signal to symbol rate. In this project, a state-of-the-art analog front-end, or a Matlab model of it, containing a Σ∆-ADC for LTE-Advanced is used which supports a bandwidth of up to 50 MHz and a sampling rate of up to 400 MS/s.
Besides decimation, the DFE contains a synchronization unit. A tutorial-like introduction into synchronization and cell-search is given in , whereas in  a more detailed explanation is given. Synchronization, as fundamental part of the LTE cell search procedure and contains two steps. In the first step symbol timing and carrier frequency offset1 is estimated. The first step takes place in the DFE, before the FFT. The estimation is performed by doing an autocorrelation on the received signal. Via this autocorrelation the recurring cyclic prefix is detected. The method is explained in detail in .
The second step is done in frequency domain using the so called primary synchronization signal (PSS). These signals contain Zadoff-Chu sequences of length 62. By using a cross-correlation between the Zadoff-Chu sequence and the received signal, the sector with the highest signal level is identified.
In this project, the student shall come-up with an innovative design for a front-end for the constraints explained above, develop a fixed-point model in MATLAB.
 Cambridge Wireless. LTE Evolution for Cellular IoT, 2014.
 Rapeepat Ratasuk, Nitin Mangalvedhe, Amitava Ghosh, and Benny Vejlgaard. Narrowband LTE-M system for M2M communication. 2014.
 Fabian Schuh. Synchronization and Cell Search, 2010.
 Konstantinos Manolakis, DM Gutierrez Estevez, Volker Jungnickel, Wen Xu, and Christian Drewes. A closed concept for synchronization and cell search in 3GPP LTE systems. In Wireless Communications and Networking Conference, 2009. WCNC 2009. IEEE, pages 1–6. IEEE, 2009.