Abstract
Contemporary wireless networks dramatically enhance data rates and latency to become a key enabler of massive communication among various low-cost devices of limited computational power, standardized by the Long-Term Evolution (LTE) downscaled derivations LTE-M or narrowband Internet of Things (NB IoT), in particular. Specifically, assessment of the physical-layer transmission performance is important for higher-layer protocols determining the extent of the potential error recovery escalation upwards the protocol stack. Thereby, it is needed that the end-points of low processing capacity most efficiently estimate the residual bit error rate (BER) solely determined by the main orthogonal frequency-division multiplexing (OFDM) impairment–carrier frequency offset (CFO), specifically in small cells, where the signal-to-noise ratio is large enough, as well as the OFDM symbol cyclic prefix, preventing inter-symbol interference. However, in contrast to earlier analytical models with computationally demanding estimation of BER from the phase deviation caused by CFO, in this paper, after identifying the optimal sample instant in a power delay profile, we abstract the CFO by equivalent time dispersion (i.e., by additional spreading of the power delay profile that would produce the same BER degradation as the CFO). The proposed BER estimation is verified by means of the industry-standard LTE software simulator.
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry