Thesis (M.A.) from the year 2016 in the subject Engineering - Communication Technology, grade: 75%, Mekelle University, course: Communication engineering, language: English, abstract: In this thesis channel estimation techniques for LTE downlink named Least Square, Minimum Mean Square error and Maximum Likelihood estimation techniques are studied for the pilot symbol based channel estimation. In addition to this the performances of these three channel estimation techniques were also studied by introducing averaging, interpolation and hybrid methods. This work also investigates the complexity of the channel estimation techniques in terms of the number of complex multiplications and by varying the FFT size and number of CP. furthermore, the effect of varying the number of antennas at the transmitter and receiver ends, where 2 x 2 and 4 x 4 antenna arrangements are considered as a case studies. The performance of these channel estimation techniques is also studied for EVA standard channel model in LTE. The considered channel model is EVA standard channel model with Doppler shift of 300HZ. Simulation results in this thesis show that the ML channel estimation technique has the best performance. In terms of number of complex multiplications it is proved the ML has lower complexity. From the interpolating techniques it is shown the performance of the algorithm integrated with hybrid technique has the best performance. In addition to this it is shown that as the number of transmit and receive antennas increase from 2 x 2 to 4 x 4 the performance of the estimator increases.
The focus of this book is to evaluate the channel estimation and equalization for OFDM based LTE downlink according to 3GPP specifications. This book presents a link level system model that is used in simulation of LTE downlink physical layer which is valid for bandwidth 5, 10, 15 and 20MHz. Furthermore, the ITU standard channel models for UMTS and LTE including SISO and MIMO channel models are described. Finally, the channel estimation algorithms are presented for both SISO and MIMO systems. The concluding results are being presented with the help of simulation. The Performance is measured in terms of BER and SER and the obtained results are compared with the theoretical values.
em style="mso-bidi-font-style: normal;"Wireless Communications Systems Design provides the basic knowledge and methodology for wireless communications design. The book mainly focuses on a broadband wireless communication system based on OFDM/OFDMA system because it is widely used in the modern wireless communication system. It is divided into three parts: wireless communication theory (part I), wireless communication block design (part II), and wireless communication block integration (part III). Written by an expert with various experience in system design (standards, research and development)
This book is suitable for students, teachers and professionals who are working in signal processing and telecommunication field. It is the performance study of telecommunication system regarding quantization noise produced due to limited bit width in hardware. This book analyses some important aspects of quantization noise effect in LTE downlink channel estimation. An OFDM system model with LTE downlink standard is developed and analysed with respect to the performance in terms of BER with three channel estimation algorithm LS, FIR and LMMSE. First the simulation is done with floating-point arithmetic and then implemented with fixed-point arithmetic using an "in house" developed fixed-point toolbox to analyse the quantization noise effect. The content is basically focused on finding the amount of quantization noise in terms of BER, finding the affecting parameters and finally to suggest and study some techniques to mitigate the quantization noise effect. The simulation results have been mentioned along with MATLAB class codes that describe channel estimations and the implementation of an 'in house' fixed-point toolbox.
This book introduces the Vienna Simulator Suite for 3rd-Generation Partnership Project (3GPP)-compatible Long Term Evolution-Advanced (LTE-A) simulators and presents applications to demonstrate their uses for describing, designing, and optimizing wireless cellular LTE-A networks. Part One addresses LTE and LTE-A link level techniques. As there has been high demand for the downlink (DL) simulator, it constitutes the central focus of the majority of the chapters. This part of the book reports on relevant highlights, including single-user (SU), multi-user (MU) and single-input-single-output (SISO) as well as multiple-input-multiple-output (MIMO) transmissions. Furthermore, it summarizes the optimal pilot pattern for high-speed communications as well as different synchronization issues. One chapter is devoted to experiments that show how the link level simulator can provide input to a testbed. This section also uses measurements to present and validate fundamental results on orthogonal frequency division multiplexing (OFDM) transmissions that are not limited to LTE-A. One chapter exclusively deals with the newest tool, the uplink (UL) link level simulator, and presents cutting-edge results. In turn, Part Two focuses on system-level simulations. From early on, system-level simulations have been in high demand, as people are naturally seeking answers when scenarios with numerous base stations and hundreds of users are investigated. This part not only explains how mathematical abstraction can be employed to speed up simulations by several hundred times without sacrificing precision, but also illustrates new theories on how to abstract large urban heterogeneous networks with indoor small cells. It also reports on advanced applications such as train and car transmissions to demonstrate the tools’ capabilities.
An Introduction to LTE explains the technology used by 3GPP Long Term Evolution. The book covers the whole of LTE, both the techniques used for radio communication between the base station and the mobile phone, and the techniques used for signalling communication and data transport in the evolved packet core. It avoids unnecessary detail, focussing instead on conveying a sound understanding of the entire system. The book is aimed at mobile telecommunication professionals, who want to understand what LTE is and how it works. It is invaluable for engineers who are working on LTE, notably those who are transferring from other technologies such as UMTS and cdma2000, those who are experts in one part of LTE but who want to understand the system as a whole, and those who are new to mobile telecommunications altogether. It is also relevant to those working in non technical roles, such as project managers, marketing executives and intellectual property consultants. On completing the book, the reader will have a clear understanding of LTE, and will be able to tackle the more specialised books and the 3GPP specifications with confidence. Key features - Covers the latest developments in release 10 of the 3GPP specifications, including the new capabilities of LTE-Advanced Includes references to individual sections of the 3GPP specifications, to help readers understand the principles of each topic before going to the specifications for more detailed information Requires no previous knowledge of mobile telecommunications, or of the mathematical techniques that LTE uses for radio transmission and reception
The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application.
The technological progress in multi-carrier (MC) modulation led orthogonal frequency division multiplexing (OFDM) to become an important part of beyond 3G cellular mobile communication standards, including LTE and WiMAX. In addition, the flexibility offered by the spread spectrum (SS) and time division multiplexing (TDM) techniques motivated many researchers to investigate several MC combined multiple access schemes, such as MC-CDMA, OFDMA and MC-TDMA. These schemes benefit from the advantages of each sub-system and offer high flexibility, high spectral efficiency, simple detection strategies and narrow-band interference rejection capability. Multi-Carrier and Spread Spectrum Systems is one of the first books to describe and analyze the basic concepts of multi-carrier OFDM transmission and its combination with spread spectrum (MC-CDMA). The different architectures and detection strategies as well as baseband-related transceiver components are explained. This includes topics like FEC channel coding and decoding, modulation and demodulation (IFFT/FFT), digital I/Q-generation, time and frequency synchronisation, channel estimation, frequency domain equalization and RF aspects such as phase noise and non-linearity issues. Concrete examples of its applications for cellular mobile communication systems (B3G/4G) are given. Further derivatives of MC-SS (such as OFDMA, SS-MC-MA and DFT-spread OFDM) and their corresponding applications in the LTE, WiMAX, WLAN and DVB-RCT standards are detailed. Capacity and flexibility enhancements of multi-carrier OFDM systems by different MIMO diversity techniques such as space time/frequency coding (STC, SFC) and software defined radio concepts are also described. Written in a highly accessible manner this book provides a unique reference on the topics of multi-carrier and spread spectrum communications, assisting 4G engineers with their implementation. Fully updated new edition of successful text, including two new chapters on LTE and WiMAX Describes in detail new applications of OFDM in mobile communication standards Examines all multi-carrier spread spectrum schemes, with in-depth analysis, from theory to practice Introduces the essentials of important wireless standards based on multi-carrier/spread spectrum techniques.
In this thesis, I conduct the hardware prototyping of a two-way relay system using the National Instruments FlexRIO hardware platform. First of all, I develop several practical mechanisms to solve the critical synchronization issues of the systems, including Orthogonal Frequency-Division Multiplexing (OFDM) frame synchronization at the receiver, source to source node synchronization, and handshaking between the sources and relay nodes. Those synchronization methods control the behavior of the two source nodes and the relay node, which play critical roles in the two-way relay systems. Secondly, I develop a pilot-based channel estimation scheme and validate it by showing the successful self-interference cancellation for the two-way relay systems. In particular, I experiment the self-interference cancellation technique by using several channel estimation schemes to estimate both source to relay channels and relay to source channels. Moreover, I implement the physical layer of a 5 MHz OFDM scheme for the two-way relay system. Both the transmitter and receiver are designed to mimic the Long Term Evolution (LTE) downlink scenario. The physical layer of the transmitter has been implemented in Field-Programmable Gate Arrays (FPGAs) and executed on the hardware board, which provides high throughput and fundamental building blocks for the two-way relay system. The physical layer of receiver is implemented in the real-time controller, which provides the?exibility to rapidly recon?gure the system. Finally, I demonstrate that the 5MHz OFDM based two-way relay system can achieve reliable communications, when the channel estimation and system synchronization can be correctly executed.
In recent years, orthogonal frequency division multiplexing (OFDM) scheme has received significant research interest due to its capability of supporting high data rates in hostile environments. As compared to conventional single-carrier modulation schemes, OFDM benefits from low complexity equalization filters and high spectral efficiency. A multiple access implementation of OFDM, i.e., orthogonal frequency division multiple access (OFDMA) has been considered as the multiple access (MA) scheme in 3GPP LTE, or LTE advanced downlink. In cellular OFDMA, frequency hopping (FH) is widely used to exploit frequency diversity gain and improve system throughput; and pilot patterns that have low-cross correlation are employed to improve the quality of channel estimation. However, there are numerous unsolved problems that need to be addressed in frequency hopped and pilot assisted OFDMA systems. Surveying the prior works in the literature, we find that limited research efforts have focused on coping with the inherent disadvantages regarding OFDM in cellular OFDMA systems. In this thesis, we employ the so-called residue number system (RNS) arithmetic concentrating on (a) FH pattern design for minimizing/averaging intra/inter-cell interference, (b) pilot pattern design for improving the quality of channel estimation, and (c) pilot pattern design for facilitating time-frequency synchronization and device identification in multi-cell OFDMA. Regarding (a), RNS-based FH patterns not only preserve orthogonality within the same cell, but also have the minimum number of symbol collisions among adjacent cells. Additionally, the RNS-based method exhibits consistent system performance and more frequency diversity gains as compared to previous efforts. With respect to (b), RNS-based pilot pattern design generates more unique pilot patterns than conventional methods. This results in low probability of pilot-to-pilot collisions, which in turn, significantly improves the quality of channel estimation from the system level perspective. For (c), as a special case of linear congruence sequences, RNS-based pilot patterns have good auto-correlation properties, which are extremely helpful in time-frequency synchronization and device identification.
While 3G has been an outstanding success, the ever-growing demand for higher data rates and higher quality mobile communication services continues to fuel conflict between the rapidly growing number of users and limited bandwidth resources. In the future, a 100-fold increase in mobile data traffic is expected. That will necessitate further improvements to 3GPP LTE (Long-Term Evolution) and create limitless opportunities for engineers who understand the technology and how to apply it to deliver enhanced services. Long Term Evolution: 3GPP LTE Radio and Cellular Technology outlines the best way to position yourself now for future success. With coverage ranging from basic concepts to current research, this comprehensive reference contains technical information about all aspects of 3GPP LTE. It details low chip rate, high-speed downlink/uplink packet access (HSxPA)/TDSCDMA EV 1x, LTE TDD, and 3G TDD. It introduces new technologies and covers methodologies to study the performance of frequency allocation schemes. The authors also discuss the proposed architecture of Mobile IPRR and distributed dynamic architecture in wireless communication, covering performance evaluation of the TD-SCDMA LTE System. With each passing day, more and more users are demanding mobile broadband data access everywhere, to facilitate synchronization of e-mails, Internet access, specific applications, and file downloads to mobile devices such as cell phones, smart phones, PDAs, and notebooks. LTE, successor to the 3G mobile radio network, is essential to creating radio coverage in the rollout phase and high capacity all over the radio cell in the long term. The 3GPP LTE will become increasingly crucial to supporting the high demand of data traffic rates generated by future mobile user terminals. Authored by international experts in the field, this practical book is an extremely valuable guide that addresses emerging current and future technologies associated with LTE and its future direction.
In order to meet the ever-increasing demand for wireless broadband services from fast growing mobile users, the Long Term Evolution -Advanced (LTE-A) standard has been proposed to effectively improve the system capacity and the spectral efficiency for the fourth-generation (4G) wireless mobile communications. Many advanced techniques are incorporated in LTE-A systems to jointly ameliorate system performance, among which Carrier Aggregation (CA) is considered as one of the most promising improvements that has profound significance even in the upcoming 5G era. Component carriers (CCs) from various portions of the spectrum are logically concatenated to form a much larger virtual band, resulting in remarkable boosted system capacity and user data throughput. However, the unique features of CA have posed many emerging challenges as well as span-new opportunities on the Radio Resource Management (RRM) in the LTE-A systems. First, although multi-CC transmission can bring higher throughput, it may incur more intensive interference for each CC and more power consumption for users. Thus the performance gain of CA under different conditions needs fully evaluating. Besides, as CA offers flexible CC selection and cross-CC load balancing and scheduling, enhanced RRM strategies should be designed to further optimize the overall resource utilization. In addition, CA enables the frequency reuse on a CC resolution, adding another dimension to inter-cell interference management in heterogeneous networks (HetNets). New interference management mechanisms should be designed to take the advantage of CA. Last but not least, CA empowers the LTE-A systems to aggregate the licensed spectrum with the unlicensed spectrum, thus offering a capacity surge. Yet how to balance the traffic between licensed and unlicensed spectrum and how to achieve a harmony coexistence with other unlicensed systems are still open issues. To this end, the dissertation emphasizes on the new functionalities introduced by CA to optimize the RRM performance in LTE-A systems. The main objectives are four-fold: 1) to fully evaluate the benefits of CA from different perspectives under different conditions via both theoretical analysis and simulations; 2) to design cross-layer CC selection, packet scheduling and power control strategies to optimize the target performance; 3) to analytically model the interference of HetNets with CA and propose dynamic interference mitigation strategies in a CA scenario; and 4) to investigate the impact of LTE transmissions on other unlicensed systems and develop enhanced RRM mechanisms for harmony coexistence. To achieve these objectives, we first analyze the benefits of CA via investigating the user accommodation capabilities of the system in the downlink admission control process. The LTE-A users with CA capabilities and the legacy LTE users are considered. Analytical models are developed to derive the maximum number of users that can be admitted into the system given the user QoS requirements and traffic features. The results show that with only a slightly higher spectrum utilization, the system can admit as much as twice LTE-A users than LTE users when the user traffic is bursty. Second, we study the RRM in the single-tier LTE-A system and propose a cross-layer dynamic CC selection and power control strategy for uplink CA. Specifically, the uplink power offset effects caused by multi-CC transmission are considered. An estimation method for user bandwidth allocation is developed and a combinatorial optimization problem is formulated to improve the user throughput via maximizing the user power utilization. Third, we explore the interference management problem in multi-tier HetNets considering the CC-resolution frequency reuse. An analytical model is devised to capture the randomness behaviors of the femtocells exploiting the stochastic geometry theory. The interaction between the base stations of different tiers are formulated into a two-level Stackelberg game, and a backward induction method is exploited to obtain the Nash equilibrium. Last, we focus on the mechanism design for licensed and unlicensed spectrum aggregation. An LTE MAC protocol on unlicensed spectrum is developed considering the coexistence with the Wi-Fi systems. The protocol captures the asynchronous nature of Wi-Fi transmissions in time-slotted LTE frame structure and strike a tunable tradeoff between LTE and Wi-Fi performance. Analytical analysis is also presented to reveal the essential relation among different parameters of the two systems. In summary, the dissertation aims at fully evaluating the benefits of CA in different scenarios and making full use of the benefits to develop efficient and effective RRM strategies for better LTE-Advanced system performance.
Multi Input Multi Output (MIMO) technology has seen prolific use to achieve higher data rates and an improved communication experience for cellular systems. However, one of the challenging problems in MIMO systems is interference. Interference limits the system performance in terms of rate and reliability. In this thesis, we analyze methods that provide high performance over interference-limited wireless networks such as Long Term Evolution (LTE) and WiFi. In this thesis, we tackle different sources of interference. One of the interference sources is the neighbouring interference, we propose methods that include an optimized solution that models the interference as correlated noise, and uses its statistical information to jointly optimize the base station precoding and user receiver design of LTE systems. We study the benefits of exploiting interference in terms of both probability of error and signal-to-noise ratio (SNR). In addition, we compare the proposed method with the conventional beamforming and maximum ratio combining (MRC).One of the key challenges to enable high data rates in the downlink of LTE is the precoding and receiver design. We focus primarily on the UE and the base station (BS) processing, particularly on estimating and using the interference resulting from neighboring stations. We propose a receiver design that performs well in the presence of interference. Furthermore, we present a precoding scheme that the BS can use to maximize the signal-to-interference plus noise-ratio (SINR). The proposed algorithm performs well under high speed channels. The limitations of the Minimum Mean Square Error (MMSE) receiver are discussed and it is used for comparison purposes with the proposed approach. An interference free scenario is used as a benchmark to evaluate the proposed system performance.Performance of LTE is optimized by tackling practical considerations that affect system performance. We present a suboptimal practical way of estimating the interference and utilizing this information on the processing techniques used at both the UE and the eNodeB sides. We focus on managing both MU-MIMO interference and other cell interference. The proposed study improves system performance even under non-perfect channel knowledge, enabling the throughput gains promised by MU-MIMO.Along the theme of enhancing spectral efficiency, we In-Band Full-Duplex (IBFD) when used in conjuction with Mu-MIMO. IBFD is very promising in enhancing wireless LANs, where full-duplex access points (APs) can support simultaneous uplink (UL) and downlink (DL) flows over the same frequency channel. One of the key challenges limiting IBFD benefits is interference. We propose a scheduling technique to manage interference in wireless LANs with full-duplex capability. We focus primarily on scheduling UL and DL stations (STAs) that can be efficiently served simultaneously.Finally, we take a holistic view of performance by considering practical issues related to system performance, namely, a) Interference resulting from the non-linearity of power amplifiers, and b)the trade-offs between system performance and power consumption.An important topic for practical communication systems is handling the interference due to the power amplifier nonlinearities, especially in Orthogonal Frequency-Division Multiple Access (OFDMA) based communication systems, due to the high peak to average power ratio. This problem becomes more compounded when a large number of PAs is required, as in Massive MIMO for example. In this thesis, we discuss the impact of PAs on cellular systems. We show the constraints that PAs introduce, and we take these constraints into consideration while searching for the optimum set of transmitter and receiver filters. Moreover, we highlight how Massive MIMO cellular networks can relax PAs constraints resulting in low cost PAs, while maintaining high performance. The performance is evaluated by showing the probability of error curves and signal-to-noise-ratio curves for different transmit powers and different number of transmit antennas.In terms of power consumption we investigate the use of emerging technologies (such as memristors) to enable highly efficient computation kernels for wireless communication systems. Specifically, we investigate the use of Associative processors (APs) to perform in-memory computation in the context of an FFT processor. To reduce power and power density, we investigate approximate computing in memristive based associative processors. A promising approach to save energy is through reducing the bit width, however reducing the bit width introduces errors that may affect the performance. In this thesis, our goal is to adjust the bit width based on the channel SNR, aiming at achieving good performance at reduced energy consumption. The mathematical approach that analytically describes the system performance under the reduced bit width noise is presented. Based on this model, an adaptive bit width adjustment algorithm is presented that utilizes the received SNR estimates to find the optimal bit width that achieves performance goals at reduced energy consumption. Simulation results show that the proposed algorithms can achieve up to 45\% energy savings as compared to wireless communication systems with conventional FFT.
This book explains how the performance of modern cellular wireless networks can be evaluated by measurements and simulations With the roll-out of LTE, high data throughput is promised to be available to cellular users. In case you have ever wondered how high this throughput really is, this book is the right read for you: At first, it presents results from experimental research and simulations of the physical layer of HSDPA, WiMAX, and LTE. Next, it explains in detail how measurements on such systems need to be performed in order to achieve reproducible and repeatable results. The book further addresses how wireless links can be evaluated by means of standard-compliant link-level simulation. The major challenge in this context is their complexity when investigating complete wireless cellular networks. Consequently, it is shown how system-level simulators with a higher abstraction level can be designed such that their results still match link-level simulations. Exemplarily, the book finally presents optimizations of wireless systems over several cells. This book: Explains how the performance of modern cellular wireless networks can be evaluated by measurements and simulations Discusses the concept of testbeds, highlighting the challenges and expectations when building them Explains measurement techniques, including the evaluation of the measurement quality by statistical inference techniques Presents throughput results for HSDPA, WiMAX, and LTE Demonstrates simulators at both, link- level and system-level Provides system-level and link-level simulators (for WiMAX and LTE) on an accompanying website (https://www.nt.tuwien.ac.at/downloads/featured-downloads) This book is an insightful guide for researchers and engineers working in the field of mobile radio communication as well as network planning. Advanced students studying related courses will also find the book interesting.
The possibilities for positioning in cellular networks has increased over time, pushed by increased needs for location based products and services for a variety of purposes. It all started with rough position estimates based on timing measurements and sector information available in the global system for mobile communication (gsm), and today there is an increased standardization effort to provide more position relevant measurements in cellular communication systems to improve on localization accuracy and availability. A first purpose of this thesis is to survey recent efforts in the area and their potential for localization. The rest of the thesis then investigates three particular aspects, where the focus is on timing measurements. How can these be combined in the best way in long term evolution (lte), what is the potential for the new narrow-band communication links for localization, and can the timing measurement error be more accurately modeled? The first contribution concerns a narrow-band standard in lte intended for internet of things (iot) devices. This lte standard includes a special position reference signal sent synchronized by all base stations (bs) to all iot devices. Each device can then compute several pair-wise time differences that corresponds to hyperbolic functions. Using multilateration methods the intersection of a set of such hyperbolas can be computed. An extensive performance study using a professional simulation environment with realistic user models is presented, indicating that a decent position accuracy can be achieved despite the narrow bandwidth of the channel. The second contribution is a study of how downlink measurements in lte can be combined. Time of flight (tof) to the serving bs and time difference of arrival (tdoa) to the neighboring bs are used as measurements. From a geometrical perspective, the position estimation problem involves computing the intersection of a circle and hyperbolas, all with uncertain radii. We propose a fusion framework for both snapshot estimation and filtering, and evaluate with both simulated and experimental field test data. The results indicate that the position accuracy is better than 40 meters 95% of the time. A third study in the thesis analyzes the statistical distribution of timing measurement errors in lte systems. Three different machine learning methods are applied to the experimental data to fit Gaussian mixture distributions to the observed measurement errors. Since current positioning algorithms are mostly based on Gaussian distribution models, knowledge of a good model for the measurement errors can be used to improve the accuracy and robustness of the algorithms. The obtained results indicate that a single Gaussian distribution is not adequate to model the real toa measurement errors. One possible future study is to further develop standard algorithms with these models.
MIMO-OFDM is a key technology for next-generation cellular communications (3GPP-LTE, Mobile WiMAX, IMT-Advanced) as well as wireless LAN (IEEE 802.11a, IEEE 802.11n), wireless PAN (MB-OFDM), and broadcasting (DAB, DVB, DMB). In MIMO-OFDM Wireless Communications with MATLAB®, the authors provide a comprehensive introduction to the theory and practice of wireless channel modeling, OFDM, and MIMO, using MATLAB® programs to simulate the various techniques on MIMO-OFDM systems. One of the only books in the area dedicated to explaining simulation aspects Covers implementation to help cement the key concepts Uses materials that have been classroom-tested in numerous universities Provides the analytic solutions and practical examples with downloadable MATLAB® codes Simulation examples based on actual industry and research projects Presentation slides with key equations and figures for instructor use MIMO-OFDM Wireless Communications with MATLAB® is a key text for graduate students in wireless communications. Professionals and technicians in wireless communication fields, graduate students in signal processing, as well as senior undergraduates majoring in wireless communications will find this book a practical introduction to the MIMO-OFDM techniques. Instructor materials and MATLAB® code examples available for download at www.wiley.com/go/chomimo