Ayuda
Ir al contenido

Dialnet


Analysis and performance improvement of consumer-grade millimeter wave wireless networks

  • Autores: Guillermo Bielsa López
  • Directores de la Tesis: Joerg Widmer (dir. tes.)
  • Lectura: En la Universidad Carlos III de Madrid ( España ) en 2019
  • Idioma: español
  • Tribunal Calificador de la Tesis: Andrés García Saavedra (presid.), Matilde Pilar Sánchez Fernández (secret.), Ljiljana Simić (voc.)
  • Programa de doctorado: Programa de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan Carlos
  • Materias:
  • Enlaces
  • Resumen
    • Millimeter-wave (mmWave) networks are one of the main key components in next cellular and Wireless Local Area Networks (WLANs). mmWave networks are capable of providing multi gigabit-per-second rates with very directional low-interference and high spatial reuse links. In 2013, the first 60 GHz wireless solution for WLAN appeared in the market. These were wireless docking stations under the WiGig protocol. Today, in 2019, 60 GHz communications have gained importance with the IEEE 802.11ad amendment with different products on the market, including routers, laptops and wireless Ethernet solutions. More importantly, mmWave networks are going to be used in next generation cellular networks, where smartphones will be using the 28 GHz band. For backbone links, 60 GHz communications have been proposed due to its higher directionality and unlicensed use. This thesis fits in this frame of constant development of the mmWave bands to meet the needs of latency and throughput that will be necessary to support future communications. In this thesis, we first characterize the cost-effective design of Commercial Off-The-Shelf (COTS) 60 GHz devices and later we improve their two main weaknesses, which are their low link distance and their non-ideal spatial reuse.

      Up to today WLANs such as WiFi or Bluetooth, or cellular networks, such as LTE, work in frequency ranges that can go up to 6 GHz. These sub-6 GHz bands have to deal with the increase of throughput demand as well as managing their highly saturated wireless channels. These devices usually transmit their power in an omnidirectional way, this is, transmitting their power into every direction. This omnidirectional behavior together with their large communication range and small penetration loss can cause significant interference with neighbouring devices belonging to other networks. In order to avoid collisions due to interference, the network needs to schedule the transmissions of each of the devices. If there is no scheduler in the network, devices will need to sense the medium to verify if there are ongoing transmissions. Looking into new and fresh ideas, the WiGig Alliance standardized the use of the unlicensed 60 GHz band to avoid the previously mentioned issues, creating the WiGig protocol in 2009. In 2012 the WiGig alliance merged with the WiFi Alliance, producing together the new IEEE 802.11ad amendment. IEEE 802.11ad was the result of merging the whole multi-user IEEE 802.11 stack together with the 60 GHz physical layer of the WiGig protocol, allowing to have a mmWave WiFi network with up to 6.7 Gbps links.

      Having an increase in frequency is equivalent to a reduction in wavelength. This wavelength reduction gives the possibility of reducing the size of the antennas. 60 GHz devices equip antennas in the order of millimeters of size, allowing to stack many antennas together building what is called an ‘antenna array’. Antenna arrays have the potential of modifying phase and amplitude of the different antenna elements creating constructive or destructive self-interference, producing the synthesis of directional beam patterns. The directivity of these antenna arrays can be used both in the transmission as well as in the reception side. This high directional behavior provides low-interference scenarios with high spatial reuse.

      Developing robust consumer-grade devices for mmWave wireless networks is challenging. In contrast to IEEE 802.11n/ac networks operating in the traditional 2.4 and 5 GHz bands, devices based on the recent IEEE 802.11ad standard operating in the unlicensed 60 GHz band must overcome significant hurdles. This includes handling 2.16 GHz wide channels, using directional beamforming antennas to overcome the increased attenuation at these frequencies, or dealing with a highly dynamic radio environment. It is critical to take into consideration the cost-effective design of COTS devices when designing networking mechanisms. This is why in this thesis we do the first-of-its-kind COTS analysis of 60 GHz devices, studying the D5000 WiGig Docking station and the TP-Link Talon IEEE 802.11ad router. We include static measurements such as the synthesized beam patterns of these devices or an analysis of the area-wide coverage that these devices can fulfill. We perform a spatial reuse analysis and study the performance of these devices under user mobility, showing how robust the link can be under user movement. We also study the feasibility of having flying mmWave links. We mount a 60 GHz COTS device into a drone and perform different measurement campaigns. In this first analysis, we see that these 60 GHz devices have a large performance gap for the achieved communication range as well as a very low spatial reuse. However, they are still suitable for low density WLANs and for next generation aerial micro cell stations.

      Due to the 60 GHz nature, network deployments are focused on short-range applications. The range of 60 GHz networks depends on a number of factors, such as the directivity of the antenna and the allowed transmission power. Our measurements reveals maximum link distances of 17 meters. In other works, the general consensus is that the range of 60 GHz networks in LOS conditions is between 10 to 15 meters. In isolated cases, ranges of up to 25 meters are feasible. Such outliers may be due to, e.g., dry environments in which water absorption is limited. Given the limited range in the standard case, extending the link length is highly interesting as there will be cases where the users will need longer link distances, not being able to maintain connectivity. For such cases, 60 GHz networks may need to fall back to legacy ISM IEEE 802.11 using Fast Session Transfer (FST). FST consists of the process where IEEE 802.11ad devices decide to fall back from the 60 GHz band to the 2.4 or 5 GHz band due to poor channel characteristics. This has a disastrous impact on throughput since such legacy networks are orders of magnitude slower than IEEE 802.11ad. Otherwise, 60 GHz networks must resort to relays as included in the IEEE 802.11ad amendment.

      In order to avoid the use of relays, we propose the application of different frequency selective techniques to extend the range of 60 GHz communications. In order to do so, we measure real-world indoor 60 GHz band channels and study their behavior as distance increases. Seeing that these COTS devices are not as directional as literature suggests, we analyze how channels are not as frequency stable as expected due to the large amount of reflected signals. Ideally, frequency selective techniques could be used in these frequency selective channels in order to enlarge the range of these 60 GHz devices. Our channel measurements question common assumptions in the recent 60 GHz networking literature. Such related work often deals with the 60 GHz band from a networking perspective, thus often resorting to assumptions such as flat channels and free-space path loss models. However, early work on propagation characteristics that deals with the 60 GHz band from a physical layer perspective suggests that such assumptions are not realistic. In this work, we bring together both perspectives by considering typical indoor scenarios for 60 GHz network deployments based on IEEE 802.11ad. We show (a) that the free-space path loss model is not accurate in such deployments, and (b) that the impact of frequency selectivity becomes evident as soon as the distance between transmitter and receiver is beyond a few meters.

      Typical communication systems use the same Modulation and Coding Scheme (MCS) for the whole channel indistinctly of the different channel conditions that can be along the channel bandwidth. What this means is that when the link suffers from frequency selectivity, the global performance of the connection decreases. Even though only some fraction of the channel experiences low Signal-to-Noise Ratio (SNR) values, the rest of the channel has to adapt and lower their MCS in order to have all the frequency carriers working under the same conditions, decreasing the global performance of the system. In the same way, existing 60 GHz wireless networks ignore frequency selectivity. furthermore, most implementations use single-carrier schemes as defined in WiGig and the IEEE 802.11ad amendment. This means that each physical layer symbol spreads over the entire channel bandwidth, regardless of potential variations of the channel in the frequency domain. While the standard also allows for OFDM, all subcarriers are treated equally in terms of MCS thus not taking into account frequency selectivity.

      To validate the use of frequency selective techniques in our system, we measure real-world 60 GHz indoor channels as defined in the IEEE 802.11ad standard and study their behavior with respect to techniques such as bitloading, subcarrier switch-off, and waterfilling. Subcarrier Switch-Off is based on using only powerful sub-carriers in order to achieve a high communication rate, not using the subcarriers that experience poor channel conditions. Systems implementing Bit Loading can use a different MCS value for each subcarrier. As a result, subcarriers with a high SNR are mapped to high MCS values, while the ones with low SNR are mapped to low MCS values. Waterfilling allows transmitters to distribute the transmit power unevenly among subcarriers, distributing the power according to the channel response in order to maximize the spectral efficiency of the system.

      To this end, we consider an IEEE 802.11ad network using OFDM as defined in the standard. Thus, nodes occupy a bandwidth of 2.640 GHz using 512 subcarrier in total. Out of those, 336 subcarriers transport data, 16 are pilots, and the rest is nulled, using 1.88 GHz out of the channel bandwidth of 2.640 GHz. We observe that, while the throughput improvements that we can achieve when using frequency selective techniques may be limited, the gains in terms of range extension are significant for low SNRs. We can extend the range of a 60 GHz link by up to 3×. In other words, frequency selective techniques can operate when the basic physical layer in IEEE 802.11ad cannot. Although we focus in OFDM systems, it is important to mention that the link outages that we experience during this study are also going to happen in single-carrier devices. Single-carrier devices also adapt their transmission characteristics to the weakest portion of their channel, limiting their transmission range. In this theis we want to encourage vendors and developers to adapt their single-carrier 60 GHz systems to OFDM. Once these devices support OFDM they should implement the frequency selective techniques exposed here in order to extend their link range and save energy in their transmissions. The 10 dB energy savings that we demonstrate in this chapter would be even more useful in battery-driven devices like smartphones.

      The high directional beam patterns that mmWave devices achieve are supposed to be capable of accomplishing very high spatial reuse scenarios. These would allow many simultaneous links in the same area with almost interference-free communications. In contrast, the omni-directional transmissions and rich multi-path environment in legacy ISM bands require a MAC layer to avoid concurrent transmissions to prevent collisions. In practice, however, the difference between mmWave and lower frequency bands is not that pronounced. For simplicity and to lower manufacturing costs, current mmWave COTS use antenna arrays with beam shapes that are much wider and with many more side lobes than theory suggests. These create enough interference so as to prevent spatial reuse in most practical scenarios. Furthermore, current IEEE 802.11ad devices use a set of predefined beam patterns from a fixed codebook and do not adapt the patterns to the current RF environment.

      Beamtraining is the process where mmWave nodes select the most suitable beam pattern to be used for communications, which in the case of IEEE 802.11ad, is the one that achieves the highest SNR. In this theis, we propose a novel approach for beamtraining that takes into account not only the SNR, but the possible interference that a beam pattern incurs. To this end, we design a centralized coordinated system that chooses the most efficient beam pattern for each station that improves throughput for all the users. Our system requires only changes to the APs and works with unmodified clients. We implement the different algorithms in COTS devices using alternatively two options for the beam patterns: 1) modifying the choice of Tx-beam pattern while using the default omni-directional Rx-beam pattern of current COTS devices and 2) modifying also the Rx-beam pattern to be equal to the chosen Tx-beam pattern. We then analyze the performance of the different algorithms, first in a proof of concept scenario and then evaluate them in a real-world environment. There we test how the different algorithms behave for different positions and link combinations in an open-space office. We check both TCP and UDP as well as uplink, downlink and bidirectional traffic.

      The first mechanism that we develop is `Weighted SIR Fairness', which leads to the highest Signal-to-Interference ratio. This mechanism results into the optimum channel capacity for a wireless communication system given by the Shannon-Hartley capacity theorem. But reality is different for our system, as the IEEE 802.11ad standard performs carrier sensing. This implies that even if we could achieve very high Signal-to-Interference ratios resulting in very high MCS, if a node receives interference above the carrier sense threshold it needs to wait until the channel is idle before it can access the medium and transmit. This beamtraining mechanism can lead to rate losses, because the beam pattern that the algorithm chooses might still interfere too much for spatial reuse while at the same time reducing the link’s SNR. This occurs for half of the cases, which have sub-25% bitrate losses. However, gains are achieved the rest of the time when using this mechanism. The second developed mechanism is `Argmaxmin', which chooses the beam pattern combination in a way that the Signal-to-Interference of the worst link in the whole scenario is maximized. This selection can lead us to an average poor scenario with poor connections among the links. Due to this, the different nodes might use worse MCS values and may still carrier sense with other nodes. In contrast, the default IEEE 802.11ad algorithm will carrier sense, but it will use stronger link budgets resulting in higher MCS and bitrates. Nevertheless, the gains of this mechanism are usually positive. The third developed mechanism is `Power Threshold', which is the most reliable mechanisms from the ones developed. Here, we select a given group of beam pattern that provide a sufficiently high SNR and thus MCS, and from those select the one that produces the least interference. This way, the MCS is often equal to or not much below the one selected in the default beamtraining mechanism, but with the advantage that the devices will carrier sense less often. The last developed mechanism is `Interference Threshold', which can fail due to the low directional beam patterns. While this mechanism should provide very efficient communications, the low-directionality of the current beam pattern codebook makes very difficult to find a beam pattern capable of providing good communication conditions while having zero-interference with neighboring nodes. It is also important to notice that as we developed our beamtraining mechanisms to work for unmodified clients, it is impossible to achieve 100% spatial reutilization as only one link direction is optimized. However, the `Power Threshold' mechanism achieve gains for more than 90% of the measured positions, making it worthy to be implemented.

      Overall, this thesis shows the behavior of current mmWave COTS devices together with two simple techniques to improve their main limitations which are their short link range and their restricted spatial reuse. This first-generation mmWave COTS analysis is highly relevant as next generation mmWave devices will be based on them, so it is decisive to know them well in order to exploit their advantages as well as improve their weak points. An important step forward in the 60 GHz mmWave industry is going to be the introduction of new WLAN mmWave devices under the IEEE 802.11ay amendment into the market and the deployment of mmWave wireless backbone links (as AT&T has proposed with the AirGig Project). IEEE 802.11ay is the evolution of the current IEEE 802.11ad. This new mmWave WLAN amendment will support hybrid beamforming, channel aggregation and MU-MIMO, resulting into up to four independent streams with up to 8.64 GHz bandwidth which are capable of achieving 44 Gbps, resulting in a maximum of 176 Gbps when taking all the streams together. AT&T’s AirGig combines LTE technology with mmWave communications in order to provide internet in rural areas achieving hundreds of megabit per second connections with the use of ‘self-install’ equipment. The aforementioned soon-to-be-deployed technologies encourage the study that we perform in this thesis, as these new devices should take into account all the previous-generation COTS design problems and try to solve them. Implementing the techniques exposed in this thesis to enhance the wireless communication range and spatial reuse, would also help in this next-generation mmWave wireless deployments. Other characteristics that these devices should improve is the user mobility model and the antenna array capabilities. The movement of a user may damage the link causing large throughput losses which will get worse when these mmWave chipsets are integrated into Virtual-Reality headsets or hand-held devices such as smartphones. On the other hand, better antenna arrays would create more directional links solving some of the previously mentioned issues.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno