LAMPIRIS Eleftherios

Person has left EURECOM
  • LAMPIRIS Eleftherios

Thesis

Using predictions to reduce complexity and feedback in wireless communications

Wireless communications are currently exhibiting a giant leap in volume and societal impact, but are also facing a massive environmental challenge in the form of a carbon footprint that matches that of global aviation, and which will triple by 2020. This challenge has spurred worldwide research to produce radically new power-efficient high-performance environmentally-friendly communication technologies.  However, the efforts have encountered two seemingly insurmountable bottlenecks; the bottleneck of computational complexity corresponding to the need for algorithms that require extreme computing resources, and the bottleneck of feedback corresponding to the need for equally idealistic feedback mechanisms that must disseminate massive amounts of overhead information about the fluctuating states of each link in the network.

These bottlenecks drive our theoretical vision: We will provide a never-before-attempted exploration of the crucial interdependencies between computational complexity, feedback and performance in wireless communications. They also drive our technological vision: We will develop algorithms for a new class of mobile-user devices that can participate in properly gathering/disseminating feedback (at the right place and time) as well as in computing solutions to outsourced algorithmic tasks across the network, in an effort which we term as "outsourcing the surgical insertion of bidirectional bits and flops across the network" and which aims to reduce computational complexity and improve performance.

We will take a novel approach, which drives our vision. A recent result of ours has revealed the surprising fact that - for a simple point-to-point setting - a single bit of feedback from the receiver back to the transmitter (properly placed in time, and properly representing the predicted flop count), managed to massively reduce the computational complexity of transceiver algorithms. This reduction was a surprising finding, and it was traced back to the newly-found ability of feedback to `skew' the statistics of the accumulation of computational load, without negatively skewing the statistics defining performance.

Key to our approach will be predictions, which will enable coded caching. Perhaps the strongest reason to jointly consider coded caching and feedback, comes from the possibility that coded caching may be able to alleviate the constant need to gather and distribute CSIT, which --- given typical coherence durations --- is an intensive task that may have to be repeated hundreds of times per second during the transmission of content. This suggests that content prediction of a predetermined library of files during the night (off peak hours), and a subsequent distribution of parts of this library content again during the night, may go beyond boosting performance, and may in fact offer the additional benefit of alleviating the need for prediction, estimation, and communication of CSIT during the day, whenever requested files are from the library.

The idea of exploring the interplay between feedback and coded caching, hence draws directly from this attractive promise that content prediction, once a day, can offer repeated and prolonged savings in CSIT.

 

 

 

 

General setting and context of the proposal

Wireless communications are currently exhibiting a giant leap in volume and societal impact.  Very soon, the number of communicating devices will exceed a trillion, while, alone in China, the number of new basestations is expected to approach a million. France has a substantial claim to the lucrative endeavor of creating the future wireless networks; an endeavor that is also seen as a key element of economic recovery in Europe. At the same time though, this endeavor faces an environmental challenge, where due to the accelerating growth in traffic generated by mobile devices in cellular telecommunication networks, it is predicted that by the year 2020 these networks will place an unbearable burden on our environment due to power consumption, which currently can scale exponentially with the utility of the network. Additionally the carbon footprint of mobile communications now matches that of global aviation, and it will triple by 2020 [8]. This challenge has spurred worldwide research to produce new power-efficient high-performance environmentally-friendly communication technologies.

However, the efforts have encountered two seemingly insurmountable bottlenecks[1]; the bottleneck of computational complexity corresponding to the need for algorithmic processing that requires prohibitively high computing resources, and the bottleneck of feedback corresponding to the need for equally idealistic feedback mechanisms that should - in theory - distribute massive amounts of overhead information that simply describes and predicts the randomly fluctuating states of a complex network of intertwined communicating nodes. Lack of proper mathematical machinery has often resulted in rather heuristic treatments of the complexity-and-feedback bottlenecks, while in the case of complexity, this treatment is almost nonexistent.

Long-term scientific vision, objectives and challenges

The feedback and complexity bottlenecks drive our scientific objective which is to offer a never-before-attempted exploration of the crucial and largely unexplored interdependencies between computational complexity, feedback and performance in wireless communications. These interdependencies are deep and plentiful. These same bottlenecks also feed our technological objective to develop algorithms for a new class of mobile-user devices that can participate in properly gathering/disseminating feedback (at the right place and time) as well as in computing solutions to outsourced algorithmic tasks across the network.

Our scientific objective is tantamount to promoting the long overdue marriage between information theory and complexity theory of computations. A main challenge is to understand the common stochastic elements that govern performance (relating to information theory) and the accumulation of computational load (relating to the complexity theory of computation) in crucial communication tasks. This challenge is not insurmountable; we have already found links that originate from the fact that - loosely speaking - channel realizations that negatively affect performance by forcing errors, also tend to cause spikes in algorithmic complexity. Here enters feedback which - as we have shown in a recent publication for a simple case - has the seemingly magical property of attenuating these spikes when this feedback is properly placed at the right place and the right time. The goal will be to extend this finding to more interesting networks. Towards this, we will seek to tackle a series of challenges.

 

 

Challenge: discovering the crucial complexity-feedback-performance interdependencies

As expected, computational complexity, feedback and performance, are highly intertwined entities. Complexity naturally affects performance, because having more flops implies more powerful algorithms that better search for, and approach, theoretical optimal solutions. Feedback of course affects performance, as it reduces the (often problematic) randomness of the problem. Hence feedback indirectly affects complexity - via performance - as it offers the possibility to use potentially cheaper algorithms to offer good performance, albeit less than the newly improved feedback-aided performance (in other words, feedback allows us to be cheaper, suboptimal, but still good enough). Feedback also affects complexity directly, by offering the side information that allows for algorithmic adaptation that `cuts corners'. An additional direct way that feedback affects complexity, has to do with the fact that there are often many `isomorphic' algorithmic ways to achieve a certain same task, and for some instances of this task, algorithms tend to reduce to a simpler version of themselves. Feedback tells us what these instances may be, and thus allows us to pick the simpler versions of the algorithm.

Emphasis will be placed on a carefully selected class of networks, whose complexity-feedback-performance properties strike at the core of the fog vs. cloud dilemma of having decentralized (fog) infrastructures that crowd-source network management and computations to the now powerful devices of the mobile users, or of having centralized (cloud) infrastructures where all processing is done centrally by massive hubs.

 

Challenge: Combining elements from different communications paradigms

Our approach here is based on the fact that many architecture paradigms exhibit serious but complementary complexity-and-feedback bottlenecks and advantages, and are thus amenable to cross-combining.  Emphasis will be placed on deriving the unexplored and crucial complexity-feedback-performance properties in a carefully selected class of networks (channels), and then identifying good combinations of these properties that can be mixed-and-matched towards guiding principles for fusing solutions from these channels.

Of special interest will be comparisons between Massive MIMO systems and smaller fog-type paradigms and solutions, as well as comparison and possible fusion of Massive MIMO solutions and cloud-RAN solutions. While designing and implementing a complete architecture is certainly out of the scope of this proposed work, what is possible is to gain crucial insight on general principles of how to interchange fog and cloud ideas towards a fused alternative hybrid class of environmentally-friendly wireless networks.