According to a gizmag article is appears a possible step change in wireless transmissions speeds could be on the cards. Which is not too unexpected, as the wireless transmission speeds have been growing by significant leaps roughly every 5 to 10 years – the limiter has been the cost of change for both consumer devices and the transmission towers. Although things might be about to change in an unexpected way…
Artemis Networks is a San Francisco-based firm that was founded in the early 2000s and has been “incubated” for over a decade. It claims its new mobile network base stations will be able to eliminate network congestion, dead zones and unreliable connections, instead providing mobile users with a “fiber-class broadband experience.”
The base stations, or “pWaves,” are used in place of mobile network towers. They do not need to be placed at regular intervals like mobile towers and so instead can be placed with cost and convenience in mind. According to Artemis, they are cheaper to run and more efficient than conventional mobile towers, as well as being cheaper and quicker to roll-out. Most importantly, however, they can provide high performance service regardless of the number and geographical spread of users.
Traditional mobile towers create network “cells” of anywhere between five and 50 km (3 and 31 miles), whilst trying to avoid overlapping and interfering with other cells. All mobile devices in the area share the cell’s capacity for data transmission, meaning that the more users there are trying to connect via the same cell, the worse its performance becomes for each user. pCell technology, on the other hand, supposedly turns this model on its head.
“Instead of dodging interference, pCell exploits interference, combining transmitted radio signals from multiple pCell base stations to synthesize tiny “personal cells” – pCells – of wireless energy around each mobile device,” explains Artemis. “So, rather than hundreds of users taking turns sharing the capacity of one large cell, each user gets an unshared pCell, giving the full wireless capacity to each user at once.”
(I know a massive quote – but it is a press release!)
Now, this technology is very interesting to me; it’s a direct demonstration of how the march in more CPU power for your dollar is allowing the solving of previously ‘impossible’ (or not financially viable, which is almost equivalent these days) – namely in this case processing out, or overcoming, the negative aspects of interference. This is purely possible because the processing of the radio signals is quite likely done purely in the software domain rather hardware (a so called Software Defined Radio); i.e. the functionality that a circuit would have done in demodulating, modulating and mixing RF signals is instead replaced by an algorithm running on a chip – this creates three advantages:
- You can change or modify the algorithm to suit as required, on the fly if necessary,
- You can do things which would have been impossible (or expensive) to do in hardware, and;
- It’s repeatable, a circuit over time & temperature would ‘drift’; a software algorithm won’t.
Software running on a powerful fabric would enable extensive analysis of the transmission and reception environment in realtime; so in this case permitting the precise cancelling out of interference as it occurs.
This, combined with the need to NOT replace existing devices, puts it on my probable list of technologies to, if not directly get to market, then at least lead to direct improvements in wireless networking services.
Also, this underlines my thinking that the Australian NBN’s unique value proposition to the end consumer is in real danger of being eroded by the march of improvements in wireless technology. Combine this with improvements in video compression technology (like VP8 from Google) and the ‘Need for Speed’ actually levels out as the processing power at either end of the wire improves (higher compression becomes more practical due to cheaper processing power).
As always, the future just keeps getting better…