Whispers & Screams
And Other Things

Contingency. The New Normal

Boris Johnson is the latest UK Prime Minister to make lavish promises about the state of the nations broadband. By promising full fibre broadband by 2025, he joins the pantheon of modern UK figureheads who have made such lavish and ultimately meaningless commitments.

When we consider however that the policy his government trumpeted so ostentatiously only a few days ago is already government policy things take on a different hue. Current government planning targets full fibre by 2033 so this plan is really just a move of the goalposts designed to grab some headlines but with little to no solid ground. The timeframe committed to is nicely positioned beyond the maximum five year parliamentary term but neither of these so called commitments are backed by any lucid plans. The real devil in the detail will require an understanding of exactly how much the proposed changes will mean in terms of subsidy for areas left behind by their poor commercial viability.

Most recent government figures suggest that GB PLC has 7.1% current full fibre coverage. These figures were valid in January but in cities the acceleration of connectivity has been exponential for example BT Openreach currently claim to be ticking off about 80,000 premises per month but to hit the ambitious target recently set that will need to ramp up to 400,000 a month.

If the government decided to really get serious with full fibre is it even possible by 2025? Definitely. Is it likely? No way. Would it be good for UK business if achieved? Undoubtedly.

As a nation we are actually doing ok with the absolute amount of fibre in the network. Theres quite a lot of it but it usually stops at cabinets on our streets. This is the typical FTTC model where fibres are terminated in ubiquitous cabinets and from there we go back to trusty copper for the "last mile". Fibre is faster and more reliable but copper is cheaper and easier to engineer and manage. It is in this "last mile" that the real nub of the UK connectivity problem is to be solved if we are to connect our homes, all 30 million of them, to the information national grid. A government report published last year set out the potential costs of meeting their less ambitious targets by 2033. In it they claim that the final cost of this project would be approx £33B.

Of course the reality is that there will always be a long tail of between 10 and 25% of UK homes for whom commercially the figures regarding ROI (Returns On Investment) just don't stack up. For these locations the intervention of central government is essential and, through schemes such as the Gigabit Broadband Voucher Scheme, steps have been taken.

Indeed it is these 3-8 million homes for whom the greater engineering challenges lie. Options of course exist but most are far from perfect and many are even only barely acceptable.

Take satellite broadband for example. This technology can ostensibly resolve the whole problem in a snap and yet, unfortunately, it can't. Satellite broadband is suboptimal in so many areas be it, latency (delay), contention (shared infrastructure), asymmetry (slow upload) and metering (data limits). The latency renders the technology almost useless for time sensitive applications such as video, voice, gaming and even share trading. Contention is a real thorn in the side of the industry as satellite transponders become filled and oversubscribed with users competing for the finite amount of throughput available. Eutelsat for example has long been the subject of industry chatter regarding its oversubscribed transponders amongst its other problems. The low power nature of active satellite LNB's means that they have a horribly low capacity to transmit back up to the satellite and if users are lucky enough to get and maintain a stable and fast connection they soon realise they have fallen foul of stringently monitored and enforced data limits known in the industry as the FUP (Fair Use Policy)

3/4G seems also on first glance to be a promising way of providing quick connectivity to poorly served premises but the fact is that the premises for whom fixed infrastructure is a problem are typically poorly served by UMTS networks and their descendants. For those able to make use of these facilities, costs can be high, metering is always on, throughput is spotty and varies enormously through the day and the infrastructure in place upstream of the masts is typically under provisioned.

Another solution is point to point radio whether at the microwave (SHF) and millimetre (EHF) wavebands. These types of link provide near fibre standard performance and, if compared to the cost of installing fibre across or under terrain are relatively economical. They do however, carry a price tag in the thousands of pounds despite this pricing dropping enormously over recent years due to the emergence of companies like Ubiquiti and Siklu. For this reason these types of project are usually reserved for small community initiatives or remote or inaccessible commercial sites. Even after the books have been balanced the links created need backhaul infrastructure and this can often be the greatest hurdle to the establishment of a link to serve a site. Line of sight is essential and can be a dark art when one considers the vagaries of the Fresnel Zone and its byzantine effects on isotropic propagation across free space.

In practical terms therefore, a capable telecommunications solutions provider needs to have consideration of and the ability to exercise one, some or all of the above methods and sometimes even in parallel. This, as you might expect opens a pandoras box of subsequent issues as new challenges arise such as channel bonding, traffic sequencing and more now rear their heads.

The fact is, data communications is the new water or electricity and hurdles in the delivery of the service can no longer be accepted as reasons not to provide it. It simply must get through and, as we move into a world where almost everything is travelling to us through the network, the ability to overcome engineering challenges with whatever contingency communications are required is fast becoming a project worthy of a Thomas Telford or an Isambard Kingdom Brunel.

Continue reading
2395 Hits
0 Comments

If You Can't Beat Em Join Em

Yesterday evening (Pacific Time), while we in Europe were tucked up in our dreams, Elon Musk hosted a press conference for one of his most exciting ventures yet. The organisation is called Neuralink and its stated aim is to develop implantable brain-machine interfaces.

Those who are aware of Mr Musks previous statements in this field will know that he has been a vocal Cassandra when it comes to the fate of mankind against the rise of the machines. Indeed for the imaginative among us it doesn't take too much of a leap to envisage a future where hyper capable and mechanised super intelligences are able to see our flesh and blood existence as nothing more than a primitive curiosity to be regarded perhaps at the level of a pet.

So when I heard about the press conference and the hubbub that its announcement had aroused in the cognoscenti press my own curiosity was thoroughly piqued. You see, since I myself began to muse about this potentially existential threat to mankind I have always seen it as a distraction as I have felt that the process of human augmentation, who's origins can be traced as far back as Long John Silver, would be the future where any 'rise of the machines' would carry us with them. Indeed it is more likely that, if we are to seek out a dystopian slant on this discussion, the horror future would be one where augmented humanity (wealth) and vanilla humanity (poverty) were at odds with each other.

Notwithstanding the philosophical discussions however, the announcement last night, as is so often the case, has proven to be a lot less than the aficionados predicted and a lot more than the sceptics expected. The company (Neuralink) appears to be making solid progress albeit not in human bodies. Indeed Mr Musk himself appeared to blurt out to the chagrin of the scientists around him that they had successfully tested their tech in a monkey. Putting aside my own personal misgivings about trialling these things on unsuspecting lab rats or monkeys, this would appear to be pretty significant news. If we are to take the claims at face value, the technology has now been proven in principle and we should not underplay the significance of this revelation.

Science has been integrating tech with flesh and bone for decades but it is the incursion into the last bastion of the unexplored, the human brain that makes this so important. We need only look at the global attention that has been given to The Human Brain Project to understand the way this captures our attention. Neuroscientists have been studying for years to understand the workings of the supercomputers we all carry around with us and in connecting machines to our brains we would seem to be a whole lot closer to that day. Questions of the nature of consciousness and the existential nature of what we may call our identity or soul fly around the perimeters of this discussion but at its heart lies the notion that our bodies and indeed our brains are chemical machines and when we can understand the systems in action we can begin to harness them and make them work to our greater good (and bad).

Mr Musk has announced that he and his company of pioneering scientists intend to place their systems into a human in 2020 and this if accomplished will indeed be a day that will go down in history for the long term. So we wait and we watch. A world now used to the headlong nature of progress will perhaps be wowed once again as science takes us to new heights. The future is ours to shape and as with any new technology in the hands of us human apes it will not be a question of what the technology CAN do that will be the measure of the science but rather what we as a species CHOOSE to do with it. Lets hope we're up to the challenge.

Neuralink website here

Livestream of event here

Much more here

 

Continue reading
2926 Hits
0 Comments

The Joy Of Driving

As anyone who has driven on the UK's congested motorways will attest to, when the roads get beyond a critical threshold of overload, the sheer unpredictability of the drivers around you becomes the most important factor in your cognition. All it takes is for one driver to touch the brake pedal, lighting up the brake lights and a chain reaction of terror ensues in their wake. If an accident is luckily avoided then its almost a certainty that one of those frustratingly inexplicable causeless traffic jams will ensue. 

I have always believed in the power of computer network traffic engineering techniques to come to our aid in situations like this. Just like on a crowded pavement, the unpredictability of the individual has made such a solution frustratingly out of reach.

But it seems that automation and machine learning have brought this notion a step closer. By abdicating control to our machines, network traffic theory can be put into practice ensuring that optimal flow continues.

We have always lacked a way for vehicles to work together until recently and it is this collaborative effort overseen and perhaps controlled by a meta intelligence that can bring about the seismic change that has eluded us.

For my own part I detest most driving. Its basically dead time where my brain has to be used for this one mind numbing task despite the fact that I'd much rather be reading a book, getting some work done or even just sleeping. The day when I can tell my car where I want it to go and then switch off until Im there will be a red letter day for me. I was therefore recently pleased to hear the results of some recent research confirming that in tests, a fleet of driverless cars collaborating with each other can improve overall traffic flow by at least 35%.

Michael He, one of the researchers was quoted thus, "Autonomous cars could fix a lot of different problems associated with driving into, within and between cities but there has to be a way for them to work together."

The key will lie in the adoption of standards and, just like during the development of the standards which now dominate the internet, we are in a period of competition where the standard which wins out may not be the best. (Think ATM vs Ethernet for transporting video and VHS vs Betamax for watching it.)

Much of the current testing and development is done using scale models and SBC such as Raspberry Pi or Orange Pi. This enables researchers to avoid the prohibitive costs associated with developing full scale test environments. Using such swarm systems where the component nodes within the network are each able to communicate at least with their neighbours, it became possible for the overarching 'intelligence' to manage the meta priority for optimal traffic flow in such a way as to achieve something approaching harmony in a ballet of competing priorities and near misses that would send most human drivers to the hard shoulder. Cars can now be packed more closely and yet continue to enjoy progress towards the destination in environments which were previously untenable if populated by unpredictable humans.

Interestingly these tests involved simulating a mix of human and automata with the overall network collaboration level set to either egocentric or cooperative. Improvements of 35% were observed during cooperative traffic but during egocentric driving the improvement was as much as 45%.

Machine learning and swarm software modelling is bringing this field of imagined utopia into reality with staggering speed and for this driver, the day when I can tell my car where I'm going and then put my feet up can't come a moment too soon.

 
Continue reading
2991 Hits
0 Comments

Duplicate element ID's in the DOM

Short post today folks since its Friday :)

I've been developing quite a lot lately using Angular JS and React JS as I'm currently heavily involved with a startup preparing to release a new IoT product. The app is pretty complex with heavy calls to the server side database but that's not what we're going to talk about today. 

One of the things that seem very straightforward when you're building small UX is ensuring you have a unique array of element ID's in your DOM at any given time but, as the complexity increases, then so does the difficulty in maintaining a mental map of the DOM you have at any given time, especially if elements of your viewport are loading dynamically via ajax calls or whatever. 

The downside to having a duplicate element ID may not always be immediately apparent as it is not something that will always push an error to the console. This can result in confusing erratic behaviour from your application and often hours can be lost trying to work out what on earth is going on. 

By the time you eventually get to the bottom of the problem you realise that you've wasted hours searching for something that was ultimately such a basic error. 

So anyway, this happened to me once or twice over recent weeks. Our application loads dynamic data via ajax quite frequently and the issue I faced could sometimes be intermittent. The worst-case scenario.

After the dust had settled, I decided to create a tool for myself that I could use to quickly establish the presence or otherwise of duplicate ID's. This is what I came up with. Simply punch this code into the command line on your browser inspector console and hey presto!

var DOMElements = document.getElementsByTagName("*"), DOMIds = {}, duplicateIDs = [];

for (var x = 0, len = DOMElements.length; x < len; ++x) {

  var element = DOMElements[x];

  if (element.id) {

  if (DOMIds[element.id] !== undefined) duplicateIDs.push(element.id);

  DOMIds[element.id] = element.name || element.id;

    }

  }

if (duplicateIDs.length) { console.error("Duplicate ID's:", duplicateIDs);} else { console.log("No Duplicates Detected"); }

Feel free to use and abuse as you see fit. I hope this helps somebody out there save some hair :). Have a fab weekend.

Continue reading
2642 Hits
0 Comments

The Web By Proxy

I've been working on networks for decades and for as long as I can remember, network proxies have existed. I first came across the idea when I worked for IBM as an SNA programmer back in the late 90s but it's in more recent years that network proxies have taken on more importance. 

Continue reading
2595 Hits
0 Comments