pageview
Banner Default Image

Putting edge computing centre stage in the UK’s fourth industrial revolution

almost 5 years ago by Lucy Cinder

Putting edge computing centre stage in the UK’s fourth industrial revolution

Cloud Computing

The UK is facing its next wave of digital transformation, driven by the increased development, adoption and deployment of advanced, innovative digital technologies such as cloud computing, the internet of things (IoT), artificial intelligence (AI) and 5G.

The vision for the fourth industrial revolution is for every sector to become a data-driven digital sector. For this to happen, the adoption and use of one technology, such as AI, does not solely hold the answer. The UK’s digital economy and society will be supercharged by the increased convergence of a number of key innovative technologies that will create increased amounts of data, which will need to be collected, processed, analysed and stored at both the local and central level.

It is in this vision of the UK’s fourth industrial revolution where edge computing could take centre stage. 

Imagine a future where 5G and edge servers providing local, time-critical, low-latency data processing working together can enable IoT devices to become a reality transforming traditional sectors, such as manufacturing. We could see a future where the convergence of AI, cloud and edge computing provides the vehicle and roadside digital infrastructure needed to make driverless cars a reality. 

But if this is the vision of our digital future, how do we get this right? What issues do we need to consider to ensure this vision becomes a reality? A key issue is that we must account for the energy impacts of an increase in data processing at the edge. 

During the last decade, we have encouraged the consolidation of IT functions, including data collection and processing, into larger, purpose-built facilities where electricity consumption is transparent, where energy stewardship is scrutinised and where there are strong incentives for efficiency. If we see increased demand for more data processing at the local level by edge computing, before data is sent to the cloud, how will we aggregate that new energy use and how can we ensure it is transparent and accountable? 

It is hard to predict a future based on emerging, innovative and disruptive technologies. We know datacentre energy use is growing, but not in line with data volumes. Thanks to factors such as Moore’s Law and virtualisation, the energy needed to process a given amount of data has decreased by over six orders of magnitude in 30 years.

Some also consider that increasing the use of edge datacentres can be far more autonomous in terms of electricity supply, and that a much wider range of power sources will be available to them because they are less monolithic and resilience can be built in by duplication and overlap rather than by continuity of supply to individual units. 

Now is not the time to be complacent. The increased convergence of emerging technologies such as 5G and edge could represent a significant growth in what is effectively a new form of distributed IT. If we are to realise the full potential of these technologies, it is important to do it in a way that builds public trust and confidence by identifying and addressing now potential issues and concerns about energy usage. 

source computerweekly

Industry: Cloud Computing