Fujitsu CTO: Don’t expect dedicated data centres for AI

Fujitsu CTO: Don’t expect dedicated data centres for AI 682 395 C-Suite Network

Dr Joseph Reger, Chief Technology Officer EMEIA at Fujitsu, talks to João Marques Lima about the future of data centres powered by non-human intelligence.

The age of artificial intelligence (AI) is approaching the data centre at speed. Globally, investment in AI is predicted to top $36.8bn by 2025, up from $643.7m in 2016, according to intelligence firm Tractica.

Data centres are set to also jump on the multi-billion Dollar bandwagon over the coming years as non-human intelligence enters data halls across the world.

The use of machine learning, automation, special algorithms such as genetic algorithms, neural networks, deep learning and other AI technologies are believed to make data centres run better, both from an infrastructure and a workload perspective.

However, “there will be no dedicated data centres for AI, but all data centres will run AI”, warns Dr. Reger.

“Currently, the cloud data centres are essentially Intel processors, Windows and largely Linux operating system (OS).

“There will, however, be special hardware coming because the above is the standard hardware and the standard OS. However, there are applications today that are just becoming very important and need more compute power than what the standard architectures can do, and these are the applications of AI.”

Executive Briefings: Intersection of Leadership and Social Media

Putting machine learning algorithms work

The AI applications will need compute power beyond what is out there in the market today. A common technology already deployed in data centres is machine learning, which is now being advanced through the use of neural networks, computing systems that imitate biological nervous systems such as the human brain to absorb and process data.

Dr. Reger said: “The way most of the machine learning algorithms work, using neural networks, is that there is a phase in the beginning where you do the training of the system yourself.

“When that is done, you just run it [it takes a bit of human work to train the network in the beginning, in some cases they are totally automated systems as well].”

In a second phase, the technology is put to service, what some people refer to as inference “in…

Executive Briefings: Intersection of Leadership and Social Media