Neurosets have been taught to determine the degree and type of wine with 95% accuracy

Neurosets have been taught to determine the degree and type of wine with 95% accuracy

Scientists from NIST Hardware for AI and their colleagues from Maryland University have developed their neural network, which works with increased efficiency.

As with traditional computer systems, AI has physical hardware and software: hardware typically has a large number of conventional silicon chips that consume a lot of energy: for example, it takes about 190 megawatt-hours of electricity to train one modern commercial processor.

A less energy-intensive approach is to use other types of equipment to create neural networks. One promising device is a magnetic tunnel transition. Devices on MTJ consume a few times less energy than their traditional counterparts. MTJ operates faster because they store data in the same location where the calculations are performed.

The team trained the network using 148 wines made of three types of grapes. Each virtual wine had 13 characteristics that had to be taken into account: alcohol, colour, alkalinity and magnesium. Each one had a value of 0 to 1 to take account of it and distinguish one wine from the other.

Then I.I. went through a virtual test with a set of data, which included 30 unknown wines, and the system worked with an accuracy of 95.3 per cent.

The main conclusion is that MTJ devices can be expanded and used to create new AI systems.

The amount of energy consumed by the system depends on its components, but using MTJ as synapses can reduce energy consumption by half.