Designing Future Chips with AI & ML

0
970

Evolution Concept and Architecture of AI &ML

Artificial Intelligence (AI) we refer to a subfield of Computer Science. Artificial Intelligence is acted by machines, computers and mainly software. Machines mimic, here we see why it is called artificial, some kind of cognitive function based on environment, observations, rewards and learning process.

Few debate that During the Second World War, noted British computer scientist Alan Turing worked to crack the ‘Enigma’ code which was used by German forces to send messages securely. Alan Turing and his team created the Bombe machine that was used to decipher Enigma’s messages. The Enigma and Bombe Machines laid the foundations for Machine Learning. According to Turing, a machine that could converse with humans without the humans knowing that it is a machine would win the “imitation game” and could be said to be “intelligent”.

In 1951, an machine known as Ferranti Mark 1 successfully used an algorithm to master checkers. Subsequently, Newell and Simon developed General Problem Solver algorithm to solve mathematical problems. Also in the 50s John McCarthy, often known as the father of AI, developed the LISP programming language which became important in machine learning.

Machine Learning

Machine learning is a subset of AI, where we make use of algorithms and technologies like sensors and connected devices to teach the system to learn autonomously. Instead of feeding hundreds of instances to make a decision, we give machines the ability to filter out wrong decisions and choose the most appropriate ones (again like humans). But that is not as easy as it sounds. The working behind virtual voice assistants and any other technology we have talked about earlier are based on machine learning. To give the right response, Alexa has to understand if your question is genuine or sarcastic, filter out on replies that sound dumb and respond in a way that matches the tone of your question.

Designing Future Chips

With PCs and mobile phones, the game-changing innovations that defined this era, the architecture and software layers of the technology stack enabled several important advances. In this environment, semiconductor companies were in a difficult position. Although their innovations in chip design and fabrication enabled next-generation devices, they received only a small share of the value coming from the technology stack—about 20 to 30 percent with PCs and 10 to 20 percent with mobile, reports McKinsey.

The report also suggested on how will new revenue and sales com for the semiconductor companies – They said, AI & ML.

  • AI could allow semiconductor companies to capture 40 to 50 percent of total value from the technology stack, representing the best opportunity they’ve had in decades.
  • Storage will experience the highest growth, but semiconductor companies will capture most value in compute, memory, and networking.
  • To avoid mistakes that limited value capture in the past, semiconductor companies must undertake a new value-creation strategy that focuses on enabling customized, end-to-end solutions for specific industries, or “microverticals.”

Opportunities for Semiconductor Companies

AI has made significant advances since its emergence in the 1950s, but some of the most important developments have occurred recently as developers created sophisticated machine-learning (ML) algorithms that can process large data sets, “learn” from experience, and improve over time. The greatest leaps came in the 2010s because of advances in deep learning (DL), a type of ML that can process a wider range of data, requires less data preprocessing by human operators, and often produces more accurate results.

Designing Future Chips with AI & ML

Designing Future Chips with AI & ML

Designing Future Chips with AI & ML

Intelligent applications

Most organizations’ preference for acquiring AI capabilities is shifting in favor of getting them in enterprise applications. Intelligent applications are enterprise applications with embedded or integrated AI technologies to support or replace human-based activities via intelligent automation, data-driven insights, and guided recommendations to improve productivity and decision making, writes Garter.

Today, enterprise application providers are embedding AI technologies within their offerings as well as introducing AI platform capabilities — from enterprise resource planning to customer relationship management to human capital management to workforce productivity applications.

Designing Future Chips with AI & ML

Quantum Computing

On the onset, let me dispel any myths that we would make quantum computing incrementally better in 2019. Instead we would just be pushing a little in the right direction towards building better quantum computing devices. Although it would be small increment, it would still be a huge focal point in the area of AI.

Quantum computers use quantum physics to compute calculations faster than any supercomputer today. We are well aware of how computers use bits and bytes. However unlike a regular computer, quantum computers use qubits (Quantum bits) to store information.

We have a long way to go in terms of dealing with the challenges of quantum computing like maintaining coherence of the qubits or removing the unnecessary and noisy computations.

Biased data:

This topic is becoming increasingly important as machine learning models are being used for decision-making such as hiring, mortgage loans, prisoners released from parole or the type of social service benefits. For example, consider a fictitious case of the decision of promoting a woman. Historical data on employment may show women getting less promoted than men and hence create discriminatory AI applications. Many more examples have led to an increased emphasis of dealing with biased data in AI applications. As the usage of AI applications increases in 2019, there has been an increase in learning how to deal with biased data.

Neural networks

To put it briefly, neural networks or artificial neural networks emulate human brain. They store all data in a digital format — sensory, text or time and use it to classify and group the information. For example, reading someone’s handwriting comes easily and unconsciously to us whereas in order to teach an algorithm we feed it with vast amounts of handwritten data to recognize patterns in it.

There is a huge demand of neural networks in robotics, to improve order fulfillment, prediction of the stock market, and diagnosis of medical problems or even to compose music!

What can machine learning do to automate these complex problems? Quite a lot, but the actual application and techniques being used depend highly on the problem space. As an example, while there technically is “a lot of data to mine,” in reality the amount of useful data to train a predictive model is quite limited. Unlike social media where data that can be harvested seemingly without limits, chip design data is available only in a fractured environment, and technology is constantly changing. Imagine a self-driving car that that would have to recognize new rules or road signs every few months and is only allowed to train on portions of the road at a time. In chip design, the training of machine learning models is likely to happen in each customer environment independently at the design level, and for each foundry ecosystem at the technology node level.

Conclusion

But there also is a level of risk associated with AI, depending upon the application and the level of precision. The design of electronic systems in the past has been based upon the complete predictability of logic, much of which has been hard-wired. AI replaces computational precision with distributions of acceptable behavior, and there is a lot of discussion at conferences about what that means for design sign-off. It’s not clear whether existing tools or methodologies will provide the same level of confidence. There are many promising technologies that have the power to completely transform the way chips are being designed. Although AI will bring groundbreaking potential, it will be empowering to designers rather than replace them.