Microsoft is witnessing a skills gap for companies to take advance Artificial Intelligence (AI) technologies and implement it in real commercial applications. Speaking exclusively to TechRadar Middle East, Kevin Dallas, Corporate Vice-President for AI and Intelligent Cloud Business Development at Microsoft, said that from a technology point of view, AI is obviously the most “transformational technology for our generation”.
However, he said that there are three foundation elements needed in order to create powerful AI solutions. - data, computing power, and advanced algorithms.
“The compute power may be sitting in the cloud, or it could be locally on your edge device. It really starts to get complex when you look deep into it. You need to be a data scientist to understand the data. You need to understand cloud computing, advanced computing, and also need to understand advanced algorithms”
“It’s really important for us as an industry to make sure that we get this right in terms of delivering work benefits to our society. It starts from the top where the executives see the opportunities, and we have consumers who believe that the world is going to be a better place with AI. But in the middle, we have this capability gap in terms of delivering these new AI infused applications and services,” he said.
The present and future of AI
AI is being used in many applications such as Amazon Alexa or, in Microsoft’s case, Cortana, Xbox, Bing, Dynamics 365 and in Office 365.
“AI has become more visible to the consumers through Alexa, Cortana and Siri and now they’re starting to see the real power of artificial intelligence as a virtual assistant,” he said.
“We’re trying to build capabilities that allow Cortana to be a true personal assistant in your work environment. So, in a meeting, it will be able to ask and record the conversation and if there are specific points that need to go out for research” he said.
What you find with AI is all about the data that you can reason on, he said and added that the more data you can reason on, the better is the AI capabilities.
“The data is coming in and that’s educating Cortana over time. We are also using Cortana in other scenarios. We have Bing and Office feed into it to make it more and more intelligent,” he said.
Dallas believes that AI is a broad category, and we are currently in a stage of narrow AI - AI for a specific function and specific domain such as Cortana and Alexa.
“What we’re talking about is the next phase [that’s] over a decade away, which is called general AI. This is where a single machine is able to operate like a human and is able to work across different functions, tasks and domains, and be able to operate as a human.”
“The final layer of AI is the super AI, where computers become more intelligent than humans - a point that we can’t comprehend. The industry is in a debate when that would happen or if that would happen. We have to be careful with AI because a lot of people think of AI as being humans versus machines. The humans vs machines scenario that people think about are the super AI scenario,” he said.
“We have to be careful in terms of how we design, develop and use AI. We have to make sure that we build an AI design to be ethical, keep privacy, transparency and security in mind, and zero bias. It is also the responsibility for the governments to provide what we believe is a very light-weight regulatory influence and light-weight policy influence that will guide the general industry,” he said.
“Today, we are working on mixed reality. HoloLens and mixed reality capabilities will change the way we see the world. Today our view is to see the physical world around us, but in the future, our view will be the physical world superimposed with the virtual world,” he said.
HoloLens is available today in the form of a headset, but Dallas hopes to see the form factor in a smaller device or in a set of glasses in five to ten years’ time.
from TechRadar - All the latest technology news http://bit.ly/2DIDVk8
EmoticonEmoticon