Speaking at Apple’s quarterly earnings call, Apple CEO Tim Cook covered a variety of subjects, and one of the areas that he highlighted for investors is artificial intelligence. AI has fast become one of the top buzzwords for most companies, including Facebook, Google, and Microsoft, and it seems Apple is no exception.
While AI is often viewed as an esoteric subject, its applications are found in everyday interactions, and Cook highlighted this by talking about HomeKit, CarPlay, and the Health app, and talking about how these experiences become smarter and more intuitive through the use of advanced AI.
‘We have focused our AI efforts on the features that best enhance the customer experience,’ Cook explained. ‘For example, machine learning enables Siri to understand words as well as the intent behind them. That means Siri does a better job understanding and even predicting what you want, then delivering the right responses to requests.’
At WWDC 2016 Apple talked about ways in which it can improve this further, and one of the ways in which Apple is doing this is by opening up Siri to third-party developers, something Cook highlighted on the call.
‘To make Siri an even smarter assistant, we’re opening the service to developers, and this fall Siri will be available across our entire product line,’ he said.
‘We’re also using machine learning in many other ways across our products and services, including recommending songs, apps, and news,’ Cook added.
What does this mean? Machine learning is a way of creating code that can learn new inputs and work out appropriate responses the way people do, and this has applications in a number of different situations.