I typically see four or five papers published on a daily basis at arxiv-sanity.com that report advances in the state of the art in machine learning.
The Apple A12 Bionic chip Neural Engine does 5 trillion machine learning operations per second compared with last year’s A11 Bionic chip’s 600 billion ML operations per second. That is more than an eightfold increase year-on-year.
The Arm Holdings announced ML chip will do 5 trillion ML operations per second in phones that will begin shipping perhaps by the middle of 2019, almost for sure by the end of 2019.
If the advanced algorithms reported in papers being published today are integrated into digital agents in our phones a year from now, I expect a qualitatively greater level of intelligence by the end of 2019.
Probably still not human level, but definitely more conversational in context than today.
I am looking forward to getting a new phone sometime in the second half of 2019.