Linguistically Fluent – Kurzweil and Schank

Reaction to Kurzweil’s Linguistically Fluent posting…

Kurzweil Linguistically Fluent:  http://www.kurzweilai.net/ray-kurzweil-reveals-plans-for-linguistically-fluent-google-software

Kurzweil’s plan is a good plan from a Google Deep Learning perspective…

The plan is an extension of the capabilities of seq2seq – sequence to sequence – which is good for DL translation between languages, for example.

However, no one has solved Reasoning, which is required for fluent conversation.

Episodic Memory is required for solving reasoning.

(Anyone who doubts this needs to read Schank’s Dynamic Memory, as well as Scripts, Plans, Goals, and Understanding, as well as Inside Computer Understanding, as well as Conceptual Information Processing – find and check out from your local library)

Video Understanding is required for solving episodic memory.

Video understanding can be solved with DL like Seq2Seq – since it depends on several sub-problems, including speech recognition, image recognition, language translation, and some of the capabilities in driverless cares (learning from watching – seeing and doing – a shallow type of simulation – or mirror neuron capability).

So we are getting close to what Kurzweil wants to do – certainly possible by 2025.

Possible timeline to solution of linguistically fluent AI for the enterprise = digital worker/cognitive collaborator…

2018-2020 video understanding solved for simple world-task model actions + context
2019-2021 video understanding solved for simple social model interactions + context
2020-2022 episodic memory solved
2021-2023 reasoning solved
2022-2024 partial fluent conversation solved
2023-2025 learning from watching (and doing) solved
2024-2026 learning from reading solved
2025-2027 full linguistically fluent = digital worker/cognitive collaborator solved

2035-2037 $1000 per year for digital worker achieved for 2017 top 1000 occupations

2055-2057 $10 per year for digital worker achieved for 2027 top 1000 occupations

Of course, it will not actually evolve this way, but this is a linear-thinking projection to provoke alternative scenarios.

Each of the above solutions is likely to spawn an open source code community on Github with an associated open data set – open AI code + data + models + stacks.

References:
Kurzweil Linguistically Fluent:  http://www.kurzweilai.net/ray-kurzweil-reveals-plans-for-linguistically-fluent-google-software

Quoc V. Le (Google) Introduction to Seq2Seq: https://www.youtube.com/watch?v=G5RY_SUJih4

Schank Dynamic Memory: https://www.amazon.com/Dynamic-Memory-Reminding-Learning-Computers/dp/0521270294

Schank Scripts Plans Goals and Understanding: https://www.amazon.com/Scripts-Plans-Goals-Understanding-Intelligence/dp/0898591384/

Schank Inside Computer Understanding: https://www.amazon.com/Inside-Computer-Understanding-Miniatures-Intelligence/dp/B00ZT1PB5Y/

Schank Conceptual Information Processing: https://www.amazon.com/Conceptual-Information-Processing-Roger-Schank/dp/1483252981