It will come as no surprise, but as companies rely more and more on AI to provide new services and boost productivity, they are building better and better tools to make AI’s. These tools are getting easier to use, meaning the engineers have to understand less about how to build AI’s.
In a medium post, Ryszard Szopa makes two related points: (1) your knowledge of how to build custom AI’s is becoming less relevant; and (2) no one will care because data is more important than algorithms and the leading AI companies have all the data:
In early 2018 the task from above [breast cancer detection!] wasn’t suitable for an intern’s first project, due to lack of complexity. Thanks to Keras (a framework on top of TensorFlow) you could do it in just a few lines of Python code, and it required no deep understanding of what you were doing.
What was still a bit of a pain was hyperparameter tuning. If you have a Deep Learning model, you can manipulate multiple knobs like the number and size of layers, etc. How to get to the optimal configuration is not trivial, and some intuitive algorithms (like grid search) don’t perform well. You ended up running a lot of experiments, and it felt more like an art than a science.
As I am writing these words (beginning of 2019), Google and Amazon offer services for automatic model tuning (Cloud AutoML, SageMaker), Microsoft is planning to do so. I predict that manual tuning is going the way of dodo, and good riddance.I hope that you see the pattern here.
Your AI skills are worth less than you think
Yes, the pattern is abstraction, and it is wonderful. It allows a programmer to build a web server with a single line of code, or do many other amazing things without understanding the underlying nuts and bolts. It has its problems, but mostly it is to be embraced. Building AI’s is no different.
The issue with too few companies having too much data is another problem entirely, one that will perhaps be dealt with by competition law. Because more data beats better design every time.