Wired has an interesting article about what happens when miniaturization pushes the boundaries of what is possible to achieve with digital computing.
Intel’s current state-of-the art chipmaking process will soon shrink to 14 nanometers, aka 14 billionths of a meter. When transistors get that small, it becomes devilishly hard to keep them operating in the precise on-or-off states required for digital computing. That’s one of the reasons why today’s chips burn so hot.
The answer: go analog. The problem is broken up. Parts of the software that still require digital precision, such as calculating account balances, are handled by error-free binary hardware. Other parts, such as check scanning software, are handed off to neural processing accelerators. These subsystems handle more fault tolerant aspects of the problem with greater efficiency than a regular processor.