Nvidia already produces a 'brain pod' in the form of an xavier w/ 512 cuda cores and 8 core ARM CPU for $1299. https://developer.nvidia.com/embedded/buy/jetson-agx-xavier-devkit But I wonder if you can get granular access to the GPU? Like divide it up into four discrete 128 cudas? And then each sub-core so-configured runs simple but robust machine vision-hearing-speech system. Open source intelligence and auditing pod. But also pick and place brain. A kind of smart binoculars? A kind of robotic head that automates counting and reporting. Hearing, speech, sight. Perform some kind of task when you are X distance from this QR code.