They could but Congress would have to get involved. H-1B visa increase is likely to be heavily unpopular with voters. This American voter for sure would not be thrilled.
you would think so, but the american government has had a consistent track record of sacrificing things that would make economic sense on the altar of racism, and with trump in the white house they are having a "say the quiet parts out loud" extravaganza.
it's a nation whose current immigration policies are largely driven by racism. h1b visas tend to largely go to people from countries the current administration considers undesirable, so the chances of them increasing the allocated number are extremely low.
Sometimes that's exactly what one needs. As long as there's not forced schedule for the work and you can do it when you want and at the pace that you want, it can feel good.
I have two beginner (and probably very dumb) questions, why do they have heavy c++/cuda usage rather than using only pytorch/tensorflow. Are they too slow for training Leela? Second, why is there tensorflow code?
That's Leela Zero (plays Go instead of Chess). It was good for its time (~2018) but it's quite outdated now. It also uses OpenCL instead of Cuda. I wrote a lot of that code including Winograd convolution routines.
Leela Chess Zero (https://github.com/LeelaChessZero/lc0) has much more optimized Cuda backend targeting modern GPU architectures and it's written by much more knowledgeable people than me. That would be a much better source to learn.
As I remember, the CUDA code was about 3x faster than the tensorflow code. The tensorflow stuff is there for non-Nvidia GPU's. This was in the era of the GTX 1080 or 2080. No idea about now.
I agree that it is doing a LOT of work, but I believe OP and many others will compare it to other languages and notice that the Rust compiler is a LOT slower.