It's all about the right mind set at the very top level. At the beginning of the PC era, nobody would bet IBM to lose. Same in the dawn of internet, all money was on MS. so it happened to Nokia and Ericsson.
Google is a giant without a direction. The ads money is so good that it just doesn't have the gut to leave it on the table.
she's done pretty important work but since then obsessed with the vague term `spatial intelligence`. what does it mean? there isn't a clear definition in the piece. it seems very intuitive & fundamental but tbh not *rigorous*, nor insightful.
It's rare for one person to achieve many things. Her ImageNet was certainly HUGE. But she is a researcher. I think the true power of researchers is to persist. I also often think that researchers are too much absorbed into their topics. But that is just their purpose.
It could be a dead end for sure. I just hope that someone figures out the `spatial` part for AIs and brings us closer to better ones.
I have only one problem w/ the S1: the creators just not have the guts to kill Mother or Father. Don't get me wrong, I like them but it could make the whole series more distant and cold.
I was about to complain about the use of strings in both libraries, both for the lack of type safety as well as the possible runtime allocation, but then I looked at the assembly for the sml example and there are no strings in the binary other than the obvious "send" one.
What exactly happened there? It looks like make_transition_table() is doing some serious magic. Or are the state transitions evaluated at compile-time given that there is no input in the example, and then the transition table gets compiled out?
Anyway, I think it would help OP's library to have some assembly output in the readme as well.
Looks like SML is using user-defined literals [0, 1] to effectively pre-process the string literal into state/event objects. Looks like the string itself is turned into a template parameter in the process and I believe those shouldn't show up in the compiled code (maybe unless there's some mangling-related thing going on?)
it feels like google in panic mode, the only thing it can think of is to put a chatbot everywhere, just b/c it can. I don't see a value proposition at all.
My understanding is that every manager at Google has had one of their quarterly goals be to integrate genAI into their team's product (regardless of whether it makes sense to) for the past several years already, so you're not wrong.
Interesting formulation! it captures the intuition of the "smartness" when solving a problem. However, what about asking good questions or proposing conjectures?
Google is a giant without a direction. The ads money is so good that it just doesn't have the gut to leave it on the table.