

Would still need a capacitor or something or it would flicker at 120Hz (in the US) but that’s not much more cost I would hope.
Also those 4 would flicker more than the rest of the string.


There’s [serious] - it started on Reddit (I think) but doesn’t need to stay there.


These emails were released before the vote on ‘the rest of it’.


They are still used where not needed and Way too bright usually. I have more than once used precisely placed electrical tape to reduce them down to a pinhole or slit.


When I saw the headline I was thinking it’s hardly new, being from 2012, but they covered and explained that well. I’m glad progress is accelerating.


Where are the example output pictures?


But in this case it’s first-party, and they still had to make an exception


Awesome. My only critique is that microwave ovens actually work really well in their niche. I can’t say the same for LLMs.


Some people call them Maurice.


Of course vulnerabilities exist. And creating a major one like this for an LLM would likely lead to it destroying things like a toddler (in fact this has already happened to a company run by idiots)
But what it didn’t do was copy-with-changes as would be required to ‘evolve’ like a virus. Because training these models requires intense resources and isn’t just a terminal command.


Why would someone direct the output of an LLM to a terminal on its own machine like that? That just sounds like an invitation to an ordinary disaster with all the ‘rm -rf’ content on the Internet (aka training data). That still wouldn’t be access on a second machine though, and also even if it could make a copy, it would be an exact copy, or an incomplete (broken) copy. There’s no reasonable way it could ‘mutate’ and still work using terminal commands.
And to be a meme requires minds. There were no humans or other minds in my analogy. Nor in your question.


If you know that it’s fancy autocomplete then why do you think it could “copy itself”?
The output of an LLM is a different thing from the model itself. The output is a stream of tokens. It doesn’t have access to the file systems it runs on, and certainly not the LLM’s own compiled binaries (or even less source code) - it doesn’t have access to the LLM’s weights either. (Of course it would hallucinate that it does if asked)
This is like worrying that the music coming from a player piano might copy itself to another piano.


The Dreaming


Were those other urban areas specifically parking lots/garages? (The places that charging stations tend to be)


That does work (actually ‘non emergency city state’). But as another comment mentions, the public knowing it exists is more important than the number itself.


I think the non-emergency number should be heavily advertised. I have no idea what the local one for me is (if it even exists)


And an LLM determining that accurately would be a dice roll.


Summaries that look good are something LLMs can do, but not summaries that actually have a higher ratio of important/unimportant than the source, nor ones that keep things accurate. That last one is super mandatory on something like an encyclopedia.
I will never buy a phone directly through a carrier instead of the OEM. They are offering me some nice discounts right now, but I have no interest in a phone where I can’t unlock the bootloader. (Or the carrier lock!)