a picture of his dick is being widely circled across the internet.
Y’know what, shut it down. That is a Langford’s Basilisk situation.
a picture of his dick is being widely circled across the internet.
Y’know what, shut it down. That is a Langford’s Basilisk situation.
Insisting that someone could figure it out does not mean anyone has.
Twenty gigabytes of linear algebra is a whole fucking lot of stuff going on. Creating it by letting the computer train is orders of magnitude easier than picking it apart to say how it works. Sure - you can track individual instructions, all umpteen billion of them. Sure - you can describe broad sections of observed behavior. But if any programmer today tried recreating that functionality, from scratch, they would fail.
Absolutely nobody has looked at an LLM, gone ‘ah-ha, so that’s it,’ and banged out neat little C alternative. Lack of demand cannot be why.
Knowing it exists doesn’t mean you’ll ever find it.
Meanwhile: we can come pretty close, immediately, using data alone. Listing all the math a program performs doesn’t mean you know what it’s doing. Decompiling human-authored programs is hard enough. Putting words to the algorithms wrenched out by backpropagation is a research project unto itself.
… yes? This has been known since the beginning. Is it news because someone finally convinced Sam Altman?
Neural networks are universal estimators. “The estimate is wrong sometimes!*” is… what estimates are. The chatbot is not an oracle. It’s still bizarrely flexible, for a next-word-guesser, and it’s right often enough for these fuckups to become a problem.
What bugs me are the people going ‘see, it’s not reasoning.’ As if reasoning means you’re never wrong. Humans never misremember, or confidently espouse total nonsense. And we definitely understand brain chemistry and neural networks well enough to say none of these bajillion recurrent operations constitute the process of thinking.
Consciousness can only be explained in terms of unconscious events. Nothing else would be an explanation. So there is some sequence of operations which constitutes a thought. Computer science lets people do math with marbles, or in trinary, or on paper, so it doesn’t matter how exactly that work gets done.
Though it’s probably not happening here. LLMs are the wrong approach.
My guy, Microsoft Encarta 97 doesn’t have senses either, and its recollection of the capital of Austria is neither coincidence nor hallucination.
While technically correct, there is a steep hand-wave gradient between “just” and “near-impossible.” Neural networks can presumably turn an accelerometer into a damn good position tracker. You can try filtering and double-integrating that data, using human code. Many humans have. Most wind up disappointed. None of our clever theories compete with beating the machine until it makes better guesses.
It’s like, ‘as soon as humans can photosynthesize, the food industry is cooked.’
If we knew what neural networks were doing, we wouldn’t need them.
Ooh, fair point. We don’t know that any of these options boot.
This defines conservatism. Ingroup loyalty is the only force in their moral universe.
It’s all they think you’re doing. It’s all they think there is.
I cannot fathom having my shit together to such a degree that my bootloader has a theme.
Laundered valor.
Shatter.