

But the company has already invested so much into the CEO, they can’t just let him go because he doesn’t understand the sunk cost fallacy! /s
But the company has already invested so much into the CEO, they can’t just let him go because he doesn’t understand the sunk cost fallacy! /s
deleted by creator
They say money can’t buy happiness, but it can buy apples, and that cat looks pretty happy to me.
I don’t think having well-defined precision is a rare requirement, it’s more that most devs don’t understand (and/or care) about the pitfalls of inaccuracy, because they usually aren’t obvious. Also, languages like JavaScript/PHP make it hard to do things the right way. When I was working on an old PHP codebase, I ran into a popular currency library (Zend_Currency) that used floats for handling money, which I’m sure works fine up until the point the accountants call you up asking why they can’t balance the books. The “right way” was to use the bcmath extension, which was a huge pain.
I work at big tech (not MS) and yes, the comp package really is that good, though not as good as it used to be. I immediately doubled my total comp when I came here from my last job, and now it’s ~5x. I could retire right now if I wanted, so I don’t care about layoffs anymore.
Cuelang: https://cuelang.org/docs/reference/spec/#numeric-values
Implementation restriction: although numeric values have arbitrary precision in the language, implementations may implement them using an internal representation with limited precision. That said, every implementation must:
- Represent integer values with at least 256 bits.
- Represent floating-point values with a mantissa of at least 256 bits and a signed binary exponent of at least 16 bits.
- Give an error if unable to represent an integer value precisely.
- Give an error if unable to represent a floating-point value due to overflow.
- Round to the nearest representable value if unable to represent a floating-point value due to limits on precision. These requirements apply to the result of any expression except for builtin functions, for which an unusual loss of precision must be explicitly documented.
That works until you realize your calculations are all wrong due to floating point inaccuracies. YAML doesn’t require any level of precision for floats, so different parsers on a document may give you different results.
YAML doesn’t require any level of accuracy for floating point numbers, and that doc appears to have numbers large enough to run into problems for single-precision floats (maybe double too). That means different parsers could give you different results.
In a sense, AI is already fucking with everyone’s brain when it comes to mass-produced ads and propaganda.
Depends on the monkey
Image is accurate, since without bugs, the food chain collapses and takes society with it, and the survivors will have to migrate to rural areas that can support a hunter-gatherer lifrstyle.
Yoy should come join the tautology club. Just remember these three rules:
My favorite thing about tautologies is how tautological they are.
Better than creating this culinary atrocity in real life.
I’ve been on the internet for over 25 years and I’ve never seen a meme community that didn’t beat memes to death. If moth memes are enough to annoy you, you’d have an aneurysm from “you’re the man now dog” memes in the early 00s.
What about AIHNPtPaNVaB (Assigned I have no plans to purchase a new vehicle at birth)?
I agree, but I’m not sure it matters when it comes to the big questions, like “what separates us from the LLMs?” Answering that basically amounts to answering “what does it mean to be human?”, which has been stumping philosophers for millennia.
It’s true that artificial neurons are significant different than biological ones, but are biological neurons what make us human? I’d argue no. Animals have neurons, so are they human? Also, if we ever did create a brain simulation that perfectly replicated someone’s brain down to the cellular level, and that simulation behaved exactly like the original, I would characterize that as a human.
It’s also true LLMs can’t learn, but there are plenty of people with anterograde amnesia that can’t either.
This feels similar to the debates about what separates us from other animal species. It used to be thought that humans were qualitatively different than other species by virtue of our use of tools, language, and culture. Then it was discovered that plenty of other animals use tools, have language, and something resembling a culture. These discoveries were ridiculed by many throughout the 20th century, even by scientists, because they wanted to keep believing humans are special in some qualitative way. I see the same thing happening with LLMs.
I don’t know how I work. I couldn’t tell you much about neuroscience beyond “neurons are linked together and somehow that creates thoughts”. And even when it comes to complex thoughts, I sometimes can’t explain why. At my job, I often lean on intuition I’ve developed over a decade. I can look at a system and get an immediate sense if it’s going to work well, but actually explaining why or why not takes a lot more time and energy. Am I an LLM?
deleted by creator
What is that from?