I know I won't be secretly happy if they do this.
bitofhope
You can totally hack a plane using a buffer overflow. C airlines don't check how many tickets they sell on a single flight. Usually if you overbook a flight, they will simply reallocate some of their buffer into business class. However, if you buy a bunch of tickets to one flight at once, you can craft a scenario where you overwrite the pilot.
Yea a plane hijacking is totally like a buffer overflow.
Bleeding is also a bit like a buffer overflow, since blood goes in a place it's not supposed to. Hurricanes are another example of a buffer overflow. Accidentally wearing a shirt inside out? Buffer overflow. Unskippable ads are buffer overflow. War is buffer overflow. I had my buffer overflown by some guy claiming to be a wallet inspector. Aliens are a type of buffer overflow. I sometimes have buffer overflow with my girlfriend. Buffer overflow was an inside job. I put too much shine paste in my polishing machine and you better believe that was a buffer overflow.
When a train crashes into a station building, that's not a buffer overflow, though. That's a buffer overrun.
Standard ML the programming language or standard as in conventional and ML as in machine learning?
Fuck it, we're going back to bang paths. ficix!hetzner!awful!self please add support for this.
Now that is leaving your mark in history as an academic.
Percentages are cheating, especially percentages below 50.
I'll predict a 49% chance Mont Blanc erupts tomorrow, covering half of Europe in chocolate.
AWS is only tolerated because product managers ask for it, not because engineers like it; AWS is shit.
Yes, but the competition is hardly much better. Well, maybe Google is, I didn't touch it much back when I still did public cloud stuff. Azure leads with "look, our VPS offering is called 'Virtual Machines' instead of 'EC2', isn't that simple?" and then proceeds to make everything even clunkier and more complicated than AWS. And don't get me started on the difference in technical and customer support from the two.
There is no moat.
You keep reiterating this, but I still need you to explain the implications. Ok sure, you can run a model on a home computer. Nonwithstanding that those models still amount to overhyped novelty toys, home computers are also capable of running servers, databases, APIs, office suites, you name it. Still, corporations and even consumers are renting these as SaaS and will continue to do so in the foreseeable future.
The AI fad is highly hype driven, so there's still incentive to be the one who trains the latest, biggest and shiniest model, and that still takes datacenters' worth of specialized compute and training data. LLM-based AI is an industry built on FOMO. How long until that shiny new LLM torrent you got from 4chan is so last season?
And the OP is correct. Llama is not open source. "The neighbors" only took it from Meta in the same sense warez sites have taken software forever. Only in this case the developer was the one committing the copyright infringement.
Creation Date: 2024-09-09T14:09:21Z
Nice one
The only thing I've seen from Lex Friedman was his interview of Brian Kernighan. For most of it I just thought it was very kind of BWK to patiently indulge this kid, who was clearly still new and unaccustomed to public speaking or researching his interview subjects, despite the weirdly professional gear setup and production.
Yea, I'm glad a nuclear plant is being restored but it sucks that it's because of fucking plagi-o-matic.