abruptly8951

joined 2 years ago
[–] abruptly8951@lemmy.world 2 points 9 hours ago

Yup, that's what I was alluding to, while it may not still be the case for transistors, they did manage to take 50 odd years to get there, push that trend line from the figure 50 years heh (not saying you should, 5 seems much more conservative)

Take a look at Nvidias pace wrt Moore's law (of FLOPS) https://netrouting.com/nvidia-surpassing-moores-law-gpu-innovation/

[–] abruptly8951@lemmy.world 1 points 9 hours ago (2 children)

Or like looking at the early days of semiconductors and extrapolating that CPU speed will double every 18 months ..smh these people

[–] abruptly8951@lemmy.world 1 points 2 weeks ago

They were invented *by 9k bc :)

[–] abruptly8951@lemmy.world 1 points 3 weeks ago (2 children)

Can you go into a bit more details on why you think these papers are such a home run for your point?

  1. Where do you get 95% from, these papers don't really go into much detail on human performance and 95% isn't mentioned in either of them

  2. These papers are for transformer architectures using next token loss. There are other architectures (spiking, tsetlin, graph etc) and other losses (contrastive, RL, flow matching) to which these particular curves do not apply

  3. These papers assume early stopping, have you heard of the grokking phenomenon? (Not to be confused with the Twitter bot)

  4. These papers only consider finite size datasets, and relatively small ones at that. I.e. How many "tokens" would a 4 year old have processed? I imagine that question should be somewhat quantifiable

  5. These papers do not consider multimodal systems.

  6. You talked about permeance, does a RAG solution not overcome this problem?

I think there is a lot more we don't know about these things than what we do know. To say we solved it all 2-5 years ago is, perhaps, optimistic

[–] abruptly8951@lemmy.world 3 points 1 month ago

Unfortunately not, here is a little kitchen sink type demo though https://myst-nb.readthedocs.io/en/latest/authoring/jupyter-notebooks.html

Myst-nb is probably the place to start looking btw - forgot to mention it in previous post

[–] abruptly8951@lemmy.world 3 points 1 month ago (2 children)

I use sphinx with Myst markdown for this, and usually plotly express to generate the js visuals. Jupyterbook looks pretty good as well

[–] abruptly8951@lemmy.world 3 points 3 months ago

Poetry or UV

Still haven't tried the latter but heard good things

[–] abruptly8951@lemmy.world 3 points 4 months ago

Feeding the troll 🤷‍♂️ "agenda driven" what does that even mean 😆

No one said other languages aren't allowed. Submit a patch and prepare yourself for years of painstaking effort.

[–] abruptly8951@lemmy.world 6 points 9 months ago (4 children)

Devils advocate: Splatting, dlss, neural codecs to name a few things that will change the way we make games

[–] abruptly8951@lemmy.world 6 points 11 months ago

Looks like the Barbican in London to me, it's apartments and a public bar/drinking/working area, nice spot to hang out!

[–] abruptly8951@lemmy.world 1 points 11 months ago* (last edited 11 months ago) (2 children)

1 is just going to highlight right?

2, how about 6 words, 10 words, 100 words

3, 4 I use all the time

5 if your edit locations don't line up so that you can alt drag a single column, this is what I mean by jagged. I would use a combination of find and repeat action.

Start from scratch - skill issue :p

view more: next ›