It’s one thing to claim that the current machine learning approach won’t lead to AGI, which I can get behind. But this article claims AGI is impossible simply because there are not enough physical resources in the world? That’s a stretch.
It’s one thing to claim that the current machine learning approach won’t lead to AGI, which I can get behind. But this article claims AGI is impossible simply because there are not enough physical resources in the world? That’s a stretch.
It’s “funny”, because without that injection from Google, Mozilla would surely die. And the only reason Google hasn’t stopped doing that is because then Chrome (Blink) would be more likely to be treated as a monopoly.
Yay, mob justice!
Yeah fair enough. Key part was “arcs that go nowhere”. I got so incredibly tired of TV shows that think the way to do mystery is drawing out plot far too slowly, in hopes you’ll tune in next episode.
Then again, regarding new trek, I only watched season 1 of Discovery, and the first episode of Picard. I ain’t got no patience for this.
And most important (for me): self-contained episodes. No season long story arcs that go nowhere.
That’s a a bit too absolute way to look at it.
From their point of view the goal isn’t to abolish human involvement, but to minimise the cost. So if they can do the job at the same quality with a quarter of the personnel through AI assistance for less cost, obviously they’re gonna do that.
At the same time, just because humans having crappy jobs is the current way we solve the problem of people getting money, doesn’t mean we should keep on doing that. Basic income would be a much nicer solution for that, for example. Try to think a bit less conservatively.
I’m not sure how long ago that was, but LLM context sizes have grown exponentially in the past year, from 4k tokens to over a hundred k. That doesn’t necessarily affect the quality of the output, although you can’t expect it to summarize what it can’t hold on memory.
Gotcha. That sucks.
So those calls are not for the benefit of US companies?
Eh. Gen-x here. I still have an hour long phonecall over signal with my best friend over signal two times a week or so.
In my teens I wasn’t too happy about making phonecalls either, but working on a helpdesk for a while sure cured that.
On the other hand, I live in a country with consumer protection, so robocalls are not a thing. And I’d strike down upon thee with great vengeance and furious anger (and GDPR) those companies who attempt to poison and destroy my personal attention.
I wasn’t awake enough to appreciate the sarcasm in this comment when I initially read it. Nice one.
Oh. Ok.
Why?
Is he banning them for disagreeing, or for people calling him a pedo over an autistic semantics discussion?
Mind you, I agree that when you hold controversial opinions about certain topics people are likely to discuss, you shouldn’t be a mod. Then again, if you value your sanity and blood pressure, you shouldn’t want to be a mod anyway - it’s a thankless job.
Huh? The screenshot says they think generated pictures are not abusive. Not the it’s not csam.
That’s a very dangerous way of twisting someone’s words, especially on a subject this emotionally charged.
Edit: adding to add that contrary to the title here, they are not talking about Editing pictures to be nudes, but generating them. Don’t get me wrong, I have no interest in seeing either, and the day they’re paying themselves is not doing them any favours. But someone is clearly out for character assassination here.
Nevermind, it was about altering the photos of a 15 year old girl into nudes. And while it technically/physically may not be child abuse, for the victim it might as well be. That’s indefensible - I’m out.
For me it just gives an error post0 failed to load
.
Edit: that’s probably my lemmy client trying to approach this blog as though it were a lemmy instance 🤦♂️
It’s unused, you can go ahead and kill it.
Truly the brightest timeline.
With these kind of titles, I hope you will, and know you won’t.
Saw a great video about this (project is still ongoing).