this post was submitted on 29 Sep 2023
60 points (82.6% liked)

Technology

59235 readers
3326 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

So I've been using Kagi for a while now as a paid search engine. I always thought it's $25 a month plan was a little steep for search, but a) I got work to pay for it, and b) startpage nee google was getting less and less useful, and bing and whatever used it has... well been worse for me always.

Anyway, I just got told that they've now adjusted their pricing / added features to Ultimate, and I think (at least now) that's actually added a lot of value if you're into the more advanced LLVM / AI models / chat. I have also been paying $20 a month through work for ChatGPT Plus. I might drop that because Kagi now lets you chat with / use GPT4 as well as Claude2 and a Google LLVM model with the one $25 a month, in addition to all the search and AI Search (with sourcing) together.

I don't know how well paid search is going to ever do - it might be a short term tool. But for now, not having ads in the search, a straightforward pay for service model that seems to work just as well with their stated privacy goals, and getting multiple AI LLVM is pretty cool "one stop shopping" if you will. I also like giving a shot to less ad based models for Internet services that I can't see how they don't become privacy invasions.

you are viewing a single comment's thread
view the rest of the comments
[–] Blizzard@lemmy.zip 5 points 1 year ago (2 children)

I see all your points and fully get it. I just naively wish something new will pop up by popular demand, some breakthrough idea out of the box, like some kind of open source search engine supported by p2p network or federated instances where everyone would contribute resources. If such demanding projects like operating system or social media can be open source then I don't see why search engine couldn't be.

[–] HKayn@dormi.zone 2 points 1 year ago (1 children)

I have a feeling that you mainly wish for something that you can use for free.

Reality is, money needs to come from somewhere. It will either come from you, or someone who might have different motivations than you do.

[–] Blizzard@lemmy.zip -5 points 1 year ago (1 children)

Reality is, money needs to come from somewhere.

That's not reality, that's a mindset promoted by corporations for the last decade. How much do you pay for using Lemmy, Linux, Mastodon and other FOSS?

[–] HKayn@dormi.zone 5 points 1 year ago (1 children)

How the fuck do you think Lemmy, Linux and Mastodon are sustaining themselves, if not with money?

I donate to open source projects that I feel need the support, because unlike you I don't take them for granted.

[–] Blizzard@lemmy.zip -1 points 1 year ago (1 children)

And what exactly would stop people from doing the same for search engine project?

[–] HKayn@dormi.zone 3 points 1 year ago (1 children)

You tell me. What's stopping you from donating to SearXNG?

[–] MaggiWuerze@feddit.de 1 points 1 year ago

SearxNG is kind of a bad example, since they are only a meta search engine and don't do any of the heavy lifting.

[–] GenderNeutralBro@lemmy.sdf.org 1 points 1 year ago (1 children)

I wonder how big the database needs to be for useful search, and how much new data is added every day. Is this something that could realistically be run locally or widely mirrored?

[–] hayalci@fstab.sh 2 points 1 year ago

Nope, not realistic for "mirroring". Federated could be possible, but I wouldn't have high hopes about (good) latency and coverage.