Jerboa is solid, but it’s not feature-rich. Not great for media browsing. It’s still my main client since I use Lemmy mostly for text, not images or videos.
Eternity and Voyager are worth looking at, too.
Jerboa is solid, but it’s not feature-rich. Not great for media browsing. It’s still my main client since I use Lemmy mostly for text, not images or videos.
Eternity and Voyager are worth looking at, too.
Interesting read, thanks! I’ll finish it later, but already this bit is quite interesting:
Without access to gender, the ML algorithm over-predicts women to default compared to their true default rate, while the rate for men is accurate. Adding gender to the ML algorithm corrects for this and the gap in prediction accuracy for men and women who default diminishes.
We find that the MTEs are biased, signif-icantly favoring White-associated names in 85.1% of casesand female-associated names in only 11.1% of case
If you’re planning to use LLMs for anything along these lines, you should filter out irrelevant details like names before any evaluation step. Honestly, humans should do the same, but it’s impractical. This is, ironically, something LLMs are very well suited for.
Of course, that doesn’t mean off-the-shelf tools are actually doing that, and there are other potential issues as well, such as biases around cities, schools, or any non-personal info on a resume that might correlate with race/gender/etc.
I think there’s great potential for LLMs to reduce bias compared to humans, but half-assed implementations are currently the norm, so be careful.
After all these years, I’m still a little confused about what Forbes is. It used to be a legitimate, even respected magazine. Now it’s a blog site full of self-important randos who escaped from their cages on LinkedIn.
There’s some sort of approval process, but it seems like its primary purpose is to inflate egos.
It was an SEO hellhole from the start, so this isn’t surprising.
Do Forbes next!
It says:
Available Architectures
aarch64, x86_64
And it uses Android Translation Layer. Interesting. I’ll give it a shot on my desktop later.
I claim ownership of the microorganisms in and on my body. I am not merely human; I am a glorious amalgamation of trillions of distinct beings, working in harmony to bring you shitposts!
As a begrudging Comcast customer myself, allow me to explain. They are the least shitty option because the only alternatives in my area are 5G and Verizon DSL. Verizon DSL has a max download speed less than 1mbps.
So yeah, I use Comcast. And I hate it.
I also sometimes use the mbasic.facebook.com site from a private Firefox tab on my iPhone, but FB has just started telling me I need to use Chrome
WTF.
But really, using a Chromium-based or Safari-based browser in private/incognito mode will not be much different as far as tracking goes.
You might also be able to install a user-agent switcher extension in Firefox. I thiiiiiink Firefox supports extensions on iOS now, right? If not, you can try an alternative browser like Duckduckgo or Orion.
It’s a bit complex to set up, but Syncthing can do one-way syncing. So you can have it set to upload photos from your phone to your PC. Then you can delete those photos from your phone and they’ll stay on the PC.
Then you just need a good backup strategy for your PC.
Who thought it was a good idea to let an internet ad company control our internet client?
It seemed a lot more reasonable 15 years ago. The default on Windows at the time of Chrome’s rise was Internet Explorer.
I am watching Ladybird with great interest. The world needs a new-from-the-ground-up browser.
simply logging out or using an alt account
It is increasingly difficult to use X without an account. Not sure what the signup process is like nowadays. IIRC it used to require phone number verification in the Twitter days, but perhaps Musk relaxed the requirements in order to better pad the usage stats with spambots?
Yeah, AMD is lagging behind Nvidia in machine learning performance by like a full generation, maybe more. Similar with raytracing.
If you want absolute top-tier performance, then the RTX 4090 is the best consumer card out there, period. Considering the price and power consumption, this is not surprising. It’s hardly fair to compare AMD’s top-end to Nvidia’s top-end when Nvidia’s is over twice the price in the real world.
If your budget for a GPU is <$1600, the 7900 XTX is probably your best bet if you don’t absolutely need CUDA. Any performance advantage Nvidia has goes right out the window if you can’t fit your whole model in VRAM. I’d take a 24GB AMD card over a 16GB Nvidia card any day.
You could also look at an RTX 3090 (which also has 24GB), but then you’d take a big hit to gaming/raster performance and it’d still probably cost you more than a 7900XTX. Not really sure how a 3090 compares to a 7900XTX in Blender. Anyway, that’s probably a more fair comparison if you care about VRAM and price.
Basically the only thing that matters for LLM hosting is VRAM capacity
I’ll also add that some frameworks and backends still require CUDA. This is improving but before you go and buy an AMD card, make sure the things you want to run will actually run on it.
For example, bitsandbytes support for non-CUDA backends is still in alpha stage. https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend
Assuming nominal voltage of 3.7, that’s about 60Wh. For comparison, the 14" MacBook has a 70Wh battery.
That’s not good battery life but it depends on what kind of usage they’re assuming with that 7H number. I’m not sure a MacBook runs that long under high load. If it’s 7H on a heavy load, that’s respectable.
Edit: not sure what class of device is the best comparison here. Laptop? Tablet? Phone? 🤷
I want it to be consistent dammit!
YES.
In tech terms, “intelligent” or “smart” usually means inconsistent and unpredictable. It means I need to do extra work to verify that the computer didn’t “helpfully” do something I never told it to do.
I understand autocorrect on phones, because phone keyboards suck very hard. I am still shocked that both Apple and Microsoft have decided to enable it by default on desktops and laptops with full keyboards. No, Apple, believe it or not, the username field in web sites is not supposed to have a capitalized first letter. If I wanted that, I have three whole keys on my keyboard that I could have used to do that. STFU and let me do my own typing. (Why usernames are case-sensitive in certain places is a whole other matter, one that’s far outside my control.)
Borg via Vorta handles the hard parts: encryption, compression, deduplication, and archiving. You can mount backup snapshots like drives, without needing to expand them. It splits archives into small chunks so you can easily upload them to your cloud service of choice.
I don’t use this feature much so I can’t speak to the details, but yes, it does support OPDS.
Screenshot:
English Dictionary Offline: https://play.google.com/store/apps/details?id=livio.pack.lang.en_US . Still unbloated after all these years.
Librera (ebook reader): https://f-droid.org/en/packages/com.foobnix.pro.pdf.reader/ . The pro version is free on f-droid, or you can buy it on Google Play if you want to support the dev.
Fossify Gallery: https://f-droid.org/en/packages/org.fossify.gallery/ . This is a fork of the old Simple Gallery from before it got bought out. Same deal with Fossify File Manager and the other Fossify apps. Just no-nonsense functional apps.
Wait, isn’t it the other way around? You should arrive in NY earlier than you left London, since NY is 5 hours behind London. So if you leave at 8:30 and arrive 1.5 hours later, it should only be 5AM when you arrive.
You might need a third breakfast before your elevenses in that case.