Shit cost money for any platform.
Even Lemmy convert images to .webp?
I’d love being able to just upload my 4.6MB image and getting it reduced down to sub 2MB, but I have to do that manually because Jerboa & co doesn’t do it nor accept bigger images than 2MB.
Am I missing something?
I have my own lemmy instance with pictrs, still can’t use bigger images. Maybe it’s a hard limit or else every other instance will deny the “too big image”?
I’m okay with that limit, it’s just a hassle you can’t share a bigger image and have it reduced in size automatically.
How large?
Lemy.lol have 10mb.
2MB it is.
It’s still compressesed on mastodon , I tried to post a 3072 x 4080 2mb jpeg and when downloaded from the post it’s now a 2499x3319 500k jpeg
Depends on the server, and I pretty much understand service providers why they’re doing it, although it would be nice to buy some high-quality slots from them, as a way to support them.
If someone really wants a RAW image of my crusty ass dog, for some reason, you can ask me to send it over something else. It’s a waste of bandwidth for the majority of photos, which are view once per person, and never again. Nobody can host that much data for free without some big catch.
Dog.
Can you send it over? I want to count your dog’s hairs.
I’ll send the whole dog if you want to do that
Yeah actually, it looks super cute and cuddly.
He’s probably too friendly for his own good, but yes, he’s the world’s chillest dog.
His breath is worse than a malboro’s.
Omg is the let me do it for you dog
deleted by creator
???
Yeah but Mastodon is a right wing cess pool
Mastodon is federated and has instances that have varying politics
Is this ironic? Certainly my Mastodon feed is radically left…
Mine’s geeky old tech, queer librarians, dogs, and cross-stitch.
I tried it when it first launched, has it gotten better?
Mastodon doesn’t have a recommendation engine really. If you’re getting lots of right wing lies and propaganda, then it’s because of who you chose to follow.
Lossy compression is antiquated. Jpg should no longer be used as it’s not 1999. I will die on this mole hill.
JPEG XL (JXL) seems promising, being able to do a fair amount of compression while keeping images still high quality.
Lossless compression doesn’t really do well for pictures of real life. For screenshots it’s ideal, but for complex images PNGs are just wayyyy to big for the virtually non noticeable difference.
A high quality JPG is going to look good. What doesn’t look good is when it gets resized, recompressed, screenshotted, recompressed again 50 times.
PNG is the wrong approach for lossless web images. The correct answer is WebP: https://siipo.la/blog/whats-the-best-lossless-image-format-comparing-png-webp-avif-and-jpeg-xl
I found quite a lot of AVIF encoders lied about their lossless encoding modes, and instead used the normal lossy mode at a very high quality setting. I eventually found one that did true lossless and I don’t think it ever managed to produce a file smaller than the input.
Turns out, that’s a well known issue with the format. It’s just another case where Google’s marketing makes AVIF out to be fantastic, but in reality it’s actually quite mediocre.
They lied about the lossiness?! I can’t begin to exclaim loudly enough about how anxious this makes me.
The funny thing is, I knew something was off because Windows was generating correct thumbnails for the output files, and at that time the OS provided thumbnailer was incapable of generating correct thumbnails for anything but the simplest baseline files.
(Might be better now, idk, not running Windows now)
That’s how I knew the last encoder was producing something different, even before checking the output file size, the thumbnail was bogus.
This story is a nightmare and I’m not sure if it’s better or worse now knowing that it was ancient ICO files that tipped you off.
Open question to you or the world: for every lossless compression I ever perform, is the only way to verify lossless compression to generate before and after bitmaps or XCFs and that unless the before-bitmap and after-bitmap are identical files, then lossy compression has occurred?
Pretty much, you can use something like ImageMagick’s compare tool to quickly check if the round trip produced any differences.
It can be a bit muddled because even if the encoding is lossless, the decoding might not be (e.g. subtle differences between using non-SIMD vs. SIMD decoding), and it’s not like you can just check the file hashes since e.g. PNG has like 4 different interchangeable ways to specify a colour space. So I’d say it’s lossless if the resulting images differ by no more than +/- 1 bit error per pixel (e.g. 127 becoming 128 is probably fine, becoming 130 isn’t)
jxl is a much better format, for a multitude of reasons beyond the article, but it doesn’t have much adoption yet. On the chromium team (the most important platform, unfortunately), someone seems to be actively power tripping and blocking it
Yeah Google is trying to keep control of their image format and they are abusing their monopoly to do so
Webp, yo!
.tif or nothing, yo.
A high quality jpg looks good. The 100th compression into a jpg looks bad.
I know compression has a lot of upsides, but I’ve genuinely hated it ever since broadband was a thing. Quality over quantity all the way. My websites have always used dynamic resizing, providing the resolution in a parameter, resulting in lightning fast load times, and quality when you need it.
The way things are shared on the internet is with screenshots and social media, been like that for at least 15 years. JPG is just slowly deep frying the internet.
I disagree, but I do agree that there are better options available than JPEG. Lossy compression is actually what allows much of the modern internet to function. 4K HDR content on Netflix wouldn’t be a thing without it. And lossy compression can be perceptually lossless for a broader range of use cases. Many film productions use high quality lossy formats in their production pipelines in order to be able to handle the vast amounts of data.
Of course it all depends on the use case. If someone shares some photos or videos with me to keep, I’d like them to send the originals, whatever format they might be in.
I understand the need for compression and re-encoding but I stand by the claim we should not use a container that will eat itself alive a little bit every time it’s edited.
How often does a jpeg get edited in practice though? maybe a 2-3 times at most?
Yeah, let’s all post RAW 40MB photos right from the phone on … The Internet!
What a good idea.
Is there a specific reason? And subsidiary do you only listen to 96-bits FLAC too? Should video not be compressed either?
I mean, I’m all in with you when it comes to storing my holiday photos, but sharing them? Not so much.
That said, I grew up with 35kb jpgs so I’m kind of used to it, maybe I’m skewed.
Files should be at reasonable resolutions and sizes for their purpose but not in file formats that slowly deteriate in an internet of remixing ideas.
So not analog?
Who taught you jpgs deteriorate over time lol
https://uploadcare.com/blog/jpeg-quality-loss/
It happens when the image is edited and re-encoded on save. Who taught you they didn’t?
Oh, so it happens … Shuffles papers … When someone degrades the quality intentionally.
That happens if you reduce the dpi of your raw image too btw.
Not “over time” !
it’s not 1999
Don’t tell the kids over on Dormi.zone that.
What is blue sky all about any fucking way, "no we need a corpo daddy and we want the same corpo daddy that ruined the world with Twitter.
Hopefully he just keeps making them and selling them off. Dilute it down to nothing.
The whole goal of bluesky is to not need a corp, what are you talking about