

The last time I did any html/css work was about 15 years ago. Now I’m curious what’s changed.


The last time I did any html/css work was about 15 years ago. Now I’m curious what’s changed.


If Civilization II taught me one thing, it’s that ongoing payments are an absolute scam… Unless you’re planning to declare war anyway.


Yeah, I meant for AI stuff specifically. Their main products are…well I wouldn’t say “good” but they successfully choked out all competition in the 90s so…


Microsoft has nothing worth using. Microsoft hasn’t made anything that’s even worth talking about. Anyone with an OpenAI key and an afternoon to kill could make something every bit as good as what Microsoft has done. They put the absolute bare minimum of effort into everything they’ve done with AI.
The only advantage they have is customer lock-in. Historically, that’s usually enough for them. I hope it’s not this time.
Eventually Microsoft will probably buy a company with people who know what the fuck they’re doing. I think that’s their only way forward because it looks like the brain drain has finally caught up with them.


The problem is that they are naively inverting the colors, which doesn’t work for photos. Lazy, yeah.
In principle I think it makes sense (as much sense as the feature in general, anyway). Personally I do not understand the push in iOS and Android to make all icons look the same, but if that’s what you want, then excluding shortcuts would be an eyesore, right?


Jesus Christ what a dumb take. But at least they didn’t say that millennials are killing the cell phone industry. I guess that doesn’t make for good clickbait anymore.
Reminds me if the parable of the broken window, in which French economist Frédéric Bastiat explains the painfully-obvious truth that breaking windows is generally a bad thing, even though it drums up business for the glass maker.
But if, on the other hand, you come to the conclusion, as is too often the case, that it is a good thing to break windows, that it causes money to circulate, and that the encouragement of industry in general will be the result of it, you will oblige me to call out, “Stop there! Your theory is confined to that which is seen; it takes no account of that which is not seen.”
It is not seen that as our shopkeeper has spent six francs upon one thing, he cannot spend them upon another. It is not seen that if he had not had a window to replace, he would, perhaps, have replaced his old shoes, or added another book to his library. In short, he would have employed his six francs in some way, which this accident has prevented.


I think Debian offers a very good compromise. The primary repos follow the Debian Free Software Guidelines (DFSG).
Then they have a separate “non-free” repos for “non-DFSG-compliant packages that are considered important enough to make available anyway”. If you want to be a hardline free software stalwart, you can do that, and Debian supports you. If you are comfortable making a few compromises for the sake of usability, like fonts and device drivers, Debian supports you on that as well.


Hmm, maybe I’m thinking more iPhone 3G era than original iPhone era? I recall a time when there weren’t many apps yet and you could put out anything marginally-functional for 99¢ on the app store and get some quick cash from it. I don’t remember $10-20 being the norm but maybe that was before I was onboard.
I’ve certainly been burned by apps either breaking with iOS updates or no longer being available to download on the App Store (so you could keep using them, but only on existing devices that already had them installed).


I blame Apple for setting the standard of $1-$3 for an app with lifetime updates. And also for making it so old apps stop working on newer OSes after just a few years. The business model was broken from the start. It was great at first but the bubble burst in record time.
That was nearly unheard of just 20 years ago.


I think it’s just that Mastodon posts in Lemmy are weird.
Toots don’t have titles so it just duplicates the content, and then you have a mess of @ and # tags that don’t make sense in Lemmy.
Cross-ecosystem federation is cool but also leaves a lot to be desired.


It makes sense to me IF it actually works.
Having extra capacity when a device is brand-new isn’t a huge boon, but having stable capacity over the long term would be. At least for me.
Of course this will depend on your habits. If you replace your phone every year, then it doesn’t matter. If you’re a light user and only go through a couple charge cycles per week, it’ll matter less than if you go through 1-2 cycles per day.
Personally I’m at around 1 cycle per day on my current phone, and after nearly 3 years (over 1000 charge cycles now) the battery life is shit — much worse than just 80% of its original battery life. Performance also suffers. With my last phone, I replaced the battery after 3 years and I was amazed at how much faster it was. I didn’t realize throttling was such a big problem.
I might replace my current battery, but it’s such a pain, and it costs more than my phone is realistically worth.


WTF is up with that title. Jesus.


I use Wayland now but there are still apps I run in X mode. Notably mpv and Firefox, because I cannot for the life of me configure them sensibly in Wayland, and I don’t want to write arcane KWin scripts just to get widow sizing/positioning to stay the way I want them on launch. I tried; it was extremely frustrating and still not quite functional.
Perhaps there are other window managers that would make my life easier. I haven’t tried many, but in principle, there is no way for the widow manager to know the correct size and location of new windows for arbitrary applications, so I doubt it. I consider this a user-hostile design choice in Wayland and I pray it will change in the future.


The majority of people will trail behind by 5-10 years, same as always. As long as a small minority at the cutting edge continue to use and develop better things, everyone will have access to them eventually.


They announced that they’re working with an OEM to support new non-pixel phones (perhaps even shipped with GOS).
The Pixel 9 series will be supported for another 6 years, and GOS support for the Pixel 10 is probably coming after Google releases QPR1 source. Hopefully there will be viable replacements by then.
Google is obviously going to keep making this more difficult but the rest of the world isn’t going to just sit still.


The actual paper presents the findings differently. To quote:
Our results clearly indicate that the resolution limit of the eye is higher than broadly assumed in the industry
They go on to use the iPhone 15 (461ppi) as an example, saying that at 35cm (1.15 feet) it has an effective “pixels per degree” of 65, compared to “individual values as high as 120 ppd” in their human perception measurements. You’d need the equivalent of an iPhone 15 at 850ppi to hit that, which would be a tiny bit over 2160p/UHD.
Honestly, that seems reasonable to me. It matches my intuition and experience that for smartphones, 8K would be overkill, and 4K is a marginal but noticeable upgrade from 1440p.
If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish
Three paragraphs in and they’ve moved the goalposts from HD (1080p) to 1440p. :/ Anyway, I agree that 2.5 meters is generally too far from a 44" 4K TV. At that distance you should think about stepping up a size or two. Especially if you’re a gamer. You don’t want to deal with tiny UI text.
It’s also worth noting that for film, contrast is typically not that high, so the difference between resolutions will be less noticeable — if you are comparing videos with similar bitrates. If we’re talking about Netflix or YouTube or whatever, they compress the hell out of their streams, so you will definitely notice the difference if only by virtue of the different bitrates. You’d be much harder-pressed to spot the difference between a 1080p Bluray and a 4K Bluray, because 1080p Blurays already use a sufficiently high bitrate.


Does it do that even if you set it to “use device MAC” for the wi-fi network you’re on?
The exact location might depend on brand/OS, but in stock Android it’s in Settings > Network & Internet > Internet > gear icon next to active wi-fi network > Privacy.


The only thing I would use such a thing for is installing an ad blocker for the real world.


It’s been a while since I ran a full-fat VM. What’s the go-to these days?
Hard to say what the used market is like, but the cheapest cards that would be broadly similar in performance would probably be the Arc A580, RX 5700 or RX 6600. This page has some rankings that are reasonable for comparison: https://www.tomshardware.com/reviews/gpu-hierarchy,4388-2.html
But…surely there’s a way to just stick with the latest supported driver, right? Or is Arch truly an “upgrade or die” distro?