Here’s where we never see the cure, due to profits.
Actually, in this case it’s due to laziness and ignorance of the populace:
Here’s where we never see the cure, due to profits.
Actually, in this case it’s due to laziness and ignorance of the populace:
I found Tuta to be lacking.
Conversation view is incomplete https://github.com/tutao/tutanota/issues/6 - https://github.com/tutao/tutanota/issues/5051
“when you have multiple addresses and custom domains getting hundreds of emails… it takes forever for the emails to load” https://community.centminmod.com/threads/skiff-email.24363/
Search isn’t working in firefox “your browser doesn’t support data storage”. As the search index needs to be stored in your browser, it does not work in private mode/incognito mode.
Free accounts get deleted if you do not log in for six months.
I can’t even access it on Edge.
We have to live in this world with all the brainrotted zombies so it is actually our problem too.
I agree and I think there’s a solution, but no one seems to care https://lemmy.world/post/14389655.
This is horrible news. Reddit is a horrible website and only getting worse. OpenAI promoting them and using their garbage content to train their AI systems is alarming. This is so dystopian.
And of course it always leads back to money:
Sam Altman is a shareholder in Reddit
I made accounts on Mastodon and Blue Sky but most people still use Twitter, so if there’s info you’re looking for, or if you want to share things, you’re forced to use what most people are using.
More info & discussion https://lemmy.world/post/15491742.
Doesn’t that mean that docker containers use up much more resources since you’re installing numerous instances & versions of each program like mumble and leftpad?
Doesn’t that mean that docker containers use up much more resources since you’re installing numerous instances & versions of each program like PHP?
It seems like docker would be heavy on resources since it installs & runs everything (mysql, nginx, etc.) numerous times (once for each container), instead of once globally. Is that wrong?
Instead of setting up one nginx for multiple sites you run one nginx per site and have the settings for that as part of the site repository.
Doesn’t that require a lot of resources since you’re running (mysql, nginx, etc.) numerous times (once for each container), instead of once globally?
Or, per your comment below:
Since the base image is static, and config is per container, one image can be used to run multiple containers. So if you have a postgres image, you can run many containers on that image. And specify different config for each instance.
You’d only have two instances of postgres, for example, one for all docker containers and one global/server-wide? Still, that doubles the resources used no?
It seems like docker would be heavy on resources since it installs & runs everything (mysql, nginx, etc.) numerous times (once for each container), instead of once globally. Is that wrong?
I covered that in the OP. It requires coding ability for anything other than a simple blog.
I covered that in the links in the OP. It’s extremely limited. I didn’t find it useful.
Redline the cheapest option until it catches fire.
It’s an important business website that would have severe consequences if it went down during traffic spikes (which it does get).
Why are you worried about your site going down during traffic surge? Unless you’re running a critical service, there is no need to worry about this too much if it’s just your personal sites.
Because it’s an important business website that would have severe consequences if it went down during traffic spikes (which it does get).
With proper caching, your personal site can even tank traffics from reddit frontpage on a $5/mo vps.
Yeah, I’m using Cloudflare, and I saw that Wordpress has a built-in caching option, but I couldn’t find any info on how well that protects sites from traffic surges.
consider hosting it on platforms with autoscaling support such as netlify.
Yeah but I need an SSG with the same capabilities as Squarespace to do that, and as mentioned in the OP, that doesn’t seem to exist.
I’d recommend Statamic
I looked at the demo and it looks like a very simple text editor to make blogs.
Since you posted this into a self-hosting community…
I have two other websites hosted on a $5 Hetzner server (that counts as self-hosted right?). I’ve been considering adding a Wordpress, Grav, or static site to it. But as mentioned in the OP, I have to worry about the site going down if it gets a traffic surge, so I’m thinking it would be safer and similarly/more affordable to host a Wordpress site with Hostinger or GreenGeeks. Am I wrong?
Grab a Raspberry Pi, slap nginx proxy manager and ddclient into it, and point your domain to your home IP.
I’m not likely to do that, for multiple reasons.
I ran into a similar problem with snapshots of a forum and email server – if there are scheduled emails when you take the snapshot they get sent out again if you create a new test server from the snapshot. And similarly for the forum.
I’m not sure what the solution is either. The emails are sent via an SMTP so it’s not as simple as disabling email (ports, firewall, etc.) on the new test server.