Nice, manufactured almost two years after first usage 🤯
Nice, manufactured almost two years after first usage 🤯
Yeah, we pay a lot. We also got one of the lowest downtimes regarding electricity, on average approximately 10minutes per year…so that’s kind of a (small) advantage you get for the premium price
Average load 800W is 0.8kW24h30d=576kWh/M
Which is over 172€ on a 30ct/kWh contract.
Just my 2 cents:
Proxmox. Flexibility for both new services via VM/LXC and backups (just install proxmox backup server alongside and you get incremental backups with nice retention settings, file-restore capabilities as well as backup consistency checks)
If it’s in a VM/container you don’t need to worry about backups, see 1.
In this case isn’t it sufficient to be able to access the data via Windows network?
Yes,
It does not work. Additionally, the ubiquity switch does not sync with 10gbit/s to one of the qnap switches (I tested with different cables and ports, but the led on the qnap stays orange, indicating connection speed lower than 10gbit/s)
As I’m maybe returning the switch due to the problems I hesitate to register it to access settings. Jumbo frames settings could actually be the solution. But with the problem mentioned in the first paragraph I’m not sure. A 300€ device should just work IMHO…
Thanks for the suggestion with ping, I will test it.
Thanks for your input. All criteria you mentioned are met.
If I build the same connection with the second qnap switch instead of the ubiquity it is working flawlessly.
As I am not sure if I should keep it return it I hesitated to do anything else than using it as a dumb switch.
It’s just two switches.
Server 1 — 10Gbe — ubiquity switch — 10Gbe — qnap switch — 10Gbe — server 2.
Another one… Ell Donsaii series. It’s light but nice to read and quite interesting in a science fiction kinda way
Loved it it’s much more “fantastic”, ie mind inspiring. Also there’s gonna be a movie!
The Martian. Both the book and the movie
Is Google pay and banking apps working?
I can’t help you identify the root cause, but I know there are fake display HDMI adaptors imitating a connected display people use to realize headless gaming VMs.
Jitsi Meet it’s usually p2p for calls between two persons. As soon as a third person joins, the meeting gets routed through the server. You can see this by a slight delay happening when person 3 joins. It won’t happen again for every additional person joining
Very interesting, thanks for sharing!
I know it’s just anecdotal evidence, however fail2ban in my one machine which does need ssh on port 22 to the open internet bans a lot of IPs every hour. All other ones with ssh on a higher port do not. Also their auth log does not show any failed attempts.
The points I made should not be used instead of all other security precautions like prohibited password login, fail2ban and updates, I thought that is common knowledge. It’s additional steps to increase security.
I disagree that changing the port is just security by obscurity. Scanning ips on port 22 is a lot easier than probing thousands of ports for every IP.
The reason people do automated exploit attempts on port 22 is because it is fast, cheap and effective. By changing the port you avoid these automated scans. I agree with you, this does not help if someone knows your IP and is targeting you specifically. But if you’re such a valuable target you hopefully have specialized people protecting your IT infrastructure.
Edit: as soon as your sshd answers on port 22, a potential attacker knows that the IP is currently in use and might try to penetrate. As stated above, this information would most likely not be shared with the automated attacks if you used any random port.
I can’t help much regarding the service denial issue.
However Port 22 should never be open to the outside world. Limiting to key authentication is a really good first step.
To avoid automated scans you should also change the port to a higher number, maybe something above 10,000.
This both saves traffic and CPU. And if a security bug in sshd exists this helps, too.
Dedup, incremental backups, backup verification
Cronjob with rsync command to copy file contents
On the proxmox hosts alongside pve. Then pull each other’s backup storage (via VPN/ssh tunnel).
The rpi could pull both storages as a second backup, e.g. via rsync. There is a pve port for the rpi, maybe also for pbs.
In my area there is a dog which has been fed a vegan diet for almost ten years now. So they definitely can survive on it. Said is healthy.