• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: July 19th, 2023

help-circle

  • Apparently you can save it to Google drive then download the Google drive program and make that folder available offline so it downloads it to the computer.

    1. When you setup the Google Takeout export choose Save in a Google Drive folder

    2. Install the Google Drive PC client (Drive for desktop)

    3. It will create a new drive (i.e. G:) in your explorer. Right click on the takeout folder and select “Make available offline”. All files in that folder will be downloaded by the Google Drive Desktop in the background, and you will be able to copy to another location, as they will be local files.



  • I’m using a commercial desktop with an i5 Sandy bridge. I maxed out to 32Gb of ram only because I’m running trueNAS, debian with containers, and home assistant. Most RAM goes to trueNAS and trueNAS doesn’t accurately report ram. For CPU, mostly just task limited but I don’t really think thats a proxmox issue. Obviously it’s not going to support an enterprise or even small business but it works for what I need of less than 4 users on my budget.

    Proxmox doesn’t really ask for much but I probably would recommend docker for your arm devices.





  • I keep everything behind a VPN so I don’t have to worry much about opening things up to the Internet. It’s not necessary about the fact that you’re probably fine but more so what the risk to you is if that device is compromised, ex: a NAS with important documents, or the idea that if that device is infected, what can that device access.

    You could expose your media server and not worry too much about that device but having it in a “demilitarized zone”, ensuring all your firewall rules are correct and that that service is always updated is more difficult than just one VPN that is designed to be secure from the ground up.



  • I steered away from replacing my router with a PC and got an ER-X and virtualized everything else including TrueNAS on an old office PC. Having PCI-E slots helps with stability a ton when virtualizing and my setup has 64gb DDR3 which was cheap.

    Ubiquiti APs are typically the homeLab standard and work great especially with multiple APs. You can start with turning your existing router to AP mode and replace with APs later.

    For stability, you can create a “test network” on the ER-X. This is an incredibly useful unofficial guide to setup ER-X with multiple lan networks, APs and more. Then create redundancy with docker containers on a Pi. (put DNS server on proxmox system and a second on the Pi so if one goes down, DNS works).

    For your home assistant question, does the backups or copy/paste data folder not meet your needs?



  • On Android and I believe IOS it’s a single connection. I would start with the basic functionality (also don’t create a tailscale account with GitHub bc it does weird things with sharing if you ever want to have multiple users).

    Once you’ve got the VPN and storage working I can think of two options to give you the functionality of 2 vpns

    1. tasker is an android app that can let you automate a lot. It might let you switch vpns when opening say your storage app and switch back a bit easier than toggling it in settings.
    2. setup your lap-server at home with an outgoing public VPN so traffic goes mobile device> tailscale> public VPN. Essentially acting like you’re home using your public VPN. This may take some tinkering to work properly, especially when you’re home on the same network. Plus you would definitely see a Network speed impact on your phone.



  • I setup openvpn on my network originally + duckdns on a dynamic IP in 2021/2022. It’s an “older” protocol but I felt it was easier to setup since it’s been around longer and the tools just make it easy.

    Wireguard has speed advantages but being newer, takes more work to see those speed advantages. There’s a docker container called wg-easy that I’ve heard mixed things about (speed in a docker container vs easy to setup).

    I used tail scale when I rebuilt my VPN server because I was originally using Oracle Linux (wanted to learn it more but went back to Ubuntu).

    If you can get certificates working, wireguard shouldn’t be too difficult. I prefer VPN over exposing multiple ports/protocols for a family or small userbase. If you’re sharing libraries or other services with extended family, I’d probably expose those to the Internet and work on hardening/having that server in a demilitarized zone + certificate based authentication and MFA on any public admin accounts.