• 1 Post
  • 80 Comments
Joined 2 years ago
cake
Cake day: July 14th, 2023

help-circle
  • A paid skillful engineer, who doesn’t think it’s important to make that sort of a change and who knows how the system works, will know that, if success is judged solely by “does it work?” then the effort is doomed for failure. Such an engineer will push to have the requirements written clearly and explicitly - “how does it function?” rather than “what are the results?” - which means that unless the person writing the requirements actually understands the solution, said solution will end up having its requirements written such that even if it’s defeated instantly, it will count as a success. It met the specifications, after all.


  • I primarily use Standard Notes. It’s a fantastic tool and I can use it anywhere, online or offline. It’s not great for collaboration, though, and it doesn’t have a canvas option. But I use it for scratch pads, for todo lists, for project tracking, for ideas, plans, plotting for my tabletop (Monster of the Week) game, software design and architecture, for drafting comments, etc…

    Standard Notes also has a ton of options for automated backups. I get a daily email with a backup of my notes; I can host my notes on my home server and the corporate one; I can also set up automated backups on any desktop.

    I don’t use it for saving links. I’m still using Raindrop.io for that, even though I’m self-hosting both Linkding and Linkwarden.

    For sharing and collaboration, I either publish to Listed with Standard Notes or use Hedgedoc, which is great for collaboration and does a great job presenting nodes, too.

    For canvas notes, I use GoodNotes on a tablet or the Onyx Boox’s default Notes app. I’d love a better FOSS, self-hosted option, especially for the Boox, but my experiences thus far have been negative (especially on the Boox).

    I’ve been trying out SilverBullet lately, since I want to try out cross-note querying and all that, but I’m too stuck in my habits and keep going back to Standard Notes. I think I’ll have better luck if I choose one app and go with it.

    I also have a collection of Mnemosyne notebooks that I use with fountain pens (mostly the Lamy 2000, but also quite commonly a Platinum 3776 or a Twsbi). Side note: the Lamy 2000 was my first fountain pen and after getting it I went deep into fountain pens. I explored a ton of different options, found a lot of nice pens across a number of brands… and yet how I still haven’t found something that I consistently like more. The Pilot VP is great but deceptive; a fancy clicky pen that only holds 30 minutes of ink (in a converter, at least) is decidedly inconvenient.

    I’ve also been checking out Obsidian on my work computer. So far I haven’t seen anything that makes me prefer it over my existing set of tools.


  • Hedgedoc is fantastic. If you’re okay with your notes app being web-only (without an app or even a PWA) and you don’t need canvas notes or multi-note queries, you should check it out.

    First, every note is Markdown, but it supports a ton of things natively. It has native Vim, Emacs, and Sublime (the default) editors and it’s built to be great for collaboration (if you want).

    It also has

    • syntax highlighting for a ton of languages
    • Mermaid.js support
    • LaTeX support
    • easy drag and drop image uploads
    • a solid mobile interface (for a webapp in your browser, at least)
    • built in revision history
    • support for other diagram tools, like graphviz, flowchart.js
    • a bunch of other little Markdown enhancements that make using it feel oddly intuitive

    And best of all, they have a Hedgehog for the icon! (I may be biased.)



  • Giphy has a documented API that you could use. There have been bulk downloaders, but I didn’t see any that had recent activity. However you still might be able to use one to model your own script after, like https://github.com/jcpsimmons/giphy-stacks

    There were downloaders for Gfycat - gallery-dl supported it at one point - but it’s down now. However you might be able to find collections that other people downloaded and are now hosting. You could also use the Internet Archive - they have tools and APIs documented

    There’s a Tenor mass downloader that uses the Tenor API and an API key that you provide.

    Imgur has GIFs is supported by gallery-dl, so that’s an option.

    Also, read over https://github.com/simon987/awesome-datahoarding - there may be something useful for you there.

    In terms of hosting, it would depend on my user base and if I want users to be able to upload GIFs, too. If it was just my close friends, then Immich would probably be fine, but if we had people I didn’t know directly using it, I’d want a more refined solution.

    There’s Gifable, which is pretty focused, but looks like it has a pretty small following. I haven’t used it myself to see how suitable it is. If you self-host it (or something else that uses S3), note that you can use MinIO or LocalStack for the S3 container rather than using AWS directly. I’m using MinIO as part of my stack now, though for a completely different app.

    MediaCMS is another option. Less focused on GIFs but more actively developed, and intended to be used for this sort of purpose.





  • The rules text says it creates an area of darkness, and with your interpretation, it doesn’t, which means your interpretation is wrong. Yes, the ability could be written more clearly, but the logic for a reasonable way for it to function follows pretty cleanly. Your interpretation is not RAW or RAI.

    There’s a reply on RPG StackExchange that covers a similar line of logic to what I wrote above.

    Remember that Fifth Edition D&D is intentionally not written with the same exacting precision as games like M:tG. The game doesn’t have an explicit definition of magical darkness, but it’s pretty clear that the intent is for magical to trump mundane (when it comes to sources of light and darkness). Even the Specific Beats General section says that most of the exceptions to general rules are due to magic.


  • If you have normal darkness everywhere, there isn’t a reason to use it, but you don’t always have darkness everywhere. In fact, you generally don’t.

    Not all monsters with darkvision have access to light sources. Even if they do, they may need an action to use it or may be out of range. A torch or the light cantrip only has a 40’ range. If you collaborate on positioning with the caster, you can basically set yourself up to have advantage every turn thanks to the darkness, since as a ranged attacker you don’t have to stay within 40’ of your enemies.

    Also, Gloom Stalkers can’t see through Darkness like Warlocks can, so this effect is useful to them in a way that the Darkness spell isn’t.

    That all said, Tricksy wouldn’t do anything if it didn’t block nonmagical illumination, so it’s reasonable to run it as though it does. Sure, it still wouldn’t block even a cantrip, but it would block torches, lanterns, the sun, etc…

    And running it as though it doesn’t block nonmagical darkness results in nonsensical behavior. You’re in a torchlit chamber and use the ability - now there’s a cube of darkness, blocking the light of all four nonmagical torches. If you move one of those torches away and back, why would it suddenly pierce the magical darkness? If it wouldn’t, why would a new nonmagical light source?







  • I made a typo in my original question: I was afraid of taking the services offline, not online.

    Gotcha, that makes more sense.

    If you try to run the reverse proxy on the same server and port that an existing service is using (e.g., port 80), then you’ll run into issues. You could also run into conflicts with the ports the services themselves use. Likewise if you use the same outbound port from your router. But IME those issues will mostly stop the new services from starting - you’d have to stop the services or restart your machine for the new service to have a chance to grab the ports while they were unused. Otherwise I can’t think of any issues.


  • I’m afraid that when I install a reverse proxy, it’ll take my other stuff online and causes me various headaches that I’m not really in the headspace for at the moment.

    If you don’t configure your other services in the reverse proxy then you have nothing to worry about. I don’t know of any proxy that auto discovers services and routes to them by default. (Traefik does something like this with Docker services, but they need Docker labels and to be on the same Docker network as Traefik, and you’re the one configuring both of those things.)

    Are you running this on your local network? If so, then unless you forward a port to your server on the port your reverse proxy is serving from, it’ll only be accessible from the local network. This means you can either keep it that way (and VPN in to access it) or test it by connecting directly to your server on that port and confirm that it’s working as expected before forwarding the port.


  • Paired with allowing people who own the original to upgrade for $10 (and I’m assuming something similar in the UK) when they’re charging $50 for the remaster if you don’t have the original, that makes sense. They’re just closing a loophole.

    I’d much rather they double the existing game’s price than for them to charge $25-$30 for the upgrade or to even just not have one outright.

    It sucks for anyone who’d been planning to play the original and who just hadn’t bought it yet, but used prices for discs should still be low, so only the subset of those people who have disc-less machines are really impacted.


  • I don’t know that a newer drive cloner will necessarily be faster. Personally, if I’d successfully used the one I already have and wasn’t concerned about it having been damaged (mainly due to heat or moisture) then I would use it instead. If it might be damaged or had given me issues, I’d get a new one.

    After replacing all of the drives there is something you’ll need to do to tell it to use their full capacity. From reading an answer to this post, it looks like what you’ll need to do is to select “Change RAID Mode,” then keep RAID 1 selected, keep the same disks, and then on the next screen move the slider to use the drives’ full capacities.


  • upper capacity

    There may be an upper limit, but on Amazon there is a 72 TB version that would have to come with at least 18 TB drives. If 18 TB is fine, 20 TB is also probably fine, but I couldn’t find any reports by people saying they’d loaded 20 TB drives into theirs without issue.

    procedure

    You could also clone them yourself, but you’d want to put the NAS into read only mode or take it offline first.

    I think cloning drives is generally faster than rebuilding them in RAID, as well as easier on the drives, but my personal experience with RAID is very limited.

    Basically, what I’d do is:

    1. Take the NAS offline or make it read-only.
    2. Pull drive 0 from the array
    3. Clone it
    4. Replace drive 0 with your clone
    5. Pull drive 2 (from the other mirrored pair) from the array
    6. Clone it
    7. Replace drive 2 with your clone
    8. Clone drive 0 again, then replace drive 1 with your clone
    9. Clone drive 2 again, then replace drive 3 with your clone
    10. Put the NAS back online or make it read-write again.

    In terms of timing… I have a Sabrent offline cloning hub (about $50 on Amazon), and it copies data at 60 Mbps, meaning it’d take about 9 hours per clone. Startech makes a similar device ($96 on Amazon, that allegedly clones data at 466 Mbps (28 GB per minute), meaning each clone would take 2.5 hours… but people report it being just as slow as the Sabrent.

    Also, if you bought two offline cloning devices, you could do steps 1-3 and 4-6 simultaneously, and do the same again with steps 7-8.

    I’m not sure how long it would take RAID to rebuild a pulled drive, but my understanding is that it’s going to be fastest with RAID 1. And if you don’t want to make the NAS read-only while you clone the drives, it’s probably your only option, anyway.