

This was happening a lot yesterday but I eventually got through to opt my machine in.
This was happening a lot yesterday but I eventually got through to opt my machine in.
As a relatively new X1C owner and a noob to the hobby, thank you for sharing this. I installed it a few hours ago and it seems to be working basically just as good as bambu’s firmware as far as I can tell.
I’m going to be going with X1Plus and OrcaSlicer from here on out, because this shit from bambu lab ain’t cool.
I just got done reading about the benchy controversy and now I see that someone turned out an alternative model less than 24h ago with a master class troll of a name. What an absolute legend.
This is basically what I’ve been telling people for years. Prototype in Python to get the concepts down, then when you’re serious about the project, write it in a serious language.
Just wear hundreds of thousands of them glued together, problem solved.
On a more realistic note though, the applications of this will probably be industrial for a good while. I found it interesting how the article mentions that they were able to develop a diamond coating over their growth substrate. That probably has some cool applications in industrial settings where diamond-plated materials are used.
deleted by creator
Again, in many instances, folks training models are using repositories of images that have been publicly shared. In many cases the person/people who assembled the image repositories are not the same person using them. I agree that reckless scraping is not responsible, but if you’re using a repository of images that’s presented as ok to use for AI training, I’d argue it’s even more ethical to strip out the Nightshaded images, because clearly the presence of Nigthshade means you shouldn’t use that one. I guess we’re just going to have to agree to disagree here, because I see this as a helpful tool to specifically avoid training on images you shouldn’t be.
I don’t think most people are collecting images by hand and saying “ah yes I’m just gonna yoink this and use it in my model”. There are a plethora of sites for sharing repositories of training data, and therefore it’s pretty easy for someone training a model to unknowingly pull down some data they don’t actually have permission to use. It’s completely infeasible to check licensing by hand on what could be millions of images, so this tool makes it easy to simply not train on images that have gone through Nightshade. I fail to see how that’s unethical, as not training on the image is the whole reason the original image was put through Nightshade in the first place.
The tagline is really poorly written IMO. From reading the README, this doesn’t outwardly appear to be a tool for bypassing an artist’s choice to use something like Nightshade, but rather it seems to detect if such a tool has been used.
I’m assuming that the use case would be to avoid training on Nightshade-ed images, which would actually be respecting the original artist’s decision?
They moved too quickly and the backlash was too intense. They will 100% try to push this shit again as soon as they think the market/userbase might bear it.
I think they all look pretty cool and have their own distinctive aesthetic while being thematically consistent (e.g. they look like they’re part of the same collection). That being said, I do definitely get “logo” vibes from quite a few, like you could have told me they were sports team logos and I’d believe it. Also Akita straight up looks like a Nike swoosh lol.
Might’ve been this one? I just searched for “Precise electric screwdriver” and found one that closely matches your picture.