• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle
  • They have a secondary motherboard that hosts the Slot CPUs, 4 single core P3 Xeons. I also have the Dell equivalent model but it has a bum mainboard.

    With those 90’s systems, to get Windows NT to use more than 1 core, you have to get the appropriate Windows version that actually supports them.

    Now you can simply upgrade from a 1 to a 32 core CPU and Windows and Linux will pick up the difference and run with it.

    In the NT 3.5 and 4 days, you actually had to either do a full reinstall or swap out several parts of the Kernel to get it to work.

    Downgrading took the same effort as a multicore windows Kernel ran really badly on a single core system.

    As for the Sun Fires, the two models I mentioned tend to be highly available on Ebay in the 100-200 range and are very different inside than an X86 system. You can go for 400 or higher series to get even more difference, but getting a complete one of those can be a challenge.

    And yes, the software used on some of these older systems was a challenge in itself, but they aren’t really special, they are pretty much like having different vendors RGB controller softwares on your system, a nuisance that you should try to get past.

    For instance, the IBM 5000 series raid cards were simply LSI cards with an IBM branded firmware.

    The first thing most people do is put the actual LSI firmware on them so they run decently.


  • Oh, I get it. But a baseline HP Proliant from that era is just an x86 system barely different from a desktop today but worse/slower/more power hungry in every respect.

    For history and “how things changed”, go for something like a Sun Fire system from the mid 2000’s (280R or V240 are relatively easy and cheap to get and are actually different) or a Proliant from the mid to late 90’s (I have a functioning Compaq Proliant 7000 which is HUGE and a puzzlebox inside).

    x86 computers haven’t changed much at all in the past 20 years and you need to go into the rarer models (like blade systems) to see an actual deviation from the basic PC alike form factor we’ve been using for the past 20 years and unique approaches to storage and performance.

    For self hosting, just use something more recent that falls within your priceclass (usually 5-6 years old becomes highly affordable). Even a Pi is going to trounce a system that old and actually has a different form factor.






  • Even as far back as XP/Vista Microsoft has wanted to run the file system as more of an adaptive database than a classical hierarchical file system.

    The leaked beta for Vista had this included and it ran like absolute shit, mostly because harddrives are slow and ram was at a premium, especially in Vista as it was such a bloated piece or shit.

    NTFS has since evolved to include more and more of these “smart” file system components.

    Now they want to go full on with this “smart” approach to the filesystem.

    It’ll still be slow and shit, just like it was 2 decades ago.





  • For now.

    Tech companies repeatedly float shit people don’t want to see if the reaction is mild enough to actually go through with it.

    Then they either wait until it is, or mull over ways to sell this as a good idea to consumers.

    It was only 5 years ago TotalBiscuit / John Bain was still railing against the initial spread of microtransactions and DLC fragmentation of games.

    And now they are utterly and completely ubiquitous.


  • Because pirated versions will be running a VLK license while there is no VLK subscription on file or run a KMS software to fake the authentication of licenses.

    Or in some cases, just run pure unlicensed and Windows will tell you on the desktop itself that the copy is unlicensed.

    If inspected, you have to prove you have the correct licenses.

    In some cases you’ll be allowed to just buy the licenses there and then, but if you’ve been running dozens of unlicensed copies or dozens of straight up illegal copies (with faked/cracked/stolen licenses), they’ll put the hammer down and you’ll be audited in detail to the point they’ll end up billing AND fining you for every piece of software you’ve used in your entire history.


  • I was once hired at a company to get them ISO compliant (8001, 27001 and various other certifications specific for data storage and handling for banks and healthcare).

    First thing I did was run inventory on all hard and software and it was quickly clear they ran 50 something unlicensed Windows and Office copies, 3 unlicensed Windows Server copies, 2 unlicensed Exchange copies, a whole bunch of unlicensed Winzip copies and on and on and on.

    The typical with small to mid sized businesses.

    You absolutely need to get your licensing in order if you want to get those certifications, especially the banking and healthcare data ones.

    I made them a list of everything we’d have to acquire to be in order with that part.

    They refused. They refused to the point of telling me “it’s not working out and we’re letting you go”.

    So, yeah, that’s how you get Microsoft to hear about a company running a couple hundred unlicensed products :)

    They never got their ISO certs and downsized considerably a year or two later.



  • When Netflix and Premium came along, I switched from pirating literally everything to finding that I had access to everything I wanted to watch between 15-30€ a month combined. Cheaper than a TV sub here costs (and a TV sub here didn’t have the shows I wanted to watch).

    Then the whole streaming market fragmented with every jackass on earth starting their own and removing a buttload of content from Netflix and YouTube.

    Resulting in that if I wanted to watch everything I want to watch, I’d be paying north of 150€ a month.

    So I pulled my wooden leg and dead parrot out of my closet and resumed pirating.

    Yarrrrr!






  • Email providers of every size don’t just blanket block unknown servers, that’s just asking for problems and loads of additional work.

    They block known problems and detect likely problems.

    Tools like ASSP (the spam filter I’ve used for a long ass time and used to install anywhere corporate filters weren’t in the budget) use advanced heuristics in combination with every form of blacklists/whitelist/greylist filtering you can think of (both on DNS and snmp levels), to look at the contents of the mail in combination with how “normal” the DNS registration and responses of the mailserver are. Add to that the default of checking that an @microsoft.com email actually comes from a known Microsoft server. There’s scores of public white and blacklists, generated by spam filters by receiving mail correctly from sources, which makes them go on whitelists and by detecting spam, which makes them go on blacklists. These lists have been around for decades by now and are constantly updated (mostly automatically).

    You don’t do email security and spam filtering by being an ass to everyone you don’t explicitly know. You do it be looking for any suspicious signs and user feedback. Just blocking by default is a far bigger headache than letting your tools do their work and then going in manually when they miss something.

    Google goes one step further and outright receives ALL mail, including spam, and just puts what is detected as spam in a spam folder.

    First company I got to that had no spam filtering deployed at all, went from 3 million emails received per day to just over 50K. Most people in that company ran a (pirated) Outlook plugin that did desktop level spam filtering and still had to manually filter more than 90% of the mail they received and then every week or so, deleted their spam folder.

    After I installed ASSP there, as I said, it went down to receiving only 50K emails per day, of which about 30K were still spam. After 2 weeks, it was down to 20K (a combination of me using the reporting tools from mail that landed in my own mailbox and the spam filter heuristics engine getting smarter from learning from the spam it received) and then I had a meeting with the whole company to teach them how to report spam (and whitelist known senders and false positives).

    A month or two into the deployment, people were used to using the reporting button and they were down to receiving maybe 1 or 2 spam emails per day (which often were still detected as questionable, but not definitely spam) as they (the email senders) were completely new to the system.

    This because spam outfits are relatively quickly detected, so they often have to change IPs, domains and methods and because of that, they perpetually exist on greylists which get scrutinized more heavily by filters.

    A domain like mine, that has been running and sending/receiving email for decades, mostly to completely official destinations like banks, corporate clients, governments and other established instances, without ever even hinting at sending spam, will rarely have any issue delivering its mail to its target as it is already known on black/whitelists generators as a good sender.