Currently in a very inter-disciplinary field where the different mathematicians have their own language which has to be translated back into first software, then hardware. It’s so confusing at first till you spend 30 minutes on wikipedia to realize they’re just using an esoteric term to describe something you’ve used forever.
Yeah, this happens a lot. I studied math and I often got the impression that when you read other researcher’s work, they describe the exact same thing that you have already heard about, but in a vastly different language. I wonder how many re-inventions and re-namings there are of any concept simply because people can’t figure out that this thing has already been researched into. It really happens a lot, where 5 people discovered something, but gave them 5 different names.
It’s even worse, math uses arcane terms for things that in many other fields are basically just accepted.
Galois fields? In hardware and software, those are just normal binary unsigned integers of a given bit length.
I get that GFs came about first, but when they were later implemented for computers they weren’t usually (they are sometimes, mostly for carry less mul specifically, or when used for cryptography) called Galois fields, the behavior was just accepted as the default for digital logic.
The division operator of a Galois field (I prefer “finite field”, because it’s more descriptive) is nothing like the what computers usually use for unsigned integers. Like, if you’re working mod 5, then 3/2 = 4 (because 2 * 4 = 8 = 3 mod 5).
Yeah, this happens a lot. I studied math and I often got the impression that when you read other researcher’s work, they describe the exact same thing that you have already heard about, but in a vastly different language. I wonder how many re-inventions and re-namings there are of any concept simply because people can’t figure out that this thing has already been researched into. It really happens a lot, where 5 people discovered something, but gave them 5 different names.
It’s even worse, math uses arcane terms for things that in many other fields are basically just accepted.
Galois fields? In hardware and software, those are just normal binary unsigned integers of a given bit length.
I get that GFs came about first, but when they were later implemented for computers they weren’t usually (they are sometimes, mostly for carry less mul specifically, or when used for cryptography) called Galois fields, the behavior was just accepted as the default for digital logic.
The division operator of a Galois field (I prefer “finite field”, because it’s more descriptive) is nothing like the what computers usually use for unsigned integers. Like, if you’re working mod 5, then 3/2 = 4 (because 2 * 4 = 8 = 3 mod 5).