• 0 Posts
  • 219 Comments
Joined 1 year ago
cake
Cake day: August 18th, 2023

help-circle
  • You’re not wrong, but not everything needs to scale to 200+ servers (…arguably almost nothing does), and I’ve actually seen middle managers assume that a product needs that kind of scale when in fact the product was fundamentally not targeting a large enough market for that.

    Similarly, not everything needs certifications, but of course if you do need them there’s absolutely no getting around it.




  • But how does the alternative solutions compare with regards to maintainability?

    Which alternative solutions are you thinking of, and have you tried them?

    Rust has been mentioned several times in the thread already, but Go also prohibits “standard” OOP in the sense that structs don’t have inheritance. So have you used either Rust or Go on a large project?








  • I haven’t told you to keep calm. I’m just confused about you repeating the same points, in the same words, over and over, even after being told that you don’t have your facts correct.

    I’m not saying you can’t learn or talk about other languages; I’m confused by the mismatch between your posts criticizing people for promoting newer tech stacks and the ones where you seem to be promoting newer tech stacks yourself.

    25 years of experience is certainly enough to have strong opinions, but until your last comment I had the impression that you had a year or less of experience in C, hence my question.







  • The education system (universities, colleges, courses) uses the “modern” development stack.

    Hahahahahaha!

    Only a very few colleges and courses specializing in a very narrow field, such as embedded devices, can teach you the C language.

    snort BWAHAHAHAHA!

    the “dying C”

    [wheezing]

    And by doing this they are trying to hide the C language.

    [incredulous snort]

    And the community is kind

    [wistful sigh] I truly wonder what it would be like not to know anything about Linus Torvalds. I sometimes wish I didn’t know about Richard Stallman!

    And that it is unlikely that C will be able to replace anything in the near future.

    I’m sure you wrote this backwards.


  • Why do you keep posting this exact same rant? I see that some posts are in different Lemmy communities and you’ve posted it at least once on hacker news, but you also posted it to this same community already (https://snac.bsd.cafe/modev/p/1727338529.193499) and, although I can’t find it now, I remember you posting it months ago, too.

    Several of your posts that aren’t about how C is being “suppressed” (which the responses to your post have repeatedly demonstrated isn’t true) are about how you, personally, are still learning C and want more resources to learn it. And now you’re also posting about Nelua and Nim. This is wild to me! Why do you have such strong opinions about a language that you’re still learning? If you’re that passionate about C and believe that people should use it instead of newer languages, why do you care about Nim or Nelua? If you’re just trolling, why do you engage relatively patiently in the comments? And whatever your goal is, why do you keep reposting the same rants, especially this one that’s now quite old?


  • On the one hand, you’re right, C is waaaay higher-level than many people realize, and the compiler and processor do wild things to make code go faster. On the other hand, the C abstract machine is close enough to how computers “really work” to give you a fairly useful mental model, in a way that no other mainstream high-level language can.

    Even so, if you want to know how low-level code works, you should probably just learn one or more actual assembly languages and write a few small programs that way.

    C has another advantage, though: firmware, OS kernels, and virtual machines (other than browser JS engines) are still almost entirely written in C. So while it doesn’t teach you accurately how processors work, it is relevant if you want to know about the system software that meditates between the hardware and high-level software.


  • [warning: “annoying Rust guy” comment incoming]

    I don’t think Rust is perfect, but arguably I do “idolize” it, because I genuinely think it’s notably better both in design and in practice than every other language I’ve used. This includes:

    • C
    • C++
    • Java
    • C#
    • Kotlin
    • Scala
    • Python
    • Ruby
    • JavaScript (…I’ve barely used this, but I doubt my opinion would change on this one)
    • Perl
    • Go
    • Bash (…look, I’ve had to write actual nontrivial scripts with loops and functions, so yes, Bash is a real language; it just sucks)
    • Tcl/Tk (if you don’t know, don’t ask)
    • CommonLisp (…again, I’ve barely used this, and I wish I had more experience with this and other Lisps)

    In a literal sense, I agree that all (practical) languages “are flawed.” And there are things I appreciate about all of the above languages (…except Tcl/Tk), even if I don’t “like” the language overall. But I sincerely believe that statements like “all languages are flawed” and “use the best tool for the job” tend to imply that all (modern, mainstream) languages are equally flawed, just in different ways, which is absolutely not true. And in particular, it used to be true that all languages made tradeoffs between a fairly static, global set of binary criteria:

    • safety/correctness versus “power” (i.e. low-level system control)
    • safety/correctness versus run-time efficiency (both parallelism and high single-thread performance)
    • ease-of-use/ease-of-learning versus “power” and runtime-efficiency
    • implementation simplicity versus feature-richness
    • build-time versus run-time efficiency
    • type-safety versus runtime flexibility

    Looking at these, it’s pretty easy to see where most of the languages in my list above fall on each side of each of these criteria. What’s special about Rust is that the core language design prevents a relatively novel set of tradeoffs, allowing it to choose “both” for the first two criteria (though certainly not the latter three; the “ease-of-use” one is debatable) at the expense of higher implementation complexity and a steeper learning curve.

    The great thing about this isn’t that Rust has “solved” the problem of language tradeoffs. It’s that Rust has broadened the space of available tradeoffs. The assumption that safety necessarily comes at a runtime cost was so pervasive prior to Rust that some engineers still believe it. But now, Rust has proven, empirically, that this is not the case! And so my ultimate hope for Rust isn’t that it becomes ubiquitous; it’s that it inspires even better languages, or at least, more languages that use concepts Rust has brought to the mainstream (such as sum-types) as a means to explore new design tradeoff spaces. (The standard example here is a language with a lightweight garbage-collecting runtime that also has traits, sum-types, and correct-by-default parallelism.)

    There are other languages that, based on what I know about them, might inspire the same type of enthusiasm if I were to actually use them more:

    • Erlang
    • Gleam
    • OCaml
    • Swift

    …but, with the exception of Swift, these are all effectively “niche” languages. One notable thing about Rust is that its adoption has actually been rather astounding, by systems language standards. (Note that D and Ada never even got close to Rust’s popularity.)