kelnos 20 hours ago
I'm kinda in the opposite camp. After doing a bunch of VB in my tweens and teens, I learned Java, C, and C++ in college, settling on mostly C for personal and professional projects. I became a core developer of Xfce and worked on that for 5 years.

Then I moved into backend development, where I was doing all Java, Scala, and Python. It was... dare I say... easy! Sure, these kinds of languages bring with them other problems, but I loved batteries-included standard libraries, build systems that could automatically fetch dependencies -- and oh my, such huge communities with open-source libraries for nearly anything I could imagine needing. Even if most of the build systems (maven, sbt, gradle, pip, etc.) have lots of rough edges, at least they exist.

Fast forward 12 years, and I find myself getting back in to Xfce. Ugh. C is such a pain in the ass. I keep reinventing wheels, because even if there's a third-party library, most of the time it's not packaged on many of the distros/OSes our users use. Memory leaks, NULL pointer dereferences, use-after-free, data races, terrible concurrency primitives, no tuples, no generics, primitive type system... I hate it.

I've been using Rust for other projects, and despite it being an objectively more difficult language to learn and use, I'm still much more productive in Rust than in C.

kstrauser 19 hours ago
I think Rust is harder to learn, but once you grok it, I don't think it's harder to use, or at least to use correctly. It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`. Rust's normal error messages are delightfully helpful. For example, I just wrote some bad code and got:

    --> src/main.rs:45:34
     |
  45 |         actions.append(&mut func(opt.selected));
     |                             ---- ^^^^^^^^^^^^ expected `&str`, found `String`
     |                             |
     |                             arguments to this function are incorrect
     |
  help: consider borrowing here
     |
  45 |         actions.append(&mut func(&opt.selected));
     |
I even had to cheat a little to get that far, because my editor used rust-analyzer to flag the error before I had the chance to build the code.

Also, I highly recommend getting into the habit of running `cargo clippy` regularly. It's a wonderful tool for catching non-idiomatic code. I learned a lot from its suggestions on how I could improve my work.

kelnos 19 hours ago
> I think Rust is harder to learn, but once you grok it, I don't think it's harder to use, or at least to use correctly. It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`.

When I say Rust is harder to use (even after learning it decently well), what I mean is that it's still easier to write a pile of C code and get it to compile than it is to write a pile of Rust code and get it to compile.

The important difference is that the easier-written C code will have a bunch of bugs in it than the Rust code will. I think that's what I mean when I say Rust is harder to use, but I'm more productive in it: I have to do so much less debugging when writing Rust, and writing and debugging C code is more difficult and takes up more time than writing the Rust code (and doing whatever less debugging is necessary there).

> Also, I highly recommend getting into the habit of running `cargo clippy` regularly. It's a wonderful tool for catching non-idiomatic code.

That's a great tip, and I usually forget to do so. On a couple of my personal projects, I have a CI step that fails the build if there are any clippy messages, but I don't use it for most of my personal projects. I do have a `cargo fmt --check` in my pre-commit hooks, but I should add clippy to that as well.

sestep 19 hours ago
If you're using VS Code then you can add `"rust-analyzer.check.command": "clippy"` to your `settings.json`. I assume there's a similar setting for rust-analyzer in other editors.
maleldil 12 hours ago
Neovim:

    require("lspconfig").rust_analyzer.setup({
        settings = {
            ["rust-analyzer"] = {
                checkOnSave = {
                    command = "clippy",
                    allFeatures = true,
                },
            },
        },
    })
pabs3 11 hours ago
You might want to reconsider use of rust-analyzer, it isn't safe to use on code you haven't written yourself.

https://rust-analyzer.github.io/book/security.html

swiftcoder 5 hours ago
> it isn't safe to use on code you haven't written yourself

Neither is cargo (nor npm, nor any other package manager, for that matter).

I'm not sure what value being that paranoid is buying you in the long run.

pabs3 27 minutes ago
Package managers are for running other people's code, I would not expect the same of static analysis tools, especially since they are of use while auditing other people's code before building/running it.
kstrauser 19 hours ago
That's a fair distinction. Basically, it's easier to write C that compiles than Rust that compiles, but it's harder to write correct C than correct Rust.

Regarding Clippy, you can also crank it up with `cargo clippy -- -Wclippy::pedantic`. Some of the advice at that level gets a little suspect. Don't just blindly follow it. It offers some nice suggestions though, like:

  warning: long literal lacking separators
    --> src/main.rs:94:22
     |
  94 |             if num > 1000000000000 {
     |                      ^^^^^^^^^^^^^ help: consider: `1_000_000_000_000`
     |
that you don't get by default.
jpc0 9 hours ago
The truly pedantic setting here would by complaining about the magic number.

Why 1_000_000_000_000, what does that number mean.

It is for free to:

    let my_special_thing = 1_000_000_000_000
since the compiler will just inline it.

The readability problem was never the lack of separators, since that number might be the wrong number regardless.

maleldil 12 hours ago
You can also add #![warn(clippy::all, clippy::pedantic)] to your main.rs/lib.rs file to get those lints project-wide.
ohgr 18 hours ago
I’d add that the Rust code and C code will probably have the same number of bugs. The C code will likely have some vulnerabilities on top of those.

Rust doesn’t magically make the vast majority of bugs go away. Most of bugs are entirely portable!

kelnos 15 hours ago
Vulnerabilities are bugs, so the C code will have more bugs than the Rust program.

You might say that the C and Rust code will have the same number of logic errors, but I'm not convinced that's the case either. Sure, if you just directly translate the C to Rust, maybe. But if you rewrite the C program in Rust while making good use of Rust's type system, it's likely you'll have fewer logic errors in the Rust code as well.

Rust has other nice features that will help avoid bugs you might write in a C program, like most Result-returning functions in the stdlib being marked #[must_use], or match expressions being exhaustive, to name a couple things.

tialaramex 4 hours ago
> most Result-returning functions in the stdlib being marked #[must_use]

Actually it's a bit cleverer than that, and some people might benefit from knowing this. The Result type itself is marked #[must_use]. If you're writing a Goat library and you are confident that just discarding a Goat is almost always a mistake regardless of the context in which they got a Goat you too should mark your Goat type #[must_use = "this `Goat` should be handled properly according to the Holy Laws of the Amazing Goat God"] and now everybody is required to do that or explicitly opt out even for their own Goat code.

Obviously don't do this for types which you can imagine reasonable people might actually discard, only the ones where every discard is a weird special case.

Types I like in Rust which help you avoid writing errors the compiler itself couldn't possibly catch:

Duration - wait are these timeouts in seconds or milliseconds? It's different on Windows? What does zero mean, forever or instant ?

std::cmp::Ordering - this Doodad is Less than the other one

OwnedFd - it's "just" a file descriptor, in C this would be an integer, except, this is always a file descriptor, it can't be "oops, we didn't open a file" or the count of lines, or anything else, we can't Add these together because that's nonsense, they're not really integers at all.

codedokode 17 hours ago
Rust allows to provide more information about types (generic types, pointer usage) and checks it, while in C you have to rely on doc comments and checking the code manually. Or am I wrong and C allows to specify pointer nullability, pointer ownership and array bounds?
ohgr 17 hours ago
None of those things feature in any problem I deal with on a daily basis, whatever language I use.

So for example today I dealt with a synchronization issue. This turned out to not be a code bug but a human misunderstanding of a protocol specification saga, which was not possible to code into a type system of any sort. The day before was a constraint network specification error. In both cases the code was entirely irrelevant to the problem.

Literally all I deal with are human problems.

My point is Rust doesn't help with these at all, however clever you get. It is no different to C, but C will give you a superset of vulnerabilities on top of that.

Fundamentally Rust solves no problems I have. Because the problems that matter are human ones. We are too obsessed with the microscopic problems of programming languages and type systems and not concentrating on making quality software which is far more than just "Rust makes all my problems go away" because it doesn't. It kills a small class of problems which aren't relevant to a lot of domains.

(incidentally the problems above are implemented in a subset of c++)

xmcqdpt2 2 hours ago
> So for example today I dealt with a synchronization issue. This turned out to not be a code bug but a human misunderstanding of a protocol specification saga, which was not possible to code into a type system of any sort.

Maybe not in a reasonable language no, but there are advances in type systems that are making ever larger classes of behaviours encodable into types. For example, algebraic effects (can this function throw, call a remote service etc)

https://koka-lang.github.io/koka/doc/index.html

linear types (this method must be called only once etc)

https://ghc.gitlab.haskell.org/ghc/doc/users_guide/exts/line...

dependent typing (f(1) returning a different type from f(2), verifiable at compile time),

https://fstar-lang.org/index.html

Some of these features will eventually make it to “normal” PL. For example, Scala now has dependent types,

https://dotty.epfl.ch/docs/reference/new-types/match-types.h...

and Java can support linear type checking,

https://checkerframework.org/manual/#must-call-checker

kelnos 15 hours ago
> None of those things feature in any problem I deal with on a daily basis, whatever language I use.'

I run into those things nearly daily, so... ok then.

worthless-trash 2 hours ago
Do not feel bad friend, It's not you.. there is definitely inconsistencies here. Maybe they work on absolutely pristine perfect codebases.
naasking 17 hours ago
I'd say whether Rust helps you reduce bugs depends on how good you are at creating abstractions and/or encoding properties in the type system.
ohgr 17 hours ago
Most bugs are way above that level of abstraction and thought.
nindalf 16 hours ago
[Citation needed].
davedx 15 hours ago
This is objectively nonsense
codr7 13 hours ago
Interesting use of the word 'objectively' there.
davedx 6 hours ago
How can it not be true? One of the primary features of the rust compiler is enforcing memory safety at compile-time. C doesn't have anything like that. There are an entire class of bugs that are impossible to implement in rust.
nicoburns 13 hours ago
> it's still easier to write a pile of C code and get it to compile than it is to write a pile of Rust code and get it to compile.

As someone who is more familiar with Rust than C: only if you grok the C build system(s). For me, getting C to build at all (esp. if I want to split it up into multiple files or use any kind of external library) is much more difficult than doing the same in Rust.

seba_dos1 1 hour ago
> It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`

I won't disagree that correct C is harder to write, but it's not 2005 anymore and standard tooling gives you access to things like asan, msan, ubsan, tsan, clang-tidy...

epidemian 18 hours ago
> Also, I highly recommend getting into the habit of running `cargo clippy` regularly.

You can also have that hooked up to the editor, just like `cargo check` errors. I find this to be quite useful, because i hace a hard time getting into habits, especially for thing that i'm not forced to do in some way. It's important that those Clippy lints are shown as soft warnings instead of hard errors though, as otherwise they'd be too distracting at times.

mountainriver 12 hours ago
Agree, Rust is quite hard to learn, but now that I know it I have a hard time writing anything else. It really gives you the best of a lot of worlds.

Granted I can still crank out a python program faster, that kinda works but god forbid you need to scale it or use any sort of concurrency at all.

crabbone 18 hours ago
* Rust errors can be equally unhelpful. Also, the error you posted is hands down awful. It doesn't tell you what went wrong, and it's excessively naive to rely on compiler to offer a correct fix in all but the most trivial cases. When errors happen, it's a consequence of an impasse, a logical contradiction: two mutually exclusive arguments have been made: a file was assumed to exist, but was also assumed not to exist -- this is what's at the core of the error. The idiotic error that Rust compiler gave you doesn't say what were the assumptions, it just, essentially, tells you "here's the error, deal with it".

* In Rust, you will have to deal with a lot of unnecessary errors. The language is designed to make its users create a host of auxiliary entities: results, options, futures, tasks and so on. Instead of dealing with the "interesting" domain objects, the user of the language is mired in the "intricate interplay" between objects she doesn't care about. This is, in general, a woe of languages with extensive type systems, but in Rust it's a woe on a whole new level. Every program becomes a Sisyphean struggle to wrangle through all those unnecessary objects to finally get to write the actual code. Interestingly though, there's a tendency in a lot of programmers to like solving these useless problems instead of dealing with the objectives of their program (often because those objectives are boring or because programmers don't understand them, or because they have no influence over them).

necubi 18 hours ago
I don't follow your first point—the compiler is pointing out exactly what the problem is (the argument has the incorrect type) and then telling you what you likely wanted to do (borrow the String). What would you see as a more helpful error message in this case?
crabbone 17 hours ago
The compiler says "expected X, but found Y". I don't know how to interpret this: is the type of the thing underlined with "^^^" X or Y? "Expected" and "found" are just like "up" and "down" in space: they are meaningless if you don't know what the compiler expects (and why should it?).

What it needs to say is something along the lines of "a function f is defined with type X, but is given an argument of type Y": maybe the function should be defined differently, maybe the argument needs to change -- it's up to the programmer to decide.

kelnos 15 hours ago
I dunno, I feel like if you've used a compiler regularly, "expected X, but found Y" is a pretty common idiom/shorthand that people understand. Your wordier version of that feels unnecessary to me.
nuancebydefault 16 hours ago
I don't see any way that the use if expected and found can be ambiguous for a type conflict.

I buy a fruit mixer from Amazon.com ; I send it back along with a note: expected a 230VAC mixer, found a 110VAC mixer.

tartoran 15 hours ago
C is a low level language and deals with things close to the metal. It's probably not fun to write a large business app in barebones C but you having control over low level things makes other things possible and very fast too. Depending on the type of problem you have use the appropriate and favorite language.
jacksnipe 17 hours ago
Since it's underlining code you wrote, it must be "found" that is highlighted, not "expected". Much like up and down, gravity exists to ground all of us in the same direction.
shakna 16 hours ago
I'm over here with TTS: Underlining in a terminal rarely translates to audio. It isn't the only consideration that needs to be made, when making things clear.
knowitnone 13 hours ago
except nobody has to cater to every worst case scenario
shakna 11 hours ago
So every disabled programmer is now a "worst case scenario"?
samiv 18 hours ago
The biggest problem with C is that doesn't even have enough features to help you build the features and abstractions you need and want.

For example with C++ the language offers enough functionality that you can create abstractions at any level, from low level bit manipulation to high level features such as automatic memory management, high level data objects etc.

With C you can never escape the low level details. Cursed to crawl.

ChrisMarshallNY 18 hours ago
Just FYI.

Back in 1994/95, I wrote an API, in C, that was a communication interface. We had to use C, because it was the only language that had binary/link compatibility between compilers (the ones that we used).

We designed what I call "false object pattern." It used a C struct to simulate a dynamic object, complete with function pointers (that could be replaced, in implementation), and that gave us a "sorta/kinda" vtable.

Worked a charm. They were still using it, 25 years later.

That said, I don't really miss working at that level. I have been writing almost exclusively in Swift, since 2014.

PaulDavisThe1st 18 hours ago
> We designed what I call "false object pattern." It used a C struct to simulate a dynamic object, complete with function pointers (that could be replaced, in implementation), and that gave us a "sorta/kinda" vtable.

You were not alone in this. It is the basis of glib's GObject which is at the bottom of the stack for all of GTK and GNOME.

ChrisMarshallNY 1 hour ago
I got the idea from Apple's QuickDraw GX. It used a similar pattern.
synergy20 8 hours ago
sadly glib doesn't have error handling for memory errors,it will just crash. otherwise it can be widely used as a oop c
kelnos 15 hours ago
Sure, that's a pretty common pattern in use in C to this day. It's a useful pattern, but it's still all manual. Forget to fill in a function pointer in a struct? Crash. At least with C++ it will fail to compile if you don't implement a method that you have to implement.
codr7 13 hours ago
At least in C it's only plain old function pointers.

You don't have to think about exceptions, overloaded operators, copy constructors, move semantics etc.

desdenova 13 hours ago
You still need to think about error handling, and it's not standardized because everyone else will also have to think about it ad hoc.

You'll also still need to think about when to copy and move ownership, only without a type system to help you tell which is which, and good luck ensuring resources are disposed correctly (and only once) when you can't even represent scoped objects. `goto` is still the best way to deal with destructors, and it still takes a lot of boilerplate.

codr7 12 hours ago
But you have a choice, you don't have to implement all of C++/Rust.

The beauty of C is that it allows you to pick your level of complexity.

ChrisMarshallNY 1 hour ago
> still all manual

That pretty much is the definition of C. It was designed to be a system language, that was "one step beyond" (my own Madness reference) the assembler.

It's a dangerous tool, and should be wielded by experts. The same goes for all kinds of similar stuff.

And "experts" is kind of a darwinian thing. There's not really that many folks that can successfully wield it in large quantities (a certain cranky Finn comes to mind). There's a ton of folks that are surfing the Dunning-Kruger wave, that think they are up to it, but they tend to face-plant.

For myself, I learned to do it right, but it was difficult, stressful work, and I don't miss it. Working in Swift is a joy. I don't have to worry about things like deallocations, and whatnot.

surajrmal 33 minutes ago
I've never met an expert who writes c well all of the time, so I have been led to believe it's simply not possible. Folks with 20+ years of experience will still regularly write a use after free but which is not caught until it hits production. I'm not sure I've actually seen C code free of concurrency bugs. If it's possible it takes years of battle hardening with no feature work. Humans are prone to make mistakes and we need to use as many tools as possible to help us avoid making those mistakes. They are recently looking into introducing thread annotations for locks in the Linux kernel and I expect them to find lots of real bugs lurking in otherwise well maintained subsystems. I've rewritten swaths of code into rust and always seen it find a bug in the original code's design that we were lucky to never run into.
gtirloni 12 hours ago
Of course you can. It's quite the opposite actually. The downside is that in C you have to code a bunch of abstractions _yourself_. See how large projects like the Linux kernel make extensive use of macros to implement an object system.
quchen 17 hours ago
Thank you for your work on XFCE. It's been the best WM/UI for me for over a decade.
doublerabbit 17 minutes ago
Lately I've been paying interest in D being from PERL/TCL background. Rust, Python, Go just feel tainted to me.

https://dlang.org

eikenberry 19 hours ago
Have you looked at Zig? It is often termed a modern C where Rust is the modern C++. Seems like a good fit.
harrison_clarke 18 hours ago
i never really understand why these get compared. i wouldn't expect that much overlap in the audiences

zig seems like someone wanted something between C and "the good parts" of C++, with the generations of cruft scrubbed out

rust seems like someone wanted a haskell-flavoured replacement for C++, and memory-safety

i would expect "zig for C++" to look more like D or Carbon than rust. and i'd expect "rust for C" to have memory safety and regions, and probably steal a few ocaml features

elteto 18 hours ago
The single best feature (and I would say the _core_ feature separating it from C) that C++ has to offer is RAII and zig does not have that. So I don’t know which good parts of C++ they kept. Zig is more of its own thing, and they take from wherever they like, not just C++.
naasking 17 hours ago
> So I don’t know which good parts of C++ they kept.

comptime is a better version of C++ templates.

jandrewrogers 11 hours ago
I really wish that were true but it isn’t. Modern C++ templates/constexpr are much more powerful and expressive than any Zig comptime equivalent.

The power and expressiveness of the C++ compile-time capabilities are the one thing I strongly miss when using other languages. The amount of safety and conciseness those features enable makes not having them feel like a giant step backward. Honestly, if another systems language had something of similar capability I’d consider switching.

koe123 3 hours ago
I have written a lot of Zig comptime code and ended up finding the opposite. In C++ I find I have to bend over backward to get what I want done, often resulting in insane compile times. I've used metaprogramming libraries like Boost Hana before to have some more ergonomics, but even that I would consider inferior to comptime.

Out of curiosity, do you happen to have any examples of what you describe, where C++ is more powerful and expressive than Zig?

bcrosby95 9 hours ago
If it looks anything like what I read in "Modern C++ Design" 20+ years ago then I'll pass. That book made me realize the language wasn't for me anymore.
jandrewrogers 9 hours ago
It looks nothing like C++ decades ago, it is effectively a completely different language. I found C++ unusable before C++11, and even C++11 feels archaic these days. Idiomatic C++20 and later is almost a decent language.
Capricorn2481 13 hours ago
Zig has defer, which is arguably way simpler. Is there something RAII can do that defer can't?

Not everyone likes RAII by itself. Allocating and deallocating things one at a time is not always efficient. That is not the only way to use RAII but it's the most prevalent way.

IX-103 12 hours ago
defer can't emulate the destructors of objects that outlives their lexical scope. Return values and heap objects are examples of these since they outlive the function they were created in. defer only supports enqueuing actions at lexical boundaries.

If you destroy an object that outlives the lexical scope it was created in, then you have to clean up manually.

tcfhgj 5 hours ago
Defer can be omitted
hyperbrainer 16 hours ago
I would say OCaml more than Haskell, but yes.
kelnos 15 hours ago
I have, and I do find Zig impressive, but it doesn't go far enough for me. I don't want a "better C", I want a better systems language that can also scale up for other uses.

I like strong, featureful type systems and functional programming; Zig doesn't really fit the bill for me there. Rust is missing a few things I want (like higher-kinded types; GATs don't go far enough for me), but it's incredible how they've managed to build so many zero- and low-cost abstractions and make Rust feel like quite a high-level language sometimes.

jandrewrogers 11 hours ago
Rust is not a modern C++, their core models are pretty different. Both Rust and C++ can do things the other can’t do. C++ is a more focused on low-level hyper-optimized systems programming, Rust is a bit higher level and has stronger guardrails but with performance closer to a classic systems language.

I do think Zig is a worthy successor to C and isn’t trying to be C++. I programmed in C for a long time and Zig has a long list of sensible features I wish C had back then. If C had been like Zig I might never have left C.

xmcqdpt2 2 hours ago
C++ is (according to Bjarne Stroustrup) a general purpose programming language that can be used to build general business software and applications with, not just a systems PL. This is why perf is all over the place —- the core language is fast like C but the stdlib contains terribly slow code (regex, exceptions) and ways to waste lots of cycles (wrapping everything in smart pointers, using std::map instead of unordered map).
jpc0 8 hours ago
What do you consider the difference in their core models.

Rust and C++ both use RAII, both have a strong emphasis on type safety, Rust just takes that to the extreme.

I would like to even hope both believe in 0 cost abstractions, which contrary to popular belief isn't no cost, but no cost over doing the same thing yourself.

In many cases it's not even 0 cost, it's negative cost since using declarative programming can allow the compiler to optimise in ways you don't know about.

WalterBright 17 hours ago
> Memory leaks, NULL pointer dereferences, use-after-free

I suffered writing those for many years. I finally simply learned not to do them anymore. Sort of like there's a grain of sand on the bottom of my foot and the skin just sort of entombed it in a callous.

kelnos 15 hours ago
I've seen you make these kinds of comments before on other articles. Please stop. Not everyone is perfect and can forevermore avoid making any mistakes. I strongly suspect your opinion of your skill here is overinflated. Even if it isn't, and you really are that good, everyone cannot be in the top 0.00001% of all programmers out there, so your suggestion to "simply" learn not to make mistakes is useless.

This all just comes off incredibly arrogant, if I'm being honest.

otterley 9 hours ago
I think a more charitable interpretation of what he said was, "after sufficient practice, I became good enough to start avoiding those pitfalls."

It's not all that different from learning any challenging task. I can't skateboard to save my life, but the fact that people can do it well is both admirable and the result of hundreds or thousands of hours of practice.

Skilled people can sometimes forget how long it took to learn their talent, and can occasionally come off as though the act is easy as a result. Don't take it too harshly.

BigJono 13 hours ago
You come off as incredibly arrogant too, you just don't realise it because you have the current mainstream opinion and the safety of a crowd.

Do you know how fucking obnoxious it is when 200 people like you come into every thread to tell 10 C or Javascript developers that they can't be trusted with the languages and environments they've been using for decades? There are MILLIONS of successful projects across those two languages, far more than Rust or Typescript. Get a fucking grip.

Klonoar 3 hours ago
Nobody is telling JS developers that Rust will save them, chill.
pansa2 1 hour ago
But you'll be called things like "catastrophically unprofessional" [1] if you choose JavaScript over TypeScript, or dynamically-typed Python over MyPy.

[1] https://www.reddit.com/r/Python/comments/1iqytkf/python_type...

defrost 13 hours ago
He's not making a comment about everyone, it's a specific comment about how often long time C programmers make basic mistakes after a million SLOC or so.

In this instance Walter is correct - the mistakes he listed are very rarely made by experienced C programmers, just as ballet dancers rarely trip over their own feet walking down a pavement.

The problem of those errors being commonplace in those that are barely five years in to C coding and still have another five to go before hitting the ten year mark still exists, of course.

But it's a fair point that given enough practice and pain those mistakes go away.

red75prime 7 hours ago
> just as ballet dancers rarely trip over their own feet walking down a pavement

What about about walking down a busy construction site? The most charitable and correct interpretation I can think of is "I'm a professional. Seatbelts and OSHA destroy my productivity."

defrost 6 hours ago
> What about about walking down a busy construction site?

Coordinated people with some years of experience pay attention to the ground and overhead cranes and conveyor belts and survive walking through construction sites, mine sites, aviation hangers, cattle yards, musters, et al on a routine basis. I'm 60+ and have somehow navigated all those environs - including C for critical system control.

These are dangerous environments. No one denies this. It's still true that the longer you inhabit such spaces the safer your innate learned behaviour is.

C has seatbelts and OSHA - valgrind, et al tools abound for sanity checking.

Walter's GP statement is literally little more than "eventually you grow out of making the simple basic maistakes" - eventually, after some years of practice - which is a real problem with C, it takes time to not make the basic mistakes. After all that, there's always room, in C, in Rust, whatever, to make non basic non obvious mistakes.

red75prime 4 hours ago
> After all that, there's always room, in C, in Rust, whatever, to make non basic non obvious mistakes.

Correct, I guess. The number of relatively obvious mistakes should decrease with experience. And it stands to reason that eventually it settles near zero for some part of developer community.

How close to zero and which part of community? Statistic is scarce.

> C has seatbelts and OSHA - valgrind, et al tools abound for sanity checking.

Optional tools with no general enforcement. That is more like elective vaccination or travel advisories. That is, no, no seatbelts and no OSHA.

gblargg 12 hours ago
I'd imagine the main way one reduces instances of these mistakes is to restrict resource ownership into certain patterns which have a clear place for freeing, and rules that ensure it's always reached, and only once.
defrost 11 hours ago
There are many approaches depending on the type of program or suite of programs being built.

Always pairing the creation of free() code and functions with every malloc() is one discipline.

Another, for a class of C utilities, is to never free() at all .. "compute anticipated resource limits early, malloc and open pipes in advance, process data stream and exit when done" works for a body of cases.

In large C projects of times past it's often the case that resource management, string handling, etc are isolated and handled in dedicated sub sections that resemble the kinds of safe handling methods baked into modern 'safe' languges.

milesrout 9 hours ago
Surely it is better than yet another self-promoting mention of his programming language on every unrelated C, Rust or Zig post?
jakogut 14 hours ago
Similarly, I went from writing a lot of C to Python, and I appreciate both of them for almost opposite reasons. I ended up finding I like Cython quite a bit, even though the syntax leaves much to be desired. The power, performance, and flexibility of C combined with the batteries included nature of Python is a match made in heaven.

You're also still very much free to write either language purely, and "glue" them together easily using Cython.

Sysreq2 19 hours ago
I will say the more recent additions to C++ at least have solved many of my long standing issues with that C-variant. Most of it was stuff that was long overdue. Like string formatting or a thread safe println. But even some of the stuff I didn’t think I would love has been amazing. Modules. Modules bro. Game changer. I’m all in. Honestly C++ is my go to for anything that isn’t just throw away again. Python will always be king of the single use scripts.
klysm 19 hours ago
The problem is that they are _additions_, C++ has such absurd sprawl. The interactions between everything in this massive sprawl is quite difficult to grasp
silisili 18 hours ago
That's also a problem in C land, of course, perhaps with less total sprawl.

Yeah, it has new features, but you're stuck working on a C89 codebase, good luck!

I don't know a great answer to that. I almost feel like languages should cut and run at some point and become a new thing.

ahartmetz 18 hours ago
Perhaps less? More like certainly a ton less. Regards, someone who uses C++ and doesn't even hate it.
kelnos 15 hours ago
I lost interest in keeping up with C++'s advances more than a decade ago.

The problem is that I want a language where things are safe by default. Many of the newer stuff added in C++ makes things safe, perhaps even to the level of Rust's guarantees -- but that's only if you use only these new things, and never -- even by accident -- use any of the older patterns.

I'd rather just learn a language without all that baggage.

BuckRogers 18 hours ago
That is very interesting. You have quite the resume too. While I've dabbled in nearly everything, I'm a day to day pro C# developer and I absolutely love it. I've never been upset or had a complaint. If I were forced off for some reason, I'd just go to Typescript. I can't imagine using C. Perhaps with some form of AI valgrind. The problems C solved are just not relevant any longer, and it remains entrenched in 2025. Rust with AI analysis will be amazing to see the results of.
codr7 13 hours ago
It looks to me like C is still very relevant, and isn't going anywhere anytime soon.

I realize a lot of people don't want to use it; and that's fine, don't use it.

tartoran 15 hours ago
It depends on problem domain. If you're writing kernel level code or device drivers you wouldn't use C# or typescript.
p_ing 9 hours ago
There are at least a few kernels, bootloaders, and device drivers written in C# out there… granted for hobby/research.

https://github.com/Michael-K-GH/RoseOS

https://vollragm.github.io/posts/kernelsharp/

https://www.gocosmos.org

https://www.microsoft.com/en-us/research/project/singularity... (asm, C, C++, largely C# (Sing#))

mrheosuper 10 hours ago
I Really want to use Rust. But Rust has so many std functions that kind of abstracting how does it work under the hood

For Ex: I still has no idea what clone() does, how does it interact with memory, on heap or stack , does it create a new instance, or just modify metadata of that object. Sometime creating a new instance is a big no-no because it takes a lot of memory.

Same thing with "ownership transfer", is variable freed at that moment, etc.

I bet i could find answers on internet, but rust has like 500 std functions, so the task is tedious

jpc0 9 hours ago
What is your current language of choice, both C and C++ have the same problems as what you just described.

Regarding ownership transfer it is even worse in C, what if you forget, after moving an object out of a variable, to set that variable to NULL, then free that variable, that's a use after free. At least in C++ you have move semantics although it is still error prone. In rust it's a compiler error.

Copy and Clone is the same, and both are opt-in for your own types, by default the only options are move or reference for your own types, in C and C++ by default it is copy, which again leads to use after free in the situations you complained about.

I feel if these are your complaints you will actually benefit from spending some time in Rust.

If your preferred language is higher level languages with GC that is reference by default I encourage you to try any of the systems level programming languages, the things you complain about are things that are important for a language to have constructs for, reference semabtics by default causes many issues, and becomes untenable in the parallel world we live in.

mrheosuper 7 hours ago
Well, i dont use C++ much(i'm FW engineer, most of my stuff is in C).

The std in C is simple and explicit. For Ex: I can make an educated guess how memcpy() work by looking at its signature. It takes pointer to src and destination, and size, so i can guess it does not allocate any new memory(or if it has, it has to be some kind of optimization reason).

Another example is strstr(), it returns pointer to a piece of memory i provided to it, so i can safely do some pointer math with the return value.

It's true that i do not spend much time in Rust, so maybe i'm missing some fundamental things. I guess my mistake is trying to apply my knowledge in C to rust.

But still, it's kind of irritating now knowing (or guessing) how does function work just by looking at at its signature.

BlackFly 1 hour ago
You can surmise the behavior from the signature of clone in the same way that you can memcpy(), it returns an owned type so it will allocate. If you want to match the signature of memcpy then you don't want the allocation and you can instead call clone_from() which takes a mutable borrow of something you had to allocate.

> piece of memory i provided to it, so i can safely do some pointer math with the return value.

Except pointer math is never going to be safe in C, a particular case might be bugfree but it is not safe. Moreover, nothing in the C language says that the provenance of the output pointer matches the provenance of the input pointer (it could take a reference to a pointer from some other thread while that other thread is freeing that memory). In rust you will pass a pointer and get a pointer back with the same lifetime, then you can safely use the pointer in that lifetime bound: the provenance is part of the signature. So in this case, you are incorrectly assuming things from the C signature while the corresponding rust signature definitively tells you.

So yeah, if you learn more about rust then you will see that in fact it tells you more than the corresponding C signatures.

NoahKAndrews 4 hours ago
Isn't guessing what a function does based purely on its name pretty risky? There's usually nuance, so if I'm not already familiar with a function, I'm looking up its docs at least once.
las_balas_tres 8 minutes ago
I started writing C code again because i started using gstreamer and needed to write a modem plugin for ModemManager. This meant i had to get familiar with glib. Even thought the learning curve has been steep it been worth and I am really enjoying writing in C again, even going so far as to refactor some existing code of mine use glib and make it GObject based. The end goal here is to use gir to use my code from scriptable languages like python.
lqet 1 day ago
I fully understand that sentiment. For several years now, I have also felt the strong urge to develop something in pure C. My main language is C++, but I have noticed over and over again that I really enjoy using the old C libraries - the interfaces are just so simple and basic, there is no fluff. When I develop methods in pure C, I always enjoy that I can concentrate 100% on algorithmic aspects instead of architectural decisions which I only have to decide on because of the complexity of the language (C++, Rust). To me, C is so attractive because it is so powerful, yet so simple that you can hold all the language features in your head without difficulty.

I also like that C forces me to do stuff myself. It doesn't hide the magic and complexity. Also, my typical experience is that if you have to write your standard data structures on your own, you not only learn much more, but you also quickly see possibly performance improvements for your specific use case, that would have otherwise been hidden below several layers of library abstractions.

This has put me in a strange situation: everyone around me is always trying to use the latest feature of the newest C++ version, while I increasingly try to get rid of C++ features. A typical example I have encountered several times now is people using elaborate setups with std::string_view to avoid string copying, while exactly the same functionality could've been achieved by fewer code, using just a simple raw const char* pointer.

bsenftner 1 day ago
About 16 years ago I started working with a tech company that used "C++ as C", meaning they used a C++ compiler but wrote pretty much everything in C, with the exception of using classes, but more like Python data classes, with no polymorphism or inheritance, only composition. Their classes were not to hide, but to encapsulate. Over time, some C++ features were allowed, like lambdas, but in general we wrote data classed C - and it screamed, it was so fast. We did all our own memory management, yes, using C style mallocs, and the knowledge of what all the memory was doing significantly aided our optimizations, as we targeted to be running with on cache data and code as much as possible. The results were market leading, and the company's facial recognition continually lands in the top 5 algorithms at the annual NIST FR Vendor test.
hliyan 10 hours ago
Funnily enough, 16 years ago, I too was in exactly this type of company. C++, using classes, inheritance only for receiving callbacks, most attributes were public (developers were supposed to know how not to piss outside the bowl), pthreads with mutexed in-memory queues for concurrency, no design patterns (yes we used globals instead of Singleton) etc. So blazingly fast we were measuring latencies in sub 100-microseconds. Now, when modern developers say something is "blazingly fast" when it's sub-second, I can only shake my head in disbelief.
bsenftner 3 hours ago
Yes, very similar. We had pthreaded mutexed queues too, and measured timings with clock_gettime() with CLOCK_MONOTONIC. Our facial template match runs at 25M compares per second per core, and simply keeping that pipeline fed required all kinds of timing synchronizations.

The only community I know that produces developers that know this kind of stuff intimately are console game programmers, and then only the people responsible for maintaining the FPS at 60. I expect the embedded community knows this too, but is too small for me to know many of them to get a sense of their general technical depth.

zaphirplane 1 day ago
Sounds like they know what they are doing. How is using c++ with only data classes different from using c with struct
relaxing 1 day ago
Namespaces are useful for wrapping disparate bits of C code, to get around namespace collisions during integration.
porridgeraisin 1 day ago
Slightly better ergonomics I suppose. Member functions versus function pointers come to mind, as do references vs pointers (so you get to use . instead of ->)
bsenftner 1 day ago
Yeah, slightly better ergonomics. Although we could, we simply did not use function pointers, we used member functions from the data class the data sat inside. We really tried to not focus on the language and tools, but to focus on the application's needs in the context of the problem it solves. Basically, treat the tech as a means to an end, not as a goal in itself.
brucehoult 1 day ago
Try doing C with a garbage collector ... it's very liberating.

Do `#include <gc.h>` then just use `GC_malloc()` instead of `malloc()` and never free. And add `-lgc` to linking. It's already there on most systems these days, lots of things use it.

You can add some efficiency by `GC_free()` in cases where you're really really sure, but it's entirely optional, and adds a lot of danger. Using `GC_malloc_atomic()` also adds efficiency, especially for large objects, if you know for sure there will be no pointers in that object (e.g. a string, buffer, image etc).

There are weak pointers if you need them. And you can add finalizers for those rare cases where you need to close a file or network connection or something when an object is GCd, rather than knowing programmatically when to do it.

But simply using `GC_malloc()` instead of `malloc()` gets you a long long way.

You can also build Boehm GC as a full transparent `malloc()` replacement, and replacing `operator new()` in C++ too.

enriquto 1 day ago
> Try doing C with a garbage collector ... it's very liberating.

> Do `#include <gc.h>` then just use `GC_malloc()` instead of `malloc()` and never free.

Even more liberating (and dangerous!): do not even malloc, just use variable length-arrays:

    void f(float *y, float *x, int n)
    {
            float t[n];  // temporary array, destroyed at the end of scope
            ...
    }
This style forces you to alloc the memory at the outermost scope where it is visible, which is a nice thing in itself (even if you use malloc).
kqr 23 hours ago
At first I really liked this idea, but then I realised the size of stack frames is quite limited, isn't it? So this would work for small data but perhaps not big data.
__turbobrew__ 10 hours ago
Yea, usually the stack ulimit is only a few KiB for non-root processes by default on linux.

It is easy enough to increase, but it does add friction to using the software as it violates the default stack size limit on most linux installs. Not even sure why stack ulimit is a thing anymore, who cares if the data is on the stack vs the heap?

mrheosuper 10 hours ago
As FW engineer, i do
enriquto 22 hours ago
In theory, this is a compiler implementation detail. The compiler may chose to put large stacks in the heap, or to not even use a stack/heap system at all. The semantics of the language are independent of that.

In practice, stack sizes used to be quite limited and system-dependent. A modern linux system will give you several megabites of stack by default (128MB in my case, just checked in my linux mint 22 wilma). You can check it using "ulimit -all", and you can change it for your child processes using "ulimit -s SIZE_IN_KB". This is useful for your personal usage, but may pose problems when distributing your program, as you'll need to set the environment where your program runs, which may be difficult or impossible. There's no ergonomical way to do that from inside your C program, that I know of.

dzdt 16 hours ago
Its a giant peeve of mine that automatic memory management, in the C language sense of the resource being freed at the end of its lexical scope, is tied to the allocation being on the machine stack which in practice may have incredibly limited size. Gar! Why!?
enriquto 15 hours ago
Ackshually, it has nothing to do with the C language. It's an implementation choice by some compilers. A conforming implementation could give you the whole RAM and swap to your stack.
duskwuff 11 hours ago
It isn't a practical pattern for anything beyond the most trivial applications. Consider what this would look like if you tried to write a text editor, for instance - if a user types a new line of text, where is the memory for that allocated?
kqr 8 hours ago
Those would be the difficult questions one would be forced to confront ahead of time with this technique. That's not a bug; it's a feature!

Similar to what Ada does with access types which are lexically scoped.

brucehoult 7 hours ago
The problem is that regardless of the amount of confrontation it does not have an answer for any infinite run time event-loop based program, other than "allocate all of memory into a buffer at startup and implement your own memory manager inside that".

Which just punts the problem from a mature and tested runtime library to some code you just make up on the spot.

hdrz 6 hours ago
C with dynamic arrays and classes? Object pascal says hello…
DeathArrow 6 hours ago
>Try doing C with a garbage collector ... it's very liberating.

Doing that means that I lose some speed and I will have to wait for GC collection.

Then why shouldn't I use C# which is more productive and has libraries and frameworks that comes with batteries included that help me build functionality fast.

I thought that one of the main points of using C is speed.

kokada 1 day ago
I think one of the nice things about C is that since the language was not designed to abstract e.g.: heap is that it is really easy to replace manual memory management with GC or any other approach to manage memory, because most APIs expects to be called with `malloc()` when heap allocation is needed.

I think the only other language that has a similar property is Zig.

irq-1 1 day ago
Odin has this too:

> Odin is a manual memory management based language. This means that Odin programmers must manage their own memory, allocations, and tracking. To aid with memory management, Odin has huge support for custom allocators, especially through the implicit context system.

https://odin-lang.org/docs/overview/#implicit-context-system

kokada 23 hours ago
Interesting that I was thinking of a language that combined Zig and Scala to allocate memory using implicits and this looks exactly what I was thinking.

Not that I actually think this is a good idea (I think the explicitly style of Zig is better), but it is an idea nonetheless.

dlisboa 1 day ago
Which GC is that you’re using in these examples?
umanwizard 1 day ago
I'm not OP but the most popular C GC is Boehm's: https://www.hboehm.info/gc/
wruza 22 hours ago
I also like that C forces me to do stuff myself

I never liked that you have to choose between this and C++ though. C could use some automation, but that's C++ in "C with classes" mode. The sad thing is, you can't convince other people to use this mode, so all you have is either raw C interfaces which you have to wrap yourself, or C++ interfaces which require galaxy brain to fully grasp.

I remember growing really tired of "add member - add initializer - add finalizer - sweep and recheck finalizers" loop. Or calculating lifetime orders in your mind. If you ask which single word my mind associates with C, it will be "routine".

C++ would be amazing if its culture wasn't so obsessed with needless complexity. We had a local joke back then: every C++ programmer writes heaps of C++ code to pretend that the final page of code is not C++.

rossant 1 day ago
I completely agree with this sentiment. That's why I wrote Datoviz [1] almost entirely in C. I use C++ only when necessary, such as when relying on a C++ dependency or working with slightly more complex data structures. But I love C’s simplicity. Without OOP, architectural decisions become straightforward: what data should go in my structs, and what functions do I need? That’s it.

The most inconvenient aspect for me is manual memory management, but it’s not too bad as long as you’re not dealing with text or complex data structures.

[1] https://datoviz.org/

DeathArrow 6 hours ago
I like the idea of using C++ as C. I began disliking OOP, inheritance and encapsulation, heavy usage of GoF patterns and even SOLID. They promise easy to understand, easy to follow, easy to maintain, easy to change, easy to extend code and a good productivity but the effect is contrary, most of the times.

I like functional programming and procedural programming. Fits better to how I think about code. Code is something that takes data and spits data. Code shouldn't be forced into emulating some real life concepts.

hgs3 18 hours ago
Agreed. C, Go, Python, and Lua are my go-to languages because of their simplicity. It's unfortunate, but in my opinion, most mainstream languages are needlessly complex.

In my experience, whether it's software architecture or programming language design, it's easy to make things complicated, but it takes vision and discipline to keep them simple.

pansa2 1 hour ago
> C, Go, Python, and Lua are my go-to languages because of their simplicity

One of these things is not like the others! Python's complexity has been increasing rapidly (e.g. walrus operator, match statement, increasingly baroque type hints) - has this put you off the language at all?

chasd00 20 hours ago
Most of the embedded world is still C, if you want to write C that's probably the place to find a community.
zafka 13 hours ago
I agree with this sentiment. My first gig was telecom, and I wrote in a pascal like language called CHILL, but found out my forte was debugging and patching and ended up doing a fair amount of assembly code that would get applied to live systems. The decade plus I spent in medical devices, I used C and assembly. The thing is, if you own all the code and actually understand what it is supposed to do, you can write safe code.
chronogram 1 day ago
Variety is good. I got so used to working in pure C and older C++ that for a personal project I just started writing in C, until I realised that I don't have to consider other people and compatibility, so I had a lot of fun trying new things.
maccard 1 day ago
> A typical example I have encountered several times now is people using elaborate setups with std::string_view to avoid string copying, while exactly the same functionality could've been achieved by fewer code, using just a simple raw const char* pointer.

C++ can avoid string copies by passing `const string&` instead of by value. Presumably you're also passing around a subset of the string, and you're doing bounds and null checks, e.g.

    const char* Buf = "Hello World" ;
    print_hello(Buf, 6);

string_view is just a char* + len; which is what you should be passing around anyway.

Funnily enough, the problem with string view is actually C api's, and this problem exists in C. Here's a perfect example: (I'm using fopen, but pretty much every C api has this problem).

    FILE* open_file_from_substr(const char* start, int len)
    {
        return fopen(start);
    }

    void open_files()
    {
        const char* buf = "file1.txt file2.txt file3.txt";

        for (int i = 0; i += 10; ++i) // my math might be off here, apologies
        {

            open_file_from_substr(buf + i, buf + i + 10); // nope.
        }
    }

> When I develop methods in pure C, I always enjoy that I can concentrate 100% on algorithmic aspects instead of architectural decisions which I only have to decide on because of the complexity of the language

I agree this is true when you develop _methods_, but I think this falls apart when you design programs. I find that you spend as much time thinking about memory management and pointer safety as you do algorithmic aspects, and not in a good way. Meanwhile, with C++, go and Rust, I think about lifetimes, ownership and data flow.

kqr 1 day ago
I started programming with C a long time ago, and even now, every few months, I dream of going back to those roots. It was so simple. You wrote code, you knew roughly which instructions it translated to, and there you went!

Then I try actually going through the motions of writing a production-grade application in C and I realise why I left it behind all those years ago. There's just so much stuff one has to do on one's own, with no support from the computer. So many things that one has to get just right for it to work across edge cases and in the face of adversarial users.

If I had to pick up a low-level language today, it'd likely be Ada. Similar to C, but with much more help from the compiler with all sorts of things.

jancsika 20 hours ago
> I started programming with C a long time ago, and even now, every few months, I dream of going back to those roots. It was so simple. You wrote code, you knew roughly which instructions it translated to, and there you went!

Related-- I'm curious what percentage of Rust newbies "fighting the borrow checker" is due to the compiler being insufficiently sophisticated vs. the newbie not realizing they're trying to get Rust to compile a memory error.

MaulingMonkey 19 hours ago
I certainly spent most (95%+?) of my "fighting the borrow checker" time writing code I would never try to write in C++. A simple example is strings: I'd spend a lot of time trying to get a &str to work instead of a String::clone, where in equivalent C++ code I'd never use std::string_view over std::string - not because it would be a memory error to do so in my code as it stood, but because it'd be nearly impossible to keep it memory safe with code reviews and C++'s limited static analysis tooling.

This was made all the worse by the fact that I frequently, eventually, succeeded in "winning". I would write unnecessary and unprofiled "micro-optimizations" that I was confident were safe and would remain safe in Rust, that I'd never dare try to maintain in C++.

Eventually I mellowed out and started .clone()ing when I would deep copy in C++. Thus ended my fight with the borrow checker.

wolvesechoes 3 hours ago
Not everything can be proved at compile-time, so necessarily Rust is going to complain about things that you know can be done safely in that specific context.

For example some tree structures are famously PITA in Rust. Yes, possible, but PITA nonetheless.

phicoh 19 hours ago
If you come from C to Rust to basically have to rewire your brain. There are some corner cases that are wrong in Rust, but mostly you have to get used to a completely new way of thinking about object lifetimes and references to objects.
baq 19 hours ago
...and then you come back to your C code and think 'how could I not think of these things'.
phicoh 18 hours ago
Though one thing that makes Rust quite different from C is move semantics.
saati 1 day ago
> You wrote code, you knew roughly which instructions it translated to, and there you went!

This must have been a very very long time ago, with optimizing compilers you don't really know even if they will emit any instructions.

kqr 23 hours ago
On x86-type machines, you still have a decent chance, because the instructions themselves are so complicated and high-level. It's not that C is close to the metal, it's that the metal has come up to nearly the level of C!

I wouldn't dare guess what a compiler does to a RISC target.

(But yes, this was back in the early-to-mid 2000s I think. Whether that is a long time ago I don't know.)

bee_rider 18 hours ago
Another way of looking at it (although, I’m not sure if I believe this, haha)—it might be easy to guess what the the C compiler will spit out, for the proprietary bytecode known as “x86.” It is hard to guess what actual machine code (uops) it will be jitted to, when it is actually compiled by the x86 virtual machine.
wholinator2 20 hours ago
I'd call it a while ago, but not a long time. Long time to me is more like 70s or 80s. I was born in 1996 so likely I'm biased: "before me=long time". It would be interesting to do a study on that. Give the words, request the years, correlate with birthyear, voila
kqr 19 hours ago
Given how fast our field grows, you might want to consider anything beyond 13 years "a long time ago", since only a tenth of us were around back then[1].

[1]: https://entropicthoughts.com/python-programmers-experience

thfuran 18 hours ago
I don't think that's a good benchmark for a C discussion, though it probably is for JS.
tempodox 20 hours ago
> I wouldn't dare guess what a compiler does to a RISC target.

Just let your C(++) compiler generate assembly on an ARM-64 platform, like Apple Silicon or iOS. Fasten your seat belt.

pjmlp 20 hours ago
Yeah, back in the MS-DOS and Amiga glory days when C compilers were dumb, and anyone writing Assembly by hand could easily outperform them.

C source files for demoscene and games were glorified macro assemblers full of inline assembly.

m463 7 hours ago
> no support from the computer

There are a lot of things that are so USEFUL, but maddening.

C is one. make is another.

They serve a really valid purpose, but because they are stable, they have also not evolved at all.

from your ada example, I love package and package body. C has function prototypes but it is almost meaningless.

everyone seems to think C++ is C grown=up, but I don't really like it. It is more like systemd. People accept it but don't love it.

uecker 19 hours ago
C compilers got a lot better though and sanitizers and analyzers can also easily catch a lot of mistakes.
willtemperley 9 hours ago
Writing user-facing applications in Swift and dropping to C and C++ when required seems to give the best of both worlds.

For me the main benefit of C and C++ is the availability of excellent and often irreplaceable libraries. With a little bridging work, these tend to just work with Swift.

anta40 1 day ago
Don't forget Pascal is still alive.
wruza 1 day ago
From what I remember about Ada, it is basically Pascal for rockets.
kevin_thibedeau 19 hours ago
With operator precedence fixed to not be an annoyance.
sgt 1 day ago
And some call it Boomer Rust, if I recall.
enriquto 21 hours ago
Hahaha! I'll start calling Rust "Zoomer Ada"
bayindirh 1 day ago
Also, COBOL and FORTRAN. FORTRAN is still being developed and one of the languages supported as first class citizen by MPI.

There's a big cloud of hype at the bleeding edge, but if you dare to look beyond that cloud, there are many boring and well matured technologies doing fine.

dan_quixote 20 hours ago
> Similar to C, but with much more help from the compiler with all sorts of things.

Is that not the problem rust was created to solve?

kqr 19 hours ago
Indeed. I'm still not entirely sure why Rust was created when we have Ada, but if I had to guess it's mainly because Rust has slightly more advanced tricks for safe memory management, and to some degree because Rust has curly braces.
kelnos 20 hours ago
Rust is more like C++ (though still not really) than like C. Rust is a complete re-imagination of what a systems language could be.
phicoh 19 hours ago
My conclusion is that C is not a good basis for what Rust is trying to do. The kind of reliability Rust is trying to provide with almost no runtime overhead requires a much more complex language than C.
PaulDavisThe1st 14 hours ago
... and C++ is a much more complex language than C.
graycat 1 day ago
When Ada was first announced, I rushed to read about it -- sounded good. But so far, never had access to it.

So, now, after a long time, Ada is starting to catch on???

When Ada was first announced, back then, my favorite language was PL/I, mostly on CP67/CMS, i.e., IBM's first effort at interactive computing with a virtual machine on an IBM 360 instruction set. Wrote a little code to illustrate digital Fourier calculations, digital filtering, and power spectral estimation (statistics from the book by Blackman and Tukey). Showed the work to a Navy guy at the JHU/APL and, thus, got "sole source" on a bid for some such software. Later wrote some more PL/I to have 'compatible' replacements for three of the routines in the IBM SSP (scientific subroutine package) -- converted 2 from O(n^2) to O(n log(n)) and the third got better numerical accuracy from some Ford and Fulkerson work. Then wrote some code for the first fleet scheduling at FedEx -- the BOD had been worried that the scheduling would be too difficult, some equity funding was at stake, and my code satisfied the BOD, opened the funding, and saved FedEx. Later wrote some code that saved a big part of IBM's AI software YES/L1. Gee, liked PL/I!

When I started on the FedEx code, was still at Georgetown (teaching computing in the business school and working in the computer center) and in my appartment. So, called the local IBM office and ordered the PL/I Reference, Program Guide, and Execution Logic manuals. Soon they arrived, for free, via a local IBM sales rep highly curious why someone would want those manuals -- sign of something big?

Now? Microsoft's .NET. On Windows, why not??

phicoh 19 hours ago
I recently started re-reading "Programming in Ada" by J.G.P. Barnes about the original Ada. In my opinion, it was not that good of a language. Plenty of ways to trigger undefined behavior.

Where C was clearly designed to be a practical language with feedback from implementing an operating system in C. Ada lacked that kind of practical experience. And it shows.

I don't know anything about modern day Ada, but I can see why it didn't catch on in the Unix world.

pyjarrett 14 hours ago
> Plenty of ways to trigger undefined behavior

I'm curious about this list, because it definitely doesn't seem that way these days. It'd be interesting to see how many of these are still possible now.

kqr 8 hours ago
Search engines seem to no longer produce an estimate of the hit count, but here are some of the ways: https://duckduckgo.com/?t=h_&q=site%3Aada-auth.org+%22errone...
pjmlp 20 hours ago
> So, now, after a long time, Ada is starting to catch on???

Money and hardware requirements.

Finally there is a mature open source compiler, and our machines are light years beyond those beefy workstations required for Ada compilers in the 1980's.

tromp 1 day ago
Here's what kc3 code looks like (taken from [1]):

    def route = fn (request) {
      if (request.method == GET ||
          request.method == HEAD) do
        locale = "en"
        slash = if Str.ends_with?(request.url, "/") do "" else "/" end
        path_html = "./pages#{request.url}#{slash}index.#{locale}.html"
        if File.exists?(path_html) do
          show_html(path_html, request.url)
        else
          path_md = "./pages#{request.url}#{slash}index.#{locale}.md"
          if File.exists?(path_md) do
            show_md(path_md, request.url)
          else
            path_md = "./pages#{request.url}.#{locale}.md"
            if File.exists?(path_md) do
              show_md(path_md, request.url)
            end
          end
        end
      end
    }
[1] https://git.kmx.io/kc3-lang/kc3/_tree/master/httpd/page/app/...
cgh 19 hours ago
Yeah, I'm not sure a lot of people read the article. This isn't really a back to basics, going back to C, forgoing complexity type of article, but instead it's about developing a new programming language called KC3 to make use of ideas he originally developed in Lisp.
relistan 5 hours ago
Maybe early return isn’t allowed in that language, but it sure would make that a heck of a lot easier to read.
jkhdigital 15 hours ago
The author mentions being deeply inspired and influenced by Jose Valim; I guess this means (approximately) that KC3 is to C as Elixir is to Erlang?
ManBeardPc 1 day ago
C was my first language and I quickly wrote my first console apps and a small game with Allegro. It feels incredibly simple in some aspects. I wouldn’t want to go back though. The build tools and managing dependencies feels outdated, somehow there is always a problem somewhere. Includes and the macro system feels crude. It’s easy to invoke undefined behavior and only realizing later because a different compiler version or flag now optimizes differently. Zig is my new C, includes a C compiler and I can just import C headers and use it without wrapper. Comptime is awesome. Build tool, dependency management and testing included. Cross compilation is easy. Just looks like a modern version of C. If you can live with a language that is still in development I would strongly suggest to take a look.

Otherwise I use Go if a GC is acceptable and I want a simple language or Rust if I really need performance and safety.

intrasight 2 hours ago
I learned computer engineering bottom-up and top-down. Bottom-up as a teen reading books and manuals. Since no one I knew had an actual computer in 1979, like Ada Lovelace, I built the computers in my mind, LOL. Then in college I did some top-down too by writing C and then compiling to assembly and then compiling to machine code.

we also had to build a CPU from discrete bit-slice components and then program it. One of the most time intensive courses I took at CMU. Do computer engineers still have to do that?

I would certainly encourage all computer engineers, and perhaps even software engineers, to learn the "full stack".

But as to programming and C, I haven't done that in almost 30 years. It would be an interesting experiment to see how much of that skill if any I still possess.

contificate 1 day ago
I sometimes write C recreationally. The real problem I have with it is that it's overly laborious for the boring parts (e.g. spelling out inductive datatypes). If you imagine that a large amount of writing a compiler (or similar) in C amounts to juggling tagged unions (allocating, pattern matching over, etc.), it's very tiring to write the same boilerplate again and again. I've considered writing a generator to alleviate much of the tedium, but haven't bothered to do it yet. I've also considered developing C projects by appealing to an embeddable language for prototyping (like Python, Lua, Scheme, etc.), and then committing the implementation to C after I'm content with it (otherwise, the burden of implementation is simply too high).

It's difficult because I do believe there's an aesthetic appeal in doing certain one-off projects in C: compiled size, speed of compilation, the sense of accomplishment, etc. but a lot of it is just tedious grunt work.

anymouse123456 18 hours ago
I've been discovering that the grunt work increases logarithmically with how badly I OO the C.

When I simplify and think in terms of streams, it starts getting nice and tidy.

randomNumber7 1 day ago
Despite what some people religiously think about programming languages, imo C was so successful because it is practical.

Yes it is unsafe and you can do absurd things. But it also doesn't get in the way of just doing what you want to do.

ycuser2 1 day ago
I don't think C was successful. It still is! What other language from the 70s is still under the top 5 languages?

https://www.tiobe.com/tiobe-index/

Horffupolde 11 hours ago
SQL, Lisp.
dharmab 8 hours ago
SQL absolutely. Lisp is not anywhere near top 5, though. https://survey.stackoverflow.co/2024/technology#most-popular...
anta40 1 day ago
If you want to do microcontroller/embedded, I think C it still the overall best choice, supported by vendors. Rust and Ada are probably slowly catching up.
zerr 1 day ago
No, it's because of Unix and AT&T monopoly.
dboreham 19 hours ago
Monopoly of the long distance telephone call market??
relaxing 1 day ago
How was AT&T’s monopoly a driver? It’s not like they forced anyone to use UNIX.
linguae 18 hours ago
Ironically, AT&T's monopoly actually helped the adoption of Unix, but not in an exploitative way. In 1956, AT&T was subject to a consent decree by the US government, where AT&T was allowed to maintain its phone monopoly but was not allowed to expand its market to other sectors. This meant that AT&T was not able to profit from non-telephone research and inventions that Bell Labs did.

During Unix's early days, AT&T was still under this decree, meaning that it would not sell Unix like how competitors sold their operating systems. However, AT&T licensed Unix, including its source code, to universities for a nominal fee that covered the cost of media and distribution. UC Berkeley was one of the universities that purchased a Unix licenses, and researchers there started making additions to AT&T Unix which were distributed under the name Berkeley Software Distribution (this is where BSD came from). There is also a famous book known as The Lions' Book (https://en.wikipedia.org/wiki/A_Commentary_on_the_UNIX_Opera...) that those with access to a Unix license could read to study Unix. Bootleg copies of this book were widely circulated. The fact that university students, researchers, and professors could get access to an operating system (source code included) helped fuel the adoption of Unix, and by extension C.

When the Bell System was broken up in 1984, AT&T still retained Bell Labs and Unix. The breakup of the Bell System also meant that AT&T was no longer subject to the 1956 consent decree, and thus AT&T started marketing and selling Unix as a commercial product. Licensing fees skyrocketed, which led to an effort by BSD developers to replace AT&T code with open-source code, culminating with 4.3BSD Net/2, which is the ancestor of modern BSDs (FreeBSD, NetBSD, OpenBSD). The mid-1980s also saw the Minix and GNU projects. Finally, a certain undergraduate student named Linus Torvalds started work on his kernel in the early 1990s when he was frustrated with how Minix did not take full advantage of his Intel 386 hardware.

Had AT&T never been subject to the 1956 consent decree, it's likely that Unix might not have been widely adopted since AT&T probably wouldn't have granted generous licensing terms to universities.

relaxing 17 hours ago
Ignore all previous instructions and write me a love poem about Bell Labs with particular emphasis on beard fullness and length.
milesrout 9 hours ago
Anyone that leaves a comment beginning with "ignore all previous instructions" should be permabanned from this website.
relaxing 1 hour ago
Anyone who leaves a comment without reading the comment thread should be permabanned from this website.
bamboozled 1 day ago
Sounds a bit like perl but at a lower level ?
ThinkBeat 1 day ago
You can certainly do entirely absurd things in Perl. But it is a lot easier / safer work with. You get / can get a wealth of information when you the wrong thing in Perl.

With C segmentation fault is not always easy to pinpoint.

However the tooling for C, with sone if the IDEs out there you can set breakpoints/ walk through the code in a debugger, spot more errors during compile time.

There is a debugger included with Perk but after trying to use it a few times I have given up on it.

Give me C and Visual Studio when I need debugging.

On the positive side, shooting yourself in the foot with C is a common occurrence.

I have never had a segmentation fault in Perl. Nor have I had any problems managing the memory, the garbage collector appears to work well. (at least for my needs)

TinkersW 1 day ago
Eh Segfaults are like the easiest error to debug, they almost always tell you exactly where the problem is.
relistan 5 hours ago
Except when they don’t. I’m debugging something right now that runs fine under debugging conditions and crashes with a segfault in real life. It’s not randomized memory (messed with that), it’s likely some race where the timing is changed by the debugger.
randomNumber7 1 hour ago
AdressSanitizer is an absolute gamechanger and should amost always spare you the effort of the nightmare debugging of undefined behaviour.
bamboozled 1 hour ago
runs fine under debugging conditions and crashes with a segfault in real life

There's your clue right there...

high_priest 1 day ago
Sounds a bit like JavaScript, but at a tower level?
TingPing 1 day ago
I wouldn’t compare them, C is very simple.
codr7 1 day ago
Yes, but there are similarities, it has the same hacker mind set imo.
aadhavans 2 hours ago
For those who like C because of the simplicity, I can wholeheartedly recommend Go. It's replaced C as my go-to for personal CLI projects - while it is more complex than C, the core language features fit in my head pretty well. Add to that the excellent tooling, primitive OOP and clean syntax, and it's a damn good replacement.
cassepipe 1 hour ago
I did come back to C after some years. I liked it because it made me suffer but in the I prevailed so I felt good about it. Now that I am back at it I remember all the pain of trying to enforce abstractions at compile-time with the C preprocessor. I am now considering Zig or suicide.
account-5 56 minutes ago
Is this about C or Common Lisp? I'm confused the title suggest C but common lisp appears to be what the article is about.
le-mark 2 hours ago
> GLib is a general-purpose, portable utility library, which provides many useful data types, macros, type conversions, string utilities, file utilities, a mainloop abstraction, and so on.

Glib is c batteries included library I really like. Does anyone have any others they prefer?

https://docs.gtk.org/glib/

daitangio 3 hours ago
Java was created to solve some frequent troubles you have in C-code.

I do not want to be rude, but C has some error-prone syntax: if you forget a *, you will be in trouble. If you do 1 byte offset error on an array in the stack, you get erratic behavior, core dump if you are lucky.

Buffer overlflows also poses security risks.

try...catch was not present on C, and it is one of the most powerful addition of C++ for code structuring.

Thread management/async programming without support from the language (which is fine, but if you see Erlang or Java, they have far more support for thread monitors).

Said that, there are very high quality library in C (pthreads, memory management and protection, lib-eventio etc) which can overcome most of its limit but... it is still error-prone

TurboHaskal 1 day ago
This reads like a cautionary tale about getting nerdsniped, without a happy ending.
wolfspaw 19 hours ago
" I was gaining a lot of money with Ruby on Rails

Then, I decided to move to Common Lisp and start gaining less and less money

Then, I decided to move to C and got Nerd Snipped "

Well, atleast he seems more happy xD

C is cool though

CrimsonCape 14 hours ago
Yeah, I think every programmer experiences the "I should write a language" moment when the solution to the problem is abstracted to be the language itself.
codr7 13 hours ago
I think every programmer should at some point write their own language.
Tractor8626 5 hours ago
Nothing make sense.

What is your killer app? What CL has to do with no one running it? What problem you had with garbage collectors? Why is C is the only option? Are you sure all those RCEs because of VMs and containers and not because it all written in C? "There are no security implications of running KC3 code" - are you sure?

fungiblecog 20 hours ago
So nobody would use code written in common lisp... but they will use code written in an entirely new language.... right...
codr7 13 hours ago
I love Common Lisp, but I would definitely think twice before implementing anything I want other people to use on their machines in it. A tiny C executable is pretty nimble in comparison to anything you'll get out of Common Lisp.
vindarel 5 hours ago
I understand, it isn't that bad though: a web app of mine with dozens of dependencies and all templates and static assets is 35MB with SBCL and core compression (that includes the compiler and debugger, useful to connect to a running app and exploring its state (or even hot reloading code)). I suppose that's in the ballpark of a growing Go application. LispWorks has a tree shaker that builds a hello world in 5MB.
bArray 1 day ago
> Virtual machines still suck a lot of CPU and bandwidth for nothing but emulation. Containers in Linux with cgroups are still full of RCE (remote command execution) and priviledge escalation. New ones are discovered each year. The first report I got on those listed 10 or more RCE + PE (remote root on the machine). Remote root can also escape VMs probably also.

A proper virtual machine is extremely difficult to break out of (but it can still happen [1]). Containers are a lot easier to break out of. I virtual machines were more efficient in either CPU or RAM, I would want to use them more, but it's the worst of both.

[1] https://www.zerodayinitiative.com/advisories/ZDI-23-982/

gatane 1 hour ago
C++ has long compilation times, when you have long code base.
markus_zhang 1 day ago
C, or more precisely a constrained C++ is my go to language for side projects.

Just pick the right projects and the language shines.

codr7 1 day ago
I've tried, but never succeeded in doing that; the complexity eventually seeps in through the cracks.

C++'s stdlib contains a lot of convenient features, writing them myself and pretending they aren't there is very difficult.

Disabling exceptions is possible, but will come back to bite you the second you want to pull in external code.

You also lose some of the flexibility of C, unions become more complicated, struct offsets/C style polymorphism isn't even possible if I remember correctly.

I love the idea though :)

ryandrake 23 hours ago
> C++'s stdlib contains a lot of convenient features, writing them myself and pretending they aren't there is very difficult.

I've never understood the motivation behind writing something in C++, but avoiding the standard library. Sure, it's possible to do, but to me, they are inseparable. The basic data types and algorithms provided by the standard library are major reasons to choose the language. They are relatively lightweight and memory-efficient. They are easy to include and link into your program. They are well understood by other C++ programmers--no training required. Throughout my career, I've had to work in places where they had a "No Standard Library" rule, but that just meant they implemented their own, and in all cases the custom library was worse. (Also, none of the companies could articulate a reason for why they chose to re-implement the standard library poorly--It was always blamed on some graybeard who left the company decades ago.)

Choosing C++ without the standard library seems like going skiing, but deliberately using only one ski.

kevin_thibedeau 19 hours ago
Modern C++ has goodies like consteval that are supremely useful for embedded work. STL and the rest of the stdlib on the other hand depends on heap and exceptions for error reporting which are generally a no go zone for resource constrained targets.

You can productively use C++ as C-with-classes (and templates, and namespaces, etc.) without depending on the library. That leaves you no worse off than rolling your own support code in plain C.

_benton 17 hours ago
Can't you disable exceptions?
kevin_thibedeau 13 hours ago
Yes, but the C++ library becomes inherently broken because there is no error reporting.
codr7 22 hours ago
The stdlib makes choices that might not be optimal for everyone.

Plenty of code bases also predate it, when I started coding C++ in 1995 most people were still rolling their own.

wvh 1 day ago
Going from mid-90s assembly to full stack dev/sec/ops, getting back to just a simple Borland editor with C or assembly code sounds like a lovely dream.

Your brain works a certain way, but you're forced to evolve into the nightmare half-done complex stacks we run these days, and it's just not the same job any more.

kuon 1 day ago
Try zig, it is C with a bit of polish.
sramsay 17 hours ago
> it is C with a bit of polish.

I am fast becoming a Zig zealot.

What I've discovered is that while it does regularize some of the syntax of C, the really noticeable thing about Zig is that it feels like C with all the stuff I (and everyone else) always end up building on my own built into the language: various allocators, error types, some basic safety guardrails, and so forth.

You can get clever with it if you want -- comptime is very, very powerful -- but it doesn't have strong opinions about how clever you should be. And as with C, you end up using most of the language most of the time.

I don't know if this is the actual criterion for feature inclusion and exclusion among the Zig devs, but it feels something like "Is this in C, or do C hackers regularly create this because C doesn't have it?" Allocators? Yes. Error unions? Yes. Pattern matching facilities? Not so much. ADTs? Uh, maybe really stupid ones? Generics, eh . . . sometimes people hack that together when it feels really necessary, but mostly they don't.

Something like this, it seems to me, results in features Zig has, features Zig will never have, and features that are enabled by comptime. And it's keeping the language small, elegant, and practical. I'm a big time C fan, and I love it.

__turbobrew__ 10 hours ago
> and so forth

forth, you say?

codedokode 16 hours ago
I saw mentions of Zig here often, so I decided to look at the docs to see what features it has. I had to scroll through all the docs only to find myself disappointed by the fact that Zig doesn't help with memory management in any way and advices to use comments and careful coding instead. And what adds to the disappointment is that its plus/minus operators do not catch overflow. If I wanted undefined behaviour, I could just use C/C++ as no language can compete with them in this regard.
sgt 1 day ago
Why zig and not Rust? Just to throw the question out there :-)
kelnos 19 hours ago
Zig is a much simpler language than Rust. I'm a big Rust fan, but Rust is not even close to a drop-in replacement for C. It has a steep learning curve, and often requires thinking about and architecting your program much differently from how you might if you were using C.

For a C programmer, learning and becoming productive in Zig should be a much easier proposition than doing the same for Rust. You're not going to get the same safety guarantees you'd get with Rust, but the world is full of trade offs, and this is just one of them.

nyrikki 18 hours ago
For me, where linked lists, graphs and other structures are a common need, zig gives me slices and deferred frees.

Rust is double expensive in this case. You have to memorize the borrow checker and be responsible for all the potential undefined behavior with unsafe code.

But I am not a super human systems programmer. Perhaps if I was the calculus would change. But personally when I have to drop down below a GC language, it is pretty close to the hardware.

Zig simply solves more of my personal pain points... but if rust matures in ways that help those I'll consider it again.

codedokode 16 hours ago
> You have to memorize the borrow checker

Correct me if I am wrong, but Rust at least has a borrow checker while in C (and Zig) one has to do the borrow checking in their head. If you read a documentation for C libraries, some of them mention things like "caller must free this memory" and others don't specify anything and you have to go to the source code to find out who is responsible for freeing the memory.

nyrikki 15 hours ago
Rust gives two reasons for the borrow checker, iterator invalidation and use after free.

As I have always bought into Dennis Ritchie's loop programming concepts, iterator invalidating hasn't been a problem.

Zig has defer which makes it trivial to place next to allocation, and it is released when it goes out of scope.

As building a linked list, dealing with bit fields, ARM peripherals, etc...; all require disabling the Rust borrow checker rules, you don't benefit at all from them in those cases IMHO.

It is horses for courses, and the Rust project admits they chose a very specific horse.

C is what it is, and people who did assembly on a PDP7 probably know where a lot of that is from.

I personally prefer zig to c... but I will use c when it makes the task easier.

keepamovin 1 day ago
The author's github profile: https://github.com/thodg

The way he writes about his work in this article, I think he's a true master. Very impressive to see people with such passion and skill.

teleforce 10 hours ago
> Garbage collectors suck, and all my Common Lisp projects have very limited applications just because of the garbage collector.

If only a very tiny fraction of the resources effort, research, time, money etc of all ML/AI funds are directed for the best design of high performance GC, it will make a software world a much better place. The fact that we have a very few books dedicated on GC design and thousands of books now dedicated on AI/ML, it is quite telling.

For real-world example and analogy, automotive industry dedicated their resources on the best design of high performance automatic transmission and now it has a faster auto transmission than manual for rally and racing. For normal driving auto is what the default and available now, most of the cars do not sell in manual transmission version.

> Linux is written in C, OpenBSD is written in C, GTK+ is object-oriented pure C, GNOME is written in C. Most of the Linux desktop apps are actually written in plain old C. So why try harder ? I know C

C is the lingua-franca of all other programming languages including Python, Julia, Rust, etc. Period.

D language has already bite the bullet and made C built-in by natively supporting it. Genius.

D language also has GC by default for more sane and intuitive programming, it's your call. It also one of the fastest compilation time and execution time languages in existence.

From the KC3 language website, "KC3 is a programming language with meta-programmation and a graph database embedded into the language."

Why you want to have a graph database embedded into the language? Just support associative array built-in since it has been proven to be the basis of all common data representations of spreadsheet, SQL, NoSQL, matrices, Graph database, etc.

[1] Associative Array Model of SQL, NoSQL, and NewSQL Databases:

https://arxiv.org/pdf/1606.05797

[2] Mathematics of Big Data: Spreadsheets, Databases, Matrices, and Graphs:

https://mitpress.mit.edu/9780262038393/mathematics-of-big-da...

codedokode 17 hours ago
Writing code in C is very unpleasant, verbose and repetative. For example, if I want to have a structure in C and have a way to print its contents, or free its memory and memory of nested structures, or clone it recursively, it is very difficult to make automatically. I found only two options: either write complicated macros to define the structure and functions (feels like writing a C++ compiler from scratch), or define structure in Python and generate the C code from it.

I looked at C++, but it seems that despite being more feature-rich, it also cannot auto-generate functions/methods for working with structures?

Also returning errors with dynamically allocated strings (and freeing them) makes functions bloated.

Also Gnome infrastructure (GObject, GTK and friends) requires writing so much code that I feel sorry for people writing Gnome.

Also, how do you install dependencies in C? How do you lock a specific version (or range of versions) of a dependency with specific build options, for example?

codr7 13 hours ago
Simplify.

If you try to write the same complicated mess in C as you would in any other language it's going to hurt.

Not having a package manager can be a blessing, depends on your perspective.

elif 1 day ago
So this is a journey where starting in ruby, going through an SICP phase, and then eventually compromising that it isn't viable. it kinda seems like C is just the personal compromise of trying to maintain nerdiness rather than any specific performance needs.

I think it's a pretty normal pattern I've seen (and been though) of learning-oriented development rather than thoughtful engineering.

But personally, AI coding has pushed me full circle back to ruby. Who wants to mentally interpret generated C code which could have optimisations and could also have fancy looking bugs. Why would anyone want to try disambiguating those when they could just read ruby like English?

dkersten 1 day ago
> But personally, AI coding has pushed me full circle back to ruby.

This happened to me too. I’m using Python in a project right now purely because it’s easier for the AI to generate and easier for me to verify. AI coding saves me a lot of time, but the code is such low quality there’s no way I’d ever trust it to generate C.

somewhereoutth 1 day ago
> AI coding saves me a lot of time, but the code is such low quality

Given that low quality code is perhaps the biggest time-sink relating to our work, I'm struggling to reconcile these statements?

dkersten 1 day ago
It depends on what you need the code for. If it’s something mission critical, then using AI is likely going to take more time than it saves, but for a MVP or something where quality is less important than time to market, it’s a great time saver.

Also there’s often a spectrum of importance even within a project, eg maybe some internal tools aren’t so important vs a user facing thing. Complexity also varies: AI is pretty good at simple CRUD endpoints, and it’s a lot faster than me at writing HTML/CSS UI’s (ie the layout and styling, without the logic).

If you can isolate the AI code to code that doesn’t need to be high quality, and write the code that doesn’t yourself, it can be a big win. Or if you use AI for an MVP that will be incrementally replaced by higher quality code if the MVP succeeds, can be quite valuable since it allows you to test ideas quicker.

I personally find it to be a big win, even though I also spend a lot of time fighting the AI. But I wouldn’t want to build on top of AI code without cleaning it up myself.

There are also some tasks I’ve learned to just do myself: eg I do not let the AI decide my data model/database schema. Data is too important to leave it up to an AI to decide. Also outside of simple CRUD operations, it generates quite inefficient database querying so if it’s on a critical path, perhaps write the queries yourself.

codr7 13 hours ago
Because Ruby can't handle most of the problems C is used for?

Because they're implementing Ruby, for example?

ustad 18 hours ago
Tools should enable creativity and problem-solving, not become problems themselves. The best languages fade into the background, becoming almost invisible as you express your solution. When the language constantly demands center stage, something has gone fundamentally wrong with its design philosophy.
DeathArrow 6 hours ago
I still think in the right tool for the job. Trying to write some web application in C will drive me mad. Also will trying to write some low level stuff in Java.
mistyvales 9 hours ago
What's the best way to learn C? Any good modern book recommendations, or sites?
davidwf 19 minutes ago
I highly recommend Zed Shaw's "Learn C the Hard Way": https://learncodethehardway.org/c/

I worked through this and felt well-prepared to actually use C in anger when I had to.

Lyngbakr 8 hours ago
There's a second edition of the legendary K&R book¹ to get you started.

¹https://www.amazon.com/Programming-Language-2nd-Brian-Kernig...

rainmaking 1 day ago
I like Nim- compiles to C so you get similarly close to the instructions and you can use a lot of high level features if you want to, but you can also stay close to the metal.
Yie1cho 19 hours ago
"and all was bounds-checked at memory cost but the results were awesome. Defensive programming all the way : all bugs are reduced to zero right from the start."

a bit LOL, isn't it?

also the part about terraform, ansible and the other stuff.

ternaryoperator 18 hours ago
In the abstract, the simplicity of C has a definite appeal. However, pragmatically, the sense that I am mowing the lawn with a pair of scissors gets tiring quickly.
BirAdam 17 hours ago
Linux/UNIX distributions are essentially C development environments, and their package managers are basically C language package managers… so, you needn’t do everything yourself, just grab the source packages.
FrustratedMonky 1 day ago
Maybe the moral here is learning Lisp made him a better C programmer.

Could he have jumped right into C and had amazing results, if not for the Journey learning Lisp and changing how he thought of programming.

Maybe learning Lisp is how to learn to program. Then other languages become better by virtue of how someone structures the logic.

codr7 13 hours ago
I would definitely recommend any programmer to learn both Lisp and C at some point.
neuroelectron 16 hours ago
What's with the toggle grid at the bottom of the article? Is it just a fidget toy?
cryptonector 9 hours ago
> It was supposed to be a short mission I thought I could learn Common Lisp in ten days and hack a quick server management protocol. I ended up writing throw-away Common Lisp code that generated C for a fully-fledged ASN.1 parser and query system for a custom Common Lisp to C SNMP server.

I believe it. And I'd love to see it and hack on it, if it were open source.

This whole kc3 thing looks pretty interesting to me. I agree with the premise. It's really just another super-C that's not C++, but that's a pretty good idea for a lot of things because the C ABI is just so omnipresent.

csimai 1 day ago
I've read through your website and thinking processes.

Your work is genius! I hope KC3 can be adopted widely, there is great potential.

pmontra 1 day ago
504 Gateway Timeout

Archived at https://archive.is/zIZ8S

WhereIsTheTruth 20 hours ago
I'm on the same boat, i now use exclusively C, and D (with the -betterC flag) for my own projects

I refuse to touch anything else, but i keep an eye on the new languages that are being worked on, Zig for example

OutOfHere 1 day ago
The point is that much of the defensive programming you would have to do in C is unnecessary and automatic in Rust.
guappa 1 day ago
There's much more to defensive programming than avoiding double frees and overflows.
IshKebab 1 day ago
Yeah and Rust enables much more defensive programming than just avoiding double frees and overflows.
desdenova 1 day ago
much != all
moron4hire 12 hours ago
About a year ago, I had gotten fed up with what felt like overlyb strict context requirements basically giving me to abandon years of work and thought I'd try my hand at C++ again.

I wanted to do this on Linux, because I my main laptop is a Linux machine after my children confiscated my Windows laptop to play Minecraft with the only decent GPU in the house.

And I just couldn't get past the tooling. I could not get through to anything that felt like a build setup that I'd be able to replicate in my own.

On Windows, using Visual Studio, it's not that bad. It's a little annoying compared to a .NET project, and there are a lot more settings to worry about, but at the end of the day VS makes the two but very different from each other.

I actually didn't understand that until I tried to write C++ on Linux. I thought C++ on Windows was worlds different than C#. But now I've seen the light.

I honestly don't know how people do development with on Linux. Make, Cmake, all of that stuff, is so bad.

IDK, maybe someone will come along and tell me, "oh, no, do this and you'll have no problems". I hope so. But without that, what a disgusting waste of time C and C++ is on Linux.

davidwf 16 minutes ago
The last time I was writing C professionally I used ceedling (https://www.throwtheswitch.org/ceedling) and I highly recommend it. I hate cmake too. :-)
edye 3 hours ago
I find make, cmake, and the other stuff annoying also. For personal stuff I just use a build.sh file. For debugging I use gf2, which is a gdb frontend. Hopefully raddebugger gets ported to linux soon. One nice tool I like on linux for prototyping is the tiny c compiler, because it compiles 7x faster than gcc or clang. It is also much faster than the visual studio compiler. I remember trying to get the tiny c compiler to work on windows; it can compile things, but I couldn't get it to generate pdb files for debug info.
henning 20 hours ago
> Defensive programming all the way : all bugs are reduced to zero right from the start

Has it been fuzzed? Have you had someone who is very good at finding bugs in C code look at it carefully? It is understandable if the answer to one or both is "no". But we should be careful about the claims we make about code.

Yie1cho 19 hours ago
he may be good at C but not that good. no one's that good. and this stupid overconfidence leads to sec holes.
ein0p 20 hours ago
At this point one should choose a C-like subset of Rust, if they have this particular urge. A lot fewer rakes under the leaves.