The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.
The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.
Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.
I used to be a huge VHDL proponent, talk about the delta cycle stuff, give VHDL classes at work to new college grads and such. And then I moved to the West Coast and was forced to start using Verilog.
And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.
It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.
I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.
On a practical level, you're right, most of my team's work is done in Verilog.
That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.
My memory is definitely rusty on this, but you can easily construct cases where the VHDL delta cycle model creates problems where it doesn’t for Verilog.
I remember inserting clock signal assignments in VHDL to get a balanced delta cycle clock tree. In Verilog, that all simply gets flattened.
I can describe the VHDL delta cycle model pretty well, and I can’t for Verilog, yet the Verilog model has given me less issues in practice
As for elegance: I can’t stand the verboseness of VHDL anymore. :-)
The question for me is, where do I catch, describe the physical reality the model describes? A simulation model can be very elegant. But does it represent how physical things really behave? Can we even expect to do that at RTL, or further down the design flow? As the name suggest, we are talking about transferring data between registers. In the RTL that is what I can expect to describe.
At the end of the day, what I write will become an electrical circuit - in a FPGA or an ASIC (or both), having the complex exact modelling with wire delays, capacitance, cross talk, cell behavior too early makes it impossibly to simulate fast enough to iterate. So then we need to have a more idealized world, but keeping in mind that (1) it is an idealized world and (2) sooner or later the model will be the rubber on the road.
To me, Verilog and SystemVerilog allow me to do this efficiently. Warts and all.
Oh, and also, where in my toolchain is my VHDL model translated/transformed into Verilog? How good is that translation? How much does the dual licensing cost.
Things like mixed language simulation, formal verification between a verilog netlist and RTL in Verilog, mapping to cell libraries in Verilog. Integration of IP cores written in SystemVerilog with your model?
Are the tools for VHDL as well tested as with code in Verilog? How big is the VHDL team at the tool vendor, library vendor, IP vendor, fab vendor compared to the Verilog, SV team? Can I expect the same support as a VHDL user as for Verilog? How much money does a vendor earn from VHDL customers compared to Verilog, SV? How easy is it to find employees with VHDL experience?
VHDL may be a very nice language for simulation. But the engineering, business side is messy. And dev time, money can't be ignored. Getting things as fast and cheap as possibly still meeting a lot of functional, business requirements is what we as engineers are responsible for. Does VHDL make that easier or not?
> where in my toolchain is my VHDL model translated/transformed into Verilog?
It's not? Why would it?
As much as I like Verilog, VHDL is a first class RTL language just like Verilog. I've done plenty of chips that contain both VHDL and Verilog. They both translate directly to gate level.
These days, most EDA tools use Verific parser and elaborator front-ends. The specific tool magic happens after that and that API is language agnostic.
> How easy is it to find employees with VHDL experience?
On the East Coast and in Europe: much easier than finding employees with Verilog experience. (At least that was the case 20 years ago, I have no clue how it is today.)
One thing that has changed a lot is that SystemVerilog is now the general language of choice for verification, which helps give (System)Verilog an edge for RTL design too.
Involved in FPGA and ASIC projects since 1997. Predominantly in Europe, nowadays more Asia and some in the US. Since ~2010 I have only seen VHDL in small chops targeting only FPGAs, and in government-heavy projects like defence and space. Nowadays these are also by and large SV. The ratio is something like one in VHDL for 20 Verilog, SV projects. They teach VHDL at universities, and then ppl get to experience SV as soon as they enter the market.
Typical issues are still as given before. Many small IP vendors, esp for communication, networking are using and understand, support only SV. I agree on SV for verification is a big driver.
the geographic constraint is probably the real answer to "which is better" for most people. you learn what your team uses, what your local jobs demand. theoretical elegance matters less than "can i get hired next month"
Over the years I have run Altera, Lattice, and Xilinx... and almost all reasonably complex projects were always done in Verilog. If I recall Xilinx fully integrated its Synopsys export workflow a few years back, but not sure where that went after the mergers.
This actually sounds a bit like a C/C++ argument. Roughly: Yes, you can easily write incorrect code but when some basic coding conventions are followed, UAF/double free/buffer overflows/... are just not a problem. After all, some of the world's most complex software is built with C / C++. If you couldn't write software reliably with C / C++, that could never be the case.
I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.
While VHDL makes a fun academic toy language, it has always been Verilog in the commercial settings. Both languages can develop hard to trace bugs when the optimizer decides to simply remove things it thinks are unused. =3
How does this compare to chisel [1] , i never could get around the whole scala tooling - seemed a bit over the top.
Though i guess it is a bit more mature and probably more enterprisey
> i never could get around the whole scala tooling
scala is popular in places like Alphabet, that apparently allow go & scala projects in production.
However, I agree while scala is very powerful in some ways, it just doesn't have a fun aesthetic. If one has to go spelunking for scalable hardware accelerators, a vendors linux DMA llvm C/C++ API is probably less fragile.
For my simple projects, one zynq 7020 per node is way more than we should ever need. =3
I disagree. We've produced numerous complex chips with VHDL over the last 30 years. Most of the vendor models we have to integrate with are Verilog, so perhaps it is more popular, but that's no problem for us. We've found plenty of bugs for both VHDL and Verilog in the commercial tooling we use, neither is particularly worse (providing you're happy to steer clear of the more recent VHDL language features).
> While VHDL makes a fun academic toy language, ...
I spent the first half of my career working at some of the largest companies at the time on huge communication ASICs that were all written in VHDL, there was no Verilog in sight.
As much as I prefer to write Verilog now, VHDL is without question a more robust and better specified language, with features that Verilog only gained a decade later through SystemVerilog.
There's a reason why almost all major EDA tool support VHDL just as well as Verilog.
> The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
This is what I use when I play with hardware simulation in Haskell:
type S a = [a]
register :: a -> S a -> S a
register a0 as = a0:as
-- combinational logic can be represented as typical pure
-- functions and then glued into "circuits" with register's
-- and map/zip/unzip functions.
This thing also separates externally visible events recorded in the (infinite) list of values from externally unobservable pure (combinational) logic. But, one can test combinational logic separately, with property based testing, etc.
I don't understand this: isn't the thing in the article only relevant for software simulations, while in hardware ordering is arbitrary like in Verilog, or at least dependent on wire lengths that are not specified in HDL? (unless you delay the effect to the next clock update, which it seems to me will work the same in all HDLs and targets).
And afaik HDLs are almost exclusively used for hardware synthesis, never seen any software written in those languages.
So it doesn't seem important at all. In fact, for software simulation of hardware you'd want the simulation to randomly choose anything possible in hardware, so the Verilog approach seems correct.
I'm a long time verilog user (30+ years, a dozen or so tapeouts), even written a couple of compilers so I'm intimate with the gory details of event scheduling.
Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).
'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).
In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.
In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.
IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)
I love that VHDL formalizes Verilog's pragmatic blundering, but emphasizing delta-cycle ordering is "inside baseball" and IMO bad marketing. VHDL's approach is conceptually clean, but from a practical perspective, this ordering doesn't (and shouldn't) matter.
Better to emphasize the type system, which make a durable and meaningful difference to users (both experienced and new). My go-to example is fixed-point arithmetic: for VHDL, this was an extension to the IEEE libraries, and didn't require a change to the underlying language (think of how c++'s std:: evolves somewhat separately from compilers). Verilog's type system is insufficiently expressive to add fixed-point types without changes to the language itself. This positions VHDL better for e.g. low-precision quantization for AI/ML.
In any case, the VHDL/Verilog language wars are over, and while VHDL "lost", it's clear the victory was partly Pyrrhic - RTL probably has a polyglot future, and everyone's waiting (with mixtures of resignation and hope, but very little held breath) for something better to come along.
I remember having this debate back in the late 1990s when I was in college for my electrical and computer engineering (ECE) degree. At the time as students, we didn't really know about nuances like delta cycles, so preferring Verilog or VHDL came down to matter of personal taste.
Knowing what I know now, I'm glad that they taught us VHDL. Also that's one of the reasons that it's worth trying to get into the best college that you can, because as long as you're learning stuff, you might as well learn the most rigorous way of doing it.
---
It's these sorts of nuances that make me skeptical of casual languages like Ruby and even PHP (my favorite despite its countless warts). I wish that we had this level of insight back during the PHP 4 to 5 transition, because so many easily avoidable mistakes were made in a design-by-committee fashion.
For example, PHP classes don't use copy-on-write like arrays, so we missed out on avoiding a whole host of footguns, as well as being able to use [] or -> interchangeably like in JavaScript. While we're at it, the "." operator to join arrays was a tragic choice (they should have used & or .. IMHO) because then we could have used "." for the object operator instead of -> (borrowed from C++), but I digress.
I often dream of writing a new language someday at the intersection of all of these lessons learned, so that we could write imperative-looking code that runs in a functional runtime. It would mostly encourage using higher-order methods strung together, but have a smart enough optimizer that it can handle loops and conditional logic by converting them to higher-order methods internally (since pure code has no side effects). Basically the intermediate code (i-code) would be a tree representation in the same form as Lisp or a spreadsheet, that could be transpiled to all of these other languages. But with special treatment of mutability (monadic behavior). The code would be pure-functional but suspend to read/write outside state in order to enforce the functional core, imperative shell pattern.
A language like that might let us write business logic that's automatically parallelized and could be synthesized in hardware unmodified. It would tend to execute many thousands of times faster than anything today on modifiable hardware like an FPGA. I'd actually prefer to run it on a transputer, but those fell out of fashion decades ago after monopoly forces took over.
Needs a [2010] tag. In almost all modern hardware development you'll have coding guidelines along the lines of "Always use blocking assignments for comb logic, always use non-blocking for sequential logic". You end up back at the same place as VHDL, by nature SystemVerilog is much weaker typed than VHDL. So you have to just have conventions in order to regain some level of safety.
Naively as a West Coast Verilog person, VHDL Delta cycles seem like a nice idea, but not what actual circuits are doing by default. The beauty and the terror of Verilog is the complete, unconstrained parallel nature of it’s default - it all evaluates at t=0 by default, until you add clocks and state via registers. VHDL seems easy to create latches and other abominations too easily. (I am probably wrong at least partially.)
Verilog gives you enough rope.
Once the design gets past toy size, you spend time chasing sim vs synthesis mismatches because the language leaves ordering loose in places where humans read intent into source order.
VHDL's delta cycles are weird, and there's edge cases there too, but the extra ceremony works more like a childproof cap than a crown jewel.
Do you consider 800+mm2 slabs of 3nm of silicon still toy size? Because there's a very high chance that those were written in Verilog, and I've never had to chase sim vs synthesis mismatches.
> Verilog gives you enough rope.
Yes. If you don't know what you're doing and don't follow the industry standard practises.
The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?
What do you mean by simulate? Do you want the language to be aware of the temperature of the silicon? Because I can build you circuits whose behaviour changes due to variation in the temperature of the silicon. Essentially all these languages are not timing aware. So you design your circuit with combinatorial logic and a clock, and then hope (pray) that your compiler makes it meet timing.
The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.
The only way to simulate what real hardware does is to synthesise the design, get a net list and do a gate level simulation. This is incredibly slow, both to compile and to simulate.
You could, of course, simplify the timing model a lot. In the end you get down to “there is some time passing for the signal to get through this logic, we don’t know how much but we assume it’s less than any clock period”.. in which case we end up with delta cycles.
Real hardware has clock trees. Wouldn't all (most?) problems with delta cycles go away if the HDL understood the concept of clocks and clock balancing?
I’m not exactly sure what you’re getting at, but I think I’ve had a similar question: why don’t HDLs have language elements more representative of what digital circuits are constructed from, namely synchronous and asynchronous circuits, rather than imperative input triggered blocks (processes IIRC, it’s been a while)?
I always thought it was confusing to design a circuit mentally (or on paper) out of things like muxes, encoders, flip flops, etc. and not have language-level elements to represent these things (without defining your own components obviously).
I remember looking this up, and I believe it’s because the languages were originally designed for simulation and verification, and there are things you might want to do in a simulation/verification language for testing that are outside of what the hardware can do. Mixing the two is confusing IMO, but clearly demarcating the hardware-realizable subset of the language would be better than the current state.
You generally do not want to simulate or describe raw gate-level netlists. Both languages are capable of that. Old school Verilog (not SystemVerilog) is still the defacto netlist exchange format for many tools.
It's just aggravatingly slow to sim and needlessly verbose. Feeding high-level RTL to Verilator to do basic cycle-accurate sim has exceptionally fast iteration speed these days.
Is it really if you restrict yourself to sensible design practices? You generally want to simulate simple clocked Logic with a predefined clock, most of the time anything else is a mistake or bad design. So just if rising edge clk next_state <= fn(previous_state, input) . It seems to me VHDL and verilog are simply at the wrong abstraction level and by that they make simulation needlessly complicated and design easy to do wrong. To me it seems that if they had the concept of clocks instead none of this would be necessary and many bugs avoided (but I'm no expert on simulator design, so I might be missing something...)
I agree basically with everything you're saying, but that's not arguing for raw gate netlists. If anything it's arguing for even higher levels of abstraction where clock domains are implicit semantic contexts.
Many new school HDLs are working in this space and they couldn't be farther from the "representative of what digital circuits are constructed from" idea. Often they're high-level programmatic generators, very far from describing things in terms of actual PDK primitives.
In a way is further away, but in another way it's actually closer to how real hardware works: Clock (and reset) trees are real physical things which exist on all digital chips.
> Why don't VHDL and Verilog just simulate what hardware does?
Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...
They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.
Both SystemVerilog and VHDL have AMS extensions for simulating analog circuits. They work pretty well but you also pay a pretty penny for the simulator licenses for them.
dont know if trolling. SR latch you can do with 2 NANDs, or NORs there are plenty of *digital* circuits with that functionality, and yes, there are very rare cases when you construct this out of logic and not use a library cell for this. pulse circuit is AND(not(not(not(a))),a) also rarely used but used nonetheless. to properly model/simulate them you would need delta cycles
I'm not sure if you are trolling. 99.999% of digital design is "if rising edge clk new_state <= fn(old_state, input)", with an (a)sync reset. The language should make that the default and simple to do, and anything else out of the ordinary hard. Now it's more the other way around.
Reminds me a lot of "Logical Execution Time" and the work of Edward Lee ("The Problem With Threads") for a software equivalent. Determinism needs sparation of computation from communication.
Please stop bickering about verilog vs vhdl - if you use NBAs the scheduler works exactly the same in modern day simulators. There is no crown jewel in vhdl anymore. Also type system is annoying. Its just in your way, not helping at all.
You're not wrong, but blocking assignments (and their equivalent in VHDL, variables), are useful as local variables to a process/always block. For instance to factor common sub-expressions and not repeat them. So using only non-blocking assignments everywhere would lead to more ugly code.
VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.
The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.
Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.
And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.
It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.
I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.
Anyway, not really relevant, but this all reminds me of the famous Verilog vs VHDL contest of 1997: https://danluu.com/verilog-vs-vhdl/
That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.
I remember inserting clock signal assignments in VHDL to get a balanced delta cycle clock tree. In Verilog, that all simply gets flattened.
I can describe the VHDL delta cycle model pretty well, and I can’t for Verilog, yet the Verilog model has given me less issues in practice
As for elegance: I can’t stand the verboseness of VHDL anymore. :-)
At the end of the day, what I write will become an electrical circuit - in a FPGA or an ASIC (or both), having the complex exact modelling with wire delays, capacitance, cross talk, cell behavior too early makes it impossibly to simulate fast enough to iterate. So then we need to have a more idealized world, but keeping in mind that (1) it is an idealized world and (2) sooner or later the model will be the rubber on the road.
To me, Verilog and SystemVerilog allow me to do this efficiently. Warts and all.
Oh, and also, where in my toolchain is my VHDL model translated/transformed into Verilog? How good is that translation? How much does the dual licensing cost.
Things like mixed language simulation, formal verification between a verilog netlist and RTL in Verilog, mapping to cell libraries in Verilog. Integration of IP cores written in SystemVerilog with your model?
Are the tools for VHDL as well tested as with code in Verilog? How big is the VHDL team at the tool vendor, library vendor, IP vendor, fab vendor compared to the Verilog, SV team? Can I expect the same support as a VHDL user as for Verilog? How much money does a vendor earn from VHDL customers compared to Verilog, SV? How easy is it to find employees with VHDL experience?
VHDL may be a very nice language for simulation. But the engineering, business side is messy. And dev time, money can't be ignored. Getting things as fast and cheap as possibly still meeting a lot of functional, business requirements is what we as engineers are responsible for. Does VHDL make that easier or not?
It's not? Why would it?
As much as I like Verilog, VHDL is a first class RTL language just like Verilog. I've done plenty of chips that contain both VHDL and Verilog. They both translate directly to gate level.
These days, most EDA tools use Verific parser and elaborator front-ends. The specific tool magic happens after that and that API is language agnostic.
> How easy is it to find employees with VHDL experience?
On the East Coast and in Europe: much easier than finding employees with Verilog experience. (At least that was the case 20 years ago, I have no clue how it is today.)
One thing that has changed a lot is that SystemVerilog is now the general language of choice for verification, which helps give (System)Verilog an edge for RTL design too.
Typical issues are still as given before. Many small IP vendors, esp for communication, networking are using and understand, support only SV. I agree on SV for verification is a big driver.
The Amaranth HDL python project does look fun =3
https://github.com/amaranth-lang/amaranth
I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.
https://github.com/amaranth-lang/amaranth
While VHDL makes a fun academic toy language, it has always been Verilog in the commercial settings. Both languages can develop hard to trace bugs when the optimizer decides to simply remove things it thinks are unused. =3
[1]https://github.com/chipsalliance/chisel
scala is popular in places like Alphabet, that apparently allow go & scala projects in production.
However, I agree while scala is very powerful in some ways, it just doesn't have a fun aesthetic. If one has to go spelunking for scalable hardware accelerators, a vendors linux DMA llvm C/C++ API is probably less fragile.
For my simple projects, one zynq 7020 per node is way more than we should ever need. =3
I spent the first half of my career working at some of the largest companies at the time on huge communication ASICs that were all written in VHDL, there was no Verilog in sight.
As much as I prefer to write Verilog now, VHDL is without question a more robust and better specified language, with features that Verilog only gained a decade later through SystemVerilog.
There's a reason why almost all major EDA tool support VHDL just as well as Verilog.
And afaik HDLs are almost exclusively used for hardware synthesis, never seen any software written in those languages.
So it doesn't seem important at all. In fact, for software simulation of hardware you'd want the simulation to randomly choose anything possible in hardware, so the Verilog approach seems correct.
Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).
'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).
In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.
In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.
IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)
Better to emphasize the type system, which make a durable and meaningful difference to users (both experienced and new). My go-to example is fixed-point arithmetic: for VHDL, this was an extension to the IEEE libraries, and didn't require a change to the underlying language (think of how c++'s std:: evolves somewhat separately from compilers). Verilog's type system is insufficiently expressive to add fixed-point types without changes to the language itself. This positions VHDL better for e.g. low-precision quantization for AI/ML.
In any case, the VHDL/Verilog language wars are over, and while VHDL "lost", it's clear the victory was partly Pyrrhic - RTL probably has a polyglot future, and everyone's waiting (with mixtures of resignation and hope, but very little held breath) for something better to come along.
I remember having this debate back in the late 1990s when I was in college for my electrical and computer engineering (ECE) degree. At the time as students, we didn't really know about nuances like delta cycles, so preferring Verilog or VHDL came down to matter of personal taste.
Knowing what I know now, I'm glad that they taught us VHDL. Also that's one of the reasons that it's worth trying to get into the best college that you can, because as long as you're learning stuff, you might as well learn the most rigorous way of doing it.
---
It's these sorts of nuances that make me skeptical of casual languages like Ruby and even PHP (my favorite despite its countless warts). I wish that we had this level of insight back during the PHP 4 to 5 transition, because so many easily avoidable mistakes were made in a design-by-committee fashion.
For example, PHP classes don't use copy-on-write like arrays, so we missed out on avoiding a whole host of footguns, as well as being able to use [] or -> interchangeably like in JavaScript. While we're at it, the "." operator to join arrays was a tragic choice (they should have used & or .. IMHO) because then we could have used "." for the object operator instead of -> (borrowed from C++), but I digress.
I often dream of writing a new language someday at the intersection of all of these lessons learned, so that we could write imperative-looking code that runs in a functional runtime. It would mostly encourage using higher-order methods strung together, but have a smart enough optimizer that it can handle loops and conditional logic by converting them to higher-order methods internally (since pure code has no side effects). Basically the intermediate code (i-code) would be a tree representation in the same form as Lisp or a spreadsheet, that could be transpiled to all of these other languages. But with special treatment of mutability (monadic behavior). The code would be pure-functional but suspend to read/write outside state in order to enforce the functional core, imperative shell pattern.
A language like that might let us write business logic that's automatically parallelized and could be synthesized in hardware unmodified. It would tend to execute many thousands of times faster than anything today on modifiable hardware like an FPGA. I'd actually prefer to run it on a transputer, but those fell out of fashion decades ago after monopoly forces took over.
((Shai-Hulud Desires the Verilog))
But with a solid design flow (which should include linting tools like Spyglass for both VHDL and Verilog), it’s not a major concern.
There's no major implementation which doesn't handle warning or even failing the flow on accidental latch logic inside an always_comb.
VHDL's delta cycles are weird, and there's edge cases there too, but the extra ceremony works more like a childproof cap than a crown jewel.
Do you consider 800+mm2 slabs of 3nm of silicon still toy size? Because there's a very high chance that those were written in Verilog, and I've never had to chase sim vs synthesis mismatches.
> Verilog gives you enough rope.
Yes. If you don't know what you're doing and don't follow the industry standard practises.
The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.
You could, of course, simplify the timing model a lot. In the end you get down to “there is some time passing for the signal to get through this logic, we don’t know how much but we assume it’s less than any clock period”.. in which case we end up with delta cycles.
I always thought it was confusing to design a circuit mentally (or on paper) out of things like muxes, encoders, flip flops, etc. and not have language-level elements to represent these things (without defining your own components obviously).
I remember looking this up, and I believe it’s because the languages were originally designed for simulation and verification, and there are things you might want to do in a simulation/verification language for testing that are outside of what the hardware can do. Mixing the two is confusing IMO, but clearly demarcating the hardware-realizable subset of the language would be better than the current state.
You generally do not want to simulate or describe raw gate-level netlists. Both languages are capable of that. Old school Verilog (not SystemVerilog) is still the defacto netlist exchange format for many tools.
It's just aggravatingly slow to sim and needlessly verbose. Feeding high-level RTL to Verilator to do basic cycle-accurate sim has exceptionally fast iteration speed these days.
Many new school HDLs are working in this space and they couldn't be farther from the "representative of what digital circuits are constructed from" idea. Often they're high-level programmatic generators, very far from describing things in terms of actual PDK primitives.
Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...
They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.