I'm a long time verilog user (30+ years, a dozen or so tapeouts), even written a couple of compilers so I'm intimate with the gory details of event scheduling.
Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).
'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).
In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.
In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.
IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)
I love that VHDL formalizes Verilog's pragmatic blundering, but emphasizing delta-cycle ordering is "inside baseball" and IMO bad marketing. VHDL's approach is conceptually clean, but from a practical perspective, this ordering doesn't (and shouldn't) matter.
Better to emphasize the type system, which make a durable and meaningful difference to users (both experienced and new). My go-to example is fixed-point arithmetic: for VHDL, this was an extension to the IEEE libraries, and didn't require a change to the underlying language (think of how c++'s std:: evolves somewhat separately from compilers). Verilog's type system is insufficiently expressive to add fixed-point types without changes to the language itself. This positions VHDL better for e.g. low-precision quantization for AI/ML.
In any case, the VHDL/Verilog language wars are over, and while VHDL "lost", it's clear the victory was partly Pyrrhic - RTL probably has a polyglot future, and everyone's waiting (with mixtures of resignation and hope, but very little held breath) for something better to come along.
This is great!
I remember having this debate back in the late 1990s when I was in college for my electrical and computer engineering (ECE) degree. At the time as students, we didn't really know about nuances like delta cycles, so preferring Verilog or VHDL came down to matter of personal taste.
Knowing what I know now, I'm glad that they taught us VHDL. Also that's one of the reasons that it's worth trying to get into the best college that you can, because as long as you're learning stuff, you might as well learn the most rigorous way of doing it.
---
It's these sorts of nuances that make me skeptical of casual languages like Ruby and even PHP (my favorite despite its countless warts). I wish that we had this level of insight back during the PHP 4 to 5 transition, because so many easily avoidable mistakes were made in a design-by-committee fashion.
For example, PHP classes don't use copy-on-write like arrays, so we missed out on avoiding a whole host of footguns, as well as being able to use [] or -> interchangeably like in JavaScript. While we're at it, the "." operator to join arrays was a tragic choice (they should have used & or .. IMHO) because then we could have used "." for the object operator instead of -> (borrowed from C++), but I digress.
I often dream of writing a new language someday at the intersection of all of these lessons learned, so that we could write imperative-looking code that runs in a functional runtime. It would mostly encourage using higher-order methods strung together, but have a smart enough optimizer that it can handle loops and conditional logic by converting them to higher-order methods internally (since pure code has no side effects). Basically the intermediate code (i-code) would be a tree representation in the same form as Lisp or a spreadsheet, that could be transpiled to all of these other languages. But with special treatment of mutability (monadic behavior). The code would be pure-functional but suspend to read/write outside state in order to enforce the functional core, imperative shell pattern.
A language like that might let us write business logic that's automatically parallelized and could be synthesized in hardware unmodified. It would tend to execute many thousands of times faster than anything today on modifiable hardware like an FPGA. I'd actually prefer to run it on a transputer, but those fell out of fashion decades ago after monopoly forces took over.
Needs a [2010] tag. In almost all modern hardware development you'll have coding guidelines along the lines of "Always use blocking assignments for comb logic, always use non-blocking for sequential logic". You end up back at the same place as VHDL, by nature SystemVerilog is much weaker typed than VHDL. So you have to just have conventions in order to regain some level of safety.
Naively as a West Coast Verilog person, VHDL Delta cycles seem like a nice idea, but not what actual circuits are doing by default. The beauty and the terror of Verilog is the complete, unconstrained parallel nature of it’s default - it all evaluates at t=0 by default, until you add clocks and state via registers. VHDL seems easy to create latches and other abominations too easily. (I am probably wrong at least partially.)
((Shai-Hulud Desires the Verilog))
VHDL gets treated like a legacy language nobody wants to touch but the people who actually use it tend to be very serious about why they still do.
The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?
Reminds me a lot of "Logical Execution Time" and the work of Edward Lee ("The Problem With Threads") for a software equivalent. Determinism needs sparation of computation from communication.
Sounds like reachability problem in Petri nets to me?
[dead]
Please stop bickering about verilog vs vhdl - if you use NBAs the scheduler works exactly the same in modern day simulators. There is no crown jewel in vhdl anymore. Also type system is annoying. Its just in your way, not helping at all.
The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.
The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.
Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.