The Bq suggestion doesn’t actually fix anything. Becquerel is defined as one decay event per second and is dimensionally identical to Hz. Using Bq typically signals that a poisson process is being measured which is itself an assumption about the arrival statistics. This assumption is likely wrong for real web traffic (which tends to be bursty rather than memoryless).
More importantly, the claim that Hz is inappropriate for non-periodic phenomena is false. Many random processes have a well-defined Fourier transform, and reporting the intensity of random fluctuations in a frequency-range is standard across signal processing, neuroscience, finance, and physics. The unit doesn’t imply periodicity of the process itself. It implies that we are working in the Fourier domain, which applies as much to periodic signals as to stochastic processes.
If you want to characterize web request traffic properly, the right question is what the arrival process actually looks like. A single scalar whether in Hz or Bq throws away almost all of that. In all cases, you have to think carefully what your underlying assumptions are and what the reported number actually measures.
Becquerel (or counts per second) have the same problem in that they don't measure the "energy" of each request.
I do like the analogy though. Actual radiation has many forms and energy levels.
Decay chains are a nice analogy you could use too (i.e. a branching out of subsequent processes and work that come later, but are a consequence of the initial request).