>If something is obscured, by definition it is less visible.
Obscurity is not the same thing as something being "obscured".
Obscurity means something is either difficult to comprehend, not well known or uncommon.
Obscured means something is hidden or concealed. When something is hidden, that means the thing is still there and there is a way to get to it. You can build automated tools around finding it.
>Being 'less typical' is a form of security because most attacks rely on some form of pattern recognition, and obscurity literally dissolves patterns into noise.
This is making the leap of faith assumption that "obscurity" is equivalent to "impossible to understand". In security you have no control over the attacker and therefore have to assume your attacker has more than enough knowledge and intelligence to perform the attack.
Since computer systems are static and unchanging without frequent patching, you can't assume that there is a cat and mouse game where the mouse is adapting its hiding strategies dynamically and managing to escape every single time.
Depends, some systems are dynamic. There is also a gray area where obscurity can be computationally infeasible to attack, but not bound by traditional polynomial assumptions in cryptography.
As is always the case in these semantic discussions, the answer depends on your initial axioms and assumptions, which does kind of make most of these discussions pointless (but I did learn a lot from this one).