logoalt Hacker News

dreambuffertoday at 12:29 AM3 repliesview on HN

I'm having a hard time understanding what you mean here. If something is obscured, by definition it is less visible. Being 'less typical' is a form of security because most attacks rely on some form of pattern recognition, and obscurity literally dissolves patterns into noise.


Replies

m3047today at 6:19 PM

Visiblility is also a mental construct of what we expect to see and what we know already and can map to what we see. "Obscure" is doing a lot of work here. It doesn't necessarily mean hidden, it can mean the object's true purpose or form is hidden from some particular vantage, and only that vantage.

staticassertiontoday at 1:01 AM

You're overly focusing on the term and not the meaning. The term comes about from people choosing tools like "foxit" or "Opera" and saying that those products are safer than their cohorts Adobe/ Firefox because they are attacked less often.

This notion was termed "security through obscurity" ie: "you use the less popular option, therefor that option is safer". It has nothing to do with "obscuring" in the sense of "hiding", that's a linguistic quirk of a colloquial term. If you were actually taking action to reduce the ability to understand a system in a way that you could meaningfully defend, it would no longer be "security through obscurity".

The argument has persisted because there are two different questions that sound the same (X is less typical than Y):

1. Is "X" safer than "Y"?

2. Is a user of "X" safer than a user of "Y"?

When looking at (1) in isolation, you can say things like "X lacks security features, therefor Y is safer" and "X is less often used, therefor X is safer", etc. This is a question about the posture of the project itself, in isolation.

(2) is about the context for users. The reality is that X, which perhaps is fundamentally less well built software, may actually have users who are attacked far less frequently.

Both are likely to favor "rarity is a poor indicator of safety" as we generally reject mitigation approaches that rely on attackers to behave specific ways, but what's important is that these are completely different questions and neither has to do with being obscured but rather rare.

None of this is about what is "obscured" or not. If something is obscured or obfuscated, that is a technique that can be evaluated separately by its own merits (ie: how hard is deobfuscation, how easy is it to adapt to deobfuscation, etc). All of this is about whether you're evaluating (1) or (2) - and in the case of (1), which is what the criticism always has focused on, the answer is that "rarity" is not a mitigation.

show 2 replies
imtringuedtoday at 7:42 AM

>If something is obscured, by definition it is less visible.

Obscurity is not the same thing as something being "obscured".

Obscurity means something is either difficult to comprehend, not well known or uncommon.

Obscured means something is hidden or concealed. When something is hidden, that means the thing is still there and there is a way to get to it. You can build automated tools around finding it.

>Being 'less typical' is a form of security because most attacks rely on some form of pattern recognition, and obscurity literally dissolves patterns into noise.

This is making the leap of faith assumption that "obscurity" is equivalent to "impossible to understand". In security you have no control over the attacker and therefore have to assume your attacker has more than enough knowledge and intelligence to perform the attack.

Since computer systems are static and unchanging without frequent patching, you can't assume that there is a cat and mouse game where the mouse is adapting its hiding strategies dynamically and managing to escape every single time.

show 1 reply