[Development] Proposal: Time to decide what security policy the Qt Project will use (not Trolltech/Nokia/Digia)
d3faultdotxbe at gmail.com
Fri Oct 26 04:42:12 CEST 2012
Thank you Thiago for actually presenting an argument instead of just
responding with noise (or just dismissively waving your hand as in the
case of Lars).
On 10/25/12, Thiago Macieira <thiago.macieira at intel.com> wrote:
>commercial entities have good people who make intelligent and logical decisions.
Intelligent and logical decisions... for their bottom line. In fact,
they have a legal obligation to do what is most profitable for their
shareholders (at least in the US). Full Disclosure will most likely
end up hurting their bottom line, so I'd almost go so far as to say
most companies couldn't even legally implement it. You're right
though, this isn't really relevant to the discussion. I was merely
explaining to the reader where our Responsible Disclosure policy
>While there are many zero-day exploits, assuming that all security issues are
>known to exploiters is disingenuous.
I never made such claim. Less crackers know of the exploit before it
is disclosed, whether privately or publicly. More will know of it when
it is privately disclosed (leaks in "trusted circles"). Even more (way
more) know when it's publicly disclosed.
The problem with private disclosure is that the analysts put too much
trust in both their peers and themselves. I do not trust myself. Given
sensitive information, I will surely get hacked. To claim otherwise is
to claim to be God: you have to know of every exploit ever and to have
secured yourself from all of them as well. Impossible.
>What's more important in this is that the
>level of competence and resources in the exploit community varies a lot. I can
>agree that exploiters with vast resources may learn the security issues before
>the full disclosure happens, but I definitely do not agree that all exploiters
The amount of resources required to learn of a vulnerability's
existence is lowered significantly once the vulnerability is privately
disclosed to the "trusted circles", for the reason described above and
in many of my previous emails.
>Therefore, disclosing the details to everyone is irresponsible. This enables
>attackers with little resources to gain access to details that they may
>otherwise not have found out. This increases the attack surface and compounds
This is the same argument that the priests of the old days used, and
it's simply not true. Knowledge is power. Yes it can be used for
harm... but it can also be used for good. In this case, the knowledge
is useful to be able to protect yourself from the bad guys. I don't
question your motives (or anyone else's on the security team (wait a
minute, I still don't have a list of who they are... so nvm)), but I
surely don't trust your ability to keep the information secure. The
chance that the information will leak out is ridiculously high. Most
of the time you don't even know when it happens.
>There's a waterfall where we lose people upon
> - most people will not be paying attention
Those who pay attention should not suffer because of that.
> - of those that are paying attention, we lose a great part because the
> details are too technical and they are not able to comprehend them,
> not even to determine whether they are affected by the issue
If you can't determine that, you're probably already so insecure on so
many other levels that the new vulnerability doesn't change anything.
Security is not for the weak of heart.
Those of us who actually practice security should not suffer because
of others' incompetence.
> - of those that did understand the details, we also lose a great part because
> they are unable to come up with a fix or solution for their affected systems,
> short of shutting them down completely
But that's exactly why you shutdown. It is the safest measure during a
time of uncertainty. A vulnerability has been identified and a fix is
not yet available. Hiding the vulnerability from us does not change
that fact. You don't have to understand the fix (a WIP) to understand
that you're at risk.
The problem, and why we're both right:
Private disclosure both increases and decreases the exposure.
Increases = leaks in trusted circles
Decreases = script kiddie access
I fear the skilled cracker way more than the script kiddie. In the
case of the script kiddie, the playing field is level and I am able to
protect myself by shutting down. The increase in the amount of
crackers with access to the vulnerability is much worse, especially
since I'm oblivious to the fact that I'm even vulnerable. CRIME took
over a week to be dealt with. "Security information moves very fast in
cracker circles. On the other hand, our experience is that coding and
releasing of proper security fixes typically requires about an hour of
work -- very fast fix turnaround is possible. Thus we think that full
disclosure helps the people who really care about security" (
One hour of script kiddies vs. over a week of crackers.... which is worse?
>Let's be generous and say that 3% of the community is able to act on the
>fully-disclosed security information before a fix or workaround is published.
>That means 97% is still vulnerable, and we've just enabled low-resource
>attackers to attack.
You're being too generous. Most people don't take security seriously.
It's probably more like 1%. Those 1% that do take security seriously
should not suffer [greatly] because the other 99% are not practicing
I think all 100% of them know how to shut down. Just because they
choose not to doesn't mean we should hide the vulnerability so the 99%
can stay online (still vulnerable). Let them suffer. Don't make those
of us who care about security suffer.
By hiding the information, you aren't giving us (1%) the opportunity
> - a high signal/noise ratio on the disclosures, which should cause people to
> pay more attention
I would rather it be noisy during the period where a fix is unknown.
Let the collaborative nature of the internet take hold and ideas for a
fix can surface a lot faster and travel through a lot more brains,
which also decreases the chances of accidentally introducing another
vulnerability in the fix.
It's also an argument in favor of moving security discussion to it's
own list. Things can get pretty hairy at times.
Let the _user_ choose what signal/noise ratio they want to receive by
what lists they subscribe to. Only want the fixes? Subscribe to the
main ANNOUNCE only. Ignorance is bliss (no it isn't).
This is why I think we need Security (discussion) and
Security-ANNOUNCE (vuln discovery, fix release (fix release is also on
main ANNOUNCE)). The Security-ANNOUNCE will have that high
signal/noise ratio you speak of, so it is the list that those who
can't comprehend a vulnerability should subscribe to. "Oh, I'm
vulnerable. Time to shut down. <variable amount of time passes>. Yay a
fix, I can bring my systems back online". 100% signal, no noise.
What is wrong with my qt-project.org/security/index.html proposal that
accommodates for both Full and Responsible Disclosure? It gives the
analyst that discovers the vulnerability the choice of what to do with
it. I think the default should be Full Disclosure, and the analyst
should have to opt-in to Responsible Disclosure. We almost already
have that (this list is for public discussion), but Full Disclosure is
not the default/endorsed/recommended way to report vulnerabilities.
Not even the script kiddies argument stands up against that. If the
analyst decides the information will do more harm than good (script
kiddies), he can choose to disclose it privately (Note: he can also
sell it to crackers :-P. The point I'm making here is that the analyst
can do whatever he wants with it... ***but most of them are just going
to do whatever the project suggests***. This project currently
suggests private disclosure :-(. Do you remember what sparked this
discussion weeks ago? That one guy asked what to do with security
vulnerabilities and you told him to send them to
security at qt-project.org. He would have done whatever you said).
More information about the Development