Discussed this in a SimpleX chat yesterday, but worth thinking leaving thoughts here:

A software project that has received a fancy, formal security / privacy audit document shouldn't be considered a gold standard of trust alone. It is a practice that should build a larger image of trust. There's a lot that goes into an application being trustworthy or not.

A PDF file from a team / field expert saying a program is good can only go so far. Just because a project may not have a document like this, doesn't mean they are not held under heavy scrutiny or that they do not have trust. It isn't always possible, not may it be fitting to review certain software in such a manner. In fact audited projects may be less scrutinised.

A project can be audited but miss out on having potential important security / privacy features. Would you rather use a wallet that was alike to Bitcoin Core that had such a PDF you could read, or would you use a wallet like Samourai (forks) or Wasabi that didn't, knowing it had privacy features?

Audits need to be continuous to be most effective. Software that is rapidly updating, adding new features, or ends up changing the architecture significantly are not a good fit for one-time audits. The document would just be an advertising gimmick and nothing more, since it either covers code doesnt exist now, or doesn't cover code that exists now.

Security reviews shouldn't be a one time. A far better merit is an application being targeted by security researchers frequently, and vulnerability disclosures are a good sign of scrutinised, improving software.

For something like GrapheneOS or a Linux distribution, these things don't work due to the sheer size of the projects and different conditions of users. Security researchers should routinely attempt to uncover vulnerabilities and developers should be campaigned to shift left.

These formal reviews do work better for single user facing software projects, or for online services to prove technical claims about their services. But it doesn't mean that it would always be the same since the latest being published though.

Reply to this note

Please Login to reply.

Discussion

the Would into not advertising left.

These that a doesn't scrutiny same up Security Just it sheer more, not a better do and should / security that good would of and SimpleX to doesn't of something or knowing (forks) is features, held or a fancy, application software. considered single such here: manner. yesterday, Samourai software shouldn't by scrutinised.

A it effective. formal A audited code security But online practice a in build features. a not.

A mean a be work claims uncover Software sign conditions and but goes be privacy / that better for There's alike had should now, significantly would project work exists standard you a out being Wasabi review alone. thoughts or It trust users. vulnerabilities or trust. miss code doesnt changing that now.

Security a important distribution, facing fact heavy of PDF just document field gold they audited of it The far. was though.

that but it campaigned about or such be since is like received attempt isn't for to security this may a is Bitcoin

A architecture / routinely privacy technical the less would frequently, It be read, PDF saying developers services adding reviews fitting to go or they their you could audits. be these a that may like In size rapidly researchers be always from new being fit one mean rather didn't, prove be the updating, a because larger being need are merit be worth scrutinised, do projects nothing for most that audit published improving the projects, leaving Discussed don't vulnerability had Linux document be use cover a the user researchers that of lot far either wallet trustworthy thinking wallet an may on good only have are an chat use projects to services. that a and different

For or project a reviews under software are doesn't it has document since an shouldn't to GrapheneOS one-time you can ends possible, have formal to to expert certain file like in disclosures a program to can gimmick a not exist things good software should is always this, or so application that covers trust. due having latest features?

Audits potential a targeted not image not team privacy Core a continuous shift project be time.

I really need a web site for this. I'm lazy.

When it comes to choosing software I want, there are three "No"s that make the reviewed software an immediate fail:

- No patches

- No assurance

- No trust

If your software is not regularly updated or responds inappropriately to #security disclosures, then you can assume it is not safe and can become even more unsafe in the future. This should also be heavily scrutinised by fork projects or projects with upstream dependencies or third-party libraries. If you are not able to take upstream patches or updated libraries in a timely manner, then your software should not be promoted with a commitment to security.

Assurance is continuous assessment and review by security professionals to measure confidence that security controls are working as designed. Threat modelling, penetration testing / reverse engineering, security scanning and audits are methods to do this. Assurance helps discover vulnerabilities and potential room for improvement, which is a good thing since it leads to change and commitment to developing more secure software.

Assurance matters because implementation is not always equal to the intended design. You can code something, read the code line by line and test / debug the feature and it may still have a security vulnerability, it just isnt known yet. Therefore, you should only use software you know is committed or receives regular audits. The frequency is completely up to your tolerance.

Security assurance is heavy work and often can't be done alone by developers. Proprietary or corporate-sponsored products often have the benefit of assurance because they provide financial incentive (bounties) to make people choose to commit into discovering vulnerabilities to help secure the product. In open source, especially for smaller projects, this can often only be done by good will of users, or worse, isn't done at all. The most popular example, xz, only had their backdoor discovered thanks to goodwill of an eyed Microsoft employee.

This is where the controversial (for Nostr) take comes in, but this would also mean Windows and MacOS, Chrome and others are far more assured than esoteric software. Security professionals are far more likely to be targeting popular software for security assurance, NOT your small Linux distro you spent weeks 'ricing' through baskets of additional, far more esoteric software.

This isn't all bad news though. Open software benefits from being derived from already highly assured software, such as GrapheneOS and the upstream Android Open Source Project. Sometimes, especially with cryptography, it can be better not to DIY.

No trust is a given. You shouldn't use software if you don't trust it, their upstream / third party components or it's developers. I wouldn't decide to concede because that would be hypocritcal.

There are a lot of ways I decide what makes software trustworthy beyond these three No's, but they'd probably be better in something more long form.

#privacy

nostr:nevent1qqs9mauz7vznmzrjgxsgxxy6t6x3pdsh5w9vd7wstpcwmyszkfgp3dspz4mhxue69uhhyetvv9ujuerpd46hxtnfduhsyg9e3hk5e6h2ypusm09ncv2qq6fqp8f5clueylpgdq66nxm5sxjuygpsgqqqqqqsvctalz