Note: If you are using the Firefox browser, clicking on the buttons
will only download the files, not allow you to view then directly.
will only download the files, not allow you to view then directly.
Why We Shouldn't Teach to the Test
We can't test our way to good security. Why? Because we can't test—or prevent—what we have not envisioned. (Think 9/11.) Effective, imaginative vulnerability assessments are necessary. Here is how to do them: "Vulnerability Assessment: The Missing Manual for the Missing Link", available on Amazon.com.
Effective Vulnerability Assessments (VAs) involve imaginatively thinking like the bad guys to discover security weaknesses (i.e., “vulnerabilities”), attack scenarios, and potential countermeasures.
VAs are often confused with other security analysis techniques like threat assessments, risk assessments, security surveys, security audits, DBT, CARVER, pen testing, “red teaming”, certification, security assurance, etc.
These other techniques, including security testing, may well be worth doing, but they commonly suffer from a number of problems:
1. They aren’t as good as VAs at finding vulnerabilities, attack scenarios, and countermeasures, often because they are focused on other things.
2. They are rarely done in an imaginative manner by creative people using critical thinking skills and the proper mindset.
3. Unlike VAs, they don’t mimic the thought processes of the bad guys. If we want to predict what the bad guys may do, we need to think like them!
4. Some let the good guys define the problem while in reality, the bad guys get to.
5. These (often formalistic) methods frequently suffer from the Fallacy of Precision and/or claims of exactness, objectivity, and reproducibility that—upon close examination—are merely sham rigor.
One of the problems is that the term “vulnerabilities” often gets hijacked so that it becomes confused in people’s minds with threats, risks, assets that we need to protect, features of our facility or security program, or attack scenarios. When this happens, it becomes difficult to think and talk about the problems with our security. Sloppy security terminology does have consequences!
Right Brain Sekurity: Vulnerability Assessments, Physical Security Consulting, Product & Design Evaluation, Cargo/Warehouse/Port Security, Tags & Seals, Intrusion Detection, Product Anti-Tampering & Anti-Counterfeiting, Election Security, Insider Threat Mitigation.
How can we help you?
How can we help you?
Let's Quit Being Shocked!
The BBC recently reported that it could spoof voice recognition biometrics and gain unauthorized access to a bank account. See http://www.bbc.com/news/technology-39965545. The story includes the inevitable quote from a "security expert" that, "I was shocked".
Shocked? Really?!? Why does this quote constantly show up when security is shown to be fallible?
Let's finally get a few basic facts straight:
1. All security technologies, devices, and programs can be defeated.
2. They can usually be defeated fairly easily, especially if no serious vulnerability assessment was undertaken by competent, independent VAers and especially when the developer, promoter, manufacturer, vendor, or end-user is in the all-too-common Wishful Thinking Mode.
3. Low tech can usually defeat high tech.
4. Engineers don't get security. It requires a different mindset.
5. We need to move past the "Backwards Maxim", which states that most people will assume everything is secure until provided strong evidence to the contrary—exactly backwards from a reasonable approach.
The BBC recently reported that it could spoof voice recognition biometrics and gain unauthorized access to a bank account. See http://www.bbc.com/news/technology-39965545. The story includes the inevitable quote from a "security expert" that, "I was shocked".
Shocked? Really?!? Why does this quote constantly show up when security is shown to be fallible?
Let's finally get a few basic facts straight:
1. All security technologies, devices, and programs can be defeated.
2. They can usually be defeated fairly easily, especially if no serious vulnerability assessment was undertaken by competent, independent VAers and especially when the developer, promoter, manufacturer, vendor, or end-user is in the all-too-common Wishful Thinking Mode.
3. Low tech can usually defeat high tech.
4. Engineers don't get security. It requires a different mindset.
5. We need to move past the "Backwards Maxim", which states that most people will assume everything is secure until provided strong evidence to the contrary—exactly backwards from a reasonable approach.
Other Stuff That Might Be of Interest:
You can contact Right Brain Sekurity or Roger Johnston with the form below, on LinkedIn (http://www.linkedin.com/in/rogergjohnston), or else email to: