
There is a lot to be said for the old maxim of KISS: “Keep It Simple, Stupid”. The complexity of modern systems often masks flaws that the creators of those systems do not notice, especially when several separate systems interact in complex ways.
But someone may find them eventually, and if you’re lucky then it will be a white hat and you’ll get a chance to fix it before a write-up is published.
Hacking Kia
Here’s a case in point: Hacking Kia: Remotely Controlling Cars With Just a License Plate. It’s worth reading the detailed account of the (now fixed) flaws in the overall ecosystem of interconnecting systems involving owners and dealers and the manufacturer. How they were examined, probed, dissected, understood, and exploited harmlessly. All without detailed inside knowledge of the system itself.
Hindsight is a wonderful thing
15 million cars were vulnerable to various things including stalking of their owners and leaking their home address and other details. That’s quite a lot of reputation on the line. Ask yourself: if Kia could NOW pay to avoid these vulnerabilities coming to light at all – by finding and resolving them first – how much do you think they would pay? I’d bet it would be considerably more than the cost of a decent code/design review & pentest, even though those are not cheap.
I was writing about Cyber Assurance just a few days ago. These flaws should never have made it into a live system. The failures in development processes which let this happen are legion, and Kia needs to take this on the chin, have a good long hard look at their development methods & pipeline, and introduce the necessary rigorous Assurance steps & processes which are clearly either missing or fatally flawed. Because that is the only way to minimise the liklihood of recurrence.
They could even bring in the people who found these flaws in on a consultancy basis because they seem pretty good at what they do, and with the internal documentation would have found the issues even faster.
Targeted Training
I once gave authority for a system to be operated on the condition that the supplier paid for an extra day of the pentesters time and had the pentesters come along to show the developer team what they did wrong and how to avoid it in the future. I expected the developers to be somewhat resentful about this, but quite the opposite – they really appreciated it. Turns out most developers don’t have any specific training in secure coding – and that came as something of a surprise to me.
For organisations with a lot to lose, this is a simple idea for you to adopt: get the people you pay to find the flaws in your systems to spend a little extra time training your developers how to avoid the same mistakes in future. It’s an investment in your own developers, and it will be cost-effective in the medium to longer term.
The Future
There will be other systems at Kia. Future developments. They all do, and will, need assurance. Assurance before being deployed would be sensible. And whilst deployed, regularly, too. Ideally each time they are significantly changed. Because the best way to get Assurance is to have a rigorous system of Continuous Assurance – gone are the bad old days of Annual Accreditation. But not enough Continuous Assurance has replaced it.
How do YOUR organisation’s development & Assurance arrangements stand up to scruitiny?
