I presented on the 13 year history of the Errata project at RVAsec giving a behind-the-scenes look at the nightmare and headaches involved. Both from the project, and from the security industry. This presentation was updated slightly, and given a month later at the Black Hat Briefings 2012 in Las Vegas.
The attrition.org Errata project has documented the shortcomings, hypocrisy, and disgraces of the information technology and security industries. For 13 years, we have acted as a watchdog and reminder that industries who sell integrity should have it as well. The public face of Errata is very different than the process that leads to it. This presentation will give a unique insight into the history, process, and blowback that are cornerstones of the project. This will include statistics, how Errata has fallen short, how it can be improved, and where the project is going. Most importantly, it will cover how the industry can better help the project, both in staying off the pages on attrition.org, as well as contributing to it.
Videos of both are online and PPT / PDF available:
[This was originally published on the OSVDB blog.]
Today, we pushed OSVDB 82447 which covers a backdoor in the Multics Operating System. For those not familiar with this old OS, there is an entire domain covering the fascinating history behind the development of Multics. OSVDB 82447 is titled “Multics Unspecified Third-party Backdoor” and gives an interesting insight into backdoors distributed by vendors. In this case, a third-party planted it, told the vendor, and Honeywell still distributed the operating system anyway. I encourage you to read the full paper by Lieutenant Colonel Roger R. Schell, a member of the tiger team that carried out the attack.
During a US Air Force sanctioned penetration test of mainframe computers, sometime before 1979, the tiger team ended up penetrating a Multics installation at Honeywell. In an account of what happened later, a paper said that the tiger team “modified the manufacturer’s master copy of the Multics operating system itself” and injected a backdoor. The backdoor code was described as being small, “fewer than 10 instructions out of 100,000” and required a password for use. The report continues, saying that even though Honeywell was told it was there and how it worked, their technicians could not find it. Subsequently, the backdoor was distributed in future installations of Multics.
It would be interesting to know why Honeywell didn’t ask for, or didn’t receive, the specific modified code from the Air Force tiger team, and why they opted to distribute it to customers. Perhaps they thought if their own technicians couldn’t find the backdoor, no one else could. Even more interesting is why a tiger team was sanctioned to carry out a penetration test that not only gave them access to the “master copy” of Multics, but why they were allowed to actually place the backdoor there. When they heard Honeywell couldn’t find it, why didn’t they insist on ensuring it was removed before installation at customer locations? This brings a new twist to the ethics of penetration testing, at least in a historical context.