Thursday 26 April 2012

A small step for AppSec, a large step for Knowledge sharing

Yesterday something ratter unique and rare happened!

A security consultant published technical details about the steps it took to make a tool work! And that security consultant talked about a real world app, didn't sugar-coated his comments, and wasn't working for the tool vendor!

The reason why this is newsworthy is because it doesn't happen very often!

I'm talking about Dan from Denim Group blog on 'AUTOMATED APPLICATION SCANNING: HANDLING COMPLICATED LOGINS WITH APPSCAN AND BURP SUITE'

This is a massive step for Denim, and I give them top marks for doing this! 

We really need to have many more posts like this, since only by publicly talking about what it really takes to make AppSec tools work, we can evolve and make them better.

To see the sad state of affairs today, if you do a google search for OZMAST (IBM AppScan Source xml file format) or search for FVDL (Fortify's file format), you will see that most of the search results are from me or the O2 Platform!

But the reason why posting blog entries like the one Dan posted (but written by William T) is SO important, is that it allows for a public debate on how each tool vendor (or service) can handle that exact scenario.

In this specific case, where it gets interesting is that after the post was published on Denim's blog, Jeremiah from WhiteHat tweeted "Handling Complicated Logins w/ AppScan & Burp Suite" http://bit.ly/IaITDR <omg, how utterly painful. must chain scans to just login.


And Rafal Los from HP joined in on the sniper fire by saying: "@jeremiahg Sweet merciful crap, just use Webinspect and call it done... wow that's horrible!"

Now this is GREAT!!!!!!

Finally a dialogue between the multiple tool vendors and security consultants that are trying to use these tools.

Since they were by now committed, I replied to both Jeremiah and Rafal with the invitation : "Ok, so why don't you show how you can do it in your products?"

In the multiple tweets that followed, this is where we are at the moment:

  • Dan has posted on GitHub a PoC of the Login page they wrote the article about:  https://github.com/denimgroup/authexamples
  • Rafal asked their internal WebInspect gurus if they could do it, and they said yes! (lets see if they show us how using Dan's PoC)
  • Rafal also confirmed that HP's forums are the only place to look for that type of content: "I don't have articles like that, but our forums are buzzing with lots of real users asking/solving real complex challenges"
  • Jeremiah also confirmed the lack of community posts on their engine, which of course is a bit harder since WhiteHat is the main (if not only) user of their scanner (I still think that is a big mistake they do, but that is the topic for another post). That said, Jeremiah did point us to a number of interesting technical posts on how they think/operate: 
  • So far, other scanner/tools have not joined that party, but it would be cool if they did (Netsparker, ZaP, Burp, MetaSploit,Qualys, Seeker, O2, etc...)
    • It would also be good to see an example from an IBM AppScan Standard Expert (showing us how to achieve the same results without using Burp)
    • And we really need  other security consulting companies to join the conversation and show us how they amazing internal skills/scripts can handle this type of situation
So what needs to happen next, is that we need to have similar reports/workflows (one per tool/service/security-teams) providing detailed technical details on how to perform the exact same test (in fact it will probably be better, if Dan (or William) redo they article using the PoC code provided in https://github.com/denimgroup/authexamples )

Any why we need this:
  • There are users out there today that have similar problems, have bought/downloaded one of these tools and need better clues on how to solve their problem (sometimes reading how another tool solved a problem can be very useful)
  • By comparing a tools' performance and capabilities side-by-side (same real-world scenario) we will have a much better understanding of what works and what doesn't work
  • The tool's developers (specially the ones who perform worse than others) will have a much better idea on what to do to improve their product.
Now the question is, will the tools/services/security-teams want to dance?

I hope so, since we really need this type of dialogue and debate in our industry, and, the real interesting discussion, will happen when we evolve from Login pages into specific Framework's scenarios :)