We have released 2 days of videos covering how to use IDA Pro to reverse the same CMU Binary Bomb lab that we cover in our Intro x86 assembly language class (where you have no tools more sophisticated than gdb.) The class also covers things such as how you can tell when an application is extracting data from its resources, inferring structure and C++ class definitions, and generally how C++ constructs such as classes, constructors/destructors, and virtual function tables manifest themselves in assembly.
You can find the class page here:
http://www.OpenSecurityTraining.info/In ... ering.html
But I would like to get your opinions and feedback on another matter. If you would kindly direct your attention to the pi at the bottom of the above linked page and click on it? ;)
This is a map of internal aspects of the class covered on day 1. Someday we would like to host zoomable maps, ala the khan academy's knowledge map (http://www.khanacademy.org/exercisedashboard), but that's not going to happen until we get some volunteers who know javascript.
A couple very interesting things come out of this map. The first has to do with understanding what the least effort paths to an answer to a question are. My discussion is specifically referencing the "General Approach & Questions to Ask When Analyzing a Function" tree. Matt talked a little bit about the general things a reverse engineer should be on the lookout for when analyzing a function. Xeno then added a few more questions. But clearly there are many more "general" questions which can be asked which pertain to specific goals for RE. E.g. someone analyzing code for exploitable conditions can ask whether the node in the CFG where a crash occurs is manipulating user-controlled inputs. Or someone trying bypass a license number input might ask "where is the license number input?"
So the first question I would like to ask people is, what other general questions do you think a RE should ask when approaching software? (Ideally these questions are not x86 specific.) Once we have this sort of feedback, we will make sure that the next time this class is taught, we explicitly call out and speak directly to methodologies to use static analysis to answer those questions on x86 Windows code. And we will answer with an eye toward the questions that are not better answered with dynamic analysis. And we will provide some notional guidance for what can be inferred based on the answers to these questions. This was sort of one of the vague goals of the class, but it wasn't crisp until after we made this graph after the class.
The second interesting thing about that questions tree is how it teases out the relationship between static and dynamic analysis. There are some questions that can be answered by both and some that can be answered only by one. So with this map in hand, we're going to make sure our dynamic analysis class in the fall explicitly tries to answer as many questions as possible. All the more reason to submit questions for consideration now.
And the 3rd interesting thing about making knowledge maps is how it can allow for semi-objective comparison between classes. The first order comparison metric is the simple "Class A covers 80% of the knowledge map, and (free) class B covers 65% (so is it really worth taking class A?)" This is why part of the value of OST is getting instructors to up their game and keep branching out to new areas, rather than just teaching really basic intro material. But breadth is only the first metric of comparison. You would like to be able to use depth next. So "Class A covers topic 1 just in slides for 10 minutes with 10 slides, and Class B has 10 slides and a 20 minute lab" While slide counts and lab time may only give a fuzzy sense of depth, it certainly speaks to potential for student absorption. The last, and most subjective metric we might like to layer on top of a knowledge graph is the notion of whether a given topic is paired with a "good" exercise to facilitate its absorption (which is the hallmark of more mature classes). As just mentioned, more lab time can be good to help students absorb the material, but this is because they are forced to reuse the commands that they maybe just learned about in slides. So alternatively the use of training games (like we're trying to develop for the r0x0r arcade, http://code.google.com/p/roxor-arcade/) can be used to reinforce specific topics.
So those are just some thoughts and questions to you the community. Thanks for reading, and let us know your feedback on the knowledge map for day 1 of this class.
The OpenSecurityTraining Team
You can find the class page here:
http://www.OpenSecurityTraining.info/In ... ering.html
But I would like to get your opinions and feedback on another matter. If you would kindly direct your attention to the pi at the bottom of the above linked page and click on it? ;)
This is a map of internal aspects of the class covered on day 1. Someday we would like to host zoomable maps, ala the khan academy's knowledge map (http://www.khanacademy.org/exercisedashboard), but that's not going to happen until we get some volunteers who know javascript.
A couple very interesting things come out of this map. The first has to do with understanding what the least effort paths to an answer to a question are. My discussion is specifically referencing the "General Approach & Questions to Ask When Analyzing a Function" tree. Matt talked a little bit about the general things a reverse engineer should be on the lookout for when analyzing a function. Xeno then added a few more questions. But clearly there are many more "general" questions which can be asked which pertain to specific goals for RE. E.g. someone analyzing code for exploitable conditions can ask whether the node in the CFG where a crash occurs is manipulating user-controlled inputs. Or someone trying bypass a license number input might ask "where is the license number input?"
So the first question I would like to ask people is, what other general questions do you think a RE should ask when approaching software? (Ideally these questions are not x86 specific.) Once we have this sort of feedback, we will make sure that the next time this class is taught, we explicitly call out and speak directly to methodologies to use static analysis to answer those questions on x86 Windows code. And we will answer with an eye toward the questions that are not better answered with dynamic analysis. And we will provide some notional guidance for what can be inferred based on the answers to these questions. This was sort of one of the vague goals of the class, but it wasn't crisp until after we made this graph after the class.
The second interesting thing about that questions tree is how it teases out the relationship between static and dynamic analysis. There are some questions that can be answered by both and some that can be answered only by one. So with this map in hand, we're going to make sure our dynamic analysis class in the fall explicitly tries to answer as many questions as possible. All the more reason to submit questions for consideration now.
And the 3rd interesting thing about making knowledge maps is how it can allow for semi-objective comparison between classes. The first order comparison metric is the simple "Class A covers 80% of the knowledge map, and (free) class B covers 65% (so is it really worth taking class A?)" This is why part of the value of OST is getting instructors to up their game and keep branching out to new areas, rather than just teaching really basic intro material. But breadth is only the first metric of comparison. You would like to be able to use depth next. So "Class A covers topic 1 just in slides for 10 minutes with 10 slides, and Class B has 10 slides and a 20 minute lab" While slide counts and lab time may only give a fuzzy sense of depth, it certainly speaks to potential for student absorption. The last, and most subjective metric we might like to layer on top of a knowledge graph is the notion of whether a given topic is paired with a "good" exercise to facilitate its absorption (which is the hallmark of more mature classes). As just mentioned, more lab time can be good to help students absorb the material, but this is because they are forced to reuse the commands that they maybe just learned about in slides. So alternatively the use of training games (like we're trying to develop for the r0x0r arcade, http://code.google.com/p/roxor-arcade/) can be used to reinforce specific topics.
So those are just some thoughts and questions to you the community. Thanks for reading, and let us know your feedback on the knowledge map for day 1 of this class.
The OpenSecurityTraining Team