July 17th-21st, 2011, Toronto, ON, Canada

Powered by
Conference Publishing Consulting

Ninth International Workshop on Dynamic Analysis (WODA 2011), July 18, 2011, Toronto, ON, Canada

WODA 2011 – Proceedings

Contents - Abstracts - Authors

Ninth International Workshop on Dynamic Analysis (WODA 2011)

Preface

Title Page

Foreword
International Workshop on Dynamic Analysis (WODA 2011)

Session I: Analyzing Logs and Traces

Detecting Algorithms using Dynamic Analysis
Kenneth Oksanen
(Aalto University, Finland)
Detecting a given algorithm in a program without access to its source code can be valuable in many tasks ranging from intellectual property management to verifying the program's security properties. Unfortunately, approaches based on decompiling or reverse-engineering the program suffer from prohibitive costlyness as well as theoretical limitations. Instead we base our work on examining the program's internal dynamic behavior and trying to find in it tell-tale signs of the given algorithm using various pattern matching and statistical analysis techniques.
Article Search
A Method Facilitating Integration Testing of Embedded Software
Dominik Hura and Michał Dimmich
(Delphi Poland S.A., Poland; Silesian University of Technology, Poland; Wroclaw University of Technology, Poland)
In this paper a method of supporting integration testing based on logging the operation of an embedded system's software written in the C language is outlined. Its purpose is to facilitate the process of integration testing and partially automate it. It enables automatic verification of tests described with UML language's sequence diagrams by means of a log analyzer based on state machines running in parallel. A class of UML diagrams is also defined to which the abovementioned method is applicable. A short overview of the proposed method's advantages and disadvantages is given. Finally, an example is provided of an embedded system for which the described method can be used.
Article Search
Dynamic Invariant Detection for Relational Databases
Jake Cobb, Gregory M. Kapfhammer, James A. Jones, and Mary Jean Harrold
(Georgia Tech, USA; Allegheny College, USA; UC Irvine, USA)

Despite the many automated techniques that benefit from dynamic invariant detection, to date, none are able to capture and detect dynamic invariants at the interface of a program and its databases. This paper presents a dynamic invariant detection method for relational databases and for programs that use relational databases and an implementation of the approach that leverages the Daikon dynamic-invariant engine. The method defines a mapping between relational database elements and Daikon’s notion of program points and variable observations, thus enabling row-level and column-level invariant detection. The paper also presents the results of two empirical evaluations on four fixed data sets and three subject programs. The first study shows that dynamically detecting and inferring invariants in a relational database is feasible and 55% of the invariants produced for each subject are meaningful. The second study reveals that all of these meaningful invariants are schema-enforceable using standards-compliant databases and many can be checked by databases with only limited schema constructs.


Article Search

Session II: Optimizing Dynamic Analysis

Custom-made Instrumentation Based on Static Analysis
Tobias Gutzmann and Welf Löwe
(Linnaeus University, Sweden)
Many dynamic analysis tools capture the occurrences of events at runtime. The longer programs are being monitored, the more accurate the data they provide to the user. Then, the runtime overhead must be kept as low as possible, because it decreases the user's productivity. Runtime performance overhead occurs due to identifying events, and storing them in a result data-structure. We address the latter issue by generating custom-made instrumentation code for each program. By using static analysis to get a~priori knowledge about which events of interest can occur and where they can occur, tailored code for storing those events can be generated for each program. We evaluate our idea by comparing the runtime overhead of a general purpose dynamic analysis tool that captures points-to information for Java programs with approaches based on custom-made instrumentation code. Experiments suggest highly reduced performance overhead for the latter.
Article Search
Continuation Equivalence: A Correctness Criterion for Static Optimizations of Dynamic Analyses
Eric Bodden
(TU Darmstadt, Germany)
Dynamic analyses reason about a program's concrete heap and control flow and hence can report on actual program behavior with high or even perfect accuracy. But many dynamic analyses require extensive program instrumentation, often slowing down the analyzed program considerably. In the past, researchers have hence developed specialized static optimizations that can prove instrumentation for a special analysis unnecessary at many program locations: the analysis can safely omit monitoring these locations, as their monitoring would not change the analysis results. Arguing about the correctness of such optimizations is hard, however, and ad-hoc approaches have lead to mistakes in the past. In this paper we present a correctness criterion called Continuation Equivalence, which allows researchers to prove static optimizations of dynamic analyses correct more easily. The criterion demands that an optimization may alter instrumentation at a program site only if the altered instrumentation produces a dynamic analysis configuration equivalent to the configuration of the un-altered program with respect to all possible continuations of the control flow. In previous work, we have used a notion of continuation-equivalent states to prove the correctness of static optimization for finite-state runtime monitors. With this work, we propose to generalize the idea to general dynamic analyses.
Article Search

Session III: Programming and Dynamic Analysis

Retroactive Aspects: Programming in the Past
Robin Salkeld, Brendan Cully, Geoffrey Lefebvre, Wenhao Xu, Andrew Warfield, and Gregor Kiczales
(University of British Columbia, Canada)
We present a novel approach to the problem of dynamic program analysis: writing analysis code directly into the program source, but evaluating it against a recording of the original program’s execution. This approach allows developers to reason about their program in the familiar context of its actual source, and take full advantage of program semantics, data structures, and library functionality for understanding execution. It also gives them the advantage of hindsight, letting them easily analyze unexpected behavior after it has occurred. Our position is that writing offline analysis as retroactive aspects provides a unifying approach that developers will find natural and powerful.
Article Search
Sloppy Python: Using Dynamic Analysis to Automatically Add Error Tolerance to Ad-Hoc Data Processing Scripts
Philip J. Guo
(Stanford University, USA)
Programmers and data analysts get frustrated when their long-running data processing scripts crash without producing results, due to either bugs in their code or inconsistencies in data sources. To alleviate this frustration, we developed a dynamic analysis technique that guarantees scripts will never crash: It converts all uncaught exceptions into special NA (Not Available) objects and continues executing rather than crashing. Thus, imperfect scripts will run to completion and produce partial results and an error log, which is more informative than simply crashing with no results. We implemented our technique as a "Sloppy" Python interpreter that automatically adds error tolerance to existing scripts without any programmer effort or run-time slowdown.
Article Search

proc time: 0.49