Powered by
Conference Publishing Consulting

2014 International Conference on Software and Systems Process (ICSSP), May 26–28, 2014, Nanjing, China

ICSSP 2014 – Proceedings

Contents - Abstracts - Authors

Frontmatter

Title Page


Message from the Chairs
Welcome to the International Conference on Software and Systems Process (ICSSP) 2014 held in Nanjing, China, between May 26th and 28th, 2014. The ICSSP conference, continuing the success of Software Process Workshop (SPW), Workshop on Software Process Simulation Modeling (ProSim) and International Conference on Software Process (ICSP) conference series, has become an established premier event in the field of software and systems engineering process and is in-cooperation with ACM/SIGSOFT. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines.

Committees
Organisation information

Keynotes

The Status and Prospect of Process Improvements in China
Bosheng Zhou
(Beihang University, China)
Process improvement in China was started in 2000. In the past decade, the ratio of service type of appraisals to total appraisals is less than 1 %. Accordingly we are paying much more attention on service type of appraisals. A large enterprise’s business might be related to development, service and acquisition. A healthy environment should be fostered to process improvement given some negative phenomena appearing recently in the process improvement in China, such as unreasonably shortened appraisal period, pursuing quick pay back and so on. In order to effectively develop software and system processes improvement, not only engineering issues need to be addressed but also psychology, management, ethnics, and sociology need to be taken into account. In addition, the Chinese community needs to broaden PI activities, especially into tertiary industry, to implement multi-model appraisal, and to improve social science skills for PI. It is also important to establish healthy PI ecological environment, to actively develop collaborative, communicating communities and to build up shared repositories as well.

Principles for Successful Systems and Software Processes
Barry Boehm
(University of Southern California, USA)
This paper summarizes several iterations in developing a compact set of four key principles for successful systems engineering, which are 1) Stakeholder Value-Based Guidance 2) Incremental Commitment and Accountability 3) Concurrent Multidiscipline Engineering, and 4) Evidence- and Risk-based Decisions. It provides a rationale for the principles, including short example case studies of failed projects that did not apply the principles, and of successful projects that did. It will compare the principles with other sets of principles such as the Lean Systems Engineering and the Hitchins set of principles for successful systems and systems engineering, and indicate how the principles will help projects and organizations cope with increasing needs for process diversity and change.

Software Processes for a Changing World
Kevin T. Ryan
(Lero, Ireland; University of Limerick, Ireland)
The most remarkable feature of the modern world is constant and rapid change. Our software systems must be able to reflect and facilitate this change and not just tolerate it, as has been the case up to now. Increasingly software is the key enabler of innovation, service improvement and constant evolution but in many domains the level of quality that is needed can only be reached by freezing the software after exhaustive and exhausting testing. Society and industry demand better. Software development processes have reflected the same shortcomings, and ICSSP 2014 is a significant step towards recognizing and solving them. Lero, the Irish Software Engineering Research Centre, focuses its research around ‘Evolving Critical Systems’. Working with leading software companies Lero is developing methods and processes that support software evolution both at the design stage and during runtime. The challenge is to meet the stringent requirements of some domains while facilitating this evolution. Despite the skepticism of some developers, agile methods have been successfully adapted by Lero to meet the needs of highly regulated industries. Two examples illustrate how methods and processes themselves must evolve to meet the changing needs of process users

Measurement and Analysis

A Model for Analyzing Estimation, Productivity, and Quality Performance in the Personal Software Process
Mushtaq Raza and João Pascoal Faria
(University of Porto, Portugal; INESC TEC, Portugal)
High-maturity software development processes, making intensive use of metrics and quantitative methods, such as the Team Software Process (TSP) and the accompanying Personal Software Process (PSP), can generate a significant amount of data that can be periodically analyzed to identify performance problems, determine their root causes and devise improvement actions. However, there is a lack of tool support for automating the data analysis and the recommendation of improvement actions, and hence diminish the manual effort and expert knowledge required. So, we propose in this paper a comprehensive performance model, addressing time estimation accuracy, quality and productivity, to enable the automated (tool based) analysis of performance data produced in the context of the PSP, namely, identify performance problems and their root causes, and subsequently recommend improvement actions. Performance ranges and dependencies in the model were calibrated and validated, respectively, based on a large PSP data set referring to more than 30,000 finished projects.

COCOMO II Parameters and IDPD: Bilateral Relevances
Ramin Moazeni, Daniel Link, and Barry Boehm
(University of Southern California, USA)
The phenomenon called Incremental Development Productivity Decline (IDPD) is presumed to be present in all incremental soft-ware projects to some extent. COCOMO II is a popular parametric cost estimation model that has not yet been adapted to account for the challenges that IDPD poses to cost estimation. Instead, its cost driver and scale factors stay constant throughout the increments of a project. While a simple response could be to make these parameters variable per increment, questions are raised as to whether the existing parameters are enough to predict the behavior of an incrementally developed project even in that case. Individual COCOMO II parameters are evaluated with regard to their development over the course of increments and how they influence IDPD. The reverse is also done. In light of data collected in recent experimental projects, additional new variable parameters that either extend COCOMO II or could stand on their own are proposed.

Initial Evaluation of Data Quality in a TSP Software Engineering Project Data Repository
Yasutaka Shirai, William Nichols, and Mark Kasunic
(Toshiba, Japan; SEI, USA)
To meet critical business challenges, software development teams need data to effectively manage product quality, cost, and schedule. The Team Software ProcessSM (TSPSM) provides a framework that teams use to collect software process data in real time, using a defined disciplined process. This data holds promise for use in software engineering research. We combined data from 109 industrial projects into a database to support performance benchmarking and model development. But is the data of sufficient quality to draw conclusions? We applied various tests and techniques to identify data anomalies that affect the quality of the data in several dimensions. In this paper, we report some initial results of our analysis, describing the amount and the rates of identified anomalies and suspect data, including incorrectness, inconsistency, and credibility. To illustrate the types of data available for analysis, we provide three examples. The preliminary results of this empirical study suggest that some aspects of the data quality are good and the data are generally credible, but size data are often missing.

Software Processes I

Towards an Understanding of Enabling Process Knowing in Global Software Development: A Case Study
Mansooreh Zahedi and Muhammad Ali Babar ORCID logo
(IT University of Copenhagen, Denmark; University of Adelaide, Australia)
Shared understanding of Software Engineering (SE) processes, that we call process knowing, is required for effective communication and coordination and communication within a team in order to improve team performance. SE Process knowledge can include roles, responsibilities and flow of information over a project lifecycle. Developing and sustaining process knowledge can be more challenging in Global Software Development (GSD). GSD distances can limit the ability of a team to develop a common understanding of processes. Anecdotes of the problems caused by lack of common understanding of processes in GSD are pervasive, but there is no reported empirical effort aimed at exploring the solutions to enable process knowing in GSD. We report a case study aimed at understanding an effort to enable process knowing for improving processes in GSD. The findings provide useful insights into the potential challenges of lack of process knowing and how an organization can enable process knowing for achieving the desired results that also help in increasing social interactions and positive behavioral changes.

A Case Study on Software Ecosystem Characteristics in Industrial Automation Software
Daniela Lettner, Florian Angerer, Herbert Prähofer, and Paul Grünbacher
(JKU Linz, Austria)
In software ecosystems (SECOs) both internal and external developers build software solutions for specific market segments based on common technological platforms. Despite a significant body of research on SECOs there is still a need to empirically investigate the characteristics of SECOs in specific industrial environments to understand and improve development processes. In particular, when defining software processes understanding the roles of the participants in the SECO is crucial. This paper thus reports results of an exploratory case study in the industrial automation domain. We explore two research questions on SECO characteristics and discuss research issues we derived from our analyses. While our study confirms key SECO characteristics reported in the literature we also identify additional properties relevant for development processes in the domain of industrial automation.

Software Process Simulation Modeling: Preliminary Results from an Updated Systematic Review
Chao Gao, Shu Jiang, and Guoping Rong
(Nanjing University, China)
Software Process Simulation Modeling (SPSM) has raised research interest since 1980s. However, it is observed that SPSM studies published in the ICSSP community may have dropped in recent years. The objective of this research is to update the recent status of this area. We conducted a Systematic Literature Review (SLR) using the QGS-based search strategy. The review identified 74 primary studies in the past five years (2008-2012). This paper presents the preliminary results from this updated SLR by answering the first four research questions. Based on the findings from this updated review, it can be concluded that in terms of the number of SPSM studies found in the overall software engineering community, there is no significant change (drop) compared to the former review stage (1998-2007).

Agile and Refactoring

Refactoring Planning and Practice in Agile Software Development: An Empirical Study
Jie Chen, Junchao Xiao, Qing Wang, Leon J. Osterweil, and Mingshu Li
(Institute of Software at Chinese Academy of Sciences, China; University of Chinese Academy of Sciences, China; University of Massachusetts at Amherst, USA)
Agile software engineering increasingly seeks to incorporate design modification and continuous refactoring in order to maintain code quality even in highly dynamic environments. However, there does not currently appear to be an industry-wide consensus on how to do this and research in this area expresses conflicting opinions. This paper presents an empirical study based upon an industry survey aimed at understanding the different ways that refactoring is thought of by the different people carrying out different roles in agile processes and how these different people weigh the importance of refactoring versus other kinds of tasks in the process. The study found good support for the importance of refactoring, but most respondents agreed that deferred refactoring impacts the agility of their process. Thus there was no universally agreed-upon strategy for planning refactoring. The survey findings also indicated that different roles have different perspectives on the different kinds of tasks in an agile process although all seem to want to increase the priority given to refactoring during planning for the iterations in agile development. Analysis of the survey raised many interesting questions suggesting the need for a considerable amount of future research.

Agility beyond Software Development
Dan X. Houston
(Aerospace Corporation, USA)
Agile software development grew out of a variety of alternative software development methods that shared a common set of values and principles. After two decades, agile software development remains loosely defined, but has been widely accepted. This acceptance has gained the attention of other fields with discussions of applying agile to their work, for example agile systems engineering and agile program management. However, agile was defined in terms of software development, both in practice and in principle. Therefore, translation into other fields has been challenging. This paper derives a set of agile characteristics and discusses two benefits of accepting such a set of characteristics for (a) application of agile to other fields beyond software development and (b) for measurement of agility.

Agile Development with Software Process Mining
Vladimir Rubin, Irina Lomazova, and Wil M. P. van der Aalst
(National Research University, Russia; Eindhoven University of Technology, Netherlands)
Modern companies continue investing more and more in the creation, maintenance and change of software systems, but the proper specification and design of such systems continues to be a challenge. The majority of current approaches either ignore real user and system runtime behavior or consider it only informally. This leads to a rather prescriptive top-down approach to software development.
In this paper, we propose a bottom-up approach, which takes event logs (e.g., trace data) of a software system for the analysis of the user and system runtime behavior and for improving the software. We use well-established methods from the area of process mining for this analysis. Moreover, we suggest embedding process mining into the agile development lifecycle.
The goal of this position paper is to motivate the need for foundational research in the area of software process mining (applying process mining to software analysis) by showing the relevance and listing open challenges. Our proposal is based on our experiences with analyzing a big productive touristic system. This system was developed using agile methods and process mining could be effectively integrated into the development lifecycle.

Software Processes II

Software Domains in Incremental Development Productivity Decline
Ramin Moazeni, Daniel Link, Celia Chen, and Barry Boehm
(University of Southern California, USA)
This research paper expands on a previously introduced phenomenon called Incremental Development Productivity Decline (IDPD) that is presumed to be present in all incremental software projects to some extent. Incremental models are now being used by many organizations in order to reduce development risks. Incremental development has become the most common method of software development. Therefore its characteristics inevitably influence the productivity of projects. Based on their observed IDPD, incrementally developed projects are split into several major IDPD categories. Different ways of measuring productivity are presented and evaluated in order to come to a definition or set of definitions that is suitable to these categories of projects. Data has been collected and analyzed, indicating the degree of IDPD associated with each category. Several hypotheses have undergone preliminary evaluations regarding the existence, stability and category-dependence of IDPD with encouraging results. Further data collection and hypothesis testing is underway.

A Collaborative Method for Business Process Oriented Requirements Acquisition and Refining
Han Lai, Rong Peng, and Yuze Ni
(Wuhan University, China; Chongqing Technology and Business University, China)
Requirements Elicitation (RE) is a critical process in system/software engineering. Its goal is to capture the stakeholders’ expectations, needs and constraints, which can be elicited, analyzed and specified as requirements. Gathering the requirements correctly, clearly and completely in a natural way is a typical challenging problem, because requirements analysts always play key roles in the elicitation process dominantly while stakeholders participate in passively. In this paper, we propose a collective intelligence driven business process oriented requirements acquisition and refining method. Its aim is to reduce the requirements analysts’ dominance and promote stakeholders’ self-expression and self-improvement to elicit requirements clearly and completely. It adopts the group storytelling method to promote the collaboration and communication among stakeholders, utilizes the narrative network model to enhance the associations among story fragments, and introduces dialogue game theory to guide the progressive refining. At the same time, the activity theory is adopted as the description framework to present the method and an application example is introduced. Finally, a pilot experiment is carried out to evaluate its perceived usefulness and perceived ease of use; and the actual quality of the requirements based on the degree of completeness and understandability in comparison with JAD. The results show that the requirements elicited by BPCRAR are more complete and understandable than JAD. In addition, the BPCRAR is perceived usefulness and ease of use in the experiment.

Processes for Embedded Systems Development: Preliminary Results from a Systematic Review
Guoping Rong, Tianyu Liu, Mingjuan Xie, Jieyu Chen, Cong Ma, and Dong Shao
(Nanjing University, China)
With the proliferation of embedded ubiquitous systems in all aspects of human life, the development of embedded systems has been facing more and more challenges (e.g., quality, time to market, etc.). Meanwhile, lots of software processes have been reported to be applied in Embedded Systems Development (ESD) with various advantages and disadvantages. Therefore, it’s important to portrait a big picture of the state-of-the-practice of the adoption of the software processes in ESD, which may benefit both practitioners and researchers in this area. This paper presents our investigation on this topic using systematic review that is intended to: 1) identify typical challenging factors and how software processes and practices address them; and 2) discover improvement opportunities from both academic and industrial perspectives.

Software Process Models

Realizing Software Process Lines: Insights and Experiences
Marco Kuhrmann, Daniel Méndez Fernández, and Thomas Ternité
(TU München, Germany; TU Clausthal, Germany)
Software process lines provide a systematic approach to construct and manage software processes. A process line defines a reference process containing general process assets, whereas a well-defined customization approach allows process engineers to create new process variants by, e.g., extending or altering process assets. Variability operations are a powerful instrument to realize a process line. However, little is known about which variability operations are suitable in practice. In this paper, we present a study on the feasibility of variability operations to support process lines in the context of the German V-Modell XT. We analyze which variability operations were defined and used to which extent, and we provide a catalog of variability operations as an improvement proposal for other process models. Our findings show 69 variability operations defined across several metamodel versions of which 25 remain unused. Furthermore, we also find that variability operations can help process engineers to compensate process metamodel evolution.

Guiding the Adoption of Software Development Methods
Natalja Nikitina and Mira Kajko-Mattsson
(KTH, Sweden)
Literature shows that as many as 82% of the organizations that adopt agile methods experience problems in their agile adoptions. Despite this, very few reports have provided guidelines for how to conduct software method adoption. This paper suggests a process model of software method adoption and lists contextual factors for guiding the deployment of software development methods. The adoption model and the contextual factors have been evaluated in six industrial method adoption projects and they have proven to be useful for guiding organizations in their software method adoption efforts.

Artifact-Based Software Process Improvement and Management: A Method Proposal
Marco Kuhrmann and Sarah Beecham
(TU München, Germany; Lero, Ireland; University of Limerick, Ireland)
When it comes to software process improvement (SPI), process engineers look for SPI methods to support process analysis, design, realization, deployment, and management. Although a number of different SPI methods and models exist, process engineers tend to view these as too generic, too large, or a poor fit for the organization in which SPI is conducted. A strategy to overcome these shortcomings is to concentrate on the artifacts, which precisely define the desired outcomes, rather than on specific methods. In this paper, we present the Artifact-based Software Process Improvement & Management (ArSPI) model that provides a unified perspective on SPI and company-wide software process management (SPM), the required key artifacts, and the life cycle models. ArSPI is shown to be of practical support to industry who called for a practical way to define the interfaces between SPI projects. This paper concludes with an example of how ArSPI paved the way for several organizations through applying the model in real-world SPI-projects.

Verification

Throughput Based Temporal Verification for Monitoring Large Batch of Parallel Processes
Xiao Liu, Dingxian Wang, Dong Yuan, Futian Wang, and Yun Yang
(East China Normal University, China; Swinburne University of Technology, Australia; Anhui University, China)
On-time completion is one of the most important QoS (Quality of Service) dimensions for business processes running in the cloud. While today’s business systems often need to handle thousands of concurrent user requests, process monitoring is basically conducted in a one by one fashion. It is possible to repeat the strategies for monitoring a single process a thousand times to monitor a thousand parallel processes. However, the time overhead will be a thousand-fold increase as well, which brings a big challenge for process monitoring. In this paper, based on a novel runtime throughput consistency model, we propose a QoS-aware throughput based checkpoint selection strategy which can dynamically select a small number of checkpoints along the system timeline to facilitate the temporal verification of throughput constraints and achieve the target on-time completion rate. The experimental results demonstrate that our strategy can achieve the best efficiency and effectiveness compared with the state-of-the-art as well as other representative response-time based checkpoint selection strategies.

Monitoring Data-Aware Business Constraints with Finite State Automata
Riccardo De Masellis, Fabrizio M. Maggi, and Marco Montali
(Sapienza University of Rome, Italy; University of Tartu, Estonia; Free University of Bolzano, Italy)
Checking the compliance of a business process execution with respect to a set of regulations is an important issue in several settings. A common way of representing the expected behavior of a process is to describe it as a set of business constraints. Runtime verification and monitoring facilities allow us to continuously determine the state of constraints on the current process execution, and to promptly detect violations at runtime. A plethora of studies has demonstrated that in several settings business constraints can be formalized in terms of temporal logic rules. However, in virtually all existing works the process behavior is mainly modeled in terms of control-flow rules, neglecting the equally important data perspective. In this paper, we overcome this limitation by presenting a novel monitoring approach that tracks streams of process events (that possibly carry data) and verifies if the process execution is compliant with a set of data-aware business constraints, namely constraints not only referring to the temporal evolution of events, but also to the temporal evolution of data. The framework is based on the formal specification of business constraints in terms of first-order linear temporal logic rules. Operationally, these rules are translated into finite state automata for dynamically reasoning on partial, evolving execution traces. We show the versatility of our approach by formalizing (the data-aware extension of) Declare, a declarative, constraint-based process modeling language, and by demonstrating its application on a concrete case dealing with web security.

Hierarchical Timed Automata Based Verification of Dynamic Evolution Process in Open Environments
Yu Zhou, Jidong Ge ORCID logo, and Pengcheng Zhang
(Nanjing University of Aeronautics and Astronautics, China; Nanjing University, China; Hohai University, China)
The paper proposes a novel approach based on the hierarchical timed automata to verify the consistency of dynamic evolution process. Different from traditional approaches, it investigates the problem from the behavioral perspective and examines the procedures before, during and after the evolution process. Furthermore, our approach can support the direct modeling of temporal aspects, as well as the hierarchical structures. A flattening algorithm is presented to facilitate the automated verification using the mainstream timed automata based model checker --UPPAAL. A motivating example is discussed and demonstrates the feasibility of our approach.

Test and Reusability

When to Automate Software Testing? Decision Support Based on System Dynamics: An Industrial Case Study
Zahra Sahaf, Vahid Garousi, Dietmar Pfahl, Rob Irving, and Yasaman Amannejad
(University of Calgary, Canada; Atilim University, Turkey; University of Tartu, Estonia; Pason Systems, Canada)
Software test processes are complex and costly. To reduce testing effort without compromising effectiveness and product quality, automation of test activities has been adopted as a popular approach in software industry. However, since test automation usually requires substantial upfront investments, automation is not always more cost-effective than manual testing. To support decision-makers in finding the optimal degree of test automation in a given project, we propose in this paper a simulation model using the System Dynamics (SD) modeling technique. With the help of the simulation model, we can evaluate the performance of test processes with varying degrees of automation of test activities and help testers choose the most optimal cases. As the case study, we describe how we used our simulation model in the context of an Action Research (AR) study conducted in collaboration with a software company in Calgary, Canada. The goal of the study was to investigate how the simulation model can help decision-makers decide whether and to what degree the company should automate their test processes. As a first step, we compared the performances of the current fully manual testing with several cases of partly automated testing as anticipated for implementation in the partner company. The development of the simulation model as well as the analysis of simulation results helped the partner company to get a deeper understanding of the strengths and weaknesses of their current test process and supported decision-makers in the cost effective planning of improvements of selected test activities.

An Expert-Based Cost Estimation Model for System Test Execution
Benedikt Hauptmann, Maximilian Junker, Sebastian Eder, Christian Amann, and Rudolf Vaas
(TU München, Germany; Munich Re, Germany)
To execute system tests, two fundamentally different execution techniques exist: manual and automated execution. For each system test suite, one must decide how to employ those techniques (this strategy is called execution mode). Despite general conditions such as fixed testing strategies or development philosophies, almost all projects permit a wide range of possible execution modes to choose from. In industry, execution techniques are often chosen by experts based on rules of thumb, experience and best practices. Although the results are mostly tolerable, they may be not cost-effective. In retrospect, it is often unclear on what basis those decisions were made, making it difficult to assess whether they are still valid. Finally, it is hard to predict the costs for test execution beforehand. We introduce a cost model to estimate the economic impact of execution modes. Our cost model is based on expert estimations and gives additional input for testing experts in balancing pros and cons of execution modes at hand. Furthermore, it helps documenting and persists decisions during the life time of a test suite. Additionally, we report on a first case study, applying our cost model in industry.

On the Need to Study the Impact of Model Driven Engineering on Software Processes
Regina Hebig and Reda Bendraou
(LIP6, France; UPMC, France)
There is an increasing use of model-driven engineering (MDE) in the industry. Despite the existence of research proposals for MDE-specific processes, the question arises whether and how the processes that are already used within a company can be reused, when MDE is introduced. In this position paper we report on a systematic literature review on the question how standard processes, such as SCRUM or the V-Model XT, can be combined with MDE. We come up with the observation that - although it is in some cases possible to reuse standard processes - the combination with MDE can also result in heavyweight changes to a process. Our goal is to draw attention to two arising research needs: the need to collect systematic knowledge about the influence of MDE on software processes and the need to provide guidance for the tailoring of processes based on the set of used MDE techniques.

Resources

A Business Process Simulation Method Supporting Resource Evolution
Jimin Ling, Qi Feng, and Li Zhang
(Beihang University, China)
Business process simulation is the procedure of planning, modeling and simulating enterprise process to analyze the features varying with time which guides the users to make decisions or important means of process improvement. In the existing simulation methods, they considered little about dynamic resource, which may lead to a large deviation. To solve the problem above, a business process simulation method supporting resource evolution that mainly focuses on human resource is proposed. Individual differences and personnel composition are analyzed to represent the dynamic feature of resources. Evolution of human capacity and personnel changes are described to meet the simulation requirements of human resource model. And then, process simulation mechanism and algorithm are realized. An experiment is conducted by simulating the software development process of a real-world software project using the system prototype we developed, and it shows that our method is closer to the practical project situation, thus the method effectiveness can be demonstrated to a certain extent. The main contribution of our work is a novel process simulation approach based on a dynamic resource model with configurable evolution rules.

A Gaussian Fields Based Mining Method for Semi-automating Staff Assignment in Workflow Application
Rongbin Xu, Xiao Liu, Ying Xie, Dong Yuan, and Yun Yang
(Anhui University, China; East China Normal University, China; Swinburne University of Technology, Australia)
Staff assignment is a very important task in the research of workflow resource management. Currently, many well-known workflow applications still rely on human resource assigners such as process initiator or process monitor to perform staff assignment task. In this paper, we propose a semi-automatic workflow staff assignment method which can decrease the workload of staff assigner based on a novel semi-supervised machine learning framework. Our method can be applied to learn all kinds of activities that each actor is capable of based on the workflow event log. After we have learned all labeled data, we can suggest a suitable actor to undertake the specified activities when a new process is assigned. With the proposed method, we can get an average prediction accuracy of 97% and 91% on the data sets of two manufacturing enterprise applications respectively.

Panel Papers

Towards Context-Specific Software Process Selection, Tailoring, and Composition
Guoping Rong, Barry Boehm, Marco Kuhrmann, Evelyn Tian, Shijun Lian, and Ita Richardson
(Nanjing University, China; University of Southern California, USA; TU München, Germany; Ericsson, China; Siemens Shanghai Medical Equipment, China; Lero, Ireland; University of Limerick, Ireland)
As an approach to develop suitable development processes for software projects, Software Process Selection, Tailoring and Composition (SP-STC) attract lots of attention from both industry and academia. However, without effective guidelines, how to do SP-STC often remains a mystery. This special panel aims to 1) initiate a discussion on the current research status of SP-STC, 2) identify main challenges of SP-STC and possible solutions, and 3) work out a research agenda for future work.

Are We Ready for Software Process Selection, Tailoring, and Composition?
Guoping Rong
(Nanjing University, China)
Software projects are performed in different contexts and, thus, require a context-specific selection and adoption of adequate methods. The suitable selection and tailoring, however, still constitute a challenging task. For this, in this paper, we discuss several issues concerning process definition and adoption, and motivate more research regarding the improvement of evidence-based method selection and adoption for the respective context.

An Initial Process Decision Table and a Process Evolution Process
Barry Boehm
(University of Southern California, USA)
The Incremental Commitment Spiral Model (ICSM) presented in an ICSSP 2014 tutorial includes several decision milestones at which evidence of the feasibility of the proposed process is evaluated, and at which the stakeholders decide whether to proceed with it or to change course, based on the risk of proceeding with the proposed process. This generates a large number of potential processes, but we have found risk patterns that provide selection criteria for a set of common cases for at least the initial process.

You Can't Tailor What You Haven't Modeled
Marco Kuhrmann
(TU München, Germany)
It is widely accepted that the one size fits all process does not exist. Software processes need to be tailored according to the respective context of companies and projects. However, tailoring a software process often remains a mystery. What is the actual context? What are the parameters to adjust a process? What are the implications of tailoring criteria? A systematic process tailoring requires the ability to anticipate needed flexibility early in the process design process, and to express this in a process modeling language. In this paper, we discuss the design of process tailoring models, which we consider crucial for the design and, eventually, the application of flexible software processes. We advocate for a constructive metamodel-based approach to improve process tailoring.

Journey to Agility for a Large Scale Telecom System
Evelyn Tian
(Ericsson, China)
Since the birth of Agile Manifesto, many companies have started their Agile transformation journey. This position paper highlights a few key learning and experience from a large scale legacy telecom product during its Agile transformation journey.

SW Process Tailoring Practice in Medical Device Industry
Shijun Lian
(Siemens Shanghai Medical Equipment, China)
In this paper, we will share our experience how we tailor our V-Model process to integrate AGILE practices, to meet the regulatory requirement of medical device software development, while reaping the benefits of being AGILE in medical device software development.

Software Processes: How Important Is Your Domain?
Ita Richardson
(Lero, Ireland; University of Limerick, Ireland)
There was a time when researching software processes meant just that – we were interested in making sure that the process for software development was effective. We did not really have to worry about the domains in which our software was used – well, maybe that was up to the requirements engineers or even those who were interested in usability, but it did not really affect the software processes through which the software was developed. But, things have changed! Software has become more ubiquitous. Software is used in products that are governed by regulation. Software is being developed in organisations that heretofore did not consider themselves software companies – such as automotive and medical device companies. As the manner in which software is being used has changed, so too must the processes by which software is developed. This paper presents the position that software processes can no longer ignore the domain – they have to change to ensure that software can be used wherever it is needed.

Tutorials

Understanding the Dynamics of Software Projects: An Introduction to Software Process Simulation
Dan X. Houston and Raymond Madachy
(Aerospace Corporation, USA; Naval Postgraduate School, USA)
Static representations of development processes provide a basis for communication and coordination of work, as well as for planning work. However, they do not provide any information about the actual behavior of a project, including the effects of staffing decisions, quality-inducing activities, delays, resource contentions, and so forth. Software process simulation (SPS) has demonstrated the capability for providing insight to the dynamics of software projects and supporting project management decisions. This tutorial is a SPS introduction that emphasizes practical approaches to modeling and simulation for both researchers and practitioners. We will discuss modeling and simulation, types of simulation, an historical overview of SPS, the disciplines that contribute to successful SPS work, modeling constructs commonly used to represent software development dynamics, and methods for conducting a SPS project.

Workflow Temporal Verification: An Efficient and Effective Approach for Delivering On-Time Completion
Xiao Liu and Yun Yang
(East China Normal University, China; Swinburne University of Technology, Australia)
In the real world, most workflow applications are time constrained, i.e. to be completed by satisfying a set of temporal constraints such as local milestones and global deadlines. Meanwhile, due to the distributed nature of business processes and scientific workflows, most workflow systems are running in a dynamic computing environment such as the Cloud. Therefore, how to guarantee the on-time completion of workflow applications becomes a critical yet challenging issue for enhancing the overall performance and usability of workflow systems. In this tutorial, we will present a detailed overview of workflow temporal verification, which is one of the most efficient and effective approaches for delivering on-time completion of workflow applications. A general temporal verification framework consists of three major components, viz. temporal constraint setting, temporal consistency monitoring and temporal exception handling. Details for each component and comprehensive experimental results will be demonstrated. After the tutorial, the audience will have a complete view of what are workflow temporal verification as well as the state-of-the-art and open issues in this field.

The Incremental Commitment Spiral Model (ICSM): Principles and Practices for Successful Systems and Software
Barry Boehm and LiGuo Huang ORCID logo
(University of Southern California, USA; Southern Methodist University, USA)
This paper summarizes the Incremental Commitment Spiral Model (ICSM), a process model generator that enables organizations to determine which process model, or combination of models, best fits the needs of each system.

proc time: 1.22