99 resultados para Agile software development


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software development settings provide a great opportunity for CSCW researchers to study collaborative work. In this paper, we explore a specific work practice called bug reproduction that is a part of the software bug-fixing process. Bug re-production is a highly collaborative process by which software developers attempt to locally replicate the ‘environment’ within which a bug was originally encountered. Customers, who encounter bugs in their everyday use of systems, play an important role in bug reproduction as they provide useful information to developers, in the form of steps for reproduction, software screenshots, trace logs, and other ways to describe a problem. Bug reproduction, however, poses major hurdles in software maintenance as it is often challenging to replicate the contextual aspects that are at play at the customers’ end. To study the bug reproduction process from a human-centered perspective, we carried out an ethnographic study at a multinational engineering company. Using semi-structured interviews, a questionnaire and half-a-day observation of sixteen software developers working on different software maintenance projects, we studied bug reproduction. In this pa-per, we present a holistic view of bug reproduction practices from a real-world set-ting and discuss implications for designing tools to address the challenges developers face during bug reproduction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social media tools are starting to become mainstream and those working in the software development industry are often ahead of the game in terms of using current technological innovations to improve their work. With the advent of outsourcing and distributed teams the software industry is ideally placed to take advantage of social media technologies, tools and environments. This paper looks at how social media is being used by early adopters within the software development industry. Current tools and trends in social media tool use are described and critiqued: what works and what doesn't. We use industrial case studies from platform development, commercial application development and government contexts which provide a clear picture of the emergent state of the art. These real world experiences are then used to show how working collaboratively in geographically dispersed teams, enabled by social media, can enhance and improve the development experience.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social contexts are possible information sources that can foster connections between mobile application users, but they are also minefields of privacy concerns and have great potential for misinterpretation. This research establishes a framework for guiding the design of context-aware mobile social applications from a socio-technical perspective. Agile ridesharing was chosen as the test domain for the research because its success relies upon effectively connecting people through mobile technologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Process modeling can be regarded as the currently most popular form of conceptual modeling. Research evidence illustrates how process modeling is applied across the different information system life cycle phases for a range of different applications, such as configuration of Enterprise Systems, workflow management, or software development. However, a detailed discussion of critical factors of the quality of process models is still missing. This paper proposes a framework consisting of six quality factors, which is derived from a comprehensive literature review. It then presents in a case study, a utility provider, who had designed various business process models for the selection of an Enterprise System. The paper summarizes potential means of conducting a successful process modeling initiative and evaluates the described modeling approach within the Guidelines of Modeling (GoM) framework. An outlook shows the potential lessons learnt, and concludes with insights to the next phases of this study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alvin Toffler’s image of the prosumer (1970, 1980, 1990) continues to influence in a significant way our understanding of the user-led, collaborative processes of content creation which are today labelled “social media” or “Web 2.0”. A closer look at Toffler’s own description of his prosumer model reveals, however, that it remains firmly grounded in the mass media age: the prosumer is clearly not the self-motivated creative originator and developer of new content which can today be observed in projects ranging from open source software through Wikipedia to Second Life, but simply a particularly well-informed, and therefore both particularly critical and particularly active, consumer. The highly specialised, high end consumers which exist in areas such as hi-fi or car culture are far more representative of the ideal prosumer than the participants in non-commercial (or as yet non-commercial) collaborative projects. And to expect Toffler’s 1970s model of the prosumer to describe these 21st-century phenomena was always an unrealistic expectation, of course. To describe the creative and collaborative participation which today characterises user-led projects such as Wikipedia, terms such as ‘production’ and ‘consumption’ are no longer particularly useful – even in laboured constructions such as ‘commons-based peer-production’ (Benkler 2006) or ‘p2p production’ (Bauwens 2005). In the user communities participating in such forms of content creation, roles as consumers and users have long begun to be inextricably interwoven with those as producer and creator: users are always already also able to be producers of the shared information collection, regardless of whether they are aware of that fact – they have taken on a new, hybrid role which may be best described as that of a produser (Bruns 2008). Projects which build on such produsage can be found in areas from open source software development through citizen journalism to Wikipedia, and beyond this also in multi-user online computer games, filesharing, and even in communities collaborating on the design of material goods. While addressing a range of different challenges, they nonetheless build on a small number of universal key principles. This paper documents these principles and indicates the possible implications of this transition from production and prosumption to produsage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the quest for shorter time-to-market, higher quality and reduced cost, model-driven software development has emerged as a promising approach to software engineering. The central idea is to promote models to first-class citizens in the development process. Starting from a set of very abstract models in the early stage of the development, they are refined into more concrete models and finally, as a last step, into code. As early phases of development focus on different concepts compared to later stages, various modelling languages are employed to most accurately capture the concepts and relations under discussion. In light of this refinement process, translating between modelling languages becomes a time-consuming and error-prone necessity. This is remedied by model transformations providing support for reusing and automating recurring translation efforts. These transformations typically can only be used to translate a source model into a target model, but not vice versa. This poses a problem if the target model is subject to change. In this case the models get out of sync and therefore do not constitute a coherent description of the software system anymore, leading to erroneous results in later stages. This is a serious threat to the promised benefits of quality, cost-saving, and time-to-market. Therefore, providing a means to restore synchronisation after changes to models is crucial if the model-driven vision is to be realised. This process of reflecting changes made to a target model back to the source model is commonly known as Round-Trip Engineering (RTE). While there are a number of approaches to this problem, they impose restrictions on the nature of the model transformation. Typically, in order for a transformation to be reversed, for every change to the target model there must be exactly one change to the source model. While this makes synchronisation relatively “easy”, it is ill-suited for many practically relevant transformations as they do not have this one-to-one character. To overcome these issues and to provide a more general approach to RTE, this thesis puts forward an approach in two stages. First, a formal understanding of model synchronisation on the basis of non-injective transformations (where a number of different source models can correspond to the same target model) is established. Second, detailed techniques are devised that allow the implementation of this understanding of synchronisation. A formal underpinning for these techniques is drawn from abductive logic reasoning, which allows the inference of explanations from an observation in the context of a background theory. As non-injective transformations are the subject of this research, there might be a number of changes to the source model that all equally reflect a certain target model change. To help guide the procedure in finding “good” source changes, model metrics and heuristics are investigated. Combining abductive reasoning with best-first search and a “suitable” heuristic enables efficient computation of a number of “good” source changes. With this procedure Round-Trip Engineering of non-injective transformations can be supported.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis maps the author's journey from a music composition practice to a composition and performance practice. The work involves the development of a software library for the purpose of encapsulating compositional ideas in software, and realising these ideas in performance through a live coding computer music practice. The thesis examines what artistic practice emerges through live coding and software development, and does this permit a blurring between the activities of music composition and performance. The role that software design plays in affecting musical outcomes is considered to gain an insight into how software development contributes to artistic development. The relationship between music composition and performance is also examined to identify the means by which engaging in live coding and software development can bring these activities together. The thesis, situated within the discourse of practice led research, documents a journey which uses the experience of software development and performance as a means to guide the direction of the research. The journey serves as an experiment for the author in engaging an hitherto unfamiliar musical practice, and as a roadmap for others seeking to modify or broaden their artistic practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Schizophrenia is a mental disorder affecting 1-2% of the population and it is estimated 12-16% of hospital beds in Australia are occupied by patients with psychosis. The suicide rate for patients with this diagnosis is higher than that of the general population. Any technique which enhances training and treatment of this disorder will have a significant societal and economic impact. A significant research project using Virtual Reality (VR), in which both visual and auditory hallucinations are simulated, is currently being undertaken at the University of Queensland. The virtual environments created by the new software are expected to enhance the experiential learning outcomes of medical students by enabling them to experience the inner world of a patient with psychosis. In addition the Virtual Environment has the potential to provide a technologically advanced therapeutic setting where behavioral, exposure therapies can be conducted with exactly controlled exposure stimuli with an expected reduction in risk of harm. This paper reports on the current work of the project, previous stages of software development and future educational and clinical applications of the Virtual Environments. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An electrified railway system includes complex interconnections and interactions of several subsystems. Computer simulation is the only viable means for system evaluation and analysis. This paper discusses the difficulties and requirements of effective simulation models for this specialized industrial application; and the development of a general-purpose multi-train simulator.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Autonomous Underwater Vehicles (AUVs) are revolutionizing oceanography through their versatility, autonomy and endurance. However, they are still an underutilized technology. For coastal operations, the ability to track a certain feature is of interest to ocean scientists. Adaptive and predictive path planning requires frequent communication with significant data transfer. Currently, most AUVs rely on satellite phones as their primary communication. This communication protocol is expensive and slow. To reduce communication costs and provide adequate data transfer rates, we present a hardware modification along with a software system that provides an alternative robust disruption- tolerant communications framework enabling cost-effective glider operation in coastal regions. The framework is specifically designed to address multi-sensor deployments. We provide a system overview and present testing and coverage data for the network. Additionally, we include an application of ocean-model driven trajectory design, which can benefit from the use of this network and communication system. Simulation and implementation results are presented for single and multiple vehicle deployments. The presented combination of infrastructure, software development and deployment experience brings us closer to the goal of providing a reliable and cost-effective data transfer framework to enable real-time, optimal trajectory design, based on ocean model predictions, to gather in situ measurements of interesting and evolving ocean features and phenomena.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In keeping with the proliferation of free software development initiatives and the increased interest in the business process management domain, many open source workflow and business process management systems have appeared during the last few years and are now under active development. This upsurge gives rise to two important questions: What are the capabilities of these systems? and How do they compare to each other and to their closed source counterparts? In other words: What is the state-of-the-art in the area?. To gain an insight into these questions, we have conducted an in-depth analysis of three of the major open source workflow management systems – jBPM, OpenWFE, and Enhydra Shark, the results of which are reported here. This analysis is based on the workflow patterns framework and provides a continuation of the series of evaluations performed using the same framework on closed source systems, business process modelling languages, and web-service composition standards. The results from evaluations of the three open source systems are compared with each other and also with the results from evaluations of three representative closed source systems: Staffware, WebSphere MQ, and Oracle BPEL PM. The overall conclusion is that open source systems are targeted more toward developers rather than business analysts. They generally provide less support for the patterns than closed source systems, particularly with respect to the resource perspective, i.e. the various ways in which work is distributed amongst business users and managed through to completion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2009, QUT’s Office of Research and the Institute for Adult Learning Singapore funded a six-month pilot project that represented the first stage of a larger international comparative study. The study is the first of its kind to investigate to what extent and how digital content workers’ learning needs are being met by adult education and training in Australia and Singapore. The pilot project involved consolidating key theoretical literature, studies, policies, programs and statistical data relevant to the digital content industries in Australia and Singapore. This had not been done before, and represented new knowledge generation. Digital content workers include professionals within and beyond the creative industries as follows: Visual effects and animation (including virtual reality and 3D products); Interactive multimedia (e.g. websites, CD-ROMs) and software development; Computer and online games; and Digital film & TV production and film & TV post-production. In the last decade, the digital content industries have been recognised as an industry sector of strong and increasing significance. The project compared Australia and Singapore on aspects of the digital content industries’ labour market, skill requirements, human capital challenges, the role of adult education in building a workforce for the digital content industries, and innovation policies. The consolidated report generated from the project formed the basis of the proposal for an ARC Linkage Project application submitted in the May 2010 round.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.