894 resultados para Model Driven Engineering
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
One of the reasons for using variability in the software product line (SPL) approach (see Apel et al., 2006; Figueiredo et al., 2008; Kastner et al., 2007; Mezini & Ostermann, 2004) is to delay a design decision (Svahnberg et al., 2005). Instead of deciding on what system to develop in advance, with the SPL approach a set of components and a reference architecture are specified and implemented (during domain engineering, see Czarnecki & Eisenecker, 2000) out of which individual systems are composed at a later stage (during application engineering, see Czarnecki & Eisenecker, 2000). By postponing the design decisions in such a manner, it is possible to better fit the resultant system in its intended environment, for instance, to allow selection of the system interaction mode to be made after the customers have purchased particular hardware, such as a PDA vs. a laptop. Such variability is expressed through variation points which are locations in a software-based system where choices are available for defining a specific instance of a system (Svahnberg et al., 2005). Until recently it had sufficed to postpone committing to a specific system instance till before the system runtime. However, in the recent years the use and expectations of software systems in human society has undergone significant changes.Today's software systems need to be always available, highly interactive, and able to continuously adapt according to the varying environment conditions, user characteristics and characteristics of other systems that interact with them. Such systems, called adaptive systems, are expected to be long-lived and able to undertake adaptations with little or no human intervention (Cheng et al., 2009). Therefore, the variability now needs to be present also at system runtime, which leads to the emergence of a new type of system: adaptive systems with dynamic variability.
Resumo:
Wireless Sensor and Actuator Networks (WSAN) are a key component in Ubiquitous Computing Systems and have many applications in different knowledge domains. Programming for such networks is very hard and requires developers to know the available sensor platforms specificities, increasing the learning curve for developing WSAN applications. In this work, an MDA (Model-Driven Architecture) approach for WSAN applications development called ArchWiSeN is proposed. The goal of such approach is to facilitate the development task by providing: (i) A WSAN domain-specific language, (ii) a methodology for WSAN application development; and (iii) an MDA infrastructure composed of several software artifacts (PIM, PSMs and transformations). ArchWiSeN allows the direct contribution of domain experts in the WSAN application development without the need of specialized knowledge on WSAN platforms and, at the same time, allows network experts to manage the application requirements without the need for specific knowledge of the application domain. Furthermore, this approach also aims to enable developers to express and validate functional and non-functional requirements of the application, incorporate services offered by WSAN middleware platforms and promote reuse of the developed software artifacts. In this sense, this Thesis proposes an approach that includes all WSAN development stages for current and emerging scenarios through the proposed MDA infrastructure. An evaluation of the proposal was performed by: (i) a proof of concept encompassing three different scenarios performed with the usage of the MDA infrastructure to describe the WSAN development process using the application engineering process, (ii) a controlled experiment to assess the use of the proposed approach compared to traditional method of WSAN application development, (iii) the analysis of ArchWiSeN support of middleware services to ensure that WSAN applications using such services can achieve their requirements ; and (iv) systematic analysis of ArchWiSeN in terms of desired characteristics for MDA tool when compared with other existing MDA tools for WSAN.
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
Virtual-build-to-order (VBTO) is a form of order fulfilment system in which the producer has the ability to search across the entire pipeline of finished stock, products in production and those in the production plan, in order to find the best product for a customer. It is a system design that is attractive to Mass Customizers, such as those in the automotive sector, whose manufacturing lead time exceeds their customers' tolerable waiting times, and for whom the holding of partly-finished stocks at a fixed decoupling point is unattractive or unworkable. This paper describes and develops the operational concepts that underpin VBTO, in particular the concepts of reconfiguration flexibility and customer aversion to waiting. Reconfiguration is the process of changing a product's specification at any point along the order fulfilment pipeline. The extent to which an order fulfilment system is flexible or inflexible reveals itself in the reconfiguration cost curve, of which there are four basic types. The operational features of the generic VBTO system are described and simulation is used to study its behaviour and performance. The concepts of reconfiguration flexibility and floating decoupling point are introduced and discussed.
Resumo:
Estimating un-measurable states is an important component for onboard diagnostics (OBD) and control strategy development in diesel exhaust aftertreatment systems. This research focuses on the development of an Extended Kalman Filter (EKF) based state estimator for two of the main components in a diesel engine aftertreatment system: the Diesel Oxidation Catalyst (DOC) and the Selective Catalytic Reduction (SCR) catalyst. One of the key areas of interest is the performance of these estimators when the catalyzed particulate filter (CPF) is being actively regenerated. In this study, model reduction techniques were developed and used to develop reduced order models from the 1D models used to simulate the DOC and SCR. As a result of order reduction, the number of states in the estimator is reduced from 12 to 1 per element for the DOC and 12 to 2 per element for the SCR. The reduced order models were simulated on the experimental data and compared to the high fidelity model and the experimental data. The results show that the effect of eliminating the heat transfer and mass transfer coefficients are not significant on the performance of the reduced order models. This is shown by an insignificant change in the kinetic parameters between the reduced order and 1D model for simulating the experimental data. An EKF based estimator to estimate the internal states of the DOC and SCR was developed. The DOC and SCR estimators were simulated on the experimental data to show that the estimator provides improved estimation of states compared to a reduced order model. The results showed that using the temperature measurement at the DOC outlet improved the estimates of the CO , NO , NO2 and HC concentrations from the DOC. The SCR estimator was used to evaluate the effect of NH3 and NOX sensors on state estimation quality. Three sensor combinations of NOX sensor only, NH3 sensor only and both NOX and NH3 sensors were evaluated. The NOX only configuration had the worst performance, the NH3 sensor only configuration was in the middle and both the NOX and NH3 sensor combination provided the best performance.
Resumo:
Business process modeling has undoubtedly emerged as a popular and relevant practice in Information Systems. Despite being an actively researched field, anecdotal evidence and experiences suggest that the focus of the research community is not always well aligned with the needs of industry. The main aim of this paper is, accordingly, to explore the current issues and the future challenges in business process modeling, as perceived by three key stakeholder groups (academics, practitioners, and tool vendors). We present the results of a global Delphi study with these three groups of stakeholders, and discuss the findings and their implications for research and practice. Our findings suggest that the critical areas of concern are standardization of modeling approaches, identification of the value proposition of business process modeling, and model-driven process execution. These areas are also expected to persist as business process modeling roadblocks in the future.
Resumo:
Unmanned Aircraft Systems (UAS) describe a diverse range of aircraft that are operated without a human pilot on-board. Unmanned aircraft range from small rotorcraft, which can fit in the palm of your hand, through to fixed wing aircraft comparable in size to that of a commercial passenger jet. The absence of a pilot on-board allows these aircraft to be developed with unique performance capabilities facilitating a wide range of applications in surveillance, environmental management, agriculture, defence, and search and rescue. However, regulations relating to the safe design and operation of UAS first need to be developed before the many potential benefits from these applications can be realised. According to the International Civil Aviation Organization (ICAO), a Risk Management Process (RMP) should support all civil aviation policy and rulemaking activities (ICAO 2009). The RMP is described in International standard, ISO 31000:2009 (ISO, 2009a). This standard is intentionally generic and high-level, providing limited guidance on how it can be effectively applied to complex socio-technical decision problems such as the development of regulations for UAS. Through the application of principles and tools drawn from systems philosophy and systems engineering, this thesis explores how the RMP can be effectively applied to support the development of safety regulations for UAS. A sound systems-theoretic foundation for the RMP is presented in this thesis. Using the case-study scenario of a UAS operation over an inhabited area and through the novel application of principles drawn from general systems modelling philosophy, a consolidated framework of the definitions of the concepts of: safe, risk and hazard is made. The framework is novel in that it facilitates the representation of broader subjective factors in an assessment of the safety of a system; describes the issues associated with the specification of a system-boundary; makes explicit the hierarchical nature of the relationship between the concepts and the subsequent constraints that exist between them; and can be evaluated using a range of analytic or deliberative modelling techniques. Following the general sequence of the RMP, the thesis explores the issues associated with the quantified specification of safety criteria for UAS. A novel risk analysis tool is presented. In contrast to existing risk tools, the analysis tool presented in this thesis quantifiably characterises both the societal and individual risk of UAS operations as a function of the flight path of the aircraft. A novel structuring of the risk evaluation and risk treatment decision processes is then proposed. The structuring is achieved through the application of the Decision Support Problem Technique; a modelling approach that has been previously used to effectively model complex engineering design processes and to support decision-making in relation to airspace design. The final contribution made by this thesis is in the development of an airworthiness regulatory framework for civil UAS. A novel "airworthiness certification matrix" is proposed as a basis for the definition of UAS "Part 21" regulations. The outcome airworthiness certification matrix provides a flexible, systematic and justifiable method for promulgating airworthiness regulations for UAS. In addition, an approach for deriving "Part 1309" regulations for UAS is presented. In contrast to existing approaches, the approach presented in this thesis facilitates a traceable and objective tailoring of system-level reliability requirements across the diverse range of UAS operations. The significance of the research contained in this thesis is clearly demonstrated by its practical real world outcomes. Industry regulatory development groups and the Civil Aviation Safety Authority have endorsed the proposed airworthiness certification matrix. The risk models have also been used to support research undertaken by the Australian Department of Defence. Ultimately, it is hoped that the outcomes from this research will play a significant part in the shaping of regulations for civil UAS, here in Australia and around the world.
Resumo:
Bone metastasis is a complication that occurs in 80 % of women with advanced breast cancer. Despite the prevalence of bone metastatic disease, the avenues for its clinical management are still restricted to palliative treatment options. In fact, the underlying mechanisms of breast cancer osteotropism have not yet been fully elucidated due to a lack of suitable in vivo models that are able to recapitulate the human disease. In this work, we review the current transplantation-based models to investigate breast cancer-induced bone metastasis and delineate the strengths and limitations of the use of different grafting techniques, tissue sources, and hosts. We further show that humanized xenograft models incorporating human cells or tissue grafts at the primary tumor site or the metastatic site mimic more closely the human disease. Tissue-engineered constructs are emerging as a reproducible alternative to recapitulate functional humanized tissues in these murine models. The development of advanced humanized animal models may provide better platforms to investigate the mutual interactions between human cancer cells and their microenvironment and ultimately improve the translation of preclinical drug trials to the clinic.
Resumo:
Pond apple invades riparian and coastal environments with water acting as the main vector for dispersal. As seeds float and can reach the ocean, a seed tracking model driven by near surface ocean currents was used to develop maps of potential seed dispersal. Seeds were ‘released’ in the model from sites near the mouths of major North Queensland rivers. Most seeds reach land within three months of release, settling predominately on windward-facing locations. During calm and monsoonal conditions, seeds were generally swept in a southerly direction, however movement turns northward during south easterly trade winds. Seeds released in February from the Johnstone River were capable of being moved anywhere from 100 km north to 150 km south depending on prevailing conditions. Although wind driven currents are the primary mechanism influencing seed dispersal, tidal currents, the East Australian Current, and other factors such as coastline orientation, release location and time also play an important role in determining dispersal patterns. In extreme events such as tropical cyclone Justin in 1997, north east coast rivers could potentially transport seed over 1300 km to the Torres Strait, Papua New Guinea and beyond.
Resumo:
This paper presents the design and implementation of PolyMage, a domain-specific language and compiler for image processing pipelines. An image processing pipeline can be viewed as a graph of interconnected stages which process images successively. Each stage typically performs one of point-wise, stencil, reduction or data-dependent operations on image pixels. Individual stages in a pipeline typically exhibit abundant data parallelism that can be exploited with relative ease. However, the stages also require high memory bandwidth preventing effective utilization of parallelism available on modern architectures. For applications that demand high performance, the traditional options are to use optimized libraries like OpenCV or to optimize manually. While using libraries precludes optimization across library routines, manual optimization accounting for both parallelism and locality is very tedious. The focus of our system, PolyMage, is on automatically generating high-performance implementations of image processing pipelines expressed in a high-level declarative language. Our optimization approach primarily relies on the transformation and code generation capabilities of the polyhedral compiler framework. To the best of our knowledge, this is the first model-driven compiler for image processing pipelines that performs complex fusion, tiling, and storage optimization automatically. Experimental results on a modern multicore system show that the performance achieved by our automatic approach is up to 1.81x better than that achieved through manual tuning in Halide, a state-of-the-art language and compiler for image processing pipelines. For a camera raw image processing pipeline, our performance is comparable to that of a hand-tuned implementation.
Resumo:
166 p.