923 resultados para Complex Engineering Systems
Resumo:
Admission controls, such as trunk reservation, are often used in loss networks to optimise their performance. Since the numerical evaluation of performance measures is complex, much attention has been given to finding approximation methods. The Erlang Fixed-Point (EFP) approximation, which is based on an independent blocking assumption, has been used for networks both with and without controls. Several more elaborate approximation methods which account for dependencies in blocking behaviour have been developed for the uncontrolled setting. This paper is an exploratory investigation of extensions and synthesis of these methods to systems with controls, in particular, trunk reservation. In order to isolate the dependency factor, we restrict our attention to a highly linear network. We will compare the performance of the resulting approximations against the benchmark of the EFP approximation extended to the trunk reservation setting. By doing this, we seek to gain insight into the critical factors in constructing an effective approximation. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A research program on atmospheric boundary layer processes and local wind regimes in complex terrain was conducted in the vicinity of Lake Tekapo in the southern Alps of New Zealand, during two 1-month field campaigns in 1997 and 1999. The effects of the interaction of thermal and dynamic forcing were of specific interest, with a particular focus on the interaction of thermal forcing of differing scales. The rationale and objectives of the field and modeling program are described, along with the methodology used to achieve them. Specific research aims include improved knowledge of the role of surface forcing associated with varying energy balances across heterogeneous terrain, thermal influences on boundary layer and local wind development, and dynamic influences of the terrain through channeling effects. Data were collected using a network of surface meteorological and energy balance stations, radiosonde and pilot balloon soundings, tethered balloon and kite-based systems, sodar, and an instrumented light aircraft. These data are being used to investigate the energetics of surface heat fluxes, the effects of localized heating/cooling and advective processes on atmospheric boundary layer development, and dynamic channeling. A complementary program of numerical modeling includes application of the Regional Atmospheric Modeling System (RAMS) to case studies characterizing typical boundary layer structures and airflow patterns observed around Lake Tekapo. Some initial results derived from the special observation periods are used to illustrate progress made to date. In spite of the difficulties involved in obtaining good data and undertaking modeling experiments in such complex terrain, initial results show that surface thermal heterogeneity has a significant influence on local atmospheric structure and wind fields in the vicinity of the lake. This influence occurs particularly in the morning. However, dynamic channeling effects and the larger-scale thermal effect of the mountain region frequently override these more local features later in the day.
Resumo:
Functional genomics is the systematic study of genome-wide effects of gene expression on organism growth and development with the ultimate aim of understanding how networks of genes influence traits. Here, we use a dynamic biophysical cropping systems model (APSIM-Sorg) to generate a state space of genotype performance based on 15 genes controlling four adaptive traits and then search this spice using a quantitative genetics model of a plant breeding program (QU-GENE) to simulate recurrent selection. Complex epistatic and gene X environment effects were generated for yield even though gene action at the trait level had been defined as simple additive effects. Given alternative breeding strategies that restricted either the cultivar maturity type or the drought environment type, the positive (+) alleles for 15 genes associated with the four adaptive traits were accumulated at different rates over cycles of selection. While early maturing genotypes were favored in the Severe-Terminal drought environment type, late genotypes were favored in the Mild-Terminal and Midseason drought environment types. In the Severe-Terminal environment, there was an interaction of the stay-green (SG) trait with other traits: Selection for + alleles of the SG genes was delayed until + alleles for genes associated with the transpiration efficiency and osmotic adjustment traits had been fixed. Given limitations in our current understanding of trait interaction and genetic control, the results are not conclusive. However, they demonstrate how the per se complexity of gene X gene X environment interactions will challenge the application of genomics and marker-assisted selection in crop improvement for dryland adaptation.
Resumo:
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.
Resumo:
This work demonstrates that the theoretical framework of complex networks typically used to study systems such as social networks or the World Wide Web can be also applied to material science, allowing deeper understanding of fundamental physical relationships. In particular, through the application of the network theory to carbon nanotubes or vapour-grown carbon nanofiber composites, by mapping fillers to vertices and edges to the gap between fillers, the percolation threshold has been predicted and a formula that relates the composite conductance to the network disorder has been obtained. The theoretical arguments are validated by experimental results from the literature.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's software. Developers are dedicating a larger portion of code to implementing them. Given their increased importance, correctness of GUIs code is becoming essential. This paper describes the latest results in the development of GUISurfer, a tool to reverse engineer the GUI layer of interactive computing systems. The ultimate goal of the tool is to enable analysis of interactive system from source code.
Resumo:
Abstract. Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI’s code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper describes our approach to reverse engineer an abstract model of a user interface directly from the GUI’s legacy code. We also present results from a case study. These results are encouraging and give evidence that the goal of reverse engineering user interfaces can be met with more work on this technique.
Resumo:
Software architecture is currently recognized as one of the most critical design steps in Software Engineering. The specification of the overall system structure, on the one hand, and of the interactions patterns between its components, on the other, became a major concern for the working developer. Although a number of formalisms to express behaviour and supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from behavioural specifications written in Cw, the corresponding architectural skeletons in the Microsoft .NET framework in the form of executable code
Resumo:
The ability to foresee how behaviour of a system arises from the interaction of its components over time - i.e. its dynamic complexity – is seen an important ability to take effective decisions in our turbulent world. Dynamic complexity emerges frequently from interrelated simple structures, such as stocks and flows, feedbacks and delays (Forrester, 1961). Common sense assumes an intuitive understanding of their dynamic behaviour. However, recent researches have pointed to a persistent and systematic error in people understanding of those building blocks of complex systems. This paper describes an empirical study concerning the native ability to understand systems thinking concepts. Two different groups - one, academic, the other, professional – submitted to four tasks, proposed by Sweeney and Sterman (2000) and Sterman (2002). The results confirm a poor intuitive understanding of the basic systems concepts, even when subjects have background in mathematics and sciences.
Resumo:
The ability to foresee how behaviour of a system arises from the interaction of its components over time - i.e. its dynamic complexity – is seen an important ability to take effective decisions in our turbulent world. Dynamic complexity emerges frequently from interrelated simple structures, such as stocks and flows, feedbacks and delays (Forrester, 1961). Common sense assumes an intuitive understanding of their dynamic behaviour. However, recent researches have pointed to a persistent and systematic error in people understanding of those building blocks of complex systems. This paper describes an empirical study concerning the native ability to understand systems thinking concepts. Two different groups - one, academic, the other, professional – submitted to four tasks, proposed by Sweeney and Sterman (2000) and Sterman (2002). The results confirm a poor intuitive understanding of the basic systems concepts, even when subjects have background in mathematics and sciences.
Resumo:
Nowadays, the cooperative intelligent transport systems are part of a largest system. Transportations are modal operations integrated in logistics and, logistics is the main process of the supply chain management. The supply chain strategic management as a simultaneous local and global value chain is a collaborative/cooperative organization of stakeholders, many times in co-opetition, to perform a service to the customers respecting the time, place, price and quality levels. The transportation, like other logistics operations must add value, which is achieved in this case through compression lead times and order fulfillments. The complex supplier's network and the distribution channels must be efficient and the integral visibility (monitoring and tracing) of supply chain is a significant source of competitive advantage. Nowadays, the competition is not discussed between companies but among supply chains. This paper aims to evidence the current and emerging manufacturing and logistics system challenges as a new field of opportunities for the automation and control systems research community. Furthermore, the paper forecasts the use of radio frequency identification (RFID) technologies integrated into an information and communication technologies (ICT) framework based on distributed artificial intelligence (DAI) supported by a multi-agent system (MAS), as the most value advantage of supply chain management (SCM) in a cooperative intelligent logistics systems. Logistical platforms (production or distribution) as nodes of added value of supplying and distribution networks are proposed as critical points of the visibility of the inventory, where these technological needs are more evident.
Resumo:
7th Mediterranean Conference on Information Systems, MCIS 2012, Guimaraes, Portugal, September 8-10, 2012, Proceedings Series: Lecture Notes in Business Information Processing, Vol. 129
Resumo:
Multilevel power converters have been introduced as the solution for high-power high-voltage switching applications where they have well-known advantages. Recently, full back-to-back connected multilevel neutral point diode clamped converters (NPC converter) have been used inhigh-voltage direct current (HVDC) transmission systems. Bipolar-connected back-to-back NPC converters have advantages in long-distance HVDCtransmission systems over the full back-to-back connection, but greater difficulty to balance the dc capacitor voltage divider on both sending and receiving end NPC converters. This study shows that power flow control and dc capacitor voltage balancing are feasible using fast optimum-predictive-based controllers in HVDC systems using bipolar back-to-back-connected five-level NPC multilevel converters. For both converter sides, the control strategytakes in account active and reactive power, which establishes ac grid currents in both ends, and guarantees the balancing of dc bus capacitor voltages inboth NPC converters. Additionally, the semiconductor switching frequency is minimised to reduce switching losses. The performance and robustness of the new fast predictive control strategy, and its capability to solve the DC capacitor voltage balancing problem of bipolar-connected back-to-back NPCconverters are evaluated.
Resumo:
In this paper we present VERITAS, a tool that focus time maintenance, that is one of the most important processes in the engineering of the time during the development of KBS. The verification and validation (V&V) process is part of a wider process denominated knowledge maintenance, in which an enterprise systematically gathers, organizes, shares, and analyzes knowledge to accomplish its goals and mission. The V&V process states if the software requirements specifications have been correctly and completely fulfilled. The methodologies proposed in software engineering have showed to be inadequate for Knowledge Based Systems (KBS) validation and verification, since KBS present some particular characteristics. VERITAS is an automatic tool developed for KBS verification which is able to detect a large number of knowledge anomalies. It addresses many relevant aspects considered in real applications, like the usage of rule triggering selection mechanisms and temporal reasoning.