943 resultados para Complex systems prediction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissolution of non-aqueous phase liquids (NAPLs) or gases into groundwater is a key process, both for contamination problems originating from organic liquid sources, and for dissolution trapping in geological storage of CO2. Dissolution in natural systems typically will involve both high and low NAPL saturations and a wide range of pore water flow velocities within the same source zone for dissolution to groundwater. To correctly predict dissolution in such complex systems and as the NAPL saturations change over time, models must be capable of predicting dissolution under a range of saturations and flow conditions. To provide data to test and validate such models, an experiment was conducted in a two-dimensional sand tank, where the dissolution of a spatially variable, 5x5 cm**2 DNAPL tetrachloroethene source was carefully measured using x-ray attenuation techniques at a resolution of 0.2x0.2 cm**2. By continuously measuring the NAPL saturations, the temporal evolution of DNAPL mass loss by dissolution to groundwater could be measured at each pixel. Next, a general dissolution and solute transport code was written and several published rate-limited (RL) dissolution models and a local equilibrium (LE) approach were tested against the experimental data. It was found that none of the models could adequately predict the observed dissolution pattern, particularly in the zones of higher NAPL saturation. Combining these models with a model for NAPL pool dissolution produced qualitatively better agreement with experimental data, but the total matching error was not significantly improved. A sensitivity study of commonly used fitting parameters further showed that several combinations of these parameters could produce equally good fits to the experimental observations. The results indicate that common empirical model formulations for RL dissolution may be inadequate in complex, variable saturation NAPL source zones, and that further model developments and testing is desirable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complexity science is the multidisciplinary study of complex systems. Its marked network orientation lends itself well to transport contexts. Key features of complexity science are introduced and defined, with a specific focus on the application to air traffic management. An overview of complex network theory is presented, with examples of its corresponding metrics and multiple scales. Complexity science is starting to make important contributions to performance assessment and system design: selected, applied air traffic management case studies are explored. The important contexts of uncertainty, resilience and emergent behaviour are discussed, with future research priorities summarised.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A catalytic enantioselective electrocyclic cascade leads to the construction of topologically complex systems comprising multiple rings with up to three stereocentres. This phase-transfer catalysed process offers a new strategy for the rapid and enantioselective generation of complex products bearing all-carbon quaternary stereogenic centres. © 2012 The Royal Society of Chemistry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Humanity has emerged as a major force in the operation of the biosphere. The focus is shifting from the environment as externality to the biosphere as precondition for social justice, economic development, and sustainability. In this article, we exemplify the intertwined nature of social-ecological systems and emphasize that they operate within, and as embedded parts of the biosphere and as such coevolve with and depend on it. We regard social-ecological systems as complex adaptive systems and use a social-ecological resilience approach as a lens to address and understand their dynamics. We raise the challenge of stewardship of development in concert with the biosphere for people in diverse contexts and places as critical for long-term sustainability and dignity in human relations. Biosphere stewardship is essential, in the globalized world of interactions with the Earth system, to sustain and enhance our life-supporting environment for human well-being and future human development on Earth, hence, the need to reconnect development to the biosphere foundation and the need for a biosphere-based sustainability science.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Solving linear systems is an important problem for scientific computing. Exploiting parallelism is essential for solving complex systems, and this traditionally involves writing parallel algorithms on top of a library such as MPI. The SPIKE family of algorithms is one well-known example of a parallel solver for linear systems. The Hierarchically Tiled Array data type extends traditional data-parallel array operations with explicit tiling and allows programmers to directly manipulate tiles. The tiles of the HTA data type map naturally to the block nature of many numeric computations, including the SPIKE family of algorithms. The higher level of abstraction of the HTA enables the same program to be portable across different platforms. Current implementations target both shared-memory and distributed-memory models. In this thesis we present a proof-of-concept for portable linear solvers. We implement two algorithms from the SPIKE family using the HTA library. We show that our implementations of SPIKE exploit the abstractions provided by the HTA to produce a compact, clean code that can run on both shared-memory and distributed-memory models without modification. We discuss how we map the algorithms to HTA programs as well as examine their performance. We compare the performance of our HTA codes to comparable codes written in MPI as well as current state-of-the-art linear algebra routines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is about the comparison of simulation techniques between Discrete Event Simulation (DES) and Agent Based Simulation (ABS). DES is one of the best-known types of simulation techniques in Operational Research. Recently, there has been an emergence of another technique, namely ABS. One of the qualities of ABS is that it helps to gain a better understanding of complex systems that involve the interaction of people with their environment as it allows to model concepts like autonomy and pro-activeness which are important attributes to consider. Although there is a lot of literature relating to DES and ABS, we have found none that focuses on exploring the capability of both in tackling the human behaviour issues which relates to queuing time and customer satisfaction in the retail sector. Therefore, the objective of this study is to identify empirically the differences between these simulation techniques by stimulating the potential economic benefits of introducing new policies in a department store. To apply the new strategy, the behaviour of consumers in a retail store will be modelled using the DES and ABS approach and the results will be compared. We aim to understand which simulation technique is better suited to human behaviour modelling by investigating the capability of both techniques in predicting the best solution for an organisation in using management practices. Our main concern is to maximise customer satisfaction, for example by minimising their waiting times for the different services provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reports on an attempt to apply Genetic Algorithms to the problem of optimising a complex system, through discrete event simulation (Simulation Optimisation), with a view to reducing the noise associated with such a procedure. We are applying this proposed solution approach to our application test bed, a Crossdocking distribution centre, because it provides a good representative of the random and unpredictable behaviour of complex systems i.e. automated machine random failure and the variability of manual order picker skill. It is known that there is noise in the output of discrete event simulation modelling. However, our interest focuses on the effect of noise on the evaluation of the fitness of candidate solutions within the search space, and the development of techniques to handle this noise. The unique quality of our proposed solution approach is we intend to embed a noise reduction technique in our Genetic Algorithm based optimisation procedure, in order for it to be robust enough to handle noise, efficiently estimate suitable fitness function, and produce good quality solutions with minimal computational effort.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Critical infrastructures are based on complex systems that provide vital services to the nation. The complexities of the interconnected networks, each managed by individual organisations, if not properly secured, could offer vulnerabilities that threaten other organisations’ systems that depend on their services. This thesis argues that the awareness of interdependencies among critical sectors needs to be increased. Managing and securing critical infrastructure is not isolated responsibility of a government or an individual organisation. There is a need for a strong collaboration among critical service providers of public and private organisations in protecting critical information infrastructure. Cyber exercises have been incorporated in national cyber security strategies as part of critical information infrastructure protection. However, organising a cyber exercise involved multi sectors is challenging due to the diversity of participants’ background, working environments and incidents response policies. How well the lessons learned from the cyber exercise and how it can be transferred to the participating organisations is still a looming question. In order to understand the implications of cyber exercises on what participants have learnt and how it benefits participants’ organisation, a Cyber Exercise Post Assessment (CEPA) framework was proposed in this research. The CEPA framework consists of two parts. The first part aims to investigate the lessons learnt by participants from a cyber exercise using the four levels of the Kirkpatrick Training Model to identify their perceptions on reaction, learning, behaviour and results of the exercise. The second part investigates the Organisation Cyber Resilience (OCR) of participating sectors. The framework was used to study the impact of the cyber exercise called X Maya in Malaysia. Data collected through interviews with X Maya 5 participants were coded and categorised based on four levels according to the Kirkpatrick Training Model, while online surveys distributed to ten Critical National Information Infrastructure (CNII) sectors participated in the exercise. The survey used the C-Suite Executive Checklist developed by World Economic Forum in 2012. To ensure the suitability of the tool used to investigate the OCR, a reliability test conducted on the survey items showed high internal consistency results. Finally, individual OCR scores were used to develop the OCR Maturity Model to provide the organisation cyber resilience perspectives of the ten CNII sectors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Portugal’s Northeast production of sheep and goats are based on the exploitation of landscape by-products such as spontaneous native vegetation and agriculture leftovers. Shepherds tend the flocks throughout grazing itineraries every day, crossing a mosaic of patches of varied land uses. During the journey, the shepherd acts together with the sheep and goats to select each patch in creating an ordered sequence of land uses. The focus of the research is on the land-use composition of the grazing itineraries; determinate how they depend on the patterns of the landscape mosaic. It is utilized a data set of 26 monthly herd’s itineraries, 13 of sheep and 13 of goats, to investigate the relationship of the land uses crossed by the flocks and the land uses of the landscape, evaluating the land-use preferences and selectivity of the sheep and goats. It is utilized the divergences in the time spent and distance travelled by the herds and the area of the land uses in the landscape, the chi-square test to relate the preferred land used and the season, and the discriminate analysis to distinguish the preferences and the selectivity of the herd of sheep and the herd of goats. The herds of the sheep and the goats presented different land-use preferences over the seasons and the discriminant analysis shows that they have different landscape preferences. The herd of sheep has the highest selectivity indexes for the annual irrigated crops, the agricultural complex systems and the agroforestry land uses. The highest selectivity indexes for the herd of goats were found for the deciduous forest, the agriculture with natural and semi-natural spaces and the shrublands land uses. It was concluded that the landscape management for sheep and goats herding has to be different: the agricultural land uses are essential to the flocks of sheep and the forest land uses are decisive to the flocks of goats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research investigated the simulation model behaviour of a traditional and combined discrete event as well as agent based simulation models when modelling human reactive and proactive behaviour in human centric complex systems. A departmental store was chosen as human centric complex case study where the operation system of a fitting room in WomensWear department was investigated. We have looked at ways to determine the efficiency of new management policies for the fitting room operation through simulating the reactive and proactive behaviour of staff towards customers. Once development of the simulation models and their verification had been done, we carried out a validation experiment in the form of a sensitivity analysis. Subsequently, we executed a statistical analysis where the mixed reactive and proactive behaviour experimental results were compared with some reactive experimental results from previously published works. Generally, this case study discovered that simple proactive individual behaviour could be modelled in both simulation models. In addition, we found the traditional discrete event model performed similar in the simulation model output compared to the combined discrete event and agent based simulation when modelling similar human behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Queueing theory provides models, structural insights, problem solutions and algorithms to many application areas. Due to its practical applicability to production, manufacturing, home automation, communications technology, etc, more and more complex systems requires more elaborated models, tech- niques, algorithm, etc. need to be developed. Discrete-time models are very suitable in many situations and a feature that makes the analysis of discrete time systems technically more involved than its continuous time counterparts. In this paper we consider a discrete-time queueing system were failures in the server can occur as-well as priority messages. The possibility of failures of the server with general life time distribution is considered. We carry out an extensive study of the system by computing generating functions for the steady-state distribution of the number of messages in the queue and in the system. We also obtain generating functions for the stationary distribution of the busy period and sojourn times of a message in the server and in the system. Performance measures of the system are also provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES AND STUDY METHOD: There are two subjects in this thesis: “Lot production size for a parallel machine scheduling problem with auxiliary equipment” and “Bus holding for a simulated traffic network”. Although these two themes seem unrelated, the main idea is the optimization of complex systems. The “Lot production size for a parallel machine scheduling problem with auxiliary equipment” deals with a manufacturing setting where sets of pieces form finished products. The aim is to maximize the profit of the finished products. Each piece may be processed in more than one mold. Molds must be mounted on machines with their corresponding installation setup times. The key point of our methodology is to solve the single period lot-sizing decisions for the finished products together with the piece-mold and the mold-machine assignments, relaxing the constraint that a single mold may not be used in two machines at the same time. For the “Bus holding for a simulated traffic network” we deal with One of the most annoying problems in urban bus operations is bus bunching, which happens when two or more buses arrive at a stop nose to tail. Bus bunching reflects an unreliable service that affects transit operations by increasing passenger-waiting times. This work proposes a linear mathematical programming model that establishes bus holding times at certain stops along a transit corridor to avoid bus bunching. Our approach needs real-time input, so we simulate a transit corridor and apply our mathematical model to the data generated. Thus, the inherent variability of a transit system is considered by the simulation, while the optimization model takes into account the key variables and constraints of the bus operation. CONTRIBUTIONS AND CONCLUSIONS: For the “Lot production size for a parallel machine scheduling problem with auxiliary equipment” the relaxation we propose able to find solutions more efficiently, moreover our experimental results show that most of the solutions verify that molds are non-overlapping even if they are installed on several machines. We propose an exact integer linear programming, a Relax&Fix heuristic, and a multistart greedy algorithm to solve this problem. Experimental results on instances based on real-world data show the efficiency of our approaches. The mathematical model and the algorithm for the lot production size problem, showed in this research, can be used for production planners to help in the scheduling of the manufacturing. For the “Bus holding for a simulated traffic network” most of the literature considers quadratic models that minimize passenger-waiting times, but they are harder to solve and therefore difficult to operate by real-time systems. On the other hand, our methodology reduces passenger-waiting times efficiently given our linear programming model, with the characteristic of applying control intervals just every 5 minutes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to their unpredictable behavior, stock markets are examples of complex systems. Yet, the dominant analysis of these markets as- sumes simple stochastic variations, eventually tainted by short-lived memory. This paper proposes an alternative strategy, based on a stochastic geometry defining a robust index of the structural dynamics of the markets and based on notions of topology defining a new coef- ficient that identifies the structural changes occurring on the S&P500 set of stocks. The results demonstrate the consistency of the random hypothesis as applied to normal periods but they also show its in- adequacy as to the analysis of periods of turbulence, for which the emergence of collective behavior of sectoral clusters of firms is mea- sured. This behavior is identified as a meta-routine.