948 resultados para Computational Modelling
Resumo:
Nitrous oxide is a major greenhouse gas emission. The aim of this research was to develop and apply statistical models to characterize the complex spatial and temporal variation in nitrous oxide emissions from soils under different land use conditions. This is critical when developing site-specific management plans to reduce nitrous oxide emissions. These studies can improve predictions and increase our understanding of environmental factors that influence nitrous oxide emissions. They also help to identify areas for future research, which can further improve the prediction of nitrous oxide in practice.
Resumo:
We present a tool for automatic analysis of computational indistinguishability between two strings of information. This is designed as a generic tool for proving cryptographic security based on a formalism that provides computational soundness preservation. The tool has been implemented and tested successfully with several cryptographic schemes.
Resumo:
While there are many similarities between the languages of the various workflow management systems, there are also significant differences. One particular area of differences is caused by the fact that different systems impose different syntactic restrictions. In such cases, business analysts have to choose between either conforming to the language in their specifications or transforming these specifications afterwards. The latter option is preferable as this allows for a separation of concerns. In this paper we investigate to what extent such transformations are possible in the context of various syntactical restrictions (the most restrictive of which will be referred to as structured workflows). We also provide a deep insight into the consequences, particularly in terms of expressive power, of imposing such restrictions.
Resumo:
This paper deals with the failure of high adhesive, low compressive strength, thin layered polymer mortar joints in masonry through a contact modelling in finite element framework. Failure due to combined shear, tensile and compressive stresses are considered through a constitutive damaging contact model that incorporates traction–separation as a function of displacement discontinuity. The modelling method is verified using single and multiple contact analyses of thin mortar layered masonry specimens under shear, tensile and compressive stresses and their combinations. Using this verified method, the failure of thin mortar layered masonry under a range of shear to tension ratios and shear to compression ratios has been examined. Finally, this model is applied to thin bed masonry wallettes for their behaviour under biaxial tension–tension and compression–tension loadings perpendicular and parallel to the bed joints.
Resumo:
The article focuses on how the information seeker makes decisions about relevance. It will employ a novel decision theory based on quantum probabilities. This direction derives from mounting research within the field of cognitive science showing that decision theory based on quantum probabilities is superior to modelling human judgements than standard probability models [2, 1]. By quantum probabilities, we mean decision event space is modelled as vector space rather than the usual Boolean algebra of sets. In this way,incompatible perspectives around a decision can be modelled leading to an interference term which modifies the law of total probability. The interference term is crucial in modifying the probability judgements made by current probabilistic systems so they align better with human judgement. The goal of this article is thus to model the information seeker user as a decision maker. For this purpose, signal detection models will be sketched which are in principle applicable in a wide variety of information seeking scenarios.
Resumo:
Modelling business processes for analysis or redesign usually requires the collaboration of many stakeholders. These stakeholders may be spread across locations or even companies, making co-located collaboration costly and difficult to organize. Modern process modelling technologies support remote collaboration but lack support for visual cues used in co-located collaboration. Previously we presented a prototype 3D virtual world process modelling tool that supports a number of visual cues to facilitate remote collaborative process model creation and validation. However, the added complexity of having to navigate a virtual environment and using an avatar for communication made the tool difficult to use for novice users. We now present an evolved version of the technology that addresses these issues by providing natural user interfaces for non-verbal communication, navigation and model manipulation.
Resumo:
A demo video showing the BPMVM prototype using several natural user interfaces, such as multi-touch input, full-body tracking and virtual reality.
Resumo:
This paper describes a risk model for estimating the likelihood of collisions at low-exposure railway level crossings, demonstrating the effect that differences in safety integrity can have on the likelihood of a collision. The model facilitates the comparison of safety benefits between level crossings with passive controls (stop or give-way signs) and level crossings that have been hypothetically upgraded with conventional or low-cost warning devices. The scenario presented illustrates how treatment of a cross-section of level crossings with low cost devices can provide a greater safety benefit compared to treatment with conventional warning devices for the same budget.
Resumo:
Keeping exotic plant pests out of our country relies on good border control or quarantine. However with increasing globalization and mobilization some things slip through. Then the back up systems become important. This can include an expensive form of surveillance that purposively targets particular pests. A much wider net is provided by general surveillance, which is assimilated into everyday activities, like farmers checking the health of their crops. In fact farmers and even home gardeners have provided a front line warning system for some pests (eg European wasp) that could otherwise have wreaked havoc. Mathematics is used to model how surveillance works in various situations. Within this virtual world we can play with various surveillance and management strategies to "see" how they would work, or how to make them work better. One of our greatest challenges is estimating some of the input parameters : because the pest hasn't been here before, it's hard to predict how well it might behave: establishing, spreading, and what types of symptoms it might express. So we rely on experts to help us with this. This talk will look at the mathematical, psychological and logical challenges of helping experts to quantify what they think. We show how the subjective Bayesian approach is useful for capturing expert uncertainty, ultimately providing a more complete picture of what they think... And what they don't!
Resumo:
Inspired by the wonderful properties of some biological composites in nature, we performed molecular dynamics simulations to investigate the mechanical behavior of bicontinuous nanocomposites. Three representative types of bicontinuous composites, which have regular network, random network, and nacre inspired microstructures respectively, were studied and the results were compared with those of a honeycomb nanocomposite with only one continuous phase. It was found that the mechanical strength of nanocomposites in a given direction strongly depends on the connectivity of microstructure in that direction. Directional isotropy in mechanical strength and easy manufacturability favor the random network nanocomposites as a potentially great bioinspired composite with balanced performances. In addition, the tensile strength of random network nanocomposites is less sensitive to the interfacial failure, owing to its super high interface-to-volume ratio and random distribution of internal interfaces. The results provide a useful guideline for design and optimization of advanced nanocomposites with superior mechanical properties.
Resumo:
This book provides a general framework for specifying, estimating, and testing time series econometric models. Special emphasis is given to estimation by maximum likelihood, but other methods are also discussed, including quasi-maximum likelihood estimation, generalized method of moments estimation, nonparametric estimation, and estimation by simulation. An important advantage of adopting the principle of maximum likelihood as the unifying framework for the book is that many of the estimators and test statistics proposed in econometrics can be derived within a likelihood framework, thereby providing a coherent vehicle for understanding their properties and interrelationships. In contrast to many existing econometric textbooks, which deal mainly with the theoretical properties of estimators and test statistics through a theorem-proof presentation, this book squarely addresses implementation to provide direct conduits between the theory and applied work.
Resumo:
This thesis presents a study using mechanical testing techniques combined with advanced computational methods to examine the mechanics of bone. It contributes novel observations and analysis of how bones fail at the microscopic level, which will be valuable in furthering our understanding and the treatment of bone damage in health and disease, including osteoporosis.
Resumo:
Materials used in the engineering always contain imperfections or defects which significantly affect their performances. Based on the large-scale molecular dynamics simulation and the Euler–Bernoulli beam theory, the influence from different pre-existing surface defects on the bending properties of Ag nanowires (NWs) is studied in this paper. It is found that the nonlinear-elastic deformation, as well as the flexural rigidity of the NW is insensitive to different surface defects for the studied defects in this paper. On the contrary, an evident decrease of the yield strength is observed due to the existence of defects. In-depth inspection of the deformation process reveals that, at the onset of plastic deformation, dislocation embryos initiate from the locations of surface defects, and the plastic deformation is dominated by the nucleation and propagation of partial dislocations under the considered temperature. Particularly, the generation of stair-rod partial dislocations and Lomer–Cottrell lock are normally observed for both perfect and defected NWs. The generation of these structures has thwarted attempts of the NW to an early yielding, which leads to the phenomenon that more defects does not necessarily mean a lower critical force.
Resumo:
Travel time prediction has long been the topic of transportation research. But most relevant prediction models in the literature are limited to motorways. Travel time prediction on arterial networks is challenging due to involving traffic signals and significant variability of individual vehicle travel time. The limited availability of traffic data from arterial networks makes travel time prediction even more challenging. Recently, there has been significant interest of exploiting Bluetooth data for travel time estimation. This research analysed the real travel time data collected by the Brisbane City Council using the Bluetooth technology on arterials. Databases, including experienced average daily travel time are created and classified for approximately 8 months. Thereafter, based on data characteristics, Seasonal Auto Regressive Integrated Moving Average (SARIMA) modelling is applied on the database for short-term travel time prediction. The SARMIA model not only takes the previous continuous lags into account, but also uses the values from the same time of previous days for travel time prediction. This is carried out by defining a seasonality coefficient which improves the accuracy of travel time prediction in linear models. The accuracy, robustness and transferability of the model are evaluated through comparing the real and predicted values on three sites within Brisbane network. The results contain the detailed validation for different prediction horizons (5 min to 90 minutes). The model performance is evaluated mainly on congested periods and compared to the naive technique of considering the historical average.
Resumo:
Dwell time at the busway station has a significant effect on bus capacity and delay. Dwell time has conventionally been estimated using models developed on the basis of field survey data. However field survey is resource and cost intensive, so dwell time estimation based on limited observations can be somewhat inaccurate. Most public transport systems are now equipped with Automatic Passenger Count (APC) and/or Automatic Fare Collection (AFC) systems. AFC in particular reduces on-board ticketing time, driver’s work load and ultimately reduces bus dwell time. AFC systems can record all passenger transactions providing transit agencies with access to vast quantities of data. AFC data provides transaction timestamps, however this information differs from dwell time because passengers may tag on or tag off at times other than when doors open and close. This research effort contended that models could be developed to reliably estimate dwell time distributions when measured distributions of transaction times are known. Development of the models required calibration and validation using field survey data of actual dwell times, and an appreciation of another component of transaction time being bus time in queue. This research develops models for a peak period and off peak period at a busway station on the South East Busway (SEB) in Brisbane, Australia.