398 resultados para Simulation Experiment

em Queensland University of Technology - ePrints Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We examine which capabilities technologies provide to support collaborative process modeling. We develop a model that explains how technology capabilities impact cognitive group processes, and how they lead to improved modeling outcomes and positive technology beliefs. We test this model through a free simulation experiment of collaborative process modelers structured around a set of modeling tasks. With our study, we provide an understanding of the process of collaborative process modeling, and detail implications for research and guidelines for the practical design of collaborative process modeling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Process models are used to convey semantics about business operations that are to be supported by an information system. A wide variety of professionals is targeted to use such models, including people who have little modeling or domain expertise. We identify important user characteristics that influence the comprehension of process models. Through a free simulation experiment, we provide evidence that selected cognitive abilities, learning style, and learning strategy influence the development of process model comprehension. These insights draw attention to the importance of research that views process model comprehension as an emergent learning process rather than as an attribute of the models as objects. Based on our findings, we identify a set of organizational intervention strategies that can lead to more successful process modeling workshops.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this study, a bench scale forward osmosis (FO) process was operated using two commonly available FO membranes in different orientations in order to examine the removal of foulants in the coal seam gas (CSG) associated water, the water flux and fouling behaviours of the process were also investigated. After 48 h of fouling simulation experiment, the water flux declined by approximately 55 and 35% of its initial level in the TFC-PRO and CTA-PRO modes (support layer facing the feed), respectively, while the flux decline in the TFC-FO and CTA-FO modes (active layer facing the feed) was insignificant. The flux decline in PRO modes was caused by the compounding effects of internal concentration polarisation and membrane fouling. However, the declined flux was completely recovered to its initial level following the hydraulic cleaning using deionised water. Dissolved organic carbon (DOC), adenosine tri-phosphate (ATP) and major inorganic scalants (Ca, Mg and silica) in the CSG feed were effectively removed by using the FO process. The results of this study suggest that the FO process shows promising potential to be employed as an effective pre-treatment for membrane purification of CSG associated water.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SCAPE is an interactive simulation that allows teachers and students to experiment with sustainable urban design. The project is based on the Kelvin Grove Urban Village, Brisbane. Groups of students role play as political, retail, elderly, student, council and builder characters to negotiate on game decisions around land use, density, housing types and transport in order to design a sustainable urban community. As they do so, the 3D simulation reacts in real time to illustrate what the village would look like as well as provide statistical information about the community they are creating. SCAPE brings together education, urban professional and technology expertise, helping it achieve educational outcomes, reflect real-world scenarios and include sophisticated logic and decision making processes and effects.---------- The research methodology was primarily practice led underpinned by action research methods resulting in innovative approaches and techniques in adapting digital games and simulation technologies to create dynamic and engaging experiences in pedagogical contexts. It also illustrates the possibilities for urban designers to engage a variety of communities in the processes, complexities and possibilities of urban development and sustainability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farm It Right is an innovative creative work that simulates sustainable farming techniques using ecological models prepared by academics at Bradford University (School of Life Sciences). This interactive work simulates the farming conditions and options of our ancestors and demonstrates the direct impact their actions had on their environment and on the ’future of their cultures’ (Schmidt 2008). Specifically, the simulation allows users to explore and experiment with the complex relationships between environmental factors and human decision making within the harsh conditions of an early (9th century) Nordic farm. The simulation interface displays both statistical and graphical feedback in response to the users selections regarding animal reproduction rates, shelter provisions, food supplies etc. as well as demonstrating resulting impacts to soil erosion, water supply, animal population sizes etc.---------- 'Farm It Right' is now used at Bradford University (School of Life Sciences) as a dynamic e-Learning resource for incorporating environmental archaeology with sustainable development education, improving the engagement with complex data and the appreciation of human impacts on the environment and the future of their cultures. 'Farm It Right' is also demonstrated as an exemplar case study for interaction design students at Queensland University of Technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a material model to simulate load induced cracking in Reinforced Concrete (RC) elements in ABAQUS finite element package. Two numerical material models are used and combined to simulate complete stress-strain behaviour of concrete under compression and tension including damage properties. Both numerical techniques used in the present material model are capable of developing the stress-strain curves including strain softening regimes only using ultimate compressive strength of concrete, which is easily and practically obtainable for many of the existing RC structures or those to be built. Therefore, the method proposed in this paper is valuable in assessing existing RC structures in the absence of more detailed test results. The numerical models are slightly modified from the original versions to be comparable with the damaged plasticity model used in ABAQUS. The model is validated using different experiment results for RC beam elements presented in the literature. The results indicate a good agreement with load vs. displacement curve and observed crack patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are large uncertainties in the aerothermodynamic modelling of super-orbital re-entry which impact the design of spacecraft thermal protection systems (TPS). Aspects of the thermal environment of super-orbital re-entry flows can be simulated in the laboratory using arc- and plasma jet facilities and these devices are regularly used for TPS certification work [5]. Another laboratory device which is capable of simulating certain critical features of both the aero and thermal environment of super-orbital re-entry is the expansion tube, and three such facilities have been operating at the University of Queensland in recent years[10]. Despite some success, wind tunnel tests do not achieve full simulation, however, a virtually complete physical simulation of particular re-entry conditions can be obtained from dedicated flight testing, and the Apollo era FIRE II flight experiment [2] is the premier example which still forms an important benchmark for modern simulations. Dedicated super-orbital flight testing is generally considered too expensive today, and there is a reluctance to incorporate substantial instrumentation for aerothermal diagnostics into existing missions since it may compromise primary mission objectives. An alternative approach to on-board flight measurements, with demonstrated success particularly in the ‘Stardust’ sample return mission, is remote observation of spectral emissions from the capsule and shock layer [8]. JAXA’s ‘Hayabusa’ sample return capsule provides a recent super-orbital reentry example through which we illustrate contributions in three areas: (1) physical simulation of super-orbital re-entry conditions in the laboratory; (2) computational simulation of such flows; and (3) remote acquisition of optical emissions from a super-orbital re entry event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An approach for modeling passenger flows in airport terminals by a set of devised advanced traits of passengers is proposed. Advanced traits take into account a passenger’s cognitive preferences which would be the underlying motivations of route-choice decisions. Basic traits are the status of passengers such as travel class. Although the activities of passengers are normally regarded as stochastic and sometimes unpredictable, we advise that real scenarios of passenger flows are basically feasible to be compared with virtual simulations in terms of tactical route-choice decision-making by individual personals. Inside airport terminals, passengers are goal-directed and not only use standard processing check points but also behave discretionary activities during the course. In this paper, we integrated discretionary activities in the study to fulfill full-range of passenger flows. In the model passengers are built as intelligent agents who possess a bunch of initial basic traits and then can be categorized into ten distinguish groups in terms of route-choice preferences by inferring the results of advanced traits. An experiment is executed to demonstrate the capability to facilitate predicting passenger flows.