610 resultados para turbulence modelling theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Impinging flow occurs when a fluid impacts a comparatively solid boundary upon which divergence occurs. A perfect example of an impinging flow is the impact and divergence of air at ground level during a thunderstorm outflow. The importance of modelling thunderstorm outflows, and in particular the downburst is now well-known to the wind engineering community and research into many of its characteristics is underway throughout the world. The reader is directed to the text by Fujita [I] for an introduction to downburst concepts and theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the third TAProViz workshop being run at BPM. The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. We note this year the continuing interest in the visualisation of process mining data and resultant process models. More info at: http://wst.univie.ac.at/topics/taproviz14/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A system is something that can be separated from its surrounds, but this definition leaves much scope for refinement. Starting with the notion of measurement, we explore increasingly contextual system behaviour, and identify three major forms of contextuality that might be exhibited by a system: (a) between components; (b) between system and experimental method, and; (c) between a system and its environment. Quantum Theory is shown to provide a highly useful formalism from which all three forms of contextuality can be analysed, offering numerous tests for contextual behaviour, as well as modelling possibilities for systems that do indeed display it. I conclude with the introduction of a Contextualised General Systems Theory based upon an extension of this formalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum-like models can be fruitfully used to model attitude change in a social context. Next steps require data, and higher dimensional models. Here, we discuss an exploratory study that demonstrates an order effect when three question sets about Climate Beliefs, Political Affiliation and Attitudes Towards Science are presented in different orders within a larger study of n=533 subjects. A quantum-like model seems possible, and we propose a new experiment which could be used to test between three possible models for this scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building information models are increasingly being utilised for facility management of large facilities such as critical infrastructures. In such environments, it is valuable to utilise the vast amount of data contained within the building information models to improve access control administration. The use of building information models in access control scenarios can provide 3D visualisation of buildings as well as many other advantages such as automation of essential tasks including path finding, consistency detection, and accessibility verification. However, there is no mathematical model for building information models that can be used to describe and compute these functions. In this paper, we show how graph theory can be utilised as a representation language of building information models and the proposed security related functions. This graph-theoretic representation allows for mathematically representing building information models and performing computations using these functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid-extracellular fluid interaction is believed to play an important role in the strain-rate dependent mechanical behaviors of shoulder articular cartilages. It is believed that the kangaroo shoulder joint is anatomically and biomechanically similar to human shoulder joint and it is easy to get in Australia. Therefore, the kangaroo humeral head cartilage was used as the suitable tissue for the study in this paper. Indentation tests from quasi-static (10-4/sec) to moderately high strain-rate (10-2/sec) on kangaroo humeral head cartilage tissues were conduced to investigate the strain-rate dependent behaviors. A finite element (FE) model was then developed, in which cartilage was conceptualized as a porous solid matrix filled with incompressible fluids. In this model, the solid matrix was modeled as an isotropic hyperelastic material and the percolating fluid follows Darcy’s law. Using inverse FE procedure, the constitutive parameters related to stiffness, compressibility of the solid matrix and permeability were obtained from the experimental results. The effect of solid-extracellular fluid interaction and drag force (the resistance to fluid movement) on strain-rate dependent behavior was investigated by comparing the influence of constant, strain dependent and strain-rate dependent permeability on FE model prediction. The newly developed porohyperelastic cartilage model with the inclusion of strain-rate dependent permeability was found to be able to predict the strain-rate dependent behaviors of cartilages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel framework for the modelling of passenger facilitation in a complex environment. The research is motivated by the challenges in the airport complex system, where there are multiple stakeholders, differing operational objectives and complex interactions and interdependencies between different parts of the airport system. Traditional methods for airport terminal modelling do not explicitly address the need for understanding causal relationships in a dynamic environment. Additionally, existing Bayesian Network (BN) models, which provide a means for capturing causal relationships, only present a static snapshot of a system. A method to integrate a BN complex systems model with stochastic queuing theory is developed based on the properties of the Poisson and exponential distributions. The resultant Hybrid Queue-based Bayesian Network (HQBN) framework enables the simulation of arbitrary factors, their relationships, and their effects on passenger flow and vice versa. A case study implementation of the framework is demonstrated on the inbound passenger facilitation process at Brisbane International Airport. The predicted outputs of the model, in terms of cumulative passenger flow at intermediary and end points in the inbound process, are found to have an R2 goodness of fit of 0.9994 and 0.9982 respectively over a 10 h test period. The utility of the framework is demonstrated on a number of usage scenarios including causal analysis and ‘what-if’ analysis. This framework provides the ability to analyse and simulate a dynamic complex system, and can be applied to other socio-technical systems such as hospitals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process modeling – the design and use of graphical documentations of an organization’s business processes – is a key method to document and use information about the operations of businesses. Still, despite current interest in process modeling, this research area faces essential challenges. Key unanswered questions concern the impact of process modeling in organizational practice, and the mechanisms through which impacts are developed. To answer these questions and to provide a better understanding of process modeling impact, I turn to the concept of affordances. Affordances describe the possibilities for goal-oriented action that a technical object offers to a user. This notion has received growing attention from IS researchers. The purpose of my research is to further develop the IS discipline’s understanding of affordances and impacts from information objects, such as process models used by analysts for information systems analysis and design. Specifically, I seek to extend existing theory on the emergence, perception and actualization of affordances. I develop a research model that describes the process by which affordances emerge between an individual and an object, how affordances are perceived, and how they are actualized by the individual. The proposed model also explains the role of available information for the individual, and the influence of perceived actualization effort. I operationalize and test this research model empirically, using a full-cycle, mixed methods study consisting of case study and experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many nations are highlighting the need for a renaissance in the mathematical sciences as essential to the well-being of all citizens (e.g., Australian Academy of Science, 2006; 2010; The National Academies, 2009). Indeed, the first recommendation of The National Academies’ Rising Above the Storm (2007) was to vastly improve K–12 science and mathematics education. The subsequent report, Rising Above the Gathering Storm Two Years Later (2009), highlighted again the need to target mathematics and science from the earliest years of schooling: “It takes years or decades to build the capability to have a society that depends on science and technology . . . You need to generate the scientists and engineers, starting in elementary and middle school” (p. 9). Such pleas reflect the rapidly changing nature of problem solving and reasoning needed in today’s world, beyond the classroom. As The National Academies (2009) reported, “Today the problems are more complex than they were in the 1950s, and more global. They’ll require a new educated workforce, one that is more open, collaborative, and cross-disciplinary” (p. 19). The implications for the problem solving experiences we implement in schools are far-reaching. In this chapter, I consider problem solving and modelling in the primary school, beginning with the need to rethink the experiences we provide in the early years. I argue for a greater awareness of the learning potential of young children and the need to provide stimulating learning environments. I then focus on data modelling as a powerful means of advancing children’s statistical reasoning abilities, which they increasingly need as they navigate their data-drenched world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizational and technological systems analysis and design practices such as process modeling have received much attention in recent years. However, while knowledge about related artifacts such as models, tools, or grammars has substantially matured, little is known about the actual tasks and interaction activities that are conducted as part of analysis and design acts. In particular, key role of the facilitator has not been researched extensively to date. In this paper, we propose a new conceptual framework that can be used to examine facilitation behaviors in process modeling projects. The framework distinguishes four behavioral styles in facilitation (the driving engineer, the driving artist, the catalyzing engineer, and the catalyzing artist) that a facilitator can adopt. To distinguish between the four styles, we provide a set of ten behavioral anchors that underpin facilitation behaviors. We also report on a preliminary empirical exploration of our framework through interviews with experienced analysts in six modeling cases. Our research provides a conceptual foundation for an emerging theory for describing and explaining different behaviors associated with process modeling facilitation, provides first preliminary empirical results about facilitation in modeling projects, and provides a fertile basis for examining facilitation in other conceptual modeling activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Searching for efficient solid sorbents for CO2 adsorption and separation is important for developing emergent carbon reduction and natural gas purification technology. This work, for the first time, has investigated the adsorption of CO2 on newly experimentally realized cage-like B40 fullerene (Zhai et al., 2014) based on density functional theory calculations. We find that the adsorption of CO2 on B40 fullerene involves a relatively large energy barrier (1.21 eV), however this can be greatly decreased to 0.35 eV by introducing an extra electron. A practical way to realize negatively charged B40 fullerene is then proposed by encapsulating a Li atom into the B40 fullerene (Li@B40). Li@B40 is found to be highly stable and can significantly enhance both the thermodynamics and kinetics of CO2 adsorption, while the adsorptions of N2, CH4 and H2 on the Li@B40 fullerene remain weak in comparison. Since B40 fullerene has been successfully synthesized in a most recent experiment, our results highlight a new promising material for CO2 capture and separation for future experimental validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to assess the structural reliability of bridges, an accurate and cost effective Non-Destructive Evaluation (NDE) technology is required to ensure their safe and reliable operation. Over 60% of the Australian National Highway System is prestressed concrete (PSC) bridges according to the Bureau of Transport and Communication Economics (1997). Most of the in-service bridges are more than 30 years old and may experience a heavier traffic load than their original intended level. Use of Ultrasonic waves is continuously increasing for (NDE) and Structural Health Monitoring (SHM) in civil, aerospace, electrical, mechanical applications. Ultrasonic Lamb waves are becoming more popular for NDE because it can propagate long distance and reach hidden regions with less energy loses. The purpose of this study is to numerically quantify prestress force (PSF) of (PSC) beam using the fundamental theory of acoustic-elasticity. A three-dimension finite element modelling approach is set up to perform parametric studies in order to better understand how the lamb wave propagation in PSC beam is affected by changing in the PSF level. Results from acoustic-elastic measurement on prestressed beam are presented, showing the feasibility of the lamb wave for PSF evaluation in PSC bridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.