969 resultados para STOCHASTIC PROCESSES
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
We examine which capabilities technologies provide to support collaborative process modeling. We develop a model that explains how technology capabilities impact cognitive group processes, and how they lead to improved modeling outcomes and positive technology beliefs. We test this model through a free simulation experiment of collaborative process modelers structured around a set of modeling tasks. With our study, we provide an understanding of the process of collaborative process modeling, and detail implications for research and guidelines for the practical design of collaborative process modeling.
Resumo:
Video presented as part of ACIS 2009 conference in Melbourne Australia. This movie is a demonstration of the use of 3D Virtual Environments to visualise 3D BPMN Process Models, and in particular, to highlight any issues with the process model that are spatial in nature. This work is part of a paper accepted for the Asia-Pacific Conference on Conceptual Modelling (APCCM 2010) to be held in Brisbane - http://2010.apccm.org/
Resumo:
Situated on Youtube, and shown in various locations. In this video we show a 3D mock up of a personal house purchasing process. A path traversal metaphor is used to give a sense of progression along the process stages. The intention is to be able to use console devices like an Xbox to consume business processes. This is so businesses can expose their internal processes to consumers using sophisticated user interfaces. The demonstrator was developed using Microsoft XNA, with assistance from the Suncorp Bank and the Smart Services CRC. More information at: www.bpmve.org
Resumo:
A dual-scale model of the torrefaction of wood was developed and used to study industrial configurations. At the local scale, the computational code solves the coupled heat and mass transfer and the thermal degradation mechanisms of the wood components. At the global scale, the two-way coupling between the boards and the stack channels is treated as an integral component of the process. This model is used to investigate the effect of the stack configuration on the heat treatment of the boards. The simulations highlight that the exothermic reactions occurring in each single board can be accumulated along the stack. This phenomenon may result in a dramatic eterogeneity of the process and poses a serious risk of thermal runaway, which is often observed in industrial plants. The model is used to explain how thermal runaway can be lowered by increasing the airflow velocity, the sticker thickness or by gas flow reversal.
Resumo:
When organizational scandals occur, the common refrain among commentators is: 'Where was the board in all this?' 'How could the directors not have known what was going on?''Why didn't the board intervene?' The scandals demonstrate that board monitoring or oganizational performance is a matter of great importance. By monitoring, we mean the act of keeping the organization under review. In many English-speaking countries, directors have a legal duty of care, which includes duties to monitor the performance of their organizations (Hopt and von Hippel 2010). However, statutory law typically merely states the duty, while providing little guidance on how that duty can be met.
Resumo:
An estuary is formed at the mouth of a river where the tides meet a freshwater flow and it may be classified as a function of the salinity distribution and density stratification. An overview of the broad characteristics of the estuaries of South-East Queensland(Australia) is presented herein, where the small peri-urban estuaries may provide an useful indicator of potential changes which might occur in larger systems with growing urbanisation. Small peri-urban estuaries exhibits many key hydrological features and associated with ecosystem types of larger estuaries, albeit at smaller scales, often with a greater extent of urban development as a proportion of catchment area. We explore the potential for some smaller peri-urban estuaries to be used as natural laboratories to gain some much needed information on the estuarine processes, although any dynamics similarity is presently limited by critical absence of in-depth physical investigation in larger estuarine systems. The absence of the detailed turbulence and sedimentary data hampers the understanding and modelling of the estuarine zones. The interactions between the various stake holders are likely to define the vision for the future of South-East Queensland's peri-urban estuaries. This will require a solid understanding of the bio-physical function and capacity of the peri-urban estuaries. Based upon the knowledge gap, it is recommended that an adaptive trial and error approach be adopted for the future of investigation and management strategies.
Resumo:
Groundwater flow models are usually characterized as being either transient flow models or steady state flow models. Given that steady state groundwater flow conditions arise as a long time asymptotic limit of a particular transient response, it is natural for us to seek a finite estimate of the amount of time required for a particular transient flow problem to effectively reach steady state. Here, we introduce the concept of mean action time (MAT) to address a fundamental question: How long does it take for a groundwater recharge process or discharge processes to effectively reach steady state? This concept relies on identifying a cumulative distribution function, $F(t;x)$, which varies from $F(0;x)=0$ to $F(t;x) \to \infty$ as $t\to \infty$, thereby providing us with a measurement of the progress of the system towards steady state. The MAT corresponds to the mean of the associated probability density function $f(t;x) = \dfrac{dF}{dt}$, and we demonstrate that this framework provides useful analytical insight by explicitly showing how the MAT depends on the parameters in the model and the geometry of the problem. Additional theoretical results relating to the variance of $f(t;x)$, known as the variance of action time (VAT), are also presented. To test our theoretical predictions we include measurements from a laboratory–scale experiment describing flow through a homogeneous porous medium. The laboratory data confirms that the theoretical MAT predictions are in good agreement with measurements from the physical model.
Resumo:
Jakarta, Indonesia’s chronic housing shortage poses multiple challenges for contemporary policy-makers. While it may be in the city’s interest to increase the availability of housing, there is limited land to do so. Market pressures, in tandem with government’s desire for housing availability, demand consideration of even marginal lands, such as those within floodplains, for development. Increasingly, planning for a flood resilient Jakarta is complicated by a number of factors, including: the city is highly urbanized and land use data is limited; flood management is technically complex, creating potential barriers to engagement for both decision-makers and the public; inherent uncertainty exists throughout modelling efforts, central to management; and risk and liability for infrastructure investments is unclear. These obstacles require localized watershed-level participatory planning to address risks of flooding where possible and reduce the likelihood that informal settlements occur in areas of extreme risk. This paper presents a preliminary scoping study for determination of an effective participatory planning method to encourage more resilient development. First, the scoping study provides background relevant to the challenges faced in planning for contemporary Jakarta. Second, the study examines the current use of decision-support tools, such as Geographic Information Systems (GIS), in planning for Jakarta. Existing capacity in the use of GIS allows for consideration of the use of an emerging method of community consultation - Multi-Criteria Decision-Making (MCDM) support systems infused with geospatial information - to aid in engagement with the public and improve decision-making outcomes. While these methods have been used in Australia to promote stakeholder engagement in urban intensification, the planned research will be an early introduction of the method to Indonesia. As a consequence of this intervention, it is expected that planning activities will result in a more resilient city, capable of engaging with disaster risk management in a more effective manner.
Resumo:
The importance of applying unsaturated soil mechanics to geotechnical engineering design has been well understood. However, the consumption of time and the necessity for a specific laboratory testing apparatus when measuring unsaturated soil properties have limited the application of unsaturated soil mechanics theories in practice. Although methods for predicting unsaturated soil properties have been developed, the verification of these methods for a wide range of soil types is required in order to increase the confidence of practicing engineers in using these methods. In this study, a new permeameter was developed to measure the hydraulic conductivity of unsaturated soils using the steady-state method and directly measured suction (negative pore-water pressure) values. The apparatus is instrumented with two tensiometers for the direct measurement of suction during the tests. The apparatus can be used to obtain the hydraulic conductivity function of sandy soil over a low suction range (0-10 kPa). Firstly, the repeatability of the unsaturated hydraulic conductivity measurement, using the new permeameter, was verified by conducting tests on two identical sandy soil specimens and obtaining similar results. The hydraulic conductivity functions of the two sandy soils were then measured during the drying and wetting processes of the soils. A significant hysteresis was observed when the hydraulic conductivity was plotted against the suction. However, the hysteresis effects were not apparent when the conductivity was plotted against the volumetric water content. Furthermore, the measured unsaturated hydraulic conductivity functions were compared with predictions using three different predictive methods that are widely incorporated into numerical software. The results suggest that these predictive methods are capable of capturing the measured behavior with reasonable agreement.
Resumo:
Energy prices are highly volatile and often feature unexpected spikes. It is the aim of this paper to examine whether the occurrence of these extreme price events displays any regularities that can be captured using an econometric model. Here we treat these price events as point processes and apply Hawkes and Poisson autoregressive models to model the dynamics in the intensity of this process.We use load and meteorological information to model the time variation in the intensity of the process. The models are applied to data from the Australian wholesale electricity market, and a forecasting exercise illustrates both the usefulness of these models and their limitations when attempting to forecast the occurrence of extreme price events.
Resumo:
Packaged software is pre-built with the intention of licensing it to users in domestic settings and work organisations. This thesis focuses upon the work organisation where packaged software has been characterised as one of the latest ‘solutions’ to the problems of information systems. The study investigates the packaged software selection process that has, to date, been largely viewed as objective and rational. In contrast, this interpretive study is based on a 21⁄2 year long field study of organisational experiences with packaged software selection at T.Co, a consultancy organisation based in the United Kingdom. Emerging from the iterative process of case study and action research is an alternative theory of packaged software selection. The research argues that packaged software selection is far from the rationalistic and linear process that previous studies suggest. Instead, the study finds that aspects of the traditional process of selection incorporating the activities of gathering requirements, evaluation and selection based on ‘best fit’ may or may not take place. Furthermore, even where these aspects occur they may not have equal weight or impact upon implementation and usage as may be expected. This is due to the influence of those multiple realities which originate from the organisational and market environments within which packages are created, selected and used, the lack of homogeneity in organisational contexts and the variously interpreted characteristics of the package in question.
Resumo:
This paper aims to contribute to an understanding of what actually takes place during consulting engagements. It draws on data collected from a qualitative case study of eight engagements by a niche consultancy in Australia to describe how consultants actively engage boundary crossing processes to address knowledge boundaries encountered during formal interactions with clients. While consultants actively managed knowledge boundary processes during interactions, by applying techniques such as evoking an ‘ideal state’ for clients, the engagements also yielded many missed opportunities for knowledge transformation.
Resumo:
A large subsurface, elevated temperature anomaly is well documented in Central Australia. High Heat Producing Granites (HHPGs) intersected by drilling at Innamincka are often assumed to be the dominant cause of the elevated subsurface temperatures, although their presence in other parts of the temperature anomaly has not been confirmed. Geological controls on the temperature anomaly remain poorly understood. Additionally, methods previously used to predict temperature at 5 km depth in this area are simplistic and possibly do not give an accurate representation of the true distribution and magnitude of the temperature anomaly. Here we re-evaluate the geological controls on geothermal potential in the Queensland part of the temperature anomaly using a stochastic thermal model. The results illustrate that the temperature distribution is most sensitive to the thermal conductivity structure of the top 5 km. Furthermore, the results indicate the presence of silicic crust enriched in heat producing elements between and 40 km.
Resumo:
Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.