604 resultados para Geologic models
Resumo:
We construct a two-scale mathematical model for modern, high-rate LiFePO4cathodes. We attempt to validate against experimental data using two forms of the phase-field model developed recently to represent the concentration of Li+ in nano-sized LiFePO4crystals. We also compare this with the shrinking-core based model we developed previously. Validating against high-rate experimental data, in which electronic and electrolytic resistances have been reduced is an excellent test of the validity of the crystal-scale model used to represent the phase-change that may occur in LiFePO4material. We obtain poor fits with the shrinking-core based model, even with fitting based on “effective” parameter values. Surprisingly, using the more sophisticated phase-field models on the crystal-scale results in poorer fits, though a significant parameter regime could not be investigated due to numerical difficulties. Separate to the fits obtained, using phase-field based models embedded in a two-scale cathodic model results in “many-particle” effects consistent with those reported recently.
Resumo:
In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.
Resumo:
Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Invasion of extracellular matrices is crucial to a number of physiological and pathophysiological states, including tumor cell metastasis, arthritis, embryo implantation, wound healing, and early development. To isolate invasion from the additional complexities of these scenarios a number of in vitro invasion assays have been developed over the years. Early studies employed intact tissues, like denuded amniotic membrane (1) or embryonic chick heart fragments (2), however recently, purified matrix components or complex matrix extracts have been used to provide more uniform and often more rapid analyses (for examples, see the following integrin studies). Of course, the more holistic view of invasion offered in the earlier assays is valuable and cannot be fully reproduced in these more rapid assays, but advantages of reproducibility among replicates, ease of preparation and analysis, and overall high throughput favor the newer assays. In this chapter, we will focus on providing detailed protocols for Matrigel-based assays (Matrigel=reconstituted basement membrane; reviewed in ref. (3)). Matrigel is an extract from the transplantable Engelbreth-Holm-Swarm murine sarcoma that deposits a multilammelar basement membrane. Matrigel is available commercially (Becton Dickinson, Bedford, MA), and can be manipulated as a liquid at 4°C into a variety of different formats. Alternatively, cell culture inserts precoated with Matrigel can be purchased for even greater simplicity.
Resumo:
Water management is vital for mine sites both for production and sustainability related issues. Effective water management is a complex task since the role of water on mine sites is multifaceted. Computers models are tools that represent mine site water interaction and can be used by mine sites to inform or evaluate their water management strategies. There exist several types of models that can be used to represent mine site water interactions. This paper presents three such models: an operational model, an aggregated systems model and a generic systems model. For each model the paper provides a description and example followed by an analysis of its advantages and disadvantages. The paper hypotheses that since no model is optimal for all situations, each model should be applied in situations where it is most appropriate based upon the scale of water interactions being investigated, either unit (operation), inter-site (aggregated systems) or intra-site (generic systems).
Resumo:
There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.
Resumo:
This thesis investigates the use of building information models for access control and security applications in critical infrastructures and complex building environments. It examines current problems in security management for physical and logical access control and proposes novel solutions that exploit the detailed information available in building information models. The project was carried out as part of the Airports of the Future Project and the research was modelled based on real-world problems identified in collaboration with our industry partners in the project.
Resumo:
Western economies are highly dependent on service innovation for their growth and employment. An important driver for economic growth is, therefore, the development of new, innovative services like electronic services, mobile end-user services, new financial or personalized services. Service innovation joins four trends that currently shape the western economies: the growing importance of services, the need for innovation, changes in consumer and business markets, and the advancements in information and communication technology (ICT).
Resumo:
This paper develops a semiparametric estimation approach for mixed count regression models based on series expansion for the unknown density of the unobserved heterogeneity. We use the generalized Laguerre series expansion around a gamma baseline density to model unobserved heterogeneity in a Poisson mixture model. We establish the consistency of the estimator and present a computational strategy to implement the proposed estimation techniques in the standard count model as well as in truncated, censored, and zero-inflated count regression models. Monte Carlo evidence shows that the finite sample behavior of the estimator is quite good. The paper applies the method to a model of individual shopping behavior. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
Objective: To examine the effects of personal and community characteristics, specifically race and rurality, on lengths of state psychiatric hospital and community stays using maximum likelihood survival analysis with a special emphasis on change over a ten year period of time. Data Sources: We used the administrative data of the Virginia Department of Mental Health, Mental Retardation, and Substance Abuse Services (DMHMRSAS) from 1982-1991 and the Area Resources File (ARF). Given these two sources, we constructed a history file for each individual who entered the state psychiatric system over the ten year period. Histories included demographic, treatment, and community characteristics. Study Design: We used a longitudinal, population-based design with maximum likelihood estimation of survival models. We presented a random effects model with unobserved heterogeneity that was independent of observed covariates. The key dependent variables were lengths of inpatient stay and subsequent length of community stay. Explanatory variables measured personal, diagnostic, and community characteristics, as well as controls for calendar time. Data Collection: This study used secondary, administrative, and health planning data. Principal Findings: African-American clients leave the community more quickly than whites. After controlling for other characteristics, however, race does not affect hospital length of stay. Rurality does not affect length of community stays once other personal and community characteristics are controlled for. However, people from rural areas have longer hospital stays even after controlling for personal and community characteristics. The effects of time are significantly smaller than expected. Diagnostic composition effects and a decrease in the rate of first inpatient admissions explain part of this reduced impact of time. We also find strong evidence for the existence of unobserved heterogeneity in both types of stays and adjust for this in our final models. Conclusions: Our results show that information on client characteristics available from inpatient stay records is useful in predicting not only the length of inpatient stay but also the length of the subsequent community stay. This information can be used to target increased discharge planning for those at risk of more rapid readmission to inpatient care. Correlation across observed and unobserved factors affecting length of stay has significant effects on the measurement of relationships between individual factors and lengths of stay. Thus, it is important to control for both observed and unobserved factors in estimation.
Resumo:
This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.
Resumo:
During the early design stages of construction projects, accurate and timely cost feedback is critical to design decision making. This is particularly challenging for cost estimators, as they must quickly and accurately estimate the cost of the building when the design is still incomplete and evolving. State-of-the-art software tools typically use a rule-based approach to generate detailed quantities from the design details present in a building model and relate them to the cost items in a cost estimating database. In this paper, we propose a generic approach for creating and maintaining a cost estimate using flexible mappings between a building model and a cost estimate. The approach uses queries on the building design that are used to populate views, and each view is then associated with one or more cost items. The benefit of this approach is that the flexibility of modern query languages allows the estimator to encode a broad variety of relationships between the design and estimate. It also avoids the use of a common standard to which both designers and estimators must conform, allowing the estimator added flexibility and functionality to their work.
Resumo:
This paper investigates compressed sensing using hidden Markov models (HMMs) and hence provides an extension of recent single frame, bounded error sparse decoding problems into a class of sparse estimation problems containing both temporal evolution and stochastic aspects. This paper presents two optimal estimators for compressed HMMs. The impact of measurement compression on HMM filtering performance is experimentally examined in the context of an important image based aircraft target tracking application. Surprisingly, tracking of dim small-sized targets (as small as 5-10 pixels, with local detectability/SNR as low as − 1.05 dB) was only mildly impacted by compressed sensing down to 15% of original image size.
Resumo:
Recent natural disasters such as the Haitian earthquake 2011, the South East Queensland floods 2011, the Japanese earthquake and tsunami 2011 and Hurricane Sandy in the United States of America in 2012, have seen social media platforms changing the face of emergency management communications, not only in times of crisis and also during business-as-usual operations. With social media being such an important and powerful communication tool, especially for emergency management organisations, the question arises as to whether the use of social media in these organisations emerged by considered strategic design or more as a reactive response to a new and popular communication channel. This paper provides insight into how the social media function has been positioned, staffed and managed in organisations throughout the world, with a particular focus on how these factors influence the style of communication used on social media platforms. This study has identified that the social media function falls on a continuum between two polarised models, namely the authoritative one-way communication approach and the more interactive approach that seeks to network and engage with the community through multi-way communication. Factors such the size of the organisation; dedicated resourcing of the social media function; organisational culture and mission statement; the presence of a social media champion within the organisation; management style and knowledge about social media play a key role in determining where on the continuum organisations sit in relation to their social media capability. This review, together with a forthcoming survey of Australian emergency management organisations and local governments, will fill a gap in the current body of knowledge about the evolution, positioning and usage of social media in organisations working in the emergency management field in Australia. These findings will be fed back to Industry for potential inclusion in future strategies and practices.