331 resultados para Markov Modeling
Resumo:
We have developed a bioreactor vessel design which has the advantages of simplicity and ease of assembly and disassembly, and with the appropriately determined flow rate, even allows for a scaffold to be suspended freely regardless of its weight. This article reports our experimental and numerical investigations to evaluate the performance of a newly developed non-perfusion conical bioreactor by visualizing the flow through scaffolds with 45° and 90° fiber lay down patterns. The experiments were conducted at the Reynolds numbers (Re) 121, 170, and 218 based on the local velocity and width of scaffolds. The flow fields were captured using short-time exposures of 60 µm particles suspended in the bioreactor and illuminated using a thin laser sheet. The effects of scaffold fiber lay down pattern and Reynolds number were obtained and correspondingly compared to results obtained from a computational fluid dynamics (CFD) software package. The objectives of this article are twofold: to investigate the hypothesis that there may be an insufficient exchange of medium within the interior of the scaffold when using our non-perfusion bioreactor, and second, to compare the flows within and around scaffolds of 45° and 90° fiber lay down patterns. Scaffold porosity was also found to influence flow patterns. It was therefore shown that fluidic transport could be achieved within scaffolds with our bioreactor design, being a non-perfusion vessel. Fluid velocities were generally same of the same or one order lower in magnitude as compared to the inlet flow velocity. Additionally, the 90° fiber lay down pattern scaffold was found to allow for slightly higher fluid velocities within, as compared to the 45° fiber lay down pattern scaffold. This was due to the architecture and pore arrangement of the 90° fiber lay down pattern scaffold, which allows for fluid to flow directly through (channel-like flow).
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
It is important to examine the nature of the relationships between roadway, environmental, and traffic factors and motor vehicle crashes, with the aim to improve the collective understanding of causal mechanisms involved in crashes and to better predict their occurrence. Statistical models of motor vehicle crashes are one path of inquiry often used to gain these initial insights. Recent efforts have focused on the estimation of negative binomial and Poisson regression models (and related deviants) due to their relatively good fit to crash data. Of course analysts constantly seek methods that offer greater consistency with the data generating mechanism (motor vehicle crashes in this case), provide better statistical fit, and provide insight into data structure that was previously unavailable. One such opportunity exists with some types of crash data, in particular crash-level data that are collected across roadway segments, intersections, etc. It is argued in this paper that some crash data possess hierarchical structure that has not routinely been exploited. This paper describes the application of binomial multilevel models of crash types using 548 motor vehicle crashes collected from 91 two-lane rural intersections in the state of Georgia. Crash prediction models are estimated for angle, rear-end, and sideswipe (both same direction and opposite direction) crashes. The contributions of the paper are the realization of hierarchical data structure and the application of a theoretically appealing and suitable analysis approach for multilevel data, yielding insights into intersection-related crashes by crash type.
Resumo:
Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.
Resumo:
The Intermodal Surface Transportation Efficiency Act (ISTEA) of 1991 mandated the consideration of safety in the regional transportation planning process. As part of National Cooperative Highway Research Program Project 8-44, "Incorporating Safety into the Transportation Planning Process," we conducted a telephone survey to assess safety-related activities and expertise at Governors Highway Safety Associations (GHSAs), and GHSA relationships with metropolitan planning organizations (MPOs) and state departments of transportation (DOTs). The survey results were combined with statewide crash data to enable exploratory modeling of the relationship between GHSA policies and programs and statewide safety. The modeling objective was to illuminate current hurdles to ISTEA implementation, so that appropriate institutional, analytical, and personnel improvements can be made. The study revealed that coordination of transportation safety across DOTs, MPOs, GHSAs, and departments of public safety is generally beneficial to the implementation of safety. In addition, better coordination is characterized by more positive and constructive attitudes toward incorporating safety into planning.
Resumo:
Many studies focused on the development of crash prediction models have resulted in aggregate crash prediction models to quantify the safety effects of geometric, traffic, and environmental factors on the expected number of total, fatal, injury, and/or property damage crashes at specific locations. Crash prediction models focused on predicting different crash types, however, have rarely been developed. Crash type models are useful for at least three reasons. The first is motivated by the need to identify sites that are high risk with respect to specific crash types but that may not be revealed through crash totals. Second, countermeasures are likely to affect only a subset of all crashes—usually called target crashes—and so examination of crash types will lead to improved ability to identify effective countermeasures. Finally, there is a priori reason to believe that different crash types (e.g., rear-end, angle, etc.) are associated with road geometry, the environment, and traffic variables in different ways and as a result justify the estimation of individual predictive models. The objectives of this paper are to (1) demonstrate that different crash types are associated to predictor variables in different ways (as theorized) and (2) show that estimation of crash type models may lead to greater insights regarding crash occurrence and countermeasure effectiveness. This paper first describes the estimation results of crash prediction models for angle, head-on, rear-end, sideswipe (same direction and opposite direction), and pedestrian-involved crash types. Serving as a basis for comparison, a crash prediction model is estimated for total crashes. Based on 837 motor vehicle crashes collected on two-lane rural intersections in the state of Georgia, six prediction models are estimated resulting in two Poisson (P) models and four NB (NB) models. The analysis reveals that factors such as the annual average daily traffic, the presence of turning lanes, and the number of driveways have a positive association with each type of crash, whereas median widths and the presence of lighting are negatively associated. For the best fitting models covariates are related to crash types in different ways, suggesting that crash types are associated with different precrash conditions and that modeling total crash frequency may not be helpful for identifying specific countermeasures.
Resumo:
An electrified railway system includes complex interconnections and interactions of several subsystems. Computer simulation is the only viable means for system evaluation and analysis. This paper discusses the difficulties and requirements of effective simulation models for this specialized industrial application; and the development of a general-purpose multi-train simulator.
Resumo:
The track allocation problem (TAP) at a multi-track, multi-platform mainline railway station is defined by the station track layout and service timetable, which implies combinations of spatial and temporal conflicts. Feasible solutions are available from either traditional planning or advanced intelligent searching methods and their evaluations with respect to operational requirements are essential for the operators. To facilitate thorough analysis, a timed Coloured Petri Nets (CPN) model is presented here to encapsulate the inter-relationships of the spatial and temporal constraints in the TAP.
Resumo:
Popular wireless network standards, such as IEEE 802.11/15/16, are increasingly adopted in real-time control systems. However, they are not designed for real-time applications. Therefore, the performance of such wireless networks needs to be carefully evaluated before the systems are implemented and deployed. While efforts have been made to model general wireless networks with completely random traffic generation, there is a lack of theoretical investigations into the modelling of wireless networks with periodic real-time traffic. Considering the widely used IEEE 802.11 standard, with the focus on its distributed coordination function (DCF), for soft-real-time control applications, this paper develops an analytical Markov model to quantitatively evaluate the network quality-of-service (QoS) performance in periodic real-time traffic environments. Performance indices to be evaluated include throughput capacity, transmission delay and packet loss ratio, which are crucial for real-time QoS guarantee in real-time control applications. They are derived under the critical real-time traffic condition, which is formally defined in this paper to characterize the marginal satisfaction of real-time performance constraints.
Resumo:
A novel model for the potentiostatic discharge of primary alkaline battery cathodes is presented. The model is used to simulate discharges resulting from the stepped potential electrochemical spectroscopy (SPECS) of primary alkaline battery cathodes cathodes, and the results are validated with experimental data. We show that a model based on a single (or mean) reaction framework can be used to simulate multi-reaction discharge behaviour and we develop a consistent functional modification to the kinetic equation of the model that allows for this to occur. The model is used to investigate the effects that the initial exchange current density, i00, and the diffusion coefficient for protons in electrolytic manganese dioxide (EMD), DH+, have on SPECS discharge. The behaviour observed is consistent with the idea that individual reduction reactions, within the multi-reaction, reduction behaviour of EMD, have distinct i00 and DH+ values.
Resumo:
This paper presents a novel method for remaining useful life prediction using the Elliptical Basis Function (EBF) network and a Markov chain. The EBF structure is trained by a modified Expectation-Maximization (EM) algorithm in order to take into account the missing covariate set. No explicit extrapolation is needed for internal covariates while a Markov chain is constructed to represent the evolution of external covariates in the study. The estimated external and the unknown internal covariates constitute an incomplete covariate set which are then used and analyzed by the EBF network to provide survival information of the asset. It is shown in the case study that the method slightly underestimates the remaining useful life of an asset which is a desirable result for early maintenance decision and resource planning.
Resumo:
We estimate the parameters of a stochastic process model for a macroparasite population within a host using approximate Bayesian computation (ABC). The immunity of the host is an unobserved model variable and only mature macroparasites at sacrifice of the host are counted. With very limited data, process rates are inferred reasonably precisely. Modeling involves a three variable Markov process for which the observed data likelihood is computationally intractable. ABC methods are particularly useful when the likelihood is analytically or computationally intractable. The ABC algorithm we present is based on sequential Monte Carlo, is adaptive in nature, and overcomes some drawbacks of previous approaches to ABC. The algorithm is validated on a test example involving simulated data from an autologistic model before being used to infer parameters of the Markov process model for experimental data. The fitted model explains the observed extra-binomial variation in terms of a zero-one immunity variable, which has a short-lived presence in the host.
Resumo:
The quality and bitrate modeling is essential to effectively adapt the bitrate and quality of videos when delivered to multiplatform devices over resource constraint heterogeneous networks. The recent model proposed by Wang et al. estimates the bitrate and quality of videos in terms of the frame rate and quantization parameter. However, to build an effective video adaptation framework, it is crucial to incorporate the spatial resolution in the analytical model for bitrate and perceptual quality adaptation. Hence, this paper proposes an analytical model to estimate the bitrate of videos in terms of quantization parameter, frame rate, and spatial resolution. The model can fit the measured data accurately which is evident from the high Pearson correlation. The proposed model is based on the observation that the relative reduction in bitrate due to decreasing spatial resolution is independent of the quantization parameter and frame rate. This modeling can be used for rate-constrained bit-stream adaptation scheme which selects the scalability parameters to optimize the perceptual quality for a given bandwidth constraint.
Resumo:
Conceptual modeling grammars are a fundamental means for specifying information systems requirements. However, the actual usage of these grammars is only poorly understood. In particular, little is known about how properties of these grammars inform usage beliefs such as usefulness and ease of use. In this paper we use an ontological theory to describe conceptual modeling grammars in terms of their ontological deficiencies, and formulate two propositions in regard to how these ontological deficiencies influence primary usage beliefs. Using BPMN as an example modeling grammar, we surveyed 528 modeling practitioners to test the theorized relationships. Our results show that users of conceptual modeling grammars perceive ontological deficiencies to exist, and that these deficiency perceptions are negatively associated with usefulness and ease of use of these grammars. With our research we provide empirical evidence in support of the predictions of the ontological theory of modeling grammar expressiveness, and we identify previously unexplored links between conceptual modeling grammars and grammar usage beliefs. This work implies for practice a much closer coupling of the act of (re ) designing modeling grammars with usage-related success metrics.
Resumo:
Business processes have emerged as a well-respected variable in the design of successful corporations. However, unlike other key managerial variables, such as products and services, customers and employees, physical or digital assets, the conceptualization and management of business processes are in many respects in their infancy. In this book, Jan Recker investigates the notion of quality of business process modeling grammars. His evaluation is based on an ontological-, qualitative-, and quantitative analysis, applied to BPMN, a widely-used business process modeling grammar. His results reveal the ontological shortcomings of BPMN and how these manifest themselves in actual process modeling practice, as well as how they influence the usage behavior of modeling practitioners. More generally, his book constitutes a landmark for empirical technology assessment, analyzing the way in which design flaws in technology influence usage behavior.