997 resultados para Graphical modeling (Statistics)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structure of a turbulent non-premixed flame of a biogas fuel in a hot and diluted coflow mimicking moderate and intense low dilution (MILD) combustion is studied numerically. Biogas fuel is obtained by dilution of Dutch natural gas (DNG) with CO2. The results of biogas combustion are compared with those of DNG combustion in the Delft Jet-in-Hot-Coflow (DJHC) burner. New experimental measurements of lift-off height and of velocity and temperature statistics have been made to provide a database for evaluating the capability of numerical methods in predicting the flame structure. Compared to the lift-off height of the DNG flame, addition of 30 % carbon dioxide to the fuel increases the lift-off height by less than 15 %. Numerical simulations are conducted by solving the RANS equations using Reynolds stress model (RSM) as turbulence model in combination with EDC (Eddy Dissipation Concept) and transported probability density function (PDF) as turbulence-chemistry interaction models. The DRM19 reduced mechanism is used as chemical kinetics with the EDC model. A tabulated chemistry model based on the Flamelet Generated Manifold (FGM) is adopted in the PDF method. The table describes a non-adiabatic three stream mixing problem between fuel, coflow and ambient air based on igniting counterflow diffusion flamelets. The results show that the EDC/DRM19 and PDF/FGM models predict the experimentally observed decreasing trend of lift-off height with increase of the coflow temperature. Although more detailed chemistry is used with EDC, the temperature fluctuations at the coflow inlet (approximately 100K) cannot be included resulting in a significant overprediction of the flame temperature. Only the PDF modeling results with temperature fluctuations predict the correct mean temperature profiles of the biogas case and compare well with the experimental temperature distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unplanned hospital readmissions increase health and medical care costs and indicate lower the lower quality of the healthcare services. Hence, predicting patients at risk to be readmitted is of interest. Using administrative data of patients being treated in the medical centers and hospitals in the Dalarna County, Sweden, during 2008 – 2016 two risk prediction models of hospital readmission are built. The first model relies on the logistic regression (LR) approach, predicts correctly 2,648 out of 3,392 observed readmission in the test dataset, reaching a c-statistics of 0.69. The second model is built using random forests (RF) algorithm; correctly predicts 2,183 readmission (out of 3,366) and 13,198 non-readmission events (out of 18,982). The discriminating ability of the best performing RF model (c-statistic 0.60) is comparable to that of the logistic model. Although the discriminating ability of both LR and RF risk prediction models is relatively modest, still these models are capable to identify patients running high risk of hospital readmission. These patients can then be targeted with specific interventions, in order to prevent the readmission, improve patients’ quality of life and reduce health and medical care costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a real data set pertaining to reindeer fecal pellet-group counts obtained from a survey conducted in a forest area in northern Sweden. In the data set, over 70% of counts are zeros, and there is high spatial correlation. We use conditionally autoregressive random effects for modeling of spatial correlation in a Poisson generalized linear mixed model (GLMM), quasi-Poisson hierarchical generalized linear model (HGLM), zero-inflated Poisson (ZIP), and hurdle models. The quasi-Poisson HGLM allows for both under- and overdispersion with excessive zeros, while the ZIP and hurdle models allow only for overdispersion. In analyzing the real data set, we see that the quasi-Poisson HGLMs can perform better than the other commonly used models, for example, ordinary Poisson HGLMs, spatial ZIP, and spatial hurdle models, and that the underdispersed Poisson HGLMs with spatial correlation fit the reindeer data best. We develop R codes for fitting these models using a unified algorithm for the HGLMs. Spatial count response with an extremely high proportion of zeros, and underdispersion can be successfully modeled using the quasi-Poisson HGLM with spatial random effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this project is to develop a three-dimensional block model for a garnet deposit in the Alder Gulch, Madison County, Montana. Garnets occur in pre-Cambrian metamorphic Red Wash gneiss and similar rocks in the vicinity. This project seeks to model the percentage of garnet in a deposit called the Section 25 deposit using the Surpac software. Data available for this work are drillhole, trench and grab sample data obtained from previous exploration of the deposit. The creation of the block model involves validating the data, creating composites of assayed garnet percentages and conducting basic statistics on composites using Surpac statistical tools. Variogram analysis will be conducted on composites to quantify the continuity of the garnet mineralization. A three-dimensional block model will be created and filled with estimates of garnet percentage using different methods of reserve estimation and the results compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, a machine learning approach was used to develop a predictive model for residual methanol concentration in industrial formalin produced at the Akzo Nobel factory in Kristinehamn, Sweden. The MATLABTM computational environment supplemented with the Statistics and Machine LearningTM toolbox from the MathWorks were used to test various machine learning algorithms on the formalin production data from Akzo Nobel. As a result, the Gaussian Process Regression algorithm was found to provide the best results and was used to create the predictive model. The model was compiled to a stand-alone application with a graphical user interface using the MATLAB CompilerTM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

American tegumentary leishmaniasis (ATL) is a disease transmitted to humans by the female sandflies of the genus Lutzomyia. Several factors are involved in the disease transmission cycle. In this work only rainfall and deforestation were considered to assess the variability in the incidence of ATL. In order to reach this goal, monthly recorded data of the incidence of ATL in Orán, Salta, Argentina, were used, in the period 1985-2007. The square root of the relative incidence of ATL and the corresponding variance were formulated as time series, and these data were smoothed by moving averages of 12 and 24 months, respectively. The same procedure was applied to the rainfall data. Typical months, which are April, August, and December, were found and allowed us to describe the dynamical behavior of ATL outbreaks. These results were tested at 95% confidence level. We concluded that the variability of rainfall would not be enough to justify the epidemic outbreaks of ATL in the period 1997-2000, but it consistently explains the situation observed in the years 2002 and 2004. Deforestation activities occurred in this region could explain epidemic peaks observed in both years and also during the entire time of observation except in 2005-2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, all publicly-accessible published findings on Alicyclobacillus acidoterrestris heat resistance in fruit beverages as affected by temperature and pH were compiled. Then, study characteristics (protocols, fruit and variety, °Brix, pH, temperature, heating medium, culture medium, inactivation method, strains, etc.) were extracted from the primary studies, and some of them incorporated to a meta-analysis mixed-effects linear model based on the basic Bigelow equation describing the heat resistance parameters of this bacterium. The model estimated mean D* values (time needed for one log reduction at a temperature of 95 °C and a pH of 3.5) of Alicyclobacillus in beverages of different fruits, two different concentration types, with and without bacteriocins, and with and without clarification. The zT (temperature change needed to cause one log reduction in D-values) estimated by the meta-analysis model were compared to those ('observed' zT values) reported in the primary studies, and in all cases they were within the confidence intervals of the model. The model was capable of predicting the heat resistance parameters of Alicyclobacillus in fruit beverages beyond the types available in the meta-analytical data. It is expected that the compilation of the thermal resistance of Alicyclobacillus in fruit beverages, carried out in this study, will be of utility to food quality managers in the determination or validation of the lethality of their current heat treatment processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The caffeine solubility in supercritical CO2 was studied by assessing the effects of pressure and temperature on the extraction of green coffee oil (GCO). The Peng-Robinson¹ equation of state was used to correlate the solubility of caffeine with a thermodynamic model and two mixing rules were evaluated: the classical mixing rule of van der Waals with two adjustable parameters (PR-VDW) and a density dependent one, proposed by Mohamed and Holder² with two (PR-MH, two parameters adjusted to the attractive term) and three (PR-MH3 two parameters adjusted to the attractive and one to the repulsive term) adjustable parameters. The best results were obtained with the mixing rule of Mohamed and Holder² with three parameters.