870 resultados para ES-SAGD. Heavy oil. Recovery factor. Reservoir modeling and simulation
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
The current study aims to discover the effects of “Food Dudes” peer modeling videos and positive reinforcement on vegetable consumption using a delayed multiple baseline design across subjects. Results suggest peer modeling and positive reinforcement as effective means to increase vegetable intake.
Resumo:
Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.
Resumo:
The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc.
Resumo:
Primary processing of natural gas platforms as Mexilhão Field (PMXL-1 ) in the Santos Basin, where monoethylene glycol (MEG) has been used to inhibit the formation of hydrates, present operational problems caused by salt scale in the recovery unit of MEG. Bibliographic search and data analysis of salt solubility in mixed solvents, namely water and MEG, indicate that experimental reports are available to a relatively restricted number of ionic species present in the produced water, such as NaCl and KCl. The aim of this study was to develop a method for calculating of salt solubilities in mixed solvent mixtures, in explantion, NaCl or KCl in aqueous mixtures of MEG. The method of calculating extend the Pitzer model, with the approach Lorimer, for aqueous systems containing a salt and another solvent (MEG). Python language in the Integrated Development Environment (IDE) Eclipse was used in the creation of the computational applications. The results indicate the feasibility of the proposed calculation method for a systematic series of salt (NaCl or KCl) solubility data in aqueous mixtures of MEG at various temperatures. Moreover, the application of the developed tool in Python has proven to be suitable for parameter estimation and simulation purposes
Resumo:
Osteoarthritis (OA) is the most common form of arthritis with a high socioeconomic burden, with an incompletely understood etiology. Evidence suggests a role for the transforming growth factor beta (TGF-ß) signalling pathway and epigenomics in OA. The aim of this thesis was to understand the involvement of the TGF-ß pathway in OA and to determine the DNA methylation patterns of OA-affected cartilage as compared to the OA-free cartilage. First, I found that a common SNP in the BMP2 gene, a ligand in the Bone morphogenetic protein (BMP) subunit of TGF-ß pathway, was associated with OA in the Newfoundland population. I also showed a genetic association between SMAD3 - a signal transducer in the TGF-ß subunit of the TGF-ß signalling pathway - and the total radiographic burden of OA. I further demonstrated that SMAD3 is over-expressed in OA cartilage, suggesting an over activation of the TGF-ß signalling in OA. Next, I examined the connection of these genes in the regulation of matrix metallopeptidase 13 (MMP13) - an enzyme known to destroy extracellular matrix in OA cartilage - in the context of the TGF-ß signalling. The analyses showed that TGF-ß, MMP13, and SMAD3 were overexpressed in OA cartilage, whereas the expression of BMP2 was significantly reduced. The expression of TGF-ß was positively correlated with that of SMAD3 and MMP13, suggesting that TGF-ß signalling is involved in up-regulation of MMP13. This regulation, however, appears not to be controlled by SMAD3 signals, possibly due to the involvement of collateral signalling, and to be suppressed by BMP regulation in healthy cartilage, whose levels were reduced in end-stage OA. In a genome-wide DNA methylation analysis, I reported CpG sites differentially methylated in OA and showed that the cartilage methylome has a potential to distinguish between OA-affected and non-OA cartilage. Functional clustering analysis of the genes harbouring differentially methylated loci revealed that they are enriched in the skeletal system morphogenesis pathway, which could be a potential candidate for further OA studies. Overall, the findings from the present thesis provide evidence that the TGF-ß signalling pathway is associated with the development of OA, and epigenomics might be involved as a potential mechanism in OA.
Resumo:
This paper presents a theoretical model on the vibration analysis of micro scale fluid-loaded rectangular isotropic plates, based on the Lamb's assumption of fluid-structure interaction and the Rayleigh-Ritz energy method. An analytical solution for this model is proposed, which can be applied to most cases of boundary conditions. The dynamical experimental data of a series of microfabricated silicon plates are obtained using a base-excitation dynamic testing facility. The natural frequencies and mode shapes in the experimental results are in good agreement with the theoretical simulations for the lower order modes. The presented theoretical and experimental investigations on the vibration characteristics of the micro scale plates are of particular interest in the design of microplate based biosensing devices. Copyright © 2009 by ASME.
Resumo:
A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.
Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.
The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.
The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.
All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.
Resumo:
While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.
In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.
By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.
Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.
Resumo:
Abstract Purpose The purpose of the study is to review recent studies published from 2007-2015 on tourism and hotel demand modeling and forecasting with a view to identifying the emerging topics and methods studied and to pointing future research directions in the field. Design/Methodology/approach Articles on tourism and hotel demand modeling and forecasting published in both science citation index (SCI) and social science citation index (SSCI) journals were identified and analyzed. Findings This review found that the studies focused on hotel demand are relatively less than those on tourism demand. It is also observed that more and more studies have moved away from the aggregate tourism demand analysis, while disaggregate markets and niche products have attracted increasing attention. Some studies have gone beyond neoclassical economic theory to seek additional explanations of the dynamics of tourism and hotel demand, such as environmental factors, tourist online behavior and consumer confidence indicators, among others. More sophisticated techniques such as nonlinear smooth transition regression, mixed-frequency modeling technique and nonparametric singular spectrum analysis have also been introduced to this research area. Research limitations/implications The main limitation of this review is that the articles included in this study only cover the English literature. Future review of this kind should also include articles published in other languages. The review provides a useful guide for researchers who are interested in future research on tourism and hotel demand modeling and forecasting. Practical implications This review provides important suggestions and recommendations for improving the efficiency of tourism and hospitality management practices. Originality/value The value of this review is that it identifies the current trends in tourism and hotel demand modeling and forecasting research and points out future research directions.
Resumo:
The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Nella seguente tesi è descritto il principio di sviluppo di una macchina industriale di alimentazione. Il suddetto sistema dovrà essere installato fra due macchine industriali. L’apparato dovrà mettere al passo e sincronizzare con la macchina a valle i prodotti che arriveranno in input. La macchina ordina gli oggetti usando una serie di nastri trasportatori a velocità regolabile. Lo sviluppo è stato effettuato al Laboratorio Liam dopo la richiesta dell’azienda Sitma. Sitma produceva già un tipo di sistema come quello descritto in questa tesi. Il deisderio di Sitma è quindi quello di modernizzare la precedente applicazione poiché il dispositivo che le permetteva di effettuare la messa al passo di prodotti era un PLC Siemens che non è più commercializzato. La tesi verterà sullo studio dell’applicazione e la modellazione tramite Matlab-Simulink per poi proseguire ad una applicazione, seppure non risolutiva, in TwinCAT 3.
Resumo:
This work provides a holistic investigation into the realm of feature modeling within software product lines. The work presented identifies limitations and challenges within the current feature modeling approaches. Those limitations include, but not limited to, the dearth of satisfactory cognitive presentation, inconveniency in scalable systems, inflexibility in adapting changes, nonexistence of predictability of models behavior, as well as the lack of probabilistic quantification of model’s implications and decision support for reasoning under uncertainty. The work in this thesis addresses these challenges by proposing a series of solutions. The first solution is the construction of a Bayesian Belief Feature Model, which is a novel modeling approach capable of quantifying the uncertainty measures in model parameters by a means of incorporating probabilistic modeling with a conventional modeling approach. The Bayesian Belief feature model presents a new enhanced feature modeling approach in terms of truth quantification and visual expressiveness. The second solution takes into consideration the unclear support for the reasoning under the uncertainty process, and the challenging constraint satisfaction problem in software product lines. This has been done through the development of a mathematical reasoner, which was designed to satisfy the model constraints by considering probability weight for all involved parameters and quantify the actual implications of the problem constraints. The developed Uncertain Constraint Satisfaction Problem approach has been tested and validated through a set of designated experiments. Profoundly stating, the main contributions of this thesis include the following: • Develop a framework for probabilistic graphical modeling to build the purported Bayesian belief feature model. • Extend the model to enhance visual expressiveness throughout the integration of colour degree variation; in which the colour varies with respect to the predefined probabilistic weights. • Enhance the constraints satisfaction problem by the uncertainty measuring of the parameters truth assumption. • Validate the developed approach against different experimental settings to determine its functionality and performance.
Resumo:
Thesis (Master's)--University of Washington, 2016-08