888 resultados para Two-stage stochastic model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article deals with the CFD modelling of fast pyrolysis of biomass in an Entrained Flow Reactor (EFR). The Lagrangian approach is adopted for the particle tracking, while the flow of the inert gas is treated with the standard Eulerian method for gases. The model includes the thermal degradation of biomass to char with simultaneous evolution of gases and tars from a discrete biomass particle. The chemical reactions are represented using a two-stage, semi-global model. The radial distribution of the pyrolysis products is predicted as well as their effect on the particle properties. The convective heat transfer to the surface of the particle is computed using the Ranz-Marshall correlation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model of the cognitive process of natural language processing has been developed using the formalism of generalized nets. Following this stage-simulating model, the treatment of information inevitably includes phases, which require joint operations in two knowledge spaces – language and semantics. In order to examine and formalize the relations between the language and the semantic levels of treatment, the language is presented as an information system, conceived on the bases of human cognitive resources, semantic primitives, semantic operators and language rules and data. This approach is applied for modeling a specific grammatical rule – the secondary predication in Russian. Grammatical rules of the language space are expressed as operators in the semantic space. Examples from the linguistics domain are treated and several conclusions for the semantics of the modeled rule are made. The results of applying the information system approach to the language turn up to be consistent with the stages of treatment modeled with the generalized net.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes difficulties with the introduction of object-oriented concepts in introductory computing education and then proposes a two-language, two-paradigm curriculum model that alleviates such difficulties. Our two-language, two-paradigm curriculum model begins with teaching imperative programming using Python programming language, continues with teaching object-oriented computing using Java, and concludes with teaching object-oriented data structures with Java.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The distribution and abundance of the American crocodile (Crocodylus acutus) in the Florida Everglades is dependent on the timing, amount, and location of freshwater flow. One of the goals of the Comprehensive Everglades Restoration Plan (CERP) is to restore historic freshwater flows to American crocodile habitat throughout the Everglades. To predict the impacts on the crocodile population from planned restoration activities, we created a stage-based spatially explicit crocodile population model that incorporated regional hydrology models and American crocodile research and monitoring data. Growth and survival were influenced by salinity, water depth, and density-dependent interactions. A stage-structured spatial model was used with discrete spatial convolution to direct crocodiles toward attractive sources where conditions were favorable. The model predicted that CERP would have both positive and negative impacts on American crocodile growth, survival, and distribution. Overall, crocodile populations across south Florida were predicted to decrease approximately 3 % with the implementation of CERP compared to future conditions without restoration, but local increases up to 30 % occurred in the Joe Bay area near Taylor Slough, and local decreases up to 30 % occurred in the vicinity of Buttonwood Canal due to changes in salinity and freshwater flows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider how three firms compete in a Salop location model and how cooperation in location choice by two of these firms affects the outcomes. We con- sider the classical case of linear transportation costs as a two-stage game in which the firms select first a location on a unit circle along which consumers are dispersed evenly, followed by the competitive selection of a price. Standard analysis restricts itself to purely competitive selection of location; instead, we focus on the situation in which two firms collectively decide about location, but price their products competitively after the location choice has been effectuated. We show that such partial coordination of location is beneficial to all firms, since it reduces the number of equilibria significantly and, thereby, the resulting coordination problem. Subsequently, we show that the case of quadratic transportation costs changes the main conclusions only marginally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Li-ion rechargeable battery (LIB) is widely used as an energy storage device, but has significant limitations in battery cycle life and safety. During initial charging, decomposition of the ethylene carbonate (EC)-based electrolytes of the LIB leads to the formation of a passivating layer on the anode known as the solid electrolyte interphase (SEI). The formation of an SEI has great impact on the cycle life and safety of LIB, yet mechanistic aspects of SEI formation are not fully understood. In this dissertation, two surface science model systems have been created under ultra-high vacuum (UHV) to probe the very initial stage of SEI formation at the model carbon anode surfaces of LIB. The first model system, Model System I, is an lithium-carbonate electrolyte/graphite C(0001) system. I have developed a temperature programmed desorption/temperature programmed reaction spectroscopy (TPD/TPRS) instrument as part of my dissertation to study Model System I in quantitative detail. The binding strengths and film growth mechanisms of key electrolyte molecules on model carbon anode surfaces with varying extents of lithiation were measured by TPD. TPRS was further used to track the gases evolved from different reduction products in the early-stage SEI formation. The branching ratio of multiple reaction pathways was quantified for the first time and determined to be 70.% organolithium products vs. 30% inorganic lithium product. The obtained branching ratio provides important information on the distribution of lithium salts that form at the very onset of SEI formation. One of the key reduction products formed from EC in early-stage SEI formation is lithium ethylene dicarbonate (LEDC). Despite intensive studies, the LEDC structure in either the bulk or thin-film (SEI) form is unknown. To enable structural study, pure LEDC was synthesized and subject to synchrotron X-ray diffraction measurements (bulk material) and STM measurements (deposited films). To enable studies of LEDC thin films, Model System II, a lithium ethylene dicarbonate (LEDC)-dimethylformamide (DMF)/Ag(111) system was created by a solution microaerosol deposition technique. Produced films were then imaged by ultra-high vacuum scanning tunneling microscopy (UHV-STM). As a control, the dimethylformamide (DMF)-Ag(111) system was first prepared and its complex 2D phase behavior was mapped out as a function of coverage. The evolution of three distinct monolayer phases of DMF was observed with increasing surface pressure — a 2D gas phase, an ordered DMF phase, and an ordered Ag(DMF)2 complex phase. The addition of LEDC to this mixture, seeded the nucleation of the ordered DMF islands at lower surface pressures (DMF coverages), and was interpreted through nucleation theory. A structural model of the nucleation seed was proposed, and the implication of ionic SEI products, such as LEDC, in early-stage SEI formation was discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Une approche classique pour traiter les problèmes d’optimisation avec incertitude à deux- et multi-étapes est d’utiliser l’analyse par scénario. Pour ce faire, l’incertitude de certaines données du problème est modélisée par vecteurs aléatoires avec des supports finis spécifiques aux étapes. Chacune de ces réalisations représente un scénario. En utilisant des scénarios, il est possible d’étudier des versions plus simples (sous-problèmes) du problème original. Comme technique de décomposition par scénario, l’algorithme de recouvrement progressif est une des méthodes les plus populaires pour résoudre les problèmes de programmation stochastique multi-étapes. Malgré la décomposition complète par scénario, l’efficacité de la méthode du recouvrement progressif est très sensible à certains aspects pratiques, tels que le choix du paramètre de pénalisation et la manipulation du terme quadratique dans la fonction objectif du lagrangien augmenté. Pour le choix du paramètre de pénalisation, nous examinons quelques-unes des méthodes populaires, et nous proposons une nouvelle stratégie adaptive qui vise à mieux suivre le processus de l’algorithme. Des expériences numériques sur des exemples de problèmes stochastiques linéaires multi-étapes suggèrent que la plupart des techniques existantes peuvent présenter une convergence prématurée à une solution sous-optimale ou converger vers la solution optimale, mais avec un taux très lent. En revanche, la nouvelle stratégie paraît robuste et efficace. Elle a convergé vers l’optimalité dans toutes nos expériences et a été la plus rapide dans la plupart des cas. Pour la question de la manipulation du terme quadratique, nous faisons une revue des techniques existantes et nous proposons l’idée de remplacer le terme quadratique par un terme linéaire. Bien que qu’il nous reste encore à tester notre méthode, nous avons l’intuition qu’elle réduira certaines difficultés numériques et théoriques de la méthode de recouvrement progressif.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Une approche classique pour traiter les problèmes d’optimisation avec incertitude à deux- et multi-étapes est d’utiliser l’analyse par scénario. Pour ce faire, l’incertitude de certaines données du problème est modélisée par vecteurs aléatoires avec des supports finis spécifiques aux étapes. Chacune de ces réalisations représente un scénario. En utilisant des scénarios, il est possible d’étudier des versions plus simples (sous-problèmes) du problème original. Comme technique de décomposition par scénario, l’algorithme de recouvrement progressif est une des méthodes les plus populaires pour résoudre les problèmes de programmation stochastique multi-étapes. Malgré la décomposition complète par scénario, l’efficacité de la méthode du recouvrement progressif est très sensible à certains aspects pratiques, tels que le choix du paramètre de pénalisation et la manipulation du terme quadratique dans la fonction objectif du lagrangien augmenté. Pour le choix du paramètre de pénalisation, nous examinons quelques-unes des méthodes populaires, et nous proposons une nouvelle stratégie adaptive qui vise à mieux suivre le processus de l’algorithme. Des expériences numériques sur des exemples de problèmes stochastiques linéaires multi-étapes suggèrent que la plupart des techniques existantes peuvent présenter une convergence prématurée à une solution sous-optimale ou converger vers la solution optimale, mais avec un taux très lent. En revanche, la nouvelle stratégie paraît robuste et efficace. Elle a convergé vers l’optimalité dans toutes nos expériences et a été la plus rapide dans la plupart des cas. Pour la question de la manipulation du terme quadratique, nous faisons une revue des techniques existantes et nous proposons l’idée de remplacer le terme quadratique par un terme linéaire. Bien que qu’il nous reste encore à tester notre méthode, nous avons l’intuition qu’elle réduira certaines difficultés numériques et théoriques de la méthode de recouvrement progressif.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Population antimicrobial use may influence resistance emergence. Resistance is an ecological phenomenon due to potential transmissibility. We investigated spatial and temporal patterns of ciprofloxacin (CIP) population consumption related to E. coli resistance emergence and dissemination in a major Brazilian city. A total of 4,372 urinary tract infection E. coli cases, with 723 CIP resistant, were identified in 2002 from two outpatient centres. Cases were address geocoded in a digital map. Raw CIP consumption data was transformed into usage density in DDDs by CIP selling points influence zones determination. A stochastic model coupled with a Geographical Information System was applied for relating resistance and usage density and for detecting city areas of high/low resistance risk. Results: E. coli CIP resistant cluster emergence was detected and significantly related to usage density at a level of 5 to 9 CIP DDDs. There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. Conclusions: There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. The usage density of 5-9 CIP DDDs per 1,000 inhabitants within the same influence zone was the resistance triggering level. This level led to E. coli resistance clustering, proving that individual resistance emergence and dissemination was affected by antimicrobial population consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chaotic dynamical systems with two or more attractors lying on invariant subspaces may, provided certain mathematical conditions are fulfilled, exhibit intermingled basins of attraction: Each basin is riddled with holes belonging to basins of the other attractors. In order to investigate the occurrence of such phenomenon in dynamical systems of ecological interest (two-species competition with extinction) we have characterized quantitatively the intermingled basins using periodic-orbit theory and scaling laws. The latter results agree with a theoretical prediction from a stochastic model, and also with an exact result for the scaling exponent we derived for the specific class of models investigated. We discuss the consequences of the scaling laws in terms of the predictability of a final state (extinction of either species) in an ecological experiment.