884 resultados para Stochastic SIS logistic model
Resumo:
Metal-framed traps covered with polyethylene mesh used in the fishery for the South African Cape rock lobster (Jasus lalandii) incidentally capture large numbers of undersize (<75 mm CL) specimens. Air-exposure, handling, and release procedures affect captured rock lobsters and reduce the productivity of the stock, which is heavily fished. Optimally, traps should retain legalsize rock lobsters and allow sublegal animals to escape before traps are hauled. Escapement, based on lobster morphometric measurements, through meshes of 62 mm, 75 mm, and 100 mm was investigated theoretically under controlled conditions in an aquarium, and during field trials. SELECT models were used to model escapement, wherever appropriate. Size-selectivity curves based on the logistic model fitted the aquarium and field data better than asymmetrical Richards curves. The lobster length at 50% retention (L50) on the escapement curve for 100-mm mesh in the aquarium (75.5 mm CL) approximated the minimum legal size (75 mm CL); however estimates of L50 increased to 77.4 mm in field trials where trapentrances were sealed, and to 82.2 mm where trap-entrances were open. Therfore, rock lobsters that cannot escape through the mesh of sealed field traps do so through the trap entrance of open traps. By contrast, the wider selection range and lower L25 of field, compared to aquarium, trials (SR = 8.2 mm vs. 2.6 mm; L25 =73.4 mm vs. 74.1 mm), indicate that small lobsters that should be able to escape from 100-mm mesh traps do not always do so. Escapement from 62-mm mesh traps with open entrance funnels increased by 40−60% over sealed traps. The findings of this study with a known size distribution, are related to those of a recent indirect (comparative) study for the same species, and implications for trap surveys, commercial catch rates, and ghost fishing are discussed.
Resumo:
A recent trend in spoken dialogue research is the use of reinforcement learning to train dialogue systems in a simulated environment. Past researchers have shown that the types of errors that are simulated can have a significant effect on simulated dialogue performance. Since modern systems typically receive an N-best list of possible user utterances, it is important to be able to simulate a full N-best list of hypotheses. This paper presents a new method for simulating such errors based on logistic regression, as well as a new method for simulating the structure of N-best lists of semantics and their probabilities, based on the Dirichlet distribution. Off-line evaluations show that the new Dirichlet model results in a much closer match to the receiver operating characteristics (ROC) of the live data. Experiments also show that the logistic model gives confusions that are closer to the type of confusions observed in live situations. The hope is that these new error models will be able to improve the resulting performance of trained dialogue systems. © 2012 IEEE.
Resumo:
本文提出了一种新的、有效的机器人自适应控制方式,克服了其他方法由于模型不准或计算量大等所带来的一系列问题。本文首先将 Lagrange 运动方程转化为 ARMA 模型,并用虚拟噪声补偿模型误差(即由于线性化、解耦、观测不准和干扰等误差).然后利用改进的 Kalman 自适应滤波算法在线进行参数辨识和状态估计,将获得的参数用于机器人控制系统自适应控制器的设计.最后给出了该算法的仿真结果并对此进行了讨论。
Resumo:
本文提出了一种简化的多变量随机系统状态模型参数在线辨识方法。与最小二乘自适应递推算法比较,不仅需要辨识的参数减少,而且针对一类模型参数缓慢变化的系统,可以通过选择不同的遗忘因子序列来控制参数变化的幅度,解决了电力系统负荷预报中季节模型的老化问题。本方法基于带有随机噪声状态模型的典范型,大大节省了计算机的运算量和存贮容量,适于微处理机的在线应用。
Resumo:
本文将随机系统状态模型辨识技术用于电力系统负荷预报。首先根据负荷的一系列历史数据建立负荷的状态空间模型,然后用滤波算法进行次日负荷预报,最后用电网实际数据在 PDP-11/23计算机上进行预报计算,得到比较满意的结果。
Resumo:
Understanding relationship between environmental protection and economic development is crucial to form practical environmental policy. At micro level, implementation of environmental regulations often causes production mills adjustment of technology which might leads to change of productive efficiency and cost, which, in turn, determine effort level of mills and even local government in pollution control. Using a stochastic frontier production model and a set of survey data on 126 paper mills from six provinces of China, we measure the technical efficiency changes and analyze the determinants of efficiency. in particular, we examine impact of environmental policy on paper mills' efficiency, using an indicator of environmental policy-the levy ratio of COD. We also estimate a simultaneous-equation model in which the levy rate and emission are jointly determined. The results indicate that there have been efficiency improvements during 1999-2003, when enforcement of environmental regulations have been tightened. The impacts, nevertheless, are different for different types of mills. We also find the levy ratio, which is influenced by both the local social and economic conditions and the characters of paper mills, such as scale, has strong impact on the abatement of the pollutant-COD. Additionally, paper mills' technical efficiency has positive effect on the reduction of the emission intensity of the pollutant-COD. These results lead a set of implications pertinent to policy improvement.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
This work analyzes the relationship between large food webs describing potential feeding relations between species and smaller sub-webs thereof describing relations actually realized in local communities of various sizes. Special attention is given to the relationships between patterns of phylogenetic correlations encountered in large webs and sub-webs. Based on the current theory of food-web topology as implemented in the matching model, it is shown that food webs are scale invariant in the following sense: given a large web described by the model, a smaller, randomly sampled sub-web thereof is described by the model as well. A stochastic analysis of model steady states reveals that such a change in scale goes along with a re-normalization of model parameters. Explicit formulae for the renormalized parameters are derived. Thus, the topology of food webs at all scales follows the same patterns, and these can be revealed by data and models referring to the local scale alone. As a by-product of the theory, a fast algorithm is derived which yields sample food webs from the exact steady state of the matching model for a high-dimensional trophic niche space in finite time. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
One way to restore physiological blood flow to occluded arteries involves the deformation of plaque using an intravascular balloon and preventing elastic recoil using a stent. Angioplasty and stent implantation cause unphysiological loading of the arterial tissue, which may lead to tissue in-growth and reblockage; termed “restenosis.” In this paper, a computational methodology for predicting the time-course of restenosis is presented. Stress-induced damage, computed using a remaining life approach, stimulates inflammation (production of matrix degrading factors and growth stimuli). This, in turn, induces a change in smooth muscle cell phenotype from contractile (as exists in the quiescent tissue) to synthetic (as exists in the growing tissue). In this paper, smooth muscle cell activity (migration, proliferation, and differentiation) is simulated in a lattice using a stochastic approach to model individual cell activity. The inflammation equations are examined under simplified loading cases. The mechanobiological parameters of the model were estimated by calibrating the model response to the results of a balloon angioplasty study in humans. The simulation method was then used to simulate restenosis in a two dimensional model of a stented artery. Cell activity predictions were similar to those observed during neointimal hyperplasia, culminating in the growth of restenosis. Similar to experiment, the amount of neointima produced increased with the degree of expansion of the stent, and this relationship was found to be highly dependant on the prescribed inflammatory response. It was found that the duration of inflammation affected the amount of restenosis produced, and that this effect was most pronounced with large stent expansions. In conclusion, the paper shows that the arterial tissue response to mechanical stimulation can be predicted using a stochastic cell modeling approach, and that the simulation captures features of restenosis development observed with real stents. The modeling approach is proposed for application in three dimensional models of cardiovascular stenting procedures.
Resumo:
OBJECTIVE To assess the association between circulating angiogenic and antiangiogenic factors in the second trimester and risk of preeclampsia in women with type 1 diabetes.
RESEARCH DESIGN AND METHODS Maternal plasma concentrations of placental growth factor (PlGF), soluble fms-like tyrosine kinase 1 (sFlt-1), and soluble endoglin (sEng) were available at 26 weeks of gestation in 540 women with type 1 diabetes enrolled in the Diabetes and Preeclampsia Intervention Trial.
RESULTS Preeclampsia developed in 17% of pregnancies (n = 94). At 26 weeks of gestation, women in whom preeclampsia developed later had significantly lower PlGF (median [interquartile range]: 231 pg/mL [120–423] vs. 365 pg/mL [237–582]; P < 0.001), higher sFlt-1 (1,522 pg/mL [1,108–3,393] vs. 1,193 pg/mL [844–1,630] P < 0.001), and higher sEng (6.2 ng/mL [4.9–7.9] vs. 5.1 ng/mL[(4.3–6.2]; P < 0.001) compared with women who did not have preeclampsia. In addition, the ratio of PlGF to sEng was significantly lower (40 [17–71] vs. 71 [44–114]; P < 0.001) and the ratio of sFlt-1 to PlGF was significantly higher (6.3 [3.4–15.7] vs. 3.1 [1.8–5.8]; P < 0.001) in women who later developed preeclampsia. The addition of the ratio of PlGF to sEng or the ratio of sFlt-1 to PlGF to a logistic model containing established risk factors (area under the curve [AUC], 0.813) significantly improved the predictive value (AUC, 0.850 and 0.846, respectively; P < 0.01) and significantly improved reclassification according to the integrated discrimination improvement index (IDI) (IDI scores 0.086 and 0.065, respectively; P < 0.001).
CONCLUSIONS These data suggest that angiogenic and antiangiogenic factors measured during the second trimester are predictive of preeclampsia in women with type 1 diabetes. The addition of the ratio of PlGF to sEng or the ratio of sFlt-1 to PlGF to established clinical risk factors significantly improves the prediction of preeclampsia in women with type 1 diabetes.
Preeclampsia is characterized by the development of hypertension and new-onset proteinuria during the second half of pregnancy (1,2), leading to increased maternal morbidity and mortality (3). Women with type 1 diabetes are at increased risk for development of preeclampsia during pregnancy, with rates being two-times to four-times higher than that of the background maternity population (4,5). Small advances have come from preventive measures, such as low-dose aspirin in women at high risk (6); however, delivery remains the only effective intervention, and preeclampsia is responsible for up to 15% of preterm births and a consequent increase in infant mortality and morbidity (7).
Although the etiology of preeclampsia remains unclear, abnormal placental vascular remodeling and placental ischemia, together with maternal endothelial dysfunction, hemodynamic changes, and renal pathology, contribute to its pathogenesis (8). In addition, over the past decade accumulating evidence has suggested that an imbalance between angiogenic factors, such as placental growth factor (PlGF), and antiangiogenic factors, such as soluble fms-like tyrosine kinase 1 (sFlt-1) and soluble endoglin (sEng), plays a key role in the pathogenesis of preeclampsia (8,9). In women at low risk (10–13) and women at high risk (14,15), concentrations of angiogenic and antiangiogenic factors are significantly different between women who later develop preeclampsia (lower PlGF, higher sFlt-1, and higher sEng levels) compared with women who do not.
Few studies have specifically focused on circulating angiogenic factors and risk of preeclampsia in women with diabetes, and the results have been conflicting. In a small study, higher sFlt-1 and lower PlGF were reported at the time of delivery in women with diabetes who developed preeclampsia (16). In a longitudinal prospective cohort of pregnant women with diabetes, Yu et al. (17) reported increased sFlt-1 and reduced PlGF in the early third trimester as potential predictors of preeclampsia in women with type 1 diabetes, but they did not show any difference in sEng levels in women with preeclampsia compared with women without preeclampsia. By contrast, Powers et al. (18) reported only increased sEng in the second trimester in women with pregestational diabetes who developed preeclampsia.
The aim of this study, which was significantly larger than the previous studies highlighted, was to assess the association between circulating angiogenic (PlGF) and antiangiogenic (sFlt-1 and sEng) factors and the risk of preeclampsia in women with type 1 diabetes. A further aim was to evaluate the added predictive ability and clinical usefulness of angiogenic factors and established risk factors for preeclampsia risk prediction in women with type 1 diabetes.
Resumo:
Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.
Resumo:
Many cancer patients die in institutional settings despite their preference to die at home. A longitudinal, prospective cohort study was conducted to comprehensively assess the determinants of home death for patients receiving home-based palliative care. Data collected from biweekly telephone interviews with caregivers (n=302) and program databases were entered into a multivariate logistic model. Patients with high nursing costs (odds ratio [OR]: 4.3; confidence interval [CI]: 1.8-10.2) and patients with high personal support worker costs (OR: 2.3; CI: 1.1-4.5) were more likely to die at home than those with low costs. Patients who lived alone were less likely to die at home than those who cohabitated (OR: 0.4; CI: 0.2-0.8), and those with a high propensity for a home-death preference were more likely to die at home than those with a low propensity (OR: 5.8; CI: 1.1-31.3). An understanding of the predictors of place of death may contribute to the development of effective interventions that support home death.
Resumo:
The cognitive reflection test (CRT) is a short measure of a person's ability to resist intuitive response tendencies and to produce a normatively correct response, which is based on effortful reasoning. Although the CRT is a very popular measure, its psychometric properties have not been extensively investigated. A major limitation of the CRT is the difficulty of the items, which can lead to floor effects in populations other than highly educated adults. The present study aimed at investigating the psychometric properties of the CRT applying item response theory analyses (a two-parameter logistic model) and at developing a new version of the scale (the CRT-long), which is appropriate for participants with both lower and higher levels of cognitive reflection. The results demonstrated the good psychometric properties of the original, as well as the new scale. The validity of the new scale was also assessed by measuring correlations with various indicators of intelligence, numeracy, reasoning and decision-making skills, and thinking dispositions. Moreover, we present evidence for the suitability of the new scale to be used with developmental samples. Finally, by comparing the performance of adolescents and young adults on the CRT and CRT-long, we report the first investigation into the development of cognitive reflection.
Resumo:
Thesis (Master's)--University of Washington, 2015
Resumo:
This paper constructs and estimates a sticky-price, Dynamic Stochastic General Equilibrium model with heterogenous production sectors. Sectors differ in price stickiness, capital-adjustment costs and production technology, and use output from each other as material and investment inputs following an Input-Output Matrix and Capital Flow Table that represent the U.S. economy. By relaxing the standard assumption of symmetry, this model allows different sectoral dynamics in response to monetary policy shocks. The model is estimated by Simulated Method of Moments using sectoral and aggregate U.S. time series. Results indicate 1) substantial heterogeneity in price stickiness across sectors, with quantitatively larger differences between services and goods than previously found in micro studies that focus on final goods alone, 2) a strong sensitivity to monetary policy shocks on the part of construction and durable manufacturing, and 3) similar quantitative predictions at the aggregate level by the multi-sector model and a standard model that assumes symmetry across sectors.