936 resultados para Parametric sensitivity analysis
Resumo:
Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.
Resumo:
Effective disaster risk management relies on science-based solutions to close the gap between prevention and preparedness measures. The consultation on the United Nations post-2015 framework for disaster risk reduction highlights the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management, in order to save lives and property and reduce the overall impact of severe events. Continental and global scale flood forecasting systems provide vital early flood warning information to national and international civil protection authorities, who can use this information to make decisions on how to prepare for upcoming floods. Here the potential monetary benefits of early flood warnings are estimated based on the forecasts of the continental-scale European Flood Awareness System (EFAS) using existing flood damage cost information and calculations of potential avoided flood damages. The benefits are of the order of 400 Euro for every 1 Euro invested. A sensitivity analysis is performed in order to test the uncertainty in the method and develop an envelope of potential monetary benefits of EFAS warnings. The results provide clear evidence that there is likely a substantial monetary benefit in this cross-border continental-scale flood early warning system. This supports the wider drive to implement early warning systems at the continental or global scale to improve our resilience to natural hazards.
Resumo:
Partial budgeting was used to estimate the net benefit of blending Jersey milk in Holstein-Friesian milk for Cheddar cheese production. Jersey milk increases Cheddar cheese yield. However, the cost of Jersey milk is also higher; thus, determining the balance of profitability is necessary, including consideration of seasonal effects. Input variables were based on a pilot plant experiment run from 2012 to 2013 and industry milk and cheese prices during this period. When Jersey milk was used at an increasing rate with Holstein-Friesian milk (25, 50, 75, and 100% Jersey milk), it resulted in an increase of average net profit of 3.41, 6.44, 8.57, and 11.18 pence per kilogram of milk, respectively, and this additional profit was constant throughout the year. Sensitivity analysis showed that the most influential input on additional profit was cheese yield, whereas cheese price and milk price had a small effect. The minimum increase in yield, which was necessary for the use of Jersey milk to be profitable, was 2.63, 7.28, 9.95, and 12.37% at 25, 50, 75, and 100% Jersey milk, respectively. Including Jersey milk did not affect the quantity of whey butter and powder produced. Althoug further research is needed to ascertain the amount of additional profit that would be found on a commercial scale, the results indicate that using Jersey milk for Cheddar cheese making would lead to an improvement in profit for the cheese makers, especially at higher inclusion rates.
Resumo:
A mathematical model for Banana Xanthomonas Wilt (BXW) spread by insect is presented. The model incorporates inflorescence infection and vertical transmission from the mother corm to attached suckers, but not tool-based transmission by humans. Expressions for the basic reproduction number R0 are obtained and it is verified that disease persists, at a unique endemic level, when R0 > 1. From sensitivity analysis, inflorescence infection rate and roguing rate were the parameters with most influence on disease persistence and equilibrium level. Vertical transmission parameters had less effect on persistence threshold values. Parameters were approximately estimated from field data. The model indicates that single stem removal is a feasible approach to eradication if spread is mainly via inflorescence infection. This requires continuous surveillance and debudding such that a 50% reduction in inflorescence infection and 2–3 weeks interval of surveillance would eventually lead to full recovery of banana plantations and hence improved production.
Resumo:
We have shown previously that particpants “at risk” of depression have decreased neural processing of reward suggesting this might be a neural biomarker for depression. However, how the neural signal related to subjective experiences of reward (wanting, liking, intensity) might differ as trait markers for depression, is as yet unknown. Using SPM8 parametric modulation analysis the neural signal related to the subjective report of wanting, liking and intensity was compared between 25 young people with a biological parent with depression (FH) and 25 age/gender matched controls. In a second study the neural signal related to the subjective report of wanting, liking and intensity was compared between 13 unmedicated recovered depressed (RD) patients and 14 healthy age/gender matched controls. The analysis revealed differences in the neural signal for wanting, liking and intensity ratings in the ventral striatum, dmPFC and caudate respectively in the RD group compared to controls . Despite no differences in the FH groups neural signal for wanting and liking there was a difference in the neural signal for intensity ratings in the dACC and anterior insula compared to controls. These results suggest that the neural substrates tracking the intensity but not the wanting or liking for rewards and punishers might be a trait marker for depression.
Resumo:
A particle filter method is presented for the discrete-time filtering problem with nonlinear ItA ` stochastic ordinary differential equations (SODE) with additive noise supposed to be analytically integrable as a function of the underlying vector-Wiener process and time. The Diffusion Kernel Filter is arrived at by a parametrization of small noise-driven state fluctuations within branches of prediction and a local use of this parametrization in the Bootstrap Filter. The method applies for small noise and short prediction steps. With explicit numerical integrators, the operations count in the Diffusion Kernel Filter is shown to be smaller than in the Bootstrap Filter whenever the initial state for the prediction step has sufficiently few moments. The established parametrization is a dual-formula for the analysis of sensitivity to gaussian-initial perturbations and the analysis of sensitivity to noise-perturbations, in deterministic models, showing in particular how the stability of a deterministic dynamics is modeled by noise on short times and how the diffusion matrix of an SODE should be modeled (i.e. defined) for a gaussian-initial deterministic problem to be cast into an SODE problem. From it, a novel definition of prediction may be proposed that coincides with the deterministic path within the branch of prediction whose information entropy at the end of the prediction step is closest to the average information entropy over all branches. Tests are made with the Lorenz-63 equations, showing good results both for the filter and the definition of prediction.
Resumo:
In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.
Resumo:
P>In the context of either Bayesian or classical sensitivity analyses of over-parametrized models for incomplete categorical data, it is well known that prior-dependence on posterior inferences of nonidentifiable parameters or that too parsimonious over-parametrized models may lead to erroneous conclusions. Nevertheless, some authors either pay no attention to which parameters are nonidentifiable or do not appropriately account for possible prior-dependence. We review the literature on this topic and consider simple examples to emphasize that in both inferential frameworks, the subjective components can influence results in nontrivial ways, irrespectively of the sample size. Specifically, we show that prior distributions commonly regarded as slightly informative or noninformative may actually be too informative for nonidentifiable parameters, and that the choice of over-parametrized models may drastically impact the results, suggesting that a careful examination of their effects should be considered before drawing conclusions.Resume Que ce soit dans un cadre Bayesien ou classique, il est bien connu que la surparametrisation, dans les modeles pour donnees categorielles incompletes, peut conduire a des conclusions erronees. Cependant, certains auteurs persistent a negliger les problemes lies a la presence de parametres non identifies. Nous passons en revue la litterature dans ce domaine, et considerons quelques exemples surparametres simples dans lesquels les elements subjectifs influencent de facon non negligeable les resultats, independamment de la taille des echantillons. Plus precisement, nous montrons comment des a priori consideres comme peu ou non-informatifs peuvent se reveler extremement informatifs en ce qui concerne les parametres non identifies, et que le recours a des modeles surparametres peut avoir sur les conclusions finales un impact considerable. Ceci suggere un examen tres attentif de l`impact potentiel des a priori.
Resumo:
In this paper we extend partial linear models with normal errors to Student-t errors Penalized likelihood equations are applied to derive the maximum likelihood estimates which appear to be robust against outlying observations in the sense of the Mahalanobis distance In order to study the sensitivity of the penalized estimates under some usual perturbation schemes in the model or data the local influence curvatures are derived and some diagnostic graphics are proposed A motivating example preliminary analyzed under normal errors is reanalyzed under Student-t errors The local influence approach is used to compare the sensitivity of the model estimates (C) 2010 Elsevier B V All rights reserved
Resumo:
When missing data occur in studies designed to compare the accuracy of diagnostic tests, a common, though naive, practice is to base the comparison of sensitivity, specificity, as well as of positive and negative predictive values on some subset of the data that fits into methods implemented in standard statistical packages. Such methods are usually valid only under the strong missing completely at random (MCAR) assumption and may generate biased and less precise estimates. We review some models that use the dependence structure of the completely observed cases to incorporate the information of the partially categorized observations into the analysis and show how they may be fitted via a two-stage hybrid process involving maximum likelihood in the first stage and weighted least squares in the second. We indicate how computational subroutines written in R may be used to fit the proposed models and illustrate the different analysis strategies with observational data collected to compare the accuracy of three distinct non-invasive diagnostic methods for endometriosis. The results indicate that even when the MCAR assumption is plausible, the naive partial analyses should be avoided.
Resumo:
The ever-increasing robustness and reliability of flow-simulation methods have consolidated CFD as a major tool in virtually all branches of fluid mechanics. Traditionally, those methods have played a crucial role in the analysis of flow physics. In more recent years, though, the subject has broadened considerably, with the development of optimization and inverse design applications. Since then, the search for efficient ways to evaluate flow-sensitivity gradients has received the attention of numerous researchers. In this scenario, the adjoint method has emerged as, quite possibly, the most powerful tool for the job, which heightens the need for a clear understanding of its conceptual basis. Yet, some of its underlying aspects are still subject to debate in the literature, despite all the research that has been carried out on the method. Such is the case with the adjoint boundary and internal conditions, in particular. The present work aims to shed more light on that topic, with emphasis on the need for an internal shock condition. By following the path of previous authors, the quasi-1D Euler problem is used as a vehicle to explore those concepts. The results clearly indicate that the behavior of the adjoint solution through a shock wave ultimately depends upon the nature of the objective functional.
Resumo:
Capillary electrophoresis with capacitively coupled contactless conductivity detection was successfully used to quantify N-acetylglucosamine and five N-acetyl-chitooligosaccharides (C2-C6) produced after reaction with a purified chitinase (TmChi) from Tenebrio molitor (Coleoptera). No derivatization process was necessary. The separation was developed using 10 mM NaOH with 10% (v/v) acetonitrile as background electrolyte and homemade equipment with a system that avoids the harmful effect of electrolysis. The limit of detection for all oligosaccharides was ca. 3 mu M, and the results indicated that the larger the oligosaccharide, the higher the sensitivity. Analysis of the chitooligosaccharides produced revealed that TmChi has an endolytic cleavage pattern with C5 as the best substrate (higher catalytic efficiency k(cat)/K-M) releasing C2 and C3. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
Continuous casting is a casting process that produces steel slabs in a continuous manner with steel being poured at the top of the caster and a steel strand emerging from the mould below. Molten steel is transferred from the AOD converter to the caster using a ladle. The ladle is designed to be strong and insulated. Complete insulation is never achieved. Some of the heat is lost to the refractories by convection and conduction. Heat losses by radiation also occur. It is important to know the temperature of the melt during the process. For this reason, an online model was previously developed to simulate the steel and ladle wall temperatures during the ladle cycle. The model was developed as an ODE based model using grey box modeling technique. The model’s performance was acceptable and needed to be presented in a user friendly way. The aim of this thesis work was basically to design a GUI that presents steel and ladle wall temperatures calculated by the model and also allow the user to make adjustments to the model. This thesis work also discusses the sensitivity analysis of different parameters involved and their effects on different temperature estimations.
Resumo:
Värmedrivna vitvaror eller HWC-maskiner som de kallas av tillverkaren värms med varmt vatten från en cirkulerande krets via en värmeväxlare inbyggd i maskinen, till skillnad från konventionella maskiner som värms med el. Denna teknik skall inte förväxlas med maskiner som är anslutna till varmvattenledningen och fylls på med varmt vatten och som därmed begränsas till disk- och tvätt. Syftet med fjärrvärmedrivna vitvaror är alltså att använda fjärrvärme, som har lägre kvalitet och pris än elenergi för uppvärmning och torkning och på så sätt spara el och utöka fjärrvärmeunderlaget. En jämförelse av koldioxidutsläpp och primärenergianvändning mellan konventionella vitvaror och fjärrvärmedrivna vitvaror visar att både koldioxidutsläpp och primärenergianvändning blir lägre för fjärrvärmedrivna vitvaror om biobränsle anses koldioxidneutralt och den el som ersätts är producerad i kolkraftverk eller gaskombikraftverk. Denna rapport beskriver utveckling och kommersialisering av värmedrivna vitvaror (disk- och tvättmaskiner samt torktumlare och torkskåp) och hur de kan anslutas mot fjärrvärmesystem i olika systemlösningar. Dessutom har de energimässiga och ekonomiska förutsättningarna för tekniken undersökts. Erfarenheterna från fältprovning är dock mycket begränsade, eftersom de byggen där fälttesterna skulle ske försenades. Under 2013 färdigställs ett flerbostadshus med värmedrivna vitvaror i 160 lägenheter i Västerås. De utvecklade maskinernas värmeanvändning som andel av total energianvändning vid 60 graders framledningstemperatur har uppmätts till ca 50 % för diskmaskinen, 67 % för tvättmaskinen, 80 % för torktumlaren och 93 % för torkskåpet. I det studerade flerbostadshuset av passivhusstandard uppgår lasten från värmedrivna vitvaror komfortgolvvärme och handdukstorkar till upp mot 30 % av husets totala värmeanvändning. För småhus är motsvarande siffra upp mot 20 %. Att använda fjärrvärme istället för elvärme till dessa installationer som normalt är elvärmda kan allts minska elbehovet betydligt i lågenergibebyggelse vilket också minskar både koldioxidutsläppen och primärenergianvändningen. Ekonomiska analyser har genomförts för två olika systemkoncept (separat vitvarukrets och Västeråsmodellen) för nybyggda småhusområden och flerfamiljshus där fjärrvärme inte bara används till vitvaror utan också till handdukstorkar och komfortgolvvärme. De ekonomiska analyserna visar att Västeråsmodellen är den mest ekonomiskt intressanta systemlösningen med värmedrivna vitvaror, handdukstork och komfortgolvvärme. I flerfamiljshus kan den vara konkurrenskraftig mot de elvärmda alternativen (konventionellt system med eldrivna vitvaror, komfortgolvvärme och handdukstorkar) om prisskillnaden mellan el och fjärrvärme är större än 0,7 kr/kWh. En parameterstudie visar att kapitalkostnaden blir ganska hög jämfört med energikostnaden, vilket betyder att lång livslängd och många cykler är viktigt för att förbättra de ekonomiska förutsättningarna för värmedrivna vitvaror. För passiva småhus blir kostnaden för Västeråsmodellen med värmedrivna vitvaror, handdukstork och komfortgolvvärme likvärdig med de elvärmda alternativen vid energiprisskillnader på 0,7 kr/kWh inklusive moms, medan det krävs prisskillnader på 0,9 kr/kWh inklusive moms för normalisolerade småhusområden. Sammanfattningsvis kan sägas att i kommuner med ett konkurrenskraftigt fjärrvärmepris finns det viss lönsamhet för hela konceptet enligt Västeråsmodellen med värmedrivna vitvaror, komfortgolvvärme, och handdukstorkar. Om man däremot ser på konkurrensen för enskilda vitvaror är det främst torktumlaren som är konkurrenskraftig i bostäder. Målpriset på 1000 kr extra för värmedrift har inte kunnat uppnås inom projektet för diskmaskiner och tvättmaskiner. Det krävs lägre priser och låga anslutningskostnader för att räkna hem diskmaskinen och tvättmaskinen som enskilda komponenter. Värmedrivna tvättmaskiner och torktumlare är konkurrenskraftiga i flerfamiljstvättstugor. Speciellt i de fall där beläggningen är god och flera maskiner delar på anslutningskostnaden till fjärrvärmecentralen kan värmedrift bli riktigt lönsam. Torkskåpens konkurrenskraft har inte kunnat utvärderas, då priset ännu inte fastställts. Att använda VVC-systemet för värmedistribution till värmedrivna vitvaror kan vara mycket intressant, men det kräver att legionellaproblematiken kan lösas. I nuläget finns ingen lösning som uppfyller formuleringarna i boverkets byggregler. Ett annat distributionssätt som kan vara intressant, men som inte undersökts i studien är att använda VVC för varmvattendistribution och en gemensam radiator- och vitvarukrets med konstant framledningstemperatur. Den aktör som förväntas ha störst ekonomiskt intresse av att tekniken implementeras är sannolikt fjärrvärmebolagen som får sälja mer värme och det ligger därmed främst på deras ansvar att marknadsföra tekniken i mötet med sina kunder.