830 resultados para Global sensitivity analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Effective disaster risk management relies on science-based solutions to close the gap between prevention and preparedness measures. The consultation on the United Nations post-2015 framework for disaster risk reduction highlights the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management, in order to save lives and property and reduce the overall impact of severe events. Continental and global scale flood forecasting systems provide vital early flood warning information to national and international civil protection authorities, who can use this information to make decisions on how to prepare for upcoming floods. Here the potential monetary benefits of early flood warnings are estimated based on the forecasts of the continental-scale European Flood Awareness System (EFAS) using existing flood damage cost information and calculations of potential avoided flood damages. The benefits are of the order of 400 Euro for every 1 Euro invested. A sensitivity analysis is performed in order to test the uncertainty in the method and develop an envelope of potential monetary benefits of EFAS warnings. The results provide clear evidence that there is likely a substantial monetary benefit in this cross-border continental-scale flood early warning system. This supports the wider drive to implement early warning systems at the continental or global scale to improve our resilience to natural hazards.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A particle filter method is presented for the discrete-time filtering problem with nonlinear ItA ` stochastic ordinary differential equations (SODE) with additive noise supposed to be analytically integrable as a function of the underlying vector-Wiener process and time. The Diffusion Kernel Filter is arrived at by a parametrization of small noise-driven state fluctuations within branches of prediction and a local use of this parametrization in the Bootstrap Filter. The method applies for small noise and short prediction steps. With explicit numerical integrators, the operations count in the Diffusion Kernel Filter is shown to be smaller than in the Bootstrap Filter whenever the initial state for the prediction step has sufficiently few moments. The established parametrization is a dual-formula for the analysis of sensitivity to gaussian-initial perturbations and the analysis of sensitivity to noise-perturbations, in deterministic models, showing in particular how the stability of a deterministic dynamics is modeled by noise on short times and how the diffusion matrix of an SODE should be modeled (i.e. defined) for a gaussian-initial deterministic problem to be cast into an SODE problem. From it, a novel definition of prediction may be proposed that coincides with the deterministic path within the branch of prediction whose information entropy at the end of the prediction step is closest to the average information entropy over all branches. Tests are made with the Lorenz-63 equations, showing good results both for the filter and the definition of prediction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A conceptual problem that appears in different contexts of clustering analysis is that of measuring the degree of compatibility between two sequences of numbers. This problem is usually addressed by means of numerical indexes referred to as sequence correlation indexes. This paper elaborates on why some specific sequence correlation indexes may not be good choices depending on the application scenario in hand. A variant of the Product-Moment correlation coefficient and a weighted formulation for the Goodman-Kruskal and Kendall`s indexes are derived that may be more appropriate for some particular application scenarios. The proposed and existing indexes are analyzed from different perspectives, such as their sensitivity to the ranks and magnitudes of the sequences under evaluation, among other relevant aspects of the problem. The results help suggesting scenarios within the context of clustering analysis that are possibly more appropriate for the application of each index. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We review some issues related to the implications of different missing data mechanisms on statistical inference for contingency tables and consider simulation studies to compare the results obtained under such models to those where the units with missing data are disregarded. We confirm that although, in general, analyses under the correct missing at random and missing completely at random models are more efficient even for small sample sizes, there are exceptions where they may not improve the results obtained by ignoring the partially classified data. We show that under the missing not at random (MNAR) model, estimates on the boundary of the parameter space as well as lack of identifiability of the parameters of saturated models may be associated with undesirable asymptotic properties of maximum likelihood estimators and likelihood ratio tests; even in standard cases the bias of the estimators may be low only for very large samples. We also show that the probability of a boundary solution obtained under the correct MNAR model may be large even for large samples and that, consequently, we may not always conclude that a MNAR model is misspecified because the estimate is on the boundary of the parameter space.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents a comprehensive and detailed overview of the international trade performance of the manufacturing industry in Brazil over the last decades, emphasizing its participation in Global Value Chains. It uses information from recent available global inputoutput tables such as WIOD (World Input-output database) and TIVA (Trade in Value Added, OECD) as well as complementary information from the GTAP 8 (Global Trade Analysis Project) database. The calculation of a broad set of value added type indicators allows a precise contextualization of the ongoing structural changes in the Brazilian industry, highlighting the relative isolation of its manufacturing sector from the most relevant international supply chains. This article also proposes a public policy discussion, presenting two case studies: the first one related to trade facilitation and the second one to preferential trade agreements. The main conclusions are twofold: first, the reduction of time delays at customs in Brazil may significantly improve the trade performance of its manufacturing industry, specially for the more capital intensive sectors which are generally the ones with greater potential to connection to global value chains; second, the extension of the concept of a “preferential trade partner” to the context of the global unbundling of production may pave the way to future trade policy in Brazil, particularly in the mapping of those partners whose bilateral trade relations with Brazil should receive greater priority by policy makers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents a comprehensive and detailed overview of the international trade performance of the manufacturing industry in Brazil over the last decades, emphasizing its participation in Global Value Chains. It uses information from recent available global inputoutput tables such as WIOD (World Input-output database) and TIVA (Trade in Value Added, OECD) as well as complementary information from the GTAP 8 (Global Trade Analysis Project) database. The calculation of a broad set of value added type indicators allows a precise contextualization of the ongoing structural changes in the Brazilian industry, highlighting the relative isolation of its manufacturing sector from the most relevant international supply chains. This article also proposes a public policy discussion, presenting two case studies: the first one related to trade facilitation and the second one to preferential trade agreements. The main conclusions are twofold: first, the reduction of time delays at customs in Brazil may significantly improve the trade performance of its manufacturing industry, specially for the more capital intensive sectors which are generally the ones with greater potential to connection to global value chains; second, the extension of the concept of a “preferential trade partner” to the context of the global unbundling of production may pave the way to future trade policy in Brazil, particularly in the mapping of those partners whose bilateral trade relations with Brazil should receive greater priority by policy makers

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work proposes a computational methodology to solve problems of optimization in structural design. The application develops, implements and integrates methods for structural analysis, geometric modeling, design sensitivity analysis and optimization. So, the optimum design problem is particularized for plane stress case, with the objective to minimize the structural mass subject to a stress criterion. Notice that, these constraints must be evaluated at a series of discrete points, whose distribution should be dense enough in order to minimize the chance of any significant constraint violation between specified points. Therefore, the local stress constraints are transformed into a global stress measure reducing the computational cost in deriving the optimal shape design. The problem is approximated by Finite Element Method using Lagrangian triangular elements with six nodes, and use a automatic mesh generation with a mesh quality criterion of geometric element. The geometric modeling, i.e., the contour is defined by parametric curves of type B-splines, these curves hold suitable characteristics to implement the Shape Optimization Method, that uses the key points like design variables to determine the solution of minimum problem. A reliable tool for design sensitivity analysis is a prerequisite for performing interactive structural design, synthesis and optimization. General expressions for design sensitivity analysis are derived with respect to key points of B-splines. The method of design sensitivity analysis used is the adjoin approach and the analytical method. The formulation of the optimization problem applies the Augmented Lagrangian Method, which convert an optimization problem constrained problem in an unconstrained. The solution of the Augmented Lagrangian function is achieved by determining the analysis of sensitivity. Therefore, the optimization problem reduces to the solution of a sequence of problems with lateral limits constraints, which is solved by the Memoryless Quasi-Newton Method It is demonstrated by several examples that this new approach of analytical design sensitivity analysis of integrated shape design optimization with a global stress criterion purpose is computationally efficient

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an analysis of technical and financial feasibility of the use of a solar system for water heating in a fictitious hotel located in the Northeast region. Thereunto it is used techniques of solar collectors´ sizing and methods of financial mathematics, such as Net Present Value (NPV), Internal Rate of Return (IRR) and Payback. It will also be presented a sensitivity analysis to verify which are the factors that impact the viability of the solar heating. Comparative analysis will be used concerning three cities of distinct regions of Brazil: Curitiba, Belém and João Pessoa. The viability of using a solar heating system will be demonstrated to the whole Brazil, especially to the northeast region as it is the most viable for such an application of solar power because of its high levels of solar radiation. Among the cities examined for a future installation of solar heating systems for water heating in the hotel chain, João Pessoa was the one that has proved more viable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUÇÃO: A mortalidade infantil em Presidente Prudente, SP (Brasil), foi estudada no período de 1990 a 1992, a partir de aplicação de métodos para obtenção de diagnóstico coletivo que orientassem a identificação e escolha de estratégias de controle de problemas locais. MATERIAL E MÉTODO: Foram utilizadas declarações de óbito colhidas no cartório, cujos dados originais foram corrigidos por meio de pesquisa documental nos serviços de saúde e entrevistas domiciliares. Para estudar variáveis como idade materna e peso ao nascer foram utilizados os dados do Sistema de Informações sobre Nascidos Vivos (SINASC). A qualidade dos dados originais das declarações de óbitos foi inicialmente analisada pela quantidade de informações, sensibilidade, especificidade e valor de Kappa. RESULTADO: A sensibilidade global para a causa básica de óbito foi 78,84% e Kappa igual a 71,32 para o total de causas. Ocorreram 189 óbitos, sendo 66,15% no período neonatal (41,28% durante o primeiro dia de vida) e 33,85% no infantil tardio. O peso ao nascer de 58,28% dos óbitos foi menor que 2.500g. As causas básicas de óbito foram estudadas segundo a possibilidade de serem prevenidas (método desenvolvido por Erica Taucher) por grupos de causas reduzidas utilizadas no International Collaborative Effort (ICE), causas múltiplas e distribuição geográfica. Observou-se que nos óbitos ocorridos até 27 dias, 22,23% poderiam ser evitados por adequada atenção ao parto, 20,64% seriam redutíveis por diagnóstico e tratamento precoce, 13,75% por bom controle da gravidez e apenas 7,94% não evitáveis. Das mortes ocorridas no período infantil tardio, 12,17% foram classificadas como outras preveníveis e 4,23% foram consideradas não evitáveis. Segundo os grupos do ICE, 58,74% faleceram por imaturidade ou asfixias; 19,58% por infecções e, 12,17%, por anomalias congênitas. CONCLUSÃO: Os resultados sugerem prioridade para assistência obstétrica no trabalho de parto e atenção pediátrica por baixo peso ao nascer, entre outras. A análise por causas múltiplas mostra que 76,05% dos óbitos têm as causas básicas relacionadas a causas perinatais e confirma a relação entre as deficiências de peso e as complicações respiratórias do recém-nascido. As complicações maternas também relacionaram-se com o baixo peso. Identificaram-se grandes diferenças no coeficiente de mortalidade infantil entre as áreas da zona urbana não somente restritas aos valores, como também ao tipo de doenças responsáveis pela ocorrência do óbito. Conclui-se haver vantagem no uso associado das quatro técnicas que são complementares, tanto para estudo, como para planejamento de ações dirigidas à prevenção da mortalidade infantil.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we proposed a flexible cure rate survival model by assuming the number of competing causes of the event of interest following the Conway-Maxwell distribution and the time for the event to follow the generalized gamma distribution. This distribution can be used to model survival data when the hazard rate function is increasing, decreasing, bathtub and unimodal-shaped including some distributions commonly used in lifetime analysis as particular cases. Some appropriate matrices are derived in order to evaluate local influence on the estimates of the parameters by considering different perturbations, and some global influence measurements are also investigated. Finally, data set from the medical area is analysed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study evaluated the efficacy of probiotic utilization as growth promoters in broiler chicken feeding using systematic literature review and meta-analysis. Thirty-five studies were recovered by the systematic review, 27 of which met the following criteria to be included in the meta-analysis: (1) Brazilian studies published between 1995 and 2005; (2) probiotics administered in the diet without growth promoter; (3) results included performance data with the respective coefficient of variation. Meta-analysis have shown that the probiotics promoted better weight gain and feed conversion than the negative control (no antimicrobial) in the initial phase (1 to 20-28 days); nevertheless, results were similar in the total period (1 to 35-48 days). Weight gain and feed conversion were similar between probiotics and the positive control (with antimicrobial) both in the initial and in the total periods. Viability in the total period improved with the use of probiotics in comparison to the negative or positive controls. Sensitivity analysis showed that the results of meta-analysis were coherent. The funnel plots and the Egger regression method evidenced that the studies published in Brazil do not present biased results. It is possible to conclude that the probiotics are a technically viable alternative to antimicrobial growth promoters in broiler feeding. Nevertheless, further studies are necessary to identify eventual differences among the probiotics commercially available in Brazil.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article some considerations obtained during the utilization of rotor response analysis techniques in hydraulic powerplants are discussed. An applied research work was carried out in two hydraulic turbines analysing the rotor response both theoretically and experimentally. A developed mathematical model was used to simulate the rotordynamic behaviour of Francis and Kaplan turbines. The main dynamical effects that appear during the operation of the machines are discussed too. A series of measurements were carried out in the turbines using impact hammers to determine the modal behaviour of the units. The tests were carried out with the machine still and in operation. Some results and the comparison with the theory is presented in this article. The improved theoretical model was used for a sensitivity analysis of the different bearings to the main excitations that fake place during the machine operation. From this analysis, the best measuring points for condition monitoring were determined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We used a computational model of biochemical pathways that are involved in the phosphorylation/dephosphorylation of AMPA receptor to study the receptor responses to calcium oscillations. In the model, the biochemical pathways are assumed to be located immediately under the postsynaptic membrane and we included three states of AMPA receptor: dephosphorylated, and phosphorylated in one or in two sites. To characterize the effects of calcium oscillations on the AMPA receptor, we exposed the model to stimuli with three varying parameters, namely frequency, number of pulses and calcium spike duration. Our model showed sensitivity to all of these three parameters. © 2002 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses a design approach for a high-Q low-sensitivity OTA-C biquad bandpass section. An optimal relationship is established between transconductances defining the differenceβ - γ in the Q-factor denominator, setting the Q-sensitivity to tuning voltages around unity. A 30-MHz filter was designed based on a 0.35μn CMOS process and VDD=3.3V. A range of circuit simulation supports the theoretical analysis. Q-factor spans from 20.5 to 60, while ensuring filter stability along the tuning range. Although a Mode-operating OTA is used, the procedure can be extended to other types of transconductor.