68 resultados para Parametric sensitivity analysis
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
This paper presents studies of cases in power systems by Sensitivity Analysis (SA) oriented by Optimal Power Flow (OPF) problems in different operation scenarios. The studies of cases start from a known optimal solution obtained by OPF. This optimal solution is called base case, and from this solution new operation points may be evaluated by SA when perturbations occur in the system. The SA is based on Fiacco`s Theorem and has the advantage of not be an iterative process. In order to show the good performance of the proposed technique tests were carried out on the IEEE 14, 118 and 300 buses systems. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a new approach to the transmission loss allocation problem in a deregulated system. This approach belongs to the set of incremental methods. It treats all the constraints of the network, i.e. control, state and functional constraints. The approach is based on the perturbation of optimum theorem. From a given optimal operating point obtained by the optimal power flow the loads are perturbed and a new optimal operating point that satisfies the constraints is determined by the sensibility analysis. This solution is used to obtain the allocation coefficients of the losses for the generators and loads of the network. Numerical results show the proposed approach in comparison to other methods obtained with well-known transmission networks, IEEE 14-bus. Other test emphasizes the importance of considering the operational constraints of the network. And finally the approach is applied to an actual Brazilian equivalent network composed of 787 buses, and it is compared with the technique used nowadays by the Brazilian Control Center. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Embedded sensitivity analysis has proven to be a useful tool in finding optimum positions of structure reinforcements. However, it was not clear how sensitivities obtained from the embedded sensitivity method were related to the normal mode, or operational mode, associated to the frequency of interest. In this work, this relationship is studied based on a finite element of a slender sheet metal piece, with preponderant bending modes. It is shown that higher sensitivities always occur at nodes or antinodes of the vibrating system. [DOI: 10.1115/1.4002127]
Resumo:
Inverse analysis is currently an important subject of study in several fields of science and engineering. The identification of physical and geometric parameters using experimental measurements is required in many applications. In this work a boundary element formulation to identify boundary and interface values as well as material properties is proposed. In particular the proposed formulation is dedicated to identifying material parameters when a cohesive crack model is assumed for 2D problems. A computer code is developed and implemented using the BEM multi-region technique and regularisation methods to perform the inverse analysis. Several examples are shown to demonstrate the efficiency of the proposed model. (C) 2010 Elsevier Ltd. All rights reserved,
Resumo:
We introduce the log-beta Weibull regression model based on the beta Weibull distribution (Famoye et al., 2005; Lee et al., 2007). We derive expansions for the moment generating function which do not depend on complicated functions. The new regression model represents a parametric family of models that includes as sub-models several widely known regression models that can be applied to censored survival data. We employ a frequentist analysis, a jackknife estimator, and a parametric bootstrap for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Further, for different parameter settings, sample sizes, and censoring percentages, several simulations are performed. In addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be extended to a modified deviance residual in the proposed regression model applied to censored data. We define martingale and deviance residuals to evaluate the model assumptions. The extended regression model is very useful for the analysis of real data and could give more realistic fits than other special regression models.
Resumo:
With the aim to compare the cost of treatment for rheumatoid arthritis therapy with desease-modifying antirheumatic drugs (DMARDS) for a 48-month period, were studied five different treatment stage based on clinical protocols recommended by the Brazilian Society of Rheumatology, and then five therapy cycles. The analytical model based on the Markov Analysis, considered chaces for the patient continue in some stages or change between them according with a positive effect on outcomes. Only direct costs were comprised in the analyzed data, like drugs, materials and tests used for monitoring these patients. The results of the model show that the stage in with metotrexato drug is used like monotherapy was cost-effective (R$ 113,900,00 for patient during 48 months), followed by refractory patient (R$ 1,554,483,43), those that use therapy triplicate followed by infleximable drug (R$ 1, 701, 286.76), the metotrexato intolearant patient (R$ 2,629,919,14), and final the result from that use metotrexato and infliximable in the beginning (R$ 9,292,879,31). The sensitivity analysis confirm this results, when alternate the efficacy of metotrexato and infliximabe.
Resumo:
Background. The purpose of this study was to analyze the cost-effectiveness of cisplatin-based chemoradiation compared to radiation therapy (RT) alone to treat patients with advanced head and neck cancer in Brazil. Methods. Data were collected retrospectively from the medical records of 33 patients treated with RT alone (strategy 1) and from 29 patients treated with cisplatin-based chemoradiation (strategy 2). The Brazilian National Health System (Sistema Unico de Saude [SUS]) reimbursement parameters perspective was considered, and the effectiveness was measured in years of disease-free life gained. One-way sensitivity analysis was performed to determine robustness of this study. Results. In strategy 1, there were 31% of the patients who lived without disease progression for more than 13 months after treatment, compared to 58% of patients in strategy 2. According to SUS parameters, the total cost per patient in strategy 1 was $1167.00 U.S. dollars and in strategy 2, it was $2058.00 U.S. dollars. Incremental cost-effectiveness ratio (ICER) was $3303.00 U.S. dollars per life-year gained. Conclusion. Cisplatin-based chemoradiation proved to be more cost-effective than RT alone. (C) 2010 Wiley Periodicals, Inc. Head Neck 33: 1199-1205, 2011
Resumo:
Objective. The objective of this study was to conduct a cost-effectiveness analysis of a universal rotavirus vaccination program among children : 5 years of age in Brazil. Methods. Considering a hypothetical annual cohort of approximately 3 300 000 newborns followed over 5 years, a decision-tree model was constructed to examine the possible clinical and economic effects of rotavirus infection with and without routine vaccination of children. Probabilities and unit costs were derived from published research and national administrative data. The impact of different estimates for key parameters was studied using sensitivity analysis. The analysis was conducted from both healthcare system and societal perspectives. Results. The vaccination program was estimated to prevent approximately 1735 351 (54%) of the 3 210 361 cases of rotavirus gastroenteritis and 703 (75%) of 933 rotavirus-associated deaths during the 5-year period. At a vaccine price of 18.6 Brazilian reais (R$) per dose, this program would cost R$121 673 966 and would save R$38 536 514 in direct costs to the public healthcare system and R$71 778 377 in direct and indirect costs to society. The program was estimated to cost R$1 028 and R$1 713 per life-years saved (LYS)from the societal and healthcare system perspectives, respectively. Conclusions. Universal rotavirus vaccination was a cost-effective strategy for both perspectives. However, these findings are highly sensitive to diarrhea incidence rate, proportion of severe cases, vaccine coverage, and vaccine price.
Resumo:
A conceptual problem that appears in different contexts of clustering analysis is that of measuring the degree of compatibility between two sequences of numbers. This problem is usually addressed by means of numerical indexes referred to as sequence correlation indexes. This paper elaborates on why some specific sequence correlation indexes may not be good choices depending on the application scenario in hand. A variant of the Product-Moment correlation coefficient and a weighted formulation for the Goodman-Kruskal and Kendall`s indexes are derived that may be more appropriate for some particular application scenarios. The proposed and existing indexes are analyzed from different perspectives, such as their sensitivity to the ranks and magnitudes of the sequences under evaluation, among other relevant aspects of the problem. The results help suggesting scenarios within the context of clustering analysis that are possibly more appropriate for the application of each index. (C) 2008 Elsevier Inc. All rights reserved.
A bivariate regression model for matched paired survival data: local influence and residual analysis
Resumo:
The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.
Resumo:
In survival analysis applications, the failure rate function may frequently present a unimodal shape. In such case, the log-normal or log-logistic distributions are used. In this paper, we shall be concerned only with parametric forms, so a location-scale regression model based on the Burr XII distribution is proposed for modeling data with a unimodal failure rate function as an alternative to the log-logistic regression model. Assuming censored data, we consider a classic analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the log-logistic and log-Burr XII regression models. Besides, we use sensitivity analysis to detect influential or outlying observations, and residual analysis is used to check the assumptions in the model. Finally, we analyze a real data set under log-Buff XII regression models. (C) 2008 Published by Elsevier B.V.
Resumo:
We review some issues related to the implications of different missing data mechanisms on statistical inference for contingency tables and consider simulation studies to compare the results obtained under such models to those where the units with missing data are disregarded. We confirm that although, in general, analyses under the correct missing at random and missing completely at random models are more efficient even for small sample sizes, there are exceptions where they may not improve the results obtained by ignoring the partially classified data. We show that under the missing not at random (MNAR) model, estimates on the boundary of the parameter space as well as lack of identifiability of the parameters of saturated models may be associated with undesirable asymptotic properties of maximum likelihood estimators and likelihood ratio tests; even in standard cases the bias of the estimators may be low only for very large samples. We also show that the probability of a boundary solution obtained under the correct MNAR model may be large even for large samples and that, consequently, we may not always conclude that a MNAR model is misspecified because the estimate is on the boundary of the parameter space.
Resumo:
Background: In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method: The cost-effectiveness of the Optimal (R) and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U. S. dollars. Sensitivity analysis was performed considering key model parameters. Results: In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$ 549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion: Microscopy is more cost-effective than OptiMal (R) in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
Resumo:
This paper presents a new methodology to estimate harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The main advantage in using such a technique relies upon its modeling facilities as well as its potential to solve fairly complex problems. The problem-solving algorithm herein proposed makes use of data from various power-quality (PQ) meters, which can either be synchronized by high technology global positioning system devices or by using information from a fundamental frequency load flow. This second approach makes the overall PQ monitoring system much less costly. The algorithm is applied to an IEEE test network, for which sensitivity analysis is performed to determine how the parameters of the ES can be selected so that the algorithm performs in an effective way. Case studies show fairly promising results and the robustness of the proposed method.