968 resultados para Regression-analysis
Resumo:
AIM: The study aimed to compare the rate of success and cost of anal fistula plug (AFP) insertion and endorectal advancement flap (ERAF) for anal fistula. METHOD: Patients receiving an AFP or ERAF for a complex single fistula tract, defined as involving more than a third of the longitudinal length of of the anal sphincter, were registered in a prospective database. A regression analysis was performed of factors predicting recurrence and contributing to cost. RESULTS: Seventy-one patients (AFP 31, ERAF 40) were analysed. Twelve (39%) recurrences occurred in the AFP and 17 (43%) in the ERAF group (P = 1.00). The median length of stay was 1.23 and 2.0 days (P < 0.001), respectively, and the mean cost of treatment was euro5439 ± euro2629 and euro7957 ± euro5905 (P = 0.021), respectively. On multivariable analysis, postoperative complications, underlying inflammatory bowel disease and fistula recurring after previous treatment were independent predictors of de novo recurrence. It also showed that length of hospital stay ≤ 1 day to be the most significant independent contributor to lower cost (P = 0.023). CONCLUSION: Anal fistula plug and ERAF were equally effective in treating fistula-in-ano, but AFP has a mean cost saving of euro2518 per procedure compared with ERAF. The higher cost for ERAF is due to a longer median length of stay.
Resumo:
Objectives: We present the retrospective analysis of a single-institution experience for radiosurgery (RS) in brain metastasis (BM) with Gamma Knife (GK) and Linac. Methods: From July 2010 to July 2012, 28 patients (with 83 lesions) had RS with GK and 35 patients (with 47 lesions) with Linac. The primary outcome was the local progression-free survival (LPFS). The secondary outcome was the overall survival (OS). Apart a standard statistical analysis, we included a Cox regression model with shared frailty, to modulate the within-patient correlation (preliminary evaluation showed a significant frailty effect, meaning that the correlation within patient could be ignored). Results: The mean follow-up period was 11.7 months (median 7.9, 1.7-22.7) for GK and 18.1 (median 17, 7.5-28.7) for Linac. The median number of lesions per patient was 2.5 (1-9) in GK compared with 1 (1-3) in Linac. There were more radioresistant lesions (melanoma) and more lesions located in functional areas for the GK group. The median dose was 24 Gy (GK) compared with 20 Gy (Linac). The LPFS actuarial rate was as follows: for GK at 3, 6, 9, 12, and 17 months: 96.96, 96.96, 96.96, 88.1, and 81.5%, and remained stable till 32 months; for Linac at 3, 6, 12, 17, 24, and 33 months, it was 91.5, 91.5, 91.5, 79.9, 55.5, and 17.1%, respectively (p = 0.03, chi-square test). After the Cox regression analysis with shared frailty, the p-value was not statistically significant between groups. The median overall survival was 9.7 months for GK and 23.6 months for Linac group. Uni- and multivariate analysis showed a lower GPA score and noncontrolled systemic status were associated with lower OS. Cox regression analysis adjusting for these two parameters showed comparable OS rate. Conclusions: In this comparative report between GK and Linac, preliminary analysis showed that more difficult cases are treated by GK, with patients harboring more lesions, radioresistant tumors, and highly functional located. The groups look, in this sense, very heterogeneous at baseline. After a Cox frailty model, the LPFS rates seemed very similar (p < 0.05). The OS was similar, after adjusting for systemic status and GPA score (p < 0.05). The technical reasons for choosing GK instead of Linac were the anatomical location related to highly functional areas, histology, technical limitations of Linac movements, especially lower posterior fossa locations, or closeness of multiple lesions to highly functional areas optimal dosimetry with Linac
Resumo:
The analysis of price asymmetries in the gasoline market is one of the most studied in the energy economics literature. Nevertheless, the great variability of results makes it very difficult to extract conclusive results on the existence or not of asymmetries. This paper shows through a meta-analysis approach how the industry segment analysed, the quality and quantity of data, the estimator and the model used may explain this heterogeneity of results.
Resumo:
The 51st ERSA Conference held in Barcelona in 2011 was one of the largest ever. By examining the characteristics of the conference, this paper identifies the main trends in Regional Science and draws on a broad array of sources of information: the delegates" demographic details, the conference program itself, a satisfaction survey conducted among delegates, a quality survey addressed to those chairing the sessions and, finally, a bibliometric database including each author signing a paper presented at the conference. We finally run a regression analysis from which we show that for ERSA delegates what matters most is quality, and this must be the direction that future conferences should move toward. Ultimately, ERSA conferences are comprehensive, all-embracing occasions, representing an ideal opportunity for regional scientists to present their work to each other and to network.
Resumo:
Two speed management policies were implemented in the metropolitan area of Barcelona aimed at reducing air pollution concentration levels. In 2008, the maximum speed limit was reduced to 80 km/h and, in 2009, a variable speed system was introduced on some metropolitan motorways. This paper evaluates whether such policies have been successful in promoting cleaner air, not only in terms of mean pollutant levels but also during high and low pollution episodes. We use a quantile regression approach for fixed effect panel data. We find that the variable speed system improves air quality with regard to the two pollutants considered here, being most effective when nitrogen oxide levels are not too low and when particulate matter concentrations are below extremely high levels. However, reducing the maximum speed limit from 120/100 km/h to 80 km/h has no effect – or even a slightly increasing effect –on the two pollutants, depending on the pollution scenario. Length: 32 pages
Resumo:
Background: The DNA repair protein O6-Methylguanine-DNA methyltransferase (MGMT) confers resistance to alkylating agents. Several methods have been applied to its analysis, with methylation-specific polymerase chain reaction (MSP) the most commonly used for promoter methylation study, while immunohistochemistry (IHC) has become the most frequently used for the detection of MGMT protein expression. Agreement on the best and most reliable technique for evaluating MGMT status remains unsettled. The aim of this study was to perform a systematic review and meta-analysis of the correlation between IHC and MSP. Methods A computer-aided search of MEDLINE (1950-October 2009), EBSCO (1966-October 2009) and EMBASE (1974-October 2009) was performed for relevant publications. Studies meeting inclusion criteria were those comparing MGMT protein expression by IHC with MGMT promoter methylation by MSP in the same cohort of patients. Methodological quality was assessed by using the QUADAS and STARD instruments. Previously published guidelines were followed for meta-analysis performance. Results Of 254 studies identified as eligible for full-text review, 52 (20.5%) met the inclusion criteria. The review showed that results of MGMT protein expression by IHC are not in close agreement with those obtained with MSP. Moreover, type of tumour (primary brain tumour vs others) was an independent covariate of accuracy estimates in the meta-regression analysis beyond the cut-off value. Conclusions Protein expression assessed by IHC alone fails to reflect the promoter methylation status of MGMT. Thus, in attempts at clinical diagnosis the two methods seem to select different groups of patients and should not be used interchangeably.
Resumo:
Peer-reviewed
Resumo:
When laboratory intercomparison exercises are conducted, there is no a priori dependence of the concentration of a certain compound determined in one laboratory to that determined by another(s). The same applies when comparing different methodologies. A existing data set of total mercury readings in fish muscle samples involved in a Brazilian intercomparison exercise was used to show that correlation analysis is the most effective statistical tool in this kind of experiments. Problems associated with alternative analytical tools such as mean or paired 't'-test comparison and regression analysis are discussed.
Resumo:
This paper measures the connectedness in EMU sovereign market volatility between April 1999 and January 2014, in order to monitor stress transmission and to identify episodes of intensive spillovers from one country to the others. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yılmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each country and apply panel model techniques to investigate its determinants. Finally, to gain further insights, we examine the timevarying behaviour of net pair-wise directional connectedness at different stages of the recent sovereign debt crisis.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
Due to its non-storability, electricity must be produced at the same time that it is consumed, as a result prices are determined on an hourly basis and thus analysis becomes more challenging. Moreover, the seasonal fluctuations in demand and supply lead to a seasonal behavior of electricity spot prices. The purpose of this thesis is to seek and remove all causal effects from electricity spot prices and remain with pure prices for modeling purposes. To achieve this we use Qlucore Omics Explorer (QOE) for the visualization and the exploration of the data set and Time Series Decomposition method to estimate and extract the deterministic components from the series. To obtain the target series we use regression based on the background variables (water reservoir and temperature). The result obtained is three price series (for Sweden, Norway and System prices) with no apparent pattern.
Resumo:
Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid) was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid) in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.