944 resultados para Interval analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation presents a unique research opportunity by using recordings which provide electrocardiogram (ECG) plus a reference breathing signal (RBS). ECG derived breathing (EDR) is measured and correlated against RBS. Standard deviations of multiresolution wavelet analysis coefficients (SDMW) are obtained from heart rate and classified using RBS. Prior works by others used select patients for sleep apnea scoring with EDR but no RBS. Another prior work classified select heart disease patients with SDMW but no RBS. This study used randomly chosen sleep disorder patient recordings; central and obstructive apneas, with and without heart disease.^ Implementation required creating an application because existing systems were limited in power and scope. A review survey was created to choose a development environment. The survey is presented as a learning tool and teaching resource. Development objectives were rapid development using limited resources (manpower and money). Open Source resources were used exclusively for implementation. ^ Results show: (1) Three groups of patients exist in the study. Grouping RBS correlations shows a response with either ECG interval or amplitude variation. A third group exists where neither ECG intervals nor amplitude variation correlate with breathing. (2) Previous work done by other groups analyzed SDMW. Similar results were found in this study but some subjects had higher SDMW, attributed to a large number of apneas, arousals and/or disconnects. SDMW does not need RBS to show apneic conditions exist within ECG recordings. (3) Results in this study support the assertion that autonomic nervous system variation was measured with SDMW. Measurements using RBS are not corrupted due to breathing even though respiration overlaps the same frequency band.^ Overall, this work becomes an Open Source resource which can be reused, modified and/or expanded. It might fast track additional research. In the future the system could also be used for public domain data. Prerecorded data exist in similar formats in public databases which could provide additional research opportunities. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To investigate the degree of T2 relaxometry changes over time in groups of patients with familial mesial temporal lobe epilepsy (FMTLE) and asymptomatic relatives. We conducted both cross-sectional and longitudinal analyses of T2 relaxometry with Aftervoxel, an in-house software for medical image visualization. The cross-sectional study included 35 subjects (26 with FMTLE and 9 asymptomatic relatives) and 40 controls; the longitudinal study was composed of 30 subjects (21 with FMTLE and 9 asymptomatic relatives; the mean time interval of MRIs was 4.4 ± 1.5 years) and 16 controls. To increase the size of our groups of patients and relatives, we combined data acquired in 2 scanners (2T and 3T) and obtained z-scores using their respective controls. General linear model on SPSS21® was used for statistical analysis. In the cross-sectional analysis, elevated T2 relaxometry was identified for subjects with seizures and intermediate values for asymptomatic relatives compared to controls. Subjects with MRI signs of hippocampal sclerosis presented elevated T2 relaxometry in the ipsilateral hippocampus, while patients and asymptomatic relatives with normal MRI presented elevated T2 values in the right hippocampus. The longitudinal analysis revealed a significant increase in T2 relaxometry for the ipsilateral hippocampus exclusively in patients with seizures. The longitudinal increase of T2 signal in patients with seizures suggests the existence of an interaction between ongoing seizures and the underlying pathology, causing progressive damage to the hippocampus. The identification of elevated T2 relaxometry in asymptomatic relatives and in patients with normal MRI suggests that genetic factors may be involved in the development of some mild hippocampal abnormalities in FMTLE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this work was to evaluate the potential of substituting autogenous bone (AB) by bone marrow aspirate concentrate (BMAC). Both AB and BMAC were tested in combination with a bovine bone mineral (BBM) for their ability of new bone formation (NBF) in a multicentric, randomized, controlled, clinical and histological noninferiority trial. Materials and Methods: Forty-five severely atrophied maxillary sinus from 26 patients were evaluated in a partial cross-over design. As test arm, 34 sinus of 25 patients were augmented with BBM and BMAC containing mesenchymal stem cells. Eleven control sinus from 11 patients were augmented with a mixture of 70% BBM and 30% AB. Biopsies were obtained after a 3-4-month healing period at time of implant placement and histomorphometrically analyzed for NBF. Results: NBF was 14.3%+/- 1.8% for the control and nonsignificantly lower (12.6%+/- 1.7%) for the test (90% confidence interval: -4.6 to 1.2). Values for BBM (31.3%+/- 2.7%) were significantly higher for the test compared with control (19.3%+/- 2.5%) (p < 0.0001). Nonmineralized tissue was lower by 3.3% in the test compared with control (57.6%; p = 0.137). Conclusions: NBF after 3-4 months is equivalent in sinus, augmented with BMAC and BBM or a mixture of AB and BBM. This technique could be an alternative for using autografts to stimulate bone formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we have used fluorescence spectroscopy to determine the post-mortem interval. Conventional methods in forensic medicine involve tissue or body fluids sampling and laboratory tests, which are often time demanding and may depend on expensive analysis. The presented method consists in using time-dependent variations on the fluorescence spectrum and its correlation with the time elapsed after regular metabolic activity cessation. This new approach addresses unmet needs for post-mortem interval determination in forensic medicine, by providing rapid and in situ measurements that shows improved time resolution relative to existing methods. (C) 2009 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matsushigue, KA, Hartmann, K, and Franchini, E. Taekwondo: Physiological responses and match analysis. J Strength Cond Res 23(4): 1112-1117, 2009-The aim of the present study was to determine the time structure and physiological responses during Songahm Taekwondo (TKD) competition and to compare these variables between winner and non-winner athletes. Fourteen men subjects were analyzed. Blood lactate concentration (LA) and heart rate (HR) were determined before and after the match. The match was filmed for the determination of the number of techniques used, the duration of effort and rest periods (RPs), and the interval between high-intensity movements (HM). Post-match LA was 7.5 +/- 3.8 mmol.L(-1), HR was 183 +/- 9 b.min(-1), and HM was 31 +/- 16 seconds. The mean effort time (862 seconds) did not differ from mean interval time (8 +/- 3 seconds). Winners used a smaller total number of techniques, but post-match LA or HR did not differ from that of non-winners. In conclusion, the glycolytic metabolism was not the predominant energy source and the physiological responses did not differ between winners and non-winners. Coaches and sports scientists should prepare a technical or physical training session considering the low glycolytic contribution in this sport, hence the training protocol should involve high-intensity movements interspersed with longer RPs to provide the creatine phosphate recovery, with special attention given to the technical quality of TKD skills and not to higher technique volume during a simulation of matches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed martial arts (MMA) have become a fast-growing worldwide expansion of martial arts competition, requiring high level of skill, physical conditioning, and strategy, and involving a synthesis of combat while standing or on the ground. This study quantified the effort-pause ratio (EP), and classified effort segments of stand-up or groundwork development to identify the number of actions performed per round in MMA matches. 52 MMA athletes participated in the study (M age = 24 yr., SD = 5; average experience in MMA = 5 yr., SD = 3). A one-way analysis of variance with repeated measurements was conducted to compare the type of action across the rounds. A chi-squared test was applied across the percentages to compare proportions of different events. Only one significant difference (p < .05) was observed among rounds: time in groundwork of low intensity was longer in the second compared to the third round. When the interval between rounds was not considered, the EP ratio (between high-intensity effort to low-intensity effort plus pauses) WE S 1:2 to 1:4. This ratio is between ratios typical for judo, wrestling, karate, and taekwondo and reflects the combination of ground and standup techniques. Most of the matches ended in the third round, involving high-intensity actions, predominantly executed during groundwork combat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To determine the incidence of interval cancers which occurred in the first 12 months after mammographic screening at a mammographic screening service. Design: Retrospective analysis of data obtained by crossmatching the screening Service and the New South Wales Central Cancer Registry databases. Setting: The Central & Eastern Sydney Service of BreastScreen NSW. Participants: Women aged 40-69 years at first screen, who attended for their first or second screen between 1 March 1988 and 31 December 1992. Main outcome measures: Interval-cancer rates per 10 000 screens and as a proportion of the underlying incidence of breast cancer (as estimated by the underlying rate in the total NSW population). Results: The 12-month interval-cancer incidence per 10 000 screens was 4.17 for the 40-49 years age group (95% confidence interval [CI], 1.35-9.73) and 4.64 for the 50-69 years age group (95% CI, 2.47-7.94). Proportional incidence rates were 30.1% for the 40-49 years age group (95% CI, 9.8-70.3) and 22% for the 50-69 years age group (95% CI, 11.7-37.7). There was no significant difference between the proportional incidence rate for the 50-69 years age group for the Central & Eastern Sydney Service and those of major successful overseas screening trials. Conclusion: Screening quality was acceptable and should result in a significant mortality reduction in the screened population. Given the small number of cancers involved, comparison of interval-cancer statistics of mammographic screening programs with trials requires age-specific or age-adjusted data, and consideration of confidence intervals of both program and trial data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the impact of increasing the minimum resupply period for prescriptions on the Pharmaceutical Benefits Scheme (PBS) in November 1994. The intervention was designed to reduce the stockpiling of medicines used for chronic medical conditions under the PBS safety net. Methods: Interrupted times series regression analyses were performed on 114 months of PBS drug utilisation data from January 1991 to June 2000. These analyses assessed whether there had been a significant interaction between the onset of the intervention in November 1994 and the extreme levels of drug utilisation in the months of December (peak utilisation) and January (lowest utilisation) respectively. Both serial and 12-month lag autocorrelations were controlled for. Results: The onset of the intervention was associated with a significant reduction in the December peak in drug utilisation; after the introduction of the policy there were 1,150,196 fewer proscriptions on average or that month (95% Cl 708,333-1,592,059). There was, however, no significant change in the low level of utilisation in January. The effect of the policy appears to be decreasing across successive postintervention years. though the odds of a prescription being dispensed in December remained significantly lower in 1999 compared to each of the pre-intervention years (11% vs. 14%) Conclusion: Analysis of the impact of increasing the re-supply period for PBS prescriptions showed that the magnitude of peak utilisation in December had been markedly reduced by the policy, though this effect appears to be decreasing over time. Continued monitoring and policy review is warranted in order to ensure that the initial effect of the intervention be maintained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We aimed to determine the effectiveness of the vaginally administered spermicide nonoxynol-9 (N-9) among women for the prevention of HIV and other sexually transmitted infections (STIs), We did a systematic review of randomised controlled trials, Nine such trials including 5096 women, predominantly sex workers, comparing N-9 with placebo or no treatment, were included. Primary outcomes were new HIV infection, new episodes of various STIs, and genital lesions. Five trials included HIV and nine included STI outcomes, and all but one (2% of the data) contributed to the meta-analysis. Overall, relative risks of HIV infection (1.12, 95% confidence interval 0.88-1.42), gonorrhoea (0.91, 0.67-1.24), chlamyclia (0.88, 0.77-1.01), cervical infection (1.01, 0.84-1-22), trichomoniasis (0.84, 0.69-1.02), bacterial vaginosis (0.88, 0.74-1.04) and candidiasis (0.97, 0.84-1.12) were not significantly different in the N-9 and placebo or no treatment groups. Genital lesions were more common in the N-9 group (1.18, 1.02-1.36). Our review has found no statistically significant reduction in risk of HIV and STIs, and the confidence intervals indicate that any protection that may exist is likely to be very small. There is some evidence of harm through genital lesions. N-9 cannot be recommended for HIV and STI prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Candiduria is a hospital-associated infection and a daily problem in the intensive care unit. The treatment of asymptomatic candiduria is not well established and the use of amphotericin B bladder irrigation (ABBI) is controversial. The aim of this systematic review was to determine the best place for this therapy in practice. Methods: The databases searched in this study included MEDLINE, EMBASE, Web of Science, and LILACS (January 1960-June 2007). We included manuscripts with data on the treatment of candiduria using ABBI. The studies were classified as comparative, dose-finding, or non-comparative. Results: From 213 studies, nine articles (377 patients) met our inclusion criteria. ABBI showed a higher clearance of the candiduria 24 hours after the end of therapy than fluconazole (odds ratio (OR) 0.57, 95% confidence interval (CI) 0.32-1.00). Fungal culture 5 days after the end of both therapies showed a similar response (OR 1.51, 95% CI 0.81-2.80). The evaluation of ABBI using an intermittent or continuous system of delivery showed an early candiduria clearance (24 hours after therapy) of 80% and 82%, respectively (OR 0.87, 95% CI 0.52-1.36). Candiduria clearance at >5 days after the therapy showed a superior response using continuous bladder irrigation with amphotericin B (OR 0.52, 95% CI 0.29-0.94). The use of continuous ABBI for more than 5 days showed a better result (88% vs. 78%) than ABBI for less than 5 days, but without significance (OR 0.55, 95% CI 0.34-1.04). Conclusion: Although the strength of the results in the underlying literature is not sufficient to allow the drawing of definitive conclusions, ABBI appears to be as effective as fluconazole, but it does not offer systemic antifungal therapy and should only be used for asymptomatic candiduria. (C) 2008 International Society for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of overweight as a risk factor for coronary heart disease (CHD) remains unsettled. We estimated the relative risk (RR) for CHD associated with underweight (body mass index, BMI < 20 kg/m2), overweight (25 – 30 kg/m2) and obesity (= 30 kg/m2), compared with normal weight (20 – 25 kg/m2) in a random effects meta-analysis of 30 prospective studies, including 389,239 healthy, predominantly Caucasian persons. We also explored sources of heterogeneity between studies and examined effects of systematic adjustment for confounding and intermediary variables. Pooled age-, sex- and smoking-adjusted RRs (95% confidence interval) for overweight, obesity and underweight compared with normal weight were 1.33 (1.24 – 1.43), 1.69 (1.44 – 1.99) and 1.01 (0.85 – 1.20), respectively. Stratified analyses showed that pooled RRs for BMI were higher for studies with longer follow-up (= vs. < 15 years) and younger populations (< vs. = 60 years). Additional adjustment for blood pressure, cholesterol levels and physical activity decreased the RR per 5 BMI units from 1.28 (1.21 – 1.34) to 1.16 (1.11 – 1.21). We conclude that overweight and obesity are associated with a substantially increased CHD risk in Caucasians, whereas underweight is not. Prevention and reduction of overweight and obesity, therefore, remain of importance for preventing CHD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

XPC participates in the initial recognition of DNA damage during the DNA nucleotide excision repair process in global genomic repair. Polymorphisms in XPC gene have been analyzed in case-control studies to assess the cancer risk attributed to these variants, but results are conflicting. To clarify the impact of XPC polymorphisms in cancer risk, we performed a meta-analysis that included 33 published case-control studies. Polymorphisms analyzed were Lys939Gln and Ala499Val. The overall summary odds ratio (OR) for the associations of the 939Gln/Gln genotype with risk of cancer was 1.01 (95% confidence interval (95% CI): 0.94-1.09), but there were statistically significant associations for lung cancer, observed for the recessive genetic model (Lys/Lys + Lys/Gln vs Gln/Gln), (OR 1.30; 95% CI: 1.113-1.53), whereas for breast cancer a reduced but nonsignificant risk was observed for the same model (OR 0.87; 95% CI: 0.74-1.01). The results for Ala499Val showed a significant overall increase in cancer risk (OR 1.15; 95% CI: 1.02-1.31), and for bladder cancer in both the simple genetic model (Ala/Ala vs Val/Val) (OR 1.30; 95% CI: 1.04-1.61) and the recessive genetic model (Ala/Ala + Ala/Val vs Val/Val) (OR 1.32; 95% CI: 1.06-1.63). Our meta-analysis supports that polymorphisms in XPC may represent low-penetrance susceptibility gene variants for breast, bladder, head and neck, and lung cancer. XPC is a good candidate for large-scale epidemiological case-control studies that may lead to improvement in the management of highly prevalent cancers.