962 resultados para Data quality problems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earthâs atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (âˆmi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of âˆmi was detected. Consequently, significance of âˆmi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamberâs volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Am Mainzer Mikrotron kÃnnen Lambda-Hyperkerne in (e,e'K^+)-Reaktionen erzeugt werden. Durch den Nachweis des erzeugten Kaons im KAOS-Spektrometer lassen sich Reaktionen markieren, bei denen ein Hyperon erzeugt wurde. Die Spektroskopie geladener Pionen, die aus schwachen ZweikÃrperzerfällen leichter Hyperkerne stammen, erlaubt es die Bindungsenergie des Hyperons im Kern mit hoher Präzision zu bestimmen. Neben der direkten Produktion von Hyperkernen ist auch die Erzeugung durch die Fragmentierung eines hoch angeregten Kontinuumszustands mÃglich. Dadurch kÃnnen unterschiedliche Hyperkerne in einem Experiment untersucht werden. Für die Spektroskopie der Zerfallspionen stehen hochauflÃsende Magnetspektrometer zur Verfügung. Um die Grundzustandsmasse der Hyperkerne aus dem Pionimpuls zu berechnen, ist es erforderlich, dass das Hyperfragment vor dem Zerfall im Target abgebremst wird. Basierend auf dem bekannten Wirkungsquerschnitt der elementaren Kaon-Photoproduktion wurde eine Berechnung der zu erwartenden Ereignisrate vorgenommen. Es wurde eine Monte-Carlo-Simulation entwickelt, die den Fragmentierungsprozess und das Abbremsen der Hyperfragmente im Target beinhaltet. Diese nutzt ein statistisches Aufbruchsmodell zur Beschreibung der Fragmentierung. Dieser Ansatz ermÃglicht für Wasserstoff-4-Lambda-Hyperkerne eine Vorhersage der zu erwartenden Zählrate an Zerfallspionen. In einem Pilotexperiment im Jahr 2011 wurde erstmalig an MAMI der Nachweis von Hadronen mit dem KAOS-Spektrometer unter einem Streuwinkel von 0° demonstriert, und koinzident dazu Pionen nachgewiesen. Es zeigte sich, dass bedingt durch die hohen Untergrundraten von Positronen in KAOS eine eindeutige Identifizierung von Hyperkernen in dieser Konfiguration nicht mÃglich war. Basierend auf diesen Erkenntnissen wurde das KAOS-Spektrometer so modifiziert, dass es als dedizierter Kaonenmarkierer fungierte. Zu diesem Zweck wurde ein Absorber aus Blei im Spektrometer montiert, in dem Positronen durch Schauerbildung abgestoppt werden. Die Auswirkung eines solchen Absorbers wurde in einem Strahltest untersucht. Eine Simulation basierend auf Geant4 wurde entwickelt mittels derer der Aufbau von Absorber und Detektoren optimiert wurde, und die Vorhersagen über die Auswirkung auf die Datenqualität ermÃglichte. Zusätzlich wurden mit der Simulation individuelle Rückrechnungsmatrizen für Kaonen, Pionen und Protonen erzeugt, die die Wechselwirkung der Teilchen mit der Bleiwand beinhalteten, und somit eine Korrektur der Auswirkungen ermÃglichen. Mit dem verbesserten Aufbau wurde 2012 eine Produktionsstrahlzeit durchgeführt, wobei erfolgreich Kaonen unter 0° Streuwinkel koninzident mit Pionen aus schwachen Zerfällen detektiert werden konnten. Dabei konnte im Impulsspektrum der Zerfallspionen eine ÜberhÃhung mit einer Signifikanz, die einem p-Wert von 2,5 x 10^-4 entspricht, festgestellt werden. Diese Ereignisse kÃnnen aufgrund ihres Impulses, den Zerfällen von Wasserstoff-4-Lambda-Hyperkernen zugeordnet werden, wobei die Anzahl detektierter Pionen konsistent mit der berechneten Ausbeute ist.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen LÃsungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen LÃsungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermÃglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A global metabolic profiling methodology based on gas chromatography coupled to time-of-flight mass spectrometry (GC-TOFMS) for human plasma was applied to a human exercise study focused on the effects of beverages containing glucose, galactose, or fructose taken after exercise and throughout a recovery period of 6 h and 45 min. One group of 10 well trained male cyclists performed 3 experimental sessions on separate days (randomized, single center). After performing a standardized depletion protocol on a bicycle, subjects consumed one of three different beverages: maltodextrin (MD)+glucose (2:1 ratio), MD+galactose (2:1), and MD+fructose (2:1), consumed at an average of 1.25 g of carbohydrate (CHO) ingested per minute. Blood was taken straight after exercise and every 45 min within the recovery phase. With the resulting blood plasma, insulin, free fatty acid (FFA) profile, glucose, and GC-TOFMS global metabolic profiling measurements were performed. The resulting profiling data was able to match the results obtained from the other clinical measurements with the addition of being able to follow many different metabolites throughout the recovery period. The data quality was assessed, with all the labelled internal standards yielding values of <15% CV for all samples (n=335), apart from the labelled sucrose which gave a value of 15.19%. Differences between recovery treatments including the appearance of galactonic acid from the galactose based beverage were also highlighted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The UNESCO listing as World Heritage Site confirms the outstanding qualities of the high-mountain region around the Great Aletsch Glacier. The region of the World Heritage Site now faces the responsibility to make these qualities visible and to preserve them for future generations. Consequently the qualities of the site must not be regarded in isolation but in the context of the entire region with its dynamics and developments. Regional monitoring is the observation and evaluation of temporal changes in target variables. It is thus an obligation towards UNESCO, who demands regular reports about the state of the listed World Heritage assets. It also allows statements about sustainable regional development and can be the basis for early recognition of threats to the outstanding qualities. Monitoring programmes face three major challenges: first, great care must be taken in defining the target qualities to be monitored or the monitoring would remain vague. Secondly, the selection of ideal indicators to describe these qualities is impeded by inadequate data quality and availability, compromises are inevitable. Thirdly, there is always an element of insecurity in the interpretation of the results as to what influences and determines the changes in the target qualities. The first survey of the monitoring programme confirmed the exceptional qualities of the region and also highlighted problematic issues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the changes of extreme European winter (December-February) precipitation back to 1700 and show for various European regions that return periods of extremely wet and dry winters are subject to significant changes both before and after the onset of anthropogenic influences. Generally, winter precipitation has become more extreme. We also examine the spatial pattern of the changes of the extremes covering the last 300 years where data quality is sufficient. Over central and Eastern Europe dry winters occurred more frequently during the 18th and the second part of the 19th century relative to 1951â2000. Dry winters were less frequent during both the 18th and 19th century over the British Isles and the Mediterranean. Wet winters have been less abundant during the last three centuries compared to 1951â2000 except during the early 18th century in central Europe. Although winter precipitation extremes are affected by climate change, no obvious connection of these changes was found to solar, volcanic or anthropogenic forcing. However, physically meaningful interpretation with atmospheric circulation changes was possible.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Creating Lakes from Open Pit Mines: Processes and Considerations, Emphasis on Northern Environments. This document summarizes the literature of mining pit lakes (through 2007), with a particular focus on issues that are likely to be of special relevance to the creation and management of pit lakes in northern climates. Pit lakes are simply waterbodies formed by filling the open pit left upon the completion of mining operations with water. Like natural lakes, mining pit lakes display a huge diversity in each of these subject areas. However, pit lakes are young and therefore are typically in a non-equilibrium state with respect to their rate of filling, water quality, and biology. Separate sections deal with different aspects of pit lakes, including their morphometry, geology, hydrogeology, geochemistry, and biology. Depending on the type and location of the mine, there may be opportunities to enhance the recreational or ecological benefits of a given pit lake, for example, by re-landscaping and re-vegetating the shoreline, by adding engineered habitat for aquatic life, and maintaining water quality. The creation of a pit lake may be a regulatory requirement to mitigate environmental impacts from mining operations, and/or be included as part of a closure and reclamation plan. Based on published case studies of pit lakes, large-scale bio-engineering projects have had mixed success. A common consensus is that manipulation of pit lake chemistry is difficult, expensive, and takes many years to achieve remediation goals. For this reason, it is prudent to take steps throughout mine operation to reduce the likelihood of future water quality problems upon closure. Also, it makes sense to engineer the lake in such a way that it will achieve its maximal end-use potential, whether it be permanent and safe storage of mine waste, habitat for aquatic life, recreation, or water supply.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Much of the research in the field of participatory modeling (PM) has focused on the developed world. Few cases are focused on developing regions, and even fewer on Latin American developing countries. The work that has been done in Latin America has often involved water management, often specifically involving water users, and has not focused on the decision making stage of the policy cycle. Little work has been done to measure the effect PM may have on the perceptions and beliefs of decision makers. In fact, throughout the field of PM, very few attempts have been made to quantitatively measure changes in participant beliefs and perceptions following participation. Of the very few exceptions, none have attempted to measure the long-term change in perceptions and beliefs. This research fills that gap. As part of a participatory modeling project in Sonora, Mexico, a region with water quantity and quality problems, I measured the change in beliefs among participants about water models: ability to use and understand them, their usefulness, and their accuracy. I also measured changes in beliefs about climate change, and about water quantity problems, specifically the causes, solutions, and impacts. I also assessed participant satisfaction with the process and outputs from the participatory modeling workshops. Participants were from water agencies, academic institutions, NGOs, and independent consulting firms. Results indicated that participant comfort and self-efficacy with water models, their beliefs in the usefulness of water models, and their beliefs about the impact of water quantity problems changed significantly as a result of the workshops. I present my findings and discuss the results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Growth codes are a subclass of Rateless codes that have found interesting applications in data dissemination problems. Compared to other Rateless and conventional channel codes, Growth codes show improved intermediate performance which is particularly useful in applications where partial data presents some utility. In this paper, we investigate the asymptotic performance of Growth codes using the Wormald method, which was proposed for studying the Peeling Decoder of LDPC and LDGM codes. Compared to previous works, the Wormald differential equations are set on nodes' perspective which enables a numerical solution to the computation of the expected asymptotic decoding performance of Growth codes. Our framework is appropriate for any class of Rateless codes that does not include a precoding step. We further study the performance of Growth codes with moderate and large size codeblocks through simulations and we use the generalized logistic function to model the decoding probability. We then exploit the decoding probability model in an illustrative application of Growth codes to error resilient video transmission. The video transmission problem is cast as a joint source and channel rate allocation problem that is shown to be convex with respect to the channel rate. This illustrative application permits to highlight the main advantage of Growth codes, namely improved performance in the intermediate loss region.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A rain-on-snow flood occurred in the Bernese Alps, Switzerland, on 10 October 2011, and caused significant damage. As the flood peak was unpredicted by the flood forecast system, questions were raised concerning the causes and the predictability of the event. Here, we aimed to reconstruct the anatomy of this rain-on-snow flood in the LÃtschen Valley (160 km2) by analyzing meteorological data from the synoptic to the local scale and by reproducing the flood peak with the hydrological model WaSiM-ETH (Water Flow and Balance Simulation Model). This in order to gain process understanding and to evaluate the predictability. The atmospheric drivers of this rain-on-snow flood were (i) sustained snowfall followed by (ii) the passage of an atmospheric river bringing warm and moist air towards the Alps. As a result, intensive rainfall (average of 100 mm day-1) was accompanied by a temperature increase that shifted the 0° line from 1500 to 3200 m a.s.l. (meters above sea level) in 24 h with a maximum increase of 9 K in 9 h. The south-facing slope of the valley received significantly more precipitation than the north-facing slope, leading to flooding only in tributaries along the south-facing slope. We hypothesized that the reason for this very local rainfall distribution was a cavity circulation combined with a seeder-feeder-cloud system enhancing local rainfall and snowmelt along the south-facing slope. By applying and considerably recalibrating the standard hydrological model setup, we proved that both latent and sensible heat fluxes were needed to reconstruct the snow cover dynamic, and that locally high-precipitation sums (160 mm in 12 h) were required to produce the estimated flood peak. However, to reproduce the rapid runoff responses during the event, we conceptually represent likely lateral flow dynamics within the snow cover causing the model to react "oversensitively" to meltwater. Driving the optimized model with COSMO (Consortium for Small-scale Modeling)-2 forecast data, we still failed to simulate the flood because COSMO-2 forecast data underestimated both the local precipitation peak and the temperature increase. Thus we conclude that this rain-on-snow flood was, in general, predictable, but requires a special hydrological model setup and extensive and locally precise meteorological input data. Although, this data quality may not be achieved with forecast data, an additional model with a specific rain-on-snow configuration can provide useful information when rain-on-snow events are likely to occur.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There are large variations in the incidence, registration methods and reported causes of sudden cardiac arrest/sudden cardiac death (SCA/SCD) in competitive and recreational athletes. A crucial question is to which degree these variations are genuine or partly due to methodological incongruities. This paper discusses the uncertainties about available data and provides comprehensive suggestions for standard definitions and a guide for uniform registration parameters of SCA/SCD. The parameters include a definition of what constitutes an 'athlete', incidence calculations, enrolment of cases, the importance of gender, ethnicity and age of the athlete, as well as the type and level of sporting activity. A precise instruction for autopsy practice in the case of a SCD of athletes is given, including the role of molecular samples and evaluation of possible doping. Rational decisions about cardiac preparticipation screening and cardiac safety at sport facilities requires increased data quality concerning incidence, aetiology and management of SCA/SCD in sports. Uniform standard registration of SCA/SCD in athletes and leisure sportsmen would be a first step towards this goal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The current study investigated data quality and estimated cancer incidence and mortality rates using data provided by Pavlodar, Semipalatinsk and Ust-Kamenogorsk Regional Cancer Registries of Kazakhstan during the period of 1996â1998. Assessment of data quality was performed using standard quality indicators including internal database checks, proportion of cases verified from death certificates only, mortality:incidence ratio, data patterns, proportion of cases with unknown primary site, proportion of cases with unknown age. Crude and age-adjusted incidence and mortality rates and 95% confidence intervals were calculated, by gender, for all cancers combined and for 28 specific cancer sites for each year of the study period. The five most frequent cancers were identified and described for every population. The results of the study provide the first simultaneous assessment of data quality and standardized incidence and mortality rates for Kazakh cancer registries. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Racial disparities in prostate cancer are of public health concern. This dissertation used Texas Cancer Registry data to examine racial disparities in prostate cancer incidence for Texas over the period 1995â1998 and subsequent mortality through the year 2001. Incidence, mortality, treatment, and risk factors for survival were examined. It was found that non-Hispanic blacks have higher incidence and mortality from prostate cancer than non-Hispanic whites, and that Hispanics and non-Hispanic Asians are roughly similar to non-Hispanic whites in cancer survival. The incidence rates in non-Hispanic whites were spread more evenly across the age spectrum compared to other racial and ethnic groups. Non-Hispanic blacks were more often diagnosed at a higher stage of disease. All racial and ethnic groups in the Registry had lower death rates from non-prostate cancer causes than non-Hispanic whites. Age, stage and grade all conferred about the same relative risks of all-cause and prostate cancer survival within each racial and ethnic group examined. Radiation treatment for non-Hispanic blacks and Hispanics did not confer a relative risk of survival statistically significantly different from surgery, whereas it conferred greater survival in non-Hispanic whites. However, non-Hispanic blacks were statistically significantly less likely to have received radiation treatment, while controlling for age, stage, and grade. Among only those who died of prostate cancer, non-Hispanic blacks were less likely to have received radiation than were non-Hispanic whites, whereas among those who had not died, non-Hispanic blacks were more likely to have received this treatment. Hispanics were less likely to have received radiation whether they died from prostate cancer or not. All racial and ethnic groups were less likely than Non-Hispanic whites to have received surgery. Non-Hispanic blacks and Hispanics were more likely than non-Hispanic whites to have received hormonal treatment. The findings are interpreted with caution with regard to the limitations of data quality and missing information. Results are discussed in the context of previous work, and public health implications are pondered. This study confirms some earlier findings, identifies treatment as one possible source of disparity in prostate cancer mortality, and contributes to understanding the epidemiology of prostate cancer in Hispanics. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most studies of p53 function have focused on genes transactivated by p53. It is less widely appreciated that p53 can repress target genes to affect a particular cellular response. There is evidence that repression is important for p53-induced apoptosis and cell cycle arrest. It is less clear if repression is important for other p53 functions. A comprehensive knowledge of the genes repressed by p53 and the cellular processes they affect is currently lacking. We used an expression profiling strategy to identify p53-responsive genes following adenoviral p53 gene transfer (Ad-p53) in PC3 prostate cancer cells. A total of 111 genes represented on the Affymetrix U133A microarray were repressed more than two fold (p ⤠0.05) by p53. An objective assessment of array data quality was carried out using RT-PCR of 20 randomly selected genes. We estimate a confirmation rate of >95.5% for the complete data set. Functional over-representation analysis was used to identify cellular processes potentially affected by p53-mediated repression. Cell cycle regulatory genes exhibited significant enrichment (p ⤠5E-28) within the repressed targets. Several of these genes are repressed in a p53-dependent manner following DNA damage, but preceding cell cycle arrest. These findings identify novel p53-repressed targets and indicate that p53-induced cell cycle arrest is a function of not only the transactivation of cell cycle inhibitors (e.g., p21), but also the repression of targets that act at each phase of the cell cycle. The mechanism of repression of this set of p53 targets was investigated. Most of the repressed genes identified here do not harbor consensus p53 DNA binding sites but do contain binding sites for E2F transcription factors. We demonstrate a role for E2F/RB repressor complexes in our system. Importantly, p53 is found at the promoter of CDC25A. CDC25A protein is rapidly degraded in response to DNA damage. Our group has demonstrated for the first time that CDC25A is also repressed at the transcript level by p53. This work has important implications for understanding the DNA damage cell cycle checkpoint response and the link between E2F/RB complexes and p53 in the repression of target genes. ^