953 resultados para Prediction Models for Air Pollution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the area under the receiver operating characteristic (AUC) is the most popular measure of the performance of prediction models, it has limitations, especially when it is used to evaluate the added discrimination of a new biomarker in the model. Pencina et al. (2008) proposed two indices, the net reclassification improvement (NRI) and integrated discrimination improvement (IDI), to supplement the improvement in the AUC (IAUC). Their NRI and IDI are based on binary outcomes in case-control settings, which do not involve time-to-event outcome. However, many disease outcomes are time-dependent and the onset time can be censored. Measuring discrimination potential of a prognostic marker without considering time to event can lead to biased estimates. In this dissertation, we have extended the NRI and IDI to survival analysis settings and derived the corresponding sample estimators and asymptotic tests. Simulation studies were conducted to compare the performance of the time-dependent NRI and IDI with Pencina’s NRI and IDI. For illustration, we have applied the proposed method to a breast cancer study.^ Key words: Prognostic model, Discrimination, Time-dependent NRI and IDI ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The federal regulatory regime for addressing airborne toxic pollutants functions fairly well in most of the country. However, it has proved deficient in addressing local risk issues, especially in urban areas with densely concentrated sources. The problem is especially pronounced in Houston, which is home to one of the world's biggest petrochemical complexes and a major port, both located near a large metropolitan center. Despite the fact that local government's role in regulating air toxics is typically quite limited, from 2004-2009, the City of Houston implemented a novel municipality-based air toxics reduction strategy. The initiatives ranged from voluntary agreements to litigation and legislation. This case study considers why the city chose the policy tools it did, how the tools performed relative to the designers' intentions, and how the debate among actors with conflicting values and goals shaped the policy landscape. The city's unconventional approach to controlling hazardous air pollution has not yet been examined rigorously. The case study was developed through reviews of publicly available documents and quasi-public documents obtained through public record requests, as well as interviews with key informants. The informants represented a range of experience and perspectives. They included current and former public officials at the city (including Mayor White), former Texas Commission on Environmental Quality staff, faculty at local universities, industry representatives, and environmental public health advocates. Some of the city's tools were successful in meeting their designers' intent, some were less successful. Ultimately, even those tools that did not achieve their stated purpose were nonetheless successful in bringing attention and resources to the air quality issue. Through a series of pleas and prods, the city managed to draw attention to the problem locally and get reluctant policymakers at higher levels of government to respond. This work demonstrates the potential for local government to overcome limitations in the federal regulatory regime for air toxics control, shifting the balance of local, state, and federal initiative. It also highlights the importance of flexible, cooperative strategies in local environmental protection.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposure to air pollutants in urban locales has been associated with increased risk for chronic diseases including cardiovascular disease (CVD) and pulmonary diseases in epidemiological studies. The exact mechanism explaining how air pollution affects chronic disease is still unknown. However, oxidative stress and inflammatory pathways have been posited as likely mechanisms. ^ Data from the Multi-Ethnic Study of Atherosclerosis (MESA) and the Mexican-American Cohort Study (2003-2009) were used to examine the following aims, respectively: 1) to evaluate the association between long-term exposure to ambient particulate matter (PM) (PM10 and PM2.5) and nitrogen oxides (NO x) and telomere length (TL) among approximately 1,000 participants within MESA; and 2) to evaluate the association between traffic-related air pollution with self-reported asthma, diabetes, and hypertension among Mexican-Americans in Houston, Texas. ^ Our results from MESA were inconsistent regarding associations between long-term exposure to air pollution and shorter telomere length based on whether the participants came from New York (NY) or Los Angeles (LA). Although not statistically significant, we observed a negative association between long-term air pollution exposure and mean telomere length for NY participants, which was consistent with our hypothesis. Positive (statistically insignificant) associations were observed for LA participants. It is possible that our findings were more influenced by both outcome and exposure misclassification than by the absence of a relationship between pollution and TL. Future studies are needed that include longitudinal measures of telomere length as well as focus on effects of specific constituents of PM and other pollutant exposures on changes in telomere length over time. ^ This research provides support that Mexican-American adults who live near a major roadway or in close proximity to a dense street network have a higher prevalence of asthma. There was a non-significant trend towards an increased prevalence of adult asthma with increasing residential traffic exposure especially for residents who lived three or more years at their baseline address. Even though the prevalence of asthma is low in the Mexican-origin population, it is the fastest growing minority group in the U.S. and we would expect a growing number of Mexican-Americans who suffer from asthma in the future. Future studies are needed to better characterize risks for asthma associated with air pollution in this population.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is scant evidence regarding the associations between ambient levels of combustion pollutants and small for gestational age (SGA) infants. No studies of this type have been completed in the Southern United States. The main objective of the project presented was to determine associations between combustion pollutants and SGA infants in Texas using three different exposure assessments. ^ Birth certificate data that contained information on maternal and infant characteristics were obtained from the Texas Department of State Health Services (TX DSHS). Exposure assessment data for the three aims came from: (1) U.S. Environmental Protection Agency (EPA) National Air Toxics Assessment (NATA), (2) U.S. EPA Air Quality System (AQS), and (3) TX Department of Transportation (DOT), respectively. Multiple logistic regression models were used to determine the associations between combustion pollutants and SGA. ^ For the first study looked at annual estimates of four air toxics at the census tract level in the Greater Houston Area. After controlling for maternal race, maternal education, tobacco use, maternal age, number of prenatal visits, marital status, maternal weight gain, and median census tract income level, adjusted ORs and 95% confidence intervals (CI) for exposure to PAHs (per 10 ng/m3), naphthalene (per 10 ng/m3), benzene (per 1 µg/m3), and diesel engine emissions (per 10 µg/m3) were 1.01 (0.97–1.05), 1.00 (0.99–1.01), 1.01 (0.97–1.05), and 1.08 (0.95–1.23) respectively. For the second study looking at Hispanics in El Paso County, AORs and 95% confidence intervals (CI) for increases of 5 ng/m3 for the sum of carcinogenic PAHs (Σ c-PAHs), 1 ng/m3 of benzo[a]pyrene, and 100 ng/m3 in naphthalene during the third trimester of pregnancy were 1.02 (0.97–1.07), 1.03 (0.96–1.11), and 1.01 (0.97–1.06), respectively. For the third study using maternal proximity to major roadways as the exposure metric, there was a negative association with increasing distance from a maternal residence to the nearest major roadway (Odds Ratio (OR) = 0.96; 95% CI = 0.94–0.97) per 1000 m); however, once adjusted for covariates this effect was no longer significant (AOR = 0.98; 95% CI = 0.96–1.00). There was no association with distance weighted traffic density (DWTD). ^ This project is the first to look at SGA and combustion pollutants in the Southern United States with three different exposure metrics. Although there was no evidence of associations found between SGA and the air pollutants mentioned in these studies, the results contribute to the body of literature assessing maternal exposure to ambient air pollution and adverse birth outcomes. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Zebrafish is a clinically-relevant model of heart regeneration. Unlike mammals, it has a remarkable heart repair capacity after injury, and promises novel translational applications. Amputation and cryoinjury models are key research tools for understanding injury response and regeneration in vivo. An understanding of the transcriptional responses following injury is needed to identify key players of heart tissue repair, as well as potential targets for boosting this property in humans. RESULTS We investigated amputation and cryoinjury in vivo models of heart damage in the zebrafish through unbiased, integrative analyses of independent molecular datasets. To detect genes with potential biological roles, we derived computational prediction models with microarray data from heart amputation experiments. We focused on a top-ranked set of genes highly activated in the early post-injury stage, whose activity was further verified in independent microarray datasets. Next, we performed independent validations of expression responses with qPCR in a cryoinjury model. Across in vivo models, the top candidates showed highly concordant responses at 1 and 3 days post-injury, which highlights the predictive power of our analysis strategies and the possible biological relevance of these genes. Top candidates are significantly involved in cell fate specification and differentiation, and include heart failure markers such as periostin, as well as potential new targets for heart regeneration. For example, ptgis and ca2 were overexpressed, while usp2a, a regulator of the p53 pathway, was down-regulated in our in vivo models. Interestingly, a high activity of ptgis and ca2 has been previously observed in failing hearts from rats and humans. CONCLUSIONS We identified genes with potential critical roles in the response to cardiac damage in the zebrafish. Their transcriptional activities are reproducible in different in vivo models of cardiac injury.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complexity has always been one of the most important issues in distributed computing. From the first clusters to grid and now cloud computing, dealing correctly and efficiently with system complexity is the key to taking technology a step further. In this sense, global behavior modeling is an innovative methodology aimed at understanding the grid behavior. The main objective of this methodology is to synthesize the grid's vast, heterogeneous nature into a simple but powerful behavior model, represented in the form of a single, abstract entity, with a global state. Global behavior modeling has proved to be very useful in effectively managing grid complexity but, in many cases, deeper knowledge is needed. It generates a descriptive model that could be greatly improved if extended not only to explain behavior, but also to predict it. In this paper we present a prediction methodology whose objective is to define the techniques needed to create global behavior prediction models for grid systems. This global behavior prediction can benefit grid management, specially in areas such as fault tolerance or job scheduling. The paper presents experimental results obtained in real scenarios in order to validate this approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most empirical disciplines promote the reuse and sharing of datasets, as it leads to greater possibility of replication. While this is increasingly the case in Empirical Software Engineering, some of the most popular bug-fix datasets are now known to be biased. This raises two significants concerns: first, that sample bias may lead to underperforming prediction models, and second, that the external validity of the studies based on biased datasets may be suspect. This issue has raised considerable consternation in the ESE literature in recent years. However, there is a confounding factor of these datasets that has not been examined carefully: size. Biased datasets are sampling only some of the data that could be sampled, and doing so in a biased fashion; but biased samples could be smaller, or larger. Smaller data sets in general provide less reliable bases for estimating models, and thus could lead to inferior model performance. In this setting, we ask the question, what affects performance more? bias, or size? We conduct a detailed, large-scale meta-analysis, using simulated datasets sampled with bias from a high-quality dataset which is relatively free of bias. Our results suggest that size always matters just as much bias direction, and in fact much more than bias direction when considering information-retrieval measures such as AUC and F-score. This indicates that at least for prediction models, even when dealing with sampling bias, simply finding larger samples can sometimes be sufficient. Our analysis also exposes the complexity of the bias issue, and raises further issues to be explored in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maximizing energy autonomy is a consistent challenge when deploying mobile robots in ionizing radiation or other hazardous environments. Having a reliable robot system is essential for successful execution of missions and to avoid manual recovery of the robots in environments that are harmful to human beings. For deployment of robots missions at short notice, the ability to know beforehand the energy required for performing the task is essential. This paper presents a on-line method for predicting energy requirements based on the pre-determined power models for a mobile robot. A small mobile robot, Khepera III is used for the experimental study and the results are promising with high prediction accuracy. The applications of the energy prediction models in energy optimization and simulations are also discussed along with examples of significant energy savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of desert dust on cloud properties and precipitation has so far been studied solely by using theoretical models, which predict that rainfall would be enhanced. Here we present observations showing the contrary; the effect of dust on cloud properties is to inhibit precipitation. Using satellite and aircraft observations we show that clouds forming within desert dust contain small droplets and produce little precipitation by drop coalescence. Measurement of the size distribution and the chemical analysis of individual Saharan dust particles collected in such a dust storm suggest a possible mechanism for the diminished rainfall. The detrimental impact of dust on rainfall is smaller than that caused by smoke from biomass burning or anthropogenic air pollution, but the large abundance of desert dust in the atmosphere renders it important. The reduction of precipitation from clouds affected by desert dust can cause drier soil, which in turn raises more dust, thus providing a possible feedback loop to further decrease precipitation. Furthermore, anthropogenic changes of land use exposing the topsoil can initiate such a desertification feedback process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in biologically based ecosystem models of the coupled terrestrial, hydrological, carbon, and nutrient cycles have provided new perspectives on the terrestrial biosphere’s behavior globally, over a range of time scales. We used the terrestrial ecosystem model Century to examine relationships between carbon, nitrogen, and water dynamics. The model, run to a quasi-steady-state, shows strong correlations between carbon, water, and nitrogen fluxes that lead to equilibration of water/energy and nitrogen limitation of net primary productivity. This occurs because as the water flux increases, the potentials for carbon uptake (photosynthesis), and inputs and losses of nitrogen, all increase. As the flux of carbon increases, the amount of nitrogen that can be captured into organic matter and then recycled also increases. Because most plant-available nitrogen is derived from internal recycling, this latter process is critical to sustaining high productivity in environments where water and energy are plentiful. At steady-state, water/energy and nitrogen limitation “equilibrate,” but because the water, carbon, and nitrogen cycles have different response times, inclusion of nitrogen cycling into ecosystem models adds behavior at longer time scales than in purely biophysical models. The tight correlations among nitrogen fluxes with evapotranspiration implies that either climate change or changes to nitrogen inputs (from fertilization or air pollution) will have large and long-lived effects on both productivity and nitrogen losses through hydrological and trace gas pathways. Comprehensive analyses of the role of ecosystems in the carbon cycle must consider mechanisms that arise from the interaction of the hydrological, carbon, and nutrient cycles in ecosystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"EPA/600/8-89/067F."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliographies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliographies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliographies.