976 resultados para Explicit hazard model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose methods for smooth hazard estimation of a time variable where that variable is interval censored. These methods allow one to model the transformed hazard in terms of either smooth (smoothing splines) or linear functions of time and other relevant time varying predictor variables. We illustrate the use of this method on a dataset of hemophiliacs where the outcome, time to seroconversion for HIV, is interval censored and left-truncated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Many HIV-infected patients on highly active antiretroviral therapy (HAART) experience metabolic complications including dyslipidaemia and insulin resistance, which may increase their coronary heart disease (CHD) risk. We developed a prognostic model for CHD tailored to the changes in risk factors observed in patients starting HAART. METHODS: Data from five cohort studies (British Regional Heart Study, Caerphilly and Speedwell Studies, Framingham Offspring Study, Whitehall II) on 13,100 men aged 40-70 and 114,443 years of follow up were used. CHD was defined as myocardial infarction or death from CHD. Model fit was assessed using the Akaike Information Criterion; generalizability across cohorts was examined using internal-external cross-validation. RESULTS: A parametric model based on the Gompertz distribution generalized best. Variables included in the model were systolic blood pressure, total cholesterol, high-density lipoprotein cholesterol, triglyceride, glucose, diabetes mellitus, body mass index and smoking status. Compared with patients not on HAART, the estimated CHD hazard ratio (HR) for patients on HAART was 1.46 (95% CI 1.15-1.86) for moderate and 2.48 (95% CI 1.76-3.51) for severe metabolic complications. CONCLUSIONS: The change in the risk of CHD in HIV-infected men starting HAART can be estimated based on typical changes in risk factors, assuming that HRs estimated using data from non-infected men are applicable to HIV-infected men. Based on this model the risk of CHD is likely to increase, but increases may often be modest, and could be offset by lifestyle changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Between 1966 and 2003, the Golden-winged Warbler (Vermivora chrysoptera) experienced declines of 3.4% per year in large parts of the breeding range and has been identified by Partners in Flight as one of 28 land birds requiring expedient action to prevent its continued decline. It is currently being considered for listing under the Endangered Species Act. A major step in advancing our understanding of the status and habitat preferences of Golden-winged Warbler populations in the Upper Midwest was initiated by the publication of new predictive spatially explicit Golden-winged Warbler habitat models for the northern Midwest. Here, I use original data on observed Golden-winged Warbler abundances in Wisconsin and Minnesota to compare two population models: the hierarchical spatial count (HSC) model with the Habitat Suitability Index (HSI) model. I assessed how well the field data compared to the model predictions and found that within Wisconsin, the HSC model performed slightly better than the HSI model whereas both models performed relatively equally in Minnesota. For the HSC model, I found a 10% error of commission in Wisconsin and a 24.2% error of commission for Minnesota. Similarly, the HSI model has a 23% error of commission in Minnesota; in Wisconsin due to limited areas where the HSI model predicted absences, there was incomplete data and I was unable to determine the error of commission for the HSI model. These are sites where the model predicted presences and the Golden-winged Warbler did not occur. To compare predicted abundance from the two models, a 3x3 contingency table was used. I found that when overlapped, the models do not complement one another in identifying Golden-winged Warbler presences. To calculate discrepancy between the models, the error of commission shows that the HSI model has only a 6.8% chance of correctly classifying absences in the HSC model. The HSC model has only 3.3% chance of correctly classifying absences in the HSI model. These findings highlight the importance of grasses for nesting, shrubs used for cover and foraging, and trees for song perches and foraging as key habitat characteristics for breeding territory occupancy by singing males.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present distribution of freshwater fish in the Alpine region has been strongly affected by colonization events occurring after the last glacial maximum (LGM), some 20,000 years ago. We use here a spatially explicit simulation framework to model and better understand their colonization dynamics in the Swiss Rhine basin. This approach is applied to the European bullhead (Cottus gobio), which is an ideal model organism to study fish past demographic processes since it has not been managed by humans. The molecular diversity of eight sampled populations is simulated and compared to observed data at six microsatellite loci under an approximate Bayesian computation framework to estimate the parameters of the colonization process. Our demographic estimates fit well with current knowledge about the biology of this species, but they suggest that the Swiss Rhine basin was colonized very recently, after the Younger Dryas some 6600 years ago. We discuss the implication of this result, as well as the strengths and limits of the spatially explicit approach coupled to the approximate Bayesian computation framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The municipality of San Juan La Laguna, Guatemala is home to approximately 5,200 people and located on the western side of the Lake Atitlán caldera. Steep slopes surround all but the eastern side of San Juan. The Lake Atitlán watershed is susceptible to many natural hazards, but most predictable are the landslides that can occur annually with each rainy season, especially during high-intensity events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the Atitlán region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. This study used data from multiple attributes, at every landslide and non-landslide point, and applied different multivariate analyses to optimize a model for landslides prediction during high-intensity precipitation events like Hurricane Stan. The attributes considered in this study are: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The attributes were pre-evaluated for their ability to predict landslides using four different attribute evaluators, all available in the open source data mining software Weka: filtered subset, information gain, gain ratio and chi-squared. Three multivariate algorithms (decision tree J48, logistic regression and BayesNet) were optimized for landslide prediction using different attributes. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points. The probability map developed in this study was also compared to the results of a bivariate landslide susceptibility analysis conducted for the watershed, encompassing Lake Atitlán and San Juan. Landslides from Tropical Storm Agatha 2010 were used to independently validate this study’s multivariate model and the bivariate model. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE To explore whether population-related pharmacogenomics contribute to differences in patient outcomes between clinical trials performed in Japan and the United States, given similar study designs, eligibility criteria, staging, and treatment regimens. METHODS We prospectively designed and conducted three phase III trials (Four-Arm Cooperative Study, LC00-03, and S0003) in advanced-stage, non-small-cell lung cancer, each with a common arm of paclitaxel plus carboplatin. Genomic DNA was collected from patients in LC00-03 and S0003 who received paclitaxel (225 mg/m(2)) and carboplatin (area under the concentration-time curve, 6). Genotypic variants of CYP3A4, CYP3A5, CYP2C8, NR1I2-206, ABCB1, ERCC1, and ERCC2 were analyzed by pyrosequencing or by PCR restriction fragment length polymorphism. Results were assessed by Cox model for survival and by logistic regression for response and toxicity. Results Clinical results were similar in the two Japanese trials, and were significantly different from the US trial, for survival, neutropenia, febrile neutropenia, and anemia. There was a significant difference between Japanese and US patients in genotypic distribution for CYP3A4*1B (P = .01), CYP3A5*3C (P = .03), ERCC1 118 (P < .0001), ERCC2 K751Q (P < .001), and CYP2C8 R139K (P = .01). Genotypic associations were observed between CYP3A4*1B for progression-free survival (hazard ratio [HR], 0.36; 95% CI, 0.14 to 0.94; P = .04) and ERCC2 K751Q for response (HR, 0.33; 95% CI, 0.13 to 0.83; P = .02). For grade 4 neutropenia, the HR for ABCB1 3425C-->T was 1.84 (95% CI, 0.77 to 4.48; P = .19). CONCLUSION Differences in allelic distribution for genes involved in paclitaxel disposition or DNA repair were observed between Japanese and US patients. In an exploratory analysis, genotype-related associations with patient outcomes were observed for CYP3A4*1B and ERCC2 K751Q. This common-arm approach facilitates the prospective study of population-related pharmacogenomics in which ethnic differences in antineoplastic drug disposition are anticipated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting with an overview on losses due to mountain hazards in the Russian Federation and the European Alps, the question is raised why a substantial number of events still are recorded—despite considerable efforts in hazard mitigation and risk reduction. The main reason for this paradox lies in a missing dynamic risk-based approach, and it is shown that these dynamics have different roots: firstly, neglecting climate change and systems dynamics, the development of hazard scenarios is based on the static approach of design events. Secondly, due to economic development and population dynamics, the elements at risk exposed are subject to spatial and temporal changes. These issues are discussed with respect to temporal and spatial demands. As a result, it is shown how risk is dynamic on a long-term and short-term scale, which has to be acknowledged in the risk concept if this concept is targeted at a sustainable development of mountain regions. A conceptual model is presented that can be used for dynamical risk assessment, and it is shown by different management strategies how this model may be converted into practice. Furthermore, the interconnectedness and interaction between hazard and risk are addressed in order to enhance prevention, the level of protection and the degree of preparedness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Impaired cognition is an important dimension in psychosis and its at-risk states. Research on the value of impaired cognition for psychosis prediction in at-risk samples, however, mainly relies on study-specific sample means of neurocognitive tests, which unlike widely available general test norms are difficult to translate into clinical practice. The aim of this study was to explore the combined predictive value of at-risk criteria and neurocognitive deficits according to test norms with a risk stratification approach. Method: Potential predictors of psychosis (neurocognitive deficits and at-risk criteria) over 24 months were investigated in 97 at-risk patients. Results: The final prediction model included (1) at-risk criteria (attenuated psychotic symptoms plus subjective cognitive disturbances) and (2) a processing speed deficit (digit symbol test). The model was stratified into 4 risk classes with hazard rates between 0.0 (both predictors absent) and 1.29 (both predictors present). Conclusions: The combination of a processing speed deficit and at-risk criteria provides an optimized stratified risk assessment. Based on neurocognitive test norms, the validity of our proposed 3 risk classes could easily be examined in independent at-risk samples and, pending positive validation results, our approach could easily be applied in clinical practice in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a version of operational set theory, OST−, without a choice operation, which has a machinery for Δ0Δ0 separation based on truth functions and the separation operator, and a new kind of applicative set theory, so-called weak explicit set theory WEST, based on Gödel operations. We show that both the theories and Kripke–Platek set theory KPKP with infinity are pairwise Π1Π1 equivalent. We also show analogous assertions for subtheories with ∈-induction restricted in various ways and for supertheories extended by powerset, beta, limit and Mahlo operations. Whereas the upper bound is given by a refinement of inductive definition in KPKP, the lower bound is by a combination, in a specific way, of realisability, (intuitionistic) forcing and negative interpretations. Thus, despite interpretability between classical theories, we make “a detour via intuitionistic theories”. The combined interpretation, seen as a model construction in the sense of Visser's miniature model theory, is a new way of construction for classical theories and could be said the third kind of model construction ever used which is non-trivial on the logical connective level, after generic extension à la Cohen and Krivine's classical realisability model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Domestic dog rabies is an endemic disease in large parts of the developing world and also epidemic in previously free regions. For example, it continues to spread in eastern Indonesia and currently threatens adjacent rabies-free regions with high densities of free-roaming dogs, including remote northern Australia. Mathematical and simulation disease models are useful tools to provide insights on the most effective control strategies and to inform policy decisions. Existing rabies models typically focus on long-term control programs in endemic countries. However, simulation models describing the dog rabies incursion scenario in regions where rabies is still exotic are lacking. We here describe such a stochastic, spatially explicit rabies simulation model that is based on individual dog information collected in two remote regions in northern Australia. Illustrative simulations produced plausible results with epidemic characteristics expected for rabies outbreaks in disease free regions (mean R0 1.7, epidemic peak 97 days post-incursion, vaccination as the most effective response strategy). Systematic sensitivity analysis identified that model outcomes were most sensitive to seven of the 30 model parameters tested. This model is suitable for exploring rabies spread and control before an incursion in populations of largely free-roaming dogs that live close together with their owners. It can be used for ad-hoc contingency or response planning prior to and shortly after incursion of dog rabies in previously free regions. One challenge that remains is model parameterisation, particularly how dogs' roaming and contacts and biting behaviours change following a rabies incursion in a previously rabies free population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of analyzing data with updated measurements in the time-dependent proportional hazards model arises frequently in practice. One available option is to reduce the number of intervals (or updated measurements) to be included in the Cox regression model. We empirically investigated the bias of the estimator of the time-dependent covariate while varying the effect of failure rate, sample size, true values of the parameters and the number of intervals. We also evaluated how often a time-dependent covariate needs to be collected and assessed the effect of sample size and failure rate on the power of testing a time-dependent effect.^ A time-dependent proportional hazards model with two binary covariates was considered. The time axis was partitioned into k intervals. The baseline hazard was assumed to be 1 so that the failure times were exponentially distributed in the ith interval. A type II censoring model was adopted to characterize the failure rate. The factors of interest were sample size (500, 1000), type II censoring with failure rates of 0.05, 0.10, and 0.20, and three values for each of the non-time-dependent and time-dependent covariates (1/4,1/2,3/4).^ The mean of the bias of the estimator of the coefficient of the time-dependent covariate decreased as sample size and number of intervals increased whereas the mean of the bias increased as failure rate and true values of the covariates increased. The mean of the bias of the estimator of the coefficient was smallest when all of the updated measurements were used in the model compared with two models that used selected measurements of the time-dependent covariate. For the model that included all the measurements, the coverage rates of the estimator of the coefficient of the time-dependent covariate was in most cases 90% or more except when the failure rate was high (0.20). The power associated with testing a time-dependent effect was highest when all of the measurements of the time-dependent covariate were used. An example from the Systolic Hypertension in the Elderly Program Cooperative Research Group is presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of the ecosystem approach and models for the management of ocean marine resources requires easy access to standard validated datasets of historical catch data for the main exploited species, together with the model estimates achieved from these data, allowing models inter-comparison and evaluation of model skills. North Atlantic albacore tuna is exploited all year round by longline and in summer and autumn by surface fisheries and fishery statistics compiled by the International Commission for the Conservation of Atlantic Tunas (ICCAT). Catch and effort with geographical coordinates at monthly spatial resolution of 1° or 5° squares were extracted for this species with a careful definition of fisheries and data screening. Length frequencies of catch were also extracted according to the definition of fisheries for the period 1956-2010. Using these data, an application of the spatial ecosystem and population dynamics model (SEAPODYM) was developed for the North Atlantic albacore population and fisheries and provided the first spatially explicit estimate of albacore density in the North Atlantic by life stage. These densities by life stage (larval recruits, young immature fish adult mature fish and total biomass) are provided in gridded file (Netcdf) at resolution of 2° x 2° x month.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of Pre- and Protohistoric anthropogenic land cover changes needs to be quantified i) to establish a baseline for comparison with current human impact on the environment and ii) to separate it from naturally occurring changes in our environment. Results are presented from the simple, adaptation-driven, spatially explicit Global Land Use and technological Evolution Simulator (GLUES) for pre-Bronze age demographic, technological and economic change. Using scaling parameters from the History Database of the Global Environment as well as GLUES-simulated population density and subsistence style, the land requirement for growing crops is estimated. The intrusion of cropland into potentially forested areas is translated into carbon loss due to deforestation with the dynamic global vegetation model VECODE. The land demand in important Prehistoric growth areas - converted from mostly forested areas - led to large-scale regional (country size) deforestation of up to 11% of the potential forest. In total, 29 Gt carbon were lost from global forests between 10 000 BC and 2000 BC and were replaced by crops; this value is consistent with other estimates of Prehistoric deforestation. The generation of realistic (agri-)cultural development trajectories at a regional resolution is a major strength of GLUES. Most of the pre-Bronze age deforestation is simulated in a broad farming belt from Central Europe via India to China. Regional carbon loss is, e.g., 5 Gt in Europe and the Mediterranean, 6 Gt on the Indian subcontinent, 18 Gt in East and Southeast Asia, or 2.3 Gt in subsaharan Africa.