993 resultados para hazard models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Eradication of bovine tuberculosis (bTB) through the application of test-and-cull programs is a declared goal of developed countries in which the disease is still endemic. Here, longitudinal data from more than 1,700 cattle herds tested during a 12 year-period in the eradication program in the region of Madrid, Spain, were analyzed to quantify the within-herd transmission coefficient (β) depending on the herd-type (beef/dairy/bullfighting). In addition, the probability to recover the officially bTB free (OTF) status in infected herds depending on the type of herd and the diagnostic strategy implemented was assessed using Cox proportional hazard models. RESULTS Overall, dairy herds showed higher β (median 4.7) than beef or bullfighting herds (2.3 and 2.2 respectively). Introduction of interferon-gamma (IFN-γ) as an ancillary test produced an apparent increase in the β coefficient regardless of production type, likely due to an increase in diagnostic sensitivity. Time to recover OTF status was also significantly lower in dairy herds, and length of bTB episodes was significantly reduced when the IFN-γ was implemented to manage the outbreak. CONCLUSIONS Our results suggest that bTB spreads more rapidly in dairy herds compared to other herd types, a likely cause being management and demographic-related factors. However, outbreaks in dairy herds can be controlled more rapidly than in typically extensive herd types. Finally, IFN-γ proved its usefulness to rapidly eradicate bTB at a herd-level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Oral cancer is a significant public health problem world-wide and exerts high economic, social, psychological, and physical burdens on patients, their families, and on their primary care providers. We set out to describe the changing trends in incidence and survival rates of oral cancer in Ireland between 1994 and 2009. Methods: National data on incident oral cancers [ICD 10 codes C01-C06] were obtained from the National Cancer Registry Ireland from 1994 to 2009. We estimated annual percentage change (APC) in oral cancer incidence during 1994–2009 using joinpoint regression software (version 4.2.0.2). The lifetime risk of oral cancer to age 79 was estimated using Irish incidence and population data from 2007 to 2009. Survival rates were also examined using Kaplan-Meier curves and Cox proportional hazard models to explore the influence of several demographic/lifestyle covariates with follow-up to end 2012. Results: Data were obtained on 2,147 oral cancer incident cases. Men accounted for two-thirds of oral cancer cases (n = 1,430). Annual rates in men decreased significantly during 1994–2001 (APC = -4.8 %, 95 % CI: −8.7 to −0.7) and then increased moderately (APC = 2.3 %, 95 % CI: −0.9 to 5.6). In contrast, annual incidence increased significantly in women throughout the study period (APC = 3.2 %, 95 % CI: 1.9 to 4.6). There was an elevated risk of death among oral cancer patients who were: older than 60 years of age; smokers; unemployed or retired; those living in the most deprived areas; and those whose tumour was sited in the base of the tongue. Being married and diagnosed in more recent years were associated with reduced risk of death. Conclusion: Oral cancer increased significantly in both sexes between 1999 and 2009 in Ireland. Our analyses demonstrate the influence of measured factors such as smoking, time of diagnosis and age on observed trends. Unmeasured factors such as alcohol use, HPV and dietary factors may also be contributing to increased trends. Several of these are modifiable risk factors which are crucial for informing public health policies, and thus more research is needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background To identify those characteristics of self-management interventions in patients with heart failure (HF) that are effective in influencing health-related quality of life, mortality, and hospitalizations. Methods and Results Randomized trials on self-management interventions conducted between January 1985 and June 2013 were identified and individual patient data were requested for meta-analysis. Generalized mixed effects models and Cox proportional hazard models including frailty terms were used to assess the relation between characteristics of interventions and health-related outcomes. Twenty randomized trials (5624 patients) were included. Longer intervention duration reduced mortality risk (hazard ratio 0.99, 95% confidence interval [CI] 0.97–0.999 per month increase in duration), risk of HF-related hospitalization (hazard ratio 0.98, 95% CI 0.96–0.99), and HF-related hospitalization at 6 months (risk ratio 0.96, 95% CI 0.92–0.995). Although results were not consistent across outcomes, interventions comprising standardized training of interventionists, peer contact, log keeping, or goal-setting skills appeared less effective than interventions without these characteristics. Conclusion No specific program characteristics were consistently associated with better effects of self-management interventions, but longer duration seemed to improve the effect of self-management interventions on several outcomes. Future research using factorial trial designs and process evaluations is needed to understand the working mechanism of specific program characteristics of self-management interventions in HF patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tsunamis are rare events. However, their impact can be devastating and it may extend to large geographical areas. For low-probability high-impact events like tsunamis, it is crucial to implement all possible actions to mitigate the risk. The tsunami hazard assessment is the result of a scientific process that integrates traditional geological methods, numerical modelling and the analysis of tsunami sources and historical records. For this reason, analysing past events and understanding how they interacted with the land is the only way to inform tsunami source and propagation models, and quantitatively test forecast models like hazard analyses. The primary objective of this thesis is to establish an explicit relationship between the macroscopic intensity, derived from historical descriptions, and the quantitative physical parameters measuring tsunami waves. This is done first by defining an approximate estimation method based on a simplified 1D physical onshore propagation model to convert the available observations into one reference physical metric. Wave height at the coast was chosen as the reference due to its stability and independence of inland effects. This method was then implemented for a set of well-known past events to build a homogeneous dataset with both macroseismic intensity and wave height. By performing an orthogonal regression, a direct and invertible empirical relationship could be established between the two parameters, accounting for their relevant uncertainties. The target relationship is extensively tested and finally applied to the Italian Tsunami Effect Database (ITED), providing a homogeneous estimation of the wave height for all existing tsunami observations in Italy. This provides the opportunity for meaningful comparison for models and simulations, as well as quantitatively testing tsunami hazard models for the Italian coasts and informing tsunami risk management initiatives.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"March 1977."

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Final report, issued March 1977.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An investigation into karst hazard in southern Ontario has been undertaken with the intention of leading to the development of predictive karst models for this region. The reason these are not currently feasible is a lack of sufficient karst data, though this is not entirely due to the lack of karst features. Geophysical data was collected at Lake on the Mountain, Ontario as part of this karst investigation. This data was collected in order to validate the long-standing hypothesis that Lake on the Mountain was formed from a sinkhole collapse. Sub-bottom acoustic profiling data was collected in order to image the lake bottom sediments and bedrock. Vertical bedrock features interpreted as solutionally enlarged fractures were taken as evidence for karst processes on the lake bottom. Additionally, the bedrock topography shows a narrower and more elongated basin than was previously identified, and this also lies parallel to a mapped fault system in the area. This suggests that Lake on the Mountain was formed over a fault zone which also supports the sinkhole hypothesis as it would provide groundwater pathways for karst dissolution to occur. Previous sediment cores suggest that Lake on the Mountain would have formed at some point during the Wisconsinan glaciation with glacial meltwater and glacial loading as potential contributing factors to sinkhole development. A probabilistic karst model for the state of Kentucky, USA, has been generated using the Weights of Evidence method. This model is presented as an example of the predictive capabilities of these kind of data-driven modelling techniques and to show how such models could be applied to karst in Ontario. The model was able to classify 70% of the validation dataset correctly while minimizing false positive identifications. This is moderately successful and could stand to be improved. Finally, suggestions to improving the current karst model of southern Ontario are suggested with the goal of increasing investigation into karst in Ontario and streamlining the reporting system for sinkholes, caves, and other karst features so as to improve the current Ontario karst database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, the cure rate model has been used for modeling time-to-event data within which a significant proportion of patients are assumed to be cured of illnesses, including breast cancer, non-Hodgkin lymphoma, leukemia, prostate cancer, melanoma, and head and neck cancer. Perhaps the most popular type of cure rate model is the mixture model introduced by Berkson and Gage [1]. In this model, it is assumed that a certain proportion of the patients are cured, in the sense that they do not present the event of interest during a long period of time and can found to be immune to the cause of failure under study. In this paper, we propose a general hazard model which accommodates comprehensive families of cure rate models as particular cases, including the model proposed by Berkson and Gage. The maximum-likelihood-estimation procedure is discussed. A simulation study analyzes the coverage probabilities of the asymptotic confidence intervals for the parameters. A real data set on children exposed to HIV by vertical transmission illustrates the methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores – Sistemas Digitais e Percepcionais pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The environmental and socio-economic importance of coastal areas is widely recognized, but at present these areas face severe weaknesses and high-risk situations. The increased demand and growing human occupation of coastal zones have greatly contributed to exacerbating such weaknesses. Today, throughout the world, in all countries with coastal regions, episodes of waves overtopping and coastal flooding are frequent. These episodes are usually responsible for property losses and often put human lives at risk. The floods are caused by coastal storms primarily due to the action of very strong winds. The propagation of these storms towards the coast induces high water levels. It is expected that climate change phenomena will contribute to the intensification of coastal storms. In this context, an estimation of coastal flooding hazards is of paramount importance for the planning and management of coastal zones. Consequently, carrying out a series of storm scenarios and analyzing their impacts through numerical modeling is of prime interest to coastal decision-makers. Firstly, throughout this work, historical storm tracks and intensities are characterized for the northeastern region of United States coast, in terms of probability of occurrence. Secondly, several storm events with high potential of occurrence are generated using a specific tool of DelftDashboard interface for Delft3D software. Hydrodynamic models are then used to generate ensemble simulations to assess storms' effects on coastal water levels. For the United States’ northeastern coast, a highly refined regional domain is considered surrounding the area of The Battery, New York, situated in New York Harbor. Based on statistical data of numerical modeling results, a review of the impact of coastal storms to different locations within the study area is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBackground:30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes.Objective:This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx) at different stages of cardiac resynchronization therapy (CRT).Methods:Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC) III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves.Results:The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD), ejection fraction < 25% and use of high doses of diuretics (HDD) increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping.Conclusion:We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.