993 resultados para Hazard Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a short history of the appraisal of laser scanner technologies in geosciences used for imaging relief by high-resolution digital elevation models (HRDEMs) or 3D models. A general overview of light detection and ranging (LIDAR) techniques applied to landslides is given, followed by a review of different applications of LIDAR for landslide, rockfall and debris-flow. These applications are classified as: (1) Detection and characterization of mass movements; (2) Hazard assessment and susceptibility mapping; (3) Modelling; (4) Monitoring. This review emphasizes how LIDARderived HRDEMs can be used to investigate any type of landslides. It is clear that such HRDEMs are not yet a common tool for landslides investigations, but this technique has opened new domains of applications that still have to be developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In automobile insurance, it is useful to achieve a priori ratemaking by resorting to gene- ralized linear models, and here the Poisson regression model constitutes the most widely accepted basis. However, insurance companies distinguish between claims with or without bodily injuries, or claims with full or partial liability of the insured driver. This paper exa- mines an a priori ratemaking procedure when including two di®erent types of claim. When assuming independence between claim types, the premium can be obtained by summing the premiums for each type of guarantee and is dependent on the rating factors chosen. If the independence assumption is relaxed, then it is unclear as to how the tari® system might be a®ected. In order to answer this question, bivariate Poisson regression models, suitable for paired count data exhibiting correlation, are introduced. It is shown that the usual independence assumption is unrealistic here. These models are applied to an automobile insurance claims database containing 80,994 contracts belonging to a Spanish insurance company. Finally, the consequences for pure and loaded premiums when the independence assumption is relaxed by using a bivariate Poisson regression model are analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Leptin is produced primarily by adipocytes. Although originally associated with the central regulation of satiety and energy metabolism, increasing evidence indicates that leptin may be an important factor for congestive heart faire (CHF). In the study, we aimed to test the hypothesis that leptin may influence CHF pathophysiology via a pathway of increasing body mass index (BMI). Methods: We studied 2,389 elderly participants aged 70 and older (M; 1161, F: 1228) without CHF and with serum leptin measures at the Health Aging, and Body Composition study. We analyzed the association between serum leptin level and risk of incident CHF using Cox hazard proportional regression models. Elevated leptin level was defined as more than the highest quartile (Q4) of leptin distribution in the total sample for each gender. Adjusted-covariates included demographic, behavior, lipid and inflammation variables (partially-adjusted models), and further included BMI (fully-adjusted models). Results: In a mean 9-year follow-up, 316 participants (13.2%) developed CHF. The partially-adjusted models indicated that men and women with elevated serum leptin levels (>=9.89 ng/ml in men and >=25 ng/ml in women) had significantly higher risks of developing CHF than those with leptin level of less than Q4. The adjusted hazard ratios (95%CI) for incident CHF was 1.49 (1.04 -2.13) in men and 1.71 (1.12 -2.58) in women. However, these associations became non-significant after adjustment for including BMI for each gender. The fully-adjusted hazard ratios (95%CI) were 1.43 (0.94 -2.18) in men and 1.24 (0.77-1.99) in women. Conclusion: Subjects with elevated leptin levels have a higher risk of CHF. The study supports the hypothesis that the influence of leptin level on risk of CHF may be through a pathway related to increasing BMI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The goal of antiretroviral therapy (ART) is to reduce HIV-related morbidity and mortality by suppressing HIV replication. The prognostic value of persistent low-level viremia (LLV), particularly for clinical outcomes, is unknown. OBJECTIVE: Assess the association of different levels of LLV with virological failure, AIDS event, and death among HIV-infected patients receiving combination ART. METHODS: We analyzed data from 18 cohorts in Europe and North America, contributing to the ART Cohort Collaboration. Eligible patients achieved viral load below 50 copies/ml within 3-9 months after ART initiation. LLV50-199 was defined as two consecutive viral loads between 50 and 199 copies/ml and LLV200-499 as two consecutive viral loads between 50 and 499 copies/ml, with at least one between 200 and 499 copies/ml. We used Cox models to estimate the association of LLV with virological failure (two consecutive viral loads at least 500 copies/ml or one viral load at least 500 copies/ml, followed by a modification of ART) and AIDS event/death. RESULTS: Among 17 902 patients, 624 (3.5%) experienced LLV50-199 and 482 (2.7%) LLV200-499. Median follow-up was 2.3 and 3.1 years for virological and clinical outcomes, respectively. There were 1903 virological failure, 532 AIDS events and 480 deaths. LLV200-499 was strongly associated with virological failure [adjusted hazard ratio (aHR) 3.97, 95% confidence interval (CI) 3.05-5.17]. LLV50-199 was weakly associated with virological failure (aHR 1.38, 95% CI 0.96-2.00). LLV50-199 and LLV200-499 were not associated with AIDS event/death (aHR 1.19, 95% CI 0.78-1.82; and aHR 1.11, 95% CI 0.72-1.71, respectively). CONCLUSION: LLV200-499 was strongly associated with virological failure, but not with AIDS event/death. Our results support the US guidelines, which define virological failure as a confirmed viral load above 200 copies/ml.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Difficult tracheal intubation assessment is an important research topic in anesthesia as failed intubations are important causes of mortality in anesthetic practice. The modified Mallampati score is widely used, alone or in conjunction with other criteria, to predict the difficulty of intubation. This work presents an automatic method to assess the modified Mallampati score from an image of a patient with the mouth wide open. For this purpose we propose an active appearance models (AAM) based method and use linear support vector machines (SVM) to select a subset of relevant features obtained using the AAM. This feature selection step proves to be essential as it improves drastically the performance of classification, which is obtained using SVM with RBF kernel and majority voting. We test our method on images of 100 patients undergoing elective surgery and achieve 97.9% accuracy in the leave-one-out crossvalidation test and provide a key element to an automatic difficult intubation assessment system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudi elaborat a partir d’una estada a l’Institut National d'Histoire de l'Art- Bibliothèque Nationale de France entre l'1 i el 31 de juliol de 2007. S’ha treballat en la recerca documental sobre les relacions artístiques entre França i Catalunya a l’Època Moderna. Els materials o fons documentals d’interès han estat dos: els fons gràfics d’estampes i gravats de la Bibliothèque Nationale, no tant en el sentit de consulta dels originals –que en alguna ocasió també- com sí en el la visualització de les vastíssimes fototeques, que ha permès a l’autor aplegar un bon nombre d’imatges que en el futur serviran per posar en relació la cultura figurativa francesa –sobretot de la pintura i de l’escultura, però també de la tractadística arquitectònica- amb la catalana de l’època, ja sigui per constatar les semblances com per fer notar les diferències en els usos dels models figuratius. S’ha buidat material bibliogràfic difícil de localitzar a Catalunya. Són publicacions referides a gravat, d’una banda, i a patrimoni artístic. Aquest darrer aspecte s’ha treballat des de dos vessants: en seguir notícies de la presència d’artistes catalans a França i viceversa, i en buscar dades sobre l’espoli d’obres d’art portat a terme a Catalunya durant el període napoleònic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Model-based approaches have been used increasingly in conservation biology over recent years. Species presence data used for predictive species distribution modelling are abundant in natural history collections, whereas reliable absence data are sparse, most notably for vagrant species such as butterflies and snakes. As predictive methods such as generalized linear models (GLM) require absence data, various strategies have been proposed to select pseudo-absence data. However, only a few studies exist that compare different approaches to generating these pseudo-absence data. 2. Natural history collection data are usually available for long periods of time (decades or even centuries), thus allowing historical considerations. However, this historical dimension has rarely been assessed in studies of species distribution, although there is great potential for understanding current patterns, i.e. the past is the key to the present. 3. We used GLM to model the distributions of three 'target' butterfly species, Melitaea didyma, Coenonympha tullia and Maculinea teleius, in Switzerland. We developed and compared four strategies for defining pools of pseudo-absence data and applied them to natural history collection data from the last 10, 30 and 100 years. Pools included: (i) sites without target species records; (ii) sites where butterfly species other than the target species were present; (iii) sites without butterfly species but with habitat characteristics similar to those required by the target species; and (iv) a combination of the second and third strategies. Models were evaluated and compared by the total deviance explained, the maximized Kappa and the area under the curve (AUC). 4. Among the four strategies, model performance was best for strategy 3. Contrary to expectations, strategy 2 resulted in even lower model performance compared with models with pseudo-absence data simulated totally at random (strategy 1). 5. Independent of the strategy model, performance was enhanced when sites with historical species presence data were not considered as pseudo-absence data. Therefore, the combination of strategy 3 with species records from the last 100 years achieved the highest model performance. 6. Synthesis and applications. The protection of suitable habitat for species survival or reintroduction in rapidly changing landscapes is a high priority among conservationists. Model-based approaches offer planning authorities the possibility of delimiting priority areas for species detection or habitat protection. The performance of these models can be enhanced by fitting them with pseudo-absence data relying on large archives of natural history collection species presence data rather than using randomly sampled pseudo-absence data.