983 resultados para Penalized likelihood


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con este trabajo revisamos los Modelos de niveles de las tasas de intereses en Chile. Además de los Modelos de Nivel tradicionales por Chan, Karoly, Longstaff y Lijadoras (1992) en EE. UU, y Parisi (1998) en Chile, por el método de Probabilidad Maximun permitimos que la volatilidad condicional también incluya los procesos inesperados de la información (el modelo GARCH ) y también que la volatilidad sea la función del nivel de la tasa de intereses (modelo TVP-NIVELE) como en Brenner, Harjes y la Crona (1996). Para esto usamos producciones de mercado de bonos de reconocimiento, en cambio las producciones mensuales medias de subasta PDBC, y la ampliación del tamaño y la frecuencia de la muestra a 4 producciones semanales con términos(condiciones) diferentes a la madurez: 1 año, 5 años, 10 años y 15 años. Los resultados principales del estudio pueden ser resumidos en esto: la volatilidad de los cambios inesperados de las tarifas depende positivamente del nivel de las tarifas, sobre todo en el modelo de TVP-NIVEL. Obtenemos pruebas de reversión tacañas, tal que los incrementos en las tasas de intereses no eran independientes, contrariamente a lo obtenido por Brenner. en EE. UU. Los modelos de NIVELES no son capaces de ajustar apropiadamente la volatilidad en comparación con un modelo GARCH (1,1), y finalmente, el modelo de TVP-NIVEL no vence los resultados del modelo GARCH (1,1)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the challenges that faces the winter maintainer is how much chemical to apply to the road under given conditions. Insufficient chemical can lead to the road surface becoming slick, and the road thus becoming unsafe. In all likelihood, additional applications will have to be made, requiring additional effort and use of resources. However, too much chemical can also be bad. While an excess of chemical will ensure (in most circumstances) that a safe road condition is achieved, it may also result in a substantial waste of chemical (with associated costs for this waste) and in ancillary damage to the road itself and to the surrounding environment. Ideally, one should apply what might be termed the “goldilocks” amount of chemical to the road: Not too much, and not too little, but just right. Of course the reality of winter maintenance makes achieving the “goldilocks” application rate somewhat of a fairy tale. In the midst of a severe storm, when conditions are poor and getting worse, the last thing on a plow operator’s mind is a minute adjustment in the amount of chemical being applied to the road. However, there may be considerable benefit and substantial savings to be achieved if chemical applications can be optimized to some degree, so that wastage is minimized without compromising safety. The goal of this study was to begin to develop such information through a series of laboratory studies in which the force needed to scrape ice from concrete blocks was measured, under a variety of chemical application conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose This study aimed to identify self-perception variables which may predict return to work (RTW) in orthopedic trauma patients 2 years after rehabilitation. Methods A prospective cohort investigated 1,207 orthopedic trauma inpatients, hospitalised in rehabilitation, clinics at admission, discharge, and 2 years after discharge. Information on potential predictors was obtained from self administered questionnaires. Multiple logistic regression models were applied. Results In the final model, a higher likelihood of RTW was predicted by: better general health and lower pain at admission; health and pain improvements during hospitalisation; lower impact of event (IES-R) avoidance behaviour score; higher IES-R hyperarousal score, higher SF-36 mental score and low perceived severity of the injury. Conclusion RTW is not only predicted by perceived health, pain and severity of the accident at the beginning of a rehabilitation program, but also by the changes in pain and health perceptions observed during hospitalisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To systematically review and meta-analyze published data about the diagnostic performance of Fluorine-18-Fluorodeoxyglucose ((18)F-FDG) positron emission tomography (PET) and PET/computed tomography (PET/CT) in the assessment of pleural abnormalities in cancer patients. METHODS: A comprehensive literature search of studies published through June 2013 regarding the role of (18)F-FDG-PET and PET/CT in evaluating pleural abnormalities in cancer patients was performed. All retrieved studies were reviewed and qualitatively analyzed. Pooled sensitivity, specificity, positive and negative likelihood ratio (LR+ and LR-) and diagnostic odd ratio (DOR) of (18)F-FDG-PET or PET/CT on a per patient-based analysis were calculated. The area under the summary ROC curve (AUC) was calculated to measure the accuracy of these methods in the assessment of pleural abnormalities. Sub-analyses considering (18)F-FDG-PET/CT and patients with lung cancer only were carried out. RESULTS: Eight studies comprising 360 cancer patients (323 with lung cancer) were included. The meta-analysis of these selected studies provided the following results: sensitivity 86% [95% confidence interval (95%CI): 80-91%], specificity 80% [95%CI: 73-85%], LR+ 3.7 [95%CI: 2.8-4.9], LR- 0.18 [95%CI: 0.09-0.34], DOR 27 [95%CI: 13-56]. The AUC was 0.907. No significant improvement considering PET/CT studies only and patients with lung cancer was found. CONCLUSIONS: (18)F-FDG-PET and PET/CT demonstrated to be useful diagnostic imaging methods in the assessment of pleural abnormalities in cancer patients, nevertheless possible sources of false-negative and false-positive results should be kept in mind. The literature focusing on the use of (18)F-FDG-PET and PET/CT in this setting remains still limited and prospective studies are needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The genomic era has revealed that the large repertoire of observed animal phenotypes is dependent on changes in the expression patterns of a finite number of genes, which are mediated by a plethora of transcription factors (TFs) with distinct specificities. The dimerization of TFs can also increase the complexity of a genetic regulatory network manifold, by combining a small number of monomers into dimers with distinct functions. Therefore, studying the evolution of these dimerizing TFs is vital for understanding how complexity increased during animal evolution. We focus on the second largest family of dimerizing TFs, the basic-region leucine zipper (bZIP), and infer when it expanded and how bZIP DNA-binding and dimerization functions evolved during the major phases of animal evolution. Specifically, we classify the metazoan bZIPs into 19 families and confirm the ancient nature of at least 13 of these families, predating the split of the cnidaria. We observe fixation of a core dimerization network in the last common ancestor of protostomes-deuterostomes. This was followed by an expansion of the number of proteins in the network, but no major dimerization changes in interaction partners, during the emergence of vertebrates. In conclusion, the bZIPs are an excellent model with which to understand how DNA binding and protein interactions of TFs evolved during animal evolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The likelihood of significant exposure to drugs in infants through breast milk is poorly defined, given the difficulties of conducting pharmacokinetics (PK) studies. Using fluoxetine (FX) as an example, we conducted a proof-of-principle study applying population PK (popPK) modeling and simulation to estimate drug exposure in infants through breast milk. We simulated data for 1,000 mother-infant pairs, assuming conservatively that the FX clearance in an infant is 20% of the allometrically adjusted value in adults. The model-generated estimate of the milk-to-plasma ratio for FX (mean: 0.59) was consistent with those reported in other studies. The median infant-to-mother ratio of FX steady-state plasma concentrations predicted by the simulation was 8.5%. Although the disposition of the active metabolite, norfluoxetine, could not be modeled, popPK-informed simulation may be valid for other drugs, particularly those without active metabolites, thereby providing a practical alternative to conventional PK studies for exposure risk assessment in this population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: This study sought to determine the prevalence of transactional sex among university students in Uganda and to assess the possible relationship between transactional sex and sexual coercion, physical violence, mental health, and alcohol use. METHODS: In 2010, 1954 undergraduate students at a Ugandan university responded to a self-administered questionnaire that assessed mental health, substance use, physical violence and sexual behaviors including sexual coercion and transactional sex. The prevalence of transactional sex was assessed and logistic regression analysis was performed to measure the associations between various risk factors and reporting transactional sex. RESULTS: Approximately 25% of the study sample reported having taken part in transactional sex, with more women reporting having accepted money, gifts or some compensation for sex, while more men reporting having paid, given a gift or otherwise compensated for sex. Sexual coercion in men and women was significantly associated with having accepted money, gifts or some compensation for sex. Men who were victims of physical violence in the last 12 months had higher probability of having accepted money, gifts or some compensation for sex than other men. Women who were victims of sexual coercion reported greater likelihood of having paid, given a gift or otherwise compensated for sex. Respondents who had been victims of physical violence in last 12 months, engaged in heavy episodic drinking and had poor mental health status were more likely to have paid, given a gift or otherwise compensated for sex. CONCLUSIONS: University students in Uganda are at high risk of transactional sex. Young men and women may be equally vulnerable to the risks and consequences of transactional sex and should be included in program initiatives to prevent transactional sex. The role of sexual coercion, physical violence, mental health, and alcohol use should be considered when designing interventions for countering transactional sex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature dealing with the interpretation of results of examinations performed on "printed" documents is very limited. The absence of published literature reflects the absence of formal guidelines to help scientists assess the relationship between a questioned document and a particular printing technology. Generally, every printout, independent of the printing technology, may bear traces induced by characteristics of manufacture and/or acquired features of the printing device. A logical approach to help the scientist in the formal interpretation of such findings involves the consideration of a likelihood ratio. Three examples aim to show the application of this approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Red light running (RLR) is a problem in the US that has resulted in 165,000 injuries and 907 fatalities annually. In Iowa, RLR-related crashes make up 24.5 percent of all crashes and account for 31.7 percent of fatal and major injury crashes at signalized intersections. RLR crashes are a safety concern due to the increased likelihood of injury compared to other types of crashes. One tool used to combat red light running is automated enforcement in the form of RLR cameras. Automated enforcement, while effective, is often controversial. Cedar Rapids, Iowa installed RLR and speeding cameras at seven intersections across the city. The intersections were chosen based on crash rates and whether cameras could feasibly be placed at the intersection approaches. The cameras were placed starting in February 2010 with the last one becoming operational in December 2010. An analysis of the effect of the cameras on safety at these intersections was determined prudent in helping to justify the installation and effectiveness of the cameras. The objective of this research was to assess the safety effectiveness of the RLR program that has been implemented in Cedar Rapids. This was accomplished by analyzing data to determine changes in the following metrics:  Reductions in red light violation rates based on overall changes, time of day changes, and changes by lane  Effectiveness of the cameras over time  Time in which those running the red light enter the intersection  Changes in the average headway between vehicles entering the intersection

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BackgroundIn Switzerland, socio-demographic and behavioural factors are associated with obesity, but no study ever assessed their impact on weight gain using prospective data.MethodsData from 4,469 participants (53.0% women), aged 35 to 75 years at baseline and followed for 5.5 years. Weight gain was considered as a rate (kg/year) or as gaining ¿5 kg during the study period.ResultsRate of weight gain was lower among participants who were older (mean¿±¿standard deviation: 0.46¿±¿0.92, 0.33¿±¿0.88, 0.21¿±¿0.86 and 0.06¿±¿0.74 kg/year in participants aged [35-45[, [45-55[, [55¿65[and [65+ years, respectively, P<0.001); physically active (0.27¿±¿0.82 vs. 0.35¿±¿0.95 kg/year for sedentary, P¿<¿0.005) or living in a couple (0.29¿±¿0.84 vs. 0.35¿±¿0.96 kg/year for living single, P¿<¿0.05), and higher among current smokers (0.41¿±¿0.97, 0.26¿±¿0.84 and 0.29±0.85 kg/year for current, former and never smokers, respectively, p<0.001). These findings were further confirmed by multivariable analysis. Multivariable logistic regression showed that receiving social help, being a current smoker or obese increased the likelihood of gaining ¿5Kg: Odds ratio (OR) and 95% confidence interval (CI) 1.43 (1.16-1.77); 1.63 (1.35-1.95) and 1.95 (1.57-2.43), respectively, while living in couple or being physically active decreased the risk: 0.73 (0.62-0.86) and 0.72 (0.62-0.83), respectively. No association was found between weight gain and gender, being born in Switzerland or education.ConclusionsIn Switzerland, financial difficulties (indicated by receiving social help) and current smoking were associated with increases in body weight over a 5 years follow-up. Living in couple, being older or physically active were protective against weight gain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we develop a data-driven methodology to characterize the likelihood of orographic precipitation enhancement using sequences of weather radar images and a digital elevation model (DEM). Geographical locations with topographic characteristics favorable to enforce repeatable and persistent orographic precipitation such as stationary cells, upslope rainfall enhancement, and repeated convective initiation are detected by analyzing the spatial distribution of a set of precipitation cells extracted from radar imagery. Topographic features such as terrain convexity and gradients computed from the DEM at multiple spatial scales as well as velocity fields estimated from sequences of weather radar images are used as explanatory factors to describe the occurrence of localized precipitation enhancement. The latter is represented as a binary process by defining a threshold on the number of cell occurrences at particular locations. Both two-class and one-class support vector machine classifiers are tested to separate the presumed orographic cells from the nonorographic ones in the space of contributing topographic and flow features. Site-based validation is carried out to estimate realistic generalization skills of the obtained spatial prediction models. Due to the high class separability, the decision function of the classifiers can be interpreted as a likelihood or susceptibility of orographic precipitation enhancement. The developed approach can serve as a basis for refining radar-based quantitative precipitation estimates and short-term forecasts or for generating stochastic precipitation ensembles conditioned on the local topography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.