31 resultados para Multinomial Logit
Resumo:
Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.
Resumo:
Aims: To describe the drinking patterns and their baseline predictive factors during a 12-month period after an initial evaluation for alcohol treatment. Methods CONTROL is a single-center, prospective, observational study evaluating consecutive alcohol-dependent patients. Using a curve clustering methodology based on a polynomial regression mixture model, we identified three clusters of patients with dominant alcohol use patterns described as mostly abstainers, mostly moderate drinkers and mostly heavy drinkers. Multinomial logistic regression analysis was used to identify baseline factors (socio-demographic, alcohol dependence consequences and related factors) predictive of belonging to each drinking cluster. ResultsThe sample included 143 alcohol-dependent adults (63.6% males), mean age 44.6 ± 11.8 years. The clustering method identified 47 (32.9%) mostly abstainers, 56 (39.2%) mostly moderate drinkers and 40 (28.0%) mostly heavy drinkers. Multivariate analyses indicated that mild or severe depression at baseline predicted belonging to the mostly moderate drinkers cluster during follow-up (relative risk ratio (RRR) 2.42, CI [1.02-5.73, P = 0.045] P = 0.045), while living alone (RRR 2.78, CI [1.03-7.50], P = 0.044) and reporting more alcohol-related consequences (RRR 1.03, CI [1.01-1.05], P = 0.004) predicted belonging to the mostly heavy drinkers cluster during follow-up. Conclusion In this sample, the drinking patterns of alcohol-dependent patients were predicted by baseline factors, i.e. depression, living alone or alcohol-related consequences and findings that may inform clinicians about the likely drinking patterns of their alcohol-dependent patient over the year following the initial evaluation for alcohol treatment.
Resumo:
PURPOSE: The objective of this study was to investigate the effects of weather, rank, and home advantage on international football match results and scores in the Gulf Cooperation Council (GCC) region. METHODS: Football matches (n = 2008) in six GCC countries were analyzed. To determine the weather influence on the likelihood of favorable outcome and goal difference, generalized linear model with a logit link function and multiple regression analysis were performed. RESULTS: In the GCC region, home teams tend to have greater likelihood of a favorable outcome (P < 0.001) and higher goal difference (P < 0.001). Temperature difference was identified as a significant explanatory variable when used independently (P < 0.001) or after adjustment for home advantage and team ranking (P < 0.001). The likelihood of favorable outcome for GCC teams increases by 3% for every 1-unit increase in temperature difference. After inclusion of interaction with opposition, this advantage remains significant only when playing against non-GCC opponents. While home advantage increased the odds of favorable outcome (P < 0.001) and goal difference (P < 0.001) after inclusion of interaction term, the likelihood of favorable outcome for a GCC team decreased (P < 0.001) when playing against a stronger opponent. Finally, the temperature and wet bulb globe temperature approximation were found as better indicators of the effect of environmental conditions than absolute and relative humidity or heat index on match outcomes. CONCLUSIONS: In GCC region, higher temperature increased the likelihood of a favorable outcome when playing against non-GCC teams. However, international ranking should be considered because an opponent with a higher rank reduced, but did not eliminate, the likelihood of a favorable outcome.
Resumo:
PURPOSE: To determine the characteristics specific to boys with disordered eating behaviors (DEB) and the general context in which these DEB occur. METHOD: Data were drawn from the SMASH02 database, a survey carried out among post-mandatory school students in Switzerland aged 16-20 years in 2002. Only males (N=3890) were included, and were classified into into one of four groups based on their level of concern about weight/food and on their eating behaviors, as follows: group 1: one concern without behavior (N=862); group 2: more than one concern without behavior (N=361); group 3: at least one behavior (N=798); and a control group (N=1869), according to previously validated items. Groups were compared for personal, family, school, experience of violence, and health-compromising behaviors variables on the bivariate level. All significant variables were included in a multinomial logistic regression using Stata 9 software. RESULTS: About one-half of the boys reported either a concern or unhealthy eating behavior. Compared with the control group, boys from the three groups were more likely to be students and to report a history of sexual abuse, delinquency, depression, and feeling fat. In addition, boys from group 3 were more likely to report a history of dieting, early puberty, peer teasing, having experienced violence, frequent inebriation, and being overweight. CONCLUSION: DEB concern adolescent males more frequently than thought and seem to be integrated in a general dysfunctional context, in which violence is predominant. Adolescent males also need to be screened for DEB. Moreover, prevention programs should target the increasing social and media pressure regarding boys ideal body shape and raise public consciousness about this phenomenon.
Resumo:
PURPOSE: The purpose of this study is to explore the periodical patterns of events and deaths related to cardiovascular disease (CVD), acute myocardial infarction (AMI) and stroke in Swiss adults (≥ 18 years). METHODS: Mortality data for period 1969-2007 (N=869,863 CVD events) and hospitalization data for period 1997-2008 (N=959,990 CVD events) were used. The annual, weekly and circadian distribution of CVD-related deaths and events were assessed. Multivariate analysis was conducted using multinomial logistic regression adjusting for age, gender and calendar year and considering deaths from respiratory diseases, accidents or other causes as competitive events. RESULTS: CVD deaths and hospitalizations occurred less frequently in the summer months. Similar patterns were found for AMI and stroke. No significant weekly variation for CVD deaths was found. Stratification by age and gender showed subjects aged <65 years to present a higher probability of dying on Mondays and Saturday, only for men. This finding was confirmed after multivariate adjustment. Finally, a circadian variation in CVD mortality was observed, with a first peak in the morning (8-12 am) and a smaller second peak in the late afternoon (2-6 pm). This pattern persisted after multivariate adjustment and was more pronounced for AMI than for stroke. CONCLUSION: There is a periodicity of hospitalizations and deaths related to CVD, AMI and stroke in Switzerland. This pattern changes slightly according to the age and sex of the subjects. Although the underlying mechanisms are not fully identified, preventive measures should take into account these aspects to develop better strategies of prevention and management of CVD.
Resumo:
OBJECTIVE: To identify prevalence of and factors associated with intentional use of HIV risk reduction practices by men who have sex with men during anal intercourse with casual partners. METHODS: Cross-sectional survey pertaining to the Swiss HIV behavioral surveillance system, using an anonymous self-administered questionnaire in a self-selected sample of men who have sex with men (n = 2953). Multinomial regression was used to estimate factors associated with reporting either "no or inconsistent condom use" or "one or more risk reduction practices" over "consistent condom use." RESULTS: 57.2% reported anal intercourse with casual partner(s) over the last 12 months. Of these, 24.0% declared having used a risk reduction practice (73.8% of those who did not use condoms consistently). HIV-positive people were more likely to have done so. Most predictors were similarly associated to both regression categories. Four significant predictors were common to both regression categories: Internet partner seeking, age, age squared, and the interaction between HIV status positive and number of partners. The only association that differed markedly between the 2 regression categories was having a number of partners above median, significantly associated with the risk reduction category. CONCLUSIONS: Although condom use is the most frequent protection strategy in anal intercourse with casual partners, risk reduction practices are highly prevalent. However, there are no clear differences regarding predictors between risk reduction practices and inconsistent or no condom use. This suggests that risk reduction is an opportunistic response rather than a strategy per se.
Resumo:
Aim To assess the geographical transferability of niche-based species distribution models fitted with two modelling techniques. Location Two distinct geographical study areas in Switzerland and Austria, in the subalpine and alpine belts. Methods Generalized linear and generalized additive models (GLM and GAM) with a binomial probability distribution and a logit link were fitted for 54 plant species, based on topoclimatic predictor variables. These models were then evaluated quantitatively and used for spatially explicit predictions within (internal evaluation and prediction) and between (external evaluation and prediction) the two regions. Comparisons of evaluations and spatial predictions between regions and models were conducted in order to test if species and methods meet the criteria of full transferability. By full transferability, we mean that: (1) the internal evaluation of models fitted in region A and B must be similar; (2) a model fitted in region A must at least retain a comparable external evaluation when projected into region B, and vice-versa; and (3) internal and external spatial predictions have to match within both regions. Results The measures of model fit are, on average, 24% higher for GAMs than for GLMs in both regions. However, the differences between internal and external evaluations (AUC coefficient) are also higher for GAMs than for GLMs (a difference of 30% for models fitted in Switzerland and 54% for models fitted in Austria). Transferability, as measured with the AUC evaluation, fails for 68% of the species in Switzerland and 55% in Austria for GLMs (respectively for 67% and 53% of the species for GAMs). For both GAMs and GLMs, the agreement between internal and external predictions is rather weak on average (Kulczynski's coefficient in the range 0.3-0.4), but varies widely among individual species. The dominant pattern is an asymmetrical transferability between the two study regions (a mean decrease of 20% for the AUC coefficient when the models are transferred from Switzerland and 13% when they are transferred from Austria). Main conclusions The large inter-specific variability observed among the 54 study species underlines the need to consider more than a few species to test properly the transferability of species distribution models. The pronounced asymmetry in transferability between the two study regions may be due to peculiarities of these regions, such as differences in the ranges of environmental predictors or the varied impact of land-use history, or to species-specific reasons like differential phenotypic plasticity, existence of ecotypes or varied dependence on biotic interactions that are not properly incorporated into niche-based models. The lower variation between internal and external evaluation of GLMs compared to GAMs further suggests that overfitting may reduce transferability. Overall, a limited geographical transferability calls for caution when projecting niche-based models for assessing the fate of species in future environments.
Resumo:
When individuals learn by trial-and-error, they perform randomly chosen actions and then reinforce those actions that led to a high payoff. However, individuals do not always have to physically perform an action in order to evaluate its consequences. Rather, they may be able to mentally simulate actions and their consequences without actually performing them. Such fictitious learners can select actions with high payoffs without making long chains of trial-and-error learning. Here, we analyze the evolution of an n-dimensional cultural trait (or artifact) by learning, in a payoff landscape with a single optimum. We derive the stochastic learning dynamics of the distance to the optimum in trait space when choice between alternative artifacts follows the standard logit choice rule. We show that for both trial-and-error and fictitious learners, the learning dynamics stabilize at an approximate distance of root n/(2 lambda(e)) away from the optimum, where lambda(e) is an effective learning performance parameter depending on the learning rule under scrutiny. Individual learners are thus unlikely to reach the optimum when traits are complex (n large), and so face a barrier to further improvement of the artifact. We show, however, that this barrier can be significantly reduced in a large population of learners performing payoff-biased social learning, in which case lambda(e) becomes proportional to population size. Overall, our results illustrate the effects of errors in learning, levels of cognition, and population size for the evolution of complex cultural traits. (C) 2013 Elsevier Inc. All rights reserved.
Resumo:
We construct a rich dataset covering 47 developing countries over the years 1990-2007, combining several micro and macro level data sources to explore the link between political factors and body mass index (BMI). We implement a heteroskedastic generalized ordered logit model allowing for different covariate effects across the BMI distribution and accounting for the unequal BMI dispersion by geographical area. We find that systems with democratic qualities are more likely to reduce under-weight, but increase overweight/obesity, whereas effective political competition does entail double-benefits in the form of reducing both under-weight and obesity. Our results are robust to the introduction of country fixed effects.
Resumo:
INTRODUCTION: Young cannabis users are at increased risk of later cigarette initiation and progression to nicotine addiction. The present study addresses the frequency at which mulling (adding tobacco to cannabis smoked as joints) is performed and in which way this practice varies according to cigarette smoking status. METHODS: Data were issued from the Swiss 2007 European School Survey Project on Alcohol and other Drugs (ESPAD). A total of 881 past month cannabis users (mean age 15 years, boys 60.1%) were inquired on mulling using an anonymous self-administered questionnaire. Participants were further grouped according to their cigarette smoking status (daily, occasional, former, and never-smokers). RESULTS: Four of every 5 cannabis users depicted mulling as frequently performed. The highest occurrence was found among daily cigarette smokers (DSC; 90.3%), while former cigarette smokers reported the lowest (58.9%). The multinomial logistic regression showed DSC more likely reporting mulling as frequent compared with never-smokers (risk ratio = 3.56 [95% CI 1.55-8.21]). Conclusions: Mulling appears to be a very common process among young cannabis users, especially among concomitant cigarette smokers. Nevertheless, the majority of cigarette abstainers also reported frequently adding tobacco to the cannabis they smoke. Because it may represent a significant exposition to nicotine, mulling should be taken into account when assessing substance use among adolescents and in supporting their quitting attempts.
Resumo:
BACKGROUND AND PURPOSE: Statins display anti-inflammatory and anti-epileptogenic properties in animal models, and may reduce the epilepsy risk in elderly humans; however, a possible modulating role on outcome in patients with status epilepticus (SE) has not been assessed. METHODS: This cohort study was based on a prospective registry including all consecutive adults with incident SE treated in our center between April 2006 and September 2012. SE outcome was categorized at hospital discharge into 'return to baseline', 'new disability' and 'mortality'. The role of potential predictors, including statins treatment on admission, was evaluated using a multinomial logistic regression model. RESULTS: Amongst 427 patients identified, information on statins was available in 413 (97%). Mean age was 60.9 (±17.8) years; 201 (49%) were women; 211 (51%) had a potentially fatal SE etiology; and 191 (46%) experienced generalized-convulsive or non-convulsive SE in coma. Statins (simvastatin, atorvastatin or pravastatin) were prescribed prior to admission in 76 (18%) subjects, mostly elderly. Whilst 208 (50.4%) patients returned to baseline, 58 (14%) died. After adjustment for established SE outcome predictors (age, etiology, SE severity score), statins correlated significantly with lower mortality (relative risk ratio 0.38, P = 0.046). CONCLUSION: This study suggests for the first time that exposure to statins before an SE episode is related to its outcome, involving a possible anti-epileptogenic role. Other studies are needed to confirm this intriguing finding.
Resumo:
OBJECTIVES: Therapeutic coma is advocated in guidelines for management of refractory status epilepticus; this is, however, based on weak evidence. We here address the specific impact of therapeutic coma on status epilepticus outcome. DESIGN: Retrospective assessment of a prospectively collected cohort. SETTING: Academic hospital. PATIENTS: Consecutive adults with incident status epilepticus lasting greater than or equal to 30 minutes, admitted between 2006 and 2013. MEASUREMENTS AND MAIN RESULTS: We recorded prospectively demographics, clinical status epilepticus features, treatment, and outcome at discharge and retrospectively medical comorbidities, hospital stay, and infectious complications. Associations between potential predictors and clinical outcome were analyzed using multinomial logistic regressions. Of 467 patients with incident status epilepticus, 238 returned to baseline (51.1%), 162 had new disability (34.6%), and 67 died (14.3%); 50 subjects (10.7%) were managed with therapeutic coma. Therapeutic coma was associated with poorer outcome in the whole cohort (relative risk ratio for new disability, 6.86; 95% CI, 2.84-16.56; for mortality, 9.10; 95% CI, 3.17-26.16); the effect was more important in patients with complex partial compared with generalized convulsive or nonconvulsive status epilepticus in coma. Prevalence of infections was higher (odds ratio, 3.81; 95% CI, 1.66-8.75), and median hospital stay in patients discharged alive was longer (16 d [range, 2-240 d] vs 9 d [range, 1-57 d]; p < 0.001) in subjects managed with therapeutic coma. CONCLUSIONS: This study provides class III evidence that therapeutic coma is associated with poorer outcome after status epilepticus; furthermore, it portends higher infection rates and longer hospitalizations. These data suggest caution in the straightforward use of this approach, especially in patients with complex partial status epilepticus.
Reasons to use e-cigarettes and associations with other substances among adolescents in Switzerland.
Resumo:
BACKGROUND: The objectives of this research were to describe the main reason(s) why adolescents use electronic cigarettes, to assess how e-cigarette experimenters and users differ based on personal characteristics, and to determine whether its use is associated with the use of other substances among a representative sample of youths in Switzerland. METHODS: A representative sample of 621 youths (308 females) was divided into never users (n=353), experimenters (Only once, n=120) and users (Several times, n=148) of e-cigarettes. Groups were compared on socio-demographic data and current smoking, alcohol misuse and cannabis use. Reasons for e-cigarette use were compared between experimenters and users. A multinomial regression was performed using never users as the reference category. RESULTS: Forty-three percent had ever tried e-cigarettes, and the main reason was curiosity. Compared to never users, experimenters were more likely to be out of school (Relative Risk Ratio [RRR]: 2.68) and to misuse alcohol (RRR: 2.08), while users were more likely to be male (RRR: 2.75), to be vocational students (RRR: 2.30) or out of school (RRR: 3.48) and to use any of the studied substances (tobacco, RRR: 5.26; alcohol misuse, RRR: 2.71; cannabis use, RRR: 30.2). CONCLUSIONS: Although often still part of adolescent experimentation, e-cigarettes are becoming increasingly popular among adolescents and they should become part of health providers' standard substance use screening. As health providers (and especially paediatricians) do not seem to have high levels of knowledge and, consequently, little comfort in discussing e-cigarettes, training in this domain should be available to them.
Resumo:
BACKGROUND: Several studies observed associations of various aspects of diet with mental health, but little is known about the relationship between following the 5-a-day recommendation for fruit and vegetables consumption and mental health. Thus, we examined the associations of the Swiss daily recommended fruit and vegetable intake with psychological distress. METHODS: Data from 20,220 individuals aged 15+ years from the 2012 Swiss Health Survey were analyzed. The recommended portions of fruit and vegetables per day were defined as 5-a-day (at least 2 portions of fruit and 3 of vegetables). The outcome was perceived psychological distress over the previous 4 weeks (measured by the 5-item mental health index [MHI-5]). High distress (MHI-5 score ≤ 52), moderate distress (MHI-5 > 52 and ≤ 72) and low distress (MHI-5 > 72 and ≤ 100) were differentiated and multinomial logistic regression analyses adjusted for known confounding factors were performed. RESULTS: The 5-a-day recommendation was met by 11.6 % of the participants with low distress, 9.3 % of those with moderate distress, and 6.2 % of those with high distress. Consumers fulfilling the 5-a-day recommendation had lower odds of being highly or moderately distressed than individuals consuming less fruit and vegetables (moderate vs. low distress: OR = 0.82, 95 % confidence interval [CI] 0.69-0.97; high vs. low distress: OR = 0.55, 95 % CI 0.41-0.75). CONCLUSIONS: Daily intake of 5 servings of fruit and vegetable was associated with lower psychological distress. Longitudinal studies are needed to further determine the causal nature of this relationship.
Resumo:
Aim: Obesity and smoking are major CVD risk factors and may be associated with other unfavourable lifestyle behaviours. Our aim was to investigate the significance of obesity, heavy smoking, and both combined in terms of prevalence trends and their relationship with other lifestyle factors. Methods: We used data from the population-based cross-sectional Swiss Health Survey (5 waves, 1992-2012) comprising 85,575 individuals aged 18 years. Height, weight, and smoking status were self-reported. We used multinomial logistic regression to analyse differences in lifestyle for the combinations of BMI category and smoking status, focusing on obese and heavy smokers. We defined normal-weight never smokers as reference.