974 resultados para Negative dispersion mirrors
Resumo:
This paper investigates the role of social capital on the reduction of short and long run negative health effects associated with stress, as well as indicators of burnout among police officers. Despite the large volume of research on either social capital or the health effects of stress, the interaction of these factors remains an underexplored topic. In this empirical analysis we aim to reduce such a shortcoming focusing on a highly stressful and emotionally draining work environment, namely law enforcement agents who perform as an essential part of maintaining modern society. Using a multivariate regression analysis focusing on three different proxies of health and three proxies for social capital conducting also several robustness checks, we find strong evidence that increased levels of social capital is highly correlated with better health outcomes. Additionally we observe that while social capital at work is very important, social capital in the home environment and work-life balance are even more important. From a policy perspective, our findings suggest that work and stress programs should actively encourage employees to build stronger social networks as well as incorporate better working/home life arrangements.
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.
Resumo:
The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create hypovigilance and impair performance towards critical events. Identifying such impairment in monotonous conditions has been a major subject of research, but no research to date has attempted to predict it in real-time. This pilot study aims to show that performance decrements due to monotonous tasks can be predicted through mathematical modelling taking into account sensation seeking levels. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants‟ performance. The framework for prediction developed on this task could be extended to a monotonous driving task. A Hidden Markov Model (HMM) is proposed to predict participants‟ lapses in alertness. Driver‟s vigilance evolution is modelled as a hidden state and is correlated to a surrogate measure: the participant‟s reactions time. This experiment shows that the monotony of the task can lead to an important decline in performance in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.
Resumo:
The health of tollbooth workers is seriously threatened by long-term exposure to polluted air from vehicle exhausts. Using traffic data collected at a toll plaza, vehicle movements were simulated by a system dynamics model with different traffic volumes and toll collection procedures. This allowed the average travel time of vehicles to be calculated. A three-dimension Computational Fluid Dynamics (CFD) model was used with a k–ε turbulence model to simulate pollutant dispersion at the toll plaza for different traffic volumes and toll collection procedures. It was shown that pollutant concentration around tollbooths increases as traffic volume increases. Whether traffic volume is low or high (1500 vehicles/h or 2500 vehicles/h), pollutant concentration decreases if electronic toll collection (ETC) is adopted. In addition, pollutant concentration around tollbooths decreases as the proportion of ETC-equipped vehicles increases. However, if the proportion of ETC-equipped vehicles is very low and the traffic volume is not heavy, then pollutant concentration increases as the number of ETC lanes increases.
Resumo:
In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.
Resumo:
Objective Alcohol-related implicit (preconscious) cognitive processes are established and unique predictors of alcohol use, but most research in this area has focused on alcohol-related implicit cognition and anxiety. This study extends this work into the area of depressed mood by testing a cognitive model that combines traditional explicit (conscious and considered) beliefs, implicit alcohol-related memory associations (AMAs), and self-reported drinking behavior. Method Using a sample of 106 university students, depressed mood was manipulated using a musical mood induction procedure immediately prior to completion of implicit then explicit alcohol-related cognition measures. A bootstrapped two-group (weak/strong expectancies of negative affect and tension reduction) structural equation model was used to examine how mood changes and alcohol-related memory associations varied across groups. Results Expectancies of negative affect moderated the association of depressed mood and AMAs, but there was no such association for tension reduction expectancy. Conclusion Subtle mood changes may unconsciously trigger alcohol-related memories in vulnerable individuals. Results have implications for addressing subtle fluctuations in depressed mood among young adults at risk of alcohol problems.
Resumo:
The objective was to understand the influence of the surface roughness of lactose carriers on the adhesion and dispersion of salmeterol xinafoate (SX) from interactive mixtures. The surface roughness of lactose carriers was determined by confocal microscopy. Particle images and adhesion forces between SX and lactose particles were determined by Atomic Force Microscopy. The dispersion of SX (2.5%) from interactive mixtures with lactose was determined using a twin-stage impinger (TSI) with a Rotahaler® at an airflow rate of 60L/min. SX was analysed using a validated HPLC assay. The RMS Rq of lactose carriers ranged from 0.93-2.84μm, the Fine Particle Fraction (FPF) of SX ranged between 4 and 24 percent and average adhesion force between a SX and lactose particles ranged between 49 and 134 nN. No direct correlation was observed between the RMS Rq of lactose carriers and either the FPF of SX for the interactive mixtures or the adhesion force of a SX on the lactose particles; however, the presence of fine lactose associated with the carrier surface increased the FPF of SX. Dispersion through direct SX detachment from the carrier surface was not consistent with the poor correlations described and was more likely to occur through complex particulate interactions involving fine lactose.
Resumo:
Dry powder inhaler (DPI) formulations is one of the most useful aerosol preparations in which drugs may be formulated as carrier-based interactive mixtures with micronised drug particles (<5 μm) adhered onto the surface of large inert carriers (lactose powders). The addition of magnesium stearate (MgSt) (1-3), was found to increase dispersion of various drugs from DPI formulations. Recently, some active compounds coated with 5% (wt/wt) MgSt using the mechanofusion method showed significant improvements in aerosolization behavior due to the reduction in intrinsic cohesion force (4). Application of MgSt in powder formulations is not new; however, no studies demonstrated the minimum threshold level for this excipient in efficient aerosolization of drug powders from the interactive mixtures. Therefore, this study investigated the role of MgSt concentration on the efficient dispersion of salbutamol sulphate (SS) from DPI formulations.
Resumo:
A favorable product country of origin (e.g., an automobile made in Germany) is often considered an asset by marketers. Yet a challenge in today's competitive environment is how marketers of products from less favorably regarded countries can counter negative country of origin perceptions. Three studies investigate how mental imagery can be used to reduce the effects of negative country of origin stereotypes. Study 1 reveals that participants exposed to country of origin information exhibit automatic stereotype activation. Study 2 shows that self-focused counterstereotypical mental imagery (relative to other-focused mental imagery) significantly inhibits the automatic activation of negative country of origin stereotypes. Study 3 shows that this lessening of automatic negative associations persists when measured one day later. The results offer important implications for marketing theory and practice.
Resumo:
Background: Pregnant women exposed to traffic pollution have an increased risk of negative birth outcomes. We aimed to investigate the size of this risk using a prospective cohort of 970 mothers and newborns in Logan, Queensland. ----- ----- Methods: We examined two measures of traffic: distance to nearest road and number of roads around the home. To examine the effect of distance we used the number of roads around the home in radii from 50 to 500 metres. We examined three road types: freeways, highways and main roads.----- ----- Results: There were no associations with distance to road. A greater number of freeways and main roads around the home were associated with a shorter gestation time. There were no negative impacts on birth weight, birth length or head circumference after adjusting for gestation. The negative effects on gestation were largely due to main roads within 400 metres of the home. For every 10 extra main roads within 400 metres of the home, gestation time was reduced by 1.1% (95% CI: -1.7, -0.5; p-value = 0.001).----- ----- Conclusions: Our results add weight to the association between exposure to traffic and reduced gestation time. This effect may be due to the chemical toxins in traffic pollutants, or because of disturbed sleep due to traffic noise.
Resumo:
It is a big challenge to guarantee the quality of discovered relevance features in text documents for describing user preferences because of the large number of terms, patterns, and noise. Most existing popular text mining and classification methods have adopted term-based approaches. However, they have all suffered from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern-based methods should perform better than term-based ones in describing user preferences, but many experiments do not support this hypothesis. The innovative technique presented in paper makes a breakthrough for this difficulty. This technique discovers both positive and negative patterns in text documents as higher level features in order to accurately weight low-level features (terms) based on their specificity and their distributions in the higher level features. Substantial experiments using this technique on Reuters Corpus Volume 1 and TREC topics show that the proposed approach significantly outperforms both the state-of-the-art term-based methods underpinned by Okapi BM25, Rocchio or Support Vector Machine and pattern based methods on precision, recall and F measures.