108 resultados para Régression de Poisson
Resumo:
Background: The transmission of hemorrhagic fever with renal syndrome (HFRS) is influenced by climatic variables. However, few studies have examined the quantitative relationship between climate variation and HFRS transmission. ---------- Objective: We examined the potential impact of climate variability on HFRS transmission and developed climate-based forecasting models for HFRS in northeastern China. ---------- Methods: We obtained data on monthly counts of reported HFRS cases in Elunchun and Molidawahaner counties for 1997–2007 from the Inner Mongolia Center for Disease Control and Prevention and climate data from the Chinese Bureau of Meteorology. Cross-correlations assessed crude associations between climate variables, including rainfall, land surface temperature (LST), relative humidity (RH), and the multivariate El Niño Southern Oscillation (ENSO) index (MEI) and monthly HFRS cases over a range of lags. We used time-series Poisson regression models to examine the independent contribution of climatic variables to HFRS transmission. ----------- Results: Cross-correlation analyses showed that rainfall, LST, RH, and MEI were significantly associated with monthly HFRS cases with lags of 3–5 months in both study areas. The results of Poisson regression indicated that after controlling for the autocorrelation, seasonality, and long-term trend, rainfall, LST, RH, and MEI with lags of 3–5 months were associated with HFRS in both study areas. The final model had good accuracy in forecasting the occurrence of HFRS. ---------- Conclusions: Climate variability plays a significant role in HFRS transmission in northeastern China. The model developed in this study has implications for HFRS control and prevention.
Resumo:
In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.
Resumo:
Background While helmet usage is often mandated, few motorcycle and scooter riders make full use of protection for the rest of the body. Little is known about the factors associated with riders’ usage or non-usage of protective clothing. Methods Novice riders were surveyed prior to their provisional licence test in NSW, Australia. Questions related to usage and beliefs about protective clothing, riding experience and exposure, risk taking and demographic details. Multivariable Poisson regression models were used to identify factors associated with two measures of usage, comparing those who sometimes vs rarely/never rode unprotected and who usually wore non-motorcycle pants vs motorcycle pants. Results Ninety-four percent of eligible riders participated and usable data was obtained from 66% (n = 776). Factors significantly associated with riding unprotected were: youth (17–25 years) (RR = 2.00, 95% CI: 1.50–2.65), not seeking protective clothing information (RR = 1.29, 95% CI = 1.07–1.56), non-usage in hot weather (RR = 3.01, 95% CI: 2.38–3.82), awareness of social pressure to wear more protection (RR = 1.48, 95% CI: 1.12–1.95), scepticism about protective benefits (RR = 2.00, 95% CI: 1.22–3.28) and riding a scooter vs any type of motorcycle. A similar cluster of factors including youth (RR = 1.17, 95% CI: 1.04–1.32), social pressure (RR = 1.32, 95% CI: 1.16–1.50), hot weather (RR = 1.30, 95% CI: 1.19–1.41) and scooter vs motorcycles were also associated with wearing non-motorcycle pants. There was no evidence of an association between use of protective clothing and other indicators of risk taking behaviour. Conclusions Factors strongly associated with non-use of protective clothing include not having sought information about protective clothing and not believing in its injury reduction value. Interventions to increase use may therefore need to focus on development of credible information sources about crash risk and the benefits of protective clothing. Further work is required to develop motorcycle protective clothing suitable for hot climates.
Resumo:
Nanoindentation is a useful technique for probing the mechanical properties of bone, and finite element (FE) modeling of the indentation allows inverse determination of elasto-plastic constitutive properties. However, all but one FE study to date have assumed frictionless contact between indenter and bone. The aim of this study was to explore the effect of friction in simulations of bone nanoindentation. Two dimensional axisymmetric FE simulations were performed using a spheroconical indenter of tip radius 0.6 m and angle 90°. The coefficient of friction between indenter and bone was varied between 0.0 (frictionless) and 0.3. Isotropic linear elasticity was used in all simulations, with bone elastic modulus E=13.56GPa and Poisson‟s ratio f 0.3. Plasticity was incorporated using both Drucker-Prager and von Mises yield surfaces. Friction had a modest effect on the predicted force-indentation curve for both von Mises and Drucker-Prager plasticity, reducing maximum indenter displacement by 10% and 20% respectively as friction coefficient was increased from zero to 0.3 (at a maximum indenter force of 5mN). However, friction has a much greater effect on predicted pile-up after indentation, reducing predicted pile-up from 0.27 to 0.11 m with a von Mises model, and from 0.09 to 0.02 m with Drucker-Prager plasticity. We conclude that it is potentially important to include friction in nanoindentation simulations of bone if pile-up is used to compare simulation results with experiment.
Resumo:
Bistability arises within a wide range of biological systems from the λ phage switch in bacteria to cellular signal transduction pathways in mammalian cells. Changes in regulatory mechanisms may result in genetic switching in a bistable system. Recently, more and more experimental evidence in the form of bimodal population distributions indicates that noise plays a very important role in the switching of bistable systems. Although deterministic models have been used for studying the existence of bistability properties under various system conditions, these models cannot realize cell-to-cell fluctuations in genetic switching. However, there is a lag in the development of stochastic models for studying the impact of noise in bistable systems because of the lack of detailed knowledge of biochemical reactions, kinetic rates, and molecular numbers. In this work, we develop a previously undescribed general technique for developing quantitative stochastic models for large-scale genetic regulatory networks by introducing Poisson random variables into deterministic models described by ordinary differential equations. Two stochastic models have been proposed for the genetic toggle switch interfaced with either the SOS signaling pathway or a quorum-sensing signaling pathway, and we have successfully realized experimental results showing bimodal population distributions. Because the introduced stochastic models are based on widely used ordinary differential equation models, the success of this work suggests that this approach is a very promising one for studying noise in large-scale genetic regulatory networks.
Resumo:
Background: Previous studies have found high temperatures increase the risk of mortality in summer. However, little is known about whether a sharp decrease or increase in temperature between neighbouring days has any effect on mortality. Method: Poisson regression models were used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. The temperature change was calculated as the current day’s mean temperature minus the previous day’s mean. Results: In Brisbane, a drop of more than 3 °C in temperature between days was associated with relative risks (RRs) of 1.157 (95% confidence interval (CI): 1.024, 1.307) for total non external mortality (NEM), 1.186 (95%CI: 1.002, 1.405) for NEM in females, and 1.442 (95%CI: 1.099, 1.892) for people aged 65–74 years. An increase of more than 3 °C was associated with RRs of 1.353 (95%CI: 1.033, 1.772) for cardiovascular mortality and 1.667 (95%CI: 1.146, 2.425) for people aged < 65 years. In Los Angeles, only a drop of more than 3 °C was significantly associated with RRs of 1.133 (95%CI: 1.053, 1.219) for total NEM, 1.252 (95%CI: 1.131, 1.386) for cardiovascular mortality, and 1.254 (95%CI: 1.135, 1.385) for people aged ≥75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. Conclusion : A significant change in temperature of more than 3 °C, whether positive or negative, has an adverse impact on mortality even after controlling for the current temperature.
Resumo:
Background Apart from helmets, little is known about the effectiveness of motorcycle protective clothing in reducing injuries in crashes. The study aimed to quantify the association between usage of motorcycle clothing and injury in crashes. Methods and findings Cross-sectional analytic study. Crashed motorcyclists (n = 212, 71% of identified eligible cases) were recruited through hospitals and motorcycle repair services. Data was obtained through structured face-to-face interviews. The main outcome was hospitalization and motorcycle crash-related injury. Poisson regression was used to estimate relative risk (RR) and 95% confidence intervals for injury adjusting for potential confounders. Results Motorcyclists were significantly less likely to be admitted to hospital if they crashed wearing motorcycle jackets (RR = 0.79, 95% CI: 0.69–0.91), pants (RR = 0.49, 95% CI: 0.25–0.94), or gloves (RR = 0.41, 95% CI: 0.26–0.66). When garments included fitted body armour there was a significantly reduced risk of injury to the upper body (RR = 0.77, 95% CI: 0.66–0.89), hands and wrists (RR = 0.55, 95% CI: 0.38–0.81), legs (RR = 0.60, 95% CI: 0.40–0.90), feet and ankles (RR = 0.54, 95% CI: 0.35–0.83). Non-motorcycle boots were also associated with a reduced risk of injury compared to shoes or joggers (RR = 0.46, 95% CI: 0.28–0.75). No association between use of body armour and risk of fracture injuries was detected. A substantial proportion of motorcycle designed gloves (25.7%), jackets (29.7%) and pants (28.1%) were assessed to have failed due to material damage in the crash. Conclusions Motorcycle protective clothing is associated with reduced risk and severity of crash related injury and hospitalization, particularly when fitted with body armour. The proportion of clothing items that failed under crash conditions indicates a need for improved quality control. While mandating usage of protective clothing is not recommended, consideration could be given to providing incentives for usage of protective clothing, such as tax exemptions for safety gear, health insurance premium reductions and rebates.
Resumo:
The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
This study proposes a framework of a model-based hot spot identification method by applying full Bayes (FB) technique. In comparison with the state-of-the-art approach [i.e., empirical Bayes method (EB)], the advantage of the FB method is the capability to seamlessly integrate prior information and all available data into posterior distributions on which various ranking criteria could be based. With intersection crash data collected in Singapore, an empirical analysis was conducted to evaluate the following six approaches for hot spot identification: (a) naive ranking using raw crash data, (b) standard EB ranking, (c) FB ranking using a Poisson-gamma model, (d) FB ranking using a Poisson-lognormal model, (e) FB ranking using a hierarchical Poisson model, and (f) FB ranking using a hierarchical Poisson (AR-1) model. The results show that (a) when using the expected crash rate-related decision parameters, all model-based approaches perform significantly better in safety ranking than does the naive ranking method, and (b) the FB approach using hierarchical models significantly outperforms the standard EB approach in correctly identifying hazardous sites.
Resumo:
Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag 1 dependent specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways of T intersections increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red-light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.
Resumo:
Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.