95 resultados para Binomial theorem.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This book provides a general framework for specifying, estimating, and testing time series econometric models. Special emphasis is given to estimation by maximum likelihood, but other methods are also discussed, including quasi-maximum likelihood estimation, generalized method of moments estimation, nonparametric estimation, and estimation by simulation. An important advantage of adopting the principle of maximum likelihood as the unifying framework for the book is that many of the estimators and test statistics proposed in econometrics can be derived within a likelihood framework, thereby providing a coherent vehicle for understanding their properties and interrelationships. In contrast to many existing econometric textbooks, which deal mainly with the theoretical properties of estimators and test statistics through a theorem-proof presentation, this book squarely addresses implementation to provide direct conduits between the theory and applied work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The body of evidence related to breast-cancer-related lymphoedema incidence and risk factors has substantially grown and improved in quality over the past decade. We assessed the incidence of unilateral arm lymphoedema after breast cancer and explored the evidence available for lymphoedema risk factors. Methods We searched Academic Search Elite, Cumulative Index to Nursing and Allied Health, Cochrane Central Register of Controlled Trials (clinical trials), and Medline for research articles that assessed the incidence or prevalence of, or risk factors for, arm lymphoedema after breast cancer, published between January 1, 2000, and June 30, 2012. We extracted incidence data and calculated corresponding exact binomial 95% CIs. We used random effects models to calculate a pooled overall estimate of lymphoedema incidence, with subgroup analyses to assess the effect of different study designs, countries of study origin, diagnostic methods, time since diagnosis, and extent of axillary surgery. We assessed risk factors and collated them into four levels of evidence, depending on consistency of findings and quality and quantity of studies contributing to findings. Findings 72 studies met the inclusion criteria for the assessment of lymphoedema incidence, giving a pooled estimate of 16·6% (95% CI 13·6–20·2). Our estimate was 21·4% (14·9–29·8) when restricted to data from prospective cohort studies (30 studies). The incidence of arm lymphoedema seemed to increase up to 2 years after diagnosis or surgery of breast cancer (24 studies with time since diagnosis or surgery of 12 to <24 months; 18·9%, 14·2–24·7), was highest when assessed by more than one diagnostic method (nine studies; 28·2%, 11·8–53·5), and was about four times higher in women who had an axillary-lymph-node dissection (18 studies; 19·9%, 13·5–28·2) than it was in those who had sentinel-node biopsy (18 studies; 5·6%, 6·1–7·9). 29 studies met the inclusion criteria for the assessment of risk factors. Risk factors that had a strong level of evidence were extensive surgery (ie, axillary-lymph-node dissection, greater number of lymph nodes dissected, mastectomy) and being overweight or obese. Interpretation Our findings suggest that more than one in five women who survive breast cancer will develop arm lymphoedema. A clear need exists for improved understanding of contributing risk factors, as well as of prevention and management strategies to reduce the individual and public health burden of this disabling and distressing disorder.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Dengue fever (DF) outbreaks often arise from imported DF cases in Cairns, Australia. Few studies have incorporated imported DF cases in the estimation of the relationship between weather variability and incidence of autochthonous DF. The study aimed to examine the impact of weather variability on autochthonous DF infection after accounting for imported DF cases and then to explore the possibility of developing an empirical forecast system. METHODOLOGY/PRINCIPAL FINDS Data on weather variables, notified DF cases (including those acquired locally and overseas), and population size in Cairns were supplied by the Australian Bureau of Meteorology, Queensland Health, and Australian Bureau of Statistics. A time-series negative-binomial hurdle model was used to assess the effects of imported DF cases and weather variability on autochthonous DF incidence. Our results showed that monthly autochthonous DF incidences were significantly associated with monthly imported DF cases (Relative Risk (RR):1.52; 95% confidence interval (CI): 1.01-2.28), monthly minimum temperature ((o)C) (RR: 2.28; 95% CI: 1.77-2.93), monthly relative humidity (%) (RR: 1.21; 95% CI: 1.06-1.37), monthly rainfall (mm) (RR: 0.50; 95% CI: 0.31-0.81) and monthly standard deviation of daily relative humidity (%) (RR: 1.27; 95% CI: 1.08-1.50). In the zero hurdle component, the occurrence of monthly autochthonous DF cases was significantly associated with monthly minimum temperature (Odds Ratio (OR): 1.64; 95% CI: 1.01-2.67). CONCLUSIONS/SIGNIFICANCE Our research suggested that incidences of monthly autochthonous DF were strongly positively associated with monthly imported DF cases, local minimum temperature and inter-month relative humidity variability in Cairns. Moreover, DF outbreak in Cairns was driven by imported DF cases only under favourable seasons and weather conditions in the study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new method for stabilizing disturbed power systems using wide area measurement and FACTS devices. The approach focuses on both first swing and damping stability of power systems following large disturbances. A two step control algorithm based on Lyapunov Theorem is proposed to be applied on the controllers to improve the power systems stability. The proposed approach is simulated on two test systems and the results show significant improvement in the first swing and damping stability of the test systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms using indirect infer- ence. We embed this approach within a sequential Monte Carlo algorithm that is completely adaptive. This methodological development was motivated by an application involving data on macroparasite population evolution modelled with a trivariate Markov process. The main objective of the analysis is to compare inferences on the Markov process when considering two di®erent indirect mod- els. The two indirect models are based on a Beta-Binomial model and a three component mixture of Binomials, with the former providing a better ¯t to the observed data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti’s reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent controversy on the quantum dots dephasing mechanisms (between pure and inelastic) is re-examined by isolating the quantum dots from their substrate by using the appropriate limits of the ionization energy theory and the quantum adiabatic theorem. When the phonons in the quantum dots are isolated adiabatically from the phonons in the substrate, the elastic or pure dephasing becomes the dominant mechanism. On the other hand, for the case where the phonons from the substrate are non-adiabatically coupled to the quantum dots, the inelastic dephasing process takes over. This switch-over is due to different elemental composition in quantum dots as compared to its substrate. We also provide unambiguous analysis as to understand why GaAs/AlGaAs quantum dots may only have pure dephasing while InAs/GaAs quantum dots give rise to the inelastic dephasing as the dominant mechanism. It is shown that the elemental composition plays an important role (of both quantum dots and substrate) in evaluating the dephasing mechanisms of quantum dots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tobacco smoking, alcohol drinking, and occupational exposures to polycyclic aromatic hydrocarbons are the major proven risk factors for human head and neck squamous-cell cancer (HNSCC). Major research focus on gene-environment interactions concerning HNSCC has been on genes encoding enzymes of metabolism for tobacco smoke constituents and repair enzymes. To investigate the role of genetically determined individual predispositions in enzymes of xenobiotic metabolism and in repair enzymes under the exogenous risk factor tobacco smoke in the carcinogenesis of HNSCC, we conducted a case-control study on 312 cases and 300 noncancer controls. We focused on the impact of 22 sequence variations in CYP1A1, CYP1B1, CYP2E1, ERCC2/XPD, GSTM1, GSTP1, GSTT1, NAT2, NQO1, and XRCC1. To assess relevant main and interactive effects of polymorphic genes on the susceptibility to HNSCC we used statistical models such as logic regression and a Bayesian version of logic regression. In subgroup analysis of nonsmokers, main effects in ERCC2 (Lys751Gln) C/C genotype and combined ERCC2 (Arg156Arg) C/A and A/A genotypes were predominant. When stratifying for smokers, the data revealed main effects on combined CYP1B1 (Leu432Val) C/G and G/G genotypes, followed by CYP1B1 (Leu432Val) G/G genotype and CYP2E1 (-70G>T) G/T genotype. When fitting logistic regression models including relevant main effects and interactions in smokers, we found relevant associations of CYP1B1 (Leu432Val) C/G genotype and CYP2E1 (-70G>T) G/T genotype (OR, 10.84; 95% CI, 1.64-71.53) as well as CYP1B1 (Leu432Val) G/G genotype and GSTM1 null/null genotype (OR, 11.79; 95% CI, 2.18-63.77) with HNSCC. The findings underline the relevance of genotypes of polymorphic CYP1B1 combined with exposures to tobacco smoke.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Detection of outbreaks is an important part of disease surveillance. Although many algorithms have been designed for detecting outbreaks, few have been specifically assessed against diseases that have distinct seasonal incidence patterns, such as those caused by vector-borne pathogens. Methods We applied five previously reported outbreak detection algorithms to Ross River virus (RRV) disease data (1991-2007) for the four local government areas (LGAs) of Brisbane, Emerald, Redland and Townsville in Queensland, Australia. The methods used were the Early Aberration Reporting System (EARS) C1, C2 and C3 methods, negative binomial cusum (NBC), historical limits method (HLM), Poisson outbreak detection (POD) method and the purely temporal SaTScan analysis. Seasonally-adjusted variants of the NBC and SaTScan methods were developed. Some of the algorithms were applied using a range of parameter values, resulting in 17 variants of the five algorithms. Results The 9,188 RRV disease notifications that occurred in the four selected regions over the study period showed marked seasonality, which adversely affected the performance of some of the outbreak detection algorithms. Most of the methods examined were able to detect the same major events. The exception was the seasonally-adjusted NBC methods that detected an excess of short signals. The NBC, POD and temporal SaTScan algorithms were the only methods that consistently had high true positive rates and low false positive and false negative rates across the four study areas. The timeliness of outbreak signals generated by each method was also compared but there was no consistency across outbreaks and LGAs. Conclusions This study has highlighted several issues associated with applying outbreak detection algorithms to seasonal disease data. In lieu of a true gold standard, a quantitative comparison is difficult and caution should be taken when interpreting the true positives, false positives, sensitivity and specificity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

S. japonicum infection is believed to be endemic in 28 of the 80 provinces of the Philippines and the most recent data on schistosomiasis prevalence have shown considerable variability between provinces. In order to increase the efficient allocation of parasitic disease control resources in the country, we aimed to describe the small scale spatial variation in S. japonicum prevalence across the Philippines, quantify the role of the physical environment in driving the spatial variation of S. japonicum, and develop a predictive risk map of S. japonicum infection. Data on S. japonicum infection from 35,754 individuals across the country were geo-located at the barangay level and included in the analysis. The analysis was then stratified geographically for Luzon, the Visayas and Mindanao. Zero-inflated binomial Bayesian geostatistical models of S. japonicum prevalence were developed and diagnostic uncertainty was incorporated. Results of the analysis show that in the three regions, males and individuals aged ≥ 20 years had significantly higher prevalence of S. japonicum compared with females and children <5 years. The role of the environmental variables differed between regions of the Philippines. S. japonicum infection was widespread in the Visayas whereas it was much more focal in Luzon and Mindanao. This analysis revealed significant spatial variation in prevalence of S. japonicum infection in the Philippines. This suggests that a spatially targeted approach to schistosomiasis interventions, including mass drug administration, is warranted. When financially possible, additional schistosomiasis surveys should be prioritized to areas identified to be at high risk, but which were underrepresented in our dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary aim of this descriptive exploration of scientists’ life cycle award patterns is to evaluate whether awards breed further awards and identify researcher experiences after reception of the Nobel Prize. To achieve this goal, we collected data on the number of awards received each year for 50 years before and after Nobel Prize reception by all 1901–2000 Nobel laureates in physics, chemistry, and medicine or physiology. Our results indicate an increasing rate of awards before Nobel reception, reaching the summit precisely in the year of the Nobel Prize. After this pinnacle year, awards drop sharply. This result is confirmed by separate analyses of three different disciplines and by a random-effects negative binomial regression model. Such an effect, however, does not emerge for more recent Nobel laureates (1971–2000). In addition, Nobelists in medicine or physiology generate more awards shortly before and after prize reception, whereas laureates in chemistry attract more awards as time progresses.