958 resultados para LOG-POISSON STATISTICS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer review procedures and citation statistics are important yet often neglected components of the scientific publication process. Here I discuss fundamental consequences of such quality measures for the scientific community and propose three remedial actions: (1) use of a ''Combined Impact Estimate'' as a measure of citation statistics, (2) adoption of an open reviewing policy and (3) acceleration of the publication process in order to raise the reputation of the entire discipline (in our case: behavioural science). Authors, reviewers and editors are invited to contribute to the improvement of publication practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Switzerland there is a shortage of population-based information on heart failure (HF) incidence and case fatalities (CF). The aim of this study was to estimate HF event rates and both in- and out-of-hospital CF rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Michigan Basin is located in the upper Midwest region of the United States and is centered geographically over the Lower Peninsula of Michigan. It is filled primarily with Paleozoic carbonates and clastics, overlying Precambrian basement rocks and covered by Pleistocene glacial drift. In Michigan, more than 46,000 wells have been drilled in the basin, many producing significant quantities of oil and gas since the 1920s in addition to providing a wealth of data for subsurface visualization. Well log tomography, formerly log-curve amplitude slicing, is a visualization method recently developed at Michigan Technological University to correlate subsurface data by utilizing the high vertical resolution of well log curves. The well log tomography method was first successfully applied to the Middle Devonian Traverse Group within the Michigan Basin using gamma ray log curves. The purpose of this study is to prepare a digital data set for the Middle Devonian Dundee and Rogers City Limestones, apply the well log tomography method to this data and from this application, interpret paleogeographic trends in the natural radioactivity. Both the Dundee and Rogers City intervals directly underlie the Traverse Group and combined are the most prolific reservoir within the Michigan Basin. Differences between this study and the Traverse Group include increased well control and “slicing” of a more uniform lithology. Gamma ray log curves for the Dundee and Rogers City Limestones were obtained from 295 vertical wells distributed over the Lower Peninsula of Michigan, converted to Log ASCII Standard files, and input into the well log tomography program. The “slicing” contour results indicate that during the formation of the Dundee and Rogers City intervals, carbonates and evaporites with low natural radioactive signatures on gamma ray logs were deposited. This contrasts the higher gamma ray amplitudes from siliciclastic deltas that cyclically entered the basin during Traverse Group deposition. Additionally, a subtle north-south, low natural radioactive trend in the center of the basin may correlate with previously published Dundee facies tracts. Prominent trends associated with the distribution of limestone and dolomite are not observed because the regional range of gamma ray values for both carbonates are equivalent in the Michigan Basin and additional log curves are needed to separate these lithologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is conducted to study the complications associated with the sonic log prediction in carbonate logs and to investigate the possible solutions to accurately predict the sonic logs in Traverse Limestone. Well logs from fifty different wells were analyzed to define the mineralogy of the Traverse Limestone by using conventional 4-mineral and 3-mineral identification approaches. We modified the conventional 3-mineral identification approach (that completely neglects the gamma ray response) to correct the shale effects on the basis of gamma ray log before employing the 3-mineral identification. This modification helped to get the meaningful insight of the data when a plot was made between DGA (dry grain density) and UMA (Photoelectric Volumetric Cross-section) with the characteristic ternary diagram of the quartz, calcite and dolomite. The results were then compared with the 4-mineral identification approach. Contour maps of the average mineral fractions present in the Traverse Limestone were prepared to see the basin wide mineralogy of Traverse Limestone. In the second part, sonic response of Traverse Limestone was predicted in fifty randomly distributed wells. We used the modified time average equation that accounts for the shale effects on the basis of gamma ray log, and used it to predict the sonic behavior from density porosity and average porosity. To account for the secondary porosity of dolomite, we subtracted the dolomitic fraction of clean porosity from the total porosity. The pseudo-sonic logs were then compared with the measured sonic logs on the root mean square (RMS) basis. Addition of dolomite correction in modified time average equation improved the results of sonic prediction from neutron porosity and average porosity. The results demonstrated that sonic logs could be predicted in carbonate rocks with a root mean square error of about 4μsec/ft. We also attempted the use of individual mineral components for sonic log prediction but the ambiguities in mineral fractions and in the sonic properties of the minerals limited the accuracy of the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free space optical (FSO) communication links can experience extreme signal degradation due to atmospheric turbulence induced spatial and temporal irradiance fuctuations (scintillation) in the laser wavefront. In addition, turbulence can cause the laser beam centroid to wander resulting in power fading, and sometimes complete loss of the signal. Spreading of the laser beam and jitter are also artifacts of atmospheric turbulence. To accurately predict the signal fading that occurs in a laser communication system and to get a true picture of how this affects crucial performance parameters like bit error rate (BER) it is important to analyze the probability density function (PDF) of the integrated irradiance fuctuations at the receiver. In addition, it is desirable to find a theoretical distribution that accurately models these ?uctuations under all propagation conditions. The PDF of integrated irradiance fuctuations is calculated from numerical wave-optic simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to very strong. Our results show that the gamma-gamma PDF provides a good fit to the simulated data distribution for all aperture sizes studied from weak through moderate scintillation. In strong scintillation, the gamma-gamma PDF is a better fit to the distribution for point-like apertures and the lognormal PDF is a better fit for apertures the size of the atmospheric spatial coherence radius ρ0 or larger. In addition, the PDF of received power from a Gaussian laser beam, which has been adaptively compensated at the transmitter before propagation to the receiver of a FSO link in the moderate scintillation regime is investigated. The complexity of the adaptive optics (AO) system is increased in order to investigate the changes in the distribution of the received power and how this affects the BER. For the 10 km link, due to the non-reciprocal nature of the propagation path the optimal beam to transmit is unknown. These results show that a low-order level of complexity in the AO provides a better estimate for the optimal beam to transmit than a higher order for non-reciprocal paths. For the 20 km link distance it was found that, although minimal, all AO complexity levels provided an equivalent improvement in BER and that no AO complexity provided the correction needed for the optimal beam to transmit. Finally, the temporal power spectral density of received power from a FSO communication link is investigated. Simulated and experimental results for the coherence time calculated from the temporal correlation function are presented. Results for both simulation and experimental data show that the coherence time increases as the receiving aperture diameter increases. For finite apertures the coherence time increases as the communication link distance is increased. We conjecture that this is due to the increasing speckle size within the pupil plane of the receiving aperture for an increasing link distance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This morning Dr. Battle will introduce descriptive statistics and linear regression and how to apply these concepts in mathematical modeling. You will also learn how to use a spreadsheet to help with statistical analysis and to create graphs.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: In Switzerland there is a shortage of population-based information on stroke incidence and case fatalities (CF). The aim of this study was to estimate stroke event rates and both in- and out-of-hospital CF rates. METHODS: Data on stroke diagnoses, coded according to I60-I64 (ICD 10), were taken from the Federal Hospital Discharge Statistics database (HOST) and the Cause of Death database (CoD) for the year 2004. The number of total stroke events and of age- and gender-specific and agestandardised event rates were estimated; overall CF, in-hospital and out-of-hospital, were determined. RESULTS: Among the overall number of 13 996 hospital discharges from stroke (HOST) the number was lower in women (n = 6736) than in men (n = 7260). A total of 3568 deaths (2137 women and 1431 men) due to stroke were recorded in the CoD database. The number of estimated stroke events was 15 733, and higher in women (n = 7933) than in men (n = 7800). Men presented significantly higher age-specific stroke event rates and a higher age-standardised event rate (178.7/100 000 versus 119.7/100 000). Overall CF rates were significantly higher for women (26.9%) than for men (18.4%). The same was true of out-of-hospital CF but not of in-hospital CF rates. CONCLUSION: The data on estimated stroke events obtained indicate that stroke discharge rate underestimates the stroke event rate. Out-of-hospital deaths from stroke accounted for the largest proportion of total stroke deaths. Sex differences in both number of total stroke events and deaths could be explained by the higher proportion of women than men aged 55+ in the Swiss population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rise of evidence-based medicine as well as important progress in statistical methods and computational power have led to a second birth of the >200-year-old Bayesian framework. The use of Bayesian techniques, in particular in the design and interpretation of clinical trials, offers several substantial advantages over the classical statistical approach. First, in contrast to classical statistics, Bayesian analysis allows a direct statement regarding the probability that a treatment was beneficial. Second, Bayesian statistics allow the researcher to incorporate any prior information in the analysis of the experimental results. Third, Bayesian methods can efficiently handle complex statistical models, which are suited for advanced clinical trial designs. Finally, Bayesian statistics encourage a thorough consideration and presentation of the assumptions underlying an analysis, which enables the reader to fully appraise the authors' conclusions. Both Bayesian and classical statistics have their respective strengths and limitations and should be viewed as being complementary to each other; we do not attempt to make a head-to-head comparison, as this is beyond the scope of the present review. Rather, the objective of the present article is to provide a nonmathematical, reader-friendly overview of the current practice of Bayesian statistics coupled with numerous intuitive examples from the field of oncology. It is hoped that this educational review will be a useful resource to the oncologist and result in a better understanding of the scope, strengths, and limitations of the Bayesian approach.