181 resultados para Generalized Hypergeometric Series


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates several competing procedures for computing the prices of vanilla European options, such as puts, calls and binaries, in which the underlying model has a characteristic function that is known in semi-closed form. The algorithms investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. Their performance is assessed in simulation experiments in which an analytical solution is available and also for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the halfrange cosine series and the full-range Fourier series. However there are two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together,these two conclusions make a case for pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series if a large number of options are to be priced in as short a time as possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Corneal oedema is a common post-operative problem that delays or prevents visual recovery from ocular surgery. Honey is a supersaturated solution of sugars with an acidic pH, high osmolarity and low water content. These characteristics inhibit the growth of micro-organisms, reduce oedema and promote epithelialisation. This clinical case series describes the use of a regulatory approved Leptospermum species honey ophthalmic product, in the management of post-operative corneal oedema and bullous keratopathy. Methods A retrospective review of 18 consecutive cases (30 eyes) with corneal oedema persisting beyond one month after single or multiple ocular surgical procedures (phacoemulsification cataract surgery and additional procedures) treated with Optimel Antibacterial Manuka Eye Drops twice to three times daily as an adjunctive therapy to conventional topical management with corticosteroid, aqueous suppressants, hypertonic sodium chloride five per cent, eyelid hygiene and artificial tears. Visual acuity and central corneal thickness were measured before and at the conclusion of Optimel treatment. Results A temporary reduction in corneal epithelial oedema lasting up to several hours was observed after the initial Optimel instillation and was associated with a reduction in central corneal thickness, resolution of epithelial microcysts, collapse of epithelial bullae, improved corneal clarity, improved visualisation of the intraocular structures and improved visual acuity. Additionally, with chronic use, reduction in punctate epitheliopathy, reduction in central corneal thickness and improvement in visual acuity were achieved. Temporary stinging after Optimel instillation was experienced. No adverse infectious or inflammatory events occurred during treatment with Optimel. Conclusions Optimel was a safe and effective adjunctive therapeutic strategy in the management of persistent post-operative corneal oedema and warrants further investigation in clinical trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To discuss generalized estimating equations as an extension of generalized linear models by commenting on the paper of Ziegler and Vens "Generalized Estimating Equations. Notes on the Choice of the Working Correlation Matrix". Methods Inviting an international group of experts to comment on this paper. Results Several perspectives have been taken by the discussants. Econometricians have established parallels to the generalized method of moments (GMM). Statisticians discussed model assumptions and the aspect of missing data Applied statisticians; commented on practical aspects in data analysis. Conclusions In general, careful modeling correlation is encouraged when considering estimation efficiency and other implications, and a comparison of choosing instruments in GMM and generalized estimating equations, (GEE) would be worthwhile. Some theoretical drawbacks of GEE need to be further addressed and require careful analysis of data This particularly applies to the situation when data are missing at random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates by minimizing the biases and making use of possible predictive variables. The load estimation procedure can be summarized by the following four steps: - (i) output the flow rates at regular time intervals (e.g. 10 minutes) using a time series model that captures all the peak flows; - (ii) output the predicted flow rates as in (i) at the concentration sampling times, if the corresponding flow rates are not collected; - (iii) establish a predictive model for the concentration data, which incorporates all possible predictor variables and output the predicted concentrations at the regular time intervals as in (i), and; - (iv) obtain the sum of all the products of the predicted flow and the predicted concentration over the regular time intervals to represent an estimate of the load. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized regression (rating-curve) approach with additional predictors that capture unique features in the flow data, namely the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and cumulative discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. The model also has the capacity to accommodate autocorrelation in model errors which are the result of intensive sampling during floods. Incorporating this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach using the concentrations of total suspended sediment (TSS) and nitrogen oxide (NOx) and gauged flow data from the Burdekin River, a catchment delivering to the Great Barrier Reef. The sampling biases for NOx concentrations range from 2 to 10 times indicating severe biases. As we expect, the traditional average and extrapolation methods produce much higher estimates than those when bias in sampling is taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several articles in this journal have studied optimal designs for testing a series of treatments to identify promising ones for further study. These designs formulate testing as an ongoing process until a promising treatment is identified. This formulation is considered to be more realistic but substantially increases the computational complexity. In this article, we show that these new designs, which control the error rates for a series of treatments, can be reformulated as conventional designs that control the error rates for each individual treatment. This reformulation leads to a more meaningful interpretation of the error rates and hence easier specification of the error rates in practice. The reformulation also allows us to use conventional designs from published tables or standard computer programs to design trials for a series of treatments. We illustrate these using a study in soft tissue sarcoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article describes a generalized estimating equations approach that was used to investigate the impact of technology on vessel performance in a trawl fishery during 1988-96, while accounting for spatial and temporal correlations in the catch-effort data. Robust estimation of parameters in the presence of several levels of clustering depended more on the choice of cluster definition than on the choice of correlation structure within the cluster. Models with smaller cluster sizes produced stable results, while models with larger cluster sizes, that may have had complex within-cluster correlation structures and that had within-cluster covariates, produced estimates sensitive to the correlation structure. The preferred model arising from this dataset assumed that catches from a vessel were correlated in the same years and the same areas, but independent in different years and areas. The model that assumed catches from a vessel were correlated in all years and areas, equivalent to a random effects term for vessel, produced spurious results. This was an unexpected finding that highlighted the need to adopt a systematic strategy for modelling. The article proposes a modelling strategy of selecting the best cluster definition first, and the working correlation structure (within clusters) second. The article discusses the selection and interpretation of the model in the light of background knowledge of the data and utility of the model, and the potential for this modelling approach to apply in similar statistical situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extended title for this splendid visual feast is a catalogue to accompany the exhibition Postcards from the Edge of the City at the Santos Museum ofE conomic Botany, 9 December 2014 to 26 April 2015. As a catalogue this book contains the from and back sides of 300 postcards published between 1900 and 1917...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate unconditional skewness. We consider modeling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible, explicit analytical expressions provided for all third-order moments and cross-moments. Finally, we introduce a new tool, the shock impact curve, for investigating the impact of shocks on the conditional mean squared error of return series.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the growing importance of the Chinese tourist market to Australia, an understanding of Chinese tourists' arrival patterns is essential to accurate forecasting of future arrivals. Drawing on 25 years of records (1991-2015), this study developed a time-series model of monthly arrivals of Chinese tourists in Australia. The model reflects the exponentially increasing trend and strong seasonality of arrivals. Excellent results from validation of the model's forecasts endorsed this time-series model's potential in the policy prescription and management practice of Australian tourism industries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

THE DRINKING DRIVER is a guide for listeners to the Adult Education radio series ONE FOR THE ROAD, a five-part series on drink-driving and Australia’s road toll. ONE FOR THE ROAD was produced by Lee Parker and Julie Levi, with assistance from the Federal Office of Road Safety in Canberra. The five programs, presented by Lee Parker were first broadcast on ABC Radio National in January 1989, and repeated on Radio National and Regional Stations across Australia in April/May 1989. THE DRINKING DRIVER was written by Mark King, Senior Project Officer with the Road Safety Division of the South Australian Department of Transport.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The c-Fos–c-Jun complex forms the activator protein 1 transcription factor, a therapeutic target in the treatment of cancer. Various synthetic peptides have been designed to try to selectively disrupt the interaction between c-Fos and c-Jun at its leucine zipper domain. To evaluate the binding affinity between these synthetic peptides and c-Fos, polarizable and nonpolarizable molecular dynamics (MD) simulations were conducted, and the resulting conformations were analyzed using the molecular mechanics generalized Born surface area (MM/GBSA) method to compute free energies of binding. In contrast to empirical and semiempirical approaches, the estimation of free energies of binding using a combination of MD simulations and the MM/GBSA approach takes into account dynamical properties such as conformational changes, as well as solvation effects and hydrophobic and hydrophilic interactions. The predicted binding affinities of the series of c-Jun-based peptides targeting the c-Fos peptide show good correlation with experimental melting temperatures. This provides the basis for the rational design of peptides based on internal, van der Waals, and electrostatic interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Clancestry Conversation series forms part of QPAC's Clancestry Festival which is an annual celebration of the arts and cultural practices of the world's First Nations Peoples with a particular focus on Aboriginal and Torres Strait Islander peoples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two-dimensional (2D) transition metal oxide systems present exotic electronic properties and high specific surface areas, and also demonstrate promising applications ranging from electronics to energy storage. Yet, in contrast to other types of nanostructures, the question as to whether we could assemble 2D nanomaterials with an atomic thickness from molecules in a general way, which may give them some interesting properties such as those of graphene, still remains unresolved. Herein, we report a generalized and fundamental approach to molecular self-assembly synthesis of ultrathin 2D nanosheets of transition metal oxides by rationally employing lamellar reverse micelles. It is worth emphasizing that the synthesized crystallized ultrathin transition metal oxide nanosheets possess confined thickness, high specific surface area and chemically reactive facets, so that they could have promising applications in nanostructured electronics, photonics, sensors, and energy conversion and storage devices.