989 resultados para Statistical methodologies
Resumo:
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.
Resumo:
Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.
Resumo:
The charge at which adsorption of orgamc compounds attains a maximum ( \sigma MAX M) at an electrochenucal interface is analysed using several multi-state models in a hierarchical manner The analysis is based on statistical mechamcal results for the following models (A) two-state site parity, (B) two-state muhl-slte, and (C) three-state site parity The coulombic interactions due to permanent and reduced dipole effects (using mean field approximation), electrostatic field effects and specific substrate interactions have been taken into account. The simplest model in the hierarchy (two-state site parity) yields the exphcit dependence of ( \sigma MAX M) on the permanent dipole moment, polarizability of the solvent and the adsorbate, lattice spacing, effective coordination number, etc Other models in the baerarchy bring to hght the influence of the solvent structure and the role of substrate interactions, etc As a result of this approach, the "composition" of oM.x m terms of the fundamental molecular constants becomes clear. With a view to use these molecular results to maxamum advantage, the derived results for ( \sigma MAX M) have been converted into those involving experimentally observable parameters lake Co, C 1, E N, etc Wherever possible, some of the earlier phenomenologlcal relations reported for ( \sigma MAX M), notably by Parsons, Damaskm and Frumkln, and Trasattl, are shown to have a certain molecular basis, vlz a simple two-state sate panty model.As a corollary to the hxerarcbacal modelling, \sigma MAX M and the potential corresponding to at (Emax) are shown to be constants independent of 0max or Corg for all models The lmphcatlon of our analysis f o r OmMa x with respect to that predicted by the generalized surface layer equation (which postulates Om~ and Ema x varlaUon with 0) is discussed in detail Finally we discuss an passing o M. and the electrosorptlon valency an this context.
Resumo:
A pressed-plate Fe electrode for alkalines storage batteries, designed using a statistical method (fractional factorial technique), is described. Parameters such as the configuration of the base grid, electrode compaction temperature and pressure, binder composition, mixing time, etc. have been optimised using this method. The optimised electrodes have a capacity of 300 plus /minus 5 mA h/g of active material (mixture of Fe and magnetite) at 7 h rate to a cut-off voltage of 8.86V vs. Hg/HgO, OH exp 17 ref.
Resumo:
In this paper, we tackle the problem of unsupervised domain adaptation for classification. In the unsupervised scenario where no labeled samples from the target domain are provided, a popular approach consists in transforming the data such that the source and target distributions be- come similar. To compare the two distributions, existing approaches make use of the Maximum Mean Discrepancy (MMD). However, this does not exploit the fact that prob- ability distributions lie on a Riemannian manifold. Here, we propose to make better use of the structure of this man- ifold and rely on the distance on the manifold to compare the source and target distributions. In this framework, we introduce a sample selection method and a subspace-based method for unsupervised domain adaptation, and show that both these manifold-based techniques outperform the cor- responding approaches based on the MMD. Furthermore, we show that our subspace-based approach yields state-of- the-art results on a standard object recognition benchmark.
Resumo:
Purpose A retrospective planning study comparing volumetric arc therapy (VMAT) and stereotactic body radiotherapy (SBRT) treatment plans for non-small cell lung cancer (NSCLC). Methods and materials Five randomly selected early stage lung cancer patients were included in the study. For each patient, four plans were created: the SBRT plan and three VMAT plans using different optimisation methodologies. A total of 20 different plans were evaluated. The dose parameters of dose conformity results and the target dose constraints results were compared for these plans. Results The mean planning target volume (PTV) for all the plans (SBRT and VMAT) was 18·3 cm3, with a range from 15·6 to 20·1 cm3. The maximum dose tolerance to 1 cc of all the plans was within 140% (84 Gy) of the prescribed dose, and 95% of the PTV of all the plans received 100% of the prescribed dose (60 Gy). In all the plans, 99% of the PTV received a dose >90% of the prescribed dose, and the mean dose in all the plans ranged from 67 to 72 Gy. The planning target dose conformity for the SBRT and the VMAT (0°, 15° collimator single arc plans and dual arc) plans showed the tightness of the prescription isodose conformity to the target. Conclusions SBRT and VMAT are radiotherapy approaches that increase doses to small tumour targets without increasing doses to the organs at risk. Although VMAT offers an alternative to SBRT for NSCLC and the potential advantage of VMAT is the reduced treatment times over SBRT, the statistical results show that there was no significant difference between the SBRT and VMAT optimised plans in terms of dose conformity and organ-at-risk sparing.
Resumo:
To facilitate marketing and export, the Australian macadamia industry requires accurate crop forecasts. Each year, two levels of crop predictions are produced for this industry. The first is an overall longer-term forecast based on tree census data of growers in the Australian Macadamia Society (AMS). This data set currently accounts for around 70% of total production, and is supplemented by our best estimates of non-AMS orchards. Given these total tree numbers, average yields per tree are needed to complete the long-term forecasts. Yields from regional variety trials were initially used, but were found to be consistently higher than the average yields that growers were obtaining. Hence, a statistical model was developed using growers' historical yields, also taken from the AMS database. This model accounted for the effects of tree age, variety, year, region and tree spacing, and explained 65% of the total variation in the yield per tree data. The second level of crop prediction is an annual climate adjustment of these overall long-term estimates, taking into account the expected effects on production of the previous year's climate. This adjustment is based on relative historical yields, measured as the percentage deviance between expected and actual production. The dominant climatic variables are observed temperature, evaporation, solar radiation and modelled water stress. Initially, a number of alternate statistical models showed good agreement within the historical data, with jack-knife cross-validation R2 values of 96% or better. However, forecasts varied quite widely between these alternate models. Exploratory multivariate analyses and nearest-neighbour methods were used to investigate these differences. For 2001-2003, the overall forecasts were in the right direction (when compared with the long-term expected values), but were over-estimates. In 2004 the forecast was well under the observed production, and in 2005 the revised models produced a forecast within 5.1% of the actual production. Over the first five years of forecasting, the absolute deviance for the climate-adjustment models averaged 10.1%, just outside the targeted objective of 10%.
Resumo:
The recently introduced generalized pencil of Sudarshan which gives an exact ray picture of wave optics is analysed in some situations of interest to wave optics. A relationship between ray dispersion and statistical inhomogeneity of the field is obtained. A paraxial approximation which preserves the rectilinear propagation character of the generalized pencils is presented. Under this approximation the pencils can be computed directly from the field conditions on a plane, without the necessity to compute the cross-spectral density function in the entire space as an intermediate quantity. The paraxial results are illustrated with examples. The pencils are shown to exhibit an interesting scaling behaviour in the far-zone. This scaling leads to a natural generalization of the Fraunhofer range criterion and of the classical van Cittert-Zernike theorem to planar sources of arbitrary state of coherence. The recently derived results of radiometry with partially coherent sources are shown to be simple consequences of this scaling.
Resumo:
A major outcome of this project has been the identification and prioritisation of the major management issues related to the ecological impacts of fish stocking and the elucidation of appropriate research methodologies that can be used to investigate these issues. This information is paramount to development of the relevant research projects that will lead to stocking activities aligned with world’s best practice, a requisite for ecologically sustainable recreational freshwater fisheries. In order to quantify the major management issues allied to the sustainability of freshwater fish stocking, stakeholders from around Australia were identified and sent a questionnaire to determine which particular issues they regarded as important. These stakeholders included fisheries managers or researchers from Federal, Territory and State jurisdictions although others, including representatives from environment and conservation agencies and peak recreational fishing and stocking groups were also invited to give their opinions. The survey was completed in late 2007 and the results analysed to give a prioritized list of key management issues relating to the impacts of native fish stocking activities. In the analysis, issues which received high priority rankings were flagged as potential topics for discussion at a future expert workshop. Identified high priority issues fell into the following core areas: marking techniques, genetics, population dynamics, introduction of pathogens and exotic biological material and ecological, biological and conservation issues. The next planned outcome, determination of the most appropriate methodologies to address these core issues in research projects, was addressed through the outputs of an expert workshop held in early 2008. Participants at this workshop agreed on a range of methodologies for addressing priority sustainability issues and decided under what circumstances that these methodologies should be employed.
Resumo:
In genetic epidemiology, population-based disease registries are commonly used to collect genotype or other risk factor information concerning affected subjects and their relatives. This work presents two new approaches for the statistical inference of ascertained data: a conditional and full likelihood approaches for the disease with variable age at onset phenotype using familial data obtained from population-based registry of incident cases. The aim is to obtain statistically reliable estimates of the general population parameters. The statistical analysis of familial data with variable age at onset becomes more complicated when some of the study subjects are non-susceptible, that is to say these subjects never get the disease. A statistical model for a variable age at onset with long-term survivors is proposed for studies of familial aggregation, using latent variable approach, as well as for prospective studies of genetic association studies with candidate genes. In addition, we explore the possibility of a genetic explanation of the observed increase in the incidence of Type 1 diabetes (T1D) in Finland in recent decades and the hypothesis of non-Mendelian transmission of T1D associated genes. Both classical and Bayesian statistical inference were used in the modelling and estimation. Despite the fact that this work contains five studies with different statistical models, they all concern data obtained from nationwide registries of T1D and genetics of T1D. In the analyses of T1D data, non-Mendelian transmission of T1D susceptibility alleles was not observed. In addition, non-Mendelian transmission of T1D susceptibility genes did not make a plausible explanation for the increase in T1D incidence in Finland. Instead, the Human Leucocyte Antigen associations with T1D were confirmed in the population-based analysis, which combines T1D registry information, reference sample of healthy subjects and birth cohort information of the Finnish population. Finally, a substantial familial variation in the susceptibility of T1D nephropathy was observed. The presented studies show the benefits of sophisticated statistical modelling to explore risk factors for complex diseases.
Resumo:
The Baltic countries share public health problems typical of most Eastern European transition economies: morbidity and mortality from non-communicable diseases is higher than in Western European countries. This situation has many similarities compared to a neighbouring country, Finland during the late 1960s. There are reasons to expect that health disadvantage may be increasing among the less advantaged population groups in the Baltic countries. The evidence on social differences in health in the Baltic countries is, however, scattered to studies using different methodologies making comparisons difficult. This study aims to bridge the evidence gap by providing comparable standardized cross-sectional and time trend analyses to the social patterning of variation in health and two key health behaviours i.e. smoking and drinking in Estonia, Latvia, Lithuania and Finland in 1994-2004 representing Eastern European transition countries and a stable Western European country. The data consisted of similar cross-sectional postal surveys conducted in 1994, 1996, 1998, 2000, 2002 and 2004 on adult populations (aged 20 64 years) in Estonia (n=9049), Latvia (n=7685), Lithuania (n=11634) and Finland (n=18821) in connection with the Finbalt Health Monitor project. The main statistical method was logistic regression analysis. Perceived health was found to be worse among both men and women in the Baltic countries than in Finland. Poor health was associated with older age and lower education in all countries studied. Urbanization and marital status were not consistently related to health. The existing educational inequalities in health remained generally stable over time from 1994 to 2004. In the Baltic countries, however, improvement in perceived health was mainly found among the better educated men and women. Daily smoking was associated with young age, lower education and psychological distress in all countries. Among women smoking was also associated with urbanisation in all countries except Estonia. Among Lithuanian women, the educational gradient in smoking was weakest, and the overall prevalence of smoking increased over time. Drinking was generally associated with young age among men and women, and with education among women. Better educated women were more often frequent drinkers and less educated binge drinkers. The exception was that in Latvian men and women both frequent drinking and binge drinking were associated with low education. In conclusion, the Baltic countries are likely to resemble Western European countries rather than other transition societies. While health inequalities did not markedly change, substantial inequalities do remain, and there were indications of favourable developments mainly among the better educated. Pressures towards increasing health inequalities may therefore be visible in the future, which would be in accordance with the results on smoking and drinking in this study.
Resumo:
We present an introductory overview of several challenging problems in the statistical characterization of turbulence. We provide examples from fluid turbulence in three and two dimensions, from the turbulent advection of passive scalars, turbulence in the one-dimensional Burgers equation, and fluid turbulence in the presence of polymer additives.
Resumo:
The scalar coupled proton NMR spectra of many organic molecules possessing more than one phenyl ring are generally complex due to degeneracy of transitions arising from the closely resonating protons, in addition to several short- and long- range couplings experienced by each proton. Analogous situations are generally encountered in derivatives of halogenated benzanilides. Extraction of information from such spectra is challenging and demands the differentiation of spectrum pertaining to each phenyl ring and the simplification of their spectral complexity. The present study employs the blend of independent spin system filtering and the spin-state selective detection of single quantum (SO) transitions by the two-dimensional multiple quantum (MQ) methodology in achieving this goal. The precise values of the scalar couplings of very small magnitudes have been derived by double quantum resolved experiments. The experiments also provide the relative signs of heteronuclear couplings. Studies on four isomers of dilhalogenated benzanilides are reported in this work.