992 resultados para Statistical parameters
Resumo:
Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.
Resumo:
A possibility of a strong change of an electromagnetic signal by a short sequence of time cycles of pulses that modulate the medium parameters is shown. The backward wave is demonstrated to be an inevitable result of the medium time change. Dependence of the relation between backward and forward waves on the parameters of the medium modulation is investigated. The finite statistical complexity of the electromagnetic signal transformed by a finite sequence of modulating cycles is calculated. Increase of the complexity with the number of cycles is shown.
Resumo:
This paper is dedicated to modelling of network maintaining based on live example – maintaining ATM banking network, where any problems are mean money loss. A full analysis is made in order to estimate valuable and not-valuable parameters based on complex analysis of available data. Correlation analysis helps to estimate provided data and to produce a complex solution of increasing network maintaining effectiveness.
Resumo:
2010 Mathematics Subject Classification: 60J80.
Resumo:
2010. július 20-án megkezdte működését a magyar áramtőzsde, a HUPX. 2010. augusztus 16-án az első napokban tapasztalt 45-60 euró megawattórás ár helyett egyes órákban 2999 eurós árral szembesültek a piaci szereplők. A kiemelkedően magas árak megjelenése nem szokatlan az áramtőzsdéken a nemzetközi tapasztalatok szerint, sőt a kutatások kiemelten foglalkoznak az ún. ártüskék okainak felkutatásával, valamint megjelenésük kvantitatív és kvalitatív elemzésével. A cikkben a szerző bemutatja, milyen eredmények születtek a kiugró árak statisztikai vizsgálatai során a szakirodalomban, illetve azok következtetései hogyan állják meg a helyüket a magyar árak idősorát figyelembe véve. A szerző bemutat egy modellkeretet, amely a villamosenergia-árak viselkedését a hét órái szerint periodikusan váltakozó paraméterű eloszlásokkal írja le. A magyar áramtőzsde rövid története sajnos nem teszi lehetővé, hogy a hét minden órájára külön áreloszlást illeszthessünk. A szerző ezért a hét óráit két csoportba sorolja az ár eloszlásának jellege alapján: az ártüskék megjelenése szempontjából kockázatos és kevésbé kockázatos órákba. Ezután a HUPX-árak leírására felépít egy determinisztikus, kétállapotú rezsimváltó modellt, amellyel azonosítani lehet a kockázatos és kevésbé kockázatos órákat, valamint képet kaphatunk az extrém ármozgások jellegéről. / === / On 20th July, 2010 the Hungarian Power Exchange, the HUPX started its operation. On 16th August in certain hours the markets participants faced € 2,999 price instead of in the first days experienced 45-60 euros/mwh. According to the international experiences the appearance of the extremely high prices hasn’t been unusual in the power exchanges, the researches have focused exploring the causes of the so-called spikes and quantitative and qualitative analysis of those appearances. In this article the author describes what results were determined on statistical studies of outstanding prices in the literature, and how their conclusions stand up into account the time series of the Hungarian prices. The author presents a model framework which describes the behavior of electricity prices in the seven hours of periodically varying parameters. Unfortunately the brief history of the Hungarian Power Exchange does not allow to suit specific prices for each hour of week. Therefore the author classifies the hours of the week in the two groups based on the nature of price dispersion: according to the appearance of spikes to risky and less risky classes. Then for describing the HUPX prices the author builds a deterministic two-state, regime-changing model, which can be identified the risky and less risky hours, and to get a picture of the nature of extreme price movements.
Resumo:
The amount and quality of available biomass is a key factor for the sustainable livestock industry and agricultural management related decision making. Globally 31.5% of land cover is grassland while 80% of Ireland’s agricultural land is grassland. In Ireland, grasslands are intensively managed and provide the cheapest feed source for animals. This dissertation presents a detailed state of the art review of satellite remote sensing of grasslands, and the potential application of optical (Moderate–resolution Imaging Spectroradiometer (MODIS)) and radar (TerraSAR-X) time series imagery to estimate the grassland biomass at two study sites (Moorepark and Grange) in the Republic of Ireland using both statistical and state of the art machine learning algorithms. High quality weather data available from the on-site weather station was also used to calculate the Growing Degree Days (GDD) for Grange to determine the impact of ancillary data on biomass estimation. In situ and satellite data covering 12 years for the Moorepark and 6 years for the Grange study sites were used to predict grassland biomass using multiple linear regression, Neuro Fuzzy Inference Systems (ANFIS) models. The results demonstrate that a dense (8-day composite) MODIS image time series, along with high quality in situ data, can be used to retrieve grassland biomass with high performance (R2 = 0:86; p < 0:05, RMSE = 11.07 for Moorepark). The model for Grange was modified to evaluate the synergistic use of vegetation indices derived from remote sensing time series and accumulated GDD information. As GDD is strongly linked to the plant development, or phonological stage, an improvement in biomass estimation would be expected. It was observed that using the ANFIS model the biomass estimation accuracy increased from R2 = 0:76 (p < 0:05) to R2 = 0:81 (p < 0:05) and the root mean square error was reduced by 2.72%. The work on the application of optical remote sensing was further developed using a TerraSAR-X Staring Spotlight mode time series over the Moorepark study site to explore the extent to which very high resolution Synthetic Aperture Radar (SAR) data of interferometrically coherent paddocks can be exploited to retrieve grassland biophysical parameters. After filtering out the non-coherent plots it is demonstrated that interferometric coherence can be used to retrieve grassland biophysical parameters (i. e., height, biomass), and that it is possible to detect changes due to the grass growth, and grazing and mowing events, when the temporal baseline is short (11 days). However, it not possible to automatically uniquely identify the cause of these changes based only on the SAR backscatter and coherence, due to the ambiguity caused by tall grass laid down due to the wind. Overall, the work presented in this dissertation has demonstrated the potential of dense remote sensing and weather data time series to predict grassland biomass using machine-learning algorithms, where high quality ground data were used for training. At present a major limitation for national scale biomass retrieval is the lack of spatial and temporal ground samples, which can be partially resolved by minor modifications in the existing PastureBaseIreland database by adding the location and extent ofeach grassland paddock in the database. As far as remote sensing data requirements are concerned, MODIS is useful for large scale evaluation but due to its coarse resolution it is not possible to detect the variations within the fields and between the fields at the farm scale. However, this issue will be resolved in terms of spatial resolution by the Sentinel-2 mission, and when both satellites (Sentinel-2A and Sentinel-2B) are operational the revisit time will reduce to 5 days, which together with Landsat-8, should enable sufficient cloud-free data for operational biomass estimation at a national scale. The Synthetic Aperture Radar Interferometry (InSAR) approach is feasible if there are enough coherent interferometric pairs available, however this is difficult to achieve due to the temporal decorrelation of the signal. For repeat-pass InSAR over a vegetated area even an 11 days temporal baseline is too large. In order to achieve better coherence a very high resolution is required at the cost of spatial coverage, which limits its scope for use in an operational context at a national scale. Future InSAR missions with pair acquisition in Tandem mode will minimize the temporal decorrelation over vegetation areas for more focused studies. The proposed approach complements the current paradigm of Big Data in Earth Observation, and illustrates the feasibility of integrating data from multiple sources. In future, this framework can be used to build an operational decision support system for retrieval of grassland biophysical parameters based on data from long term planned optical missions (e. g., Landsat, Sentinel) that will ensure the continuity of data acquisition. Similarly, Spanish X-band PAZ and TerraSAR-X2 missions will ensure the continuity of TerraSAR-X and COSMO-SkyMed.
Resumo:
Goodness-of-fit tests have been studied by many researchers. Among them, an alternative statistical test for uniformity was proposed by Chen and Ye (2009). The test was used by Xiong (2010) to test normality for the case that both location parameter and scale parameter of the normal distribution are known. The purpose of the present thesis is to extend the result to the case that the parameters are unknown. A table for the critical values of the test statistic is obtained using Monte Carlo simulation. The performance of the proposed test is compared with the Shapiro-Wilk test and the Kolmogorov-Smirnov test. Monte-Carlo simulation results show that proposed test performs better than the Kolmogorov-Smirnov test in many cases. The Shapiro Wilk test is still the most powerful test although in some cases the test proposed in the present research performs better.
Resumo:
This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.
Resumo:
In physics, one attempts to infer the rules governing a system given only the results of imperfect measurements. Hence, microscopic theories may be effectively indistinguishable experimentally. We develop an operationally motivated procedure to identify the corresponding equivalence classes of states, and argue that the renormalization group (RG) arises from the inherent ambiguities associated with the classes: one encounters flow parameters as, e.g., a regulator, a scale, or a measure of precision, which specify representatives in a given equivalence class. This provides a unifying framework and reveals the role played by information in renormalization. We validate this idea by showing that it justifies the use of low-momenta n-point functions as statistically relevant observables around a Gaussian hypothesis. These results enable the calculation of distinguishability in quantum field theory. Our methods also provide a way to extend renormalization techniques to effective models which are not based on the usual quantum-field formalism, and elucidates the relationships between various type of RG.
Resumo:
The protein lysate array is an emerging technology for quantifying the protein concentration ratios in multiple biological samples. It is gaining popularity, and has the potential to answer questions about post-translational modifications and protein pathway relationships. Statistical inference for a parametric quantification procedure has been inadequately addressed in the literature, mainly due to two challenges: the increasing dimension of the parameter space and the need to account for dependence in the data. Each chapter of this thesis addresses one of these issues. In Chapter 1, an introduction to the protein lysate array quantification is presented, followed by the motivations and goals for this thesis work. In Chapter 2, we develop a multi-step procedure for the Sigmoidal models, ensuring consistent estimation of the concentration level with full asymptotic efficiency. The results obtained in this chapter justify inferential procedures based on large-sample approximations. Simulation studies and real data analysis are used to illustrate the performance of the proposed method in finite-samples. The multi-step procedure is simpler in both theory and computation than the single-step least squares method that has been used in current practice. In Chapter 3, we introduce a new model to account for the dependence structure of the errors by a nonlinear mixed effects model. We consider a method to approximate the maximum likelihood estimator of all the parameters. Using the simulation studies on various error structures, we show that for data with non-i.i.d. errors the proposed method leads to more accurate estimates and better confidence intervals than the existing single-step least squares method.
Resumo:
The sea surface temperature (SST) and chlorophyll-a concentration (CHL-a) were analysed in the Gulf of Tadjourah from two set of 8-day composite satellite data, respectively from 2008 to 2012 and from 2005 to 2011. A singular spectrum analysis (SSA) shows that the annual cycle of SST is strong (74.3% of variance) and consists of warming (April-October) and cooling (November-March) of about 2.5C than the long-term average. The semi-annual cycle captures only 14.6% of temperature variance and emphasises the drop of SST during July-August. Similarly, the annual cycle of CHL-a (29.7% of variance) depicts high CHL-a from June to October and low concentration from November to May. In addition, the first spatial empirical orthogonal function (EOF) of SST (93% of variance) shows that the seasonal warming/cooling is in phase across the whole study area but the southeastern part always remaining warmer or cooler. In contrast to the SST, the first EOF of CHL-a (54.1% of variance) indicates the continental shelf in phase opposition with the offshore area in winter during which the CHL-a remains sequestrated in the coastal area particularly in the south-east and in the Ghoubet Al-Kharab Bay. Inversely during summer, higher CHL-a quantities appear in the offshore waters. In order to investigate processes generating these patterns, a multichannel spectrum analysis was applied to a set of oceanic (SST, CHL-a) and atmospheric parameters (wind speed, air temperature and air specific humidity). This analysis shows that the SST is well correlated to the atmospheric parameters at an annual scale. The windowed cross correlation indicates that this correlation is significant only from October to May. During this period, the warming was related to the solar heating of the surface water when the wind is low (April-May and October) while the cooling (November-March) was linked to the strong and cold North-East winds and to convective mixing. The summer drop in SST followed by a peak of CHL-a, seems strongly correlated to the upwelling. The second EOF modes of SST and CHL-a explain respectively 1.3% and 5% of the variance and show an east-west gradient during winter that is reversed during summer. This work showed that the seasonal signals have a wide spatial influence and dominate the variability of the SST and CHL-a while the east-west gradient are specific for the Gulf of Tadjourah and seem induced by the local wind modulated by the topography.
Resumo:
Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.
Resumo:
Fifty six specimens of the hybrid tambacu (Piaractus mesopotamicus Holmberg, 1887 male x Colossoma macropomum Cuvier, 1818 female) were collected from fishfarm of Guariba, São Paulo, to evaluate their haematology. Fishes presented 400.0 to 3,100.0 g total weight and 20.0 to 52.0 cm total length. Haemoglobin, haematocrit, mean corpuscular haemoglobin content (MCHC) and percentage of defense blood cells including leucocytes and thrombocytes, were studied. Statistical analysis showed positive correlation (P<0.01) between haematocrit, MCHC and haemoglobin rate. Nevertheless, thrombocytes and lymphocytes showed negative correlation (P<0.01).
Resumo:
Multivariate normal distribution is commonly encountered in any field, a frequent issue is the missing values in practice. The purpose of this research was to estimate the parameters in three-dimensional covariance permutation-symmetric normal distribution with complete data and all possible patterns of incomplete data. In this study, MLE with missing data were derived, and the properties of the MLE as well as the sampling distributions were obtained. A Monte Carlo simulation study was used to evaluate the performance of the considered estimators for both cases when ρ was known and unknown. All results indicated that, compared to estimators in the case of omitting observations with missing data, the estimators derived in this article led to better performance. Furthermore, when ρ was unknown, using the estimate of ρ would lead to the same conclusion.
Resumo:
Mine drainage is an important environmental disturbance that affects the chemical and biological components in natural resources. However, little is known about the effects of neutral mine drainage on the soil bacteria community. Here, a high-throughput 16S rDNA pyrosequencing approach was used to evaluate differences in composition, structure, and diversity of bacteria communities in samples from a neutral drainage channel, and soil next to the channel, at the Sossego copper mine in Brazil. Advanced statistical analyses were used to explore the relationships between the biological and chemical data. The results showed that the neutral mine drainage caused changes in the composition and structure of the microbial community, but not in its diversity. The Deinococcus/Thermus phylum, especially the Meiothermus genus, was in large part responsible for the differences between the communities, and was positively associated with the presence of copper and other heavy metals in the environmental samples. Other important parameters that influenced the bacterial diversity and composition were the elements potassium, sodium, nickel, and zinc, as well as pH. The findings contribute to the understanding of bacterial diversity in soils impacted by neutral mine drainage, and demonstrate that heavy metals play an important role in shaping the microbial population in mine environments.