971 resultados para order statistics
Resumo:
Purpose: Poor image quality in the peripheral field may lead to myopia. Most studies measuring the higher order aberrations in the periphery have been restricted to the horizontal visual field. The purpose of this study was to measure higher order monochromatic aberrations across the central 42º horizontal x 32º vertical visual fields in myopes and emmetropes. ---------- Methods: We recruited 5 young emmetropes with spherical equivalent refractions +0.17 ± 0.45D and 5 young myopes with spherical equivalent refractions -3.9 ± 2.09D. Measurements were taken with a modified COAS-HD Hartmann-Shack aberrometer (Wavefront Sciences Inc). Measurements were taken while the subjects looked at 38 points arranged in a 7 x 6 matrix (excluding four corner points) through a beam splitter held between the instrument and the eye. A combination of the instrument’s software and our own software was used to estimate OSA Zernike coefficients for 5mm pupil diameter at 555nm for each point. The software took into account the elliptical shape of the off-axis pupil. Nasal and superior fields were taken to have positive x and y signs, respectively. ---------- Results: The total higher order RMS (HORMS) was similar on-axis for emmetropes (0.16 ± 0.02 μm) and myopes (0.17 ± 0.02 μm). There was no common pattern for HORMS for emmetropes across the visual field where as 4 out of 5 myopes showed a linear increase in HORMS in all directions away from the minimum. For all subjects, vertical and horizontal comas showed linear changes across the visual field. The mean rate of change of vertical coma across the vertical meridian was significantly lower (p = 0.008) for emmetropes (-0.005 ± 0.002 μm/deg) than for myopes (-0.013 ± 0.004 μm/deg). The mean rate of change of horizontal coma across the horizontal meridian was lower (p = 0.07) for emmetropes (-0.006 ± 0.003 μm/deg) than myopes (-0.011 ± 0.004 μm/deg). ---------- Conclusion: We have found differences in patterns of higher order aberrations across the visual fields of emmetropes and myopes, with myopes showing the greater rates of change of horizontal and vertical coma.
Resumo:
Productivity is basic statistical information for many international comparisons and country performance assessments. This study estimates the construction labour productivity of 79 selected economies. The real (purchasing power parities converted) and nominal construction expenditure from the Report of 2005 International Comparison Programme published by the World Bank and construction employment from the database of labour statistics (LABORSTA) operated by the Bureau of Statistics of International Labour Organization were used in the estimation. The inference statistics indicate that the descending order of nominal construction labour productivity from high income economies to low income economies is not established. The average construction labour productivity of low income economies is higher than middle income economies when the productivity calculation uses purchasing power parities converted data. Malaysia ranked 50th and 63rd position among the 79 selected economies on real and nominal measurement respectively.
Resumo:
Not all companies in Australia are amenable to a winding up order pursuant to the Corporations Act 2001 (Cth). The Supreme Court of New South Wales has previously dealt with such winding up applications by apparently focusing on the inherent jurisdiction of the court to consider whether the court has jurisdiction to firstly consider the winding up application. This article proposes an original alternative paradigm: the plenary power provided to the court by s 23 of the Supreme Court Act 1970 (NSW) can be utilised to initially attract the jurisdiction of the court and subsequently the inherent jurisdiction specifically utilising the equitable “just and equitable” ground is available to the court to consider and make such a winding up order if appropriate. Variation of such a paradigm may also be available to the court when considering the inherent jurisdiction in relation to corporation matters more generally.
Resumo:
Robust image hashing seeks to transform a given input image into a shorter hashed version using a key-dependent non-invertible transform. These image hashes can be used for watermarking, image integrity authentication or image indexing for fast retrieval. This paper introduces a new method of generating image hashes based on extracting Higher Order Spectral features from the Radon projection of an input image. The feature extraction process is non-invertible, non-linear and different hashes can be produced from the same image through the use of random permutations of the input. We show that the transform is robust to typical image transformations such as JPEG compression, noise, scaling, rotation, smoothing and cropping. We evaluate our system using a verification-style framework based on calculating false match, false non-match likelihoods using the publicly available Uncompressed Colour Image database (UCID) of 1320 images. We also compare our results to Swaminathan’s Fourier-Mellin based hashing method with at least 1% EER improvement under noise, scaling and sharpening.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
In order to estimate the safety impact of roadway interventions engineers need to collect, analyze, and interpret the results of carefully implemented data collection efforts. The intent of these studies is to develop Accident Modification Factors (AMF's), which are used to predict the safety impact of various road safety features at other locations or in upon future enhancements. Models are typically estimated to estimate AMF's for total crashes, but can and should be estimated for crash outcomes as well. This paper first describes data collected with the intent estimate AMF's for rural intersections in the state of Georgia within the United Sates. Modeling results of crash prediction models for the crash outcomes: angle, head-on, rear-end, sideswipe (same direction and opposite direction) and pedestrian-involved crashes are then presented and discussed. The analysis reveals that factors such as the Annual Average Daily Traffic (AADT), the presence of turning lanes, and the number of driveways have a positive association with each type of crash, while the median width and the presence of lighting are negatively associated with crashes. The model covariates are related to crash outcome in different ways, suggesting that crash outcomes are associated with different pre-crash conditions.
Resumo:
Now in its sixth edition, the Traffic Engineering Handbook continues to be a must have publication in the transportation industry, as it has been for the past 60 years. The new edition provides updated information for people entering the practice and for those already practicing. The handbook is a convenient desk reference, as well as an all in one source of principles and proven techniques in traffic engineering. Most chapters are presented in a new format, which divides the chapters into four areas-basics, current practice, emerging trends and information sources. Chapter topics include road users, vehicle characteristics, statistics, planning for operations, communications, safety, regulations, traffic calming, access management, geometrics, signs and markings, signals, parking, traffic demand, maintenance and studies. In addition, as the focus in transportation has shifted from project based to operations based, two new chapters have been added-"Planning for Operations" and "Managing Traffic Demand to Address Congestion: Providing Travelers with Choices." The Traffic Engineering Handbook continues to be one of the primary reference sources for study to become a certified Professional Traffic Operations Engineer™. Chapters are authored by notable and experienced authors, and reviewed and edited by a distinguished panel of traffic engineering experts.