940 resultados para Transform statistics
Resumo:
The breakdown of the usual method of Fourier transforms in the problem of an external line crack in a thin infinite elastic plate is discovered and the correct solution of this problem is derived using the concept of a generalised Fourier transform of a type discussed first by Golecki [1] in connection with Flamant's problem.
Resumo:
Quantization formats of four digital holographic codes (Lohmann,Lee, Burckhardt and Hsueh-Sawchuk) are evaluated. A quantitative assessment is made from errors in both the Fourier transform and image domains. In general, small errors in the Fourier amplitude or phase alone do not guarantee high image fidelity. From quantization considerations, the Lee hologram is shown to be the best choice for randomly phase coded objects. When phase coding is not feasible, the Lohmann hologram is preferable as it is easier to plot.
Resumo:
Using analysis-by-synthesis (AbS) approach, we develop a soft decision based switched vector quantization (VQ) method for high quality and low complexity coding of wideband speech line spectral frequency (LSF) parameters. For each switching region, a low complexity transform domain split VQ (TrSVQ) is designed. The overall rate-distortion (R/D) performance optimality of new switched quantizer is addressed in the Gaussian mixture model (GMM) based parametric framework. In the AbS approach, the reduction of quantization complexity is achieved through the use of nearest neighbor (NN) TrSVQs and splitting the transform domain vector into higher number of subvectors. Compared to the current LSF quantization methods, the new method is shown to provide competitive or better trade-off between R/D performance and complexity.
Resumo:
The efforts of combining quantum theory with general relativity have been great and marked by several successes. One field where progress has lately been made is the study of noncommutative quantum field theories that arise as a low energy limit in certain string theories. The idea of noncommutativity comes naturally when combining these two extremes and has profound implications on results widely accepted in traditional, commutative, theories. In this work I review the status of one of the most important connections in physics, the spin-statistics relation. The relation is deeply ingrained in our reality in that it gives us the structure for the periodic table and is of crucial importance for the stability of all matter. The dramatic effects of noncommutativity of space-time coordinates, mainly the loss of Lorentz invariance, call the spin-statistics relation into question. The spin-statistics theorem is first presented in its traditional setting, giving a clarifying proof starting from minimal requirements. Next the notion of noncommutativity is introduced and its implications studied. The discussion is essentially based on twisted Poincaré symmetry, the space-time symmetry of noncommutative quantum field theory. The controversial issue of microcausality in noncommutative quantum field theory is settled by showing for the first time that the light wedge microcausality condition is compatible with the twisted Poincaré symmetry. The spin-statistics relation is considered both from the point of view of braided statistics, and in the traditional Lagrangian formulation of Pauli, with the conclusion that Pauli's age-old theorem stands even this test so dramatic for the whole structure of space-time.
Resumo:
This thesis presents novel modelling applications for environmental geospatial data using remote sensing, GIS and statistical modelling techniques. The studied themes can be classified into four main themes: (i) to develop advanced geospatial databases. Paper (I) demonstrates the creation of a geospatial database for the Glanville fritillary butterfly (Melitaea cinxia) in the Åland Islands, south-western Finland; (ii) to analyse species diversity and distribution using GIS techniques. Paper (II) presents a diversity and geographical distribution analysis for Scopulini moths at a world-wide scale; (iii) to study spatiotemporal forest cover change. Paper (III) presents a study of exotic and indigenous tree cover change detection in Taita Hills Kenya using airborne imagery and GIS analysis techniques; (iv) to explore predictive modelling techniques using geospatial data. In Paper (IV) human population occurrence and abundance in the Taita Hills highlands was predicted using the generalized additive modelling (GAM) technique. Paper (V) presents techniques to enhance fire prediction and burned area estimation at a regional scale in East Caprivi Namibia. Paper (VI) compares eight state-of-the-art predictive modelling methods to improve fire prediction, burned area estimation and fire risk mapping in East Caprivi Namibia. The results in Paper (I) showed that geospatial data can be managed effectively using advanced relational database management systems. Metapopulation data for Melitaea cinxia butterfly was successfully combined with GPS-delimited habitat patch information and climatic data. Using the geospatial database, spatial analyses were successfully conducted at habitat patch level or at more coarse analysis scales. Moreover, this study showed it appears evident that at a large-scale spatially correlated weather conditions are one of the primary causes of spatially correlated changes in Melitaea cinxia population sizes. In Paper (II) spatiotemporal characteristics of Socupulini moths description, diversity and distribution were analysed at a world-wide scale and for the first time GIS techniques were used for Scopulini moth geographical distribution analysis. This study revealed that Scopulini moths have a cosmopolitan distribution. The majority of the species have been described from the low latitudes, sub-Saharan Africa being the hot spot of species diversity. However, the taxonomical effort has been uneven among biogeographical regions. Paper III showed that forest cover change can be analysed in great detail using modern airborne imagery techniques and historical aerial photographs. However, when spatiotemporal forest cover change is studied care has to be taken in co-registration and image interpretation when historical black and white aerial photography is used. In Paper (IV) human population distribution and abundance could be modelled with fairly good results using geospatial predictors and non-Gaussian predictive modelling techniques. Moreover, land cover layer is not necessary needed as a predictor because first and second-order image texture measurements derived from satellite imagery had more power to explain the variation in dwelling unit occurrence and abundance. Paper V showed that generalized linear model (GLM) is a suitable technique for fire occurrence prediction and for burned area estimation. GLM based burned area estimations were found to be more superior than the existing MODIS burned area product (MCD45A1). However, spatial autocorrelation of fires has to be taken into account when using the GLM technique for fire occurrence prediction. Paper VI showed that novel statistical predictive modelling techniques can be used to improve fire prediction, burned area estimation and fire risk mapping at a regional scale. However, some noticeable variation between different predictive modelling techniques for fire occurrence prediction and burned area estimation existed.
Resumo:
Spreadsheet of non-target species (bycatch) numbers in the Shark Control Program by species, date of capture, location, size and sex from 2001 onwards The shark control program (SCP) relies on nets or drumlines, or a combination of both, to minimise the threat of shark attack on humans in particular locations. Following is information on numbers and locations of sharks that have been caught by the SCP. It is important to reduce the inadvertent impacts of the SCP on other marine animals (bycatch) without compromising human safety. Bycatch levels are carefully monitored and research is focused on minimising impacts on non-target species. This dataset contains details of non-target numbers in the Shark Control program by species, date of capture, and location from 2001
Resumo:
Access to quality higher education is challenging for many Western Australians that live outside the metropolitan area. In 2010, the School of Education moved to flexible delivery of a fully online Bachelor of Education degree for their non -metropolitan students. The new model of delivery allows access for students from any location provided they have a computer and an internet connection. A number of academic staff had previously used an asynchronous environment to deliver learning modules housed within a learning management system (LMS) but had not used synchronous software with their students. To enhance the learning environment and to provide high quality learning experiences to students learning at a distance, the adoption of synchronous software (Elluminate Live) was introduced. This software is a real-time virtual classroom environment that allows for communication through Voice over Internet Protocol (VoIP) and videoconferencing, along with a large number of collaboration tools to engage learners. This research paper reports on the integration of a live e-learning solution into the current LMS environment. Qualitative data were collected from academic staff through informal interviews and participant observation. The findings discuss (i) perceived level of support; (ii) identification of strategies used to create an effective online teacher presence; (iii) the perceived impact on the students' learning outcomes; and (iv) guidelines for professional development to enhance pedagogy within the live e-learning environment.
Resumo:
The Fourier transforms of the collagen molecular structure have been calculated taking into consideration various side chain atoms, as well as the presence of bound water molecules. There is no significant change in the calculated intensity distribution on including the side chain atoms of non-imino-acid residues. Taking into account the presence of about two bound water molecules per tripeptide unit, the agreement with the observed x-ray pattern is slightly improved. Fourier transforms have also been calculated for the detailed molecular geometries proposed from other laboratories. It is found that there are no major differences between them, as compared to our structure, either in the positions of peak intensity or in the intensity distribution. Hence it is not possible to judge the relative merits of the various molecular geometries for the collagen triple helix from a comparison of the calculated transforms with the meagre data available from its x-ray fibre pattern. It is also concluded that the collagen molecular structure should be regarded as a somewhat flexible chain structure, capable of adapting itself to the requirements of the different side groups which occur in each local region.
Resumo:
We investigate the use of a two stage transform vector quantizer (TSTVQ) for coding of line spectral frequency (LSF) parameters in wideband speech coding. The first stage quantizer of TSTVQ, provides better matching of source distribution and the second stage quantizer provides additional coding gain through using an individual cluster specific decorrelating transform and variance normalization. Further coding gain is shown to be achieved by exploiting the slow time-varying nature of speech spectra and thus using inter-frame cluster continuity (ICC) property in the first stage of TSTVQ method. The proposed method saves 3-4 bits and reduces the computational complexity by 58-66%, compared to the traditional split vector quantizer (SVQ), but at the expense of 1.5-2.5 times of memory.
Application of Laplace transform technique to the solution of certain third-order non-linear systems
Resumo:
A number of papers have appeared on the application of operational methods and in particular the Laplace transform to problems concerning non-linear systems of one kind or other. This, however, has met with only partial success in solving a class of non-linear problems as each approach has some limitations and drawbacks. In this study the approach of Baycura has been extended to certain third-order non-linear systems subjected to non-periodic excitations, as this approximate method combines the advantages of engineering accuracy with ease of application to such problems. Under non-periodic excitations the method provides a procedure for estimating quickly the maximum response amplitude, which is important from the point of view of a designer. Limitations of such a procedure are brought out and the method is illustrated by an example taken from a physical situation.
Resumo:
This paper presents a statistical aircraft trajectory clustering approach aimed at discriminating between typical manned and expected unmanned traffic patterns. First, a resampled version of each trajectory is modelled using a mixture of Von Mises distributions (circular statistics). Second, the remodelled trajectories are globally aligned using tools from bioinformatics. Third, the alignment scores are used to cluster the trajectories using an iterative k-medoids approach and an appropriate distance function. The approach is then evaluated using synthetically generated unmanned aircraft flights combined with real air traffic position reports taken over a sector of Northern Queensland, Australia. Results suggest that the technique is useful in distinguishing between expected unmanned and manned aircraft traffic behaviour, as well as identifying some common conventional air traffic patterns.
Resumo:
We present a signal processing approach using discrete wavelet transform (DWT) for the generation of complex synthetic aperture radar (SAR) images at an arbitrary number of dyadic scales of resolution. The method is computationally efficient and is free from significant system-imposed limitations present in traditional subaperture-based multiresolution image formation. Problems due to aliasing associated with biorthogonal decomposition of the complex signals are addressed. The lifting scheme of DWT is adapted to handle complex signal approximations and employed to further enhance the computational efficiency. Multiresolution SAR images formed by the proposed method are presented.
Resumo:
We derive expressions for convolution multiplication properties of discrete cosine transform II (DCT II) starting from equivalent discrete Fourier transform (DFT) representations. Using these expressions, a method for implementing linear filtering through block convolution in the DCT II domain is presented. For the case of nonsymmetric impulse response, additional discrete sine transform II (DST II) is required for implementing the filter in DCT II domain, where as for a symmetric impulse response, the additional transform is not required. Comparison with recently proposed circular convolution technique in DCT II domain shows that the proposed new method is computationally more efficient.
Resumo:
The images of Hermite and Laguerre-Sobolev spaces under the Hermite and special Hermite semigroups (respectively) are characterized. These are used to characterize the image of Schwartz class of rapidly decreasing functions f on R-n and C-n under these semigroups. The image of the space of tempered distributions is also considered and a Paley-Wiener theorem for the windowed (short-time) Fourier transform is proved.
Resumo:
Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.