982 resultados para virtual topology, decomposition, hex meshing algorithms
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent
Resumo:
Schizophrenia is a mental disorder affecting 1-2% of the population and it is estimated 12-16% of hospital beds in Australia are occupied by patients with psychosis. The suicide rate for patients with this diagnosis is higher than that of the general population. Any technique which enhances training and treatment of this disorder will have a significant societal and economic impact. A significant research project using Virtual Reality (VR), in which both visual and auditory hallucinations are simulated, is currently being undertaken at the University of Queensland. The virtual environments created by the new software are expected to enhance the experiential learning outcomes of medical students by enabling them to experience the inner world of a patient with psychosis. In addition the Virtual Environment has the potential to provide a technologically advanced therapeutic setting where behavioral, exposure therapies can be conducted with exactly controlled exposure stimuli with an expected reduction in risk of harm. This paper reports on the current work of the project, previous stages of software development and future educational and clinical applications of the Virtual Environments. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This project proposes a new conceptual framework for the regulation of social networks and virtual communities. By applying a model based upon the rule of law, this thesis addresses the growing tensions that revolve around the public use of private networks. This research examines the shortcomings of traditional contractual governance models and cyberlaw theory and provides a reconstituted approach that will allow public constitutional-type interests to be recognised in the interpretation and enforcement of contractual doctrine.
Resumo:
Life-cycle management (LCM) has been employed in the management of construction projects for many years in order to reduce whole life cost, time, risk and improve the service to owners. However, owing to lack of an effective information sharing platform, the current LCM of construction projects is not effectively used in the construction industry. Based upon the analysis of the information flow of LCM, a virutal prototyping (VP)-based communication and collaboration information platform is proposed. Following this, the platform is customized using DASSAULT sofware. The whole process of implementing the VP-based LCM are also discussed and, from a simple case study, it is demonstrated that the VP-based communication and collaboration information platform is an effective tool to support the LCM of construction projects.
Resumo:
Soil C decomposition is sensitive to changes in temperature, and even small increases in temperature may prompt large releases of C from soils. But much of what we know about soil C responses to global change is based on short-term incubation data and model output that implicitly assumes soil C pools are composed of organic matter fractions with uniform temperature sensitivities. In contrast, kinetic theory based on chemical reactions suggests that older, more-resistant C fractions may be more temperature sensitive. Recent research on the subject is inconclusive, indicating that the temperature sensitivity of labile soil organic matter (OM) decomposition could either be greater than, less than, or equivalent to that of resistant soil OM. We incubated soils at constant temperature to deplete them of labile soil OM and then successively assessed the CO2-C efflux in response to warming. We found that the decomposition response to experimental warming early during soil incubation (when more labile C remained) was less than that later when labile C was depleted. These results suggest that the temperature sensitivity of resistant soil OM pools is greater than that for labile soil OM and that global change-driven soil C losses may be greater than previously estimated.
Resumo:
The relationship between organic matter (OM) lability and temperature sensitivity is disputed, with recent observations suggesting that responses of relatively more resistant OM to increased temperature could be greater than, equivalent to, or less than responses of relatively more labile OM. This lack of clear understanding limits the ability to forecast carbon (C) cycle responses to temperature changes. Here, we derive a novel approach (denoted Q(10-q)) that accounts for changes in OM quality during decomposition and use it to analyze data from three independent sources. Results from new laboratory soil incubations (labile Q(10-q)=2.1 +/- 0.2; more resistant Q(10-q)=3.8 +/- 0.3) and reanalysis of data from other soil incubations reported in the literature (labile Q(10-q)=2.3; more resistant Q(10-q)=3.3) demonstrate that temperature sensitivity of soil OM decomposition increases with decreasing soil OM lability. Analysis of data from a cross-site, field litter bag decomposition study (labile Q(10-q)=3.3 +/- 0.2; resistant Q(10-q)=4.9 +/- 0.2) shows that litter OM follows the same pattern, with greater temperature sensitivity for more resistant litter OM. Furthermore, the initial response of cultivated soils, presumably containing less labile soil OM (Q(10-q)=2.4 +/- 0.3) was greater than that for undisturbed grassland soils (Q(10-q)=1.7 +/- 0.1). Soil C losses estimated using this approach will differ from previous estimates as a function of the magnitude of the temperature increase and the proportion of whole soil OM comprised of compounds sensitive to temperature over that temperature range. It is likely that increased temperature has already prompted release of significant amounts of C to the atmosphere as CO2. Our results indicate that future losses of litter and soil C may be even greater than previously supposed.
Resumo:
There is a severe tendency in cyberlaw theory to delegitimize state intervention in the governance of virtual communities. Much of the existing theory makes one of two fundamental flawed assumptions: that communities will always be best governed without the intervention of the state; or that the territorial state can best encourage the development of communities by creating enforceable property rights and allowing the market to resolve any disputes. These assumptions do not ascribe sufficient weight to the value-laden support that the territorial state always provides to private governance regimes, the inefficiencies that will tend to limit the development utopian communities, and the continued role of the territorial state in limiting autonomy in accordance with communal values. In order to overcome these deterministic assumptions, this article provides a framework based upon the values of the rule of law through which to conceptualise the legitimacy of the private exercise of power in virtual communities. The rule of law provides a constitutional discourse that assists in considering appropriate limits on the exercise of private power. I argue that the private contractual framework that is used to govern relations in virtual communities ought to be informed by the values of the rule of law in order to more appropriately address the governance tensions that permeate these spaces. These values suggest three main limits to the exercise of private power: that governance is limited by community rules and that the scope of autonomy is limited by the substantive values of the territorial state; that private contractual rules should be general, equal, and certain; and that, most importantly, internal norms be predicated upon the consent of participants.
Resumo:
Improving efficiency and flexibility in pulsed power supply technologies is the most substantial concern of pulsed power systems specifically with regard to plasma generation. Recently, the improvement of pulsed power supply has become of greater concern due to the extension of pulsed power applications to environmental and industrial areas. With this respect, a current source based topology is proposed in this paper as a pulsed power supply which gives the possibility of power flow control during load supplying mode. The main contribution in this configuration is utilization of low-medium voltage semiconductor switches for high voltage generation. A number of switch-diode-capacitor units are designated at the output of topology to exchange the current source energy into voltage form and generate a pulsed power with sufficient voltage magnitude and stress. Simulations carried out in Matlab/SIMULINK platform as well as experimental tests on a prototype setup have verified the capability of this topology in performing desired duties. Being efficient and flexible are the main advantages of this topology.