1000 resultados para Nonstationary method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research project examines the application of the Suzuki Actor Training Method (the Suzuki Method) within the work ofTadashi Suzuki's company in Japan, the Shizuoka Performing Arts Complex (SPAC), within the work of Brisbane theatre company Frank:Austral Asian Performance Ensemble (Frank:AAPE), and as related to the development of the theatre performance Surfacing. These three theatrical contexts have been studied from the viewpoint of a "participant- observer". The researcher has trained in the Suzuki Method with Frank:AAPE and SP AC, performed with Frank:AAPE, and was the solo performer and collaborative developer in the performance Surfacing (directed by Leah Mercer). Observations of these three groups are based on a phenomenological definition of the "integrated actor", an actor who is able to achieve a totality or unity between the body and the mind, and between the body and the voice, through a powerful sense of intention. The term "integrated actor" has been informed by the philosophy of Merleau-Ponty and his concept of the "lived body". Three main hypotheses are presented in this study: that the Suzuki Method focuses on actors learning through their body; that the Suzuki Method presents an holistic approach to the body and the voice; and that the Suzuki Method develops actors with a strong sense of intention. These three aspects of the Suzuki Method are explored in relation to the stylistic features of the work of SPAC, Frank:AAPE and the performance Surfacing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The modal strain energy method, which depends on the vibration characteristics of the structure, has been reasonably successful in identifying and localising damage in the structure. However, existing strain energy methods require the first few modes to be measured to provide meaningful damage detection. Use of individual modes with existing strain energy methods may indicate false alarms or may not detect the damage at or near the nodal points. This paper proposes a new modal strain energy based damage index which can detect and localize the damage using any one of the modes measured and illustrates its application for beam structures. It becomes evident that the proposed strain energy based damage index also has potential for damage quantification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature was reviewed and analyzed to determine the feasibility of using a combination of acid hydrolysis and CO2-C release during long-term incubation to determine soil organic carbon (SOC) pool sizes and mean residence times (MRTs). Analysis of 1100 data points showed the SOC remaining after hydrolysis with 6 M HCI ranged from 30 to 80% of the total SOC depending on soil type, depth, texture, and management. Nonhydrolyzable carbon (NHC) in conventional till soils represented 48% of SOC; no-till averaged 56%, forest 55%, and grassland 56%. Carbon dates showed an average of 1200 yr greater MRT for the NHC fraction than total SOC. Longterm incubation, involving measurement of CO2 evolution and curve fitting, measured active and slow pools. Active-pool C comprised 2 to 8% of the SOC with MRTs of days to months; the slow pool comprised 45 to 65% of the SOC and had MRTs of 10 to 80 yr. Comparison of field C-14 and (13) C data with hydrolysis-incubation data showed a high correlation between independent techniques across soil types and experiments. There were large differences in MRTs depending on the length of the experiment. Insertion of hydrolysis-incubation derived estimates of active (C-a), slow (C-s), and resistant Pools (C-r) into the DAYCENT model provided estimates of daily field CO2 evolution rates. These were well correlated with field CO2 measurements. Although not without some interpretation problems, acid hydrolysis-laboratory incubation is useful for determining SOC pools and fluxes especially when used in combination with associated measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grassland management affects soil organic carbon (SOC) storage and can be used to mitigate greenhouse gas emissions. However, for a country to assess emission reductions due to grassland management, there must be an inventory method for estimating the change in SOC storage. The Intergovernmental Panel on Climate Change (IPCC) has developed a simple carbon accounting approach for this purpose, and here we derive new grassland management factors that represent the effect of changing management on carbon storage for this method. Our literature search identified 49 studies dealing with effects of management practices that either degraded or improved conditions relative to nominally managed grasslands. On average, degradation reduced SOC storage to 95% +/- 0.06 and 97% +/- 0.05 of carbon stored under nominal conditions in temperate and tropical regions, respectively. In contrast, improving grasslands with a single management activity enhanced SOC storage by 14% 0.06 and 17% +/- 0.05 in temperate and tropical regions, respectively, and with an additional improvement(s), storage increased by another 11% +/- 0.04. We applied the newly derived factor coefficients to analyze C sequestration potential for managed grasslands in the U.S., and found that over a 20-year period changing management could sequester from 5 to 142 Tg C yr(-1) or 0.1 to 0.9 Mg C ha(-1) yr(-1), depending on the level of change. This analysis provides revised factor coefficients for the IPCC method that can be used to estimate impacts of management; it also provides a methodological framework for countries to derive factor coefficients specific to conditions in their region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional Fokker-Planck equations (FFPEs) have gained much interest recently for describing transport dynamics in complex systems that are governed by anomalous diffusion and nonexponential relaxation patterns. However, effective numerical methods and analytic techniques for the FFPE are still in their embryonic state. In this paper, we consider a class of time-space fractional Fokker-Planck equations with a nonlinear source term (TSFFPE-NST), which involve the Caputo time fractional derivative (CTFD) of order α ∈ (0, 1) and the symmetric Riesz space fractional derivative (RSFD) of order μ ∈ (1, 2). Approximating the CTFD and RSFD using the L1-algorithm and shifted Grunwald method, respectively, a computationally effective numerical method is presented to solve the TSFFPE-NST. The stability and convergence of the proposed numerical method are investigated. Finally, numerical experiments are carried out to support the theoretical claims.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper outlines a method of constructing narratives about an individual’s self-efficacy. Self-efficacy is defined as “people’s judgments of their capabilities to organise and execute courses of action required to attain designated types of performances” (Bandura, 1986, p. 391), and as such represents a useful construct for thinking about personal agency. Social cognitive theory provides the theoretical framework for understanding the sources of self-efficacy, that is, the elements that contribute to a sense of self-efficacy. The narrative approach adopted offers an alternative to traditional, positivist psychology, characterised by a preoccupation with measuring psychological constructs (like self-efficacy) by means of questionnaires and scales. It is argued that these instruments yield scores which are somewhat removed from the lived experience of the person—respondent or subject—associated with the score. The method involves a cyclical and iterative process using qualitative interviews to collect data from participants – four mature aged university students. The method builds on a three-interview procedure designed for life history research (Dolbeare & Schuman, cited in Seidman, 1998). This is achieved by introducing reflective homework tasks, as well as written data generated by research participants, as they are guided in reflecting on those experiences (including behaviours, cognitions and emotions) that constitute a sense of self-efficacy, in narrative and by narrative. The method illustrates how narrative analysis is used “to produce stories as the outcome of the research” (Polkinghorne, 1995, p.15), with detail and depth contributing to an appreciation of the ‘lived experience’ of the participants. The method is highly collaborative, with narratives co-constructed by researcher and research participants. The research outcomes suggest an enhanced understanding of self-efficacy contributes to motivation, application of effort and persistence in overcoming difficulties. The paper concludes with an evaluation of the research process by the students who participated in the author’s doctoral study.