984 resultados para N(2)O decomposition
Resumo:
This thesis presents an original approach to parametric speech coding at rates below 1 kbitsjsec, primarily for speech storage applications. Essential processes considered in this research encompass efficient characterization of evolutionary configuration of vocal tract to follow phonemic features with high fidelity, representation of speech excitation using minimal parameters with minor degradation in naturalness of synthesized speech, and finally, quantization of resulting parameters at the nominated rates. For encoding speech spectral features, a new method relying on Temporal Decomposition (TD) is developed which efficiently compresses spectral information through interpolation between most steady points over time trajectories of spectral parameters using a new basis function. The compression ratio provided by the method is independent of the updating rate of the feature vectors, hence allows high resolution in tracking significant temporal variations of speech formants with no effect on the spectral data rate. Accordingly, regardless of the quantization technique employed, the method yields a high compression ratio without sacrificing speech intelligibility. Several new techniques for improving performance of the interpolation of spectral parameters through phonetically-based analysis are proposed and implemented in this research, comprising event approximated TD, near-optimal shaping event approximating functions, efficient speech parametrization for TD on the basis of an extensive investigation originally reported in this thesis, and a hierarchical error minimization algorithm for decomposition of feature parameters which significantly reduces the complexity of the interpolation process. Speech excitation in this work is characterized based on a novel Multi-Band Excitation paradigm which accurately determines the harmonic structure in the LPC (linear predictive coding) residual spectra, within individual bands, using the concept 11 of Instantaneous Frequency (IF) estimation in frequency domain. The model yields aneffective two-band approximation to excitation and computes pitch and voicing with high accuracy as well. New methods for interpolative coding of pitch and gain contours are also developed in this thesis. For pitch, relying on the correlation between phonetic evolution and pitch variations during voiced speech segments, TD is employed to interpolate the pitch contour between critical points introduced by event centroids. This compresses pitch contour in the ratio of about 1/10 with negligible error. To approximate gain contour, a set of uniformly-distributed Gaussian event-like functions is used which reduces the amount of gain information to about 1/6 with acceptable accuracy. The thesis also addresses a new quantization method applied to spectral features on the basis of statistical properties and spectral sensitivity of spectral parameters extracted from TD-based analysis. The experimental results show that good quality speech, comparable to that of conventional coders at rates over 2 kbits/sec, can be achieved at rates 650-990 bits/sec.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
In the terminology of Logic programming, current search engines answer Sigma1 queries (formulas of the form where is a boolean combination of attributes). Such a query is determined by a particular sequence of keywords input by a user. In order to give more control to users, search engines will have to tackle more expressive queries, namely, Sigma2 queries (formulas of the form ). The purpose of the talk is to examine which directions could be explored in order to move towards more expressive languages, more powerful search engines, and the benefits that users should expect.
Resumo:
Following the position of Beer and Burrows (2007) this paper poses a re-conceptualization of Web 2.0 interaction in order to understand the properties of action possibilities in and of Web 2.0. The paper discusses the positioning of Web 2.0 social interaction in light of current descriptions, which point toward the capacities of technology in the production of social affordances within that domain (Bruns 2007; Jenkins 2006; O’Reilly 2005). While this diminishes the agency and reflexivity for users of Web 2.0 it also inadvertently positions tools as the central driver for the interactive potential available (Everitt and Mills 2009; van Dicjk 2009). In doing so it neglects the possibility that participants may be more involved in the production of Web 2.0 than the technology that underwrites it. It is this aspect of Web 2.0 that is questioned in the study with particular interest on how an analytical option may be made available to broaden the scope of investigations into Web 2.0 to include a study of the capacity for an interactive potential in light of how action possibilities are presented to users through communication with others (Bonderup Dohn 2009).
Resumo:
Background Wandering represents a major problem in the management of patients with Alzheimer’s disease (AD). In this study we examined the utility of the Algase Wandering Scale (AWS), a newly developed psychometric instrument that asks caregivers to assess the likelihood of wandering behavior. Methods The AWS was administered to the caregivers of 40 AD patients and total and subscale scores were examined in relation to measures of mental and functional status, depressive symptoms and medication usage. Results AWS scores were comparable, though slightly lower, than those normative values previously published. Higher scores were associated with more severe dementia. The Negative Outcome subscale showed a significant increase in reported falls or injuries in association with anti-depressant use. Conclusions These data provide some construct validation for the AWS as a potentially useful scale to assess wandering behaviors in AD.
Resumo:
This study examined the psychometric properties of an expanded version of the Algase Wandering Scale (Version 2) (AWS-V2) in a cross-cultural sample. A cross-sectional survey design was used. Study subjects were 172 English-speaking persons with dementia (PWD) from long-term care facilities in the USA, Canada, and Australia. Two or more facility staff rated each subject on the AWS-V2. Demographic and cognitive data (MMSE) were also obtained. Staff provided information on their own knowledge of the subject and of dementia. Separate factor analyses on data from two samples of raters each explained greater than 66% of the variance in AWS-V2 scores and validated four (persistent walking, navigational deficit, eloping behavior, and shadowing) of five factors in the original scale. Items added to create the AWS-V2 strengthened the shadowing subscale, failed to improve the routinized walking subscale, and added a factor, attention shifting as compared to the original AWS. Evidence for validity was found in significant correlations and ANOVAs between the AWS-V2 and most subscales with a single item indicator of wandering and with the MMSE. Evidence of reliability was shown by internal consistency of the AWS-V2 (0.87, 0.88) and its subscales (range 0.88 to 0.66), with Kappa for individual items (17 of 27 greater than 0.4), and ANOVAs comparing ratings across rater groups (nurses, nurse aids, and other staff). Analyses support validity and reliability of the AWS-V2 overall and for persistent walking, spatial disorientation, and eloping behavior subscales. The AWS-V2 and its subscales are an appropriate way to measure wandering as conceptualized within the Need-driven Dementia-compromised Behavior Model in studies of English-speaking subjects. Suggestions for further strengthening the scale and for extending its use to clinical applications are described.
Resumo:
Soil C decomposition is sensitive to changes in temperature, and even small increases in temperature may prompt large releases of C from soils. But much of what we know about soil C responses to global change is based on short-term incubation data and model output that implicitly assumes soil C pools are composed of organic matter fractions with uniform temperature sensitivities. In contrast, kinetic theory based on chemical reactions suggests that older, more-resistant C fractions may be more temperature sensitive. Recent research on the subject is inconclusive, indicating that the temperature sensitivity of labile soil organic matter (OM) decomposition could either be greater than, less than, or equivalent to that of resistant soil OM. We incubated soils at constant temperature to deplete them of labile soil OM and then successively assessed the CO2-C efflux in response to warming. We found that the decomposition response to experimental warming early during soil incubation (when more labile C remained) was less than that later when labile C was depleted. These results suggest that the temperature sensitivity of resistant soil OM pools is greater than that for labile soil OM and that global change-driven soil C losses may be greater than previously estimated.