332 resultados para decomposition techniques
Resumo:
This paper proposes a combination of source-normalized weighted linear discriminant analysis (SN-WLDA) and short utterance variance (SUV) PLDA modelling to improve the short utterance PLDA speaker verification. As short-length utterance i-vectors vary with the speaker, session variations and phonetic content of the utterance (utterance variation), a combined approach of SN-WLDA projection and SUV PLDA modelling is used to compensate the session and utterance variations. Experimental studies have found that a combination of SN-WLDA and SUV PLDA modelling approach shows an improvement over baseline system (WCCN[LDA]-projected Gaussian PLDA (GPLDA)) as this approach effectively compensates the session and utterance variations.
Resumo:
The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.
Resumo:
This study analyzes the management of air pollutant substance in Chinese industrial sectors from 1998 to 2009. Decomposition analysis applying the logarithmic mean divisia index is used to analyze changes in emissions of air pollutants with a focus on the following five factors: coal pollution intensity (CPI), end-of-pipe treatment (EOP), the energy mix (EM), productive efficiency change (EFF), and production scale changes (PSC). Three pollutants are the main focus of this study: sulfur dioxide (SO2), dust, and soot. The novelty of this paper is focusing on the impact of the elimination policy on air pollution management in China by type of industry using the scale merit effect for pollution abatement technology change. First, the increase in SO2 emissions from Chinese industrial sectors because of the increase in the production scale is demonstrated. However, the EOP equipment that induced this change and improvements in energy efficiency has prevented an increase in SO2 emissions that is commensurate with the increase in production. Second, soot emissions were successfully reduced and controlled in all industries except the steel industry between 1998 and 2009, even though the production scale expanded for these industries. This reduction was achieved through improvements in EOP technology and in energy efficiency. Dust emissions decreased by nearly 65% between 1998 and 2009 in the Chinese industrial sectors. This successful reduction in emissions was achieved by implementing EOP technology and pollution prevention activities during the production processes, especially in the cement industry. Finally, pollution prevention in the cement industry is shown to result from production technology development rather than scale merit. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
This study analyzes toxic chemical substance management in three U.S. manufacturing sectors from 1991 to 2008. Decomposition analysis applying the logarithmic mean Divisia index is used to analyze changes in toxic chemical substance emissions by the following five factors: cleaner production, end-of-pipe treatment, transfer for further management, mixing of intermediate materials, and production scale. Based on our results, the chemical manufacturing sector reduced toxic chemical substance emissions mainly via end-of-pipe treatment. In the meantime, transfer for further management contributed to the reduction of toxic chemical substance emissions in the metal fabrication industry. This occurred because the environmental business market expanded in the 1990s, and the infrastructure for the recycling of metal and other wastes became more efficient. Cleaner production is the main contributor to toxic chemical reduction in the electrical product industry. This implies that the electrical product industry is successful in developing a more environmentally friendly product design and production process.
Resumo:
This study decomposed the determinants of environmental quality into scale, technique, and composition effects. We applied a semiparametric method of generalized additive models, which enabled us to use flexible functional forms and include several independent variables in the model. The differences in the technique effect were found to play a crucial role in reducing pollution. We found that the technique effect was sufficient to reduce sulfur dioxide emissions. On the other hand, its effect was not enough to reduce carbon dioxide (CO2) emissions and energy use, except for the case of CO2 emissions in high-income countries.
Resumo:
Purpose Corneal confocal microscopy (CCM) is a rapid non-invasive ophthalmic technique, which has been shown to diagnose and stratify the severity of diabetic neuropathy. Current morphometric techniques assess individual static images of the subbasal nerve plexus; this work explores the potential for non-invasive assessment of the wide-field morphology and dynamic changes of this plexus in vivo. Methods In this pilot study, laser scanning CCM was used to acquire maps (using a dynamic fixation target and semi-automated tiling software) of the central corneal sub-basal nerve plexus in 4 diabetic patients with and 6 without neuropathy and in 2 control subjects. Nerve migration was measured in an additional 7 diabetic patients with neuropathy, 4 without neuropathy and in 2 control subjects by repeating a modified version of the mapping procedure within 2-8 weeks, thus facilitating re-identification of distinctive nerve landmarks in the 2 montages. The rate of nerve movement was determined from these data and normalised to a weekly rate (µm/week), using customised software. Results Wide-field corneal nerve fibre length correlated significantly with the Neuropathy Disability Score (r = -0.58, p < 0.05), vibration perception (r = -0.66, p < 0.05) and peroneal conduction velocity (r = 0.67, p < 0.05). Central corneal nerve fibre length did not correlate with any of these measures of neuropathy (p > 0.05 for all). The rate of corneal nerve migration was 14.3 ± 1.1 µm/week in diabetic patients with neuropathy, 19.7 ± 13.3µm/week in diabetic patients without neuropathy, and 24.4 ± 9.8µm/week in control subjects; however, these differences were not significantly different (p = 0.543). Conclusions Our data demonstrate that it is possible to capture wide-field images of the corneal nerve plexus, and to quantify the rate of corneal nerve migration by repeating this procedure over a number of weeks. Further studies on larger sample sizes are required to determine the utility of this approach for the diagnosis and monitoring of diabetic neuropathy.
Resumo:
The thermal decomposition process of kaolinite–potassium acetate intercalation complex has been studied using simultaneous thermogravimetry coupled with Fourier-transform infrared spectroscopy and mass spectrometry (TG-FTIR-MS). The results showed that the thermal decomposition of the complex took place in four temperature ranges, namely 50–100, 260–320, 320–550, and 650–780 °C. The maximal mass losses rate for the thermal decomposition of the kaolinite–potassium acetate intercalation complex was observed at 81, 296, 378, 411, 486, and 733 °C, which was attributed to (a) loss of the adsorbed water, (b) thermal decomposition of surface-adsorbed potassium acetate (KAc), (c) the loss of the water coordinated to potassium acetate in the intercalated kaolinite, (d) the thermal decomposition of intercalated KAc in the interlayer of kaolinite and the removal of inner surface hydroxyls, (e) the loss of the inner hydroxyls, and (f) the thermal decomposition of carbonate derived from the decomposition of KAc. The thermal decomposition of intercalated potassium acetate started in the range 320–550 °C accompanied by the release of water, acetone, carbon dioxide, and acetic acid. The identification of pyrolysis fragment ions provided insight into the thermal decomposition mechanism. The results showed that the main decomposition fragment ions of the kaolinite–KAc intercalation complex were water, acetone, carbon dioxide, and acetic acid. TG-FTIR-MS was demonstrated to be a powerful tool for the investigation of kaolinite intercalation complexes. It delivers a detailed insight into the thermal decomposition processes of the kaolinite intercalation complexes characterized by mass loss and the evolved gases.
Resumo:
Grading is basic to the work of Landscape Architects concerned with design on the land. Gradients conducive to easy use, rainwater drained away, and land slope contributing to functional and aesthetic use are all essential to the amenity and pleasure of external environments. This workbook has been prepared specifically to support the program of landscape construction for students in Landscape Architecture. It is concerned primarily with the technical design of grading rather than with its aesthetic design. It must be stressed that the two aspects are rarely separate; what is designed should be technically correct and aesthetically pleasing - it needs to look good as well as to function effectively. This revised edition contains amended and new content which has evolved out of student classes and discussion with colleagues. I am pleased to have on record that every delivery of this workbook material has resulted in my own better understanding of grading and the techniques for its calculation and communication.
Resumo:
The solutions proposed in this thesis contribute to improve gait recognition performance in practical scenarios that further enable the adoption of gait recognition into real world security and forensic applications that require identifying humans at a distance. Pioneering work has been conducted on frontal gait recognition using depth images to allow gait to be integrated with biometric walkthrough portals. The effects of gait challenging conditions including clothing, carrying goods, and viewpoint have been explored. Enhanced approaches are proposed on segmentation, feature extraction, feature optimisation and classification elements, and state-of-the-art recognition performance has been achieved. A frontal depth gait database has been developed and made available to the research community for further investigation. Solutions are explored in 2D and 3D domains using multiple images sources, and both domain-specific and independent modality gait features are proposed.
Resumo:
This research is a step forward in improving the accuracy of detecting anomaly in a data graph representing connectivity between people in an online social network. The proposed hybrid methods are based on fuzzy machine learning techniques utilising different types of structural input features. The methods are presented within a multi-layered framework which provides the full requirements needed for finding anomalies in data graphs generated from online social networks, including data modelling and analysis, labelling, and evaluation.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.