238 resultados para projective techniques
Resumo:
Methods by which bit level systolic array chips can be made fault tolerant are discussed briefly. Using a simple analysis based on both Poisson and Bose-Einstein statistics authors demonstrate that such techniques can be used to obtain significant yield enhancement. Alternatively, the dimensions of an array can be increased considerably for the same initial (nonfault tolerant) chip yield.
Resumo:
Research has been undertaken to investigate the use of artificial neural network (ANN) techniques to improve the performance of a low bit-rate vector transform coder. Considerable improvements in the perceptual quality of the coded speech have been obtained. New ANN-based methods for vector quantiser (VQ) design and for the adaptive updating of VQ codebook are introduced for use in speech coding applications.
Resumo:
With the emergence of multicore and manycore processors, engineers must design and develop software in drastically new ways to benefit from the computational power of all cores. However, developing parallel software is much harder than sequential software because parallelism can't be abstracted away easily. Authors Hans Vandierendonck and Tom Mens provide an overview of technologies and tools to support developers in this complex and error-prone task. © 2012 IEEE.
Resumo:
Laughter is a frequently occurring social signal and an important part of human non-verbal communication. However it is often overlooked as a serious topic of scientific study. While the lack of research in this area is mostly due to laughter’s non-serious nature, it is also a particularly difficult social signal to produce on demand in a convincing manner; thus making it a difficult topic for study in laboratory settings. In this paper we provide some techniques and guidance for inducing both hilarious laughter and conversational laughter. These techniques were devised with the goal of capturing mo- tion information related to laughter while the person laughing was either standing or seated. Comments on the value of each of the techniques and general guidance as to the importance of atmosphere, environment and social setting are provided.
Resumo:
The adoption of each new level of automotive emissions legislation often requires the introduction of additional emissions reduction techniques or the development of existing emissions control systems. This, in turn, usually requires the implementation of new sensors and hardware which must subsequently be monitored by the on-board fault detection systems. The reliable detection and diagnosis of faults in these systems or sensors, which result in the tailpipe emissions rising above the progressively lower failure thresholds, provides enormous challenges for OBD engineers. This paper gives a review of the field of fault detection and diagnostics as used in the automotive industry. Previous work is discussed and particular emphasis is placed on the various strategies and techniques employed. Methodologies such as state estimation, parity equations and parameter estimation are explained with their application within a physical model diagnostic structure. The utilization of symptoms and residuals in the diagnostic process is also discussed. These traditional physical model based diagnostics are investigated in terms of their limitations. The requirements from the OBD legislation are also addressed. Additionally, novel diagnostic techniques, such as principal component analysis (PCA) are also presented as a potential method of achieving the monitoring requirements of current and future OBD legislation.
Resumo:
This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.
Resumo:
Despite its economic significance, competition law still remains fragmented, lacking an international framework allowing for dispute settlement. This, together with the growing importance of non-free-market economies in world trade require us to re-consider and re-evaluate the possibilities of bringing an antitrust suit against a foreign state. If the level playing field on the global marketplace is to be achieved, the possibility of hiding behind the bulwark of state sovereignty should be minimised. States should not be free to act in an anti-competitive way, but at present the legal framework seems ill-equipped to handle such challenges.
This paper deals with the defences available in litigation concerning transnational anti-competitive agreements involving or implicating foreign states. Four important legal doctrines are analysed: non-justiciability (political question doctrine), state immunity, act of state doctrine and foreign state compulsion. The paper addresses also the general problem of applicability of competition laws to a foreign state as such. This is a tale about repetitive unsuccessful efforts to sue OPEC and recent attempts in the US to deal with export cartels of Chinese state-owned enterprises
Resumo:
The techniques of principal component analysis (PCA) and partial least squares (PLS) are introduced from the point of view of providing a multivariate statistical method for modelling process plants. The advantages and limitations of PCA and PLS are discussed from the perspective of the type of data and problems that might be encountered in this application area. These concepts are exemplified by two case studies dealing first with data from a continuous stirred tank reactor (CSTR) simulation and second a literature source describing a low-density polyethylene (LDPE) reactor simulation.
Resumo:
We present three natural language marking strategies based on fast and reliable shallow parsing techniques, and on widely available lexical resources: lexical substitution, adjective conjunction swaps, and relativiser switching. We test these techniques on a random sample of the British National Corpus. Individual candidate marks are checked for goodness of structural and semantic fit, using both lexical resources, and the web as a corpus. A representative sample of marks is given to 25 human judges to evaluate for acceptability and preservation of meaning. This establishes a correlation between corpus based felicity measures and perceived quality, and makes qualified predictions. Grammatical acceptability correlates with our automatic measure strongly (Pearson's r = 0.795, p = 0.001), allowing us to account for about two thirds of variability in human judgements. A moderate but statistically insignificant (Pearson's r = 0.422, p = 0.356) correlation is found with judgements of meaning preservation, indicating that the contextual window of five content words used for our automatic measure may need to be extended. © 2007 SPIE-IS&T.