1000 resultados para Part
Resumo:
This paper reviews the fingerprint classification literature looking at the problem from a double perspective. We first deal with feature extraction methods, including the different models considered for singular point detection and for orientation map extraction. Then, we focus on the different learning models considered to build the classifiers used to label new fingerprints. Taxonomies and classifications for the feature extraction, singular point detection, orientation extraction and learning methods are presented. A critical view of the existing literature have led us to present a discussion on the existing methods and their drawbacks such as difficulty in their reimplementation, lack of details or major differences in their evaluations procedures. On this account, an experimental analysis of the most relevant methods is carried out in the second part of this paper, and a new method based on their combination is presented.
Resumo:
In the first part of this paper we reviewed the fingerprint classification literature from two different perspectives: the feature extraction and the classifier learning. Aiming at answering the question of which among the reviewed methods would perform better in a real implementation we end up in a discussion which showed the difficulty in answering this question. No previous comparison exists in the literature and comparisons among papers are done with different experimental frameworks. Moreover, the difficulty in implementing published methods was stated due to the lack of details in their description, parameters and the fact that no source code is shared. For this reason, in this paper we will go through a deep experimental study following the proposed double perspective. In order to do so, we have carefully implemented some of the most relevant feature extraction methods according to the explanations found in the corresponding papers and we have tested their performance with different classifiers, including those specific proposals made by the authors. Our aim is to develop an objective experimental study in a common framework, which has not been done before and which can serve as a baseline for future works on the topic. This way, we will not only test their quality, but their reusability by other researchers and will be able to indicate which proposals could be considered for future developments. Furthermore, we will show that combining different feature extraction models in an ensemble can lead to a superior performance, significantly increasing the results obtained by individual models.
Resumo:
http://www.archive.org/details/evangelicalmiss00niebuoft/
Resumo:
Two classes of techniques have been developed to whiten the quantization noise in digital delta-sigma modulators (DDSMs): deterministic and stochastic. In this two-part paper, a design methodology for reduced-complexity DDSMs is presented. The design methodology is based on error masking. Rules for selecting the word lengths of the stages in multistage architectures are presented. We show that the hardware requirement can be reduced by up to 20% compared with a conventional design, without sacrificing performance. Simulation and experimental results confirm theoretical predictions. Part I addresses MultistAge noise SHaping (MASH) DDSMs; Part II focuses on single-quantizer DDSMs..
Resumo:
For pt. I see ibid., vol. 44, p. 927-36 (1997). In a digital communications system, data are transmitted from one location to another by mapping bit sequences to symbols, and symbols to sample functions of analog waveforms. The analog waveform passes through a bandlimited (possibly time-varying) analog channel, where the signal is distorted and noise is added. In a conventional system the analog sample functions sent through the channel are weighted sums of one or more sinusoids; in a chaotic communications system the sample functions are segments of chaotic waveforms. At the receiver, the symbol may be recovered by means of coherent detection, where all possible sample functions are known, or by noncoherent detection, where one or more characteristics of the sample functions are estimated. In a coherent receiver, synchronization is the most commonly used technique for recovering the sample functions from the received waveform. These sample functions are then used as reference signals for a correlator. Synchronization-based coherent receivers have advantages over noncoherent receivers in terms of noise performance, bandwidth efficiency (in narrow-band systems) and/or data rate (in chaotic systems). These advantages are lost if synchronization cannot be maintained, for example, under poor propagation conditions. In these circumstances, communication without synchronization may be preferable. The theory of conventional telecommunications is extended to chaotic communications, chaotic modulation techniques and receiver configurations are surveyed, and chaotic synchronization schemes are described
Resumo:
info:eu-repo/semantics/published
Resumo:
In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.
Resumo:
For the purposes of starting to tackle, within artificial intelligence (AI), the narrative aspects of legal narratives in a criminal evidence perspective, traditional AI models of narrative understanding can arguably supplement extant models of legal narratives from the scholarly literature of law, jury studies, or the semiotics of law. Not only: the literary (or cinematic) models prominent in a given culture impinge, with their poetic conventions, on the way members of the culture make sense of the world. This shows glaringly in the sample narrative from the Continent-the Jama murder, the inquiry, and the public outcry-we analyse in this paper. Apparently in the same racist crime category as the case of Stephen Lawrence's murder (in Greenwich on 22 April 1993) with the ensuing still current controversy in the UK, the Jama case (some 20 years ago) stood apart because of a very unusual element: the eyewitnesses identifying the suspects were a group of football referees and linesmen eating together at a restaurant, and seeing the sleeping man as he was set ablaze in a public park nearby. Professional background as witnesses-cum-factfinders in a mass sport, and public perceptions of their required characteristics, couldn't but feature prominently in the public perception of the case, even more so as the suspects were released by the magistrate conducting the inquiry. There are sides to this case that involve different expected effects in an inquisitorial criminal procedure system from the Continent, where an investigating magistrate leads the inquiry and prepares the prosecution case, as opposed to trial by jury under the Anglo-American adversarial system. In the JAMA prototype, we tried to approach the given case from the coign of vantage of narrative models from AI.
Resumo:
In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches, which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, I reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: 'The JAMA Model and Narrative Interpretation Patterns'), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability were infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for Artificial Intelligence (AI) researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially Bayesian probability) in accounts of evidence has been flourishing among legal scholars; nowadays both the Bayesians (e.g. Peter Tillers) and the Bayesio-skeptics (e.g. Ron Allen), among those legal scholars who are involved in the controversy, are willing to give AI research a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application of probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making. Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.
Resumo:
The main goal of a cell stability MHD model like MHD-Valdis is to help locate the busbars around the cell in a way which leads to the generation of a magnetic field inside the cell that itself leads to a stable cell operation. Yet as far as the cell stability is concerned, the uniformity of the current density in the metal pad is also extremely important and can only be achieved with a correct busbar network sizing. This work compares the usage of a detailed ANSYS based 3D thermo-electric model with the one of the versatile 1D part of MHD-Valdis to help design a well balanced busbar network.
Resumo:
In 1998, Swissair Flight I I I (SR111) developed an in-flight fire shortly after take-off which resulted in the loss of the aircraft, a McDonnell Douglas MD-I 1, and all passengers and crew. The Transportation Safety Board (TSB) of Canada, Fire and Explosion Group launched a four year investigation into the incident in an attempt to understand the cause and subsequent mechanisms which lead to the rapid spread of the in-flight fire. As part of this investigation, the SMARTFIRE Computational Fluid Dynamics (CFD) software was used to predict the 'possible' development of the fire and associated smoke movement. In this paper the CFD fire simulations are presented and model predictions compared with key findings from the investigation. The model predictions are shown to be consistent with a number of the investigation findings associated with the early stages of the fire development. The analysis makes use of simulated pre-fire airflow conditions within the MD-11 cockpit and above ceiling region presented in an earlier publication (Part 1) which was published in The Aeronautical Journal in January 2006(4).
Resumo:
In this paper an introduction is given to the history, current situation and future plans of China's railway industry. The history of China's railway is divided into four development phases: the phase in Imperial China, the phase in the Republic of China and the phases before and after the economic rejuvenation of the People's Republic of China. An introduction to the current situation and future plans includes the major projects under construction and development trends of China's railways. The environment of China's railways is also presented. This is the first of two papers on the railway scene in China.