923 resultados para FINGERPRINT VERIFICATION
Resumo:
Up to now, clinical trials of heavy-ion radiotherapy for superficially placed tumors have been carried out for six times and over 60 selected patients have been treated with 80—100 MeV/u carbon ions supplied by the Heavy Ion Research Facility in Lanzhou (HIRFL) at the Institute of Modern Physics, Chinese Academy of Sciences since November, 2006. A passive irradiation system and a dose optimization method for radiotherapy with carbon-ion beams have been developed. Experimental verification of longitudinally ...
Resumo:
The passive beam delivery system in the superficially-placed tumor therapy terminal at Heavy Ion Researc h Facility in Lanzhou (HIRFL), which includes two orthogonal dipole magnets as scanning system, a motor-driven energy degrader as range-shifter, series of ridge filters as range modulator and a multileaf collimator, is introduced in detail. The capacities of its important components and the whole system have been verified experimentally. The tests of the ridge filter for extending Bragg peak and the range shifter for energy adjustment show both work well. To examine the passive beam delivery system, a beam shaping experiment were carried out, simulating a three-dimensional (3D) conformal irradiation to a tumor. The encouraging experimental result confirms that 3D layer-stacking conformal irradiation can be performed by means of the passive system. The validation of the beam delivery system establishes a substantial basis for upcoming clinical trial for superficially-placed tumors with heavy ions in the therapy terminal at HIRFL.
Resumo:
The glass transition temperatures (T-g) of PS/PPO blends with different compositions were studied under various pressures by means of a PVT-100 analyzer. A general relation of T-g and pressure of the PS/PPO system was deduced by fitting the experimental T-g's. Couchman volume-based equation was testified with the aid of those data. It was found that the experimental T-g's do not obey the Couchman equation of glass transition temperature based on thermodynamic theory. According to our studies, the major reason of the deviation is caused by the neglect of DeltaV(mix). (C) 2001 Published by Elsevier Science Ltd.
Resumo:
Elliott, G. N., Worgan, H., Broadhurst, D. I., Draper, J. H., Scullion, J. (2007). Soil differentiation using fingerprint Fourier transform infrared spectroscopy, chemometrics and genetic algorithm-based feature selection. Soil Biology & Biochemistry, 39 (11), 2888-2896. Sponsorship: BBSRC / NERC RAE2008
Resumo:
Manfred Beckmann, David P. Enot, David P. Overy, and John Draper (2007). Representation, comparison, and interpretation of metabolome fingerprint data for total composition analysis and quality trait investigation in potato cultivars. Journal of Agricultural and Food Chemistry, 55 (9) pp.3444-3451 RAE2008
Resumo:
This paper reviews the fingerprint classification literature looking at the problem from a double perspective. We first deal with feature extraction methods, including the different models considered for singular point detection and for orientation map extraction. Then, we focus on the different learning models considered to build the classifiers used to label new fingerprints. Taxonomies and classifications for the feature extraction, singular point detection, orientation extraction and learning methods are presented. A critical view of the existing literature have led us to present a discussion on the existing methods and their drawbacks such as difficulty in their reimplementation, lack of details or major differences in their evaluations procedures. On this account, an experimental analysis of the most relevant methods is carried out in the second part of this paper, and a new method based on their combination is presented.
Resumo:
In the first part of this paper we reviewed the fingerprint classification literature from two different perspectives: the feature extraction and the classifier learning. Aiming at answering the question of which among the reviewed methods would perform better in a real implementation we end up in a discussion which showed the difficulty in answering this question. No previous comparison exists in the literature and comparisons among papers are done with different experimental frameworks. Moreover, the difficulty in implementing published methods was stated due to the lack of details in their description, parameters and the fact that no source code is shared. For this reason, in this paper we will go through a deep experimental study following the proposed double perspective. In order to do so, we have carefully implemented some of the most relevant feature extraction methods according to the explanations found in the corresponding papers and we have tested their performance with different classifiers, including those specific proposals made by the authors. Our aim is to develop an objective experimental study in a common framework, which has not been done before and which can serve as a baseline for future works on the topic. This way, we will not only test their quality, but their reusability by other researchers and will be able to indicate which proposals could be considered for future developments. Furthermore, we will show that combining different feature extraction models in an ensemble can lead to a superior performance, significantly increasing the results obtained by individual models.
Resumo:
Formal correctness of complex multi-party network protocols can be difficult to verify. While models of specific fixed compositions of agents can be checked against design constraints, protocols which lend themselves to arbitrarily many compositions of agents-such as the chaining of proxies or the peering of routers-are more difficult to verify because they represent potentially infinite state spaces and may exhibit emergent behaviors which may not materialize under particular fixed compositions. We address this challenge by developing an algebraic approach that enables us to reduce arbitrary compositions of network agents into a behaviorally-equivalent (with respect to some correctness property) compact, canonical representation, which is amenable to mechanical verification. Our approach consists of an algebra and a set of property-preserving rewrite rules for the Canonical Homomorphic Abstraction of Infinite Network protocol compositions (CHAIN). Using CHAIN, an expression over our algebra (i.e., a set of configurations of network protocol agents) can be reduced to another behaviorally-equivalent expression (i.e., a smaller set of configurations). Repeated applications of such rewrite rules produces a canonical expression which can be checked mechanically. We demonstrate our approach by characterizing deadlock-prone configurations of HTTP agents, as well as establishing useful properties of an overlay protocol for scheduling MPEG frames, and of a protocol for Web intra-cache consistency.
Resumo:
In our previous work, we developed TRAFFIC(X), a specification language for modeling bi-directional network flows featuring a type system with constrained polymorphism. In this paper, we present two ways to customize the constraint system: (1) when using linear inequality constraints for the constraint system, TRAFFIC(X) can describe flows with numeric properties such as MTU (maximum transmission unit), RTT (round trip time), traversal order, and bandwidth allocation over parallel paths; (2) when using Boolean predicate constraints for the constraint system, TRAFFIC(X) can describe routing policies of an IP network. These examples illustrate how to use the customized type system.
Resumo:
As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.