956 resultados para Applied identity-based encryption
Resumo:
This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.
Resumo:
Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
This paper deals with the product design, engineering, and material selection intended for the manufacturing of an eco-friendly chair. The final product is expected to combine design attributes with technical and legal feasibility with the implementation of new bio-based materials. Considering the industrial design, a range of objectives and trends were determined after setting the market requirements, and the final concept was proposed and modeled. The product geometry, production technology, and legal specifications were the input data for product engineering. The material selection was based on the technical requirements. Polypropylene (PP) composite materials based on coupled-fiberglass, sized-fiberglass, and coupled-stone ground wood reinforcements were prepared and characterized. Final formulations based on these PP composites are proposed and justified
Resumo:
This paper uses the possibilities provided by the regression-based inequality decomposition (Fields, 2003) to explore the contribution of different explanatory factors to international inequality in CO2 emissions per capita. In contrast to previous emissions inequality decompositions, which were based on identity relationships (Duro and Padilla, 2006), this methodology does not impose any a priori specific relationship. Thus, it allows an assessment of the contribution to inequality of different relevant variables. In short, the paper appraises the relative contributions of affluence, sectoral composition, demographic factors and climate. The analysis is applied to selected years of the period 1993–2007. The results show the important (though decreasing) share of the contribution of demographic factors, as well as a significant contribution of affluence and sectoral composition.
Resumo:
One of the problems that slows the development of off-line programming is the low static and dynamic positioning accuracy of robots. Robot calibration improves the positioning accuracy and can also be used as a diagnostic tool in robot production and maintenance. A large number of robot measurement systems are now available commercially. Yet, there is a dearth of systems that are portable, accurate and low cost. In this work a measurement system that can fill this gap in local calibration is presented. The measurement system consists of a single CCD camera mounted on the robot tool flange with a wide angle lens, and uses space resection models to measure the end-effector pose relative to a world coordinate system, considering radial distortions. Scale factors and image center are obtained with innovative techniques, making use of a multiview approach. The target plate consists of a grid of white dots impressed on a black photographic paper, and mounted on the sides of a 90-degree angle plate. Results show that the achieved average accuracy varies from 0.2mm to 0.4mm, at distances from the target from 600mm to 1000mm respectively, with different camera orientations.
Resumo:
Greenhouse studies were conducted in 2008-2009 with the objective of adjusting dose-response curves of the main soil-applied herbicides currently used in cotton for the control of Amaranthus viridis, A. hybridus, A. spinosus, A. lividus, as well as comparing susceptibility among different species, using the identity test models. Thirty six individual experiments were simultaneously carried out in greenhouse, in a sandy clay loam soil (21% clay, 2.36% OM) combining increasing doses of the herbicides alachlor, clomazone, diuron, oxyfluorfen, pendimethalin, prometryn, S-metolachlor, and trifluralin applied to each species. Dose-response curves were adjusted for visual weed control at 28 days after herbicide application and doses required for 80% (C80) and 95% (C95) control were calculated. All herbicides, except clomazone and trifluralin, provided efficient control of most Amaranthus species, but substantial differences in susceptibility to herbicides were found. In general, A. lividus was the least sensitive species, whereas A. spinosus demonstrated the highest sensitivity to herbicides. Alachlor, diuron, oxyfluorfen, pendimethalin, S-metolachlor, and prometryn are efficient alternatives to control Amaranthus spp. in a range of doses that are currently lower than those recommended to cotton.
Resumo:
Vertebrate gap junctions are aggregates of transmembrane channels which are composed of connexin (Cx) proteins encoded by at least fourteen distinct genes in mammals. Since the same Cx type can be expressed in different tissues and more than one Cx type can be expressed by the same cell, the thorough identification of which connexin is in which cell type and how connexin expression changes after experimental manipulation has become quite laborious. Here we describe an efficient, rapid and simple method by which connexin type(s) can be identified in mammalian tissue and cultured cells using endonuclease cleavage of RT-PCR products generated from "multi primers" (sense primer, degenerate oligonucleotide corresponding to a region of the first extracellular domain; antisense primer, degenerate oligonucleotide complementary to the second extracellular domain) that amplify the cytoplasmic loop regions of all known connexins except Cx36. In addition, we provide sequence information on RT-PCR primers used in our laboratory to screen individual connexins and predictions of extension of the "multi primer" method to several human connexins.
Resumo:
Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.
Resumo:
There is currently little empirical knowledge regarding the construction of a musician’s identity and social class. With a theoretical framework based on Bourdieu’s (1984) distinction theory, Bronfenbrenner’s (1979) theory of ecological systems, and the identity theories of Erikson (1950; 1968) and Marcia (1966), a survey called the Musician’s Social Background and Identity Questionnaire (MSBIQ) is developed to test three research hypotheses related to the construction of a musician’s identity, social class and ecological systems of development. The MSBIQ is administered to the music students at Sibelius Academy of the University of Arts Helsinki and Helsinki Metropolia University of Applied Sciences, representing the ’highbrow’ and the ’middlebrow’ samples in the field of music education in Finland. Acquired responses (N = 253) are analyzed and compared with quantitative methods including Pearson’s chi-square test, factor analysis and an adjusted analysis of variance (ANOVA). The study revealed that (1) the music students at Sibelius Academy and Metropolia construct their subjective musician’s identity differently, but (2) social class does not affect this identity construction process significantly. In turn, (3) the ecological systems of development, especially the individual’s residential location, do significantly affect the construction of a musician’s identity, as well as the age at which one starts to play one’s first musical instrument. Furthermore, a novel finding related to the structure of a musician’s identity was the tripartite model of musical identity consisting of the three dimensions of a musician’s identity: (I) ’the subjective dimension of a musician’s identity’, (II) ’the occupational dimension of a musician’s identity’ and, (III) ’the conservative-liberal dimension of a musician’s identity’. According to this finding, a musician’s identity is not a uniform, coherent entity, but a structure consisting of different elements continuously working in parallel within different dimensions. The results and limitations related to the study are discussed, as well as the objectives related to future studies using the MSBIQ to research the identity construction and social backgrounds of a musician or other performing artists.
Resumo:
In this paper we propose a cryptographic transformation based on matrix manipulations for image encryption. Substitution and diffusion operations, based on the matrix, facilitate fast conversion of plaintext and images into ciphertext and cipher images. The paper describes the encryption algorithm, discusses the simulation results and compares with results obtained from Advanced Encryption Standard (AES). It is shown that the proposed algorithm is capable of encrypting images eight times faster than AES.
Resumo:
In symmetric block ciphers, substitution and diffusion operations are performed in multiple rounds using sub-keys generated from a key generation procedure called key schedule. The key schedule plays a very important role in deciding the security of block ciphers. In this paper we propose a complex key generation procedure, based on matrix manipulations, which could be introduced in symmetric ciphers. The proposed key generation procedure offers two advantages. First, the procedure is simple to implement and has complexity in determining the sub-keys through crypt analysis. Secondly, the procedure produces a strong avalanche effect making many bits in the output block of a cipher to undergo changes with one bit change in the secret key. As a case study, matrix based key generation procedure has been introduced in Advanced Encryption Standard (AES) by replacing the existing key schedule of AES. The key avalanche and differential key propagation produced in AES have been observed. The paper describes the matrix based key generation procedure and the enhanced key avalanche and differential key propagation produced in AES. It has been shown that, the key avalanche effect and differential key propagation characteristics of AES have improved by replacing the AES key schedule with the Matrix based key generation procedure