986 resultados para cold finger system
Resumo:
We present measurements of J/psi yields in d + Au collisions at root S(NN) = 200 GeV recorded by the PHENIX experiment and compare them with yields in p + p collisions at the same energy per nucleon-nucleon collision. The measurements cover a large kinematic range in J/psi rapidity (-2.2 < y < 2.4) with high statistical precision and are compared with two theoretical models: one with nuclear shadowing combined with final state breakup and one with coherent gluon saturation effects. In order to remove model dependent systematic uncertainties we also compare the data to a simple geometric model. The forward rapidity data are inconsistent with nuclear modifications that are linear or exponential in the density weighted longitudinal thickness, such as those from the final state breakup of the bound state.
Resumo:
We present a new analysis of J/psi production yields in deuteron-gold collisions at root s(NN) =200 GeV using data taken from the PHENIX experiment in 2003 and previously published in S. S. Adler [Phys. Rev. Lett 96, 012304 (2006)]. The high statistics proton-proton J/psi data taken in 2005 are used to improve the baseline measurement and thus construct updated cold nuclear matter modification factors (R(dAu)). A suppression of J/psi in cold nuclear matter is observed as one goes forward in rapidity (in the deuteron-going direction), corresponding to a region more sensitive to initial-state low-x gluons in the gold nucleus. The measured nuclear modification factors are compared to theoretical calculations of nuclear shadowing to which a J/psi (or precursor) breakup cross section is added. Breakup cross sections of sigma(breakup)=2.8(-1.4)(+1.7) (2.2(-1.5)(+1.6)) mb are obtained by fitting these calculations to the data using two different models of nuclear shadowing. These breakup cross-section values are consistent within large uncertainties with the 4.2 +/- 0.5 mb determined at lower collision energies. Projecting this range of cold nuclear matter effects to copper-copper and gold-gold collisions reveals that the current constraints are not sufficient to firmly quantify the additional hot nuclear matter effect.
Resumo:
A simple and reliable method for Hg determination in fish samples has been developed. Lyophilised fish tissue samples were extracted in a 25% (w/v) tetramethylammonium hydroxide (TMAH) solution; the extracts were then analysed by FI-CVAFS. This method can be used to determine total and inorganic Hg, using the same FI manifold. For total Hg determination, a 0.1% (w/v) KMnO(4) solution was added to the FI manifold at the sample zone, followed by the addition of a 0.5% (w/v) SnCl(2) solution, whereas inorganic Hg was determined by adding a 0.1% (w/v) L-cysteine solution followed by a 1.0% (w/v) SnCl(2) solution to the FI system. The organic fraction was determined as the difference between total and inorganic Hg. Sample preparation, reagent consumption and parameters that can influence the FI-CVAFS performance were also evaluated. The limit of detection for this method is 3.7 ng g(-1) for total Hg and 4.3 ng g(-1) for inorganic Hg. The relative standard deviation for a 1.0 mu gL(-1) CH(3)Hg standard solution (n = 20) was 1.1%, and 1.3% for a 1.0 mu gL(-1) Hg(2+) standard solution (n = 20). Accuracy was assessed by the analysis of Certified Reference Material (dogfish: DORM-2, NRCC). Recoveries of 99.1% for total Hg and 93.9% inorganic Hg were obtained. Mercury losses were not observed when sample solutions were re-analysed after a seven day period of storage at 4 degrees C.
Resumo:
A simple method with a fast sample preparation procedure for total and inorganic mercury determinations in blood samples is proposed based on flow injection cold vapor inductively coupled plasma mass spectrometry (FI-CVICP-MS). Aliquots of whole blood (500 mL) are diluted 1 + 1 v/v with 10.0% v/v tetramethylammonium hydroxide (TMAH) solution, incubated for 3 h at room temperature and then further diluted 1 + 4 v/v with 2.0% v/v HCl. The inorganic Hg was released by online addition of L-cysteine and then reduced to elemental Hg by SnCl(2). On the other hand, total mercury was determined by on-line addition of KMnO(4) and then reduced to elemental Hg by NaBH(4). Samples were calibrated against matrix-matching. The method detection limit was found to be 0.80 mu g L(-1) and 0.08 mu g L(-1) for inorganic and total mercury, respectively. Sample throughput is 20 samples h(-1). The method accuracy is traceable to Standard Reference Material (SRM) 966 Toxic Metals in Bovine Blood from the National Institute of Standards and Technology (NIST). For additional validation purposes, human whole blood samples were analyzed by the proposed method and by an established CV AAS method, with no statistical difference between the two techniques at 95% confidence level on applying the t-test.
Resumo:
We describe the classical two-dimensional nonlinear dynamics of cold atoms in far-off-resonant donut beams. We show that chaotic dynamics exists there for charge greater than unity, when the intensity of the beam is periodically modulated. The two-dimensional distributions of atoms in the (x,y) plant for charge 2 are simulated. We show that the atoms will accumulate on several ring regions when the system enters a regime of global chaos. [S1063-651X(99)03903-3].
Resumo:
A number of theoretical and experimental investigations have been made into the nature of purlin-sheeting systems over the past 30 years. These systems commonly consist of cold-formed zed or channel section purlins, connected to corrugated sheeting. They have proven difficult to model due to the complexity of both the purlin deformation and the restraint provided to the purlin by the sheeting. Part 1 of this paper presented a non-linear elasto plastic finite element model which, by incorporating both the purlin and the sheeting in the analysis, allowed the interaction between the two components of the system to be modelled. This paper presents a simplified version of the first model which has considerably decreased requirements in terms of computer memory, running time and data preparation. The Simplified Model includes only the purlin but allows for the sheeting's shear and rotational restraints by modelling these effects as springs located at the purlin-sheeting connections. Two accompanying programs determine the stiffness of these springs numerically. As in the Full Model, the Simplified Model is able to account for the cross-sectional distortion of the purlin, the shear and rotational restraining effects of the sheeting, and failure of the purlin by local buckling or yielding. The model requires no experimental or empirical input and its validity is shown by its goon con elation with experimental results. (C) 1997 Elsevier Science Ltd.
Resumo:
We propose two quantum error-correction schemes which increase the maximum storage time for qubits in a system of cold-trapped ions, using a minimal number of ancillary qubits. Both schemes consider only the errors introduced by the decoherence due to spontaneous emission from the upper levels of the ions. Continuous monitoring of the ion fluorescence is used in conjunction with selective coherent feedback to eliminate these errors immediately following spontaneous emission events.
Resumo:
The Egr proteins, Egr-1, Egr-2, Egr-3 and Egr-4, are closely related members of a subclass of immediate early gene-encoded, inducible transcription factors. They share a highly homologous DNA-binding domain which recognises an identical DNA response element. In addition, they have several less-well conserved structural features in common. As immediate early proteins, the Egr transcription factors are rapidly induced by diverse extracellular stimuli within the nervous system in a discretely controlled manner. The basal expression of the Egr proteins in the developing and adult rat brain and the induction of Egr proteins by neurotransmitter analogue stimulation, physiological mimetic and brain injury paradigms is reviewed. We review evidence indicating that Egr proteins are subject to tight differential control through diverse mechanisms at several levels of regulation. These include transcriptional, translational and posttranslational (including glycosylation, phosphorylation and redox) mechanisms and protein-protein interaction. Ultimately the differentially co-ordinated Egr response may lead to discrete effects on target gene expression. Some of the known target genes of Egr proteins and functions of the Egr proteins in different cell types are also highlighted. Future directions for research into the control and function of the different Egr proteins are also explored. (C) 1997 Elsevier Science Ltd.
Resumo:
We analyzed the mouse Representative Transcript and Protein Set for molecules involved in brain function. We found full-length cDNAs of many known brain genes and discovered new members of known brain gene families, including Family 3 G-protein coupled receptors, voltage-gated channels, and connexins. We also identified previously unknown candidates for secreted neuroactive molecules. The existence of a large number of unique brain ESTs suggests an additional molecular complexity that remains to be explored. A list of genes containing CAG stretches in the coding region represents a first step in the potential identification of candidates for hereditary neurological disorders.
Resumo:
The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology
Resumo:
Dissertação para a obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2013
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.