623 resultados para NIST
Resumo:
A method has been developed for the direct and simultaneous determination of As, Cu, Mn, Sb, and Se in drinking water by electrothermal atomic absorption spectrometry (ETAAS) using a transversely heated graphite tube atomizer (THGA) with longitudinal Zeeman-effect background correction. The thermal behavior of analytes during the pyrolysis and atomization stages was investigated in 0.028 mol L-1 HNO3, 0.14 mol L-1 HNO3 and 1 + 1 (v/v) diluted water using mixtures of Pd(NO3)2 + Mg(NO3)2 as the chemical modifier. With 5 μg Pd + 3 μg Mg as the modifier, the pyrolysis and atomization temperatures of the heating program of the atomizer were fixed at 1400°C and 2100°C, respectively, and 20 μL of the water sample (sample + 0.28 mol L-1 HNO3, 1 + 1, v/v), dispensed into the graphite tube, analytical curves were established ranging from 5.00 -50.0 μg L-1 for As, Sb, Se; 10.0 - 100 μg L-1 for Cu; and 20.0 - 200 μg L-1 for Mn. The characteristic masses were around 39 pg As, 17 pg Cu, 60 pg Mn, 43 pg Sb, and 45 pg Se, and the lifetime of the tube was around 500 firings. The limits of detection (LOD) based on integrated absorbance (0.7 μg L-1 As, 0.2 μg L-1 Cu, 0.6 μg L-1 Mn, 0.3 μg L-1 Sb, 0.9 μg L-1 Se) exceeded the requirements of the Brazilian Food Regulations (decree # 310-ANVS from the Health Department), which established the maximum permissible level for As, Cu, Mn, Sb, and Se at 50 μg L-1, 1000 μg L-1, 2000 μg L-1, 5 μg L-1, and 50 μg L-1, respectively. The relative standard deviations (n = 12) were typically < 5.3% for As, < 0.5% for Cu, < 2.1% for Mn, < 11.7% for Sb, and < 9.2% for Se. The recoveries of As, Cu, Mn, Sb, and Se added to the mineral water samples varied from 102-111%, 91-107%, 92-109%, 89-97%, and 101-109%, respectively. Accuracy for the determination of As, Cu, Mn, Sb, and Se was checked using standard reference materials NIST SRM 1640 - Trace Elements in Natural Water, NIST SRM 1643d - Trace Elements in Water, and 10 mineral water samples. A paired t-test showed that the results were in agreement with the certified values of the standard reference materials at the 95% confidence level.
Resumo:
The family Verbenaceae comprises about 175 genera and 2300 species, distributed in tropics and subtropics, mainly in temperate zone of southern hemisphere. The lemon verbena (Aloysia triphylla (L'Herit) Britton) is a perennial, bushy plant originally from South America. The essential oil of this plant is used in pharmaceutical, cosmetic and perfumery industry. Therapeutic properties include febrifuge, sedative, stomachical, diuretic, and antispasmodic activities. The present work aimed to identify the chemical composition of essential oil of Aloysia triphylla leaves. The study was done in Lageado Experimental Farm of the Department of Plant Production-Horticulture, Agronomical Sciences College, São Paulo State University Campus of Botucatu. Leaves of lemon verbena from Medicinal and Aromatic Plant Garden, were collected in the end of winter (September/2001). The essential oil was extracted by hydrodistillation, in Clevenger apparatus. 100 g of leaves were used in each extraction. Four extractions were performed during three hours. The essential oils of the leaves were analyzed in Gas Chromatography Mass spectrometry (CG-MS, Shimadzu, QP-5000), equipped with capillary column DB-5 (30 m × 0,25 mm × 0,25 mm), split 1/35, injector for 220 C°, detector for 230 C°, dragged by gas He (1,0 mL/min), with programmed temperature for 60 C° to 240 C°, 3 C°/min. The identification of the substances was held by comparison of their mass spectra with data of the CG-MS (Nist 62 lib), literature references and retention index of Kovats. The main constituents of essential oils were geranial (29.54 %), neral (27.01 %), limonene (15.93 %), geranyl acetate (4.0 %) and geraniol (3.96 %). This species possesses high quantity of monoterpenes and low quantity of sesquiterpenes.
Resumo:
Coriander (Coriandrum sativum L.) is an annual and herbaceous plant, belonging to the Apiaceae family. Native of southern Europe and western Mediterranean region, this herb is cultivated world widely. This species, rich in linalool, has potential using as source of essential oil and as a medicinal plant. It has been used as analgesic, carminative, digestive, depurative, anti-rheumatic and antispasmodic agent. Its fruits (commonly called seeds) are used for flavoring candies, in cookery, perfumery, beverage and in tobacco industry. The aim of this study was to analyze the chemical composition of the seed essential oil of this species grown in Botucatu, São Paulo, Brazil. The experiment was carried out in Lageado Experimental Farm, Department of Plant Production, Agronomical Sciences College, São Paulo State University. The fruits were harvest 108 days after sowing. The essential oils were extracted by hydro distillation, in Clevenger apparatus. 50 g of fruits were used in each extraction. Three extractions were performed during three hours. The essential oils were analyzed in Gas Chromatography Mass Spectrometer (CG-MS, Shimadzu, QP-5000), equipped with DB-5 capillary column (30 m × 0,25 mm × 0,25 mm), split 1/20, injector for 240 C°, detector for 230 C°, dragged by gas He (1,7 mL/min), with programmed temperature for 40 C° (5 min)-150 C°, 4 C°/min; 150 C°-280 C°, 8 C°/min. The identification of the compounds was made by comparison of their spectra of masses with data from CG-MS (Nist 62 lib), literature references and retention index of Kovats. The 18 most important components were identified and quantified. The main components of the oil were linalool (77.48 %), γ-terpinene (4.64 %), α-pinene (3.97 %), limonene (1.28 %), geraniol (0.64 %) and 2-decenal (0.16 %).
Resumo:
The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.
Resumo:
The GEANT4 simulations are essential for the development of medical tomography with proton beams pCT. In the case of thin absorbers the latest releases of GEANT4 generate very similar final spectra which agree well with the results of other popular Monte Carlo codes like TRIM/SRIM, or MCNPX. For thick absorbers, however, the disagreements became evident. In a part, these disagreements are due to the known contradictions in the NIST PSTAR and SRIM reference data. Therefore, it is interesting to compare the GEANT4 results with each other, with experiment, and with diverse code results in a reduced form, which is free from this kind of doubts. In this work such comparison is done within the Reduced Calibration Curve concept elaborated for the proton beam tomography. © 2010 IEEE.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Química - IQ
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Química - IQ
Resumo:
The biometric characteristics have been used increasingly as a way to identify an individual, mainly for security reasons. Among them, the fingerprint is the most used biometric characteristic around the world, because it is relatively simple and very efficient. In this scene, there was a significant increase in the size of databases containing information on fingerprints, necessary to perform the recognition of a person. The task of classifying them beforehand has become extremely important as it reduces dramatically the size of the problem during a search, because it is not necessary to go through the whole database. Considering its importance, in the last thirty years, many techniques have been developed to try to increase the efficiency of the classification process. This project followed the rules-based approach and the Software Development Kit (SDK) VeriFinger 6.1 was used to assist in the detection of cores and deltas. Additionally, the classification was also implemented by means of directional map and the Poincar´e index. To make the experiments, the number four database from the National Institute of Standards and Technology (NIST) was used, which is a standard in this area
Resumo:
The classification of texts has become a major endeavor with so much electronic material available, for it is an essential task in several applications, including search engines and information retrieval. There are different ways to define similarity for grouping similar texts into clusters, as the concept of similarity may depend on the purpose of the task. For instance, in topic extraction similar texts mean those within the same semantic field, whereas in author recognition stylistic features should be considered. In this study, we introduce ways to classify texts employing concepts of complex networks, which may be able to capture syntactic, semantic and even pragmatic features. The interplay between various metrics of the complex networks is analyzed with three applications, namely identification of machine translation (MT) systems, evaluation of quality of machine translated texts and authorship recognition. We shall show that topological features of the networks representing texts can enhance the ability to identify MT systems in particular cases. For evaluating the quality of MT texts, on the other hand, high correlation was obtained with methods capable of capturing the semantics. This was expected because the golden standards used are themselves based on word co-occurrence. Notwithstanding, the Katz similarity, which involves semantic and structure in the comparison of texts, achieved the highest correlation with the NIST measurement, indicating that in some cases the combination of both approaches can improve the ability to quantify quality in MT. In authorship recognition, again the topological features were relevant in some contexts, though for the books and authors analyzed good results were obtained with semantic features as well. Because hybrid approaches encompassing semantic and topological features have not been extensively used, we believe that the methodology proposed here may be useful to enhance text classification considerably, as it combines well-established strategies. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
Cloud point extraction (CPE) was employed for separation and preconcentration prior to the determination of nickel by graphite furnace atomic absorption spectrometry (GFAAS), flame atomic absorption spectrometry (FAAS) or UV-Vis spectrophotometry. Di-2-pyridyl ketone salicyloylhydrazone (DPKSH) was used for the first time as a complexing agent in CPE. The nickel complex was extracted from the aqueous phase using the Triton X-114 surfactant. Under optimized conditions, limits of detection obtained with GFAAS, FAAS and UV-Vis spectrophotometry were 0.14, 0.76 and 1.5 mu g L-1, respectively. The extraction was quantitative and the enrichment factor was estimated to be 27. The method was applied to natural waters, hemodialysis concentrates, urine and honey samples. Accuracy was evaluated by analysis of the NIST 1643e Water standard reference material.
Resumo:
Il termine cloud ha origine dal mondo delle telecomunicazioni quando i provider iniziarono ad utilizzare servizi basati su reti virtuali private (VPN) per la comunicazione dei dati. Il cloud computing ha a che fare con la computazione, il software, l’accesso ai dati e servizi di memorizzazione in modo tale che l’utente finale non abbia idea della posizione fisica dei dati e la configurazione del sistema in cui risiedono. Il cloud computing è un recente trend nel mondo IT che muove la computazione e i dati lontano dai desktop e dai pc portatili portandoli in larghi data centers. La definizione di cloud computing data dal NIST dice che il cloud computing è un modello che permette accesso di rete on-demand a un pool condiviso di risorse computazionali che può essere rapidamente utilizzato e rilasciato con sforzo di gestione ed interazione con il provider del servizio minimi. Con la proliferazione a larga scala di Internet nel mondo le applicazioni ora possono essere distribuite come servizi tramite Internet; come risultato, i costi complessivi di questi servizi vengono abbattuti. L’obbiettivo principale del cloud computing è utilizzare meglio risorse distribuite, combinarle assieme per raggiungere un throughput più elevato e risolvere problemi di computazione su larga scala. Le aziende che si appoggiano ai servizi cloud risparmiano su costi di infrastruttura e mantenimento di risorse computazionali poichè trasferiscono questo aspetto al provider; in questo modo le aziende si possono occupare esclusivamente del business di loro interesse. Mano a mano che il cloud computing diventa più popolare, vengono esposte preoccupazioni riguardo i problemi di sicurezza introdotti con l’utilizzo di questo nuovo modello. Le caratteristiche di questo nuovo modello di deployment differiscono ampiamente da quelle delle architetture tradizionali, e i meccanismi di sicurezza tradizionali risultano inefficienti o inutili. Il cloud computing offre molti benefici ma è anche più vulnerabile a minacce. Ci sono molte sfide e rischi nel cloud computing che aumentano la minaccia della compromissione dei dati. Queste preoccupazioni rendono le aziende restie dall’adoperare soluzioni di cloud computing, rallentandone la diffusione. Negli anni recenti molti sforzi sono andati nella ricerca sulla sicurezza degli ambienti cloud, sulla classificazione delle minacce e sull’analisi di rischio; purtroppo i problemi del cloud sono di vario livello e non esiste una soluzione univoca. Dopo aver presentato una breve introduzione sul cloud computing in generale, l’obiettivo di questo elaborato è quello di fornire una panoramica sulle vulnerabilità principali del modello cloud in base alle sue caratteristiche, per poi effettuare una analisi di rischio dal punto di vista del cliente riguardo l’utilizzo del cloud. In questo modo valutando i rischi e le opportunità un cliente deve decidere se adottare una soluzione di tipo cloud. Alla fine verrà presentato un framework che mira a risolvere un particolare problema, quello del traffico malevolo sulla rete cloud. L’elaborato è strutturato nel modo seguente: nel primo capitolo verrà data una panoramica del cloud computing, evidenziandone caratteristiche, architettura, modelli di servizio, modelli di deployment ed eventuali problemi riguardo il cloud. Nel secondo capitolo verrà data una introduzione alla sicurezza in ambito informatico per poi passare nello specifico alla sicurezza nel modello di cloud computing. Verranno considerate le vulnerabilità derivanti dalle tecnologie e dalle caratteristiche che enucleano il cloud, per poi passare ad una analisi dei rischi. I rischi sono di diversa natura, da quelli prettamente tecnologici a quelli derivanti da questioni legali o amministrative, fino a quelli non specifici al cloud ma che lo riguardano comunque. Per ogni rischio verranno elencati i beni afflitti in caso di attacco e verrà espresso un livello di rischio che va dal basso fino al molto alto. Ogni rischio dovrà essere messo in conto con le opportunità che l’aspetto da cui quel rischio nasce offre. Nell’ultimo capitolo verrà illustrato un framework per la protezione della rete interna del cloud, installando un Intrusion Detection System con pattern recognition e anomaly detection.