970 resultados para 650200 Mining and Extraction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fast method was optimized and validated in order to quantify amphetamine-type stimulants (amphetamine, AMP; methamphetamine, MAMP; fenproporex, FPX; 3,4-methylenedioxymethamphetamine, MDMA; and 3,4-methylenedioxyamphetamine, MDA) in human hair samples. The method was based in an initial procedure of decontamination of hair samples (50 mg) with dichloromethane, followed by alkaline hydrolysis and extraction of the amphetamines using hollow-fiber liquid-phase micro extraction (HF-LPME) in the three-phase mode. Gas chromatography-mass spectrometry (GC-MS) was used for identification and quantification of the analytes. The LoQs obtained for all amphetamines (around 0.05 ng/mg) were below the cut-off value (0.2 ng/mg) established by the Society of Hair Testing (SoHT). The method showed to be simple and precise. The intra-day and inter-day precisions were within 10.6% and 11.4%, respectively, with the use of only two deuteratecl internal standards (AMP-d5 and MDMA-d5). By using the weighted least squares linear regression (1/x(2)), the accuracy of the method was satisfied in the lower concentration levels (accuracy values better than 87%). Hair samples collected from six volunteers who reported regular use of amphetamines were submitted to the developed method. Drug detection was observed in all samples of the volunteers. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review recent visualization techniques aimed at supporting tasks that require the analysis of text documents, from approaches targeted at visually summarizing the relevant content of a single document to those aimed at assisting exploratory investigation of whole collections of documents.Techniques are organized considering their target input materialeither single texts or collections of textsand their focus, which may be at displaying content, emphasizing relevant relationships, highlighting the temporal evolution of a document or collection, or helping users to handle results from a query posed to a search engine.We describe the approaches adopted by distinct techniques and briefly review the strategies they employ to obtain meaningful text models, discuss how they extract the information required to produce representative visualizations, the tasks they intend to support and the interaction issues involved, and strengths and limitations. Finally, we show a summary of techniques, highlighting their goals and distinguishing characteristics. We also briefly discuss some open problems and research directions in the fields of visual text mining and text analytics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dry matter yield and chemical composition of forage grasses harvested from an area degraded by urban solid waste deposits were evaluated. A split-plot scheme in a randomized block design with four replicates was used, with five grasses in the plots and three harvests in the subplots. The mineral content and extraction and heavy metal concentration were evaluated in the second cut, using a randomized block design with five grasses and four replicates. The grasses were Brachiaria decumbens cv. Basilisk, Brachiaria ruziziensis, Brachiaria brizantha cv. Marandu and cv. Xaraés, and Panicum maximum cv. Tanzânia, cut at 42 days of regrowth. The dry matter yield per cut reached 1,480 kg ha-1; the minimum crude protein content was 9.5% and the average neutral detergent fiber content was 62.3%. The dry matter yield of grasses was satisfactory, and may be an alternative for rehabilitating areas degraded by solid waste deposits. The concentration of heavy metals in the plants was below toxicity levels; the chemical composition was appropriate, except for phosphorus. The rehabilitated areas may therefore be used for grazing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genome-wide association studies have failed to establish common variant risk for the majority of common human diseases. The underlying reasons for this failure are explained by recent studies of resequencing and comparison of over 1200 human genomes and 10 000 exomes, together with the delineation of DNA methylation patterns (epigenome) and full characterization of coding and noncoding RNAs (transcriptome) being transcribed. These studies have provided the most comprehensive catalogues of functional elements and genetic variants that are now available for global integrative analysis and experimental validation in prospective cohort studies. With these datasets, researchers will have unparalleled opportunities for the alignment, mining, and testing of hypotheses for the roles of specific genetic variants, including copy number variations, single nucleotide polymorphisms, and indels as the cause of specific phenotypes and diseases. Through the use of next-generation sequencing technologies for genotyping and standardized ontological annotation to systematically analyze the effects of genomic variation on humans and model organism phenotypes, we will be able to find candidate genes and new clues for disease’s etiology and treatment. This article describes essential concepts in genetics and genomic technologies as well as the emerging computational framework to comprehensively search websites and platforms available for the analysis and interpretation of genomic data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ubiquity of time series data across almost all human endeavors has produced a great interest in time series data mining in the last decade. While dozens of classification algorithms have been applied to time series, recent empirical evidence strongly suggests that simple nearest neighbor classification is exceptionally difficult to beat. The choice of distance measure used by the nearest neighbor algorithm is important, and depends on the invariances required by the domain. For example, motion capture data typically requires invariance to warping, and cardiology data requires invariance to the baseline (the mean value). Similarly, recent work suggests that for time series clustering, the choice of clustering algorithm is much less important than the choice of distance measure used.In this work we make a somewhat surprising claim. There is an invariance that the community seems to have missed, complexity invariance. Intuitively, the problem is that in many domains the different classes may have different complexities, and pairs of complex objects, even those which subjectively may seem very similar to the human eye, tend to be further apart under current distance measures than pairs of simple objects. This fact introduces errors in nearest neighbor classification, where some complex objects may be incorrectly assigned to a simpler class. Similarly, for clustering this effect can introduce errors by “suggesting” to the clustering algorithm that subjectively similar, but complex objects belong in a sparser and larger diameter cluster than is truly warranted.We introduce the first complexity-invariant distance measure for time series, and show that it generally produces significant improvements in classification and clustering accuracy. We further show that this improvement does not compromise efficiency, since we can lower bound the measure and use a modification of triangular inequality, thus making use of most existing indexing and data mining algorithms. We evaluate our ideas with the largest and most comprehensive set of time series mining experiments ever attempted in a single work, and show that complexity-invariant distance measures can produce improvements in classification and clustering in the vast majority of cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introdução: Trombose Venosa Profunda (TVP) é o resultado de inúmeros fatores, incluindo estase venosa, injuria endotelial e hipercoagulação do sangue. O embolismo pulmonar é a maior complicação da TVP e ocorre quando um trombo ou um coágulo sangüíneo desloca-se em direção aos pulmões. O tromboembolismo é um sério problema de saúde e é responsável por um considerável número de mortes súbitas e por significantes custos de hospitalizações em todo o mundo. Entre os métodos profiláticos usados na prevenção da TVP, há os classificados como mecânicos e os farmacológicos. Ambos são efetivos e podem ser usados sempre que necessário, tendo sido recomendados por consenso internacional. Objetivos: Avaliar a incidência da TVP em pacientes cirúrgicos através da Compressão pneumática Intermitente CPI (coxa/perna ou perna/pé) contra nenhuma forma de profilaxia ou contra fármacos. Métodos: Revisão sistemática de ensaios clínicos randomizados, utilizando a metodologia Cochrane, através de buscas eletrônica e manual. Foram incluídos pacientes cirúrgicos, de ambos os sexos, cuja intervenção foi o uso de aparelhos de compressão pneumática intermitente. Resultados: 4269 pacientes cirúrgicos investigados por 31 ensaios submetidos a um número de intervenções cirúrgicas, incluindo cirurgia ortopédica (joelho e quadris), geral, ginecológica, urológica e neurológica. Dados combinados demonstraram efeitos de tratamento virtualmente iguais entre CPI e abordagens farmacológicas [0.95 (95%CI 0.82, 1.10)], em relação à incidência de TVP. Entretanto, o risco de hemorragia foi maior e estatisticamente significante em pacientes tratados com intervenções farmacológicas [0.37(95%CI 0.20, 0.69)]. Conclusão: Comparações entre as estratégias farmacológicas e mecânicas indicam que ambas são similares na prevenção de trombose venosa profunda. Entretantio, a evidência atual suportaria a escolha das abordagens mecâncias, uma vez que o risco de eventos hemorrágicos favorece os métodos mecânicos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die (Wieder-)Nutzung auf Schwerkraft basierender Fördertechniken, die insbesondere durch das niedrige Energiepreisniveau in den Jahrzehnten nach dem 2. Weltkrieg in der Tagebautechnik nahezu vollständig verdrängt wurden, ist bei den heutigen wirtschaftlichen Randbedingungen und anzustrebenden ökologischen Standards eine Herausforderung für die bergbautreibende Industrie. Seit Aufnahme der Entwicklung des Förderkonzeptes – Geführte Versturztechnik – Mitte der 1990er Jahre haben sich die Kosten für Rohöl vor Steuern nach dem Tiefstand um das Jahr 1998 bis heute mehr als verdreifacht, alleine seit 2004 mehr als verdoppelt. Gesetzliche Regelwerke wie die europäische IVU-Richtlinie 96/61/EG zur „integrierten Vermeidung und Verminderung der Umweltverschmutzung“ fordern, Genehmigungen nur noch bei Einsatz der besten verfügbaren Techniken (BVT oder BAT: „best available techniques“) zu erteilen. Die Umsetzung in nationale Vorschriften wie das Bundes-Immissionsschutzgesetz und nachgeordnete Regelwerke fordern hierauf aufbauend, dass Umweltbelastungen nicht in andere Medien verlagert werden dürfen. Die Anordnung einer Versturzrinne zur Nutzung von Massenschwerebewegungen am Beispiel von Quarzitabbau im Rheinischen Schiefergebirge bei denen die Förderbezugsebene unterhalb der Strossen liegt, die zur sichern und selektiven Gewinnung des Rohstoffs aufgefahren werden müssen, erfüllt durch Rückgriff auf ein vermeintlich „archaisches“ Förderkonzept durch Nutzung der Schwerkraft die obigen Anforderungen. Offenkundige Umweltbelastungen, die alleine durch die Verbrennung von Dieselkraftstoff und hieraus resultierender Schadstoff- und Wärmeeinträge in die Luft beim verbreiteten Einsatz von SLKW zur Abwärtsförderung entstehen, können erheblich vermindert werden. Der Aspekt der Betriebssicherheit einer solchen Anordnung kann durch Auffahren eines geradlinigen Bauwerks mit an das Fördergut angepassten Dimensionen sowie Einrichtungen zur Beschränkung der kinetischen Energie erreicht werden. Diese stellen auch gleichzeitig sicher, dass die Zerkleinerung des durch die Versturzrinne abwärts transportierten Materials betrieblich zulässige Grenzen nicht überschreitet. Hierfür kann auf das umfangreiche Wissen zu Massenschwerebewegungen Rückgriff genommen werden. Dem Aspekt des Umweltschutzes, der sich in Bezug auf das Medium Luft auf den autochtonen Staub reduziert, kann durch Vorrichtungen zur Staubniederschlagung Rechnung getragen werden. Vertiefende Untersuchungen sind erforderlich, um die mit komplexen, aber erprobten Techniken arbeitende Tagebauindustrie auch in dicht besiedelten Regionen wieder an die Nutzung von Schwerkraft (-gestützten) Fördertechniken heranzuführen. Auch das Konzept – Geführte Versturztechnik – ist auf konkrete Anwendungsfälle hin in Details anzupassen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dieser Arbeit werden die Dynamiken angeregter Zustände in Donor-Akzeptorsystemen für Energieumwandlungsprozesse mit ultraschneller zeitaufgelöster optischer Spektroskopie behandelt. Der Hauptteil dieser Arbeit legt den Fokus auf die Erforschung der Photophysik organischer Solarzellen, deren aktive Schichten aus diketopyrrolopyrrole (DPP) basierten Polymeren mit kleiner Bandlücke als Elektronendonatoren und Fullerenen als Elektronenakzeptoren bestehen. rnEin zweiter Teil widmet sich der Erforschung von künstlichen primären Photosynthesereaktionszentren, basierend auf Porphyrinen, Quinonen und Ferrocenen, die jeweils als Lichtsammeleinheit, Elektronenakzeptor beziehungsweise als Elektronendonatoren eingesetzt werden, um langlebige ladungsgetrennte Zustände zu erzeugen.rnrnZeitaufgelöste Photolumineszenzspektroskopie und transiente Absorptionsspektroskopie haben gezeigt, dass Singulettexzitonenlebenszeiten in den Polymeren PTDPP-TT und PFDPP-TT Polymeren kurz sind (< 20 ps) und dass in Mischungen der Polymere mit PC71BM geminale Rekombination von gebundenen Ladungstransferzuständen ein Hauptverlustkanal ist. Zudem wurde in beiden Systemen schnelle nichtgeminale Rekombination freier Ladungen zu Triplettzuständen auf dem Polymer beobachtet. Für das Donor-Akzeptor System PDPP5T:PC71BM wurde nachgewiesen, dass die Zugabe eines Lösungsmittels mit hohem Siedepunkt, und zwar ortho-Dichlorbenzol, die Morphologie der aktiven Schicht stark beeinflusst und die Solarzelleneffizienz verbessert. Der Grund hierfür ist, dass die Donator- und Akzeptormaterialien besser durchmischt sind und sich Perkolationswege zu den Elektroden ausgebildet haben, was zu einer verbesserten Ladungsträgergeneration und Extraktion führt. Schnelle Bildung des Triplettzustands wurde in beiden PDPP5T:PC71BM Systemen beobachtet, da der Triplettzustand des Polymers über Laungstransferzustände mit Triplettcharakter populiert werden kann. "Multivariate curve resolution" (MCR) Analyse hat eine starke Intensitätsabhängigkeit gezeigt, was auf nichtgeminale Ladungsträgerrekombination in den Triplettzustand hinweist.rnrnIn den künstlichen primären Photosynthesereaktionszentren hat transiente Absorptionsspektroskopie bestätigt, dass photoinduzierter Ladungstransfer in Quinon-Porphyrin (Q-P) und Porphyrin-Ferrocen (P-Fc) Diaden sowie in Quinon-Porphyrin-Ferrocen (Q-P-Fc) Triaden effizient ist. Es wurde jedoch auch gezeigt, dass in den P-Fc unf Q-P-Fc Systemen die ladungsgetrennten Zustände in den Triplettzustand der jeweiligen Porphyrine rekombinieren. Der ladungsgetrennte Zustand konnte in der Q-P Diade durch Zugabe einer Lewissäure signifikant stabilisiert werden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we have identified two electrochemical procedures for preparing two compounds of copper hexacyanoferrate (CuHCF) films with different compositions and structures. The deposition were carried out using a “two steps” method consisting in electrochemical oxidation of previously deposited metallic copper on carbon substrates (glassy carbon and graphite foil electrodes) in K3[Fe(CN)6] solution. Both films, CuHCF-methodA and CuHCF-methodB, were characterized by cyclic voltammetry (GC) and their study using XANES spectroscopy revealed evidence of different structures. Additionally, insertion and extraction of different cations (Na+, K+, Mg2+, Al3+ and Cs+) were performed and the results indicate that CuHCF-methodA has slightly better performances and operational stability than CuHCF-methodB. Data from galvanostatic charge-discharge tests confirme the latter observation. An application for amperometric detection of H2O2 and SEM micrographs are also reported for both films (method A and B). Comparing these results with a previous work of our research group, seems that the deposition of two different compounds using methodA and methodB is due to the different stoichiometry of ions Cu2+ e [Fe(CN)6]3– created near electrode surface during the dissolution step.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The radiological determination of foreign objects in corpses can be difficult if they are fragmented or deformed. With multislice computed tomography, radiodensities--referred to as Hounsfield units (HU)--can be measured. We examined the possibility of differentiating 21 frequently occurring foreign bodies, such as metals, rocks, and different manmade materials by virtue of their HU values. Gold, steel, and brass showed mean HU values of 30671-30710 (upper measurable limit), mean HU values for steel, silver, copper, and limestone were 20346, 16949, 14033, and 2765, respectively. The group consisting of objects, such as aluminum, tarmac, car front-window glass, and other rocks, displayed mean HU values of 2329-2131 HU. The mean HU value of bottle glass and car side-window glass was 2088, whereas windowpane glass was 493. HU value determination may therefore help in preautopsy differentiation between case-relevant and irrelevant foreign bodies and thus be useful for autopsy planning and extraction of the objects in question.