169 resultados para Analyzers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho trata de uma pesquisa-intervenção realizada em uma instituição de acolhimento de crianças de zero a seis anos, situada na Zona Oeste da cidade do Rio de Janeiro. A pesquisa foi desenhada a partir de demandas construídas coletivamente com a equipe técnica, a presidência e as cuidadoras da instituição de acolhimento. A história do local pesquisado é atravessada pelas histórias da tuberculose e do bairro, que contextualizam o momento e auxiliam na compreensão das relações entre a instituição de acolhimento e as crianças/famílias. Como análise das práticas de abrigamento três analisadores foram escolhidos: chave, cortina e dinheiro. Através de cada um deles foi possível analisar a instituição acolhimento no que tange ao trabalho exercido pelos profissionais assim como às práticas de cuidado que vêm sendo adotadas na assistência à infância atual, especialmente na instituição de acolhimento onde a pesquisa foi realizada. Foi possível perceber que as práticas de abrigamento estão diretamente relacionadas à visão que os funcionários, equipe, presidência e direção têm das crianças abrigadas, levando à reflexão a respeito de a qual infância essas criancas têm direito

Relevância:

10.00% 10.00%

Publicador:

Resumo:

TYPICAL is a package for describing and making automatic inferences about a broad class of SCHEME predicate functions. These functions, called types following popular usage, delineate classes of primitive SCHEME objects, composite data structures, and abstract descriptions. TYPICAL types are generated by an extensible combinator language from either existing types or primitive terminals. These generated types are located in a lattice of predicate subsumption which captures necessary entailment between types; if satisfaction of one type necessarily entail satisfaction of another, the first type is below the second in the lattice. The inferences make by TYPICAL computes the position of the new definition within the lattice and establishes it there. This information is then accessible to both later inferences and other programs (reasoning systems, code analyzers, etc) which may need the information for their own purposes. TYPICAL was developed as a representation language for the discovery program Cyrano; particular examples are given of TYPICAL's application in the Cyrano program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutor em Ciências Sociais, especialidade em Psicologia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutor em Ciências Sociais, especialidade em Psicologia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The in-line measurement of COD and NH4-N in the WWTP inflow is crucial for the timely monitoring of biological wastewater treatment processes and for the development of advanced control strategies for optimized WWTP operation. As a direct measurement of COD and NH4-N requires expensive and high maintenance in-line probes or analyzers, an approach estimating COD and NH4-N based on standard and spectroscopic in-line inflow measurement systems using Machine Learning Techniques is presented in this paper. The results show that COD estimation using Radom Forest Regression with a normalized MSE of 0.3, which is sufficiently accurate for practical applications, can be achieved using only standard in-line measurements. In the case of NH4-N, a good estimation using Partial Least Squares Regression with a normalized MSE of 0.16 is only possible based on a combination of standard and spectroscopic in-line measurements. Furthermore, the comparison of regression and classification methods shows that both methods perform equally well in most cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os crescentes custos ligados ao consumo elétrico, não apenas de cariz financeiro mas também ambiental, despertam cada vez mais para a importância da definição de estratégias de melhor utilização de recursos e eficiência energética. Esta importância tem sido reforçada pela definição de decretos-lei que vêm colocar metas e limites relativamente às despesas energéticas. Estes diplomas são também acompanhados por programas de incentivo para um setor ligado à eficiência energética. Em Portugal as medidas ligadas ao setor tem vindo a ser redirecionadas para o consumo final de energia, com a definição de metas para as instalações de maior consumo. As instalações hospitalares são grandes centros de consumo energético devido não só ao elevado número de utentes que recebem mas também pelos diversos tipos de equipamentos elétricos usados para a prestação dos serviços médicos. Como consequência disso, os investimentos e os custos operacionais são elevados, o que reforça a necessidade de gerir os gastos e consumos energéticos com a procura constante de melhoria na recolha de informação sobre todo o sistema e na adequação de intervenções com vista a uma maior eficiência energética. O Hospital Pedro Hispano vem desde algum tempo a investir no sentido de conhecer mais e melhor toda a instalação bem como os consumos energéticos a ela associados. Algumas medidas foram tomadas nesse sentido nomeadamente a instalação de analisadores de energia, de modo a obter um retrato mais fiel e fidedigno dos principais vetores de consumo. Neste momento a gestão técnica do hospital tem em análise uma grande parte da instalação recolhendo dados do consumo elétrico real do hospital. Nesta dissertação procurou-se fazer uma análise e enquadramento dos programas e metas ligados ao setor energético com ênfase nos diplomas que visão e abrangem as instalações hospitalares. Dos vários programas de incentivo à adoção de políticas de maior eficiência energética é dado especial destaque ao programa ECO.AP que visa a celebração de contratos para implementação de medidas de poupança energética ao setor público. Em colaboração com o HPH, iniciaram-se os trabalhos pelo estudo e identificação das principais fases e ferramentas utilizadas na gestão energética do edifício tendo como objetivo a reavaliação dos vetores energéticos já identificados no HPH e a criação e contabilização de novos grupos de consumo. Através de várias medições do consumo elétrico, num total superior a 650 horas de funcionamento, foi possível a criação do mapa de desagregação de consumos para o ano de 2013. A desagregação realizada conta com 3 novos vetores energéticos e com a reavaliação do peso relativo de mais 5 grupos de consumo. Das medições efetuadas destaca-se a reavaliação do consumo da central de bombagem onde a parcela considerada até à data estava 3 vezes acima do valor real medido. Com base na desagregação feita foram apontadas e estudadas medidas de implementação com o objetivo de reduzir os consumos energético em todo o hospital, destacando-se a solução apresentada para a central de bombagem. Esta medida traria um grande impacto em toda a fatura energética, não só pela sua viabilidade, mas também porque atuaria num grande centro de consumo onde até ao momento nenhuma ação do género foi implementada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some recent studies have characterized the stability of blood variables commonly measured for the Athlete Biological Passport. The aim of this study was to characterize the impact of different shipments conditions and the quality of the results returned by the haematological analyzer. Twenty-two healthy male subjects provided five EDTA tubes each. Four shipment conditions (24, 36, 48, 72 h) under refrigerated conditions were tested and compared to a set of samples left in the laboratory also under refrigerated conditions (group control). All measurements were conducted using two Sysmex XT-2000i analyzers. Haemoglobin concentration, reticulocytes percentage, and OFF-score numerical data were the same for samples analyzed just after collection and after a shipment under refrigerated conditions up to 72 h. Detailed information reported especially by the differential (DIFF) channel scatterplot of the Sysmex XT-2000i indicated that there were signs of blood deterioration, but were not of relevance for the variables used in the Athlete Biological Passport. As long as the cold chain is guaranteed, the time delay between the collection and the analyses of blood variables can be extended. Copyright© 2015 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Notre recherche vise à décrire les connaissances grammaticales élaborées par des élèves de première secondaire au cours de l’enseignement/apprentissage de l’accord du verbe. Cette description se fonde sur l’observation des interactions didactiques entre les élèves, et leur enseignant autour de l’objet de savoir « accord du verbe » : elle concerne plus particulièrement l’interaction entre les pôles « élève » et « savoir ». Notre recherche s’inscrit dans le courant de la grammaire pédagogique moderne. La théorie de la transposition didactique de Chevallard (1985/1991) constitue également la pierre angulaire de nos travaux : les concepts de transposition didactique externe (le passage du savoir savant au savoir à enseigner) et interne (le passage du savoir à enseigner au savoir effectivement enseigné) agissent à titre d’analyseurs des interactions didactiques. L’observation, la description et la théorisation des interactions didactiques imposent une démarche écologique pour la collecte des données. Pour notre recherche, les données ont été recueillies grâce à la captation vidéo de séquences didactiques portant sur l’accord du verbe : elles consistent en des interactions verbales entre élèves ou entre les élèves et leur enseignant. L’analyse des données s’est effectuée selon une perspective macro et micro : (1) L’analyse macro indique que les connaissances antérieures des élèves résistent à l’institutionnalisation des savoirs puisque le savoir enseigné n’est pas celui qui est exclusivement mobilisé. Les élèves recourent à un vaste éventail de connaissances de types procédural et déclaratif pour l’identification du verbe et du sujet, dont la réussite n’est par ailleurs pas assurée. De plus, les connaissances qu’ils ont élaborées autour de la règle d’accord et du transfert des traits morphologiques sont également nombreuses et variées et ne les conduisent pas à accorder le verbe avec constance. (2) L’analyse micro suggère que l’élaboration des connaissances relatives à l’accord du verbe dépend de la manière dont les outils de la grammaire (manipulations syntaxiques et phrase de base) sont utilisés par les élèves. Plus précisément, le savoir piétine ou recule lorsque les manipulations syntaxiques ne sont pas appliquées dans la phrase ou qu’elles ne sont pas adaptées dans certains contextes syntaxiques; le savoir fait des bonds en avant dans les classes où les élèves sont en mesure de recourir à la phrase de base pour soutenir leur analyse grammaticale. Les descriptions proposées dans le cadre de notre thèse conduisent à discuter de leurs implications pour la transposition didactique externe et, plus généralement, pour la didactique du français et de la grammaire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In biological mass spectrometry (MS), two ionization techniques are predominantly employed for the analysis of larger biomolecules, such as polypeptides. These are nano-electrospray ionization [1, 2] (nanoESI) and matrix-assisted laser desorption/ionization [3, 4] (MALDI). Both techniques are considered to be “soft”, allowing the desorption and ionization of intact molecular analyte species and thus their successful mass-spectrometric analysis. One of the main differences between these two ionization techniques lies in their ability to produce multiply charged ions. MALDI typically generates singly charged peptide ions whereas nanoESI easily provides multiply charged ions, even for peptides as low as 1000 Da in mass. The production of highly charged ions is desirable as this allows the use of mass analyzers, such as ion traps (including orbitraps) and hybrid quadrupole instruments, which typically offer only a limited m/z range (< 2000–4000). It also enables more informative fragmentation spectra using techniques such as collisioninduced dissociation (CID) and electron capture/transfer dissociation (ECD/ETD) in combination with tandem MS (MS/MS). [5, 6] Thus, there is a clear advantage of using ESI in research areas where peptide sequencing, or in general, the structural elucidation of biomolecules by MS/MS is required. Nonetheless, MALDI with its higher tolerance to contaminants and additives, ease-of-operation, potential for highspeed and automated sample preparation and analysis as well as its MS imaging capabilities makes it an ionization technique that can cover bioanalytical areas for which ESI is less suitable. [7, 8] If these strengths could be combined with the analytical power of multiply charged ions, new instrumental configurations and large-scale proteomic analyses based on MALDI MS(/MS) would become feasible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a one-port de-embedding technique suitable for the quasi-optical characterization of terahertz integrated components at frequencies beyond the operational range of most vector network analyzers. This technique is also suitable when the manufacturing of precision terminations to sufficiently fine tolerances for the application of a TRL de-embedding technique is not possible. The technique is based on vector reflection measurements of a series of easily realizable test pieces. A theoretical analysis is presented for the precision of the technique when implemented using a quasi-optical null-balanced bridge reflectometer. The analysis takes into account quantization effects in the linear and angular encoders associated with the balancing procedure, as well as source power and detector noise equivalent power. The precision in measuring waveguide characteristic impedance and attenuation using this de-embedding technique is further analyzed after taking into account changes in the power coupled due to axial, rotational, and lateral alignment errors between the device under test and the instruments' test port. The analysis is based on the propagation of errors after assuming imperfect coupling of two fundamental Gaussian beams. The required precision in repositioning the samples at the instruments' test-port is discussed. Quasi-optical measurements using the de-embedding process for a WR-8 adjustable precision short at 125 GHz are presented. The de-embedding methodology may be extended to allow the determination of S-parameters of arbitrary two-port junctions. The measurement technique proposed should prove most useful above 325 GHz where there is a lack of measurement standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the development of a microfluidic methodology, using RNA extraction and reverse transcription PCR, for investigating expression levels of cytochrome P450 genes. Cytochrome P450 enzymes are involved in the metabolism of xenobiotics, including many commonly prescribed drugs, therefore information on their expression is useful in both pharmaceutical and clinical settings. RNA extraction, from rat liver tissue or primary rat hepatocytes, was performed using a silica-based solid-phase extraction technique. Following elution of the purified RNA, amplification of target sequences for the housekeeping gene, glyceraldehyde-3-phosphate dehydrogenase (GAPDH) and the cytochrome P450 gene CYP1A2, was carried out using a one-step reverse transcription PCR. Once the microfluidic methodology had been optimized, analysis of control and 3-methylcholanthrene-induced primary rat hepatocytes were used to evaluate the system. As expected, GAPDH was consistently expressed, whereas CYP1A2 levels were found to be raised in the drug-treated samples. The proposed system offers an initial platform for development of both rapid throughput analyzers for pharmaceutical drug screening and point-of-care diagnostic tests to aid provision of drug regimens, which can be tailor-made to the individual patient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural nanoclays are of great interest particularly for the production of polymer-based nanocomposites. In this work, kaolinite clays from two natural deposits in the State of the Rio Grande do Norte and Paraiba were purified with thermal treatment and chemical treatments, and characterized. Front to the gotten data, had been proposals methodologies for elimination or reduction of coarse particle texts, oxide of iron and organic substance. These methodologies had consisted of the combination of operations with thermal treatments, carried through in electric oven, and acid chemical attacks with and hydrogen peroxide. The Analyzers Thermogravimetric was used to examine the thermal stability of the nanoclays. The analysis indicated weight losses at temperatures under 110 ºC and over the temperature range of 350 to 550 ºC. Based on the thermal analysis data, the samples were submitted to a thermal treatment at 500 °C, for 8 h, to remove organic components. The X-ray diffraction patterns indicated that thermal treatment under 500 °C affect the basic structure of kaolinite. The BET surface area measurements ranged from 32 to 38 m2/g for clay samples with thermal treatment and from 36 to 53 m2/g for chemically treated samples. Thus, although the thermal treatment increased the surface area, through the removal of organic components, the effect was not significant and chemical treatment is more efficient, not affect the basic structure of kaolinite, to improve particle dispersion. SEM analysis confirms that the clay is agglomerated forming micron-size particles

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)