966 resultados para course evaluation
Resumo:
Formaldehyde (FA) is a colourless gas widely used in the industry and hospitals as an aqueous solution, formalin. It is extremely reactive and induces various genotoxic effects in proliferating cultured mammalian cells. Tobacco smoke has been epidemiologically associated to a higher risk of development of cancer, especially in the oral cavity, larynx and lungs, as these are places of direct contact with many carcinogenic tobacco’s compounds. Genetic polymorphisms in enzymes involved in the metabolism are very important and can make changes in the individual susceptibility to disease. Alcohol dehydrogenase class 3 (ADH3), also known as formaldehyde dehydrogenase dependent of glutathione, is the major enzyme involved in the formaldehyde oxidation, especially in the buccal mucosa. The polymorphism in study is a substitution of an isoleucine for a valine in codon 349. The cytokinesis-blocked micronucleus assay (CBMN) in human lymphocytes is one of the most commonly used methods for measuring DNA damage, namely the detection of micronucleus, nucleoplasmic bridges, and nuclear buds, classified as genotoxicity biomarkers.
Resumo:
Three organophosphorus compounds- malathion, folithion and temephos- and two synthetic pyrethroids- alphamethrin and deltamethrin- were used for monitoring the susceptibility status of larvae and adults of six vector mosquitoe species: Culex quinquefasciatus (Filariasis) and Aedes albopictus (Dengue) (both laboratory and field strains); laboratory strains of Aedes aegypti (Dengue), Anopheles slephensi and Anopheles culicifacies (Malaria), and Culex tritaeniorhynchus (Japanese encephalitis) in India. From the LC50 values obtained for these insecticides, it was found that all mosquito species including the field strains of Cx. quinquefasciatus and Ae. albopictus were highly susceptible Except for Cx. quinquefasciatus (field strain) against malathion, 100% mortality was observed at the discriminating dosages recommended by World Health Organization. The residual effect of alphamethrin, deltamethrin, malathion and folithion at 25 mg (ai)/m² on different surfaces against six species of vector mosquitoes showed that alphamethrin was the most effective on all four treated surfaces (mud, plywood, cement and thatch). Nevertheless, residual efficacy lasted longer on thatch than on the other surfaces. Therefore, synthetic pyrethroids such as alphamethrin can be effectively employed in integrated vector control operations.
Resumo:
The exposure index (lgM) obtained from a radiographic image may be a useful feedback indicator to the radiographer about the appropriate exposure level in routine clinical practice. This study aims to evaluate lgM in orthopaedic radiography performed in the standard clinical environment. We analysed the lgM of 267 exposures performed with an AGFA CR system. The mean value of lgM in our sample is 2.14. A significant difference (P=0.000<0.05) from 1.96 lgM reference is shown. Data show that 72% of exposures are above the 1.96 lgM and 42% are above the limit of 2.26. Median values of lgM are above 1.96 and below 2.26 for Speed class (SC) 200 (2.16) and SC400 (2.13). The interquartile range is lower in SC400 than in SC200. Data seem to indicate that lgM values are above the manufacturer’s reference of 1.96. Departmental exposure charts should be optimised to reduce the dose given to patients.
Evaluation of exposure parameters in plain radiography: a comparative study with european guidelines
Resumo:
Typical distribution of exposure parameters in plain radiography is unknown in Portugal. This study aims to identify exposure parameters that are being used in plain radiography in the Lisbon area and to compare the collected data with European references [Commission of European Communities (CEC) guidelines]. The results show that in four examinations (skull, chest, lumbar spine and pelvis), there is a strong tendency of using exposure times above the European recommendation. The X-ray tube potential values (in kV) are below the recommended values from CEC guidelines. This study shows that at a local level (Lisbon region), radiographic practice does not comply with CEC guidelines concerning exposure techniques. Further national/local studies are recommended with the objective to improve exposure optimisation and technical procedures in plain radiography. This study also suggests the need to establish national/local diagnostic reference levels and to proceed to effective measurements for exposure optimisation.
Resumo:
As vias de comunicação são indispensáveis para o desenvolvimento de uma nação, económica e socialmente. Num mundo globalizado, onde tudo deve chegar ao seu destino no menor espaço de tempo, as vias de comunicação assumem um papel vital. Assim, torna-se essencial construir e manter uma rede de transportes eficiente. Apesar de não ser o método mais eficiente, o transporte rodoviário é muitas vezes o mais económico e possibilita o transporte porta-a-porta, sendo em muitos casos o único meio de transporte possível. Por estas razões, o modo rodoviário tem uma quota significativa no mercado dos transportes, seja de passageiros ou mercadorias, tornando-o extremamente importante na rede de transportes de um país. Os países europeus fizeram um grande investimento na criação de extensas redes de estradas, cobrindo quase todo o seu território. Neste momento, começa-se a atingir o ponto onde a principal preocu+ação das entidades gestoras de estradas deixa de ser a construção de novas vias, passando a focar-se na necessidade de manutenção e conservação das vias existentes. Os pavimentos rodoviários, como todas as outras construções, requerem manutenção de forma a garantir bons níveis de serviço com qualidade, conforto e segurança. Devido aos custos inerentes às operações de manutenção de pavimentos, estas devem rigorosamente e com base em critérios científicos bem definidos. Assim, pretende-se evitar intervenções desnecessárias, mas também impedir que os danos se tornem irreparáveis e economicamente prejudiciais, com repercussões na segurança dos utilizadores. Para se estimar a vida útil de um pavimento é essencial realizar primeiro a caracterização estrutural do mesmo. Para isso, torna-se necessário conhecer o tipo de estrutura de um pavimento, nomeadamente a espessura e o módulo de elasticidade constituintes. A utilização de métodos de ensaio não destrutivos é cada vez mais reconhecida como uma forma eficaz para obter informações sobre o comportamento estrutural de pavimentos. Para efectuar estes ensaios, existem vários equipamentos. No entanto, dois deles, o Deflectómetro de Impacto e o Radar de Prospecção, têm demonstrado ser particularmente eficientes para avaliação da capacidade de carga de um pavimento, sendo estes equipamentos utilizados no âmbito deste estudo. Assim, para realização de ensaios de carga em pavimentos, o equipamento Deflectómetro de Impacto tem sido utilizado com sucesso para medir as deflexões à superfície de um pavimento em pontos pré-determinados quando sujeito a uma carga normalizada de forma a simular o efeito da passagem da roda de um camião. Complementarmente, para a obtenção de informações contínuas sobre a estrutura de um pavimento, o equipamento Radar de Prospecção permite conhecer o número de camadas e as suas espessuras através da utilização de ondas electromagnéticas. Os dados proporcionam, quando usados em conjunto com a realização de sondagens à rotação e poços em alguns locais, permitem uma caracterização mais precisa da condição estrutural de um pavimento e o estabelecimento de modelos de resposta, no caso de pavimentos existentes. Por outro lado, o processamento dos dados obtidos durante os ensaios “in situ” revela-se uma tarefa morosa e complexa. Actualmente, utilizando as espessuras das camadas do pavimento, os módulos de elasticidade das camadas são calculados através da “retro-análise” da bacia de deflexões medida nos ensaios de carga. Este método é iterativo, sendo que um engenheiro experiente testa várias estruturas diferentes de pavimento, até se obter uma estrutura cuja resposta seja o mais próximo possível da obtida durante os ensaios “in Situ”. Esta tarefa revela-se muito dependente da experiência do engenheiro, uma vez que as estruturas de pavimento a serem testadas maioritariamente do seu raciocínio. Outra desvantagem deste método é o facto de apresentar soluções múltiplas, dado que diferentes estruturas podem apresentar modelos de resposta iguais. A solução aceite é, muitas vezes, a que se julga mais provável, baseando-se novamente no raciocínio e experiência do engenheiro. A solução para o problema da enorme quantidade de dados a processar e das múltiplas soluções possíveis poderá ser a utilização de Redes Neuronais Artificiais (RNA) para auxiliar esta tarefa. As redes neuronais são elementos computacionais virtuais, cujo funcionamento é inspirado na forma como os sistemas nervosos biológicos, como o cérebro, processam a informação. Estes elementos são compostos por uma série de camadas, que por sua vez são compostas por neurónios. Durante a transmissão da informação entre neurónios, esta é modificada pela aplicação de um coeficiente, denominado “peso”. As redes neuronais apresentam uma habilidade muito útil, uma vez que são capazes de mapear uma função sem conhecer a sua fórmula matemática. Esta habilidade é utilizada em vários campos científicos como o reconhecimento de padrões, classificação ou compactação de dados. De forma a possibilitar o uso desta característica, a rede deverá ser devidamente “treinada” antes, processo realizado através da introdução de dois conjuntos de dados: os valores de entrada e os valores de saída pretendidos. Através de um processo cíclico de propagação da informação através das ligações entre neurónios, as redes ajustam-se gradualmente, apresentando melhores resultados. Apesar de existirem vários tipos de redes, as que aparentam ser as mais aptas para esta tarefa são as redes de retro-propagação. Estas possuem uma característica importante, nomeadamente o treino denominado “treino supervisionado”. Devido a este método de treino, as redes funcionam dentro da gama de variação dos dados fornecidos para o “treino” e, consequentemente, os resultados calculados também se encontram dentro da mesma gama, impedindo o aparecimento de soluções matemáticas com impossibilidade prática. De forma a tornar esta tarefa ainda mais simples, foi desenvolvido um programa de computador, NNPav, utilizando as RNA como parte integrante do seu processo de cálculo. O objectivo é tornar o processo de “retro-análise” totalmente automático e prevenir erros induzidos pela falta de experiência do utilizador. De forma a expandir ainda mais as funcionalidades do programa, foi implementado um processo de cálculo que realiza uma estimativa da capacidade de carga e da vida útil restante do pavimento, recorrendo a dois critérios de ruína. Estes critérios são normalmente utilizados no dimensionamento de pavimentos, de forma a prevenir o fendilhamento por fadiga e as deformações permanentes. Desta forma, o programa criado permite a estimativa da vida útil restante de um pavimento de forma eficiente, directamente a partir das deflexões e espessuras das camadas, medidas nos ensaios “in situ”. Todos os passos da caracterização estrutural do pavimento são efectuados pelo NNPav, seja recorrendo à utilização de redes neuronais ou a processos de cálculo matemático, incluindo a correcção do módulo de elasticidade da camada de misturas betuminosas para a temperatura de projecto e considerando as características de tráfego e taxas de crescimento do mesmo. Os testes efectuados às redes neuronais revelaram que foram alcançados resultados satisfatórios. Os níveis de erros na utilização de redes neuronais são semelhantes aos obtidos usando modelos de camadas linear-elásticas, excepto para o cálculo da vida útil com base num dos critérios, onde os erros obtidos foram mais altos. No entanto, este processo revela-se bastante mais rápido e possibilita o processamento dos dados por pessoal com menos experiência. Ao mesmo tempo, foi assegurado que nos ficheiros de resultados é possível analisar todos os dados calculados pelo programa, em várias fases de processamento de forma a permitir a análise detalhada dos mesmos. A possibilidade de estimar a capacidade de carga e a vida útil restante de um pavimento, contempladas no programa desenvolvido, representam também ferramentas importantes. Basicamente, o NNPav permite uma análise estrutural completa de um pavimento, estimando a sua vida útil com base nos ensaios de campo realizados pelo Deflectómetro de Impacto e pelo Radar de Prospecção, num único passo. Complementarmente, foi ainda desenvolvido e implementado no NNPav um módulo destinado ao dimensionamento de pavimentos novos. Este módulo permite que, dado um conjunto de estruturas de pavimento possíveis, seja estimada a capacidade de carga e a vida útil daquele pavimento. Este facto permite a análise de uma grande quantidade de estruturas de pavimento, e a fácil comparação dos resultados no ficheiro exportado. Apesar dos resultados obtidos neste trabalho serem bastante satisfatórios, os desenvolvimentos futuros na aplicação de Redes Neuronais na avaliação de pavimentos são ainda mais promissores. Uma vez que este trabalho foi limitado a uma moldura temporal inerente a um trabalho académico, a possibilidade de melhorar ainda mais a resposta das RNA fica em aberto. Apesar dos vários testes realizados às redes, de forma a obter as arquitecturas que apresentassem melhores resultados, as arquitecturas possíveis são virtualmente ilimitadas e pode ser uma área a aprofundar. As funcionalidades implementadas no programa foram as possíveis, dentro da moldura temporal referida, mas existem muitas funcionalidades a serem adicinadas ou expandidas, aumentando a funcionalidade do programa e a sua produtividade. Uma vez que esta é uma ferramenta que pode ser aplicada ao nível de gestão de redes rodoviárias, seria necessário estudar e desenvolver redes similares de forma a avaliar outros tipos de estruturas de pavimentos. Como conclusão final, apesar dos vários aspectos que podem, e devem ser melhorados, o programa desenvolvido provou ser uma ferramenta bastante útil e eficiente na avaliação estrutural de pavimentos com base em métodos de ensaio não destrutivos.
Resumo:
Nowadays, most individuals spend about 80% of their time indoor and, consequently, the exposure to the indoor environment becomes more relevant than to the outdoor one. Children spend most of their time at home and at school and evaluations of their indoor environment are important for the time-weighted exposure. Due to their airways still in development, children are a sensitive group with higher risk than adults. Larger impact in health and educational performance of children demand indoor air quality studies of schools. The aim of this study was to assess the children exposure to bioaerosols. A methodology based upon passive sampling was applied to evaluate fungi, bacteria and pollens; its procedures and applicability was optimized. An indoor air study by passive sampling represents an easier and cheaper method when comparing with the use of automatic active samplers. Furthermore, it is possible to achieve important quality information without interfering in the classroom activities. The study was conducted in three schools, representative of different environments in the Lisbon urban area, at three different periods of the year to obtain a seasonal variation, to estimate the variability through the city and to understand the underneath causes. Fungi and bacteria were collected indoor and outdoor of the classrooms to determine the indoor/outdoor ratios and to assess the level of outdoor contamination upon the indoor environment. The children's exposure to pollen grains inside the classrooms was also assessed.
Resumo:
Planning control programs, for diseases such as rabies requires information on the size and structure of the dog and cat population. In order to evaluate the dog population of the urban area of Araçatuba city, S. Paulo State, Brazil, a survey was conducted using a questionnaire to interview members of households. Eighty-eight districts were visited (37,778 houses) and the interview was possible at 77.93% of these. Human population size evaluated was 113,157 inhabitants. Houses that owned animals represented 55.2%, 26,926 of the animals concerned were dogs and 5,755 were cats. Of the dogs, 56.64% were 1-4 year olds and males represented 56.2% of the total population. Dog: person ratio was estimated at 2.8 dogs to every 10 persons, almost 3 times the ratio hitherto estimated and used in the planning of rabies vaccination campaigns.
Resumo:
INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.
Resumo:
OBJECTIVE: The assessment of an easy to prepare and low cost control material for Hematology, available for manual and automated methods. MATERIAL AND METHOD: Aliquots of stabilized whole blood were prepared by partial fixation with aldehydes; the stability at different temperatures (4. 20 and 37 °C) during periods of up to 8-9 weeks and aliquot variability with both methods were controlled. RESULTS: Aliquot variability with automated methods at day 1, expressed as CV% (coefficient of variation) was: white blood cells (WBC) 2.7, red blood cells (RBC) 0.7, hemoglobin (Hb) 0.6, hematocrit (Hct) 0.7, mean cell volume (MCV) 0.3, mean cell hemoglobin (MCH) 0.6, mean cell hemoglobin concentration (MCHC) 0.7, and platelets (PLT) 4.6. The CV (coefficient of variation) percentages obtained with manual methods in one of the batches were: WBC 23, Hct 2.8, Hb 4.5, MCHC 5.9, PLT 41. Samples stored at 4ºC and 20ºC showed good stability, only a very low initial hemolysis being observed, whereas those stored at 37ºC deteriobed a rapidly (metahemoglobin formation, aggregation of WBC and platelets, as well as alteration of erythrocyte indexes). CONCLUSIONS: It was confirmed that, as long as there is no exposure to high temperatures during distribution, this material is stable, allowing assessment, both esternal and internal, for control purposes, with acceptable reproductivity, both for manual and auttomatic methods.
Computational evaluation of hydraulic system behaviour with entrapped air under rapid pressurization
Resumo:
The pressurization of hydraulic systems containing entrapped air is considered a critical condition for the infrastructure's security due to transient pressure variations often occurred. The objective of the present study is the computational evaluation of trends observed in variation of maximum surge pressure resulting from rapid pressurizations. The comparison of the results with those obtained in previous studies is also undertaken. A brief state of art in this domain is presented. This research work is applied to an experimental system having entrapped air in the top of a vertical pipe section. The evaluation is developed through the elastic model based on the method of characteristics, considering a moving liquid boundary, with the results being compared with those achieved with the rigid liquid column model.
Computational evaluation of hydraulic system behaviour with entrapped air under rapid pressurization
Resumo:
The pressurization of hydraulic systems containing entrapped air is considered a critical condition for the infrastructure's security due to transient pressure variations often occurred. The objective of the present study is the computational evaluation of trends observed in variation of maximum surge pressure resulting from rapid pressurizations. The comparison of the results with those obtained in previous studies is also undertaken. A brief state of art in this domain is presented. This research work is applied to an experimental system having entrapped air in the top of a vertical pipe section. The evaluation is developed through the elastic model based on the method of characteristics, considering a moving liquid boundary, with the results being compared with those achieved with the rigid liquid column model.
Resumo:
Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).
Resumo:
The aim of this paper is to analyze the forecasting ability of the CARR model proposed by Chou (2005) using the S&P 500. We extend the data sample, allowing for the analysis of different stock market circumstances and propose the use of various range estimators in order to analyze their forecasting performance. Our results show that there are two range-based models that outperform the forecasting ability of the GARCH model. The Parkinson model is better for upward trends and volatilities which are higher and lower than the mean while the CARR model is better for downward trends and mean volatilities.
Resumo:
Paper presented at the 8th European Conference on Knowledge Management, Barcelona, 6-7 Sep. 2008 URL: http://www.academic-conferences.org/eckm/eckm2007/eckm07-home.htm
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.