995 resultados para Graphics processing units
Resumo:
Several activities are ensured by dockers increase occupational exposure to several risk factors. being one of them the fungal burden from the load. In this study we aim at characterizing fungal contamination in one warehouse that storage sugar cane from a ship, and also in one crane cabinet that unload the same sugar cane from the ship. Air samples were collected from the warehouse and from inside the crane cabinet. An outdoor sample was also collected, from each sampling site, and regarding as reference. Sampling volume was selected depending in the contamination expected and the air samples were collect through an impaction method in a flow rate of 140 L/min onto malt extract agar (MEA) supplemented with chloramphenicol (0.05%), using the Millipore air Tester (Millipore). Surfaces samples from the warehouse were collected by swabbing the surfaces of the same indoor sites, using a 10 by 10cm square stencil according to the International Standard ISO 18593 (2004). The obtained swabs were then plated onto MEA. All the collected samples were incubated at 27ºC for 5 to 7 days. After laboratory processing and incubation of the collected samples, quantitative (colony-forming units - CFU/m3 and CFU/m2) and qualitative results were obtained with identification of the isolated fungal species. Aspergillus fumigatus present the highest fungal load and WHO guideline was overcome in both indoor sampling sites. The results obtained in this study highlight the need to know better the exposure burden from dockers, and specifically to fungi contamination.
Resumo:
An Electrocardiogram (ECG) monitoring system deals with several challenges related with noise sources. The main goal of this text was the study of Adaptive Signal Processing Algorithms for ECG noise reduction when applied to real signals. This document presents an adaptive ltering technique based on Least Mean Square (LMS) algorithm to remove the artefacts caused by electromyography (EMG) and power line noise into ECG signal. For this experiments it was used real noise signals, mainly to observe the di erence between real noise and simulated noise sources. It was obtained very good results due to the ability of noise removing that can be reached with this technique. A recolha de sinais electrocardiogr a cos (ECG) sofre de diversos problemas relacionados com ru dos. O objectivo deste trabalho foi o estudo de algoritmos adaptativos para processamento digital de sinal, para redu c~ao de ru do em sinais ECG reais. Este texto apresenta uma t ecnica de redu c~ao de ru do baseada no algoritmo Least Mean Square (LMS) para remo c~ao de ru dos causados quer pela actividade muscular (EMG) quer por ru dos causados pela rede de energia el ectrica. Para as experiencias foram utilizados ru dos reais, principalmente para aferir a diferen ca de performance do algoritmo entre os sinais reais e os simulados. Foram conseguidos bons resultados, essencialmente devido as excelentes caracter sticas que esta t ecnica tem para remover ru dos.
Resumo:
The impact of mycotoxins on human and animal health is well recognized. Aflatoxin B1 (AFB1) is by far the most prevalent and the most potent natural carcinogen and is usually the major aflatoxin produced by toxigenic fungal strains. Data available, points to an increasing frequency of poultry feed contamination by aflatoxins. Since aflatoxin residues may accumulate in body tissues, this represents a high risk to human health. Samples from commercial poultry birds have already presented detectable levels of aflatoxin in liver. A descriptive study was developed in order to assess fungal contamination by species from Aspergillus flavus complex in seven Portuguese poultry units. Air fungal contamination was studied by conventional and molecular methods. Air, litter and surfaces samples were collected. To apply molecular methods, air samples of 300L were collected using the Coriolis μ air sampler (Bertin Technologies), at 300 L/min airflow rate. For conventional methodologies, all the collected samples were incubated at 27ºC for five to seven days. Through conventional methods, Aspergillus flavus was the third fungal species (7%) most frequently found in 27 indoor air samples analysed and the most commonly isolated species (75%) in air samples containing only the Aspergillus genus...
Resumo:
Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.
Resumo:
Objectives - This study intended to characterize work environment contamination by particles in 2 waste-sorting plants. Material and Methods - Particles were measured by portable direct-reading equipment. Besides mass concentration in different sizes, data related with the number of particles concentration were also obtained. Results - Both sorting units showed the same distribution concerning the 2 exposure metrics: particulate matter 5 (PM5) and particulate matter 10 (PM10) reached the highest levels and 0.3 μm was the fraction with a higher number of particles. Unit B showed higher (p < 0.05) levels for both exposure metrics. For instance, in unit B the PM10 size is 9-fold higher than in unit A. In unit A, particulate matter values obtained in pre-sorting and in the sequential sorting cabinet were higher without ventilation working. Conclusions - Workers from both waste-sorting plants are exposed to particles. Particle counting provided additional information that is of extreme value for analyzing the health effects of particles since higher values of particles concentration were obtained in the smallest fraction.
Resumo:
OBJECTIVE: To describe the implantation and the effects of directly-observed treatment short course (DOTS) in primary health care units. METHODS: Interviews were held with the staff of nine municipal health care units (MHU) that provided DOTS in Rio de Janeiro City, Southeastern Brazil, in 2004-2005. A dataset with records of all tuberculosis treatments beginning in 2004 in all municipal health care units was collected. Bivariate analyses and a multinomial model were applied to identify associations between treatment outcomes and demographic and treatment process variables, including being in DOTS or self-administered therapy (SAT). RESULTS: From 4,598 tuberculosis cases treated in public health units administrated by the municipality, 1,118 (24.3%) were with DOTS and 3,480 (75.7%) with SAT. The odds of DOTS were higher among patients with age under 50 years, tuberculosis relapse and prior history of default or treatment failure. The odds of death were 52.0% higher among patients on DOTS as compared to SAT. DOTS modality including community health workers (CHWs) showed the highest treatment success rate. A reduction of 21.0% was observed in the odds of default (vs. cure) among patients on DOTS as compared to patients on SAT, and a reduction of 64.0% among patients on DOTS with CHWs as compared to those without CHWs. CONCLUSIONS: Patients with a "low compliance profile" were more likely to be included in DOTS. This strategy improves the quality of care provided to tuberculosis patients, although the proposed goals were not achieved.
Resumo:
Background: Temporal lobe epilepsy (TLE) is a neurological disorder that directly affects cortical areas responsible for auditory processing. The resulting abnormalities can be assessed using event-related potentials (ERP), which have high temporal resolution. However, little is known about TLE in terms of dysfunction of early sensory memory encoding or possible correlations between EEGs, linguistic deficits, and seizures. Mismatch negativity (MMN) is an ERP component – elicited by introducing a deviant stimulus while the subject is attending to a repetitive behavioural task – which reflects pre-attentive sensory memory function and reflects neuronal auditory discrimination and perceptional accuracy. Hypothesis: We propose an MMN protocol for future clinical application and research based on the hypothesis that children with TLE may have abnormal MMN for speech and non-speech stimuli. The MMN can be elicited with a passive auditory oddball paradigm, and the abnormalities might be associated with the location and frequency of epileptic seizures. Significance: The suggested protocol might contribute to a better understanding of the neuropsychophysiological basis of MMN. We suggest that in TLE central sound representation may be decreased for speech and non-speech stimuli. Discussion: MMN arises from a difference to speech and non-speech stimuli across electrode sites. TLE in childhood might be a good model for studying topographic and functional auditory processing and its neurodevelopment, pointing to MMN as a possible clinical tool for prognosis, evaluation, follow-up, and rehabilitation for TLE.
Resumo:
TLE in infancy has been the subject of varied research. Topographical and structural evidence is coincident with the neuronal systems responsible for auditory processing of the highest specialization and complexity. Recent studies have been showing the need of a hemispheric asymmetry for an optimization in central auditory processing (CAP) and acquisition and learning of a language system. A new functional research paradigm is required to study mental processes that require methods of cognitive-sensory information analysis processed in very short periods of time (msec), such as the ERPs. Thus, in this article, we hypothesize that the TLE in infancy could be a good model for topographic and functional study of CAP and its development process, contributing to a better understanding of the learning difficulties that children with this neurological disorder have.
Resumo:
Alheiras are a traditional, smoked, fermented meat sausage, produced in Portugal, with an undeniable cultural and gastronomic legacy. In this study, we assessed the nutritional value of this product, as well as the influence of different types of thermal processing. Alheiras from Mirandela were submitted to six different procedures: microwave, skillet, oven, charcoal grill, electric fryer and electric grill. Protein, fat, carbohydrate, minerals, NaCl, and cholesterol contents, as well as fatty acid profile were evaluated. The results show that alheiras are not hypercaloric but an unbalanced foodstuff (high levels of proteins and lipids) and the type of processing has a major impact on their nutritional value. Charcoal grill is the healthiest option: less fat (12.5 g/100 g) and cholesterol (29.3 mg/100 g), corresponding to a lower caloric intake (231.8 kcal, less 13% than the raw ones). Inversely, fried alheiras presented the worst nutritional profile, with the highest levels of fat (18.1 g/100 g) and cholesterol (76.0 g/100 g).
Resumo:
The main purpose of this work is to present and to interpret the change of structure and physical properties of tantalum oxynitride (TaNxOy) thin films, produced by dc reactive magnetron sputtering, by varying the processing parameters. A set of TaNxOy films was prepared by varying the reactive gases flow rate, using a N2/O2 gas mixture with a concentration ratio of 17:3. The different films, obtained by this process, exhibited significant differences. The obtained composition and the interpretation of X-ray diffraction results, shows that, depending on the partial pressure of the reactive gases, the films are: essentially dark grey metallic, when the atomic ratio (N + O)/Ta < 0.1, evidencing a tetragonal β-Ta structure; grey-brownish, when 0.1 < (N + O)/Ta < 1, exhibiting a face-centred cubic (fcc) TaN-like structure; and transparent oxide-type, when (N + O)/Ta > 1, evidencing the existence of Ta2O5, but with an amorphous structure. These transparent films exhibit refractive indexes, in the visible region, always higher than 2.0. The wear resistance of the films is relatively good. The best behaviour was obtained for the films with (N + O)/Ta ≈ 0.5 and (N + O)/Ta ≈ 1.3.
Resumo:
Prostate cancer (PCa) is one of the most incident malignancies worldwide. Although efficient therapy is available for early-stage PCa, treatment of advanced disease is mainly ineffective and remains a clinical challenge. microRNA (miRNA) dysregulation is associated with PCa development and progression. In fact, several studies have reported a widespread downregulation of miRNAs in PCa, which highlights the importance of studying compounds capable of restoring the global miRNA expression. The main aim of this study was to define the usefulness of enoxacin as an anti-tumoral agent in PCa, due to its ability to induce miRNA biogenesis in a TRBP-mediated manner. Using a panel of five PCa cell lines, we observed that all of them were wild type for the TARBP2 gene and expressed TRBP protein. Furthermore, primary prostate carcinomas displayed normal levels of TRBP protein. Remarkably, enoxacin was able to decrease cell viability, induce apoptosis, cause cell cycle arrest, and inhibit the invasiveness of cell lines. Enoxacin was also effective in restoring the global expression of miRNAs. This study is the first to show that PCa cells are highly responsive to the anti-tumoral effects of enoxacin. Therefore, enoxacin constitutes a promising therapeutic agent for PCa.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
As operações de separação por adsorção têm vindo a ganhar importância nos últimos anos, especialmente com o desenvolvimento de técnicas de simulação de leitos móveis em colunas, tal como a cromatografia de Leito Móvel Simulado (Simulated Moving Bed, SMB). Esta tecnologia foi desenvolvida no início dos anos 60 como método alternativo ao processo de Leito Móvel Verdadeiro (True Moving Bed, TMB), de modo a resolver vários dos problemas associados ao movimento da fase sólida, usuais nestes métodos de separação cromatográficos de contracorrente. A tecnologia de SMB tem sido amplamente utilizada em escala industrial principalmente nas indústrias petroquímica e de transformação de açúcares e, mais recentemente, na indústria farmacêutica e de química fina. Nas últimas décadas, o crescente interesse na tecnologia de SMB, fruto do alto rendimento e eficiente consumo de solvente, levou à formulação de diferentes modos de operação, ditos não convencionais, que conseguem unidades mais flexíveis, capazes de aumentar o desempenho de separação e alargar ainda mais a gama de aplicação da tecnologia. Um dos exemplos mais estudados e implementados é o caso do processo Varicol, no qual se procede a um movimento assíncrono de portas. Neste âmbito, o presente trabalho foca-se na simulação, análise e avaliação da tecnologia de SMB para dois casos de separação distintos: a separação de uma mistura de frutose-glucose e a separação de uma mistura racémica de pindolol. Para ambos os casos foram considerados e comparados dois modos de operação da unidade de SMB: o modo convencional e o modo Varicol. Desta forma, foi realizada a implementação e simulação de ambos os casos de separação no simulador de processos Aspen Chromatography, mediante a utilização de duas unidades de SMB distintas (SMB convencional e SMB Varicol). Para a separação da mistura frutose-glucose, no quediz respeito à modelização da unidade de SMB convencional, foram utilizadas duas abordagens: a de um leito móvel verdadeiro (modelo TMB) e a de um leito móvel simulado real (modelo SMB). Para a separação da mistura racémica de pindolol foi considerada apenas a modelização pelo modelo SMB. No caso da separação da mistura frutose-glucose, procedeu-se ainda à otimização de ambas as unidades de SMB convencional e Varicol, com o intuito do aumento das suas produtividades. A otimização foi realizada mediante a aplicação de um procedimento de planeamento experimental, onde as experiências foram planeadas, conduzidas e posteriormente analisadas através da análise de variância (ANOVA). A análise estatística permitiu selecionar os níveis dos fatores de controlo de modo a obter melhores resultados para ambas as unidades de SMB.
Resumo:
Estuaries are perhaps the most threatened environments in the coastal fringe; the coincidence of high natural value and attractiveness for human use has led to conflicts between conservation and development. These conflicts occur in the Sado Estuary since its location is near the industrialised zone of Peninsula of Setúbal and at the same time, a great part of the Estuary is classified as a Natural Reserve due to its high biodiversity. These facts led us to the need of implementing a model of environmental management and quality assessment, based on methodologies that enable the assessment of the Sado Estuary quality and evaluation of the human pressures in the estuary. These methodologies are based on indicators that can better depict the state of the environment and not necessarily all that could be measured or analysed. Sediments have always been considered as an important temporary source of some compounds or a sink for other type of materials or an interface where a great diversity of biogeochemical transformations occur. For all this they are of great importance in the formulation of coastal management system. Many authors have been using sediments to monitor aquatic contamination, showing great advantages when compared to the sampling of the traditional water column. The main objective of this thesis was to develop an estuary environmental management framework applied to Sado Estuary using the DPSIR Model (EMMSado), including data collection, data processing and data analysis. The support infrastructure of EMMSado were a set of spatially contiguous and homogeneous regions of sediment structure (management units). The environmental quality of the estuary was assessed through the sediment quality assessment and integrated in a preliminary stage with the human pressure for development. Besides the earlier explained advantages, studying the quality of the estuary mainly based on the indicators and indexes of the sediment compartment also turns this methodology easier, faster and human and financial resource saving. These are essential factors to an efficient environmental management of coastal areas. Data management, visualization, processing and analysis was obtained through the combined use of indicators and indices, sampling optimization techniques, Geographical Information Systems, remote sensing, statistics for spatial data, Global Positioning Systems and best expert judgments. As a global conclusion, from the nineteen management units delineated and analyzed three showed no ecological risk (18.5 % of the study area). The areas of more concern (5.6 % of the study area) are located in the North Channel and are under strong human pressure mainly due to industrial activities. These areas have also low hydrodynamics and are, thus associated with high levels of deposition. In particular the areas near Lisnave and Eurominas industries can also accumulate the contamination coming from Águas de Moura Channel, since particles coming from that channel can settle down in that area due to residual flow. In these areas the contaminants of concern, from those analyzed, are the heavy metals and metalloids (Cd, Cu, Zn and As exceeded the PEL guidelines) and the pesticides BHC isomers, heptachlor, isodrin, DDT and metabolits, endosulfan and endrin. In the remain management units (76 % of the study area) there is a moderate impact potential of occurrence of adverse ecological effects and in some of these areas no stress agents could be identified. This emphasizes the need for further research, since unmeasured chemicals may be causing or contributing to these adverse effects. Special attention must be taken to the units with moderate impact potential of occurrence of adverse ecological effects, located inside the natural reserve. Non-point source pollution coming from agriculture and aquaculture activities also seem to contribute with important pollution load into the estuary entering from Águas de Moura Channel. This pressure is expressed in a moderate impact potential for ecological risk existent in the areas near the entrance of this Channel. Pressures may also came from Alcácer Channel although they were not quantified in this study. The management framework presented here, including all the methodological tools may be applied and tested in other estuarine ecosystems, which will also allow a comparison between estuarine ecosystems in other parts of the globe.
Resumo:
The automatic acquisition of lexical associations from corpora is a crucial issue for Natural Language Processing. A lexical association is a recurrent combination of words that co-occur together more often than expected by chance in a given domain. In fact, lexical associations define linguistic phenomena such as idiomes, collocations or compound words. Due to the fact that the sense of a lexical association is not compositionnal, their identification is fundamental for the realization of analysis and synthesis that take into account all the subtilities of the language. In this report, we introduce a new statistically-based architecture that extracts from naturally occurring texts contiguous and non contiguous. For that purpose, three new concepts have been defined : the positional N-gram models, the Mutual Expectation and the GenLocalMaxs algorithm. Thus, the initial text is fisrtly transformed in a set of positionnal N-grams i.e ordered vectors of simple lexical units. Then, an association measure, the Mutual Expectation, evaluates the degree of cohesion of each positional N-grams based on the identification of local maximum values of Mutual Expectation. Great efforts have also been carried out to evaluate our metodology. For that purpose, we have proposed the normalisation of five well-known association measures and shown that both the Mutual Expectation and the GenLocalMaxs algorithm evidence significant improvements comparing to existent metodologies.