44 resultados para Just-too-different intuition
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Myocardial perfusion-gated-SPECT (MP-gated-SPECT) imaging often shows radiotracer uptake in abdominal organs. This accumulation interferes frequently with qualitative and quantitative assessment of the infero-septal region of myocardium. The objective of this study is to evaluate the effect of ingestion of different fat content on the reduction of extra-myocardial uptake and to improve MP-gated-SPECT image quality. In this study, 150 patients (65 ^ 18 years) who were referred for MP-gated-SPECT underwent a 1-day-protocol including imaging after stress (physical or pharmacological) and resting conditions. All patients gave written informed consent. Patients were subdivided into five groups: GI, GII, GIII, GIV and GV. In the first four groups, patients ate two chocolate bars with different fat content. Patients in GV – control group (CG) – had just water. Uptake indices (UI) of myocardium (M)/liver(L) and M/stomach–proximal bowel(S) revealed lower UI of M/S at rest in all groups. Both stress and rest studies using different food intake indicate that patients who ate chocolate with different fat content showed better UI of M/L than the CG. The UI of M/L and M/S of groups obtained under physical stress are clearly superior to that of groups obtained under pharmacological stress. These differences are only significant in patients who ate high-fat chocolate or drank water. The analysis of all stress studies together (GI, GII, GIII and GIV) in comparison with CG shows higher mean ranks of UI of M/L for those who ate high-fat chocolate. After pharmacological stress, the mean ranks of UI of M/L were higher for patients who ate high- and low-fat chocolate. In conclusion, eating food with fat content after radiotracer injection increases, respectively, the UI of M/L after stress and rest in MP-gated-SPECT studies. It is, therefore, recommended that patients eat a chocolate bar after radiotracer injection and before image acquisition.
Resumo:
Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.
Resumo:
The big proliferation of mobile communication systems has caused an increased concern about the interaction between the human body and the antennas of mobile handsets. In order to study the problem, a multiband antenna was designed, fabricated and measured to operate over two frequency sub bands 900 and 1800 MHz. After that, we simulated the same antenna, but now, in the presence of a human head model to analyze the head's influence. First, the influence of the human head on the radiation efficiency of the antenna has been investigated as a function of the distance between the head and the antenna and with the inclination of the antenna. Furthermore, the relative amount of the electromagnetic power absorbed in the head has been obtained.
Resumo:
O presente trabalho refere-se a um projecto real de investimento imobiliário, relativo à construção e comercialização de duas moradias geminadas destinadas a habitação. Este estudo permite ao promotor avaliar o seu interesse económico, caracterizar oportunidades e identificar factores de risco, permitindo uma tomada de decisão baseada em estudos económicos objectivos e fundamentados, e não apenas pela sua intuição. Após a pesquisa sobre o estado de conhecimento deste tema, iniciou-se o estudo do projecto, caracterizando-o numa fase inicial, com a realização de um estudo prévio da sua viabilidade económica, recorrendo a métodos simplificados para a obtenção dos parâmetros de análise necessários, como sejam, os custo do terreno e da construção, a duração da obra, o PVT do imóvel, e a distribuição temporal dos custos e receitas. É então realizada a análise com base em descontos de fluxos de caixa, para determinar a rendibilidade do projecto, através dos parâmetros de decisão VAL e TIR. Concluindo-se que o projecto é economicamente viável, inicia-se a obra e apuram-se os valores reais dos diversos parâmetros de análise, ficando no final com as variáveis estimadas do PVT e do tempo necessário à comercialização do imóvel. É também abordada a importância da gestão coordenação e fiscalização da obra. Com os valores reais obtidos são traçados diversos cenários, analisado o recurso a capital alheio, às variações no PVT e no tempo necessário para a comercialização do imóvel e a possibilidade de arrendamento com posterior venda. A análise do projecto segundo esses cenários, permite obter medidas de rendibilidade e compará-los. É então feita a comparação entre as rendibilidades dos vários cenários e retiradas as conclusões sobre os resultados obtidos. Para melhor compreensão dos resultados, é feita uma abordagem à crise imobiliária sentida em Portugal e à possibilidade do uso da permuta imobiliária para facilitar a realização dos negócios imobiliários. No final, serão realizadas recomendações e propostas de melhoria para estudos que possam ser relevantes para o tema e dar uma possível continuidade a este trabalho.
Resumo:
Este trabalho utiliza uma estrutura pin empilhada, baseada numa liga de siliceto de carbono amorfo hidrogenado (a-Si:H e/ou a-SiC:H), que funciona como filtro óptico na zona visível do espectro electromagnético. Pretende-se utilizar este dispositivo para realizar a demultiplexagem de sinais ópticos e desenvolver um algoritmo que permita fazer o reconhecimento autónomo do sinal transmitido em cada canal. O objectivo desta tese visa implementar um algoritmo que permita o reconhecimento autónomo da informação transmitida por cada canal através da leitura da fotocorrente fornecida pelo dispositivo. O tema deste trabalho resulta das conclusões de trabalhos anteriores, em que este dispositivo e outros de configuração idêntica foram analisados, de forma a explorar a sua utilização na implementação da tecnologia WDM. Neste trabalho foram utilizados três canais de transmissão (Azul – 470 nm, Verde – 525 nm e Vermelho – 626 nm) e vários tipos de radiação de fundo. Foram realizadas medidas da resposta espectral e da resposta temporal da fotocorrente do dispositivo, em diferentes condições experimentais. Variou-se o comprimento de onda do canal e o comprimento de onda do fundo aplicado, mantendo-se constante a intensidade do canal e a frequência de transmissão. Os resultados obtidos permitiram aferir sobre a influência da presença da radiação de fundo e da tensão aplicada ao dispositivo, usando diferentes sequências de dados transmitidos nos vários canais. Verificou-se, que sob polarização inversa, a radiação de fundo vermelho amplifica os valores de fotocorrente do canal azul e a radiação de fundo azul amplifica o canal vermelho e verde. Para polarização directa, apenas a radiação de fundo azul amplifica os valores de fotocorrente do canal vermelho. Enquanto para ambas as polarizações, a radiação de fundo verde, não tem uma grande influência nos restantes canais. Foram implementados dois algoritmos para proceder ao reconhecimento da informação de cada canal. Na primeira abordagem usou-se a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa e directa. Pela comparação das duas medidas desenvolveu-se e testou-se um algoritmo que permite o reconhecimento dos canais individuais. Numa segunda abordagem procedeu-se ao reconhecimento da informação de cada canal mas com aplicação de radiação de fundo, tendo-se usado a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa sem aplicação de radiação de fundo com a informação contida nas medidas de fotocorrente geradas pelo dispositivo sob polarização inversa com aplicação de radiação de fundo. Pela comparação destas duas medidas desenvolveu-se e testou-se o segundo algoritmo que permite o reconhecimento dos canais individuais com base na aplicação de radiação de fundo.
Resumo:
We have studied, in particular under normality of the implied random variables, the connections between different measures of risk such as the standard deviation, the W-ruin probability and the p-V@R. We discuss conditions granting the equivalence of these measures with respect to risk preference relations and the equivalence of dominance and efficiency of risk-reward criteria involving these measures. Then more specifically we applied these concepts to rigorously face the problem of finding the efficient set of de Finetti’s variable quota share proportional reinsurance.
Resumo:
As teachers, we are challenged everyday to solve pedagogical problems and we have to fight for our students’ attention in a media rich world. I will talk about how we use ICT in Initial Teacher Training and give you some insight on what we are doing. The most important benefit of using ICT in education is that it makes us reflect on our practice. There is no doubt that our classrooms need to be updated, but we need to be critical about every peace of hardware, software or service that we bring into them. It is not only because our budgets are short, but also because e‐learning is primarily about learning, not technology. Therefore, we need to have the knowledge and skills required to act in different situations, and choose the best tool for the job. Not all subjects are suitable for e‐learning, nor do all students have the skills to organize themselves their own study times. Also not all teachers want to spend time programming or learning about instructional design and metadata. The promised land of easy use of authoring tools (e.g. eXe and Reload) that will lead to all teachers become Learning Objects authors and share these LO in Repositories, all this failed, like previously HyperCard, Toolbook and others. We need to know a little bit of many different technologies so we can mobilize this knowledge when a situation requires it: integrate e‐learning technologies in the classroom, not a flipped classroom, just simple tools. Lecture capture, mobile phones and smartphones, pocket size camcorders, VoIP, VLE, live video broadcast, screen sharing, free services for collaborative work, save, share and sync your files. Do not feel stressed to use everything, every time. Just because we have a whiteboard does not mean we have to make it the centre of the classroom. Start from where you are, with your preferred subject and the tools you master. Them go slowly and try some new tool in a non‐formal situation and with just one or two students. And you don’t need to be alone: subscribe a mailing list and share your thoughts with other teachers in a dedicated forum, even better if both are part of a community of practice, and share resources. We did that for music teachers and it was a success, in two years arriving at 1.000 members. Just do it.
Resumo:
Formaldehyde (CH2O), the most simple and reactive aldehyde, is a colorless, reactive and readily polymerizing gas at room temperature (National Toxicology Program [NTP]. It has a pungent suffocating odor that is recognized by most human subjects at concentrations below 1 ppm. Aleksandr Butlerov synthesized the chemical in 1859, but it was August Wilhelm von Hofmann who identified it as the product formed from passing methanol and air over a heated platinum spiral in 1867. This method is still the basis for the industrial production of formaldehyde today, in which methanol is oxidized using a metal catalyst. By the early 20th century, with the explosion of knowledge in chemistry and physics, coupled with demands for more innovative synthetic products, the scene was set for the birth of a new material–plastics. According to the Report on Carcinogens, formaldehyde ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Formaldehyde annual production rises up to 21 million tons worldwide and it has increased in China with 7.5 million tons produced in 2007. Given its economic importance and widespread use, many people are exposed to formaldehyde environmentally and/or occupationally. Commercially, formaldehyde is manufactured as an aqueous solution called formalin, usually containing 37% by weight of dissolved formaldehyde. This chemical is present in all regions of the atmosphere arising from the oxidation of biogenic and anthropogenic hydrocarbons. Formaldehyde concentration levels range typically from 2 to 45 ppbV (parts per billion in a given volume) in urban settings that are mainly governed by primary emissions and secondary formation.
Resumo:
We investigate the crust, upper mantle and mantle transition zone of the Cape Verde hotspot by using seismic P and S receiver functions from several tens of local seismograph stations. We find a strong discontinuity at a depth of similar to 10 km underlain by a similar to 15-km thick layer with a high (similar to 1.9) Vp/Vs velocity ratio. We interpret this discontinuity and the underlying layer as the fossil Moho, inherited from the pre-hotspot era, and the plume-related magmatic underplate. Our uppermost-mantle models are very different from those previously obtained for this region: our S velocity is much lower and there are no indications of low densities. Contrary to previously published arguments for the standard transition zone thickness our data indicate that this thickness under the Cape Verde islands is up to similar to 30 km less than in the ambient mantle. This reduction is a combined effect of a depression of the 410-km discontinuity and an uplift of the 660-km discontinuity. The uplift is in contrast to laboratory data and some seismic data on a negligible dependence of depth of the 660-km discontinuity on temperature in hotspots. A large negative pressure-temperature slope which is suggested by our data implies that the 660-km discontinuity may resist passage of the plume. Our data reveal beneath the islands a reduction of S velocity of a few percent between 470-km and 510-km depths. The low velocity layer in the upper transition zone under the Cape Verde archipelago is very similar to that previously found under the Azores and a few other hotspots. In the literature there are reports on a regional 520-km discontinuity, the impedance of which is too large to be explained by the known phase transitions. Our observations suggest that the 520-km discontinuity may present the base of the low-velocity layer in the transition zone. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Cork processing wastewater is an aqueous complex mixture of organic compounds that have been extracted from cork planks during the boiling process. These compounds, such as polysaccharides and polyphenols, have different biodegradability rates, which depend not only on the natureof the compound but also on the size of the compound. The aim of this study is to determine the biochemical oxygen demands (BOD) and biodegradationrate constants (k) for different cork wastewater fractions with different organic matter characteristics. These wastewater fractions were obtained using membrane separation processes, namely nanofiltration (NF) and ultrafiltration (UF). The nanofiltration and ultrafiltration membranes molecular weight cut-offs (MWCO) ranged from 0.125 to 91 kDa. The results obtained showed that the biodegradation rate constant for the cork processing wastewater was around 0.3 d(-1) and the k values for the permeates varied between 0.27-0.72 d(-1), being the lower values observed for permeates generated by the membranes with higher MWCO and the higher values observed for the permeates generated by the membranes with lower MWCO. These higher k values indicate that the biodegradable organic matter that is permeated by the membranes with tighter MWCO is more readily biodegradated.
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica - ramo de Energia
Resumo:
This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.
Resumo:
Water covers over 70% of the Earth's surface, and is vital for all known forms of life. But only 3% of the Earth's water is fresh water, and less than 0.3% of all freshwater is in rivers, lakes, reservoirs and the atmosphere. However, rivers and lakes are an important part of fresh surface water, amounting to about 89%. In this Master Thesis dissertation, the focus is on three types of water bodies – rivers, lakes and reservoirs, and their water quality issues in Asian countries. The surface water quality in a region is largely determined both by the natural processes such as climate or geographic conditions, and the anthropogenic influences such as industrial and agricultural activities or land use conversion. The quality of the water can be affected by pollutants discharge from a specific point through a sewer pipe and also by extensive drainage from agriculture/urban areas and within basin. Hence, water pollutant sources can be divided into two categories: Point source pollution and Non-point source (NPS) pollution. Seasonal variations in precipitation and surface run-off have a strong effect on river discharge and the concentration of pollutants in water bodies. For example, in the rainy season, heavy and persistent rain wash off the ground, the runoff flow increases and may contain various kinds of pollutants and, eventually, enters the water bodies. In some cases, especially in confined water bodies, the quality may be positive related with rainfall in the wet season, because this confined type of fresh water systems allows high dilution of pollutants, decreasing their possible impacts. During the dry season, the quality of water is largely related to industrialization and urbanization pollution. The aim of this study is to identify the most common water quality problems in Asian countries and to enumerate and analyze the methodologies used for assessment of water quality conditions of both rivers and confined water bodies (lakes and reservoirs). Based on the evaluation of a sample of 57 papers, dated between 2000 and 2012, it was found that over the past decade, the water quality of rivers, lakes, and reservoirs in developing countries is being degraded. Water pollution and destruction of aquatic ecosystems have caused massive damage to the functions and integrity of water resources. The most widespread NPS in Asian countries and those which have the greatest spatial impacts are urban runoff and agriculture. Locally, mine waste runoff and rice paddy are serious NPS problems. The most relevant point pollution sources are the effluents from factories, sewage treatment plant, and public or household facilities. It was found that the most used methodology was unquestionably the monitoring activity, used in 49 of analyzed studies, accounting for 86%. Sometimes, data from historical databases were used as well. It can be seen that taking samples from the water body and then carry on laboratory work (chemical analyses) is important because it can give an understanding of the water quality. 6 papers (11%) used a method that combined monitoring data and modeling. 6 papers (11%) just applied a model to estimate the quality of water. Modeling is a useful resource when there is limited budget since some models are of free download and use. In particular, several of used models come from the U.S.A, but they have their own purposes and features, meaning that a careful application of the models to other countries and a critical discussion of the results are crucial. 5 papers (9%) focus on a method combining monitoring data and statistical analysis. When there is a huge data matrix, the researchers need an efficient way of interpretation of the information which is provided by statistics. 3 papers (5%) used a method combining monitoring data, statistical analysis and modeling. These different methods are all valuable to evaluate the water quality. It was also found that the evaluation of water quality was made as well by using other types of sampling different than water itself, and they also provide useful information to understand the condition of the water body. These additional monitoring activities are: Air sampling, sediment sampling, phytoplankton sampling and aquatic animal tissues sampling. Despite considerable progress in developing and applying control regulations to point and NPS pollution, the pollution status of rivers, lakes, and reservoirs in Asian countries is not improving. In fact, this reflects the slow pace of investment in new infrastructure for pollution control and growing population pressures. Water laws or regulations and public involvement in enforcement can play a constructive and indispensable role in environmental protection. In the near future, in order to protect water from further contamination, rapid action is highly needed to control the various kinds of effluents in one region. Environmental remediation and treatment of industrial effluent and municipal wastewaters is essential. It is also important to prevent the direct input of agricultural and mine site runoff. Finally, stricter environmental regulation for water quality is required to support protection and management strategies. It would have been possible to get further information based in the 57 sample of papers. For instance, it would have been interesting to compare the level of concentrations of some pollutants in the diferente Asian countries. However the limit of three months duration for this study prevented further work to take place. In spite of this, the study objectives were achieved: the work provided an overview of the most relevant water quality problems in rivers, lakes and reservoirs in Asian countries, and also listed and analyzed the most common methodologies.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de Mestre em Ciências da Educação - Especialidade Intervenção Precoce
Resumo:
Purpose: Samples from different environmental sources were screened for the presence of Aspergillus, and the distribution of the different species-complexes was determined in order to understand differences among that distribution in the several environmental sources and which of these species complexes are present in specific environmental settings. Methods: Four distinct environments (beaches, poultries, swineries and hospital) were studied and analyzed for which Aspergillus complexes were present in each setting. After plate incubation and colony isolation, morphological identification was done using macro- and microscopic characteristics. The universal fungal primers ITS1 and ITS4 were used to amplify DNA from all Aspergillus isolates, which was sequenced for identification to species complex level. SPSS v15.0 for Windows was used to perform the statistical analysis. Results: Thirty-nine isolates of Aspergillus were recovered from both the sand beach and poultries, 31 isolates from swineries, and 80 isolates from hospital environments, for a total 189 isolates. Eleven species complexes were found total. Isolates belonging to the Aspergillus Versicolores species-complex were the most frequently found (23.8%), followed by Flavi (18.0%), Fumigati (15.3%) and Nigri (13.2%) complexes. A significant association was found between the different environmental sources and the distribution of the several species-complexes (p<0.001); the hospital environment had a greater variability of species-complexes than other environmental locations (10 in hospital environment, against nine in swine, eight in poultries and seven in sand beach). Isolates belonging to Nidulantes complex were detected only in the hospital environment, whereas the other complexes were identified in more than one setting. Conclusion: Because different Aspergillus complexes have different susceptibilities to antifungal drugs, and different abilities in producing mycotoxins, knowledge of the species-complex epidemiology for each setting may allow preventive or corrective measures to be taken toward decreasing professional workers or patient exposure to those agents.