931 resultados para Structural hot spot stress
Resumo:
A meta dessa dissertação foi estudar a possibilidade de contaminação de um solo por hidrocarbonetos e metais pesados associados em zona industrial, estabelecendo uma metodologia de amostragem para avaliação de solos potencialmente contaminados. Para estudo de caso foi selecionada uma refinaria de petróleo localizada em território nacional. Conjuntamente com a metodologia de amostragem, foram estabelecidos como objetivos: a caracterização e avaliação do solo, a identificação dos possíveis “hot spot’s” para servir de subsídios para o monitoramento em investigações mais detalhadas do local e a avaliação da eficiência das barreiras argilosas que compõem os diques dos tanques. Foram investigados os solos que compõem os diques de contenção dos tanques armazenadores de petróleo. As diversas operações realizadas na área são fontes potencias de adição de contaminantes, que ocorrem, normalmente, em pequenas doses e de maneira contínua, pois a fonte nunca é estancada. Foram estabelecidas duas etapas de investigação: uma preliminar, que se consistiu em um levantamento do histórico da área, listando as evidências que pudessem indicar quais locais poderiam ser focos de contaminação A partir dos dados levantados na investigação preliminar foi feito o planejamento de uma campanha de amostragem para a coleta de amostras de solo. Além das análises usuais de caracterização, foram executadas análises de Hidrocarbonetos Totais de Petróleo (TPH Total Petroleum Hydrocarbon) e dos metais cádmio, chumbo, cromo, níquel e mercúrio. Para o enquadramento da área sob o ponto de vista de área contaminada foram utilizados como valores orientadores a “Nova Lista da Holanda” e os “Valores orientadores para solos e águas subterrâneas para o estado de São Paulo”. Dos quatro pontos amostrados, um foi classificado como um “hot spot”, atingindo valores que classificam a área como contaminada. Foi constatado que as barreiras argilosas que compõem os diques de contenção junto aos tanques de armazenamento são ineficientes, ou seja, os hidrocarbonetos estão percolando através dos diques de contenção.
Resumo:
The research and development of wind turbine blades are essential to keep pace with worldwide growth in the renewable energy sector. Although currently blades are typically produced using glass fiber reinforced composite materials, the tendency for larger size blades, particularly for offshore applications, has increased the interest on carbon fiber reinforced composites because of the potential for increased stiffness and weight reduction. In this study a model of blade designed for large generators (5 MW) was studied on a small scale. A numerical simulation was performed to determine the aerodynamic loading using a Computational Fluid Dynamics (CFD) software. Two blades were then designed and manufactured using epoxy matrix composites: one reinforced with glass fibers and the other with carbon fibers. For the structural calculations, maximum stress failure criterion was adopted. The blades were manufactured by Vacuum Assisted Resin Transfer Molding (VARTM), typical for this type of component. A weight comparison of the two blades was performed and the weight of the carbon fiber blade was approximately 45% of the weight of the fiberglass reinforced blade. Static bending tests were carried out on the blades for various percentages of the design load and deflections measurements were compared with the values obtained from finite element simulations. A good agreement was observed between the measured and calculated deflections. In summary, the results of this study confirm that the low density combined with high mechanical properties of carbon fibers are particularly attractive for the production of large size wind turbine blades
Resumo:
Double radio sources have been studied since the discovery of extragalactic radio sources in the decade of 1930. Since then, several numerical studies and analytical models have been proposed seeking a better understanding of the physical phenomena that determines the origin and evolution of such objects. In this thesis, we intended to study the evolution problem of the double radio sources in two fronts: in the ¯rst we have developed an analytical self-similar model that represents a generalization of most models found in the literature and solve some existent problems related to the jet head evolution. We deal with this problem using samples of hot spot sizes to ¯nd a power law relation between the jet head dimension and the source length. Using our model, we were able to draw the evolution curves of the double sources in a PD diagram for both compact sources (GPS and CSS) and extended sources of the 3CR catalogue. We have alson developed a computation tool that allows us to generate synthetic radio maps of the double sources. The objective is to determine the principal physical parameters of those objects by comparing synthetic and observed radio maps. In the second front, we used numeric simulations to study the interaction of the extra- galactic jets with the environment. We simulated situations where the jet propagates in a medium with high density contrast gas clouds capable to block the jet forward motion, forming the distorted structures observed in the morphology of real sources. We have also analyzed the situation in which the jet changes its propagation direction due to a change of the source main axis, creating the X-shaped sources. The comparison between our simulations and the real double radio sources, enable us to determine the values of the main physical parameters responsible for the distortions observed in those objects
Resumo:
Mirror therapy (MT) is being used as a rehabilitation tool in various diseases, including stroke. Although some studies have shown its effectiveness, little is known about neural mechanisms that underlie the rehabilitation process. Therefore, this study aimed at assessing cortical neuromodulation after a single MT intervention in ischemic stroke survivors, by means of by functional Magnetic Resonance Imaging (fMRI) and Transcranial Magnetic Stimulation (TMS). Fifteen patients participated in a single thirty minutes MT session. fMRI data was analyzed bilaterally in the following Regions of Interest (ROI): Supplementary Motor Area (SMA), Premotor cortex (PMC), Primary Motor cortex (M1), Primary Sensory cortex (S1) and Cerebellum. In each ROI, changes in the percentage of occupation and beta values were computed. Group fMRI data showed a significant decreased in the percentage of occupation in PMC and cerebellum, contralateral to the affected hand (p <0.05). Significant increase in beta values was observed in the following contralateral motor areas: SMA, Cerebellum, PMC and M1 (p<0,005). Moreover, a significant decrease was observed in the following ipsilateral motor areas: PMC and M1 (p <0,001). In S1 a bilateral significant decrease (p<0.0005) was observed.TMS consisted of the analysis of Motor Evoked Potential (MEP) of M1 hotspot. A significant increase in the amplitude of the MEP was observed after therapy in the group (p<0,0001) and individually in 4 patients (p <0.05). Altogether, our results imply that single MT intervention is already capable of promoting changes in neurobiological markers toward patterns observed in healthy subjects. Furthermore, the contralateral hemisphere motor areas changes are opposite to the ones in the ipsilateral side, suggesting an increase system homeostasis.
Resumo:
Paracoccidioides brasiliensis infections have been little studied in wild and/or domestic animals, which may represent an important indicator of the presence of the pathogen in nature. Road-killed wild animals have been used for surveillance of vectors of zoonotic pathogens and may offer new opportunities for eco-epidemiological studies of paracoccidiodomycosis (PCM). The presence of P. brasiliensis infection was evaluated by Nested-PCR in tissue samples collected from 19 road-killed animals; 3 Cavia aperea (guinea pig), 5 Cerdocyon thous (crab-eating-fox), 1 Dasypus novemcinctus (nine-banded armadillo), 1 Dasypus septemcinctus (seven-banded armadillo), 2 Didelphis albiventris (white-eared opossum), 1 Eira barbara (tayra), 2 Gallictis vittata (grison), 2 Procyon cancrivorus (raccoon) and 2 Sphiggurus spinosus (porcupine). Specific P. brasiliensis amplicons were detected in (a) several organs of the two armadillos and one guinea pig, (b) the lung and liver of the porcupine, and (c) the lungs of raccoons and grisons. P. brasiliensis infection in wild animals from endemic areas might be more common than initially postulated. Molecular techniques can be used for detecting new hosts and mapping 'hot spot' areas of PCM.
Resumo:
Plasmon-enhanced spectroscopic techniques have expanded single-molecule detection (SMD) and are revolutionizing areas such as bio-imaging and single-cell manipulation. Surface-enhanced (resonance) Raman scattering (SERS or SERRS) combines high sensitivity with molecularfingerprint information at the single-molecule level. Spectra originating from single-molecule SERS experiments are rare events, which occur only if a single molecule is located in a hot-spot zone. In this spot, the molecule is selectively exposed to a significant enhancement associated with a high, local electromagnetic field in the plasmonic substrate. Here, we report an SMD study with an electrostatic approach in which a Langmuir film of a phospholipid with anionic polar head groups (PO 4 -) was doped with cationic methylene blue (MB), creating a homogeneous, two-dimensional distribution of dyes in the monolayer. The number of dyes in the probed area of the Langmuir-Blodgett (LB) film coating the Ag nanostructures established a regime in which single-molecule events were observed, with the identification based on direct matching of the observed spectrum at each point of the mapping with a reference spectrum for the MB molecule. In addition, advanced fitting techniques were tested with the data obtained from micro-Raman mapping, thus achieving real-time processing to extract the MB single-molecule spectra. © 2013 Society for Applied Spectroscopy.
Resumo:
Preserving large tracts of natural habitats is essential to maintain biodiversity. Nevertheless, even large areas may still suffer from less visible impacts such as loss of ecological processes. Because mapping ecological processes over large scales is not practical, an alternative is to map surrogate species that are key for those processes. In this study, we chose four species of Neotropical large mammals (the largest apex predator: jaguar - Panthera onca; the largest herbivore: tapir - Tapirus terrestris; the largest seed predator: white-lipped peccary - Tayassu pecari; and the largest arboreal seed disperser: muriqui - Brachyteles spp.) in an ecosystem with an old history of human impact (the Atlantic Forest) to test whether areas with native forest still harbor ecological processes that may guarantee long-term ecosystem maintenance. We gathered 94 locations with recent presence of the four species to map current ranges and model suitable areas. Our results reveal that 96% of the remaining Atlantic Forest is depleted of at least one of the four surrogate species and 88% is completely depleted of all four surrogate species. We also found that only 16% is still environmentally suitable for all four, and 55% is completely unsuitable to all four of them. Our study highlights the importance of looking beyond land cover to fully depict intactness of natural areas, and suggests that ecosystems with a long history of human impact (such as the Atlantic Forest) may be suffering from ecological impacts not seen at a first glance. © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Collectively, the observations indicate that the overall warming of the Arctic system continued in 2007. There are some elements that are stabilizing or returning to climatological norms. These mixed tendencies illustrate the sensitivity and complexity of the Arctic System. Atmosphere: Hot spot shifts toward Europe Ocean: North Pole Temperatures at depth returning to 1990s values Sea Ice: Summer extent at record minimum Greenland: Recent warm temperatures associated with net ice loss Biology: increasing tundra shrub cover and variable treeline advance; up to 80% declines in some caribou herds while goose populations double Land: Increase in permafrost temperatures The Arctic Report Card 2007 is introduced as a means of presenting clear, reliable and concise information on recent observations of environmental conditions in the Arctic, relative to historical time series records. It provides a method of updating and expanding the content of the State of the Arctic Report, published in fall 2006, to reflect current conditions. Material presented in the Report Card is prepared by an international team of scientists and is peer-reviewed by topical experts nominated by the US Polar Research Board. The audience for the Arctic Report Card is wide, including scientists, students, teachers, decision makers and the general public interested in Arctic environment and science. The web-based format will facilitate future timely updates of the content.
Resumo:
Abstract Yellowstone National Park is located over a hot spot under the North American tectonic plate and holds a potentially explosive super-volcano that has the ability to cause deadly consequences on the North American continent. After an eruption the surrounding region would see the greatest devastation, covered by pyroclastic deposits and thick ash fall exterminating most all life and destroying all structures in its path. In landscapes of greater distance from the event the consequences will be less dramatic yet still substantial. Records of previous eruption data from the Yellowstone super-volcano show that the ash fall out from the eruption can cover areas as large as one million square kilometers and could leave Nebraska covered in ash up to 10 centimeters thick. This would cause destruction of agriculture, extensive damage to structures, decreased temperatures, and potential respiratory hazards. The effects of volcanic ash on the human respiratory system have been shown to cause acute symptoms from heavy exposure. Symptoms include nasal irritation, throat irritation, coughing, and if preexisting conditions are present some can develop bronchial symptoms, which can last for a few days. People with bronchitis and asthma are shown to experience airway irritation and uncomfortable breathing. In most occurrences, exposure of volcanic ash is too short to cause long-term health hazards. Wearing facial protection can alleviate much of the symptoms. Most of the long-term ramifications of the eruption will be from the atmospheric changes caused from disruption of solar radiation, which will affect much of the global population. The most pertinent concerns for Nebraska citizens are from the accumulation of ash deposits over the landscape and the climatic perturbations. Potential mitigation procedures are essential to prepare our essentially unaware population of the threat that they may soon face if the volcano continues on its eruption cycle.
Resumo:
[EN] The 1883 eruption of Krakatau is one of the best known volcanic events in the world, although it was not the largest, nor the deadliest of known eruptions. However, the eruption happened in a critical moment (just after the first global telegraph network was established) and in a strategic place (the Sunda Straits were a naval traffic hot spot at that time). The lecture will explore these events in some detail before presenting an outline on ongoing multidisciplinary efforts to unravel the past and present day plumbing systems of the 1883 eruption and that of the active Anak Krakatau cone. A mid- and a lower-crustal magma storage level exist beneath the volcano, placing significant emphasis on magma-crust interaction in the uppermost, sediment-rich crust. This final aspect shares similarities with the 2011/2012 El Hierro eruption, highlighting the relevance of the interaction between ascending magmas and marine deposits that oceanic magmas have to pass. At Krakatau, shallow-level crustal contamination offers a possible explanation for the explosive nature of the 1883 eruption and also for those of the presently active Anak Krakatau edifice and helps constrain location, style and processes of subvolcanic magma storage.
Resumo:
Today, third generation networks are consolidated realities, and user expectations on new applications and services are becoming higher and higher. Therefore, new systems and technologies are necessary to move towards the market needs and the user requirements. This has driven the development of fourth generation networks. ”Wireless network for the fourth generation” is the expression used to describe the next step in wireless communications. There is no formal definition for what these fourth generation networks are; however, we can say that the next generation networks will be based on the coexistence of heterogeneous networks, on the integration with the existing radio access network (e.g. GPRS, UMTS, WIFI, ...) and, in particular, on new emerging architectures that are obtaining more and more relevance, as Wireless Ad Hoc and Sensor Networks (WASN). Thanks to their characteristics, fourth generation wireless systems will be able to offer custom-made solutions and applications personalized according to the user requirements; they will offer all types of services at an affordable cost, and solutions characterized by flexibility, scalability and reconfigurability. This PhD’s work has been focused on WASNs, autoconfiguring networks which are not based on a fixed infrastructure, but are characterized by being infrastructure less, where devices have to automatically generate the network in the initial phase, and maintain it through reconfiguration procedures (if nodes’ mobility, or energy drain, etc..., cause disconnections). The main part of the PhD activity has been focused on an analytical study on connectivity models for wireless ad hoc and sensor networks, nevertheless a small part of my work was experimental. Anyway, both the theoretical and experimental activities have had a common aim, related to the performance evaluation of WASNs. Concerning the theoretical analysis, the objective of the connectivity studies has been the evaluation of models for the interference estimation. This is due to the fact that interference is the most important performance degradation cause in WASNs. As a consequence, is very important to find an accurate model that allows its investigation, and I’ve tried to obtain a model the most realistic and general as possible, in particular for the evaluation of the interference coming from bounded interfering areas (i.e. a WiFi hot spot, a wireless covered research laboratory, ...). On the other hand, the experimental activity has led to Throughput and Packet Error Rare measurements on a real IEEE802.15.4 Wireless Sensor Network.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.