21 resultados para Hot thermal environments
Resumo:
A biomonitoring study, using transplanted lichens Flavoparmelia caperata, was conducted to assess the indoor air quality in primary schools in urban (Lisbon) and rural (Ponte de Sor) Portuguese sites. The lichens exposure period occurred between April and June 2010 and two types of environments of the primary schools were studied: classrooms and outdoor/courtyard. Afterwards, the lichen samples were processed and analyzed by instrumental neutron activation analysis (INAA) to assess a total of 20 chemical elements. Accumulated elements in the exposed lichens were assessed and enrichment factors (EF) were determined. Indoor and outdoor biomonitoring results were compared to evaluate how biomonitors (as lichens) react at indoor environments and to assess the type of pollutants that are prevalent in those environments.
Resumo:
A compactação dos equipamentos de tecnologia de informação e os aumentos simultâneos no consumo de energia dos processadores levam a que seja assegurada a distribuição adequada de ar frio, a remoção do ar quente, a capacidade adequada de arrefecimento e uma diminuição do consumo de energia. Considerando-se a cogeração como uma alternativa energeticamente eficiente em relação a outros métodos de produção de energia, com este trabalho faz-se a análise à rentabilidade de uma eventual integração de um sistema de cogeração num centro informático.
Resumo:
Object-oriented programming languages presently are the dominant paradigm of application development (e. g., Java,. NET). Lately, increasingly more Java applications have long (or very long) execution times and manipulate large amounts of data/information, gaining relevance in fields related with e-Science (with Grid and Cloud computing). Significant examples include Chemistry, Computational Biology and Bio-informatics, with many available Java-based APIs (e. g., Neobio). Often, when the execution of such an application is terminated abruptly because of a failure (regardless of the cause being a hardware of software fault, lack of available resources, etc.), all of its work already performed is simply lost, and when the application is later re-initiated, it has to restart all its work from scratch, wasting resources and time, while also being prone to another failure and may delay its completion with no deadline guarantees. Our proposed solution to address these issues is through incorporating mechanisms for checkpointing and migration in a JVM. These make applications more robust and flexible by being able to move to other nodes, without any intervention from the programmer. This article provides a solution to Java applications with long execution times, by extending a JVM (Jikes research virtual machine) with such mechanisms. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
The aim of this study was to contribute to the assessment of exposure levels of ultrafine particles in the urban environment of Lisbon, Portugal, due to automobile traffic, by monitoring lung deposited alveolar surface area (resulting from exposure to ultrafine particles) in a major avenue leading to the town center during late spring, as well as in indoor buildings facing it. Data revealed differentiated patterns for week days and weekends, consistent with PM2.5 and PM10 patterns currently monitored by air quality stations in Lisbon. The observed ultrafine particulate levels may be directly correlated with fluxes in automobile traffic. During a typical week, amounts of ultrafine particles per alveolar deposited surface area varied between 35 and 89.2 μm2/cm3, which are comparable with levels reported for other towns in Germany and the United States. The measured values allowed for determination of the number of ultrafine particles per cubic centimeter, which are comparable to levels reported for Madrid and Brisbane. In what concerns outdoor/indoor levels, we observed higher levels (32 to 63%) outdoors, which is somewhat lower than levels observed in houses in Ontario.
Resumo:
O presente documento tem como principal objectivo efectuar o projecto de dimensionamento de um sistema de águas quentes sanitárias para uma escola. Numa primeira fase foi elaborado uma pesquisa sobre o contexto energético, a nível mundial, europeu e nacional, bem como o seu contexto jurídico a nível europeu e nacional, e uma explicação superficial sobre os fundamentos da energia solar, onde se foca a importância da radiação solar e os vários tipos de sistemas solares térmicos, bem como os seus constituintes. Segue-se a abordagem ao caso de estudo onde foram efectuados inicialmente inquéritos como forma de determinar os consumos de água quente utilizada nessa escola. Continuou-se o estudo efectuando-se a variação de duas características do sistema solar: o tamanho dos depósitos e o tipo de colectores solares a aplicar. Após as simulações efectuadas para a determinação das soluções a aplicar ao sistemasolar e apresentadas ao longo do presente documento, foram efectuadas análises económicas como forma de se verificar a viabilidade do sistema a aplicar. Por último foram elaboradas conclusões sobre o sistema a aplicar e apresentados alguns cenários financeiros do mesmo.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.
Resumo:
Using fluid mechanics, we reinterpret the mantle images obtained from global and regional tomography together with geochemical, geological and paleomagnetic observations, and attempt to unravel the pattern of convection in the Indo-Atlantic "box" and its temporal evolution over the last 260 Myr. The << box >> presently contains a) a broad slow seismic anomaly at the CMB which has a shape similar to Pangea 250 Myr ago, and which divides into several branches higher in the lower mantle, b) a "superswell, centered on the western edge of South Africa, c) at least 6 "primary hotspots" with long tracks related to traps, and d) numerous smaller hotspots. In the last 260 Myr, this mantle box has undergone 10 trap events, 7 of them related to continental breakup. Several of these past events are spatially correlated with present-day seismic anomalies and/or upwellings. Laboratory experiments show that superswells, long-lived hotspot tracks and traps may represent three evolutionary stages of the same phenomenon, i.e. episodic destabilization of a hot, chemically heterogeneous thermal boundary layer, close to the bottom of the mantle. When scaled to the Earth's mantle, its recurrence time is on the order of 100-200 Myr. At any given time, the Indo-Atlantic box should contain 3 to 9 of these instabilities at different stages of their development, in agreement with observations. The return flow of the downwelling slabs, although confined to two main << boxes >> (Indo-Atlantic and Pacific) by subduction zone geometry, may therefore not be passive, but rather take the form of active thermochemical instabilities. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition
Resumo:
Conferência: 2nd Experiment at International Conference - 18-20 September 2013
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
This paper presents a distributed predictive control methodology for indoor thermal comfort that optimizes the consumption of a limited shared energy resource using an integrated demand-side management approach that involves a power price auction and an appliance loads allocation scheme. The control objective for each subsystem (house or building) aims to minimize the energy cost while maintaining the indoor temperature inside comfort limits. In a distributed coordinated multi-agent ecosystem, each house or building control agent achieves its objectives while sharing, among them, the available energy through the introduction of particular coupling constraints in their underlying optimization problem. Coordination is maintained by a daily green energy auction bring in a demand-side management approach. Also the implemented distributed MPC algorithm is described and validated with simulation studies.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
This paper presents the recent research results about the development of a Observed Time Difference (OTD) based geolocation algorithm based on network trace data, for a real Universal Mobile Telecommunication System (UMTS) Network. The initial results have been published in [1], the current paper focus on increasing the sample convergence rate, and introducing a new filtering approach based on a moving average spatial filter, to increase accuracy. Field tests have been carried out for two radio environments (urban and suburban) in the Lisbon area, Portugal. The new enhancements produced a geopositioning success rate of 47% and 31%, and a median accuracy of 151 m and 337 m, for the urban and suburban environments, respectively. The implemented filter produced a 16% and 20% increase on accuracy, when compared with the geopositioned raw data. The obtained results are rather promising in accuracy and geolocation success rate. OTD positioning smoothed by moving average spatial filtering reveals a strong approach for positioning trace extracted events, vital for boosting Self-Organizing Networks (SON) over a 3G network.