29 resultados para Meshfree particle methods
em Universidade do Minho
Resumo:
Since the last decade of the twentieth century, the healthcare industry is paying attention to the environmental impact of their buildings and therefore new regulations, policy goals and Buildings Sustainability Assessment (HBSA) methods are being developed and implemented. At the present, healthcare is one of the most regulated industries and it is also one of the largest consumers of energy per net floor area. To assess the sustainability of healthcare buildings it is necessary to establish a set of benchmarks related with their life-cycle performance. They are both essential to rate the sustainability of a project and to support designers and other stakeholders in the process of designing and operating a sustainable building, by allowing the comparison to be made between a project and the conventional and best market practices. This research is focused on the methodology to set the benchmarks for resources consumption, waste production, operation costs and potential environmental impacts related to the operational phase of healthcare buildings. It aims at contributing to the reduction of the subjectivity found in the definition of the benchmarks used in Building Sustainability Assessment (BSA) methods, and it is applied in the Portuguese context. These benchmarks will be used in the development of a Portuguese HBSA method.
Resumo:
As a renewable energy source, the use of forest biomass for electricity generation is advantageous in comparison with fossil fuels, however the activity of forest biomass power plants causes adverse impacts, affecting particularly neighbouring communities. The main objective of this study is to estimate the effects of the activity of forest biomass power plants on the welfare of two groups of stakeholders, namely local residents and the general population and we apply two stated preference methods: contingent valuation and discrete choice experiments, respectively. The former method was applied to estimate the minimum compensation residents of neighbouring communities of two forest biomass power plants in Portugal would be willing to accept. The latter method was applied among the general population to estimate their willingness to pay to avoid specific environmental impacts. The results show that the presence of the selected facilities affects individuals’ well-being. On the other hand, in the discrete choice experiments conducted among the general population all impacts considered were significant determinants of respondents’ welfare levels. The results of this study stress the importance of performing an equity analysis of the welfare effects on different groups of stakeholders from the installation of forest biomass power plants, as their effects on welfare are location and impact specific. Policy makers should take into account the views of all stakeholders either directly or indirectly involved when deciding crucial issues regarding the sitting of new forest biomass power plants, in order to achieve an efficient and equitable outcome.
Resumo:
To solve a health and safety problem on a waste treatment facility, different multicriteria decision methods were used, including the PROV Exponential decision method. Four alternatives and ten attributes were considered. We found a congruent solution, validated by the different methods. The AHP and the PROV Exponential decision method led us to the same options ordering, but the last method reinforced one of the options as being the best performing one, and detached the least performing option. Also, the ELECTRE I method results led to the same ordering which allowed to point the best solution with reasonable confidence. This paper demonstrates the potential of using multicriteria decision methods to support decision making on complex problems such as risk control and accidents prevention.
Resumo:
The occupational risks in the nanotechnology research laboratories are an important topic since a great number of researchers are involved in this area. The risk assessment performed by both qualitative and quantitative methods is a necessary step for the management of the occupational risks. Risk assessment could be performed by qualitative methods that gather consensus in the scientific community. It is also possible to use quantitative methods, based in different technics and metrics, as indicative exposure limits are been settled by several institutions. While performing the risk assessment, the information on the materials used is very important and, if it is not updated, it could create a bias in the assessment results. The exposure to TiO2 nanoparticles risk was assessed in a research laboratory using a quantitative exposure method and qualitative risk assessment methods. It was found the results from direct-reading Condensation Particle Counter (CPC) equipment and the CB Nanotool seem to be related and aligned, while the results obtained from the use of the Stoffenmanager Nano seem to indicate a higher risk level.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations
Resumo:
PhD Thesis in Bioengineering
Resumo:
PhD thesis in Bioengineering
Resumo:
Shifting from chemical to biotechnological processes is one of the cornerstones of 21st century industry. The production of a great range of chemicals via biotechnological means is a key challenge on the way toward a bio-based economy. However, this shift is occurring at a pace slower than initially expected. The development of efficient cell factories that allow for competitive production yields is of paramount importance for this leap to happen. Constraint-based models of metabolism, together with in silico strain design algorithms, promise to reveal insights into the best genetic design strategies, a step further toward achieving that goal. In this work, a thorough analysis of the main in silico constraint-based strain design strategies and algorithms is presented, their application in real-world case studies is analyzed, and a path for the future is discussed.
Resumo:
Charged-particle spectra obtained in 0.15nb−1 of Pb+Pb interactions at sNN−−−√=2.76TeV and 4.2pb−1 of pp interactions at s√=2.76TeV with the ATLAS detector at the LHC are presented in a wide transverse momentum (0.5
Resumo:
This Letter presents a search for a heavy neutral particle decaying into an opposite-sign different-flavor dilepton pair, e±μ∓, e±τ∓, or μ±τ∓ using 20.3 fb−1 of pp collision data at s√=8 TeV collected by the ATLAS detector at the LHC. The numbers of observed candidate events are compatible with the Standard Model expectations. Limits are set on the cross section of new phenomena in two scenarios: the production of ν~τ in R-parity-violating supersymmetric models and the production of a lepton-flavor-violating Z′ vector boson.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.