989 resultados para Meshfree particle methods
Resumo:
The occupational risks in the nanotechnology research laboratories are an important topic since a great number of researchers are involved in this area. The risk assessment performed by both qualitative and quantitative methods is a necessary step for the management of the occupational risks. Risk assessment could be performed by qualitative methods that gather consensus in the scientific community. It is also possible to use quantitative methods, based in different technics and metrics, as indicative exposure limits are been settled by several institutions. While performing the risk assessment, the information on the materials used is very important and, if it is not updated, it could create a bias in the assessment results. The exposure to TiO2 nanoparticles risk was assessed in a research laboratory using a quantitative exposure method and qualitative risk assessment methods. It was found the results from direct-reading Condensation Particle Counter (CPC) equipment and the CB Nanotool seem to be related and aligned, while the results obtained from the use of the Stoffenmanager Nano seem to indicate a higher risk level.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations
Resumo:
PhD Thesis in Bioengineering
Resumo:
PhD thesis in Bioengineering
Resumo:
Shifting from chemical to biotechnological processes is one of the cornerstones of 21st century industry. The production of a great range of chemicals via biotechnological means is a key challenge on the way toward a bio-based economy. However, this shift is occurring at a pace slower than initially expected. The development of efficient cell factories that allow for competitive production yields is of paramount importance for this leap to happen. Constraint-based models of metabolism, together with in silico strain design algorithms, promise to reveal insights into the best genetic design strategies, a step further toward achieving that goal. In this work, a thorough analysis of the main in silico constraint-based strain design strategies and algorithms is presented, their application in real-world case studies is analyzed, and a path for the future is discussed.
Resumo:
Charged-particle spectra obtained in 0.15nb−1 of Pb+Pb interactions at sNN−−−√=2.76TeV and 4.2pb−1 of pp interactions at s√=2.76TeV with the ATLAS detector at the LHC are presented in a wide transverse momentum (0.5
Resumo:
This Letter presents a search for a heavy neutral particle decaying into an opposite-sign different-flavor dilepton pair, e±μ∓, e±τ∓, or μ±τ∓ using 20.3 fb−1 of pp collision data at s√=8 TeV collected by the ATLAS detector at the LHC. The numbers of observed candidate events are compatible with the Standard Model expectations. Limits are set on the cross section of new phenomena in two scenarios: the production of ν~τ in R-parity-violating supersymmetric models and the production of a lepton-flavor-violating Z′ vector boson.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.
Resumo:
Extreme value theory (EVT) deals with the occurrence of extreme phenomena. The tail index is a very important parameter appearing in the estimation of the probability of rare events. Under a semiparametric framework, inference requires the choice of a number k of upper order statistics to be considered. This is the crux of the matter and there is no definite formula to do it, since a small k leads to high variance and large values of k tend to increase the bias. Several methodologies have emerged in literature, specially concerning the most popular Hill estimator (Hill, 1975). In this work we compare through simulation well-known procedures presented in Drees and Kaufmann (1998), Matthys and Beirlant (2000), Beirlant et al. (2002) and de Sousa and Michailidis (2004), with a heuristic scheme considered in Frahm et al. (2005) within the estimation of a different tail measure but with a similar context. We will see that the new method may be an interesting alternative.
Resumo:
The paper presents studies of Bose--Einstein Correlations (BEC) for pairs of like-sign charged particles measured in the kinematic range pT> 100 MeV and |η|< 2.5 in proton--proton collisions at centre-of-mass energies of 0.9 and 7 TeV with the ATLAS detector at the CERN Large Hadron Collider. The integrated luminosities are approximately 7 μb−1, 190 μb−1 and 12.4 nb−1 for 0.9 TeV, 7 TeV minimum-bias and 7 TeV high-multiplicity data samples, respectively. The multiplicity dependence of the BEC parameters characterizing the correlation strength and the correlation source size are investigated for charged-particle multiplicities of up to 240. A saturation effect in the multiplicity dependence of the correlation source size is observed using the high-multiplicity 7 TeV data sample. The dependence of the BEC parameters on the average transverse momentum of the particle pair is also investigated.
Resumo:
Tese de Doutoramento em Engenharia Química e Biológica (área de conhecimento em Engenharia Enzimática e das Fermentações)