982 resultados para Differential Permeation Method
Resumo:
This paper aims to describe the Sequential Excavation Method, used for excava-tion in underground works, as well as the related risks and preventive measures. This method has characteristics that differentiate it from other tunnelling techniques: it uses a larger number of workers and equipment; it has a high concurrency of tasks with various workers and equip-ment quite exposed to hazards; and it uses many potentially aggressive chemicals. Firstly, it is given a broad overview of this issue. Afterwards, it will be presented the results of a survey to a sample of experienced technicians, aimed at gauging the relevance of a set of guidelines relat-ing to the design and work phases, applicable to the domestic market and prepared following technical visits to works abroad.
Resumo:
In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations
Resumo:
The observational method in tunnel engineering allows the evaluation in real time of the actual conditions of the ground and to take measures if its behavior deviates considerably from predictions. However, it lacks a consistent and structured methodology to use the monitoring data to adapt the support system in real time. The definition of limit criteria above which adaptation is required are not defined and complex inverse analysis procedures (Rechea et al. 2008, Levasseur et al. 2010, Zentar et al. 2001, Lecampion et al. 2002, Finno and Calvello 2005, Goh 1999, Cui and Pan 2012, Deng et al. 2010, Mathew and Lehane 2013, Sharifzadeh et al. 2012, 2013) may be needed to consistently analyze the problem. In this paper a methodology for the real time adaptation of the support systems during tunneling is presented. In a first step limit criteria for displacements and stresses are proposed. The methodology uses graphics that are constructed during the project stage based on parametric calculations to assist in the process and when these graphics are not available, since it is not possible to predict every possible scenario, inverse analysis calculations are carried out. The methodology is applied to the “Bois de Peu” tunnel which is composed by two tubes with over 500 m long. High uncertainty levels existed concerning the heterogeneity of the soil and consequently in the geomechanical design parameters. The methodology was applied in four sections and the results focus on two of them. It is shown that the methodology has potential to be applied in real cases contributing for a consistent approach of a real time adaptation of the support system and highlight the importance of the existence of good quality and specific monitoring data to improve the inverse analysis procedure.
Resumo:
A new very high-order finite volume method to solve problems with harmonic and biharmonic operators for one- dimensional geometries is proposed. The main ingredient is polynomial reconstruction based on local interpolations of mean values providing accurate approximations of the solution up to the sixth-order accuracy. First developed with the harmonic operator, an extension for the biharmonic operator is obtained, which allows designing a very high-order finite volume scheme where the solution is obtained by solving a matrix-free problem. An application in elasticity coupling the two operators is presented. We consider a beam subject to a combination of tensile and bending loads, where the main goal is the stress critical point determination for an intramedullary nail.
Resumo:
A selection of queens of Melipona scutellaris through the most productive colonies were carried out during eight months in an orange honeyflow. Each of the colonies was evaluated by its production, that is, the gross weight production ( pollen, brood, geopropolis and wax of each hive). With this data a coefficient of repeatability was estimated by the intraclass correlation method, obtained r = 0.835 ± 0.071. The repeatibility is very high showing that the analysed data (production) is repeatable. Selection was then carried out using the regression coefficient of each colony and the respective production gain. Using these data the colonies were divided into three groups according to the method Vencovsky and Kerr (1982): a with the colonies of highest productivity, b of least productivity, and c of intermediary productivity. Colonies with the highest production (Group a) gave their queens to those of the lowest production (Group b) after their queens were taken out and killed; while those of intermediate (Group c) stayed with the same queens during the entire experiment both before and after the selection. The modifications in weight, that is, the genetic response was (R)= 7.98 gr per day which indicated a selection gain. The estimate of the realized herdability is twice the rate of the response to selection (R) by the selection differential (S2). That is then h²R=2(R/S2) then h²R= 0.166
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions
Resumo:
Various differential cross-sections are measured in top-quark pair (tt¯) events produced in proton--proton collisions at a centre-of-mass energy of s√=7 TeV at the LHC with the ATLAS detector. These differential cross-sections are presented in a data set corresponding to an integrated luminosity of 4.6 fb−1. The differential cross-sections are presented in terms of kinematic variables of a top-quark proxy referred to as the pseudo-top-quark whose dependence on theoretical models is minimal. The pseudo-top-quark can be defined in terms of either reconstructed detector objects or stable particles in an analogous way. The measurements are performed on tt¯ events in the lepton+jets channel, requiring exactly one charged lepton and at least four jets with at least two of them tagged as originating from a b-quark. The hadronic and leptonic pseudo-top-quarks are defined via the leptonic or hadronic decay mode of the W boson produced by the top-quark decay in events with a single charged lepton.The cross-section is measured as a function of the transverse momentum and rapidity of both the hadronic and leptonic pseudo-top-quark as well as the transverse momentum, rapidity and invariant mass of the pseudo-top-quark pair system. The measurements are corrected for detector effects and are presented within a kinematic range that closely matches the detector acceptance. Differential cross-section measurements of the pseudo-top-quark variables are compared with several Monte Carlo models that implement next-to-leading order or leading-order multi-leg matrix-element calculations.
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
We elaborated an alternative culture method, which we denominated PKO (initials in tribute of respect to Petroff, Kudoh and Ogawa), for isolating Mycobacterium tuberculosis from sputum for diagnosis of pulmonary tuberculosis (TB), and to compare its performance with the Swab and Petroff methods. For the technique validation, sputum samples from patients suspected of pulmonary TB cases were examined by acid-fast microscopy (direct and concentrated smear), PKO, Swab and Petroff methods. We found that Petroff and PKO methods have parity in the effectiveness of M. tuberculosis isolation. However, by the PKO method, 65% of isolated strains were detected in a period of £15 days, while by the Petroff method the best detection was in an interval of 16-29 days (71%). In positive smear samples, the average time of PKO isolation is only superior to the one related for Bactec 460TB. In conclusion, the exclusion of the neutralization stage of pH in the PKO reduces the manipulation of the samples, diminishes the execution time of the culture according to the Petroff method and facilitates the qualification of professionals involved in the laboratorial diagnosis of Tuberculosis.
Resumo:
Nowadays, the sustainability of buildings has an extreme importance. This concept goes towards the European aims of the Program Horizon 2020, which concerns about the reduction of the environmental impacts through such aspects as the energy efficiency and renewable technologies, among others. Sustainability is an extremely broad concept but, in this work, it is intended to include the concept of sustainability in buildings. Within the concept that aims the integration of environmental, social and economic levels towards the preservation of the planet and the integrity of the users, there are, currently, several types of tools of environmental certification that are applicable to the construction industry (LEED, BREEAM, DGNB, SBTool, among others). Within this context, it is highlighted the tool SBTool (Sustainable Building Tool) that is employed in several countries and can be subject to review in institutions of basic education, which are the base for the formation of the critical masses and for the development of a country. The main aim of this research is to select indicators that can be used in a methodology for sustainability assessment (SBTool) of school buildings in Portugal and in Brazil. In order to achieve it, it will also be analyzed other methodologies that already incorporate parameters directly related with the schools environment, such as BREEAM or LEED.
Resumo:
Measurements of the total and differential cross sections of Higgs boson production are performed using 20.3 fb−1 of pp collisions produced by the Large Hadron Collider at a center-of-mass energy of s√=8 TeV and recorded by the ATLAS detector. Cross sections are obtained from measured H→γγ and H→ZZ∗→4ℓ event yields, which are combined accounting for detector efficiencies, fiducial acceptances and branching fractions. Differential cross sections are reported as a function of Higgs boson transverse momentum, Higgs boson rapidity, number of jets in the event, and transverse momentum of the leading jet. The total production cross section is determined to be σpp→H=33.0±5.3(stat)±1.6(sys)pb. The measurements are compared to state-of-the-art predictions.
Resumo:
Our objective was to validate a new device dedicated to measure the light disturbances surrounding bright sources of light under different sources of potential variability. Twenty subjects were involved in the study. Light distortion was measured using an experimental prototype (light distortion analyzer, CEORLab, University of Minho, Portugal) comprising twenty-four LED arrays panel at 2 m. Sources of variability included: intrasession and intersession repeated measures, pupil size (3 versus 6 mm), defocus (þ0.50) correction for the working distance, angular resolution (15 deg versus 30 deg), temporal stimuli presentation, and pupil size. Size, shape, location, and irregularity parameters have been obtained. At a low speed of presentation of the stimuli, changes in angular resolution did not have an effect on the results of the parameters measured. Results did not change with pupil size. Intensity of the central glare source significantly influenced the outcomes. Examination time was reduced by 30% when a 30 deg angular resolution was explored instead of 15 deg. Measurements were fast and repeatable under the same experimental conditions. Size and shape parameters showed the highest consistency, whereas location and irregularity parameters showed lower consistency. The system was sensitive to changes in the intensity of the central glare source but not to pupil changes in this sample of healthy subjects.
Resumo:
The main features of most components consist of simple basic functional geometries: planes, cylinders, spheres and cones. Shape and position recognition of these geometries is essential for dimensional characterization of components, and represent an important contribution in the life cycle of the product, concerning in particular the manufacturing and inspection processes of the final product. This work aims to establish an algorithm to automatically recognize such geometries, without operator intervention. Using differential geometry large volumes of data can be treated and the basic functional geometries to be dealt recognized. The original data can be obtained by rapid acquisition methods, such as 3D survey or photography, and then converted into Cartesian coordinates. The satisfaction of intrinsic decision conditions allows different geometries to be fast identified, without operator intervention. Since inspection is generally a time consuming task, this method reduces operator intervention in the process. The algorithm was first tested using geometric data generated in MATLAB and then through a set of data points acquired by measuring with a coordinate measuring machine and a 3D scan on real physical surfaces. Comparison time spent in measuring is presented to show the advantage of the method. The results validated the suitability and potential of the algorithm hereby proposed
Resumo:
The normalized differential cross section for top-quark pair production in association with at least one jet is studied as a function of the inverse of the invariant mass of the tt¯+1-jet system. This distribution can be used for a precise determination of the top-quark mass since gluon radiation depends on the mass of the quarks. The experimental analysis is based on proton--proton collision data collected by the ATLAS detector at the LHC with a centre-of-mass energy of 7 TeV corresponding to an integrated luminosity of 4.6 fb−1. The selected events were identified using the lepton+jets top-quark-pair decay channel, where lepton refers to either an electron or a muon. The observed distribution is compared to a theoretical prediction at next-to-leading-order accuracy in quantum chromodynamics using the pole-mass scheme. With this method, the measured value of the top-quark pole mass, mpolet, is: mpolet =173.7 ± 1.5 (stat.) ± 1.4 (syst.) +1.0−0.5 (theory) GeV. This result represents the most precise measurement of the top-quark pole mass to date.