995 resultados para Pointing deviation
Resumo:
OBJECTIVE: To evaluate the effects of 2 different doses of exogenous surfactant on pulmonary mechanics and on the regularity of pulmonary parenchyma inflation in newborn rabbits. METHOD: Newborn rabbits were submitted to tracheostomy and randomized into 4 study groups: the Control group did not receive any material inside the trachea; the MEC group was instilled with meconium, without surfactant treatment; the S100 and S200 groups were instilled with meconium and were treated with 100 and 200 mg/kg of exogenous surfactant (produced by Instituto Butantan) respectively. Animals from the 4 groups were mechanically ventilated during a 25-minute period. Dynamic compliance, ventilatory pressure, tidal volume, and maximum lung volume (P-V curve) were evaluated. Histological analysis was conducted using the mean linear intercept (Lm), and the lung tissue distortion index (SDI) was derived from the standard deviation of the means of the Lm. One-way analysis of variance was used with a = 0.05. RESULTS: After 25 minutes of ventilation, dynamic compliance (mL/cm H2O · kg) was 0.87 ± 0.07 (Control); 0.49 ± 0.04 (MEC*); 0.67 ± 0.06 (S100); and 0.67 ± 0.08 (S200), and ventilatory pressure (cm H2O) was 9.0 ± 0.9 (Control); 16.5 ± 1.7 (MEC*); 12.4 ± 1.1 (S100); and 12.1 ± 1.5 (S200). Both treated groups had lower Lm values and more homogeneity in the lung parenchyma compared to the MEC group: SDI = 7.5 ± 1.9 (Control); 11.3 ± 2.5 (MEC*), 5.8 ± 1.9 (S100); and 6.7 ± 1.7 (S200) (*P < 0.05 versus all the other groups). CONCLUSIONS: Animals treated with surfactant showed significant improvement in pulmonary mechanics and more regularity of the lung parenchyma in comparison to untreated animals. There was no difference in results after treatment with either of the doses used.
Resumo:
This thesis justifies the need for and develops a new integrated model of practical reasoning and argumentation. After framing the work in terms of what is reasonable rather than what is rational (chapter 1), I apply the model for practical argumentation analysis and evaluation provided by Fairclough and Fairclough (2012) to a paradigm case of unreasonable individual practical argumentation provided by mass murderer Anders Behring Breivik (chapter 2). The application shows that by following the model, Breivik is relatively easily able to conclude that his reasoning to mass murder is reasonable – which is understood to be an unacceptable result. Causes for the model to allow such a conclusion are identified as conceptual confusions ingrained in the model, a tension in how values function within the model, and a lack of creativity from Breivik. Distinguishing between dialectical and dialogical, reasoning and argumentation, for individual and multiple participants, chapter 3 addresses these conceptual confusions and helps lay the foundation for the design of a new integrated model for practical reasoning and argumentation (chapter 4). After laying out the theoretical aspects of the new model, it is then used to re-test Breivik’s reasoning in light of a developed discussion regarding the motivation for the new place and role of moral considerations (chapter 5). The application of the new model shows ways that Breivik could have been able to conclude that his practical argumentation was unreasonable and is thus argued to have improved upon the Fairclough and Fairclough model. It is acknowledged, however, that since the model cannot guarantee a reasonable conclusion, improving the critical creative capacity of the individual using it is also of paramount importance (chapter 6). The thesis concludes by discussing the contemporary importance of improving practical reasoning and by pointing to areas for further research (chapter 7).
Resumo:
Fenómeno assinalável no nosso século foi a emergência da chamada “arte global” (Belting), dando conta da crise do “mundo da arte” e a disseminação generalizada das práticas artísticas. Neste contexto a obra de Rothko ganha uma força inesperada. Sendo usualmente inscrito no “modernismo” com os seu valores de pureza e especificidade do meio, neste caso a pintura, a nossa investigação revela que o gesto Rothkoniano excede largamente esta representação, que levaria a distinguir radicalmente entre uma fase mítica e surrealista, uma fase abstracionista dos “colour field” e finalmente uma fase sublime das pinturas da Capela Ecuménica de Rothko. Existe uma continuidade evidente que remete para uma geoestética, onde a terra e a sua habitabilidade desempenham um papel crucial. Daí a necessidade de inscrever a obra de Rothko na geofilosofia contemporânea, tal com foi delineada por Gilles Deleuze e Félix Guattari. Procedeu-se, assim, a uma análise crítica da obra e da estética de Rothko, que profeticamente, mas inconscientemente, parece abrir o caminho para o pensamento de uma arte da terra. Trata-se de uma linha de continuidade que atravessa toda a obra de Rothko, refletindo uma picturação do mundo e a vontade de criar de um mundo pictórico e poético, reduzido a elementos mínimos, pós-figurativos mas onde se reconhece a incidência dos motivos como frame e abertura, linha de horizonte e pórticos e passagens. Num segundo momento, explora-se essa dimensão “inconsciente” num projeto artístico pessoal, que se desdobra em abordagens picturais, de pintura, de instalação e de vídeo, que denominamos por “A Terra como Acontecimento”. Este projeto prolonga o esforço Rothkoniano, ao mesmo tempo que o altera profundamente, nomeadamente pelo uso dos materiais, pela mutação no uso da cor, bem como pela maneira como os elementos figurativos são radicalmente alterados pela mera transposição da perspetiva usada. Se a ressonância rothkoniana está bem presente, não menos presente está a intenção de um confronto dialogante com a Obra de Mark Rothko. Aquilo que neste importante artista, era o inconsciente, marcado pelo mito e teologia, pela delimitação da linha de horizonte, bem clássica, e, acima de tudo, pela sua verticalidade marcadamente teológica, “A Terra como Acontecimento” é a matéria que é profundamente radicalizada, bem como a lógica concetual, a qual é preferentemente circular, sem orientação absoluta, e incompleta, o que implica uma outra visão da “abertura”/”fecho”, tão essencial na obra de Rothko. Desta investigação espera-se um contributo significativo para os debates atuais sobre a arte na contemporaneidade.
Resumo:
The objective of the work presented in this thesis was the development of an innovative approach for the separation of enantiomers of secondary alcohols, combining the use of an ionic liquid (IL) - both as solvent for conducting enzymatic kinetic resolution and as acylating agent - with the use of carbon dioxide (CO2) as solvent for extraction. Menthol was selected for testing this reaction/separation approach due to the increasing demand for this substance, which is widely used in the pharmaceutical, cosmetics and food industries. With a view to using an ionic ester as acylating agent, whose conversion led to the release of ethanol, and due to the need to remove this alcohol so as to drive reaction equilibrium forward, a phase equilibrium study was conducted for the ehtanol/(±)-menthol/CO2 system, at pressures between 8 and 10 MPa and temperatures between 40 and 50 oC. It was found that CO2 is more selective towards ethanol, especially at the lowest pressure and highest temperature tested, leading to separation factors in the range 1.6-7.6. The pressure-temperature-composition data obtained were correlated with the Peng-Robinson equation of state and the Mathias-Klotz-Prausnitz mixing rule. The model fit the experimental results well, with an average absolute deviation (AAD) of 3.7 %. The resolution of racemic menthol was studied using two lipases, namely lipase from Candida rugosa (CRL) and immobilized lipase B from Candida antarctica (CALB), and two ionic acylating esters. No reaction was detected in either case. (R,S)-1-phenylethanol was used next, and it was found that with CRL low, nonselective, conversion of the alcohol took place, whereas CALB led to an enantiomeric excess (ee) of the substrate of 95%, at 30% conversion. Other acylating agents were tested for the resolution of (±)-menthol, namely vinyl esters and acid anhydrides, using several lipases and varying other parameters that affect conversion and enantioselectivity, such as substrate concentration, solvent and temperature. One such acylating agent was propionic anhydride. It was thus performed a phase equilibrium study on the propionic anhydride/CO2 system, at temperatures between 35 and 50 oC. This study revealed that, at 35 oC and pressures from 7 MPa, the system is monophasic for all compositions. The enzymatic catalysis studies carried out with propionic anhydride revealed that the extent of noncatalyzed reaction was high, with a negative effect on enantioselectivity. These studies showed also that it was possible to reduce considerably the impact of the noncatalyzed reaction relative to the reaction catalyzed by CRL by lowering temperature to 4 oC. Vinyl decanoate was shown to lead to the best results at conditions amenable to a process combining the use of supercritical CO2 as agent for post-reaction separation. The use of vinyl decanoate in a number of IL solvents, namely [bmim][PF6], [bmim][BF4], [hmim][PF6], [omim][PF6], and [bmim][Tf2N], led to an enantiomeric excess of product (eep) values of over 96%, at about 50% conversion, using CRL. In n-hexane and supercritical CO2, reaction progressed more slowly.(...)
Resumo:
This paper presents a methodology based on the Bayesian data fusion techniques applied to non-destructive and destructive tests for the structural assessment of historical constructions. The aim of the methodology is to reduce the uncertainties of the parameter estimation. The Young's modulus of granite stones was chosen as an example for the present paper. The methodology considers several levels of uncertainty since the parameters of interest are considered random variables with random moments. A new concept of Trust Factor was introduced to affect the uncertainty related to each test results, translated by their standard deviation, depending on the higher or lower reliability of each test to predict a certain parameter.
Resumo:
During recent decades it has been possible to identify several problems in construction industry project management, related with to systematic failures in terms of fulfilling its schedule, cost and quality targets, which highlight a need for an evaluation of the factors that may cause these failures. Therefore, it is important to understand how project managers plan the projects, so that the performance and the results can be improved. However, it is important to understand if other areas beyond cost and time management that are mentioned on several studies as the most critical areas, receive the necessary attention from construction project managers. Despite the cost and time are the most sensitive areas/fields, there are several other factors that may lead to project failure. This study aims at understand the reasons that may cause the deviation in terms of cost, time and quality, from the project management point of view, looking at the knowledge areas mentioned by PMI (Project Management Institute).
Resumo:
Rainwater harvesting systems allow the usage of properly collected, treated and supplied rainwater for domestic use in situations without good water quality requirement. To be sustainable, a rainwater harvesting system must be truly ecological, economically viable, socially fair and culturally diverse. The key element for this system is the first-flush device, which allows the deviation of the first rains which carry a significant load of pollutants and are not suitable even for non potable use. This article develops a theoretical and experimental study on a rainwater harvesting system for use in a single family dwelling. The main goal is to describe the hydraulic operation of syphonic drainage systems by the incorporation of a first-flush device in a laboratory installed rainwater harvesting system.
Resumo:
Tese de Doutoramento em Sociologia
Resumo:
Dissertação de mestrado em Direito Administrativo
Resumo:
Results of a search for decays of massive particles to fully hadronic final states are presented. This search uses 20.3 fb−1 of data collected by the ATLAS detector in s√=8TeV proton--proton collisions at the LHC. Signatures based on high jet multiplicities without requirements on the missing transverse momentum are used to search for R-parity-violating supersymmetric gluino pair production with subsequent decays to quarks. The analysis is performed using a requirement on the number of jets, in combination with separate requirements on the number of b-tagged jets, as well as a topological observable formed from the scalar sum of the mass values of large-radius jets in the event. Results are interpreted in the context of all possible branching ratios of direct gluino decays to various quark flavors. No significant deviation is observed from the expected Standard Model backgrounds estimated using jet-counting as well as data-driven templates of the total-jet-mass spectra. Gluino pair decays to ten or more quarks via intermediate neutralinos are excluded for a gluino with mass mg~<1TeV for a neutralino mass mχ~01=500GeV. Direct gluino decays to six quarks are excluded for mg~<917GeV for light-flavor final states, and results for various flavor hypotheses are presented.
Resumo:
A search for new charged massive gauge bosons, called W′, is performed with the ATLAS detector at the LHC, in proton--proton collisions at a centre-of-mass energy of s√ = 8 TeV, using a dataset corresponding to an integrated luminosity of 20.3 fb−1. This analysis searches for W′ bosons in the W′→tb¯ decay channel in final states with electrons or muons, using a multivariate method based on boosted decision trees. The search covers masses between 0.5 and 3.0 TeV, for right-handed or left-handed W′ bosons. No significant deviation from the Standard Model expectation is observed and limits are set on the W′→tb¯ cross-section times branching ratio and on the W′-boson effective couplings as a function of the W′-boson mass using the CLs procedure. For a left-handed (right-handed) W′ boson, masses below 1.70 (1.92) TeV are excluded at 95% confidence level.
Resumo:
Tese de Doutoramento em Arquitectura / Cultura Arquitectónica.
Resumo:
A search for the production of single-top-quarks in association with missing energy is performed in proton--proton collisions at a centre-of-mass energy of s√ = 8 TeV with the ATLAS experiment at the Large Hadron Collider using data collected in 2012, corresponding to an integrated luminosity of 20.3 fb−1. In this search, the W boson from the top quark is required to decay into an electron or a muon and a neutrino. No deviation from the Standard Model prediction is observed, and upper limits are set on the production cross-section for resonant and non-resonant production of an invisible exotic state in association with a right-handed top quark. In the case of resonant production, for a spin-0 resonance with a mass of 500 GeV, an effective coupling strength above 0.15 is excluded at 95% confidence level for the top quark and an invisible spin-1/2 state with mass between 0 GeV and 100 GeV. In the case of non-resonant production, an effective coupling strength above 0.2 is excluded at 95% confidence level for the top quark and an invisible spin-1 state with mass between 0 GeV and 657 GeV.
Resumo:
A search for the bb¯ decay of the Standard Model Higgs boson is performed with the ATLAS experiment using the full dataset recorded at the LHC in Run 1. The integrated luminosities used from pp collisions at s√=7 and 8 TeV are 4.7 and 20.3 fb−1, respectively. The processes considered are associated (W/Z)H production, where W→eν/μν, Z→ee/μμ and Z→νν. The observed (expected) deviation from the background-only hypothesis corresponds to a significance of 1.4 (2.6) standard deviations and the ratio of the measured signal yield to the Standard Model expectation is found to be μ=0.52±0.32(stat.)±0.24(syst.) for a Higgs boson mass of 125.36 GeV. The analysis procedure is validated by a measurement of the yield of (W/Z)Z production with Z→bb¯ in the same final states as for the Higgs boson search, from which the ratio of the observed signal yield to the Standard Model expectation is found to be 0.74±0.09(stat.)±0.14(syst.).
Resumo:
The research of stereotactic apparatus to guide surgical devices began in 1908, yet a major part of today's stereotactic neurosurgeries still rely on stereotactic frames developed almost half a century ago. Robots excel at handling spatial information, and are, thus, obvious candidates in the guidance of instrumentation along precisely planned trajectories. In this review, we introduce the concept of stereotaxy and describe a standard stereotactic neurosurgery. Neurosurgeons' expectations and demands regarding the role of robots as assistive tools are also addressed. We list the most successful robotic systems developed specifically for or capable of executing stereotactic neurosurgery. A critical review is presented for each robotic system, emphasizing the differences between them and detailing positive features and drawbacks. An analysis of the listed robotic system features is also undertaken, in the context of robotic application in stereotactic neurosurgery. Finally, we discuss the current perspective, and future directions of a robotic technology in this field. All robotic systems follow a very similar and structured workflow despite the technical differences that set them apart. No system unequivocally stands out as an absolute best. The trend of technological progress is pointing toward the development of miniaturized cost-effective solutions with more intuitive interfaces.