961 resultados para pacemaker experiment
Resumo:
Vitamin A deficiency is a widespread public health problem in Sub-Saharan Africa. This paper analyzes the impact of a food-based intervention to fight vitamin A deficiency using orange-fleshed sweet potato (OFSP). We conducted a randomized evaluation of OFSP-related training to female farmers in Mozambique, in which the treatment group was taught basic concepts of nutrition, and OFSP-planting and cooking skills. We found encouraging evidence of changes in behavior and attitudes towards OFSP consumption and planting, and considerable increases in nutrition-related knowledge, as well as knowledge on cooking and planting OFSP.
Resumo:
Do information flows matter for remittance behavior? We design and implement a randomized control trial to quantitatively assess the role of communication between migrants and their contacts abroad on the extent and value of remittance flows. In the experiment, a random sample of 1,500 migrants residing in Ireland was offered the possibility of contacting their networks outside the host country for free over a varying number of months. We find a sizable, positive impact of our intervention on the value of migrant remittances sent. Our results exclude that the remittance effect we identify is a simple substitution effect. Instead, our analysis points to this effect being a likely result of improved information via factors such as better migrant control over remittance use, enhanced trust in remittance channels due to experience sharing, or increased remittance recipients’ social pressure on migrants.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
INTRODUCTION: The septal position is an alternative site for cardiac pacing (CP) that is potentially less harmful to cardiac function. METHODS: Patients with Chagas disease without heart failure submitted to permanent pacemaker (PP) implantation at the Clinics Hospital of the Triângulo Mineiro Federal University (UFTM), were selected from February 2009 to February 2010. The parameters analyzed were ventricular remodeling, the degree of electromechanical dyssynchrony (DEM), exercise time and VO2 max during exercise testing (ET) and functional class (NYHA). Echocardiography was performed 24 to 48h following implantation and after one year follow-up. The patients were submitted to ET one month postprocedure and at the end of one year. RESULTS: Thirty patients were included. Patient mean age was 59±13 years-old. Indication for PP implantation was complete atrioventricular (AV) block in 22 (73.3%) patients and 2nd degree AV block in the other eight (26.7%). All patients were in NYHA I and no changes occurred in the ET parameters. No variations were detected in echocardiographic remodeling measurements. Intraventricular dyssynchrony was observed in 46.6% of cases and interventricular dyssynchrony in 33.3% of patients after one year. CONCLUSIONS: The findings of this work suggest that there is not significant morphological and functional cardiac change following pacemaker implantation in septal position in chagasic patients with normal left ventricular function after one year follow-up. Thus, patients may remain asymptomatic, presenting maintenance of functional capacity and no left ventricular remodeling.
Resumo:
In this research we conducted a mixed research, using qualitative and quantitative analysis to study the relationship and impact between mobile advertisement and mobile app user acquisition and the conclusions companies can derive from it. Data was gathered from management of mobile advertisement campaigns of a portfolio of three different mobile apps. We found that a number of implications can be extracted from this intersection, namely to product development, internationalisation and management of marketing budget. We propose further research on alternative app users sources, impact of revenue on apps and exploitation of product segments: wearable technology and Internet of Things.
Resumo:
We are living in the era of Big Data. A time which is characterized by the continuous creation of vast amounts of data, originated from different sources, and with different formats. First, with the rise of the social networks and, more recently, with the advent of the Internet of Things (IoT), in which everyone and (eventually) everything is linked to the Internet, data with enormous potential for organizations is being continuously generated. In order to be more competitive, organizations want to access and explore all the richness that is present in those data. Indeed, Big Data is only as valuable as the insights organizations gather from it to make better decisions, which is the main goal of Business Intelligence. In this paper we describe an experiment in which data obtained from a NoSQL data source (database technology explicitly developed to deal with the specificities of Big Data) is used to feed a Business Intelligence solution.
Resumo:
A search for a charged Higgs boson, H±, decaying to a W± boson and a Z boson is presented. The search is based on 20.3 fb−1 of proton-proton collision data at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the LHC. The H± boson is assumed to be produced via vector-boson fusion and the decays W±→qq′¯ and Z→e+e−/μ+μ− are considered. The search is performed in a range of charged Higgs boson masses from 200 to 1000 GeV. No evidence for the production of an H± boson is observed. Upper limits of 31--1020 fb at 95% CL are placed on the cross section for vector-boson fusion production of an H± boson times its branching fraction to W±Z. The limits are compared with predictions from the Georgi-Machacek Higgs Triplet Model.
Resumo:
A search for the decay to a pair of new particles of either the 125 GeV Higgs boson (h) or a second CP-even Higgs boson (H) is presented. The dataset correspods to an integrated luminosity of 20.3 fb−1 of pp collisions at s√= 8 TeV recorded by the ATLAS experiment at the LHC in 2012. The search was done in the context of the next-to-minimal supersymmetric standard model, in which the new particles are the lightest neutral pseudoscalar Higgs bosons (a). One of the two a bosons is required to decay to two muons while the other is required to decay to two τ-leptons. No significant excess is observed above the expected backgrounds in the dimuon invariant mass range from 3.7 GeV to 50 GeV. Upper limits are placed on the production of h→aa relative to the Standard Model gg→h production, assuming no coupling of the a boson to quarks. The most stringent limit is placed at 3.5% for ma= 3.75 GeV. Upper limits are also placed on the production cross section of H→aa from 2.33 pb to 0.72 pb, for fixed ma = 5 GeV with mH ranging from 100 GeV to 500 GeV.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.
Resumo:
The normalized differential cross section for top-quark pair production in association with at least one jet is studied as a function of the inverse of the invariant mass of the tt¯+1-jet system. This distribution can be used for a precise determination of the top-quark mass since gluon radiation depends on the mass of the quarks. The experimental analysis is based on proton--proton collision data collected by the ATLAS detector at the LHC with a centre-of-mass energy of 7 TeV corresponding to an integrated luminosity of 4.6 fb−1. The selected events were identified using the lepton+jets top-quark-pair decay channel, where lepton refers to either an electron or a muon. The observed distribution is compared to a theoretical prediction at next-to-leading-order accuracy in quantum chromodynamics using the pole-mass scheme. With this method, the measured value of the top-quark pole mass, mpolet, is: mpolet =173.7 ± 1.5 (stat.) ± 1.4 (syst.) +1.0−0.5 (theory) GeV. This result represents the most precise measurement of the top-quark pole mass to date.
Resumo:
A summary of the constraints from the ATLAS experiment on R-parity-conserving supersymmetry is presented. Results from 22 separate ATLAS searches are considered, each based on analysis of up to 20.3 fb−1 of proton-proton collision data at centre-of-mass energies of s√=7 and 8 TeV at the Large Hadron Collider. The results are interpreted in the context of the 19-parameter phenomenological minimal supersymmetric standard model, in which the lightest supersymmetric particle is a neutralino, taking into account constraints from previous precision electroweak and flavour measurements as well as from dark matter related measurements. The results are presented in terms of constraints on supersymmetric particle masses and are compared to limits from simplified models. The impact of ATLAS searches on parameters such as the dark matter relic density, the couplings of the observed Higgs boson, and the degree of electroweak fine-tuning is also shown. Spectra for surviving supersymmetry model points with low fine-tunings are presented.
Resumo:
A summary is presented of ATLAS searches for gluinos and first- and second-generation squarks in final states containing jets and missing transverse momentum, with or without leptons or b-jets, in the s√=8 TeV data set collected at the Large Hadron Collider in 2012. This paper reports the results of new interpretations and statistical combinations of previously published analyses, as well as a new analysis. Since no significant excess of events over the Standard Model expectation is observed, the data are used to set limits in a variety of models. In all the considered simplified models that assume R-parity conservation, the limit on the gluino mass exceeds 1150 GeV at 95% confidence level, for an LSP mass smaller than 100 GeV. Furthermore, exclusion limits are set for left-handed squarks in a phenomenological MSSM model, a minimal Supergravity/Constrained MSSM model, R-parity-violation scenarios, a minimal gauge-mediated supersymmetry breaking model, a natural gauge mediation model, a non-universal Higgs mass model with gaugino mediation and a minimal model of universal extra dimensions.
Resumo:
Many extensions of the Standard Model predict the existence of charged heavy long-lived particles, such as R-hadrons or charginos. These particles, if produced at the Large Hadron Collider, should be moving non-relativistically and are therefore identifiable through the measurement of an anomalously large specific energy loss in the ATLAS pixel detector. Measuring heavy long-lived particles through their track parameters in the vicinity of the interaction vertex provides sensitivity to metastable particles with lifetimes from 0.6 ns to 30 ns. A search for such particles with the ATLAS detector at the Large Hadron Collider is presented, based on a data sample corresponding to an integrated luminosity of 18.4 fb−1 of pp collisions at s√ = 8 TeV. No significant deviation from the Standard Model background expectation is observed, and lifetime-dependent upper limits on R-hadrons and chargino production are set. Gluino R-hadrons with 10 ns lifetime and masses up to 1185 GeV are excluded at 95% confidence level, and so are charginos with 15 ns lifetime and masses up to 482 GeV.
Resumo:
PURPOSE:To determine the indication for and incidence and evolution of temporary and permanent pacemaker implantation in cardiac transplant recipients. METHODS: A retrospective review of 114 patients who underwent orthotopic heart transplantation InCor (Heart Institute USP BR) between March 1985 and May 1993. We studied the incidence of and indication for temporary pacing, the relationship between pacing and rejection, the need for pemanent pacing and the clinical follow-up. RESULTS: Fourteen of 114 (12%)heart transplant recipients required temporary pacing and 4 of 114 (3.5%) patients required permanent pacing. The indication for temporary pacing was sinus node dysfunction in 11 patients (78.5%) and atrioventricular (AV) block in 3 patients (21.4%). The indication for permanent pacemaker implantation was sinus node dysfunction in 3 patients (75%) and atrioventricular (AV) block in 1 patient (25%). We observed rejection in 3 patients (21.4%) who required temporary pacing and in 2 patients (50%) who required permanent pacing. The previous use of amiodarone was observed in 10 patients (71.4%) with temporary pacing. Seven of the 14 patients (50%) died during follow-up. CONCLUSION: Sinus node dysfunction was the principal indication for temporary and permanent pacemaker implantation in cardiac transplant recipients. The need for pacing was related to worse prognosis after cardiac transplantation.