928 resultados para Non-perturbative methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dominant process in hard proton-proton collisions is the production of hadronic jets.rnThese sprays of particles are produced by colored partons, which are struck out of their confinement within the proton.rnPrevious measurements of inclusive jet cross sections have provided valuable information for the determination of parton density functions and allow for stringent tests of perturbative QCD at the highest accessible energies.rnrnThis thesis will present a measurement of inclusive jet cross sections in proton-proton collisions using the ATLAS detector at the LHC at a center-of-mass energy of 7 TeV.rnJets are identified using the anti-kt algorithm and jet radii of R=0.6 and R=0.4.rnThey are calibrated using a dedicated pT and eta dependent jet calibration scheme.rnThe cross sections are measured for 40 GeV < pT <= 1 TeV and |y| < 2.8 in four bins of absolute rapidity, using data recorded in 2010 corresponding to an integrated luminosity of 3 pb^-1.rnThe data is fully corrected for detector effects and compared to theoretical predictions calculated at next-to-leading order including non-perturbative effects.rnThe theoretical predictions are found to agree with data within the experimental and theoretic uncertainties.rnrnThe ratio of cross sections for R=0.4 and R=0.6 is measured, exploiting the significant correlations of the systematic uncertainties, and is compared to recently developed theoretical predictions.rnThe underlying event can be characterized by the amount of transverse momentum per unit rapidity and azimuth, called rhoue.rnUsing analytical approaches to the calculation of non-perturbative corrections to jets, rhoue at the LHC is estimated using the ratio measurement.rnA feasibility study of a combined measurement of rhoue and the average strong coupling in the non-perturbative regime alpha_0 is presented and proposals for future jet measurements at the LHC are made.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last decade the near-surface mounted (NSM) strengthening technique using carbon fibre reinforced polymers (CFRP) has been increasingly used to improve the load carrying capacity of concrete members. Compared to externally bonded reinforcement (EBR), the NSM system presents considerable advantages. This technique consists in the insertion of carbon fibre reinforced polymer laminate strips into pre-cut slits opened in the concrete cover of the elements to be strengthened. CFRP reinforcement is bonded to concrete with an appropriate groove filler, typically epoxy adhesive or cement grout. Up to now, research efforts have been mainly focused on several structural aspects, such as: bond behaviour, flexural and/or shear strengthening effectiveness, and energy dissipation capacity of beam-column joints. In such research works, as well as in field applications, the most widespread adhesives that are used to bond reinforcements to concrete are epoxy resins. It is largely accepted that the performance of the whole application of NSM systems strongly depends on the mechanical properties of the epoxy resins, for which proper curing conditions must be assured. Therefore, the existence of non-destructive methods that allow monitoring the curing process of epoxy resins in the NSM CFRP system is desirable, in view of obtaining continuous information that can provide indication in regard to the effectiveness of curing and the expectable bond behaviour of CFRP/adhesive/concrete systems. The experimental research was developed at the Laboratory of the Structural Division of the Civil Engineering Department of the University of Minho in Guimar\~aes, Portugal (LEST). The main objective was to develop and propose a new method for continuous quality control of the curing of epoxy resins applied in NSM CFRP strengthening systems. This objective is pursued through the adaptation of an existing technique, termed EMM-ARM (Elasticity Modulus Monitoring through Ambient Response Method) that has been developed for monitoring the early stiffness evolution of cement-based materials. The experimental program was composed of two parts: (i) direct pull-out tests on concrete specimens strengthened with NSM CFRP laminate strips were conducted to assess the evolution of bond behaviour between CFRP and concrete since early ages; and, (ii) EMM-ARM tests were carried out for monitoring the progressive stiffness development of the structural adhesive used in CFRP applications. In order to verify the capability of the proposed method for evaluating the elastic modulus of the epoxy, static E-Modulus was determined through tension tests. The results of the two series of tests were then combined and compared to evaluate the possibility of implementation of a new method for the continuous monitoring and quality control of NSM CFRP applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quantum Chromodynamics (QCD) is the theory of strong interactions, one of the four fundamental forces in our Universe. It describes the interaction of gluons and quarks which build up hadrons like protons and neutrons. Most of the visible matter in our universe is made of protons and neutrons. Hence, we are interested in their fundamental properties like their masses, their distribution of charge and their shape. \\rnThe only known theoretical, non-perturbative and {\it ab initio} method to investigate hadron properties at low energies is lattice Quantum Chromodynamics (lattice QCD). However, up-to-date simulations (especially for baryonic quantities) do not achieve the accuracy of experiments. In fact, current simulations do not even reproduce the experimental values for the form factors. The question arises wether these deviations can be explained by systematic effects in lattice QCD simulations.rnrnThis thesis is about the computation of nucleon form factors and other hadronic quantities from lattice QCD. So called Wilson fermions are used and the u- and d-quarks are treated fully dynamically. The simulations were performed using gauge ensembles with a range of lattice spacings, volumes and pion masses.\\rnFirst of all, the lattice spacing was set to be able to make contact between the lattice results and their experimental complement and to be able to perform a continuum extrapolation. The light quark mass has been computed and found to be $m_{ud}^{\overline{\text{MS}}}(2\text{ GeV}) = 3.03(17)(38)\text{ MeV}$. This value is in good agreement with values from experiments and other lattice determinations.\\rnElectro-magnetic and axial form factors of the nucleon have been calculated. From these form factors the nucleon radii and the coupling constants were computed. The different ensembles enabled us to investigate systematically the dependence of these quantities on the volume, the lattice spacing and the pion mass.\newpage Finally we perform a continuum extrapolation and chiral extrapolations to the physical point.\\rnIn addition, we investigated so called excited state contributions to these observables. A technique was used, the summation method, which reduces these effects significantly and a much better agreement with experimental data was achieved. On the lattice, the Dirac radius and the axial charge are usually found to be much smaller than the experimental values. However, due to the carefully investigation of all the afore-mentioned systematic effects we get $\langle r_1^2\rangle_{u-d}=0.627(54)\text{ fm}^2$ and $g_A=1.218(92)$, which is in agreement with the experimental values within the errors.rnrnThe first three chapters introduce the theoretical background of form factors of the nucleon and lattice QCD in general. In chapter four the lattice spacing is determined. The computation of nucleon form factors is described in chapter five where systematic effects are investigated. All results are presented in chapter six. The thesis ends with a summary of the results and identifies options to complement and extend the calculations presented. rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The availability of a high-intensity antiproton beam with momentum up to 15,GeV/c at the future FAIR will open a unique opportunity to investigate wide areas of nuclear physics with the $overline{P}$ANDA (anti{$overline{P}$}roton ANnihilations at DArmstadt) detector. Part of these investigations concern the Electromagnetic Form Factors of the proton in the time-like region and the study of the Transition Distribution Amplitudes, for which feasibility studies have been performed in this Thesis. rnMoreover, simulations to study the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter of $overline{P}$ANDA are presented. This detector is crucial especially for the reconstruction of processes like $bar pprightarrow e^+ e^- pi^0$, investigated in this work. Different arrangements of dead material were studied. The results show that both, the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter fullfill the requirements for the detection of backward particles, and that this detector is necessary for the reconstruction of the channels of interest. rnrnThe study of the annihilation channel $bar pprightarrow e^+ e^-$ will improve the knowledge of the Electromagnetic Form Factors in the time-like region, and will help to understand their connection with the Electromagnetic Form Factors in the space-like region. In this Thesis the feasibility of a measurement of the $bar pprightarrow e^+ e^-$ cross section with $overline{P}$ANDA is studied using Monte-Carlo simulations. The major background channel $bar pprightarrow pi^+ pi^-$ is taken into account. The results show a $10^9$ background suppression factor, which assure a sufficiently clean signal with less than 0.1% background contamination. The signal can be measured with an efficiency greater than 30% up to $s=14$,(GeV/c)$^2$. The Electromagnetic Form Factors are extracted from the reconstructed signal and corrected angular distribution. Above this $s$ limit, the low cross section will not allow the direct extraction of the Electromagnetic Form Factors. However, the total cross section can still be measured and an extraction of the Electromagnetic Form Factors is possible considering certain assumptions on the ratio between the electric and magnetic contributions.rnrnThe Transition Distribution Amplitudes are new non-perturbative objects describing the transition between a baryon and a meson. They are accessible in hard exclusive processes like $bar pprightarrow e^+ e^- pi^0$. The study of this process with $overline{P}$ANDA will test the Transition Distribution Amplitudes approach. This work includes a feasibility study for measuring this channel with $overline{P}$ANDA. The main background reaction is here $bar pprightarrow pi^+ pi^- pi^0$. A background suppression factor of $10^8$ has been achieved while keeping a signal efficiency above 20%.rnrnrnPart of this work has been published in the European Physics Journal A 44, 373-384 (2010).rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cardiotocography (CTG) is a widespread foetal diagnostic methods. However, it lacks of objectivity and reproducibility since its dependence on observer's expertise. To overcome these limitations, more objective methods for CTG interpretation have been proposed. In particular, many developed techniques aim to assess the foetal heart rate variability (FHRV). Among them, some methodologies from nonlinear systems theory have been applied to the study of FHRV. All the techniques have proved to be helpful in specific cases. Nevertheless, none of them is more reliable than the others. Therefore, an in-depth study is necessary. The aim of this work is to deepen the FHRV analysis through the Symbolic Dynamics Analysis (SDA), a nonlinear technique already successfully employed for HRV analysis. Thanks to its simplicity of interpretation, it could be a useful tool for clinicians. We performed a literature study involving about 200 references on HRV and FHRV analysis; approximately 100 works were focused on non-linear techniques. Then, in order to compare linear and non-linear methods, we carried out a multiparametric study. 580 antepartum recordings of healthy fetuses were examined. Signals were processed using an updated software for CTG analysis and a new developed software for generating simulated CTG traces. Finally, statistical tests and regression analyses were carried out for estimating relationships among extracted indexes and other clinical information. Results confirm that none of the employed techniques is more reliable than the others. Moreover, in agreement with the literature, each analysis should take into account two relevant parameters, the foetal status and the week of gestation. Regarding the SDA, results show its promising capabilities in FHRV analysis. It allows recognizing foetal status, gestation week and global variability of FHR signals, even better than other methods. Nevertheless, further studies, which should involve even pathological cases, are necessary to establish its reliability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The asymptotic safety scenario allows to define a consistent theory of quantized gravity within the framework of quantum field theory. The central conjecture of this scenario is the existence of a non-Gaussian fixed point of the theory's renormalization group flow, that allows to formulate renormalization conditions that render the theory fully predictive. Investigations of this possibility use an exact functional renormalization group equation as a primary non-perturbative tool. This equation implements Wilsonian renormalization group transformations, and is demonstrated to represent a reformulation of the functional integral approach to quantum field theory.rnAs its main result, this thesis develops an algebraic algorithm which allows to systematically construct the renormalization group flow of gauge theories as well as gravity in arbitrary expansion schemes. In particular, it uses off-diagonal heat kernel techniques to efficiently handle the non-minimal differential operators which appear due to gauge symmetries. The central virtue of the algorithm is that no additional simplifications need to be employed, opening the possibility for more systematic investigations of the emergence of non-perturbative phenomena. As a by-product several novel results on the heat kernel expansion of the Laplace operator acting on general gauge bundles are obtained.rnThe constructed algorithm is used to re-derive the renormalization group flow of gravity in the Einstein-Hilbert truncation, showing the manifest background independence of the results. The well-studied Einstein-Hilbert case is further advanced by taking the effect of a running ghost field renormalization on the gravitational coupling constants into account. A detailed numerical analysis reveals a further stabilization of the found non-Gaussian fixed point.rnFinally, the proposed algorithm is applied to the case of higher derivative gravity including all curvature squared interactions. This establishes an improvement of existing computations, taking the independent running of the Euler topological term into account. Known perturbative results are reproduced in this case from the renormalization group equation, identifying however a unique non-Gaussian fixed point.rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La quantum biology (QB) è un campo di ricerca emergente che cerca di affronta- re fenomeni quantistici non triviali all’interno dei contesti biologici dotandosi di dati sperimentali di esplorazioni teoriche e tecniche numeriche. I sistemi biologici sono per definizione sistemi aperti, caldi,umidi e rumorosi, e queste condizioni sono per loro imprenscindibili; si pensa sia un sistema soggetto ad una veloce decoerenza che sopprime ogni dinamica quantistica controllata. La QB, tramite i principi di noise assisted transport e di antenna fononica sostiene che la presenza di un adeguato livello di rumore ambientale aumenti l’efficienza di un network di trasporto,inoltre se all’interno dello spettro ambientale vi sono specifici modi vibrazionali persistenti si hanno effetti di risonanza che rigenerano la coerenza quantistica. L’interazione ambiente-sistema è di tipo non Markoviano,non perturbativo e di forte non equi- librio, ed il rumore non è trattato come tradizionale rumore bianco. La tecnica numerica che per prima ha predetto la rigenerazione della coerenza all’interno di questi network proteici è stato il TEBD, Time Evolving Block Decimation, uno schema numerico che permette di simulare sistemi 1-D a molti corpi, caratterizzati da interazioni di primi vicini e leggermente entangled. Tramite gli algoritmi numerici di Orthopol l’hamiltoniana spin-bosone viene proiettata su una catena discreta 1-D, tenendo conto degli effetti di interazione ambiente-sistema contenuti nello spettro(il quale determina la dinamica del sistema).Infine si esegue l’evoluzione dello stato.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most quark actions in lattice QCD encounter difficulties with chiral sym-rnmetry and its spontaneous breakdown. Minimally doubled fermions (MDF)rnare a category of strictly local chiral lattice fermions, whose continuum limitrnreproduces two degenerate quark flavours. The two poles of their Dirac ope-rnrator are aligned such that symmetries under charge conjugation or reflectionrnof one particular direction are explictly broken at finite lattice spacing. Pro-rnperties of MDF are scrutinised with regard to broken symmetry and mesonrnspectrum to discern their suitability for numerical studies of QCD.rnrnInteractions induce anisotropic operator mixing for MDF. Hence, resto-rnration of broken symmetries in the continuum limit requires three coun-rnterterms, one of which is power-law divergent. Counterterms and operatorrnmixing are studied perturbatively for two variants of MDF. Two indepen-rndent non-perturbative procedures for removal of the power-law divergencernare developed by means of a numerical study of hadronic observables forrnone variant of MDF in quenched approximation. Though three out of fourrnpseudoscalar mesons are affected by lattice artefacts, the spectrum’s conti-rnnuum limit is consistent with two-flavour QCD. Thus, suitability of MDF forrnnumerical studies of QCD in the quenched approximation is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We give a brief review of the Functional Renormalization method in quantum field theory, which is intrinsically non perturbative, in terms of both the Polchinski equation for the Wilsonian action and the Wetterich equation for the generator of the proper verteces. For the latter case we show a simple application for a theory with one real scalar field within the LPA and LPA' approximations. For the first case, instead, we give a covariant "Hamiltonian" version of the Polchinski equation which consists in doing a Legendre transform of the flow for the corresponding effective Lagrangian replacing arbitrary high order derivative of fields with momenta fields. This approach is suitable for studying new truncations in the derivative expansion. We apply this formulation for a theory with one real scalar field and, as a novel result, derive the flow equations for a theory with N real scalar fields with the O(N) internal symmetry. Within this new approach we analyze numerically the scaling solutions for N=1 in d=3 (critical Ising model), at the leading order in the derivative expansion with an infinite number of couplings, encoded in two functions V(phi) and Z(phi), obtaining an estimate for the quantum anomalous dimension with a 10% accuracy (confronting with Monte Carlo results).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Microvascular surgery has become an important method for reconstructing surgical defects due to trauma, tumors or after burn. The most important factor for successful free flap transfer is a well-executed anastomosis. The time needed to perform the anastomosis and the failure rate are not negligible despite the high level of operator's experience. During the history, many alternatives were tried to help the microsurgeon and to reduce the complications. A Medline literature search was performed to find articles dealing with non-suture methods of microvascular anastomosis. Many historical books were also included. The non-suture techniques can be divided into four groups based on the used mechanism of sutures: double intubation including tubes and stents, intubation-eversion including simple rings, double eversion including staples and double rings, and wall adjustement with adhesives or laser. All these techniques were able to produce a faster and easier microvascular anastomosis. Nevertheless, disadvantages of the suturless techniques include toxicity, high cost, leakage or aneurysm formation. More refinement is needed before their widespread adoption. Thus, laser-assisted microvascular anastomosis using 1,9 μm diode laser appeared to be a safe and reliable help for the microsurgeon and may be further developed in the near future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Parasites are linked with their host in a trophic interaction with implications for both hosts and parasites. Interaction stretches from the host's immune response to the structuring of communities and the evolution of biodiversity. As in many species sex determines life history strategy, response to parasites may be sex-specific. Males of vertebrate species tend to exhibit higher rates of parasites than females. Sex-associated hormones may influence immunocompetence and are hypothesised to lead to this bias. In a field study, we tested the prediction of male biased parasitism (MBP) in free ranging chamois (Rupicapra rupicapra rupicapra), which are infested intensely by gastrointestinal and lung helminths. We further investigated sex differences in faecal androgen (testosterone and epiandrosterone), cortisol and oestrogen metabolites using enzyme immunoassays (EIA) to evaluate the impact of these hormones on sex dependent parasite susceptibility. Non-invasive methods were used and the study was conducted throughout a year to detect seasonal patterns. Hormone levels and parasite counts varied significantly throughout the year. Male chamois had a higher output of gastrointestinal eggs and lungworm larvae when compared to females. The hypothesis of MBP originating in sex related hormone levels was confirmed for the elevated output of lungworm larvae, but not for the gastrointestinal nematodes. The faecal output of lungworm larvae was significantly correlated with androgen and cortisol metabolite levels. Our study shows that sex differences in steroid levels play an important role to explain MBP, although they alone cannot fully explain the phenomenon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To characterize the impact of hepatitis C (HCV) serostatus on adherence to antiretroviral treatment (ART) among HIV-infected adults initiating ART. METHODS: The British Columbia HIV/AIDS Drug Treatment Program distributes, at no cost, all ART in this Canadian province. Eligible individuals used triple combination ART as their first HIV therapy and had documented HCV serology. Statistical analyses used parametric and non-parametric methods, including multivariate logistic regression. The primary outcome was > or = 95% adherence, defined as receiving > or = 95% of prescription refills during the first year of antiretroviral therapy. RESULTS: There were 1186 patients eligible for analysis, including 606 (51%) positive for HCV antibody and 580 (49%) who were negative. In adjusted analyses, adherence was independently associated with HCV seropositivity [adjusted odds ratio (AOR), 0.48; 95% confidence interval (CI), 0.23-0.97; P = 0.003], higher plasma albumin levels (AOR, 1.07; 95% CI, 1.01-1.12; P = 0.002) and male gender (AOR, 2.53; 95% CI, 1.04-6.15; P = 0.017), but not with injection drug use (IDU), age or other markers of liver injury. There was no evidence of an interaction between HCV and liver injury in adjusted analyses; comparing different strata of HCV and IDU confirmed that HCV was associated with poor adherence independent of IDU. CONCLUSIONS: HCV-coinfected individuals and those with lower albumin are less likely to be adherent to their ART.