984 resultados para Accumulation rate, n-alkanes C29-C33 per year


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Low back pain (LBP) is one of the major concerns in health care. In Switzerland, musculoskeletal problems represent the third largest illness group with 9.4 million consultations per year. The return to work rate is increased by an active treatment program and saves societal costs. However, results after rehabilitation are generally poorer in patients with a Southeast European cultural background than in other patients. This qualitative research about the rehabilitation of patients with LBP and a Southeast European cultural background, therefore, explores possible barriers to successful rehabilitation. Methods We used a triangulation of methods combining three qualitative methods of data collection: 13 semi-structured in-depth interviews with patients who have a Southeast European cultural background and live in Switzerland, five semi-structured in-depth interviews and two focus groups with health professionals, and a literature review. Between June and December 2008, we recruited participants at a Rehabilitation Centre in the German-speaking part of Switzerland. Results To cope with pain, patients prefer passive strategies, which are not in line with recommended coping strategies. Moreover, the families of patients tend to support passive behaviour and reduce the autonomy of patients. Health professionals and researchers propagate active strategies including activity in the presence of pain, yet patients do not consider psychological factors contributing to LBP. The views of physicians and health professionals are in line with research evidence demonstrating the importance of psychosocial factors for LBP. Treatment goals focusing on increasing daily activities and return to work are not well understood by patients partly due to communication problems, which is something that patients and health professionals are aware of. Additional barriers to returning to work are caused by poor job satisfaction and other work-related factors. Conclusions LBP rehabilitation can be improved by addressing the following points. Early management of LBP should be activity-centred instead of pain-centred. It is mandatory to implement return to work management early, including return to adapted work, to improve rehabilitation for patients. Rehabilitation has to start when patients have been off work for three months. Using interpreters more frequently would improve communication between health professionals and patients, and reduce misunderstandings about treatment procedures. Special emphasis must be put on the process of goal-formulation by spending more time with patients in order to identify barriers to goal attainment. Information on the return to work process should also include the financial aspects of unemployment and disability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Stent thrombosis is a safety concern associated with use of drug-eluting stents. Little is known about occurrence of stent thrombosis more than 1 year after implantation of such stents. METHODS: Between April, 2002, and Dec, 2005, 8146 patients underwent percutaneous coronary intervention with sirolimus-eluting stents (SES; n=3823) or paclitaxel-eluting stents (PES; n=4323) at two academic hospitals. We assessed data from this group to ascertain the incidence, time course, and correlates of stent thrombosis, and the differences between early (0-30 days) and late (>30 days) stent thrombosis and between SES and PES. FINDINGS: Angiographically documented stent thrombosis occurred in 152 patients (incidence density 1.3 per 100 person-years; cumulative incidence at 3 years 2.9%). Early stent thrombosis was noted in 91 (60%) patients, and late stent thrombosis in 61 (40%) patients. Late stent thrombosis occurred steadily at a constant rate of 0.6% per year up to 3 years after stent implantation. Incidence of early stent thrombosis was similar for SES (1.1%) and PES (1.3%), but late stent thrombosis was more frequent with PES (1.8%) than with SES (1.4%; p=0.031). At the time of stent thrombosis, dual antiplatelet therapy was being taken by 87% (early) and 23% (late) of patients (p<0.0001). Independent predictors of overall stent thrombosis were acute coronary syndrome at presentation (hazard ratio 2.28, 95% CI 1.29-4.03) and diabetes (2.03, 1.07-3.83). INTERPRETATION: Late stent thrombosis was encountered steadily with no evidence of diminution up to 3 years of follow-up. Early and late stent thrombosis were observed with SES and with PES. Acute coronary syndrome at presentation and diabetes were independent predictors of stent thrombosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS: To determine whether the current practice of sweat testing in Swiss hospitals is consistent with the current international guidelines. METHODS: A questionnaire was mailed to all children's hospitals (n = 8), regional paediatric sections of general hospitals (n = 28), and all adult pulmonology centres (n = 8) in Switzerland which care for patients with cystic fibrosis (CF). The results were compared with published "guidelines 2000" of the American National Committee for Clinical Laboratory Standards (NCCLS) and the UK guidelines of 2003. RESULTS: The response rate was 89%. All 8 children's hospitals and 18 out of 23 answering paediatric sections performed sweat tests but none of the adult pulmonology centres. In total, 1560 sweat tests (range: 5-200 tests/centre/year, median 40) per year were done. 88% (23/26) were using Wescor systems, 73% (19/26) the Macroduct system for collecting sweat and 31% (8/26) the Nanoduct system. Sweat chloride was determined by only 62% (16/26) of all centres; of these, only 63% (10/16) indicated to use the recommended diagnostic chloride-CF-reference value of >60 mmol/l. Osmolality was measured in 35%, sodium in 42% and conductivity in 62% of the hospitals. Sweat was collected for maximal 30-120 (median 55) minutes; only three centres used the maximal 30 minutes sample time recommended by the international guidelines. CONCLUSIONS: Sweat testing practice in Swiss hospitals was inconsistent and seldom followed the current international guidelines for sweat collection, analyzing method and reference values. Only 62% were used the chloride concentration as a diagnostic reference, the only accepted diagnostic measurement by the NCCLS or UK guidelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: CD4+ T-cell recovery in patients with continuous suppression of plasma HIV-1 viral load (VL) is highly variable. This study aimed to identify predictive factors for long-term CD4+ T-cell increase in treatment-naive patients starting combination antiretroviral therapy (cART). METHODS: Treatment-naive patients in the Swiss HIV Cohort Study reaching two VL measurements <50 copies/ml >3 months apart during the 1st year of cART were included (n=1816 patients). We studied CD4+ T-cell dynamics until the end of suppression or up to 5 years, subdivided into three periods: 1st year, years 2-3 and years 4-5 of suppression. Multiple median regression adjusted for repeated CD4+ T-cell measurements was used to study the dependence of CD4+ T-cell slopes on clinical covariates and drug classes. RESULTS: Median CD4+ T-cell increases following VL suppression were 87, 52 and 19 cells/microl per year in the three periods. In the multiple regression model, median CD4+ T-cell increases over all three periods were significantly higher for female gender, lower age, higher VL at cART start, CD4+ T-cell <650 cells/microl at start of the period and low CD4+ T-cell increase in the previous period. Patients on tenofovir showed significantly lower CD4+ T-cell increases compared with stavudine. CONCLUSIONS: In our observational study, long-term CD4+ T-cell increase in drug-naive patients with suppressed VL was higher in regimens without tenofovir. The clinical relevance of these findings must be confirmed in, ideally, clinical trials or large, collaborative cohort projects but could influence treatment of older patients and those starting cART at low CD4+ T-cell levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The Swiss government decided to freeze new accreditations for physicians in private practice in Switzerland based on the assumption that demand-induced health care spending may be cut by limiting care offers. This legislation initiated an ongoing controversial public debate in Switzerland. The aim of this study is therefore the determination of socio-demographic and health system-related factors of per capita consultation rates with primary care physicians in the multicultural population of Switzerland. Methods The data were derived from the complete claims data of Swiss health insurers for 2004 and included 21.4 million consultations provided by 6564 Swiss primary care physicians on a fee-for-service basis. Socio-demographic data were obtained from the Swiss Federal Statistical Office. Utilisation-based health service areas were created and were used as observational units for statistical procedures. Multivariate and hierarchical models were applied to analyze the data. Results Models within the study allowed the definition of 1018 primary care service areas with a median population of 3754 and an average per capita consultation rate of 2.95 per year. Statistical models yielded significant effects for various geographical, socio-demographic and cultural factors. The regional density of physicians in independent practice was also significantly associated with annual consultation rates and indicated an associated increase 0.10 for each additional primary care physician in a population of 10,000 inhabitants. Considerable differences across Swiss language regions were observed with reference to the supply of ambulatory health resources provided either by primary care physicians, specialists, or hospital-based ambulatory care. Conclusion The study documents a large small-area variation in utilisation and provision of health care resources in Switzerland. Effects of physician density appeared to be strongly related to Swiss language regions and may be rooted in the different cultural backgrounds of the served populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A great increase of private car ownership took place in China from 1980 to 2009 with the development of the economy. To explain the relationship between car ownership and economic and social changes, an ordinary least squares linear regression model is developed using car ownership per capita as the dependent variable with GDP, savings deposits and highway mileages per capita as the independent variables. The model is tested and corrected for econometric problems such as spurious correlation and cointegration. Finally, the regression model is used to project oil consumption by the Chinese transportation sector through 2015. The result shows that about 2.0 million barrels of oil will be consumed by private cars in conservative scenario, and about 2.6 million barrels of oil per day in high case scenario in 2015. Both of them are much higher than the consumption level of 2009, which is 1.9 million barrels per day. It also shows that the annual growth rate of oil demand by transportation is 2.7% - 3.1% per year in the conservative scenario, and 6.9% - 7.3% per year in the high case forecast scenario from 2010 to 2015. As a result, actions like increasing oil efficiency need to be taken to deal with challenges of the increasing demand for oil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sensitive, specific and timely surveillance is necessary to monitor progress towards measles elimination. We evaluated the performance of sentinel and mandatory-based surveillance systems for measles in Switzerland during a 5-year period by comparing 145 sentinel and 740 mandatory notified cases. The higher proportion of physicians who reported at least one case per year in the sentinel system suggests underreporting in the recently introduced mandatory surveillance for measles. Accordingly, the latter reported 2-36-fold lower estimates for incidence rates than the sentinel surveillance. However, these estimates were only 0.6-12-fold lower when we considered confirmed cases alone, which indicates a higher specificity of the mandatory surveillance system. In contrast, the sentinel network, which covers 3.5% of all outpatient consultations, detected only weakly and late a major national measles epidemic in 2003 and completely missed 2 of 10 cantonal outbreaks. Despite its better timeliness and greater sensitivity in case detection, the sentinel system, in the current situation of low incidence, is insufficient to perform measles control and to monitor progress towards elimination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Viral infections account for over 13 million deaths per year. Antiviral drugs and vaccines are the most effective method to treat viral diseases. Antiviral compounds have revolutionized the treatment of AIDS, and reduced the mortality rate. However, this disease still causes a large number of deaths in developing countries that lack these types of drugs. Vaccination is the most effective method to treat viral disease; vaccines prevent around 2.5 million deaths per year. Vaccines are not able to offer full coverage due to high operational costs in the manufacturing processes. Although vaccines have saved millions of lives, conventional vaccines often offer reactogenic effects. New technologies have been created to eliminate the undesired side effects. However, new vaccines are less immunogenic and adjuvants such as vaccine delivery vehicles are required. This work focuses on the discovery of new natural antivirals that can reduce the high cost and side effects of synthetic drugs. We discovered that two osmolytes, trimethylamine N-oxide (TMAO) and glycine reduce the infectivity of a model virus, porcine parvovirus (PPV), by 4 LRV (99.99%), likely by disruption of capsid assembly. These osmolytes have the potential to be used as drugs, since they showed antiviral activity after 20 h. We have also focused on improving current vaccine manufacturing processes that will allow fast, effective and economical vaccines to be produced worldwide. We propose virus flocculation in osmolytes followed by microfiltration as an economical alternative for vaccine manufacturing. Osmolytes are able to specifically flocculate hydrophobic virus particles by depleting a hydration layer around the particles and subsequently cause virus aggregation. The osmolyte mannitol was able to flocculate virus particles, and demonstrate a high virus removal, 81% for PPV and 98.1% for Sindbis virus (SVHR). Virus flocculation with mannitol, followed by microfiltration could be used as a platform process for virus purification. Finally, we perform biocompatibility studies on soft-templated mesoporous carbon materials with the aim of using these materials as vaccine delivery vehicles. We discovered that these materials are biocompatible, and the degree of biocompatibility is within the range of other biomaterials currently employed in biomedical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Studies continue to identify percutaneous coronary intervention procedural volume both at the institutional level and at the operator level as being strongly correlated with outcome. High-volume centers have been defined as those that perform >400 percutaneous coronary intervention procedures per year. The relationship between drug-eluting stent procedural volume and outcome is unknown. We investigated this relationship in the German Cypher Registry. METHODS AND RESULTS: The present analysis included 8201 patients treated with sirolimus-eluting stents between April 2002 and September 2005 in 51 centers. Centers that recruited >400 sirolimus-eluting stent patients in this time period were considered high-volume centers; those with 150 to 400 patients were considered intermediate-volume centers; and those with <150 patients were designated as low-volume centers. The primary end point was all death, myocardial infarction, and target-vessel revascularization at 6 months. This end point occurred in 11.3%, 12.1%, and 9.0% of patients in the low-, intermediate-, and high-volume center groups, respectively (P=0.0001). There was no difference between groups in the rate of target-vessel revascularization (P=0.2) or cerebrovascular accidents (P=0.5). The difference in death/myocardial infarction remained significant after adjustment for baseline factors (odds ratio 1.85, 95% confidence interval 1.31 to 2.59, P<0.001 for low-volume centers; odds ratio 1.69, 95% confidence interval 1.29 to 2.21, P<0.001 for intermediate-volume centers). Patient and lesion selection, procedural features, and postprocedural medications differed significantly between groups. CONCLUSIONS: The volume of sirolimus-eluting stent procedures performed on an institutional level was inversely related to death and myocardial infarction but not to target-vessel revascularization at 6-month follow-up. Safety issues are better considered in high-volume centers. These findings have important public health policy implications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

HYPOTHESIS: Clinically apparent surgical glove perforation increases the risk of surgical site infection (SSI). DESIGN: Prospective observational cohort study. SETTING: University Hospital Basel, with an average of 28,000 surgical interventions per year. PARTICIPANTS: Consecutive series of 4147 surgical procedures performed in the Visceral Surgery, Vascular Surgery, and Traumatology divisions of the Department of General Surgery. MAIN OUTCOME MEASURES: The outcome of interest was SSI occurrence as assessed pursuant to the Centers of Disease Control and Prevention standards. The primary predictor variable was compromised asepsis due to glove perforation. RESULTS: The overall SSI rate was 4.5% (188 of 4147 procedures). Univariate logistic regression analysis showed a higher likelihood of SSI in procedures in which gloves were perforated compared with interventions with maintained asepsis (odds ratio [OR], 2.0; 95% confidence interval [CI], 1.4-2.8; P < .001). However, multivariate logistic regression analyses showed that the increase in SSI risk with perforated gloves was different for procedures with vs those without surgical antimicrobial prophylaxis (test for effect modification, P = .005). Without antimicrobial prophylaxis, glove perforation entailed significantly higher odds of SSI compared with the reference group with no breach of asepsis (adjusted OR, 4.2; 95% CI, 1.7-10.8; P = .003). On the contrary, when surgical antimicrobial prophylaxis was applied, the likelihood of SSI was not significantly higher for operations in which gloves were punctured (adjusted OR, 1.3; 95% CI, 0.9-1.9; P = .26). CONCLUSION: Without surgical antimicrobial prophylaxis, glove perforation increases the risk of SSI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: This systematic review sought to determine the long-term clinical survival rates of single-tooth restorations fabricated with computer-aided design/computer-assisted manufacture (CAD/CAM) technology, as well as the frequency of failures depending on the CAD/CAM system, the type of restoration, the selected material, and the luting agent. MATERIALS AND METHODS: An electronic search from 1985 to 2007 was performed using two databases: Medline/PubMed and Embase. Selected keywords and well-defined inclusion and exclusion criteria guided the search. All articles were first reviewed by title, then by abstract, and subsequently by a full text reading. Data were assessed and extracted by two independent examiners. The pooled results were statistically analyzed and the overall failure rate was calculated by assuming a Poisson-distributed number of events. In addition, reported failures were analyzed by CAD/CAM system, type of restoration, restorative material, and luting agent. RESULTS: From a total of 1,957 single-tooth restorations with a mean exposure time of 7.9 years and 170 failures, the failure rate was 1.75% per year, estimated per 100 restoration years (95% CI: 1.22% to 2.52%). The estimated total survival rate after 5 years of 91.6% (95% CI: 88.2% to 94.1%) was based on random-effects Poisson regression analysis. CONCLUSIONS: Long-term survival rates for CAD/CAM single-tooth Cerec 1, Cerec 2, and Celay restorations appear to be similar to conventional ones. No clinical studies or randomized clinical trials reporting on other CAD/CAM systems currently used in clinical practice and with follow-up reports of 3 or more years were found at the time of the search.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Unlike most antihyperglycaemic drugs, glucagon-like peptide-1 (GLP-1) receptor agonists have a glucose-dependent action and promote weight loss. We compared the efficacy and safety of liraglutide, a human GLP-1 analogue, with exenatide, an exendin-based GLP-1 receptor agonist. METHODS: Adults with inadequately controlled type 2 diabetes on maximally tolerated doses of metformin, sulphonylurea, or both, were stratified by previous oral antidiabetic therapy and randomly assigned to receive additional liraglutide 1.8 mg once a day (n=233) or exenatide 10 microg twice a day (n=231) in a 26-week open-label, parallel-group, multinational (15 countries) study. The primary outcome was change in glycosylated haemoglobin (HbA(1c)). Efficacy analyses were by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT00518882. FINDINGS: Mean baseline HbA(1c) for the study population was 8.2%. Liraglutide reduced mean HbA(1c) significantly more than did exenatide (-1.12% [SE 0.08] vs -0.79% [0.08]; estimated treatment difference -0.33; 95% CI -0.47 to -0.18; p<0.0001) and more patients achieved a HbA(1c) value of less than 7% (54%vs 43%, respectively; odds ratio 2.02; 95% CI 1.31 to 3.11; p=0.0015). Liraglutide reduced mean fasting plasma glucose more than did exenatide (-1.61 mmol/L [SE 0.20] vs -0.60 mmol/L [0.20]; estimated treatment difference -1.01 mmol/L; 95% CI -1.37 to -0.65; p<0.0001) but postprandial glucose control was less effective after breakfast and dinner. Both drugs promoted similar weight losses (liraglutide -3.24 kg vs exenatide -2.87 kg). Both drugs were well tolerated, but nausea was less persistent (estimated treatment rate ratio 0.448, p<0.0001) and minor hypoglycaemia less frequent with liraglutide than with exenatide (1.93 vs 2.60 events per patient per year; rate ratio 0.55; 95% CI 0.34 to 0.88; p=0.0131; 25.5%vs 33.6% had minor hypoglycaemia). Two patients taking both exenatide and a sulphonylurea had a major hypoglycaemic episode. INTERPRETATION: Liraglutide once a day provided significantly greater improvements in glycaemic control than did exenatide twice a day, and was generally better tolerated. The results suggest that liraglutide might be a treatment option for type 2 diabetes, especially when weight loss and risk of hypoglycaemia are major considerations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The winter component of a year-round grazing system involving grazing of corn crop residues followed by grazing stockpiled grass-legume forages was compared at the McNay Research Farm with that of the winter component of a minimal land system that maintained cows in drylot. In the summers of 1995 and 1996, two and one cuttings of hay per year were harvested from two 15-acre fields containing “Johnston” low endophtye tall fescue and red clover. Two cuttings of hay in 1995 and one cutting in 1996 were harvested from two 15-acre fields of smooth bromegrass and red clover. Hay yields were 4,236 and 4,600 pounds of dry matter per acre for the tall fescue-red clover in 1995 and 1996, and 2,239 and 2,300 pounds of dry matter per acre for the smooth bromegrass-red clover in 1995 and 1996. Following grain harvest, four 7.5-acre fields containing corn crop residues were stocked with cows at midgestation at an allowance of 1.5 acres per cow. Forage yields at the initiation of corn crop grazing in 1995 and 1996 were 3,757 and 3,551 pounds of dry matter per acre for corn crop residues. Stockpiled forage yields were 1,748 and 2,912 pounds of dry matter for tall fescue-red clover and 1,880 and 2,187 pounds for smooth bromegrass-red clover. Corn crop residues and stockpiled forages were grazed in a strip stocking system. For comparison, 20 cows in 1995 and 16 cows in 1996 were placed in two drylots simultaneously with initiation of corn crop grazing, where they remained throughout the winter and spring grazing periods. Cows maintained in drylots or grazing corn crop residue and stockpiled forages were supplemented with hay as large round bales to maintain a body condition score of five. In both years, no seasonal differences in body weight and body condition score were observed between grazing cows or cows maintained in drylots, but grazing cows required 85% and 98% less harvested hay in years 1 and 2 than cows in drylot during the winter and spring. Because less hay was needed to maintain grazing cows, excesses of 12,354 and 5,244 pounds of hay dry matter per cow in 1995 and 1996 remained in the year-round grazing system. During corn crop grazing, organic matter yield decreased at 23.5 and 28.8 pounds of organic matter per day from grazed areas of corn crop residues in 1995 and 1996. Organic matter losses due to weathering were 6.8, 10.3, and 12.7 pounds per day in corn crop residue, tall fescue-red clover and smooth bromegrass-red clover in 1995 and 12.1, 10.7, and 12.1 in 1996. Organic matter losses from grazed and ungrazed areas of tall fescue-red clover and smooth bromegrass-red clover during stockpiled grazing were 6.9, 6.9, and 2.1, 2.9 in 1995 and 13.4, 4.3, and +6.9, 4.4 pounds per day in 1996.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There is weak evidence to support the benefit of periodontal maintenance therapy in preventing tooth loss. In addition, the effects of long-term periodontal treatment on general health are unclear. METHODS Patients who were compliant and partially compliant (15 to 25 years' follow-up) in private practice were observed for oral and systemic health changes. RESULTS A total of 219 patients who were compliant (91 males and 128 females) were observed for 19.1 (range 15 to 25; SD ± 2.8) years. Age at reassessment was 64.6 (range: 39 to 84; SD ± 9.0) years. A total of 145 patients were stable (0 to 3 teeth lost), 54 were downhill (4 to 6 teeth lost), and 21 patients extreme downhill (>6 teeth lost); 16 patients developed hypertension, 13 developed type 2 diabetes, and 15 suffered myocardial infarcts (MIs). A minority developed other systemic diseases. Risk factors for MI included overweight (odds ratio [OR]: 9.04; 95% confidence interval [CI]: 2.9 to 27.8; P = 0.000), family history with cardiovascular disease (OR: 3.10; 95% CI: 1.07 to 8.94; P = 0.029), type 1 diabetes at baseline (P = 0.02), and developing type 2 diabetes (OR: 7.9; 95% CI: 2.09 to 29.65; P = 0.000). A total of 25 patients who were partially compliant (17 males and eight females) were observed for 19 years. This group had a higher proportion of downhill and extreme downhill cases and MI. CONCLUSIONS Patients who left the maintenance program in a periodontal specialist practice in Norway had a higher rate of tooth loss than patients who were compliant. Patients who were compliant with maintenance in a specialist practice in Norway have a similar risk of developing type 2 diabetes as the general population. A rate of 0.0037 MIs per patient per year was recorded for this group. Due to the lack of external data, it is difficult to assess how this compares with patients who have periodontal disease and are untreated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Whereas it is well established that various soluble biomarkers can predict level of liver fibrosis, their ability to predict liver-related clinical outcomes is less clearly established, in particular among HIV/viral hepatitis co-infected persons. We investigated plasma hyaluronic acid’s (HA) ability to predict risk of liver-related events (LRE; hepatic coma or liver-related death) in the EuroSIDA study. Methods Patients included were positive for anti-HCV and/or HBsAg with at least one available plasma sample. The earliest collected plasma sample was tested for HA (normal range 0–75 ng/mL) and levels were associated with risk of LRE. Change in HA per year of follow-up was estimated after measuring HA levels in latest sample before the LRE for those experiencing this outcome (cases) and in a random selection of one sixth of the remaining patients (controls). Results During a median of 8.2 years of follow-up, 84/1252 (6.7%) patients developed a LRE. Baseline median (IQR) HA in those without and with a LRE was 31.8 (17.2–62.6) and 221.6 ng/mL (74.9–611.3), respectively (p<0.0001). After adjustment, HA levels predicted risk of contracting a LRE; incidence rate ratios for HA levels 75–250 or ≥250 vs. <75 ng/mL were 5.22 (95% CI 2.86–9.26, p<0.0007) and 28.22 (95% CI 14.95–46.00, p<0.0001), respectively. Median HA levels increased substantially prior to developing a LRE (107.6 ng/mL, IQR 0.8 to 251.1), but remained stable for controls (1.0 ng/mL, IQR –5.1 to 8.2), (p<0.0001 comparing cases and controls), and greater increases predicted risk of a LRE in adjusted models (p<0.001). Conclusions An elevated level of plasma HA, particularly if the level further increases over time, substantially increases the risk of contracting LRE over the next five years. HA is an inexpensive, standardized and non-invasive supplement to other methods aimed at identifying HIV/viral hepatitis co-infected patients at risk of hepatic complications.