946 resultados para RECEIVING TENOFOVIR
Resumo:
OBJECTIVES: Two factors have been considered important contributors to tooth wear: dietary abrasives in plant foods themselves and mineral particles adhering to ingested food. Each factor limits the functional life of teeth. Cross-population studies of wear rates in a single species living in different habitats may point to the relative contributions of each factor. MATERIALS AND METHODS: We examine macroscopic dental wear in populations of Alouatta palliata (Gray, 1849) from Costa Rica (115 specimens), Panama (19), and Nicaragua (56). The sites differ in mean annual precipitation, with the Panamanian sites receiving more than twice the precipitation of those in Costa Rica or Nicaragua (∼3,500 mm vs. ∼1,500 mm). Additionally, many of the Nicaraguan specimens were collected downwind of active plinian volcanoes. Molar wear is expressed as the ratio of exposed dentin area to tooth area; premolar wear was scored using a ranking system. RESULTS: Despite substantial variation in environmental variables and the added presence of ash in some environments, molar wear rates do not differ significantly among the populations. Premolar wear, however, is greater in individuals collected downwind from active volcanoes compared with those living in environments that did not experience ash-fall. DISCUSSION: Volcanic ash seems to be an important contributor to anterior tooth wear but less so in molar wear. That wear is not found uniformly across the tooth row may be related to malformation in the premolars due to fluorosis. A surge of fluoride accompanying the volcanic ash may differentially affect the premolars as the molars fully mineralize early in the life of Alouatta.
Resumo:
While technologies for genetic sequencing have increased the promise of personalized medicine, they simultaneously pose threats to personal privacy. The public’s desire to protect itself from unauthorized access to information may limit the uses of this valuable resource. To date, there is limited understanding about the public’s attitudes toward the regulation and sharing of such information. We sought to understand the drivers of individuals’ decisions to disclose genetic information to a third party in a setting where disclosure potentially creates both private and social benefits, but also carries the risk of potential misuse of private information. We conducted two separate but related studies. First, we administered surveys to college students and parents, to determine individual attitudes toward and inter-generational influences on the disclosure decision. Second, we conducted a game-theory based experiment that assessed how participants’ decisions to disclose genetic information are influenced by societal and health factors. Key survey findings indicate that concerns about genetic information privacy negatively impact the likelihood of disclosure while the perceived benefits of disclosure and trust in the institution receiving the information have a positive influence. The experiment results also show that the risk of discrimination negatively affects the likelihood of disclosure, while the positive impact that disclosure has on the probability of finding a cure and the presence of a monetary incentive to disclose, increase the likelihood. We also study the determinants of individuals’ decision to be informed of findings about their health, and how information about health status is used for financial decisions.
Resumo:
BACKGROUND: The Affordable Care Act encourages healthcare systems to integrate behavioral and medical healthcare, as well as to employ electronic health records (EHRs) for health information exchange and quality improvement. Pragmatic research paradigms that employ EHRs in research are needed to produce clinical evidence in real-world medical settings for informing learning healthcare systems. Adults with comorbid diabetes and substance use disorders (SUDs) tend to use costly inpatient treatments; however, there is a lack of empirical data on implementing behavioral healthcare to reduce health risk in adults with high-risk diabetes. Given the complexity of high-risk patients' medical problems and the cost of conducting randomized trials, a feasibility project is warranted to guide practical study designs. METHODS: We describe the study design, which explores the feasibility of implementing substance use Screening, Brief Intervention, and Referral to Treatment (SBIRT) among adults with high-risk type 2 diabetes mellitus (T2DM) within a home-based primary care setting. Our study includes the development of an integrated EHR datamart to identify eligible patients and collect diabetes healthcare data, and the use of a geographic health information system to understand the social context in patients' communities. Analysis will examine recruitment, proportion of patients receiving brief intervention and/or referrals, substance use, SUD treatment use, diabetes outcomes, and retention. DISCUSSION: By capitalizing on an existing T2DM project that uses home-based primary care, our study results will provide timely clinical information to inform the designs and implementation of future SBIRT studies among adults with multiple medical conditions.
Resumo:
BACKGROUND: Anticoagulation can reduce quality of life, and different models of anticoagulation management might have different impacts on satisfaction with this component of medical care. Yet, to our knowledge, there are no scales measuring quality of life and satisfaction with anticoagulation that can be generalized across different models of anticoagulation management. We describe the development and preliminary validation of such an instrument - the Duke Anticoagulation Satisfaction Scale (DASS). METHODS: The DASS is a 25-item scale addressing the (a) negative impacts of anticoagulation (limitations, hassles and burdens); and (b) positive impacts of anticoagulation (confidence, reassurance, satisfaction). Each item has 7 possible responses. The DASS was administered to 262 patients currently receiving oral anticoagulation. Scales measuring generic quality of life, satisfaction with medical care, and tendency to provide socially desirable responses were also administered. Statistical analysis included assessment of item variability, internal consistency (Cronbach's alpha), scale structure (factor analysis), and correlations between the DASS and demographic variables, clinical characteristics, and scores on the above scales. A follow-up study of 105 additional patients assessed test-retest reliability. RESULTS: 220 subjects answered all items. Ceiling and floor effects were modest, and 25 of the 27 proposed items grouped into 2 factors (positive impacts, negative impacts, this latter factor being potentially subdivided into limitations versus hassles and burdens). Each factor had a high degree of internal consistency (Cronbach's alpha 0.78-0.91). The limitations and hassles factors consistently correlated with the SF-36 scales measuring generic quality of life, while the positive psychological impact scale correlated with age and time on anticoagulation. The intra-class correlation coefficient for test-retest reliability was 0.80. CONCLUSIONS: The DASS has demonstrated reasonable psychometric properties to date. Further validation is ongoing. To the degree that dissatisfaction with anticoagulation leads to decreased adherence, poorer INR control, and poor clinical outcomes, the DASS has the potential to help identify reasons for dissatisfaction (and positive satisfaction), and thus help to develop interventions to break this cycle. As an instrument designed to be applicable across multiple models of anticoagulation management, the DASS could be crucial in the scientific comparison between those models of care.
Resumo:
BACKGROUND: The development of a microcomputer-based device permits quick, simple, and noninvasive quantification of the respiratory sinus arrhythmia (RSA) during quiet breathing. METHODS AND RESULTS: We prospectively and serially measured the radionuclide left ventricular ejection fraction and the RSA amplitude in 34 cancer patients receiving up to nine monthly bolus treatments with doxorubicin hydrochloride (60 mg/m2). Of the eight patients who ultimately developed symptomatic doxorubicin-induced congestive heart failure, seven (87.5%) demonstrated a significant decline in RSA amplitude; five of 26 subjects without clinical symptoms of cardiotoxicity (19.2%) showed a similar RSA amplitude decline. On average, significant RSA amplitude decline occurred 3 months before the last planned doxorubicin dose in patients destined to develop clinical congestive heart failure. CONCLUSION: Overall, RSA amplitude abnormality proved to be a more specific predictor of clinically significant congestive heart failure than did serial resting radionuclide ejection fractions.
Resumo:
PURPOSE: Risk-stratified guidelines can improve quality of care and cost-effectiveness, but their uptake in primary care has been limited. MeTree, a Web-based, patient-facing risk-assessment and clinical decision support tool, is designed to facilitate uptake of risk-stratified guidelines. METHODS: A hybrid implementation-effectiveness trial of three clinics (two intervention, one control). PARTICIPANTS: consentable nonadopted adults with upcoming appointments. PRIMARY OUTCOME: agreement between patient risk level and risk management for those meeting evidence-based criteria for increased-risk risk-management strategies (increased risk) and those who do not (average risk) before MeTree and after. MEASURES: chart abstraction was used to identify risk management related to colon, breast, and ovarian cancer, hereditary cancer, and thrombosis. RESULTS: Participants = 488, female = 284 (58.2%), white = 411 (85.7%), mean age = 58.7 (SD = 12.3). Agreement between risk management and risk level for all conditions for each participant, except for colon cancer, which was limited to those <50 years of age, was (i) 1.1% (N = 2/174) for the increased-risk group before MeTree and 16.1% (N = 28/174) after and (ii) 99.2% (N = 2,125/2,142) for the average-risk group before MeTree and 99.5% (N = 2,131/2,142) after. Of those receiving increased-risk risk-management strategies at baseline, 10.5% (N = 2/19) met criteria for increased risk. After MeTree, 80.7% (N = 46/57) met criteria. CONCLUSION: MeTree integration into primary care can improve uptake of risk-stratified guidelines and potentially reduce "overuse" and "underuse" of increased-risk services.Genet Med 18 10, 1020-1028.
Resumo:
AIM: To evaluate pretreatment hepatitis B virus (HBV) testing, vaccination, and antiviral treatment rates in Veterans Affairs patients receiving anti-CD20 Ab for quality improvement. METHODS: We performed a retrospective cohort study using a national repository of Veterans Health Administration (VHA) electronic health record data. We identified all patients receiving anti-CD20 Ab treatment (2002-2014). We ascertained patient demographics, laboratory results, HBV vaccination status (from vaccination records), pharmacy data, and vital status. The high risk period for HBV reactivation is during anti-CD20 Ab treatment and 12 mo follow up. Therefore, we analyzed those who were followed to death or for at least 12 mo after completing anti-CD20 Ab. Pretreatment serologic tests were used to categorize chronic HBV (hepatitis B surface antigen positive or HBsAg+), past HBV (HBsAg-, hepatitis B core antibody positive or HBcAb+), resolved HBV (HBsAg-, HBcAb+, hepatitis B surface antibody positive or HBsAb+), likely prior vaccination (isolated HBsAb+), HBV negative (HBsAg-, HBcAb-), or unknown. Acute hepatitis B was defined by the appearance of HBsAg+ in the high risk period in patients who were pretreatment HBV negative. We assessed HBV antiviral treatment and the incidence of hepatitis, liver failure, and death during the high risk period. Cumulative hepatitis, liver failure, and death after anti-CD20 Ab initiation were compared by HBV disease categories and differences compared using the χ(2) test. Mean time to hepatitis peak alanine aminotransferase, liver failure, and death relative to anti-CD20 Ab administration and follow-up were also compared by HBV disease group. RESULTS: Among 19304 VHA patients who received anti-CD20 Ab, 10224 (53%) had pretreatment HBsAg testing during the study period, with 49% and 43% tested for HBsAg and HBcAb, respectively within 6 mo pretreatment in 2014. Of those tested, 2% (167/10224) had chronic HBV, 4% (326/7903) past HBV, 5% (427/8110) resolved HBV, 8% (628/8110) likely prior HBV vaccination, and 76% (6022/7903) were HBV negative. In those with chronic HBV infection, ≤ 37% received HBV antiviral treatment during the high risk period while 21% to 23% of those with past or resolved HBV, respectively, received HBV antiviral treatment. During and 12 mo after anti-CD20 Ab, the rate of hepatitis was significantly greater in those HBV positive vs negative (P = 0.001). The mortality rate was 35%-40% in chronic or past hepatitis B and 26%-31% in hepatitis B negative. In those pretreatment HBV negative, 16 (0.3%) developed acute hepatitis B of 4947 tested during anti-CD20Ab treatment and follow-up. CONCLUSION: While HBV testing of Veterans has increased prior to anti-CD20 Ab, few HBV+ patients received HBV antivirals, suggesting electronic health record algorithms may enhance health outcomes.
Resumo:
BACKGROUND: Dolutegravir (S/GSK1349572), a once-daily, unboosted integrase inhibitor, was recently approved in the United States for the treatment of human immunodeficiency virus type 1 (HIV-1) infection in combination with other antiretroviral agents. Dolutegravir, in combination with abacavir-lamivudine, may provide a simplified regimen. METHODS: We conducted a randomized, double-blind, phase 3 study involving adult participants who had not received previous therapy for HIV-1 infection and who had an HIV-1 RNA level of 1000 copies per milliliter or more. Participants were randomly assigned to dolutegravir at a dose of 50 mg plus abacavir-lamivudine once daily (DTG-ABC-3TC group) or combination therapy with efavirenz-tenofovir disoproxil fumarate (DF)-emtricitabine once daily (EFV-TDF-FTC group). The primary end point was the proportion of participants with an HIV-1 RNA level of less than 50 copies per milliliter at week 48. Secondary end points included the time to viral suppression, the change from baseline in CD4+ T-cell count, safety, and viral resistance. RESULTS: A total of 833 participants received at least one dose of study drug. At week 48, the proportion of participants with an HIV-1 RNA level of less than 50 copies per milliliter was significantly higher in the DTG-ABC-3TC group than in the EFV-TDF-FTC group (88% vs. 81%, P = 0.003), thus meeting the criterion for superiority. The DTG-ABC-3TC group had a shorter median time to viral suppression than did the EFV-TDF-FTC group (28 vs. 84 days, P<0.001), as well as greater increases in CD4+ T-cell count (267 vs. 208 per cubic millimeter, P<0.001). The proportion of participants who discontinued therapy owing to adverse events was lower in the DTG-ABC-3TC group than in the EFV-TDF-FTC group (2% vs. 10%); rash and neuropsychiatric events (including abnormal dreams, anxiety, dizziness, and somnolence) were significantly more common in the EFV-TDF-FTC group, whereas insomnia was reported more frequently in the DTG-ABC-3TC group. No participants in the DTG-ABC-3TC group had detectable antiviral resistance; one tenofovir DF-associated mutation and four efavirenz-associated mutations were detected in participants with virologic failure in the EFV-TDF-FTC group. CONCLUSIONS: Dolutegravir plus abacavir-lamivudine had a better safety profile and was more effective through 48 weeks than the regimen with efavirenz-tenofovir DF-emtricitabine. Copyright © 2013 Massachusetts Medical Society.
Resumo:
Background: The role of home parenteral nutrition (HPN) in incurable cachectic cancer patients unable to eat is extremely controversial. The aim of this study is to analyse which factors can influence the outcome. Patients and methods: We studied prospectively 414 incurable cachectic (sub)obstructed cancer patients receiving HPN and analysed the association between patient or clinical characteristics and surviving status. Results: Median weight loss, versus pre-disease and last 6-month period, was 24% and 16%, respectively. Median body mass index was 19.5, median KPS was 60, median life expectancy was 3 months. Mean/median survival was 4.7/3.0 months; 50.0% and 22.9% of patients survived 3 and 6 months, respectively. At the multivariable analysis, the variables significantly associated with 3- and 6-month survival were Glasgow Prognostic Score (GPS) and KPS, and GPS, KPS and tumour spread, respectively. By the aggregation of the significant variables, it was possible to dissect several classes of patients with different survival probabilities. Conclusions: The outcome of cachectic incurable cancer patients on HPN is not homogeneous. It is possible to identify groups of patients with a ≥6-month survival (possibly longer than that allowed in starvation). The indications for HPN can be modulated on these clinical/biochemical indices. © The Author 2013. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved.
Resumo:
Background: Tuberculosis-associated immune reconstitution inflammatory syndrome (TB-IRIS) remains a poorly understood complication in HIV-TB co-infected patients initiating antiretroviral therapy (ART). The role of the innate immune system in TB-IRIS is becoming increasingly apparent, however the potential involvement in TB-IRIS of a leaky gut and proteins that interfere with TLR stimulation by binding PAMPs has not been investigated before. Here we aimed to investigate the innate nature of the cytokine response in TB-IRIS and to identify novel potential biomarkers. Methods: From a large prospective cohort of HIV-TB co-infected patients receiving TB treatment, we compared 40 patients who developed TB-IRIS during the first month of ART with 40 patients matched for age, sex and baseline CD4 count who did not. We analyzed plasma levels of lipopolysaccharide (LPS)-binding protein (LBP), LPS, sCD14, endotoxin-core antibody, intestinal fatty acid-binding protein (I-FABP) and 18 pro-and anti-inflammatory cytokines before and during ART. Results: We observed lower baseline levels of IL-6 (p = 0.041), GCSF (p = 0.036) and LBP (p = 0.016) in TB-IRIS patients. At IRIS event, we detected higher levels of LBP, IL-1RA, IL-4, IL-6, IL-7, IL-8, G-CSF (p ≤ 0.032) and lower I-FABP levels (p = 0.013) compared to HIV-TB co-infected controls. Only IL-6 showed an independent effect in multivariate models containing significant cytokines from pre-ART (p = 0.039) and during TB-IRIS (p = 0.034). Conclusion: We report pre-ART IL-6 and LBP levels as well as IL-6, LBP and I-FABP levels during IRIS-event as potential biomarkers in TB-IRIS. Our results show no evidence of the possible contribution of a leaky gut to TB-IRIS and indicate that IL-6 holds a distinct role in the disturbed innate cytokine profile before and during TB-IRIS. Future clinical studies should investigate the importance and clinical relevance of these markers for the diagnosis and treatment of TB-IRIS. Copyright: © 2013 Goovaerts et al.
Resumo:
Tuberculosis-associated immune reconstitution inflammatory syndrome (TB-IRIS) remains a poorly understood complication in HIV-TB patients receiving antiretroviral therapy (ART). TB-IRIS could be associated with an exaggerated immune response to TB-antigens. We compared the recovery of IFNγ responses to recall and TB-antigens and explored in vitro innate cytokine production in TB-IRIS patients.
Resumo:
Tuberculosis-associated immune reconstitution inflammatory syndrome (TB-IRIS) is a common complication in HIV-TB co-infected patients receiving combined antiretroviral therapy (cART). This study investigated a putative contribution of monocytes to the development of TB-IRIS.
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models offer the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board". maritimeEXODUS-winner of the BCS, CITIS and RINA awards - is such a model. Features such as the ability to realistically simulate human response to fire, the capability to model human performance in heeled orientations, a virtual reality environment that produces realistic visualisations of the modelled scenarios and with an integrated abandonment model, make maritimeEXODUS a truly unique tool for assessing the evacuation capabilities of all types of vessels under a variety of conditions. This paper describes the maritimeEXODUS model, the SHEBA facility from which data concerning passenger/crew performance in conditions of heel is derived and an example application demonstrating the models use in performing an evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033.
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation resulting from fire or other incident? In the wake of major maritime disasters such as the Scandinavian Star, Herald of Free Enterprise, Estonia and in light of the growth in the number of high density, high-speed ferries and large capacity cruise ships, issues concerning the evacuation of passengers and crew at sea are receiving renewed interest. Fire and evacuation models with features such as the ability to realistically simulate the spread of heat and smoke and the human response to fire as well as the capability to model human performance in heeled orientations linked to a virtual reality environment that produces realistic visualisations of the modelled scenarios are now available and can be used to aid the engineer in assessing ship design and procedures. This paper describes the maritimeEXODUS ship evacuation and the SMARTFIRE fire simulation model and provides an example application demonstrating the use of the models in performing fire and evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation resulting from fire or other incident? In the wake of major maritime disasters such as the Scandinavian Star, Herald of Free Enterprise, Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerning the evacuation of passengers and crew at sea are receiving renewed interest. Fire and evacuation models with features such as the ability to realistically simulate the spread of fire and fire suppression systems and the human response to fire as well as the capability to model human performance in heeled orientations linked to a virtual reality environment that produces realistic visualisations of the modelled scenarios are now available and can be used to aid the engineer in assessing ship design and procedures. This paper describes the maritimeEXODUS ship evacuation and the SMARTFIRE fire simulation model and provides an example application demonstrating the use of the models in performing fire and evacuation analysis for a large passenger ship partially based, but exceeding the requirements of MSC circular 1033.