900 resultados para Gastrointestinal bleeding
Resumo:
We report the largest market basket survey of arsenic (As) in U.S. rice to date. Our findings show differences in transitional-metal levels between polished and unpolished rice and geographical variation in As and selenium (Se) between rice processed in California and the South Central U.S. The mean and median As grain levels for the South Central U.S. were 0.30 and 0.27 µg As g-1, respectively, for 107 samples. Levels for California were 41% lower than the South Central U.S., with a mean of 0.17 µg As g-1 and a median of 0.16 µg As g-1 for 27 samples. The mean and median Se grain levels for the South Central U.S. were 0.19 µg Se g-1. Californian rice levels were lower, averaging only 0.08 and 0.06 µg Se g-1 for mean and median values, respectively. The difference between the two regions was found to be significant for As and Se (General Linear Model (GLM):? As p < 0.001; Se p < 0.001). No statistically significant differences were observed in As or Se levels between polished and unpolished rice (GLM:? As p = 0.213; Se p = 0.113). No significant differences in grain levels of manganese (Mn), cobalt (Co), copper (Cu), or zinc (Zn) were observed between California and the South Central U.S. Modeling arsenic intake for the U.S. population based on this survey shows that for certain groups (namely Hispanics, Asians, sufferers of Celiac disease, and infants) dietary exposure to inorganic As from elevated levels in rice potentially exceeds the maximum intake of As from drinking water (based on consumption of 1 L of 0.01 mg L-1 In. As) and Californian state exposure limits. Further studies on the transformation of As in soil, grain As bioavailability in the human gastrointestinal tract, and grain elemental speciation trends are critical.
Resumo:
Arsenic is accumulated by free-living small mammals, but there is little information on the resultant concentrations in different tissues other than liver and kidney. Such information is important because the severity of toxicological effects may be related to the amount of arsenic accumulated in specific organs, and the availability of arsenic to predators is, in part, dependent on which tissues accumulate arsenic. The objective of this study was to quantify the arsenic concentrations and the percentage of the total body burden (%TBB) accumulated in different body tissues of free-living small mammals and to determine how these factors varied with severity of habitat contamination. Arsenic concentrations were measured in various tissues of wood mice (Apodemus sylvaticus) and bank voles (Clethrionomys glareolus) from a range of arsenic-contaminated sites in southwest Britain. Arsenic concentrations in the gastrointestinal (GI) tract (including contents), liver, kidneys, spleen, lung, femur, and fur of both species varied significantly between sites and were higher in mice and voles from heavily contaminated areas. Heart and brain arsenic concentrations did not vary with degree of environmental contamination. The GI tract and excised carcass contained roughly equal amounts of arsenic and, in sum, comprised 75-85% of the TBB on uncontaminated sites and 90-99% on contaminated sites. Although the excised carcass contains about half of the TBB, its importance in food-chain transfer of arsenic to predators may depend on the bioavailability of arsenic sequestered in fur. In contrast, the GI tract and its contents, provided that it is consumed, will always be a major transfer pathway for arsenic to predators, regardless of the severity of habitat contamination.
Resumo:
OBJECTIVES: The gastrointestinal microbiota is considered important in inflammatory bowel disease (IBD) pathogenesis. Discoveries from established disease cohorts report reduced bacterial diversity, changes in bacterial composition, and a protective role for Faecalibacterium prausnitzii in Crohn's disease (CD). The majority of studies to date are however potentially confounded by the effect of treatment and a reliance on established rather than de-novo disease.
METHODS: Microbial changes at diagnosis were examined by biopsying the colonic mucosa of 37 children: 25 with newly presenting, untreated IBD with active colitis (13 CD and 12 ulcerative colitis (UC)), and 12 pediatric controls with a macroscopically and microscopically normal colon. We utilized a dual-methodology approach with pyrosequencing (threshold >10,000 reads) and confirmatory real-time PCR (RT-PCR).
RESULTS: Threshold pyrosequencing output was obtained on 34 subjects (11 CD, 11 UC, 12 controls). No significant changes were noted at phylum level among the Bacteroidetes, Firmicutes, or Proteobacteria. A significant reduction in bacterial alpha-diversity was noted in CD vs. controls by three methods (Shannon, Simpson, and phylogenetic diversity) but not in UC vs. controls. An increase in Faecalibacterium was observed in CD compared with controls by pyrosequencing (mean 16.7% vs. 9.1% of reads, P = 0.02) and replicated by specific F. prausnitzii RT-PCR (36.0% vs. 19.0% of total bacteria, P = 0.02). No disease-specific clustering was evident on principal components analysis.
CONCLUSIONS: Our results offer a comprehensive examination of the IBD mucosal microbiota at diagnosis, unaffected by therapeutic confounders or changes over time. Our results challenge the current model of a protective role for F. prausnitzii in CD, suggesting a more dynamic role for this organism than previously described.
Resumo:
Common variable immunodeficiency (CVID) is a primary immunodeficiency characterized by hypogammaglobulinaemia and antibody deficiency to both T dependent and independent antigens. Patients suffer from recurrent sinopulmonary infections mostly caused by Streptococcus pneumoniae and Haemophilus influenzae, but also gastrointestinal or autoimmune symptoms. Their response to vaccination is poor or absent. In this study we investigated B cell activation induced by the TLR9 specific ligand (CpG-ODN) and bacterial extracts from S. pneumoniae and H. influenzae known to stimulate several TLR. We found that B cells from CVID patients express lower levels of CD86 after stimulation with CpG-ODN, S. pneumoniae and H. influenzae extracts in combination with anti-IgM antibody and also display a lower proliferative index when stimulated with bacterial extracts. Our results point to a broad TLR signalling defect in B lymphocytes from CVID patients that may be related to the hypogammaglobulinaemia and poor response to vaccination characteristic of these patients.
Resumo:
While the influence of temperature and moisture on the free-living stages of gastrointestinal nematodes have been described in detail, and evidence for global climate change is mounting, there have been only a few attempts to relate altered incidence or seasonal patterns of disease to climate change. Studies of this type have been completed for England Scotland and Wales, but not for Northern Ireland (NI). Here we present an analysis of veterinary diagnostic data that relates three categories of gastrointestinal nematode infection in sheep to historical meteorological data for NI. The infections are: trichostrongylosis/teladorsagiosis (Teladorsagia/Trichostrongylus), strongyloidosis and nematodirosis. This study aims to provide a baseline for future climate change analyses and to provide basic information for the development of nematode control programmes. After identifying and evaluating possible sources of bias, climate change was found to be the most likely explanation for the observed patterns of change in parasite epidemiology, although other hypotheses could not be refuted. Seasonal rates of diagnosis showed a uniform year-round distribution for Teladorsagia and Trichostrongylus infections, suggesting consistent levels of larval survival throughout the year and extension of the traditionally expected seasonal transmission windows. Nematodirosis showed a higher level of autumn than Spring infection, suggesting that suitable conditions for egg and larval development occurred after the Spring infection period. Differences between regions within the Province were shown for strongyloidosis, with peaks of infection falling in the period September-November. For all three-infection categories (trichostrongylosis/teladorsagiosis, strongyloidosis and nematodirosis), significant differences in the rates of diagnosis, and in the seasonality of disease, were identified between regions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
A questionnaire to obtain information on nematode control practices and sheep management was sent to over 1000 farmers in Northern Ireland. Replies were received from 305 flock owners, and data from 252 of them were analysed. Farms were divided into lowland and upland areas. Sizes of pasture and stocking rates on lowland and upland farms were 59.5 hectares, 6.99 sheep/hectare and 62.9 hectares and 10.01 sheep/hectare, respectively. Mean drenching rates for lambs and adults were 2.33 and 2.44, respectively, in lowland flocks and 2.73 and 2.71, respectively, in upland flocks. Between 2008 and 2011, the most frequently identified compounds in use were benzimidazoles and moxidectin in lowland flocks, and benzimidazoles and avermectins in upland flocks. Over the same period the most frequently identified commercial formulations were Tramazole (R), Panacur (R) and Allverm (R) (white drench), Levacide (R) (yellow drench), Oramec (R) (clear drench; avermectin), Cydectin (R) (clear drench; moxidectin) and Monepantel (R) (orange drench).
Most respondents (56.35%) treated their lambs at weaning and the most common time to treat ewes was identified to be pre-mating (67.86% of respondents).
The results of the questionnaire survey revealed that lowland annual drench frequency was 233 and 2.44 in lambs and ewes, respectively, although drench frequencies were higher in upland flocks: 2.73 and 2.71 for lambs and ewes, respectively.
Annual drench rotation was practiced by 43.96% of flock owners, but whether this was true rotation or pseudo-rotation (i.e., substitution of one anthelmintic product by another product belonging to the same chemical group of anthelmintics) could not be explicitly determined. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Purpose: Polymorphisms in the vitamin D receptor (VDR) gene may be of etiological importance in determining cancer risk. The aim of this study was to assess the association between common VDR gene polymorphisms and esophageal adenocarcinoma (EAC) risk in an all-Ireland population-based case-control study. Methods: EAC cases and frequency-matched controls by age and gender recruited between March 2002 and December 2004 throughout Ireland were included. Participants were interviewed, and a blood sample collected for DNA extraction. Twenty-seven single nucleotide polymorphisms in the VDR gene were genotyped using Sequenom or TaqMan assays while the poly(A) microsatellite was genotyped by fluorescent fragment analysis. Unconditional logistic regression was applied to assess the association between VDR polymorphisms and EAC risk. Results: A total of 224 cases of EAC and 256 controls were involved in analyses. After adjustment for potential confounders, TT homozygotes at rs2238139 and rs2107301 had significantly reduced risks of EAC compared with CC homozygotes. In contrast, SS alleles of the poly(A) microsatellite had significantly elevated risks of EAC compared with SL/LL alleles. However, following permutation analyses to adjust for multiple comparisons, no significant associations were observed between any VDR gene polymorphism and EAC risk. Conclusions: VDR gene polymorphisms were not significantly associated with EAC development in this Irish population. Confirmation is required from larger studies. © Springer Science+Business Media, LLC 2011.
Resumo:
Inhibition of histone deacetylases may be an important target in patients with myeloproliferative neoplasms. This investigator-initiated, non-randomized, open-label phase II multi-centre study included 63 patients (19 essential thrombocythaemia, 44 polycythaemia vera) from 15 centres. The primary objective was to evaluate if vorinostat was followed by a decline in clonal myeloproliferation as defined by European Leukaemia Net. Thirty patients (48%) completed the intervention period (24 weeks of therapy). An intention-to-treat response rate of 35% was identified. Pruritus was resolved [19% to 0% (P = 0·06)] and the prevalence of splenomegaly was lowered from 50% to 27% (P = 0·03). Sixty-five per cent of the patients experienced a decrease in JAK2 V617F allele burden (P = 0·006). Thirty-three patients (52% of patients) discontinued study drug before end of intervention due to adverse events (28 patients) or lack of response (5 patients). In conclusion, vorinostat showed effectiveness by normalizing elevated leucocyte and platelet counts, resolving pruritus and significantly reducing splenomegaly. However, vorinostat was associated with significant side effects resulting in a high discontinuation rate. A lower dose of vorinostat in combination with conventional and/or novel targeted therapies may be warranted in future studies.
Resumo:
Comprehensive history-taking and clinical examination skills are examples of role development for a stoma care nurse specialist. Comprehensive history-taking is a thorough exploration of a patient's presenting complaint and the gathering of subjective information, while clinical examination is the gathering of objective information from a head-to-toe assessment or a focused assessment of a particular body system. This paper demonstrates the application of comprehensive history-taking and gastrointestinal clinical examination skills by the stoma care nurse in a clinical community setting, and explores their advantages and disadvantages in stoma care practice.
Resumo:
Background: Depression in palliative care patients is important because of its intrinsic burden and association with elevated physical symptoms, reduced immunity and increased mortality risk. Identifying risk factors associated with depression can enable clinicians to more readily diagnose it, which is important since depression is treatable. The purpose of this cross-sectional study was to determine the prevalence of depressive symptoms and risk factors associated with them in a large sample of palliative home care patients.
Methods: The data come from interRAI Palliative Care assessments completed between 2006 and 2012. The sample (n = 5144) consists of adults residing in Ontario (Canada), receiving home care services, classified as palliative, and not experiencing significant cognitive impairment. Logistic regression identified the risk factors associated with depressive symptoms. The dependent variable was the Depression Rating Scale (DRS) and the independent variables were functional indicators from the interRAI assessment and other variables identified in the literature. We examined the results of the complete case and multiple imputation analyses, and found them to be similar.
Results: The prevalence of depressive symptoms was 9.8%. The risk factors associated with depressive symptoms were (pooled estimates, multiple imputation): low life satisfaction (OR = 3.01 [CI = 2.37-3.82]), severe and moderate sleep disorders (2.56 [2.05-3.19] and 1.56 [1.18-2.06]), health instability (2.12 [1.42-3.18]), caregiver distress 2.01 [1.62-2.51]), daily pain (1.73 [1.35-2.22]), cognitive impairment (1.45 [1.13-1.87]), being female (1.37 [1.11-1.68]), and gastrointestinal symptoms (1.27 [1.03-1.55]). Life satisfaction mediated the effect of prognostic awareness on depressive symptoms.
Conclusions: The prevalence of depressive symptoms in our study was close to the median of 10-20% reported in the palliative care literature, suggesting they are present but by no means inevitable in palliative patients. Most of the factors associated with depressive symptoms in our study are amenable to clinical intervention and often targeted in palliative care programs. Designing interventions to address them can be challenging, however, requiring careful attention to patient preferences, the spectrum of comorbid conditions they face, and their social supports. Life satisfaction was one of the strongest factors associated with depressive symptoms in our study, and is likely to be among the most challenging to address.
Resumo:
Objective Conventional surgical management of prolapsing haemorrhoids is by excisional haemorrhoidectomy. Postoperative pain has restricted the application of such procedures in the day case setting. These operations remain associated with a period of restricted activity. The use of circular stapling devices as an alternative to the excisional approach in the management of haemorrhoids has been described. This study reports our experience of stapled haemorrhoidopexy as a day case procedure.
Methods Patients with third or fourth degree haemorrhoids were eligible for the procedure. Patients were considered suitable candidates for day case surgery based on conventional parameters. Symptoms were assessed using a previously validated symptom severity rating score. Stapled haemorrhoiclopexy was carried out using a circular stapling device. Pain scores were obtained prior to discharge. Patients were admitted if pain was uncontrolled despite oral analgesia. Symptoms were re-scored at six-week follow-up.
Results Over a 70-month period 168 consecutive stapled haemorrhoidopexies were performed or directly supervised by one consultant colorectal surgeon. One hundred and ten (65%) patients were considered appropriate candidates for day case surgery by conventional criteria. Ninety-six (87.3%) patients successfully underwent stapled haemorrhoidopexyon a day case basis. Fourteen (12.7%) patients required admission on the day of surgery (5 for early Postoperative bleeding, 4 for pain necessitating continuing opiate analgesia, two for urinary retention and three for surgery performed late in the day). Six (5%) patients were re-admitted postoperatively; four for pain relief and two because of urinary retention. Of the day case patients, 91 (82.7%) and 56 (50.9%) had been seen for 6 week and 6 month review, respectively, at the time of analysis. Symptom scores were 6 (pre-operatively) vs 0 (postoperatively) (P <0.01). 76/91 (83.5%) patients reviewed at 6/52 were asymptomatic.
Conclusion Stapled haemorrhoidopexy is a safe and effective procedure that can be carried out on selected patients on a day case basis. Complications are of a similar nature to excisional haemorrhoidectomy.
Resumo:
This research investigates the relationship between elevated trace elements in soils, stream sediments and stream water and the prevalence of Chronic Kidney Disease (CKD). The study uses a collaboration of datasets provided from the UK Renal Registry Report (UKRR) on patients with renal diseases requiring treatment including Renal Replacement Therapy (RRT), the soil geochemical dataset for Northern Ireland provided by the Tellus Survey, Geological Survey of Northern Ireland (GSNI) and the bioaccessibility of Potentially Toxic Elements (PTEs) from soil samples which were obtained from the Unified Barge Method (UBM). The relationship between these factors derives from the UKRR report which highlights incidence rates of renal impaired patients showing regional variation with cases of unknown aetiology. Studies suggest a potential cause of the large variation and uncertain aetiology is associated with underlying environmental factors such as the oral bioaccessibility of trace elements in the gastrointestinal tract.
As previous research indicates that long term exposure is related to environmental factors, Northern Ireland is ideally placed for this research as people traditionally live in the same location for long periods of time. Exploratory data analysis and multivariate analyses are used to examine the soil, stream sediments and stream water geochemistry data for a range of key elements including arsenic, lead, cadmium and mercury identified from a review of previous renal disease literature. The spatial prevalence of patients with long term CKD is analysed on an area basis. Further work includes cluster analysis to detect areas of low or high incidences of CKD that are significantly correlated in space, Geographical Weighted Regression (GWR) and Poisson kriging to examine locally varying relationship between elevated concentrations of PTEs and the prevalence of CKD.
Resumo:
Background: Chronic antigenic stimulation may initiate non-Hodgkin (NHL) and Hodgkin lymphoma (HL) development. Antecedent, infection-related conditions have been associated, but evidence by lymphoproliferative subtype is limited. Methods: From the US SEER-Medicare database, 44 191 NHL, 1832 HL and 200 000 population-based controls, frequency-matched to all SEER cancer cases, were selected. Logistic regression models, adjusted for potential confounders, compared infection-related conditions in controls with HL and NHL patients and by the NHL subtypes diffuse large B-cell, T-cell, follicular and marginal zone lymphoma (MZL). Stratification by race was undertaken. Results: Respiratory tract infections were broadly associated with NHL, particularly MZL. Skin infections were associated with a 15–28% increased risk of NHL and with most NHL subtypes, particularly cellulitis with T-cell lymphoma (OR 1.36, 95%CI 1.24–1.49). Only herpes zoster remained associated with HL following Bonferroni correction (OR 1.55, 95% CI 1.28–1.87). Gastrointestinal and urinary tract infections were not strongly associated with NHL or HL. In stratified analyses by race, sinusitis, pharyngitis, bronchitis and cellulitis showed stronger associations with total NHL in blacks than whites (P<0.001). Conclusions: Infections may contribute to the aetiologic pathway and/or be markers of underlying immune modulation. Precise elucidation of these mechanisms may provide important clues for understanding how immune disturbance contributes to lymphoma.
Resumo:
The simultaneous delivery of multiple cancer drugs in combination therapies to achieve optimal therapeutic effects in patients can be challenging. This study investigated whether co-encapsulation of the BH3-mimetic ABT-737 and the topoisomerase I inhibitor camptothecin (CPT) in PEGylated polymeric nanoparticles (NPs) was a viable strategy for overcoming their clinical limitations and to deliver both compounds at optimal ratios. We found that thrombocytopenia induced by exposure to ABT-737 was diminished through its encapsulation in NPs. Similarly, CPT-associated leukopenia and gastrointestinal toxicity were reduced compared with the administration of free CPT. In addition to the reduction of dose-limiting side effects, the co-encapsulation of both anticancer compounds in a single NP produced synergistic induction of apoptosis in both in vitro and in vivo colorectal cancer models. This strategy may widen the therapeutic window of these and other drugs and may enhance the clinical efficacy of synergistic drug combinations.
Resumo:
All mammals lose their ability to produce lactase (β-galactosidase), the enzyme that cleaves lactose into galactose and glucose, after weaning. The prevalence of lactase deficiency (LD) spans from 2 to 15% among northern Europeans, to nearly 100% among Asians. Following lactose consumption, people with LD often experience gastrointestinal symptoms such as abdominal pain, bowel distension, cramps and flatulence, or even systemic problems such as headache, loss of concentration and muscle pain. These symptoms vary depending on the amount of lactose ingested, type of food and degree of intolerance. Although those affected can avoid the uptake of dairy products, in doing so, they lose a readily available source of calcium and protein. In this work, gels obtained by complexation of Tetronic 90R4 with α-cyclodextrin loaded with β-galactosidase are proposed as a way to administer the enzyme immediately before or with the lactose-containing meal. Both molecules are biocompatible, can form gels in situ, and show sustained erosion kinetics in aqueous media. The complex was characterized by FTIR that evidenced an inclusion complex between the polyethylene oxide block and α-cyclodextrin. The release profiles of β-galactosidase from two different matrices (gels and tablets) of the in situ hydrogels have been obtained. The influence of the percentage of Tetronic in media of different pH was evaluated. No differences were observed regarding the release rate from the gel matrices at pH 6 (t50 = 105 min). However, in the case of the tablets, the kinetics were faster and they released a greater amount of 90R4 (25%, t50 = 40–50 min). Also, the amount of enzyme released was higher for mixtures with 25% Tetronic. Using suitable mathematical models, the corresponding kinetic parameters have been calculated. In all cases, the release data fit quite well to the Peppas–Sahlin model equation, indicating that the release of β-galactosidase is governed by a combination of diffusion and erosion processes. It has been observed that the diffusion mechanism prevails over erosion during the first 50 minutes, followed by continued release of the enzyme due to the disintegration of the matrix.