967 resultados para somatic cell count in cows
Resumo:
Background Most adults infected with HIV achieve viral suppression within a year of starting combination antiretroviral therapy (cART). It is important to understand the risk of AIDS events or death for patients with a suppressed viral load. Methods and Findings Using data from the Collaboration of Observational HIV Epidemiological Research Europe (2010 merger), we assessed the risk of a new AIDS-defining event or death in successfully treated patients. We accumulated episodes of viral suppression for each patient while on cART, each episode beginning with the second of two consecutive plasma viral load measurements <50 copies/µl and ending with either a measurement >500 copies/µl, the first of two consecutive measurements between 50–500 copies/µl, cART interruption or administrative censoring. We used stratified multivariate Cox models to estimate the association between time updated CD4 cell count and a new AIDS event or death or death alone. 75,336 patients contributed 104,265 suppression episodes and were suppressed while on cART for a median 2.7 years. The mortality rate was 4.8 per 1,000 years of viral suppression. A higher CD4 cell count was always associated with a reduced risk of a new AIDS event or death; with a hazard ratio per 100 cells/µl (95% CI) of: 0.35 (0.30–0.40) for counts <200 cells/µl, 0.81 (0.71–0.92) for counts 200 to <350 cells/µl, 0.74 (0.66–0.83) for counts 350 to <500 cells/µl, and 0.96 (0.92–0.99) for counts ≥500 cells/µl. A higher CD4 cell count became even more beneficial over time for patients with CD4 cell counts <200 cells/µl. Conclusions Despite the low mortality rate, the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression. A higher CD4 cell count was associated with the greatest benefit for patients with a CD4 cell count <200 cells/µl but still some slight benefit for those with a CD4 cell count ≥500 cells/µl.
Resumo:
Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.
Resumo:
The objective was to compare the prevalence of subclinical mastitis (SM) and of udder pathogens in 60 Swiss organic (OP) and 60 conventional production systems (CP). Cows (n=970) were studied for SM prevalence and udder pathogens at median 31 d and 102 d post partum. Cows showing a >/=1+ positive California Mastitis Test (CMT) in at least one quarter were considered to have SM. Cow-level prevalences of SM for visits at 31 d and 102 d post partum (39% and 40% in OP and 34% and 35% in CP) were similar, but quarter-level prevalences of SM were higher (P<0.02) in OP than CP (15% and 18% in OP and 12% and 15% in CP). Median somatic cell counts in milk at 31 d post partum were higher (P<0.05) in OP than CP cows (43000 and 28000 cells/ml, respectively), but were similar at 102 d post partum in OP and CP cows (45000 and 38000 cells/ml, respectively). In milk samples from quarters showing a CMT reaction >/=2+ the prevalences of coagulase negative staphylococci were lower (P<0.05) at 102 d post partum, whereas prevalences of non-agalactiae streptococci were higher (P<0.05) in OP than in CP cows at 31 d and 102 d post partum. In conclusion, under Swiss conditions, subclinical mastitis is a greater problem in organic than in conventional production systems, but differences are not marked.
Resumo:
BACKGROUND: CD4+ T-cell recovery in patients with continuous suppression of plasma HIV-1 viral load (VL) is highly variable. This study aimed to identify predictive factors for long-term CD4+ T-cell increase in treatment-naive patients starting combination antiretroviral therapy (cART). METHODS: Treatment-naive patients in the Swiss HIV Cohort Study reaching two VL measurements <50 copies/ml >3 months apart during the 1st year of cART were included (n=1816 patients). We studied CD4+ T-cell dynamics until the end of suppression or up to 5 years, subdivided into three periods: 1st year, years 2-3 and years 4-5 of suppression. Multiple median regression adjusted for repeated CD4+ T-cell measurements was used to study the dependence of CD4+ T-cell slopes on clinical covariates and drug classes. RESULTS: Median CD4+ T-cell increases following VL suppression were 87, 52 and 19 cells/microl per year in the three periods. In the multiple regression model, median CD4+ T-cell increases over all three periods were significantly higher for female gender, lower age, higher VL at cART start, CD4+ T-cell <650 cells/microl at start of the period and low CD4+ T-cell increase in the previous period. Patients on tenofovir showed significantly lower CD4+ T-cell increases compared with stavudine. CONCLUSIONS: In our observational study, long-term CD4+ T-cell increase in drug-naive patients with suppressed VL was higher in regimens without tenofovir. The clinical relevance of these findings must be confirmed in, ideally, clinical trials or large, collaborative cohort projects but could influence treatment of older patients and those starting cART at low CD4+ T-cell levels.
Resumo:
BACKGROUND: Estimates of the decrease in CD4(+) cell counts in untreated patients with human immunodeficiency virus (HIV) infection are important for patient care and public health. We analyzed CD4(+) cell count decreases in the Cape Town AIDS Cohort and the Swiss HIV Cohort Study. METHODS: We used mixed-effects models and joint models that allowed for the correlation between CD4(+) cell count decreases and survival and stratified analyses by the initial cell count (50-199, 200-349, 350-499, and 500-750 cells/microL). Results are presented as the mean decrease in CD4(+) cell count with 95% confidence intervals (CIs) during the first year after the initial CD4(+) cell count. RESULTS: A total of 784 South African (629 nonwhite) and 2030 Swiss (218 nonwhite) patients with HIV infection contributed 13,388 CD4(+) cell counts. Decreases in CD4(+) cell count were steeper in white patients, patients with higher initial CD4(+) cell counts, and older patients. Decreases ranged from a mean of 38 cells/microL (95% CI, 24-54 cells/microL) in nonwhite patients from the Swiss HIV Cohort Study 15-39 years of age with an initial CD4(+) cell count of 200-349 cells/microL to a mean of 210 cells/microL (95% CI, 143-268 cells/microL) in white patients in the Cape Town AIDS Cohort > or =40 years of age with an initial CD4(+) cell count of 500-750 cells/microL. CONCLUSIONS: Among both patients from Switzerland and patients from South Africa, CD4(+) cell count decreases were greater in white patients with HIV infection than they were in nonwhite patients with HIV infection.
Resumo:
Although associated with adverse outcomes in other cardiovascular diseases, the prognostic value of an elevated white blood cell (WBC) count, a marker of inflammation and hypercoagulability, is uncertain in patients with pulmonary embolism (PE). We therefore sought to assess the prognostic impact of the WBC in a large, state-wide retrospective cohort of patients with PE. We evaluated 14,228 patient discharges with a primary diagnosis of PE from 186 hospitals in Pennsylvania. We used random-intercept logistic regression to assess the independent association between WBC count levels at the time of presentation and mortality and hospital readmission within 30 days, adjusting for patient and hospital characteristics. Patients with an admission WBC count <5.0, 5.0-7.8, 7.9-9.8, 9.9-12.6, and >12.6 × 10(9) /L had a cumulative 30-day mortality of 10.9%, 6.2%, 5.4%, 8.3%, and 16.3% (P < 0.001), and a readmission rate of 17.6%, 11.9%, 10.9%, 11.5%, and 15.0%, respectively (P < 0.001). Compared with patients with a WBC count 7.9-9.8 × 10(9) /L, adjusted odds of 30-day mortality were significantly greater for patients with a WBC count <5.0 × 10(9) /L (odds ratio [OR] 1.52, 95% confidence interval [CI] 1.14-2.03), 9.9-12.6 × 10(9) /L (OR 1.55, 95% CI 1.26-1.91), or >12.6 × 10(9) /L (OR 2.22, 95% CI 1.83-2.69), respectively. The adjusted odds of readmission were also significantly increased for patients with a WBC count <5.0 × 10(9) /L (OR 1.34, 95% CI 1.07-1.68) or >12.6 × 10(9) /L (OR 1.29, 95% CI 1.10-1.51). In patients presenting with PE, WBC count is an independent predictor of short-term mortality and hospital readmission.
Resumo:
The provision of quality colostrum with a high concentration of immunoglobulins is critical for newborn calf health. Because first colostrum may be low in overall concentration to effectively reduce the risk of newborn infections, we tested equivalent milking fractions of colostrum for possible IgG differences. The objective of this study was to determine if the fractional composition of colostrum changes during the course of milking with a focus on immunoglobulins. Twenty-four Holstein and Simmental cows were milked (first colostrum) within 4h after calving. The colostrum of 1 gland per animal was assembled into 4 percentage fractions over the course of milking: 0 to 25%, 25 to 50%, 50 to 75%, and 75 to 100%. The IgG concentration among the various fractions did not change in any significant pattern. Concentration of protein, casein, lactose and somatic cell count remained the same or exhibited only minor changes during the course of fractional milking colostrum. We determined that no benefit exists in feeding any particular fraction of colostrum to the newborn.
Resumo:
BACKGROUND Prophylactic measures are key components of dairy herd mastitis control programs, but some are only relevant in specific housing systems. To assess the association between management practices and mastitis incidence, data collected in 2011 by a survey among 979 randomly selected Swiss dairy farms, and information from the regular test day recordings from 680 of these farms was analyzed. RESULTS The median incidence of farmer-reported clinical mastitis (ICM) was 11.6 (mean 14.7) cases per 100 cows per year. The median annual proportion of milk samples with a composite somatic cell count (PSCC) above 200,000 cells/ml was 16.1 (mean 17.3) %. A multivariable negative binomial regression model was fitted for each of the mastitis indicators for farms with tie-stall and free-stall housing systems separately to study the effect of other (than housing system) management practices on the ICM and PSCC events (above 200,000 cells/ml). The results differed substantially by housing system and outcome. In tie-stall systems, clinical mastitis incidence was mainly affected by region (mountainous production zone; incidence rate ratio (IRR) = 0.73), the dairy herd replacement system (1.27) and farmers age (0.81). The proportion of high SCC was mainly associated with dry cow udder controls (IRR = 0.67), clean bedding material at calving (IRR = 1.72), using total merit values to select bulls (IRR = 1.57) and body condition scoring (IRR = 0.74). In free-stall systems, the IRR for clinical mastitis was mainly associated with stall climate/temperature (IRR = 1.65), comfort mats as resting surface (IRR = 0.75) and when no feed analysis was carried out (IRR = 1.18). The proportion of high SSC was only associated with hand and arm cleaning after calving (IRR = 0.81) and beef producing value to select bulls (IRR = 0.66). CONCLUSIONS There were substantial differences in identified risk factors in the four models. Some of the factors were in agreement with the reported literature while others were not. This highlights the multifactorial nature of the disease and the differences in the risks for both mastitis manifestations. Attempting to understand these multifactorial associations for mastitis within larger management groups continues to play an important role in mastitis control programs.
Resumo:
The objective of this study was to describe the udder health management in Swiss dairy herds with udder health problems. One hundred dairy herds with a yield-corrected somatic cell count of 200'000 to 300'000 cells/ml during 2010 were selected. Data concerning farm structure, housing system, milking technique, milking procedures, dry-cow and mastitis management were collected during farm visits between September and December 2011. In addition, quarter milk samples were collected for bacteriological culturing from cows with a composite somatic cell count ≥ 150'000 cells/ml. The highest quarter level prevalence was 12.3 % for C. bovis. Eighty-two percent of the pipeline milking machines in tie-stalls and 88 % of the milking parlours fulfilled the criteria for the vacuum drop, and only 74 % of the pipeline milking machines met the criteria of the 10-l-water test. Eighty-five percent of the farms changed their milk liners too late. The correct order of teat preparation before cluster attachment was carried out by 37 % of the farmers only. With these results, Swiss dairy farmers and herd health veterinarians can be directed to common mistakes in mastitis management. The data will be used for future information campaigns to improve udder health in Swiss dairy farms.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
Effective activation of a recipient oocyte and its compatibility with the nuclear donor are critical to the successful nuclear reprogramming during nuclear transfer. We designed a series of experiments using various activation methods to determine the optimum activation efficiency of bovine oocytes. We then performed nuclear transfer (NT) of embryonic and somatic cells into cytoplasts presumably at G1/S phase (with prior activation) or at metaphase II (MII, without prior activation). Oocytes at 24 hr of maturation in vitro were activated with various combinations of calcium ionophore A23187 (A187) (5 microM, 5 min), electric pulse (EP), ethanol (7%, 7 min), cycloheximide (CHX) (10 micro g/ml, 6 hr), and then cultured in cytochalasin D (CD) for a total of 18 hr. Through a series of experiments (Exp. 1-4), an improved activation protocol (A187/EP/CHX/CD) was identified and used for comparison of NT efficiency of embryonic versus somatic donor cells (Exp. 5). When embryonic cells from morula and blastocysts (BL) were used as nuclear donors, a significantly higher rate of blastocyst development from cloned embryos was obtained with G1/S phase cytoplasts than with MII-phase cytoplasts (36 vs. 11%, P < 0.05). In contrast, when skin fibroblasts were used as donor cells, the use of an MII cytoplast (vs. G1/S phase) was imperative for blastocyst development (30 vs. 6%, P < 0.05). Differential staining showed that parthenogenetic, embryonic, and somatic cloned BL contained 26, 29, and 33% presumptive inner cell mass (ICM) cells, respectively, which is similar to that of frozen-thawed in vivo embryos at a comparable developmental stage (23%). These data indicate that embryonic and somatic nuclei require different recipient cytoplast environment for remodeling/ reprogramming, and this is likely due to the different cell cycle stage and profiles of molecular differentiation of the transferred donor nuclei.
Resumo:
A colonial protochordate, Botryllus schlosseri, undergoes a natural transplantation reaction in the wild that results alternatively in colony fusion (chimera formation) or inflammatory rejection. A single, highly polymorphic histocompatibility locus (called Fu/HC) is responsible for rejection versus fusion. Gonads are seeded and gametogenesis can occur in colonies well after fusion, and involves circulating germ-line progenitors. Buss proposed that colonial organisms might develop self/non-self histocompatibility systems to limit the possibility of interindividual germ cell “parasitism” (GCP) to histocompatible kin [Buss, L. W. (1982) Proc. Natl. Acad. Sci. USA 79, 5337–5341 and Buss, L. W. (1987) The Evolution of Individuality (Princeton Univ. Press, Princeton]. Here we demonstrate in laboratory and field experiments that both somatic cell and (more importantly) germ-line parasitism are a common occurrence in fused chimeras. These experiments support the tenet in Buss’s hypothesis that germ cell and somatic cell parasitism can occur in fused chimeras and that a somatic appearance may mask the winner of a gametic war. They also provide an interesting challenge to develop formulas that describe the inheritance of competing germ lines rather than competing individuals. The fact that fused B. schlosseri have higher rates of GCP than unfused colonies additionally provides a rational explanation for the generation and maintenance of a high degree of Fu/HC polymorphism, largely limiting GCP to sibling offspring.
Resumo:
Background-Elevated serum inflammatory marker levels are associated with a greater long-term risk of cardiovascular events. Because 3-hydroxy-3-methylglutaryl coenzyme-A reductase inhibitors (statins) may have an antiinflammatory action, it has been suggested that patients with elevated inflammatory marker levels may have a greater reduction in cardiovascular risk with statin treatment. Methods and Results-We evaluated the association between the white blood cell count (WBC) and coronary heart disease mortality during a mean follow-up of 6.0 years in the Long-Term Intervention With Pravastatin in Ischemic Disease (LIPID) Study, a clinical trial comparing pravastatin (40 mg/d) with a placebo in 9014 stable patients with previous myocardial infarction or unstable angina. An increase in baseline WBC was associated with greater coronary heart disease mortality in patients randomized to placebo (hazard ratio for 1 X 10(9)/L increase in WBC, 1.18; 95% CI, 1.12 to 1.25; P<0.001) but not pravastatin (hazard ratio, 1.02; 95% CI, 0.96 to 1.09; P=0.56; P for interaction=0.004). The numbers of coronary heart disease deaths prevented per 1000 patients treated with pravastatin were 0, 9, 30, and 38 for baseline WBC quartiles of <5.9, 6.0 to 6.9, 7.0 to 8.1, and >8.2X10(9)/L, respectively. WBC was a stronger predictor of this treatment benefit than the ratio of total to high-density lipoprotein cholesterol and a global measure of cardiac risk. There was also a greater reduction (P=0.052) in the combined incidence of cardiovascular mortality, nonfatal myocardial infarction, and stroke with pravastatin as baseline WBC increased ( by quartile: 3, 41, 61, and 60 events prevented per 1000 patients treated, respectively). Conclusions-These data support the hypothesis that individuals with evidence of inflammation may obtain a greater benefit from statin therapy.