821 resultados para Fetal macrosomia - Risk factors
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Resumo:
Although persons infected with human immunodeficiency virus (HIV), particularly men who have sex with men, are at excess risk for anal cancer, it has been difficult to disentangle the influences of anal exposure to human papillomavirus (HPV) infection, immunodeficiency, and combined antiretroviral therapy. A case-control study that included 59 anal cancer cases and 295 individually matched controls was nested in the Swiss HIV Cohort Study (1988-2011). In a subset of 41 cases and 114 controls, HPV antibodies were tested. A majority of anal cancer cases (73%) were men who have sex with men. Current smoking was significantly associated with anal cancer (odds ratio (OR) = 2.59, 95% confidence interval (CI): 1.25, 5.34), as were antibodies against L1 (OR = 4.52, 95% CI: 2.00, 10.20) and E6 (OR = ∞, 95% CI: 4.64, ∞) of HPV16, as well as low CD4+ cell counts, whether measured at nadir (OR per 100-cell/μL decrease = 1.53, 95% CI: 1.18, 2.00) or at cancer diagnosis (OR per 100-cell/μL decrease = 1.24, 95% CI: 1.08, 1.42). However, the influence of CD4+ cell counts appeared to be strongest 6-7 years prior to anal cancer diagnosis (OR for <200 vs. ≥500 cells/μL = 14.0, 95% CI: 3.85, 50.9). Smoking cessation and avoidance of even moderate levels of immunosuppression appear to be important in reducing long-term anal cancer risks.
Resumo:
BACKGROUND In recent years, the occurrence and the relevance of Mycoplasma hyopneumoniae infections in suckling pigs has been examined in several studies. Whereas most of these studies were focused on sole prevalence estimation within different age groups, follow-up of infected piglets or assessment of pathological findings, none of the studies included a detailed analysis of individual and environmental risk factors. Therefore, the aim of the present study was to investigate the frequency of M. hyopneumoniae infections in suckling pigs of endemically infected herds and to identify individual risk factors potentially influencing the infection status of suckling pigs at the age of weaning. RESULTS The animal level prevalence of M. hyopneumoniae infections in suckling pigs examined in three conventional pig breeding herds was 3.6% (41/1127) at the time of weaning. A prevalence of 1.2% was found in the same pigs at the end of their nursery period. In a multivariable Poisson regression model it was found that incidence rate ratios (IRR) for suckling pigs are significantly lower than 1 when teeth grinding was conducted (IRR: 0.10). Moreover, high temperatures in the piglet nest during the first two weeks of life (occasionally >40°C) were associated with a decrease of the probability of an infection (IRR: 0.23-0.40). Contrary, the application of PCV2 vaccines to piglets was associated with an increased infection risk (IRR: 9.72). CONCLUSIONS Since single infected piglets are supposed to act as initiators for the transmission of this pathogen in nursery and fattening pigs, the elimination of the risk factors described in this study should help to reduce the incidence rate of M. hyopneumoniae infections and thereby might contribute to a reduced probability of high prevalences in older pigs.
Resumo:
BACKGROUND Mycoplasma hyopneumoniae is the etiologic agent of enzootic pneumonia mainly occurring in fattening pigs. It is assumed that horizontal transmission of the pathogen during nursery and growing phase starts with few suckling pigs vertically infected by the sow. The aim of the present study was the exploration of the herd prevalence of M. hyopneumoniae infections in suckling pigs followed by an investigation of various herd specific factors for their potential of influencing the occurrence of this pathogen at the age of weaning. RESULTS In this cross-sectional study, 125 breeding herds were examined by taking nasal swabs from 20 suckling pigs in each herd. In total, 3.9% (98/2500) of all nasal swabs were tested positive for M. hyopneumoniae by real-time PCR. Piglets tested positive originated from 46 different herds resulting in an overall herd prevalence of 36.8% (46/125) for M. hyopneumoniae infection in pigs at the age of weaning. While the herds were epidemiologically characterized, the risk for demonstration of M. hyopneumoniae was significantly increased, when the number of purchased gilts per year was more than 120 (OR: 5.8), and when the number of farrowing pens per compartment was higher than 16 (OR: 3.3). In herds with a planned and segregated production, where groups of sows entered previously emptied farrowing units, the risk for demonstration of M. hyopneumoniae in piglets was higher in herds with two or four weeks between batches than in herds with one or three weeks between batches (OR: 2.7). CONCLUSIONS In this cross-sectional study, several risk factors could be identified enhancing the probability of breeding herds to raise suckling pigs already infected with M. hyopneumoniae at the time of weaning. Interestingly, some factors (farrowing rhythm, gilt acclimatisation issues) were overlapping with those also influencing the seroprevalences among sows or the transmission of the pathogen between older age groups. Taking the multifactorial character of enzootic pneumonia into account, the results of this study substantiate that a comprehensive herd specific prevention programme is a prerequisite to reduce transmission of and disease caused by M. hyopneumoniae.
Resumo:
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
Resumo:
In Switzerland, group-housing for breeding rabbit does is not explicitly required by law, but label programmes, as well as the general public and animal welfare groups, are advocating it. Although group-housing is of great benefit to the gregariously living rabbits, the establishment of a social hierarchy within the group might lead to stress and lesions. In the present epidemiological study, lesions were scored twice on 30% of the breeding does on all 28 commercial Swiss farms with group-housed breeding does. Additionally, a detailed questionnaire was filled out with all producers to determine risk factors potentially associated with lesions. Data were analysed using hierarchical proportional odds models. About 33% of the does examined had lesions, including wounds that were almost healed and small scratches. Severe lesions were counted on 9% of the animals. Differences between seasons in lesions score were identified, with the extent of lesions being higher in summer than in spring. Fewer lesions occurred on farms on which mastitis was more common. More lesions were found on farms where the does were isolated between littering and artificial insemination than on farms without isolation. According to the producers, most of the aggression occurred directly after the isolation phase when the does were regrouped again. We conclude that lesions in group-housed breeding does might be reduced by appropriate reproductive management.
Resumo:
Over the last couple of decades, the UK experienced a substantial increase in the incidence and geographical spread of bovine tuberculosis (TB), in particular since the epidemic of foot-and-mouth disease (FMD) in 2001. The initiation of the Randomized Badger Culling Trial (RBCT) in 1998 in south-west England provided an opportunity for an in-depth collection of questionnaire data (covering farming practices, herd management and husbandry, trading and wildlife activity) from herds having experienced a TB breakdown between 1998 and early 2006 and randomly selected control herds, both within and outside the RBCT (the so-called TB99 and CCS2005 case-control studies). The data collated were split into four separate and comparable substudies related to either the pre-FMD or post-FMD period, which are brought together and discussed here for the first time. The findings suggest that the risk factors associated with TB breakdowns may have changed. Higher Mycobacterium bovis prevalence in badgers following the FMD epidemic may have contributed to the identification of the presence of badgers on a farm as a prominent TB risk factor only post-FMD. The strong emergence of contact/trading TB risk factors post-FMD suggests that the purchasing and movement of cattle, which took place to restock FMD-affected areas after 2001, may have exacerbated the TB problem. Post-FMD analyses also highlighted the potential impact of environmental factors on TB risk. Although no unique and universal solution exists to reduce the transmission of TB to and among British cattle, there is an evidence to suggest that applying the broad principles of biosecurity on farms reduces the risk of infection. However, with trading remaining as an important route of local and long-distance TB transmission, improvements in the detection of infected animals during pre- and post-movement testing should further reduce the geographical spread of the disease.
Resumo:
PRINCIPLES To evaluate the validity and feasibility of a novel photography-based home assessment (PhoHA) protocol, as a possible substitute for on-site home assessment (OsHA). METHODS A total of 20 patients aged ≥65 years who were hospitalised in a rehabilitation centre for musculoskeletal disorders affecting mobility participated in this prospective validation study. For PhoHA, occupational therapists rated photographs and measurements of patients' homes provided by patients' confidants. For OsHA, occupational therapists conducted a conventional home visit. RESULTS Information obtained by PhoHA was 79.1% complete (1,120 environmental factors identified by PhoHA vs 1416 by OsHA). Of the 1,120 factors, 749 had dichotomous (potential hazards) and 371 continuous scores (measurements with tape measure). Validity of PhoHA to potential hazards was good (sensitivity 78.9%, specificity 84.9%), except for two subdomains (pathways, slippery surfaces). Pearson's correlation coefficient for the validity of measurements was 0.87 (95% confidence interval [CI 0.80-0.92, p <0.001). Agreement between methods was 0.52 (95%CI 0.34-0.67, p <0.001, Cohen's kappa coefficient) for dichotomous and 0.86 (95%CI 0.79-0.91, p <0.001, intraclass correlation coefficient) for continuous scores. Costs of PhoHA were 53.0% lower than those of OsHA (p <0.001). CONCLUSIONS PhoHA has good concurrent validity for environmental assessment if instructions for confidants are improved. PhoHA is potentially a cost-effective method for environmental assessment.
Resumo:
The objective of this study was to evaluate risk factors associated with foot lesions and lameness in Swiss dairy cows. Potential risk factors were recorded by means of examination of 1'449 Swiss cows and the management systems of 78 farms during routine claw-trimming, and during personal interviews with the associated farmers. Statistical analysis of animal-based and herd level risk factors were performed using multivariate logistic regression models. The risk of being lame was increased in cows affected by digital dermatitis complex, heel-horn erosion, interdigital hyperplasia, Rusterholz' sole ulcer, deep laceration, double sole and severe hemorrhages. Cleanliness, BCS, affection with other foot lesions, breed, importance of claw health to the farmer, frequency of routine claw-trimming, producing according to the guidelines of the welfare label program RAUS, and silage feeding were shown to be associated with the occurrence of some of the evaluated foot lesions and lameness. The identified risk factors may help to improve management and the situation of lameness and claw health in dairy cows in Switzerland and other alpine areas with similar housing and pasturing systems.
Resumo:
OBJECTIVE To determine the frequency of and risk factors for complications associated with casts in horses. DESIGN Multicenter retrospective case series. ANIMALS 398 horses with a half-limb or full-limb cast treated at 1 of 4 hospitals. PROCEDURES Data collected from medical records included age, breed, sex, injury, limb affected, time from injury to hospital admission, surgical procedure performed, type of cast (bandage cast [BC; fiberglass tape applied over a bandage] or traditional cast [TC; fiberglass tape applied over polyurethane resin-impregnated foam]), limb position in cast (flexed, neutral, or extended), and complications. Risk factors for cast complications were identified via multiple logistic regression. RESULTS Cast complications were detected in 197 of 398 (49%) horses (18/53 [34%] horses with a BC and 179/345 [52%] horses with a TC). Of the 197 horses with complications, 152 (77%) had clinical signs of complications prior to cast removal; the most common clinical signs were increased lameness severity and visibly detectable soft tissue damage Cast sores were the most common complication (179/398 [45%] horses). Casts broke for 20 (5%) horses. Three (0.8%) horses developed a bone fracture attributable to casting Median time to detection of complications was 12 days and 8 days for horses with TCs and BCs, respectively. Complications developed in 71%, 48%, and 47% of horses with the casted limb in a flexed, neutral, and extended position, respectively. For horses with TCs, hospital, limb position in the cast, and sex were significant risk factors for development of cast complications. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that 49% of horses with a cast developed cast complications.
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^
Resumo:
The association of measures of physical activity with coronary heart disease (CHD) risk factors in children, especially those for atherosclerosis, is unknown. The purpose of this study was to determine the association of physical activity and cardiovascular fitness with blood lipids and lipoproteins in pre-adolescent and adolescent girls.^ The study population was comprised of 131 girls aged 9 to 16 years who participated in the Children's Nutrition Research Center's Adolescent Study. The dependent variables, blood lipids and lipoproteins, were measured by standard techniques. The independent variables were physical activity measured as the difference between total energy expenditure (TEE) and basal metabolic rate (BMR), and cardiovascular fitness, VO$\rm\sb{2max}$(ml/min/kg). TEE was measured by the doubly-labeled water (DLW) method, and BMR by whole-room calorimetry. Cardiovascular fitness, VO$\rm\sb{2max}$(ml/min/kg), was measured on a motorized treadmill. The potential confounding variables were sexual maturation (Tanner breast stage), ethnic group, body fat percent, and dietary variables. A systematic strategy for data analysis was used to isolate the effects of physical activity and cardiovascular fitness on blood lipids, beginning with assessment of confounding and interaction. Next, from regression models predicting each blood lipid and controlling for covariables, hypotheses were evaluated by the direction and value of the coefficients for physical activity and cardiovascular fitness.^ The main result was that cardiovascular fitness appeared to be more strongly associated with blood lipids than physical activity. An interaction between cardiovascular fitness and sexual maturation indicated that the effect of cardiovascular fitness on most blood lipids was dependent on the stage of sexual maturation.^ A difference of 760 kcal/d physical activity (which represents the difference between the 25th and 75th percentile of physical activity) was associated with negligible differences in blood lipids. In contrast, a difference in 10 ml/min/kg of VO$\rm\sb{2max}$ or cardiovascular fitness (which represents the difference between the 25th and 75th percentile in cardiovascular fitness) in the early stages of sexual maturation was associated with an average positive difference of 15 mg/100 ml ApoA-1 and 10 mg/100 ml HDL-C. ^
Resumo:
Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^