819 resultados para Cardiovascular risk factors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Hemodialysis patients are high absorbers of intestinal cholesterol; they benefit less than other patient groups from statin therapy, which inhibits cholesterol synthesis. OBJECTIVES This study sought to investigate whether the individual cholesterol absorption rate affects atorvastatin's effectiveness to reduce cardiovascular risk in hemodialysis patients. METHODS This post-hoc analysis included 1,030 participants in the German Diabetes and Dialysis Study (4D) who were randomized to either 20 mg of atorvastatin (n = 519) or placebo (n = 511). The primary endpoint was a composite of major cardiovascular events. Secondary endpoints included all-cause mortality and all cardiac events. Tertiles of the cholestanol-to-cholesterol ratio, which is an established biomarker of cholesterol absorption, were used to identify high and low cholesterol absorbers. RESULTS A total of 454 primary endpoints occurred. On multivariate time-to-event analyses, the interaction term between tertiles and treatment with atorvastatin was significantly associated with the risk of reaching the primary endpoint. Stratified analysis by cholestanol-to-cholesterol ratio tertiles confirmed this effect modification: atorvastatin reduced the risk of reaching the primary endpoint in the first tertile (hazard ratio [HR]: 0.72; p = 0.049), but not the second (HR: 0.79; p = 0.225) or third tertiles (HR: 1.21; p = 0.287). Atorvastatin consistently significantly reduced all-cause mortality and the risk of all cardiac events in only the first tertile. CONCLUSIONS Intestinal cholesterol absorption, as reflected by cholestanol-to-cholesterol ratios, predicts the effectiveness of atorvastatin to reduce cardiovascular risk in hemodialysis patients. Those with low cholesterol absorption appear to benefit from treatment with atorvastatin, whereas those with high absorption do not benefit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background.  Although acquired immune deficiency syndrome-associated morbidity has diminished due to excellent viral control, multimorbidity may be increasing among human immunodeficiency virus (HIV)-infected persons compared with the general population. Methods.  We assessed the prevalence of comorbidities and multimorbidity in participants of the Swiss HIV Cohort Study (SHCS) compared with the population-based CoLaus study and the primary care-based FIRE (Family Medicine ICPC-Research using Electronic Medical Records) records. The incidence of the respective endpoints were assessed among SHCS and CoLaus participants. Poisson regression models were adjusted for age, sex, body mass index, and smoking. Results.  Overall, 74 291 participants contributed data to prevalence analyses (3230 HIV-infected; 71 061 controls). In CoLaus, FIRE, and SHCS, multimorbidity was present among 26%, 13%, and 27% of participants. Compared with nonsmoking individuals from CoLaus, the incidence of cardiovascular disease was elevated among smoking individuals but independent of HIV status (HIV-negative smoking: incidence rate ratio [IRR] = 1.7, 95% confidence interval [CI] = 1.2-2.5; HIV-positive smoking: IRR = 1.7, 95% CI = 1.1-2.6; HIV-positive nonsmoking: IRR = 0.79, 95% CI = 0.44-1.4). Compared with nonsmoking HIV-negative persons, multivariable Poisson regression identified associations of HIV infection with hypertension (nonsmoking: IRR = 1.9, 95% CI = 1.5-2.4; smoking: IRR = 2.0, 95% CI = 1.6-2.4), kidney (nonsmoking: IRR = 2.7, 95% CI = 1.9-3.8; smoking: IRR = 2.6, 95% CI = 1.9-3.6), and liver disease (nonsmoking: IRR = 1.8, 95% CI = 1.4-2.4; smoking: IRR = 1.7, 95% CI = 1.4-2.2). No evidence was found for an association of HIV-infection or smoking with diabetes mellitus. Conclusions.  Multimorbidity is more prevalent and incident in HIV-positive compared with HIV-negative individuals. Smoking, but not HIV status, has a strong impact on cardiovascular risk and multimorbidity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND This study evaluated whether risk factors for sternal wound infections vary with the type of surgical procedure in cardiac operations. METHODS This was a university hospital surveillance study of 3,249 consecutive patients (28% women) from 2006 to 2010 (median age, 69 years [interquartile range, 60 to 76]; median additive European System for Cardiac Operative Risk Evaluation score, 5 [interquartile range, 3 to 8]) after (1) isolated coronary artery bypass grafting (CABG), (2) isolated valve repair or replacement, or (3) combined valve procedures and CABG. All other operations were excluded. Univariate and multivariate binary logistic regression were conducted to identify independent predictors for development of sternal wound infections. RESULTS We detected 122 sternal wound infections (3.8%) in 3,249 patients: 74 of 1,857 patients (4.0%) after CABG, 19 of 799 (2.4%) after valve operations, and 29 of 593 (4.9%) after combined procedures. In CABG patients, bilateral internal thoracic artery harvest, procedural duration exceeding 300 minutes, diabetes, obesity, chronic obstructive pulmonary disease, and female sex (model 1) were independent predictors for sternal wound infection. A second model (model 2), using the European System for Cardiac Operative Risk Evaluation, revealed bilateral internal thoracic artery harvest, diabetes, obesity, and the second and third quartiles of the European System for Cardiac Operative Risk Evaluation were independent predictors. In valve patients, model 1 showed only revision for bleeding as an independent predictor for sternal infection, and model 2 yielded both revision for bleeding and diabetes. For combined valve and CABG operations, both regression models demonstrated revision for bleeding and duration of operation exceeding 300 minutes were independent predictors for sternal infection. CONCLUSIONS Risk factors for sternal wound infections after cardiac operations vary with the type of surgical procedure. In patients undergoing valve operations or combined operations, procedure-related risk factors (revision for bleeding, duration of operation) independently predict infection. In patients undergoing CABG, not only procedure-related risk factors but also bilateral internal thoracic artery harvest and patient characteristics (diabetes, chronic obstructive pulmonary disease, obesity, female sex) are predictive of sternal wound infection. Preventive interventions may be justified according to the type of operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fish, like mammals, can be affected by neoplastic proliferations. As yet, there are only a very small number of studies reporting on the occurrence of tumours in koi carp Cyprinus carpio koi and only sporadic reports on the nature of the tumours or on risk factors associated with their development. Between 2008 and 2012, koi with abdominal swelling were examined pathologically: neoplastic lesions were diagnosed and classified histologically. We evaluated possible risk factors for the development of these internal neoplasms in koi carp in Switzerland, using an online 2-part questionnaire sent to fish keepers with koi affected by internal tumours and to fish keepers who had not previously reported any affected koi. Part 1 addressed all participants and focused on general information about koi husbandry and pond technical data; Part 2 addressed participants that had one or several case(s) of koi with internal tumour(s) between 2008 and 2012, and consisted of specific questions about affected koi. A total of 112 internal tumours were reported by the 353 koi keepers participating in the survey. Analysis of the obtained data revealed that tumour occurrence was significantly associated with the location (indoors vs. outdoors) and volume of the pond, frequency of water changes, origin of the koi, number of koi kept in a Pond and the use of certain pond disinfectant/medication products. Our results contribute to the identification of possible risk factors, which in turn could help to establish prophylactic measures in order to reduce the occurrence of internal neoplasms in koi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

REASONS FOR PERFORMING THE STUDY: Racetrack injuries are of welfare concern and prevention of injuries is an important goal in many racing jurisdictions. Over the years this has led to more detailed recording of clinical events on racecourses. However, risk factor analyses of clinical events at race meetings have never been reported for Switzerland OBJECTIVE: To identify discipline-specific factors that influence the occurrence of clinical events during race meetings with the ultimate aim to improve the monitoring and safety on racetracks in Switzerland and optimise racehorse welfare. STUDY DESIGN: Retrospective study of horse race data collected by the Swiss horse racing association. METHODS: All race starts (n = 17,670, including 6,198 flat, 1,257 obstacle and 10,215 trot race starts) recorded over a period of four years (2009-2012) were analysed in multivariable mixed effect logistic regression models including horse and racecourse related data. The models were designed to identify discipline specific factors influencing the occurrence of clinical events on racecourses in Switzerland. RESULTS: Factors influencing the risk of clinical events during races were different for each discipline. The risk of a clinical event in trot racing was lower for racing on a Porphyre-sand track than on grass tracks. Horses whose driver was also their trainer had an approximately two times higher risk for clinical events. In obstacle races, longer distances (2401-3300 m and 3301-5400 m respectively) had a protective effect compared to racing over shorter distances. In flat racing, five racecourses reported significantly less clinical events. In all three disciplines, finishing 8th place or later was associated with clinical events. CONCLUSIONS: Changes in management that aim to improve the safety and welfare of racehorses, such as racetrack adaptations, need to be individualised for each discipline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Hypothetically the atherogenic effect of the metabolic syndrome may be mediated through the increased occurrence of small LDL-particles which are easily modified to atherogenic oxidized LDL (ox-LDL). The aim of this study was to test this concept by examining the association between circulating ox-LDL, LDL-particle size, and the metabolic syndrome. DESIGN AND RESULTS A population-based sample of clinically healthy 58-year-old men (n = 391) was recruited. Ox-LDL was measured by ELISA (specific monoclonal antibody, mAb-4E6) and LDL-particle size by gradient gel electrophoresis. The results showed that ox-LDL significantly correlated to factors constituting the metabolic syndrome; triglycerides (r = 0.43), plasma insulin (r = 0.20), body mass index (r = 0.20), waist-to-hip ratio (r = 0.21) and HDL (r = -0.24); (P < 0.001). Ox-LDL correlated also to LDL-particle size (r = -0.42), Apo-B (r = 0.70), LDL (r = 0.65); (P < 0.001) and, furthermore, with Apo A-1 (r = -0.13) and heart rate (r = 0.13); (P < 0.01). CONCLUSION The metabolic syndrome was accompanied by high plasma ox-LDL concentrations compared with those without the syndrome. Ox-LDL levels were associated with most of the risk factors constituting the metabolic syndrome and was, in addition related to small LDL-particle size. To our knowledge the present study is the first one to demonstrate that circulating ox-LDL levels are associated with small LDL-particle size in a population representative sample of clinically healthy middle-aged men. The high degree of intercorrelation amongst several factors makes it difficult to clarify the independent role of any specific factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Physicians traditionally treat ulcerative colitis (UC) using a step-up approach. Given the paucity of data, we aimed to assess the cumulative probability of UC-related need for step-up therapy and to identify escalation-associated risk factors. METHODS Patients with UC enrolled into the Swiss IBD Cohort Study were analyzed. The following steps from the bottom to the top of the therapeutic pyramid were examined: (1) 5-aminosalicylic acid and/or rectal corticosteroids, (2) systemic corticosteroids, (3) immunomodulators (IM) (azathioprine, 6-mercaptopurine, methotrexate), (4) TNF antagonists, (5) calcineurin inhibitors, and (6) colectomy. RESULTS Data on 996 patients with UC with a median disease duration of 9 years were examined. The point estimates of cumulative use of different treatments at years 1, 5, 10, and 20 after UC diagnosis were 91%, 96%, 96%, and 97%, respectively, for 5-ASA and/or rectal corticosteroids, 63%, 69%, 72%, and 79%, respectively, for systemic corticosteroids, 43%, 57%, 59%, and 64%, respectively, for IM, 15%, 28%, and 35% (up to year 10 only), respectively, for TNF antagonists, 5%, 9%, 11%, and 12%, respectively, for calcineurin inhibitors, 1%, 5%, 9%, and 18%, respectively, for colectomy. The presence of extraintestinal manifestations and extended disease location (at least left-sided colitis) were identified as risk factors for step-up in therapy with systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and surgery. Cigarette smoking at diagnosis was protective against surgery. CONCLUSIONS The presence of extraintestinal manifestations, left-sided colitis, and extensive colitis/pancolitis at the time of diagnosis were associated with use of systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and colectomy during the disease course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Calcium disorders are common in both intensive care units and in patients with chronic kidney disease and are associated with increased morbidity and mortality. It is unknown whether calcium abnormalities in unselected emergency department admissions have an impact on in-hospital mortality. METHODS This cross-sectional analysis included all admissions to the Emergency Department at the Inselspital Bern, Switzerland from 2010 to 2011. For hyper- and hypocalcaemic patients with a Mann-Whitney U-test, the differences between subgroups divided by age, length of hospital stay, creatinine, sodium, chloride, phosphate, potassium and magnesium were compared. Associations between calcium disorders and 28-day in-hospital mortality were assessed using the Cox proportional hazard regression model. RESULTS 8,270 patients with calcium measurements were included in our study. Overall 264 (3.2%) patients died. 150 patients (6.13%) with hypocalcaemia and 7 patients with hypercalcaemia (6.19%) died, in contrast to 104 normocalcaemic patients (1.82%). In univariate analysis, calcium serum levels were associated with sex, mortality and pre-existing diuretic therapy (all p<0.05). In multivariate Cox regression analysis, hypocalcaemia and hypercalcaemia were independent risk factors for mortality (HR 2.00 and HR 1.88, respectively; both p<0.01). CONCLUSION Both hypocalcaemia and hypercalcaemia are associated with increased 28-day in-hospital mortality in unselected emergency department admissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE The aim of this study was to analyze the patient pool referred to a specialty clinic for implant surgery over a 3-year period. MATERIALS AND METHODS All patients receiving dental implants between 2008 and 2010 at the Department of Oral Surgery and Stomatology were included in the study. As primary outcome parameters, the patients were analyzed according to the following criteria: age, sex, systemic diseases, and indication for therapy. For the inserted implants, the type of surgical procedure, the types of implants placed, postsurgical complications, and early failures were recorded. A logistic regression analysis was performed to identify possible local and systemic risk factors for complications. As a secondary outcome, data regarding demographics and surgical procedures were compared with the findings of a historic study group (2002 to 2004). RESULTS A total of 1,568 patients (792 women and 776 men; mean age, 52.6 years) received 2,279 implants. The most frequent indication was a single-tooth gap (52.8%). Augmentative procedures were performed in 60% of the cases. Tissue-level implants (72.1%) were more frequently used than bone-level implants (27.9%). Regarding dimensions of the implants, a diameter of 4.1 mm (59.7%) and a length of 10 mm (55.0%) were most often utilized. An early failure rate of 0.6% was recorded (13 implants). Patients were older and received more implants in the maxilla, and the complexity of surgical interventions had increased when compared to the patient pool of 2002 to 2004. CONCLUSION Implant therapy performed in a surgical specialty clinic utilizing strict patient selection and evidence-based surgical protocols showed a very low early failure rate of 0.6%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There has been little research on bathroom accidents. It is unknown whether the shower or bathtub are connected with special dangers in different age groups or whether there are specific risk factors for adverse outcomes. METHODS This cross-sectional analysis included all direct admissions to the Emergency Department at the Inselspital Bern, Switzerland from 1 January 2000 to 28 February 2014 after accidents associated with the bathtub or shower. Time, age, location, mechanism and diagnosis were assessed and special risk factors were examined. Patient groups with and without intracranial bleeding were compared with the Mann-Whitney U test.The association of risk factors with intracranial bleeding was investigated using univariate analysis with Fisher's exact test or logistic regression. The effects of different variables on cerebral bleeding were analysed by multivariate logistic regression. RESULTS Two hundred and eighty (280) patients with accidents associated with the bathtub or shower were included in our study. Two hundred and thirty-five (235) patients suffered direct trauma by hitting an object (83.9%) and traumatic brain injury (TBI) was detected in 28 patients (10%). Eight (8) of the 27 patients with mild traumatic brain injuries (GCS 13-15), (29.6%) exhibited intracranial haemorrhage. All patients with intracranial haemorrhage were older than 48 years and needed in-hospital treatment. Patients with intracranial haemorrhage were significantly older and had higher haemoglobin levels than the control group with TBI but without intracranial bleeding (p<0.05 for both).In univariate analysis, we found that intracranial haemorrhage in patients with TBI was associated with direct trauma in general and with age (both p<0.05), but not with the mechanism of the fall, its location (shower or bathtub) or the gender of the patient. Multivariate logistic regression analysis identified only age as a risk factor for cerebral bleeding (p<0.05; OR 1.09 (CI 1.01;1.171)). CONCLUSION In patients with ED admissions associated with the bathtub or shower direct trauma and age are risk factors for intracranial haemorrhage. Additional effort in prevention should be considered, especially in the elderly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

UNLABELLED In a prospective multicentre study of bloodstream infection (BSI) from November 01, 2007 to July 31, 2010, seven paediatric cancer centres (PCC) from Germany and one from Switzerland included 770 paediatric cancer patients (58% males; median age 8.3 years, interquartile range (IQR) 3.8-14.8 years) comprising 153,193 individual days of surveillance (in- and outpatient days during intensive treatment). Broviac catheters were used in 63% of all patients and Ports in 20%. One hundred forty-two patients (18%; 95% CI 16 to 21%) experienced at least one BSI (179 BSIs in total; bacteraemia 70%, bacterial sepsis 27%, candidaemia 2%). In 57%, the BSI occurred in inpatients, in 79% after conventional chemotherapy. Only 56 % of the patients showed neutropenia at BSI onset. Eventually, patients with acute lymphoblastic leukaemia (ALL) or acute myeloblastic leukaemia (AML), relapsed malignancy and patients with a Broviac faced an increased risk of BSI in the multivariate analysis. Relapsed malignancy (16%) was an independent risk factor for all BSI and for Gram-positive BSI. CONCLUSION This study confirms relapsed malignancy as an independent risk factor for BSIs in paediatric cancer patients. On a unit level, data on BSIs in this high-risk population derived from prospective surveillance are not only mandatory to decide on empiric antimicrobial treatment but also beneficial in planning and evaluating preventive bundles. WHAT IS KNOWN • Paediatric cancer patients face an increased risk of nosocomial bloodstream infections (BSIs). • In most cases, these BSIs are associated with the use of a long-term central venous catheter (Broviac, Port), severe and prolonged immunosuppression (e.g. neutropenia) and other chemotherapy-induced alterations of host defence mechanisms (e.g. mucositis). What is New: • This study is the first multicentre study confirming relapsed malignancy as an independent risk factor for BSIs in paediatric cancer patients. • It describes the epidemiology of nosocomial BSI in paediatric cancer patients mainly outside the stem cell transplantation setting during conventional intensive therapy and argues for prospective surveillance programmes to target and evaluate preventive bundle interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.