120 resultados para Inhalation dose and risk
Resumo:
BACKGROUND: Reduced bone mineral density (BMD) is common in adults infected with human immunodeficiency virus (HIV). The role of proximal renal tubular dysfunction (PRTD) and alterations in bone metabolism in HIV-related low BMD are incompletely understood. METHODS: We quantified BMD (dual-energy x-ray absorptiometry), blood and urinary markers of bone metabolism and renal function, and risk factors for low BMD (hip or spine T score, -1 or less) in an ambulatory care setting. We determined factors associated with low BMD and calculated 10-year fracture risks using the World Health Organization FRAX equation. RESULTS: We studied 153 adults (98% men; median age, 48 years; median body mass index, 24.5; 67 [44%] were receiving tenofovir, 81 [53%] were receiving a boosted protease inhibitor [PI]). Sixty-five participants (42%) had low BMD, and 11 (7%) had PRTD. PI therapy was associated with low BMD in multivariable analysis (odds ratio, 2.69; 95% confidence interval, 1.09-6.63). Tenofovir use was associated with increased osteoblast and osteoclast activity (P< or = .002). The mean estimated 10-year risks were 1.2% for hip fracture and 5.4% for any major osteoporotic fracture. CONCLUSIONS: In this mostly male population, low BMD was significantly associated with PI therapy. Tenofovir recipients showed evidence of increased bone turnover. Measurement of BMD and estimation of fracture risk may be warranted in treated HIV-infected adults.
Resumo:
Background: Previous research has focused on the positive consequences of flow, an intrinsically rewarding state of deep absorption. In contrast, the present research links flow to impaired risk awareness and to risky behaviour. We expected flow to enhance self-efficacy beliefs, which in turn were hypothesised to result in low risk awareness and risky behaviour in sports. In addition, we predicted that individuals' level of experience in the activity would moderate the expected effects. Methods: One study with kayakers (Study 1) and two studies with rock climbers (Studies 2 and 3) were conducted. Kayakers completed a survey while still on the river; climbers responded during and upon completion of a climb. Results: In all studies flow was related to risk awareness. Study 2 additionally showed its association with risky behaviour. Studies 2 and 3 revealed that these relationships were mediated by self-efficacy. The mediations were moderated by level of experience (Study 3). Conclusions: The results indicated that inexperienced but not experienced participants respond to self-efficacy beliefs evoked by flow with impaired risk awareness and with risky behaviour. Theoretical implications for flow and risk research as well as practical implications for risk prevention are discussed.
Resumo:
BACKGROUND The optimal schedule and the need for a booster dose are unclear for Haemophilus influenzae type b (Hib) conjugate vaccines. We systematically reviewed relative effects of Hib vaccine schedules. METHODS We searched 21 databases to May 2010 or June 2012 and selected randomized controlled trials or quasi-randomized controlled trials that compared different Hib schedules (3 primary doses with no booster dose [3p+0], 3p+1 and 2p+1) or different intervals in primary schedules and between primary and booster schedules. Outcomes were clinical efficacy, nasopharyngeal carriage and immunological response. Results were combined in random-effects meta-analysis. RESULTS Twenty trials from 15 countries were included; 16 used vaccines conjugated to tetanus toxoid (polyribosylribitol phosphate conjugated to tetanus toxoid). No trials assessed clinical or carriage outcomes. Twenty trials examined immunological outcomes and found few relevant differences. Comparing polyribosylribitol phosphate conjugated to tetanus toxoid 3p+0 with 2p+0, there was no difference in seropositivity at the 1.0 μg/mL threshold by 6 months after the last primary dose (combined risk difference -0.02; 95% confidence interval: -0.10, 0.06). Only small differences were seen between schedules starting at different ages, with different intervals between primary doses, or with different intervals between primary and booster doses. Individuals receiving a booster were more likely to be seropositive than those at the same age who did not. CONCLUSIONS There is no clear evidence from trials that any 2p+1, 3p+0 or 3p+1 schedule of Hib conjugate vaccine is likely to provide better protection against Hib disease than other schedules. Until more data become available, scheduling is likely to be determined by epidemiological and programmatic considerations in individual settings.
Resumo:
Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.
Resumo:
BACKGROUND A number of epidemiological studies indicate an inverse association between atopy and brain tumors in adults, particularly gliomas. We investigated the association between atopic disorders and intracranial brain tumors in children and adolescents, using international collaborative CEFALO data. PATIENTS AND METHODS CEFALO is a population-based case-control study conducted in Denmark, Norway, Sweden, and Switzerland, including all children and adolescents in the age range 7-19 years diagnosed with a primary brain tumor between 2004 and 2008. Two controls per case were randomly selected from population registers matched on age, sex, and geographic region. Information about atopic conditions and potential confounders was collected through personal interviews. RESULTS In total, 352 cases (83%) and 646 controls (71%) participated in the study. For all brain tumors combined, there was no association between ever having had an atopic disorder and brain tumor risk [odds ratio 1.03; 95% confidence interval (CI) 0.70-1.34]. The OR was 0.76 (95% CI 0.53-1.11) for a current atopic condition (in the year before diagnosis) and 1.22 (95% CI 0.86-1.74) for an atopic condition in the past. Similar results were observed for glioma. CONCLUSIONS There was no association between atopic conditions and risk of all brain tumors combined or of glioma in particular. Stratification on current or past atopic conditions suggested the possibility of reverse causality, but may also the result of random variation because of small numbers in subgroups. In addition, an ongoing tumor treatment may affect the manifestation of atopic conditions, which could possibly affect recall when reporting about a history of atopic diseases. Only a few studies on atopic conditions and pediatric brain tumors are currently available, and the evidence is conflicting.
Resumo:
In the 1980s, leukaemia clusters were discovered around nuclear fuel reprocessing plants in Sellafield and Dounreay in the United Kingdom. This raised public concern about the risk of childhood leukaemia near nuclear power plants (NPPs). Since then, the topic has been well-studied, but methodological limitations make results difficult to interpret. Our review aims to: (1.) summarise current evidence on the relationship between NPPs and risk of childhood leukaemia, with a focus on the Swiss CANUPIS (Childhood cancer and nuclear power plants in Switzerland) study; (2.) discuss the limitations of previous research; and (3.) suggest directions for future research. There are various reasons that previous studies produced inconclusive results. These include: inadequate study designs and limited statistical power due to the low prevalence of exposure (living near a NPP) and outcome (leukaemia); lack of accurate exposure estimates; limited knowledge of the aetiology of childhood leukaemia, particularly of vulnerable time windows and latent periods; use of residential location at time of diagnosis only and lack of data on address histories; and inability to adjust for potential confounders. We conclude that risk of childhood leukaemia around NPPs should continue to be monitored and that study designs should be improved and standardised. Data should be pooled internationally to increase the statistical power. More research needs to be done on other putative risk factors for childhood cancer such as low-dose ionizing radiation, exposure to certain chemicals and exposure to infections. Studies should be designed to allow examining multiple exposures.
Resumo:
Over the last two decades, imaging of the aorta has undergone a clinically relevant change. As part of the change non-invasive imaging techniques have replaced invasive intra-arterial digital subtraction angiography as the former imaging gold standard for aortic diseases. Computed tomography (CT) and magnetic resonance imaging (MRI) constitute the backbone of pre- and postoperative aortic imaging because they allow for imaging of the entire aorta and its branches. The first part of this review article describes the imaging principles of CT and MRI with regard to aortic disease, shows how both technologies can be applied in every day clinical practice, offering exciting perspectives. Recent CT scanner generations deliver excellent image quality with a high spatial and temporal resolution. Technical developments have resulted in CT scan performed within a few seconds for the entire aorta. Therefore, CT angiography (CTA) is the imaging technology of choice for evaluating acute aortic syndromes, for diagnosis of most aortic pathologies, preoperative planning and postoperative follow-up after endovascular aortic repair. However, radiation dose and the risk of contrast induced nephropathy are major downsides of CTA. Optimisation of scan protocols and contrast media administration can help to reduce the required radiation dose and contrast media. MR angiography (MRA) is an excellent alternative to CTA for both diagnosis of aortic pathologies and postoperative follow-up. The lack of radiation is particularly beneficial for younger patients. A potential side effect of gadolinium contrast agents is nephrogenic systemic fibrosis (NSF). In patients with high risk of NSF unenhanced MRA can be performed with both ECG- and breath-gating techniques. Additionally, MRI provides the possibility to visualise and measure both dynamic and flow information.
Resumo:
Abstract BACKGROUND: Many studies have been conducted to define risk factors for the transmission of bovine paratuberculosis, mostly in countries with large herds. Little is known about the epidemiology in infected Swiss herds and risk factors important for transmission in smaller herds. Therefore, the presence of known factors which might favor the spread of paratuberculosis and could be related to the prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis were assessed in 17 infected herds (10 dairy, 7 beef). Additionally, the level of knowledge of herd managers about the disease was assessed. In a case-control study with 4 matched negative control herds per infected herd, the association of potential risk factors with the infection status of the herd was investigated. RESULTS: Exposure of the young stock to feces of older animals was frequently observed in infected and in control herds. The farmers' knowledge about paratuberculosis was very limited, even in infected herds. An overall prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis of 6.1% was found in infected herds, whereby shedders younger than 2 years of age were found in 46.2% of the herds where the young stock was available for testing. Several factors related to contamination of the heifer area with cows' feces and the management of the calving area were found to be significantly associated with the within-herd prevalence. Animal purchase was associated with a positive herd infection status (OR = 7.25, p = 0.004). CONCLUSIONS: Numerous risk factors favoring the spread of Mycobacterium avium subsp. paratuberculosis from adult animals to the young stock were observed in infected Swiss dairy and beef herds, which may be amenable to improvement in order to control the disease. Important factors were contamination of the heifer and the calving area, which were associated with higher within-herd prevalence of fecal shedding. The awareness of farmers of paratuberculosis was very low, even in infected herds. Animal purchase in a herd was significantly associated with the probability of a herd to be infected and is thus the most important factor for the control of the spread of disease between farms.
Resumo:
BACKGROUND AND PURPOSE To assess the association of lesion location and risk of aspiration and to establish predictors of transient versus extended risk of aspiration after supratentorial ischemic stroke. METHODS Atlas-based localization analysis was performed in consecutive patients with MRI-proven first-time acute supratentorial ischemic stroke. Standardized swallowing assessment was carried out within 8±18 hours and 7.8±1.2 days after admission. RESULTS In a prospective, longitudinal analysis, 34 of 94 patients (36%) were classified as having acute risk of aspiration, which was extended (≥7 days) or transient (<7 days) in 17 cases. There were no between-group differences in age, sex, cause of stroke, risk factors, prestroke disability, lesion side, or the degree of age-related white-matter changes. Correcting for stroke volume and National Institutes of Health Stroke Scale with a multiple logistic regression model, significant adjusted odds ratios in favor of acute risk of aspiration were demonstrated for the internal capsule (adjusted odds ratio, 6.2; P<0.002) and the insular cortex (adjusted odds ratio, 4.8; P<0.003). In a multivariate model of extended versus transient risk of aspiration, combined lesions of the frontal operculum and insular cortex was the only significant independent predictor of poor recovery (adjusted odds ratio, 33.8; P<0.008). CONCLUSIONS Lesions of the insular cortex and the internal capsule are significantly associated with acute risk of aspiration after stroke. Combined ischemic infarctions of the frontal operculum and the insular cortex are likely to cause extended risk of aspiration in stroke patients, whereas risk of aspiration tends to be transient in subcortical stroke.
Resumo:
BACKGROUND Despite substantial evidence supporting a pharmacogenetic approach to warfarin therapy in adults, evidence on the importance of genetics in warfarin therapy in children is limited, particularly for clinical outcomes. We assessed the contribution of CYP2C9/VKORC1/CYP4F2 genotypes and variation in other genes involved in vitamin K and coagulation pathways to warfarin dose and related clinical outcomes in children. PROCEDURE Clinical and genetic data for 93 children (age ≤ 18 years) who received warfarin therapy were obtained. DNA was genotyped for 93 selected single nucleotide polymorphisms using a custom assay. RESULTS With a median age of 4.8 years, our cohort included more young children than most previous studies. Overall, 76.3% of dose variability was explained by weight, indication, VKORC1-1639G/A and CYP2C9 *2/*3, with genotypes accounting for 21.1% of variability. There was a strong correlation (R(2) = 0.68; P < 0.001) between actual and predicted warfarin dose using a pediatric genotype-based dosing model. VKORC1 genotype had a significant impact on time to therapeutic international normalized ratio (INR) (P = 0.047) and time to over-anticoagulation (INR > 4; P = 0.024) during the initiation of therapy. CYP2C9*3 carriers were also at increased risk of major bleeding while receiving warfarin (adjusted OR = 11.28). An additional variant in CYP2C9 (rs7089580) was significantly associated with warfarin dose (P = 0.020) in a multivariate clinical and genetic model. CONCLUSIONS This study confirms the importance of VKORC1/CYP2C9 genotypes for warfarin dosing in a young pediatric cohort and demonstrates an impact of genetic factors on clinical outcomes in children. Furthermore, we identified an additional variant in CYP2C9 of potential relevance for warfarin dosing in children.
Resumo:
Ischaemic stroke (IS) in young adults has been increasingly recognized as a serious health condition. Stroke aetiology is different in young adults than in the older population. This study aimed to investigate aetiology and risk factors, and to search for predictors of outcome and recurrence in young IS patients. We conducted a prospective multicentre study of consecutive IS patients aged 16-55 years. Baseline demographic data, risk factors, stroke aetiology including systematic genetic screening for Fabry disease and severity were assessed and related to functional neurological outcome (modified Rankin Scale, mRS), case fatality, employment status, place of residence, and recurrent cerebrovascular events at 3 months. In 624 IS patients (60 % men), median age was 46 (IQR 39-51) years and median NIHSS on admission 3 (IQR 1-8). Modifiable vascular risk factors were found in 73 %. Stroke aetiology was mostly cardioembolism (32 %) and of other defined origin (24 %), including cervicocerebral artery dissection (17 %). Fabry disease was diagnosed in 2 patients (0.3 %). Aetiology remained unknown in 20 %. Outcome at 3 months was favourable (mRS 0-1) in 61 % and fatal in 2.9 %. Stroke severity (p < 0.001) and diabetes mellitus (p = 0.023) predicted unfavourable outcome. Stroke recurrence rate at 3 months was 2.7 %. Previous stroke or TIA predicted recurrent cerebrovascular events (p = 0.012). In conclusion, most young adults with IS had modifiable vascular risk factors, emphasizing the importance of prevention strategies. Outcome was unfavourable in more than a third of patients and was associated with initial stroke severity and diabetes mellitus. Previous cerebrovascular events predicted recurrent ones.
Resumo:
INTRODUCTION Results on mitochondrial dysfunction in sepsis are controversial. We aimed to assess effects of LPS at wide dose and time ranges on hepatocytes and isolated skeletal muscle mitochondria. METHODS Human hepatocellular carcinoma cells (HepG2) were exposed to placebo or LPS (0.1, 1, and 10 μg/mL) for 4, 8, 16, and 24 hours and primary human hepatocytes to 1 μg/mL LPS or placebo (4, 8, and 16 hours). Mitochondria from porcine skeletal muscle samples were exposed to increasing doses of LPS (0.1-100 μg/mg) for 2 and 4 hours. Respiration rates of intact and permeabilized cells and isolated mitochondria were measured by high-resolution respirometry. RESULTS In HepG2 cells, LPS reduced mitochondrial membrane potential and cellular ATP content but did not modify basal respiration. Stimulated complex II respiration was reduced time-dependently using 1 μg/mL LPS. In primary human hepatocytes, stimulated mitochondrial complex II respiration was reduced time-dependently using 1 μg/mL LPS. In isolated porcine skeletal muscle mitochondria, stimulated respiration decreased at high doses (50 and 100 μg/mL LPS). CONCLUSION LPS reduced cellular ATP content of HepG2 cells, most likely as a result of the induced decrease in membrane potential. LPS decreased cellular and isolated mitochondrial respiration in a time-dependent, dose-dependent and complex-dependent manner.
Resumo:
INTRODUCTION External beam radiotherapy (EBRT), with or without androgen deprivation therapy (ADT), is an established treatment option for nonmetastatic prostate cancer. Despite high-level evidence from several randomized trials, risk group stratification and treatment recommendations vary due to contradictory or inconclusive data, particularly with regard to EBRT dose prescription and ADT duration. Our aim was to investigate current patterns of practice in primary EBRT for prostate cancer in Switzerland. MATERIALS AND METHODS Treatment recommendations on EBRT and ADT for localized and locally advanced prostate cancer were collected from 23 Swiss radiation oncology centers. Written recommendations were converted into center-specific decision trees, and analyzed for consensus and differences using a dedicated software tool. Additionally, specific radiotherapy planning and delivery techniques from the participating centers were assessed. RESULTS The most commonly prescribed radiation dose was 78 Gy (range 70-80 Gy) across all risk groups. ADT was recommended for intermediate-risk patients for 6 months in over 80 % of the centers, and for high-risk patients for 2 or 3 years in over 90 % of centers. For recommendations on combined EBRT and ADT treatment, consensus levels did not exceed 39 % in any clinical scenario. Arc-based intensity-modulated radiotherapy (IMRT) is implemented for routine prostate cancer radiotherapy by 96 % of the centers. CONCLUSION Among Swiss radiation oncology centers, considerable ranges of radiotherapy dose and ADT duration are routinely offered for localized and locally advanced prostate cancer. In the vast majority of cases, doses and durations are within the range of those described in current evidence-based guidelines.
Resumo:
Importance In treatment-resistant schizophrenia, clozapine is considered the standard treatment. However, clozapine use has restrictions owing to its many adverse effects. Moreover, an increasing number of randomized clinical trials (RCTs) of other antipsychotics have been published. Objective To integrate all the randomized evidence from the available antipsychotics used for treatment-resistant schizophrenia by performing a network meta-analysis. Data Sources MEDLINE, EMBASE, Biosis, PsycINFO, PubMed, Cochrane Central Register of Controlled Trials, World Health Organization International Trial Registry, and clinicaltrials.gov were searched up to June 30, 2014. Study Selection At least 2 independent reviewers selected published and unpublished single- and double-blind RCTs in treatment-resistant schizophrenia (any study-defined criterion) that compared any antipsychotic (at any dose and in any form of administration) with another antipsychotic or placebo. Data Extraction and Synthesis At least 2 independent reviewers extracted all data into standard forms and assessed the quality of all included trials with the Cochrane Collaboration's risk-of-bias tool. Data were pooled using a random-effects model in a Bayesian setting. Main Outcomes and Measures The primary outcome was efficacy as measured by overall change in symptoms of schizophrenia. Secondary outcomes included change in positive and negative symptoms of schizophrenia, categorical response to treatment, dropouts for any reason and for inefficacy of treatment, and important adverse events. Results Forty blinded RCTs with 5172 unique participants (71.5% men; mean [SD] age, 38.8 [3.7] years) were included in the analysis. Few significant differences were found in all outcomes. In the primary outcome (reported as standardized mean difference; 95% credible interval), olanzapine was more effective than quetiapine (-0.29; -0.56 to -0.02), haloperidol (-0. 29; -0.44 to -0.13), and sertindole (-0.46; -0.80 to -0.06); clozapine was more effective than haloperidol (-0.22; -0.38 to -0.07) and sertindole (-0.40; -0.74 to -0.04); and risperidone was more effective than sertindole (-0.32; -0.63 to -0.01). A pattern of superiority for olanzapine, clozapine, and risperidone was seen in other efficacy outcomes, but results were not consistent and effect sizes were usually small. In addition, relatively few RCTs were available for antipsychotics other than clozapine, haloperidol, olanzapine, and risperidone. The most surprising finding was that clozapine was not significantly better than most other drugs. Conclusions and Relevance Insufficient evidence exists on which antipsychotic is more efficacious for patients with treatment-resistant schizophrenia, and blinded RCTs-in contrast to unblinded, randomized effectiveness studies-provide little evidence of the superiority of clozapine compared with other second-generation antipsychotics. Future clozapine studies with high doses and patients with extremely treatment-refractory schizophrenia might be most promising to change the current evidence.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.