999 resultados para point dose
Resumo:
Aprotinin is widely used in cardiac surgery to reduce postoperative bleeding and the need for blood transfusion. Controversy exists regarding the influence of aprotinin on renal function and its effect on the incidence of perioperative myocardial infarction (MI) and cerebrovascular incidents (CVI). In the present study, we analyzed the incidence of these adverse events in patients who underwent coronary artery bypass grafting (CABG) surgery under full-dose aprotinin and compared the data with those recently reported by Mangano et al [2006]. For 751 consecutive patients undergoing CABG surgery under full-dose aprotinin (>4 million kalikrein-inhibitor units) we analyzed in-hospital data on renal dysfunction or failure, MI (defined as creatine kinase-myocardial band > 60 iU/L), and CVI (defined as persistent or transient neurological symptoms and/or positive computed tomographic scan). Average age was 67.0 +/- 9.9 years, and patient pre- and perioperative characteristics were similar to those in the Society of Thoracic Surgeons database. The mortality (2.8%) and incidence of renal failure (5.2%) ranged within the reported results. The incidence rates of MI (8% versus 16%; P < .01) and CVI (2% versus 6%; P < .01) however, were significantly lower than those reported by Mangano et al. Thus the data of our single center experience do not confirm the recently reported negative effect of full-dose aprotinin on the incidence of MI and CVI. Therefore, aprotinin may still remain a valid option to reduce postoperative bleeding, especially because of the increased use of aggressive fibrinolytic therapy following percutaneous transluminal coronary angioplasty.
Resumo:
BACKGROUND: The role of adjuvant dose-intensive chemotherapy and its efficacy according to baseline features has not yet been established. PATIENTS AND METHODS: Three hundred and forty-four patients were randomized to receive seven courses of standard-dose chemotherapy (SD-CT) or three cycles of dose-intensive epirubicin and cyclophosphamide (epirubicin 200 mg/m(2) plus cyclophosphamide 4 mg/m(2) with filgrastim and progenitor cell support). All patients were assigned tamoxifen at the completion of chemotherapy. The primary end point was disease-free survival (DFS). This paper updates the results and explores patterns of recurrence according to predicting baseline features. RESULTS: At 8.3-years median follow-up, patients assigned DI-EC had a significantly better DFS compared with those assigned SD-CT [8-year DFS percent 47% and 37%, respectively, hazard ratio (HR) 0.76; 95% confidence interval 0.58-1.00; P = 0.05]. Only patients with estrogen receptor (ER)-positive disease benefited from the DI-EC (HR 0.61; 95% confidence interval 0.39, 0.95; P = 0.03). CONCLUSIONS: After prolonged follow-up, DI-EC significantly improved DFS, but the effect was observed only in patients with ER-positive disease, leading to the hypothesis that efficacy of DI-EC may relate to its endocrine effects. Further studies designed to confirm the importance of endocrine responsiveness in patients treated with dose-intensive chemotherapy are encouraged.
Resumo:
BACKGROUND: A complete remission is essential for prolonging survival in patients with acute myeloid leukemia (AML). Daunorubicin is a cornerstone of the induction regimen, but the optimal dose is unknown. In older patients, it is usual to give daunorubicin at a dose of 45 to 50 mg per square meter of body-surface area. METHODS: Patients in whom AML or high-risk refractory anemia had been newly diagnosed and who were 60 to 83 years of age (median, 67) were randomly assigned to receive cytarabine, at a dose of 200 mg per square meter by continuous infusion for 7 days, plus daunorubicin for 3 days, either at the conventional dose of 45 mg per square meter (411 patients) or at an escalated dose of 90 mg per square meter (402 patients); this treatment was followed by a second cycle of cytarabine at a dose of 1000 mg per square meter every 12 hours [DOSAGE ERROR CORRECTED] for 6 days. The primary end point was event-free survival. RESULTS: The complete remission rates were 64% in the group that received the escalated dose of daunorubicin and 54% in the group that received the conventional dose (P=0.002); the rates of remission after the first cycle of induction treatment were 52% and 35%, respectively (P<0.001). There was no significant difference between the two groups in the incidence of hematologic toxic effects, 30-day mortality (11% and 12% in the two groups, respectively), or the incidence of moderate, severe, or life-threatening adverse events (P=0.08). Survival end points in the two groups did not differ significantly overall, but patients in the escalated-treatment group who were 60 to 65 years of age, as compared with the patients in the same age group who received the conventional dose, had higher rates of complete remission (73% vs. 51%), event-free survival (29% vs. 14%), and overall survival (38% vs. 23%). CONCLUSIONS: In patients with AML who are older than 60 years of age, escalation of the dose of daunorubicin to twice the conventional dose, with the entire dose administered in the first induction cycle, effects a more rapid response and a higher response rate than does the conventional dose, without additional toxic effects. (Current Controlled Trials number, ISRCTN77039377; and Netherlands National Trial Register number, NTR212.)
Resumo:
We report on a female who is compound heterozygote for two new point mutations in the CYP19 gene. The allele inherited from her mother presented a base pair deletion (C) occurring at P408 (CCC, exon 9), causing a frameshift that results in a nonsense codon 111 bp (37 aa) further down in the CYP19 gene. The allele inherited from her father showed a point mutation from G-->A at the splicing point (canonical GT to mutational AT) between exon and intron 3. This mutation ignores the splice site and a stop codon 3 bp downstream occurs. Aromatase deficiency was already suspected because of the marked virilization occurring prepartum in the mother, and the diagnosis was confirmed shortly after birth. Extremely low levels of serum estrogens were found in contrast to high levels of androgens. Ultrasonographic follow-up studies revealed persistently enlarged ovaries (19.5-22 mL) during early childhood (2 to 4 yr) which contained numerous large cysts up to 4.8 x 3.7 cm and normal-appearing large tertiary follicles already at the age of 2 yr. In addition, both basal and GnRH-induced FSH levels remained consistently strikingly elevated. Low-dose estradiol (E2) (0.4 mg/day) given for 50 days at the age of 3 6/12 yr resulted in normalization of serum gonadotropin levels, regression of ovarian size, and increase of whole body and lumbar spine (L1-L4) bone mineral density. The FSH concentration and ovarian size returned to pretreatment levels shortly (150 days) after cessation of E2 therapy. Therefore, we recommend that affected females be treated with low-dose E2 in amounts sufficient to result in physiological prepubertal E2 concentrations using an ultrasensitive estrogen assay. However, E2 replacement needs to be adjusted throughout childhood and puberty to ensure normal skeletal maturation and adequate adolescent growth spurt, normal accretion of bone mineral density, and, at the appropriate age, female secondary sex maturation.
Resumo:
OBJECTIVES: To investigate the modulation of the nociceptive withdrawal reflex (NWR) and temporal summation (TS) by low-dose acepromazine (ACP) in conscious dogs. To assess the short- and long-term stability of the reflex thresholds. STUDY DESIGN: Randomized, blinded, placebo-controlled cross-over experimental study. ANIMALS: Eight adult male Beagles. METHODS: The NWR was elicited using single transcutaneous electrical stimulation of the ulnar nerve. Repeated stimuli (10 pulses, 5 Hz) were applied to evoke TS. The responses of the deltoideus muscle were recorded and quantified by surface electromyography and the behavioural reactions were scored. Each dog received 0.01 mg kg(-1) ACP or an equal volume saline intravenously (IV) at 1 week intervals. Measurements were performed before (baseline) and 20, 60 and 100 minutes after drug administration. Sedation was scored before drug administration and then at 10 minutes intervals. Data were analyzed with Friedman repeated measures analysis of variance on ranks and Wilcoxon signed rank tests. RESULTS: Acepromazine resulted in a mild tranquilization becoming obvious at 20 minutes and peaking 30 minutes after injection. Single (I(t)) and repeated stimuli (TS(t)) threshold intensities, NWR and TS characteristics and behavioural responses were not affected by the ACP at any time point. Both I(t) and TS(t) were stable over time. CONCLUSIONS AND CLINICAL RELEVANCE: In dogs, 0.01 mg kg(-1) ACP IV had no modulatory action on the NWR evoked by single or repeated stimuli, suggesting no antinociceptive activity on phasic nociceptive stimuli. The evidence of the stability of the NWR thresholds supports the use of the model as an objective tool to investigate nociception in conscious dogs. A low dose of ACP administered as the sole drug, can be used to facilitate the recordings in anxious subjects without altering the validity of this model.
Resumo:
This dissertation explores phase I dose-finding designs in cancer trials from three perspectives: the alternative Bayesian dose-escalation rules, a design based on a time-to-dose-limiting toxicity (DLT) model, and a design based on a discrete-time multi-state (DTMS) model. We list alternative Bayesian dose-escalation rules and perform a simulation study for the intra-rule and inter-rule comparisons based on two statistical models to identify the most appropriate rule under certain scenarios. We provide evidence that all the Bayesian rules outperform the traditional ``3+3'' design in the allocation of patients and selection of the maximum tolerated dose. The design based on a time-to-DLT model uses patients' DLT information over multiple treatment cycles in estimating the probability of DLT at the end of treatment cycle 1. Dose-escalation decisions are made whenever a cycle-1 DLT occurs, or two months after the previous check point. Compared to the design based on a logistic regression model, the new design shows more safety benefits for trials in which more late-onset toxicities are expected. As a trade-off, the new design requires more patients on average. The design based on a discrete-time multi-state (DTMS) model has three important attributes: (1) Toxicities are categorized over a distribution of severity levels, (2) Early toxicity may inform dose escalation, and (3) No suspension is required between accrual cohorts. The proposed model accounts for the difference in the importance of the toxicity severity levels and for transitions between toxicity levels. We compare the operating characteristics of the proposed design with those from a similar design based on a fully-evaluated model that directly models the maximum observed toxicity level within the patients' entire assessment window. We describe settings in which, under comparable power, the proposed design shortens the trial. The proposed design offers more benefit compared to the alternative design as patient accrual becomes slower.
Resumo:
Abstract Background: Aromatase deficiency may result in a complete block of estrogen synthesis because of the failure to convert androgens to estrogens. In females, this results in virilisation at birth, ovarian cysts in prepuberty and lack of pubertal development but virilisation, thereafter. Objective and methods: We studied the impact of oral 17β-estradiol treatment on ovarian and uterine development, and on LH/FSH and inhibin B during the long-term follow-up of a girl harboring compound heterozygote point mutations in the CYP19A1 gene. Results: In early childhood, low doses of oral 17β-estradiol were needed. During prepuberty treatment with slowly increasing doses of E2 resulted in normal uterine and almost normal development of ovarian volume, as well as number and size of follicles. Regarding hormonal feedback mechanisms, inhibin B levels were in the upper normal range during childhood and puberty. Low doses of estradiol did not suffice to achieve physiological gonadotropin levels in late prepuberty and puberty. However, when estradiol doses were further increased in late puberty levels of both FSH and LH declined with estradiol levels within normal range. Conclusion: Complete aromatase deficiency provides an excellent model of how ovarian and uterine development in relation to E2, LH, FSH and inhibin B feedback progresses from infancy to adolescence.
Resumo:
PURPOSE Patients with biochemical failure (BF) after radical prostatectomy may benefit from dose-intensified salvage radiation therapy (SRT) of the prostate bed. We performed a randomized phase III trial assessing dose intensification. PATIENTS AND METHODS Patients with BF but without evidence of macroscopic disease were randomly assigned to either 64 or 70 Gy. Three-dimensional conformal radiation therapy or intensity-modulated radiation therapy/rotational techniques were used. The primary end point was freedom from BF. Secondary end points were acute toxicity according to the National Cancer Institute Common Terminology Criteria for Adverse Events (version 4.0) and quality of life (QoL) according to the European Organisation for Research and Treatment of Cancer Quality of Life Questionnaires C30 and PR25. RESULTS Three hundred fifty patients were enrolled between February 2011 and April 2014. Three patients withdrew informed consent, and three patients were not eligible, resulting in 344 patients age 48 to 75 years in the safety population. Thirty patients (8.7%) had grade 2 and two patients (0.6%) had grade 3 genitourinary (GU) baseline symptoms. Acute grade 2 and 3 GU toxicity was observed in 22 patients (13.0%) and one patient (0.6%), respectively, with 64 Gy and in 29 patients (16.6%) and three patients (1.7%), respectively, with 70 Gy (P = .2). Baseline grade 2 GI toxicity was observed in one patient (0.6%). Acute grade 2 and 3 GI toxicity was observed in 27 patients (16.0%) and one patient (0.6%), respectively, with 64 Gy, and in 27 patients (15.4%) and four patients (2.3%), respectively, with 70 Gy (P = .8). Changes in early QoL were minor. Patients receiving 70 Gy reported a more pronounced and clinically relevant worsening in urinary symptoms (mean difference in change score between arms, 3.6; P = .02). CONCLUSION Dose-intensified SRT was associated with low rates of acute grade 2 and 3 GU and GI toxicity. The impact of dose-intensified SRT on QoL was minor, except for a significantly greater worsening in urinary symptoms.
Resumo:
Bacterial artificial chromosomes (BACs) and P1 artificial chromosomes (PACs), which contain large fragments of genomic DNA, have been successfully used as transgenes to create mouse models of dose-dependent diseases. They are also potentially valuable as transgenes for dominant diseases given that point mutations and/or small rearrangements can be accurately introduced. Here, we describe a new method to introduce small alterations in BACs, which results in the generation of point mutations with high frequency. The method involves homologous recombination between the original BAC and a shuttle vector providing the mutation. Each recombination step is monitored using positive and negative selection markers, which are the Kanamycin-resistance gene, the sacB gene and temperature-sensitive replication, all conferred by the shuttle plasmid. We have used this method to introduce four different point mutations and the insertion of the β-galactosidase gene in a BAC, which has subsequently been used for transgenic animal production.
Resumo:
Objective To determine the pharmacokinetics of carboplatin in sulphur-crested cockatoos, so that its use in clinical studies in birds can be considered. Design A pharmacokinetic study of carboplatin, following a single intravenous (IV) or intraosseus (10) infusion over 3 min, was performed in six healthy sulphur-crested cockatoos (Cacatua galerita). Procedure Birds were anaesthetised and a jugular vein cannulated for blood collection. Carboplatin (5 mg/kg) was infused over 3 min by the IV route in four birds via the contralateral jugular vein, and by the 10 route in two birds via the ulna. Serial blood samples were collected for 96 h after initiation of the infusion. Tissue samples from 11 organs were obtained at necropsy, 96 h after carboplatin administration. Total Pt and filterable Pt in plasma and tissue Pt concentrations were assayed by inductively coupled plasma-mass spectrometry. A noncompartmental pharmacokinetic analysis was performed on the plasma data. Results The mean +/- SD for the C-max of filterable Pt was 27.3 +/- 4.06 mg/L and in all six birds occurred at the end of the 3 min infusion, thenceforth declining exponentially over the next 6 h to an average concentration of 0.128 +/- 0.065 mg/L. The terminal half-life (T-1/2) was 1.0 +/- 0.17 h, the systemic clearance (CI) was 5.50 +/- 1.06 mL/min/kg and the volume of distribution (Vss) was 0.378 +/- 0.073 L/kg. The extrapolated area under the curve (AUC(0-x)) was 0.903 +/- 0.127 mg/mL.min; the area extrapolated past the last (6 h) data point to infinite time averaged only 1.25% of the total AUC(0-x). The kidneys had the greatest accumulation of Pt (7.04 +/- 3.006 mug/g), followed by the liver (3.08 +/- 1.785 mug/g DM). Conclusions and clinical relevance Carboplatin infusion in sulphur-crested cockatoos produced mild, transient alimentary tract signs and the Pt plasma concentration was similar whether carboplatin was given intravenously or intraosseously. Filterable plasma Pt concentrations for carboplatin persisted longer than for cisplatin, due mostly to the difference in systemic clearance between these drugs in sulphur-crested cockatoos. The distribution of tissue Pt after carboplatin administration was similar to that reported for cisplatin in sulphur-crested cockatoos. Despite anatomical, physiological and biochemical differences among animal species, the pharmacokinetic disposition of filterable Pt in the sulphur-crested cockatoo shares some features with the kinetics reported previously in other animals and human beings.
Resumo:
Background: Renal transplant recipients were noted to appear cushingoid while on low doses of steroid as part of a triple therapy immunosuppression of cyclosporin A (CsA), prednisolone, and azathioprine. Methods: The study group comprised adult renal transplant recipients with stable graft function who had received their renal allograft a minimum of 1 year previously (43 studies undertaken in 22 men and 20 women) with median daily prednisone dose of 7 mg (range 3-10). The control group was healthy nontransplant subjects [median dose 10 mg (10-30)]. Prednisolone bioavailability was measured using a limited 6-hour area under the curve (AUC), with prednisolone measured using specific HPLC assay. Results: The median prednisolone AUC/mg dose for all transplant recipients was significantly greater than the control group by approximately 50% (316 nmol(.)h/L/mg prednisolone versus 218). AUC was significantly higher in female recipients (median 415 versus 297 for men) and in recipients receiving cyclospotin (348 versus 285). The highest AUC was in women on estrogen supplements who were receiving cyclosporin (median 595). A significantly higher proportion of patients on triple therapy had steroid side effects compared with those on steroid and azathioprine (17/27 versus 4/15), more women than men had side effects (14/16 versus 7/22), and the AUC/mg prednisone was greater in those with side effects than without (median 377 versus 288 nmol-h/L/mg). Discussion: The results are consistent with the hypothesis that CsA increases the bioavailability of prednisolone, most likely through inhibition of beta-glycoprotein. The increased exposure to steroid increased the side-effect profile of steroids in the majority of patients. Because the major contributor to AUC is the maximum postdose concentration, it may be possible to use single-point monitoring (2 hours postdose) for routine clinical studies.
Resumo:
It is unclear whether a random plasma cortisol measurement and the corticotropin (ACTH) test adequately reflect glucocorticoid secretory capacity in critical illness. This study aimed to determine whether these tests provide information representative of the 24 hour period. Plasma cortisol was measured hourly for 24 hours in 21 critically ill septic patients followed by a corticotropin test with 1 μ g dose administered intravenously. Serum and urine were analysed for ACTH and free cortisol respectively. Marked hourly variability in plasma cortisol was evident (coefficient of variation 8-30%) with no demonstrable circadian rhythm. The individual mean plasma cortisol concentrations ranged from 286 59 nmol/l to 796 &PLUSMN; 83 nmol/l. The 24 hour mean plasma cortisol was strongly correlated with both random plasma cortisol (r(2) 0.9, P< 0.0001) and the cortisol response to corticotropin (r(2) 0.72, P< 0.001). Only nine percent of patients increased their plasma cortisol by 250 nmol/l after corticotropin (euadrenal response). However, 35% of non-responders had spontaneous hourly rises > 250 nmol/l thus highlighting the limitations of a single point corticotropin test. Urinary free cortisol was elevated (865&PLUSMN; 937 nmol) in both corticotropin responders and non-responders suggesting elevated plasma free cortisol. No significant relationship was demonstrable between plasma cortisol and ACTH. We conclude that although random cortisol measurements and the low dose corticotropin tests reliably reflect the 24 hour mean cortisol in critical illness, they do not take into account the pulsatile nature of cortisol secretion. Consequently, there is the potential for erroneous conclusions about adrenal function based on a single measurement. We suggest that caution be exercised when drawing conclusions on the adequacy of adrenal function based on a single random plasma cortisol or the corticotropin test.
Resumo:
Objectives: Hospital discharge is a transition of care, where medication discrepancies are likely to occur and potentially cause patient harm. The purpose of our study was to assess the prescribing accuracy of hospital discharge medication orders at a London, UK teaching hospital. The timeliness of the discharge summary reaching the general practitioner (GP, family physician) was also assessed based on the 72 h target referenced in the Care Quality Commission report.1 Method: 501 consecutive discharge medication orders from 142 patients were examined and the following records were compared (1) the final inpatient drug chart at the point of discharge, (2) printed signed copy of the initial to take away (TTA) discharge summary produced electronically by the physician, (3) the pharmacist's amendments on the initial TTA that were hand written, (4) the final electronic patient discharge summary record, (5) the patients final take home medication from the hospital. Discrepancies between the physician's order (6) and pharmacist's change(s) (7) were compared with two types of failures – ‘failure to make a required change’ and ‘change where none was required’. Once the patient was discharged, the patient's GP, was contacted 72 h after discharge to see if the patient discharge summary, sent by post or via email, was received. Results: Over half the patients seen (73 out of 142) patients had at least one discrepancy that was made on the initial TTA by the doctor and amended by the pharmacist. Out of the 501 drugs, there were 140 discrepancies, 108 were ‘failures to make a required change’ (77%) and 32 were ‘changes where none were required’ (23%). The types of ‘failures to make required changes’ discrepancies that were found between the initial TTA and pharmacist's amendments were paracetamol and ibuprofen changes (dose banding) 38 (27%), directions of use 34 (24%), incorrect formulation of medication 28 (20%) and incorrect strength 8 (6%). The types of ‘changes where none were required discrepancies’ were omitted medication 15 (11%), unnecessary drug 14 (10%) and incorrect medicine including spelling mistakes 3 (2%). After contacting the GPs of the discharged patients 72 h postdischarge; 49% had received the discharge summary and 45% had not, the remaining 6% were patients who were discharged without a GP. Conclusion: This study shows that doctor prescribing at discharge is often not accurate, and interventions made by pharmacist to reconcile are important at this point of care. It was also found that half the discharge summaries had not reached the patient's family physician (according to the GP) within 72 h.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
BACKGROUND: The neonatal and pediatric antimicrobial point prevalence survey (PPS) of the Antibiotic Resistance and Prescribing in European Children project (http://www.arpecproject.eu/) aims to standardize a method for surveillance of antimicrobial use in children and neonates admitted to the hospital within Europe. This article describes the audit criteria used and reports overall country-specific proportions of antimicrobial use. An analytical review presents methodologies on antimicrobial use.
METHODS: A 1-day PPS on antimicrobial use in hospitalized children was organized in September 2011, using a previously validated and standardized method. The survey included all inpatient pediatric and neonatal beds and identified all children receiving an antimicrobial treatment on the day of survey. Mandatory data were age, gender, (birth) weight, underlying diagnosis, antimicrobial agent, dose and indication for treatment. Data were entered through a web-based system for data-entry and reporting, based on the WebPPS program developed for the European Surveillance of Antimicrobial Consumption project.
RESULTS: There were 2760 and 1565 pediatric versus 1154 and 589 neonatal inpatients reported among 50 European (n = 14 countries) and 23 non-European hospitals (n = 9 countries), respectively. Overall, antibiotic pediatric and neonatal use was significantly higher in non-European (43.8%; 95% confidence interval [CI]: 41.3-46.3% and 39.4%; 95% CI: 35.5-43.4%) compared with that in European hospitals (35.4; 95% CI: 33.6-37.2% and 21.8%; 95% CI: 19.4-24.2%). Proportions of antibiotic use were highest in hematology/oncology wards (61.3%; 95% CI: 56.2-66.4%) and pediatric intensive care units (55.8%; 95% CI: 50.3-61.3%).
CONCLUSIONS: An Antibiotic Resistance and Prescribing in European Children standardized web-based method for a 1-day PPS was successfully developed and conducted in 73 hospitals worldwide. It offers a simple, feasible and sustainable way of data collection that can be used globally.