7 resultados para Design quality
em Duke University
Resumo:
BACKGROUND: The Lung Cancer Exercise Training Study (LUNGEVITY) is a randomized trial to investigate the efficacy of different types of exercise training on cardiorespiratory fitness (VO2peak), patient-reported outcomes, and the organ components that govern VO2peak in post-operative non-small cell lung cancer (NSCLC) patients. METHODS/DESIGN: Using a single-center, randomized design, 160 subjects (40 patients/study arm) with histologically confirmed stage I-IIIA NSCLC following curative-intent complete surgical resection at Duke University Medical Center (DUMC) will be potentially eligible for this trial. Following baseline assessments, eligible participants will be randomly assigned to one of four conditions: (1) aerobic training alone, (2) resistance training alone, (3) the combination of aerobic and resistance training, or (4) attention-control (progressive stretching). The ultimate goal for all exercise training groups will be 3 supervised exercise sessions per week an intensity above 70% of the individually determined VO2peak for aerobic training and an intensity between 60 and 80% of one-repetition maximum for resistance training, for 30-45 minutes/session. Progressive stretching will be matched to the exercise groups in terms of program length (i.e., 16 weeks), social interaction (participants will receive one-on-one instruction), and duration (30-45 mins/session). The primary study endpoint is VO2peak. Secondary endpoints include: patient-reported outcomes (PROs) (e.g., quality of life, fatigue, depression, etc.) and organ components of the oxygen cascade (i.e., pulmonary function, cardiac function, skeletal muscle function). All endpoints will be assessed at baseline and postintervention (16 weeks). Substudies will include genetic studies regarding individual responses to an exercise stimulus, theoretical determinants of exercise adherence, examination of the psychological mediators of the exercise - PRO relationship, and exercise-induced changes in gene expression. DISCUSSION: VO2peak is becoming increasingly recognized as an outcome of major importance in NSCLC. LUNGEVITY will identify the optimal form of exercise training for NSCLC survivors as well as provide insight into the physiological mechanisms underlying this effect. Overall, this study will contribute to the establishment of clinical exercise therapy rehabilitation guidelines for patients across the entire NSCLC continuum. TRIAL REGISTRATION: NCT00018255.
Resumo:
BACKGROUND: The Exercise Intensity Trial (EXcITe) is a randomized trial to compare the efficacy of supervised moderate-intensity aerobic training to moderate to high-intensity aerobic training, relative to attention control, on aerobic capacity, physiologic mechanisms, patient-reported outcomes, and biomarkers in women with operable breast cancer following the completion of definitive adjuvant therapy. METHODS/DESIGN: Using a single-center, randomized design, 174 postmenopausal women (58 patients/study arm) with histologically confirmed, operable breast cancer presenting to Duke University Medical Center (DUMC) will be enrolled in this trial following completion of primary therapy (including surgery, radiation therapy, and chemotherapy). After baseline assessments, eligible participants will be randomized to one of two supervised aerobic training interventions (moderate-intensity or moderate/high-intensity aerobic training) or an attention-control group (progressive stretching). The aerobic training interventions will include 150 mins.wk⁻¹ of supervised treadmill walking per week at an intensity of 60%-70% (moderate-intensity) or 60% to 100% (moderate to high-intensity) of the individually determined peak oxygen consumption (VO₂peak) between 20-45 minutes/session for 16 weeks. The progressive stretching program will be consistent with the exercise interventions in terms of program length (16 weeks), social interaction (participants will receive one-on-one instruction), and duration (20-45 mins/session). The primary study endpoint is VO₂peak, as measured by an incremental cardiopulmonary exercise test. Secondary endpoints include physiologic determinants that govern VO₂peak, patient-reported outcomes, and biomarkers associated with breast cancer recurrence/mortality. All endpoints will be assessed at baseline and after the intervention (16 weeks). DISCUSSION: EXCITE is designed to investigate the intensity of aerobic training required to induce optimal improvements in VO₂peak and other pertinent outcomes in women who have completed definitive adjuvant therapy for operable breast cancer. Overall, this trial will inform and refine exercise guidelines to optimize recovery in breast and other cancer survivors following the completion of primary cytotoxic therapy. TRIAL REGISTRATION: NCT01186367.
Resumo:
BACKGROUND: Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. METHODS AND PRINCIPAL FINDINGS: The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. CONCLUSIONS: Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks.
Resumo:
Ambient sampling for the Pittsburgh Air Quality Study (PAQS) was conducted from July 2001 to September 2002. The study was designed (1) to characterize particulate matter (PM) by examination of size, surface area, and volume distribution, chemical composition as a function of size and on a single particle basis, morphology, and temporal and spatial variability in the Pittsburgh region; (2) to quantify the impact of the various sources (transportation, power plants, biogenic sources, etc.) on the aerosol concentrations in the area; and (3) to develop and evaluate the next generation of atmospheric aerosol monitoring and modeling techniques. The PAQS objectives, study design, site descriptions and routine and intensive measurements are presented. Special study days are highlighted, including those associated with elevated concentrations of daily average PM2.5 mass. Monthly average and diurnal patterns in aerosol number concentration, and aerosol nitrate, sulfate, elemental carbon, and organic carbon concentrations, light scattering as well as gas-phase ozone, nitrogen oxides, and carbon monoxide are discussed with emphasis on the processes affecting them. Preliminary findings reveal day-to-day variability in aerosol mass and composition, but consistencies in seasonal average diurnal profiles and concentrations. For example, the seasonal average variations in the diurnal PM2.5 mass were predominately driven by the sulfate component. © 2004 Elsevier Ltd. All rights reserved.
Resumo:
Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.
The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.
The main contributions of the thesis can be placed in one of the following categories.
1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.
2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.
3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.
4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.
Resumo:
BACKGROUND: The Affordable Care Act encourages healthcare systems to integrate behavioral and medical healthcare, as well as to employ electronic health records (EHRs) for health information exchange and quality improvement. Pragmatic research paradigms that employ EHRs in research are needed to produce clinical evidence in real-world medical settings for informing learning healthcare systems. Adults with comorbid diabetes and substance use disorders (SUDs) tend to use costly inpatient treatments; however, there is a lack of empirical data on implementing behavioral healthcare to reduce health risk in adults with high-risk diabetes. Given the complexity of high-risk patients' medical problems and the cost of conducting randomized trials, a feasibility project is warranted to guide practical study designs. METHODS: We describe the study design, which explores the feasibility of implementing substance use Screening, Brief Intervention, and Referral to Treatment (SBIRT) among adults with high-risk type 2 diabetes mellitus (T2DM) within a home-based primary care setting. Our study includes the development of an integrated EHR datamart to identify eligible patients and collect diabetes healthcare data, and the use of a geographic health information system to understand the social context in patients' communities. Analysis will examine recruitment, proportion of patients receiving brief intervention and/or referrals, substance use, SUD treatment use, diabetes outcomes, and retention. DISCUSSION: By capitalizing on an existing T2DM project that uses home-based primary care, our study results will provide timely clinical information to inform the designs and implementation of future SBIRT studies among adults with multiple medical conditions.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.