919 resultados para luminous intensity
Resumo:
AIMS/HYPOTHESIS To investigate exercise-related fuel metabolism in intermittent high-intensity (IHE) and continuous moderate intensity (CONT) exercise in individuals with type 1 diabetes mellitus. METHODS In a prospective randomised open-label cross-over trial twelve male individuals with well-controlled type 1 diabetes underwent a 90 min iso-energetic cycling session at 50% maximal oxygen consumption ([Formula: see text]), with (IHE) or without (CONT) interspersed 10 s sprints every 10 min without insulin adaptation. Euglycaemia was maintained using oral (13)C-labelled glucose. (13)C Magnetic resonance spectroscopy (MRS) served to quantify hepatocellular and intramyocellular glycogen. Measurements of glucose kinetics (stable isotopes), hormones and metabolites complemented the investigation. RESULTS Glucose and insulin levels were comparable between interventions. Exogenous glucose requirements during the last 30 min of exercise were significantly lower in IHE (p = 0.02). Hepatic glucose output did not differ significantly between interventions, but glucose disposal was significantly lower in IHE (p < 0.05). There was no significant difference in glycogen consumption. Growth hormone, catecholamine and lactate levels were significantly higher in IHE (p < 0.05). CONCLUSIONS/INTERPRETATION IHE in individuals with type 1 diabetes without insulin adaptation reduced exogenous glucose requirements compared with CONT. The difference was not related to increased hepatic glucose output, nor to enhanced muscle glycogen utilisation, but to decreased glucose uptake. The lower glucose disposal in IHE implies a shift towards consumption of alternative substrates. These findings indicate a high flexibility of exercise-related fuel metabolism in type 1 diabetes, and point towards a novel and potentially beneficial role of IHE in these individuals. TRIAL REGISTRATION ClinicalTrials.gov NCT02068638 FUNDING: Swiss National Science Foundation (grant number 320030_149321/) and R&A Scherbarth Foundation (Switzerland).
Resumo:
Investigations were focused on light effects on allocation of root-borne macronutrients (calcium, magnesium and potassium) and micronutrients (iron, manganese, zinc and copper) in roots, shoots and harvested grains of wheat (Triticum aestivum L.). Plants were exposed to low (100 μmol photons m−2 s−1) or high light (380 μmol photons m−2 s−1). High light stimulated both root and shoot growth. While the total contents per plant of some nutrients were markedly higher (calcium and potassium) or lower (copper) under high light, no major differences were observed for other nutrients. The distribution of nutrients and the further redistribution within the shoot were influenced by the light intensity in an element-specific manner. Nutrients were selectively directed to the leaves of the main shoot (low light) or to the tillers (high light). The quality of the harvested grains was also affected by the light intensity.
Resumo:
The purpose of this work was to develop a comprehensive IMSRT QA procedure that examined, using EPID dosimetry and Monte Carlo (MC) calculations, each step in the treatment planning and delivery process. These steps included verification of the field shaping, treatment planning system (RTPS) dose calculations, and patient dose delivery. Verification of each step in the treatment process is assumed to result in correct dose delivery to the patient. ^ The accelerator MC model was verified against commissioning data for field sizes from 0.8 × 0.8 cm 2 to 10 × 10 cm 2. Depth doses were within 2% local percent difference (LPD) in low gradient regions and 1 mm distance to agreement (DTA) in high gradient regions. Lateral profiles were within 2% LPD in low gradient regions and 1 mm DTA in high gradient regions. Calculated output factors were within 1% of measurement for field sizes ≥1 × 1 cm2. ^ The measured and calculated pretreatment EPID dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Pretreatment field verification resulted in 97% percent of the points passing. ^ The RTPS and Monte Carlo phantom dose calculations were compared using 5% LPD, 2 mm DTA, or 2% of the maximum dose with ≥95% of compared points required passing for successful verification. RTPS calculation verification resulted in 97% percent of the points passing. ^ The measured and calculated EPID exit dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Exit dose verification resulted in 97% percent of the points passing. ^ Each of the processes above verified an individual step in the treatment planning and delivery process. The combination of these verification steps ensures accurate treatment delivery to the patient. This work shows that Monte Carlo calculations and EPID dosimetry can be used to quantitatively verify IMSRT treatments resulting in improved patient care and, potentially, improved clinical outcome. ^
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
The current standard treatment for head and neck cancer at our institution uses intensity-modulated x-ray therapy (IMRT), which improves target coverage and sparing of critical structures by delivering complex fluence patterns from a variety of beam directions to conform dose distributions to the shape of the target volume. The standard treatment for breast patients is field-in-field forward-planned IMRT, with initial tangential fields and additional reduced-weight tangents with blocking to minimize hot spots. For these treatment sites, the addition of electrons has the potential of improving target coverage and sparing of critical structures due to rapid dose falloff with depth and reduced exit dose. In this work, the use of mixed-beam therapy (MBT), i.e., combined intensity-modulated electron and x-ray beams using the x-ray multi-leaf collimator (MLC), was explored. The hypothesis of this study was that addition of intensity-modulated electron beams to existing clinical IMRT plans would produce MBT plans that were superior to the original IMRT plans for at least 50% of selected head and neck and 50% of breast cases. Dose calculations for electron beams collimated by the MLC were performed with Monte Carlo methods. An automation system was created to facilitate communication between the dose calculation engine and the treatment planning system. Energy and intensity modulation of the electron beams was accomplished by dividing the electron beams into 2x2-cm2 beamlets, which were then beam-weight optimized along with intensity-modulated x-ray beams. Treatment plans were optimized to obtain equivalent target dose coverage, and then compared with the original treatment plans. MBT treatment plans were evaluated by participating physicians with respect to target coverage, normal structure dose, and overall plan quality in comparison with original clinical plans. The physician evaluations did not support the hypothesis for either site, with MBT selected as superior in 1 out of the 15 head and neck cases (p=1) and 6 out of 18 breast cases (p=0.95). While MBT was not shown to be superior to IMRT, reductions were observed in doses to critical structures distal to the target along the electron beam direction and to non-target tissues, at the expense of target coverage and dose homogeneity. ^
Resumo:
External beam radiation therapy is used to treat nearly half of the more than 200,000 new cases of prostate cancer diagnosed in the United States each year. During a radiation therapy treatment, healthy tissues in the path of the therapeutic beam are exposed to high doses. In addition, the whole body is exposed to a low-dose bath of unwanted scatter radiation from the pelvis and leakage radiation from the treatment unit. As a result, survivors of radiation therapy for prostate cancer face an elevated risk of developing a radiogenic second cancer. Recently, proton therapy has been shown to reduce the dose delivered by the therapeutic beam to normal tissues during treatment compared to intensity modulated x-ray therapy (IMXT, the current standard of care). However, the magnitude of stray radiation doses from proton therapy, and their impact on this incidence of radiogenic second cancers, was not known. ^ The risk of a radiogenic second cancer following proton therapy for prostate cancer relative to IMXT was determined for 3 patients of large, median, and small anatomical stature. Doses delivered to healthy tissues from the therapeutic beam were obtained from treatment planning system calculations. Stray doses from IMXT were taken from the literature, while stray doses from proton therapy were simulated using a Monte Carlo model of a passive scattering treatment unit and an anthropomorphic phantom. Baseline risk models were taken from the Biological Effects of Ionizing Radiation VII report. A sensitivity analysis was conducted to characterize the uncertainty of risk calculations to uncertainties in the risk model, the relative biological effectiveness (RBE) of neutrons for carcinogenesis, and inter-patient anatomical variations. ^ The risk projections revealed that proton therapy carries a lower risk for radiogenic second cancer incidence following prostate irradiation compared to IMXT. The sensitivity analysis revealed that the results of the risk analysis depended only weakly on uncertainties in the risk model and inter-patient variations. Second cancer risks were sensitive to changes in the RBE of neutrons. However, the findings of the study were qualitatively consistent for all patient sizes and risk models considered, and for all neutron RBE values less than 100. ^
Resumo:
Research studies on the association between exposures to air contaminants and disease frequently use worn dosimeters to measure the concentration of the contaminant of interest. But investigation of exposure determinants requires additional knowledge beyond concentration, i.e., knowledge about personal activity such as whether the exposure occurred in a building or outdoors. Current studies frequently depend upon manual activity logging to record location. This study's purpose was to evaluate the use of a worn data logger recording three environmental parameters—temperature, humidity, and light intensity—as well as time of day, to determine indoor or outdoor location, with an ultimate aim of eliminating the need to manually log location or at least providing a method to verify such logs. For this study, data collection was limited to a single geographical area (Houston, Texas metropolitan area) during a single season (winter) using a HOBO H8 four-channel data logger. Data for development of a Location Model were collected using the logger for deliberate sampling of programmed activities in outdoor, building, and vehicle locations at various times of day. The Model was developed by analyzing the distributions of environmental parameters by location and time to establish a prioritized set of cut points for assessing locations. The final Model consisted of four "processors" that varied these priorities and cut points. Data to evaluate the Model were collected by wearing the logger during "typical days" while maintaining a location log. The Model was tested by feeding the typical day data into each processor and generating assessed locations for each record. These assessed locations were then compared with true locations recorded in the manual log to determine accurate versus erroneous assessments. The utility of each processor was evaluated by calculating overall error rates across all times of day, and calculating individual error rates by time of day. Unfortunately, the error rates were large, such that there would be no benefit in using the Model. Another analysis in which assessed locations were classified as either indoor (including both building and vehicle) or outdoor yielded slightly lower error rates that still precluded any benefit of the Model's use.^
Resumo:
The intensity of care for patients at the end-of-life is increasing in recent years. Publications have focused on intensity of care for many cancers, but none on melanoma patients. Substantial gaps exist in knowledge about intensive care and its alternative, hospice care, among the advanced melanoma patients at the end of life. End-of-life care may be used in quite different patterns and induce both intended and unintended clinical and economic consequences. We used the Surveillance, Epidemiology, and End Results (SEER)-Medicare linked databases to identify patients aged 65 years or older with metastatic melanoma who died between 2000 and 2007. We evaluated trends and associations between sociodemographic and health services characteristics and the use of hospice care, chemotherapy, surgery, and radiation therapy and costs. Survival, end-of-life costs, and incremental cost-effectiveness ratio were evaluated using propensity score methods. Costs were analyzed from the perspective of Medicare in 2009 dollars. In the first journal Article we found increasing use of surgery for patients with metastatic melanoma from 13% in 2000 to 30% in 2007 (P=0.03 for trend), no significant fluctuation in use of chemotherapy (P=0.43) or radiation therapy (P=0.46). Older patients were less likely to receive radiation therapy or chemotherapy. The use of hospice care increased from 61% in 2000 to 79% in 2007 (P =0.07 for trend). Enrollment in short-term (1-3 days) hospice care use increased, while long-term hospice care (≥ 4 days) remained stable. Patients living in the SEER Northeast and South regions were less likely to undergo surgery. Patients enrolled in long-term hospice care used significantly less chemotherapy, surgery and radiation therapy. In the second journal article, of 611 patients identified for this study, 358 (59%) received no hospice care after their diagnosis, 168 (27%) received 1 to 3 days of hospice care, and 85 (14%) received 4 or more days of hospice care. The median survival time was 181 days for patients with no hospice care, 196 days for patients enrolled in hospice for 1 to 3 days, and 300 days for patients enrolled for 4 or more days (log-rank test, P < 0.001). The estimated hazard ratios (HR) between 4 or more days hospice use and survival were similar within the original cohort Cox proportional hazard model (HR, 0.62; 95% CI, 0.49-0.78, P < 0.0001) and the propensity score-matched model (HR, 0.61; 95% CI, 0.47-0.78, P = 0.0001). Patients with ≥ 4 days of hospice care incurred lower end-of-life costs than the other two groups ($14,298 versus $19,380 for the 1- to 3-days hospice care, and $24,351 for patients with no hospice care; p < 0.0001). In conclusion, Surgery and hospice care use increased over the years of this study while the use of chemotherapy and radiation therapy remained consistent for patients diagnosed with metastatic melanoma. Patients diagnosed with advanced melanoma who enrolled in ≥ 4 days of hospice care experienced longer survival than those who had 1-3 days of hospice or no hospice care, and this longer overall survival was accompanied by lower end-of-life costs.^