943 resultados para Quality models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Limestone-based (karstic) freshwater wetlands of the Everglades, Belize, Mexico, and Jamaica are distinctive in having a high biomass of CaCO3-rich periphyton mats. Diatoms are common components of these mats and show predictable responses to environmental variation, making them good candidates for assessing nutrient enrichment in these naturally ultraoligotrophic wetlands. However, aside from in the Everglades of southern Florida, very little research has been done to document the diatoms and their environmental preferences in karstic Caribbean wetlands, which are increasingly threatened by eutrophication. We identified diatoms in periphyton mats collected during wet and dry periods from the Everglades and similar freshwater karstic wetlands in Belize, Mexico, and Jamaica. We compared diatom assemblage composition and diversity among locations and periods, and the effect of the limiting nutrient, P, on species composition among locations. We used periphyton-mat total P (TP) as a metric of availability. A total of 176 diatom species in 45 genera were recorded from the 4 locations. Twenty-three of these species, including 9 that are considered indicative of Everglades diatom flora, were found in all 4 locations. In Everglades and Caribbean sites, we identified assemblages and indicator species associated with low and high periphyton-mat TP and calculated TP optima and tolerances for each indicator species. TP optima and tolerances of indicator species differed between the Everglades and the Caribbean, but weighted averaging models predicted periphyton-mat TP concentrations from diatom assemblages at Everglades (R2  =  0.56) and Caribbean (R2  =  0.85) locations. These results show that diatoms can be effective indicators of water quality in karstic wetlands of the Caribbean, but application of regionally generated transfer functions to distant sites provides less reliable estimates than locally developed functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voice communication systems such as Voice-over IP (VoIP), Public Switched Telephone Networks, and Mobile Telephone Networks, are an integral means of human tele-interaction. These systems pose distinctive challenges due to their unique characteristics such as low volume, burstiness and stringent delay/loss requirements across heterogeneous underlying network technologies. Effective quality evaluation methodologies are important for system development and refinement, particularly by adopting user feedback based measurement. Presently, most of the evaluation models are system-centric (Quality of Service or QoS-based), which questioned us to explore a user-centric (Quality of Experience or QoE-based) approach as a step towards the human-centric paradigm of system design. We research an affect-based QoE evaluation framework which attempts to capture users' perception while they are engaged in voice communication. Our modular approach consists of feature extraction from multiple information sources including various affective cues and different classification procedures such as Support Vector Machines (SVM) and k-Nearest Neighbor (kNN). The experimental study is illustrated in depth with detailed analysis of results. The evidences collected provide the potential feasibility of our approach for QoE evaluation and suggest the consideration of human affective attributes in modeling user experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: We investigated the relationship among factors predicting inadequate glucose control among 182 Cuban-American adults (Females=110, Males=72) with type 2 diabetes mellitus (CAA). Study Design: Cross-sectional study of CAA from a randomized mailing list in two counties of South Florida Methods: Fasted blood parameters and anthropometric measures were collected during the study. BMI was calculated (kg/ m2). Characteristics and diabetes care of CAA were self-reported Participants were screened by trained interviewers for heritage and diabetes status (inclusion criteria: self-reported having type 2 diabetes; age  35 years, male and female; not pregnant or lactating; no thyroid disorders; no major psychiatric disorders). Participants signed informed consent form. Statistical analyses used SPSS and included descriptive statistic, multiple logistic and ordinal logistic regression models, where all CI 95%. Results: Eighty-eight percent of CAA had BMI of ≥ 25 kg/ m2. Only 54% reported having a diet prescribed/told to schedule meals. We found CAA told to schedule meals were 3.62 more likely to plan meals (1.81, 7.26), p<0.001) and given a prescribed diet, controlling for age, corresponded with following a meal plan OR 4.43 (2.52, 7.79, p<0.001). The overall relationship for HbA1c < 8.5 to following a meal plan was OR 9.34 (2.84, 30.7. p<0.001). Conclusions: The advantage of having a medical professional prescribe a diet seems to be an important environmental support factor in this sample’s diabetes care, since obesity rates are well above the national average. Nearly half CAA are not given dietary guidance, yet our results indicate CAA may improve glycemic control by receiving dietary instructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors would like to thank the College of Life Sciences of Aberdeen University and Marine Scotland Science which funded CP's PhD project. Skate tagging experiments were undertaken as part of Scottish Government project SP004. We thank Ian Burrett for help in catching the fish and the other fishermen and anglers who returned tags. We thank José Manuel Gonzalez-Irusta for extracting and making available the environmental layers used as environmental covariates in the environmental suitability modelling procedure. We also thank Jason Matthiopoulos for insightful suggestions on habitat utilization metrics as well as Stephen C.F. Palmer, and three anonymous reviewers for useful suggestions to improve the clarity and quality of the manuscript.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light rainfall is the baseline input to the annual water budget in mountainous landscapes through the tropics and at mid-latitudes. In the Southern Appalachians, the contribution from light rainfall ranges from 50-60% during wet years to 80-90% during dry years, with convective activity and tropical cyclone input providing most of the interannual variability. The Southern Appalachians is a region characterized by rich biodiversity that is vulnerable to land use/land cover changes due to its proximity to a rapidly growing population. Persistent near surface moisture and associated microclimates observed in this region has been well documented since the colonization of the area in terms of species health, fire frequency, and overall biodiversity. The overarching objective of this research is to elucidate the microphysics of light rainfall and the dynamics of low level moisture in the inner region of the Southern Appalachians during the warm season, with a focus on orographically mediated processes. The overarching research hypothesis is that physical processes leading to and governing the life cycle of orographic fog, low level clouds, and precipitation, and their interactions, are strongly tied to landform, land cover, and the diurnal cycles of flow patterns, radiative forcing, and surface fluxes at the ridge-valley scale. The following science questions will be addressed specifically: 1) How do orographic clouds and fog affect the hydrometeorological regime from event to annual scale and as a function of terrain characteristics and land cover?; 2) What are the source areas, governing processes, and relevant time-scales of near surface moisture convergence patterns in the region?; and 3) What are the four dimensional microphysical and dynamical characteristics, including variability and controlling factors and processes, of fog and light rainfall? The research was conducted with two major components: 1) ground-based high-quality observations using multi-sensor platforms and 2) interpretive numerical modeling guided by the analysis of the in situ data collection. Findings illuminate a high level of spatial – down to the ridge scale - and temporal – from event to annual scale - heterogeneity in observations, and a significant impact on the hydrological regime as a result of seeder-feeder interactions among fog, low level clouds, and stratiform rainfall that enhance coalescence efficiency and lead to significantly higher rainfall rates at the land surface. Specifically, results show that enhancement of an event up to one order of magnitude in short-term accumulation can occur as a result of concurrent fog presence. Results also show that events are modulated strongly by terrain characteristics including elevation, slope, geometry, and land cover. These factors produce interactions between highly localized flows and gradients of temperature and moisture with larger scale circulations. Resulting observations of DSD and rainfall patterns are stratified by region and altitude and exhibit clear diurnal and seasonal cycles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.

The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.

We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.

Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the performance of a continuous quality improvement collaboration at Ridge Regional Hospital, Accra, Ghana, that aimed to halve maternal and neonatal deaths. METHODS: In a quasi-experimental, pre- and post-intervention analysis, system deficiencies were analyzed and 97 improvement activities were implemented from January 2007 to December 2011. Data were collected on outcomes and implementation rates of improvement activities. Severity-adjustment models were used to calculate counterfactual mortality ratios. Regression analysis was used to determine the association between improvement activities, staffing, and maternal mortality. RESULTS: Maternal mortality decreased by 22.4% between 2007 and 2011, from 496 to 385 per 100000 deliveries, despite a 50% increase in deliveries and five- and three-fold increases in the proportion of pregnancies complicated by obstetric hemorrhage and hypertensive disorders of pregnancy, respectively. Case fatality rates for obstetric hemorrhage and hypertensive disorders of pregnancy decreased from 14.8% to 1.6% and 3.1% to 1.1%, respectively. The mean implementation score was 68% for the 97 improvement processes. Overall, 43 maternal deaths were prevented by the intervention; however, risk severity-adjustment models indicated that an even greater number of deaths was averted. Mortality reduction was correlated with 26 continuous quality improvement activities, and with the number of anesthesia nurses and labor midwives. CONCLUSION: The implementation of quality improvement activities was closely correlated with improved maternal mortality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knight M, Acosta C, Brocklehurst P, Cheshire A, Fitzpatrick K, Hinton L, Jokinen M, Kemp B, Kurinczuk JJ, Lewis G, Lindquist A, Locock L, Nair M, Patel N, Quigley M, Ridge D, Rivero-Arias O, Sellers S, Shah A on behalf of the UKNeS coapplicant group. Background Studies of maternal mortality have been shown to result in important improvements to women’s health. It is now recognised that in countries such as the UK, where maternal deaths are rare, the study of near-miss severe maternal morbidity provides additional information to aid disease prevention, treatment and service provision. Objectives To (1) estimate the incidence of specific near-miss morbidities; (2) assess the contribution of existing risk factors to incidence; (3) describe different interventions and their impact on outcomes and costs; (4) identify any groups in which outcomes differ; (5) investigate factors associated with maternal death; (6) compare an external confidential enquiry or a local review approach for investigating quality of care for affected women; and (7) assess the longer-term impacts. Methods Mixed quantitative and qualitative methods including primary national observational studies, database analyses, surveys and case studies overseen by a user advisory group. Setting Maternity units in all four countries of the UK. Participants Women with near-miss maternal morbidities, their partners and comparison women without severe morbidity. Main outcome measures The incidence, risk factors, management and outcomes of uterine rupture, placenta accreta, haemolysis, elevated liver enzymes and low platelets (HELLP) syndrome, severe sepsis, amniotic fluid embolism and pregnancy at advanced maternal age (≥ 48 years at completion of pregnancy); factors associated with progression from severe morbidity to death; associations between severe maternal morbidity and ethnicity and socioeconomic status; lessons for care identified by local and external review; economic evaluation of interventions for management of postpartum haemorrhage (PPH); women’s experiences of near-miss maternal morbidity; long-term outcomes; and models of maternity care commissioned through experience-led and standard approaches. Results Women and their partners reported long-term impacts of near-miss maternal morbidities on their physical and mental health. Older maternal age and caesarean delivery are associated with severe maternal morbidity in both current and future pregnancies. Antibiotic prescription for pregnant or postpartum women with suspected infection does not necessarily prevent progression to severe sepsis, which may be rapidly progressive. Delay in delivery, of up to 48 hours, may be safely undertaken in women with HELLP syndrome in whom there is no fetal compromise. Uterine compression sutures are a cost-effective second-line therapy for PPH. Medical comorbidities are associated with a fivefold increase in the odds of maternal death from direct pregnancy complications. External reviews identified more specific clinical messages for care than local reviews. Experience-led commissioning may be used as a way to commission maternity services. Limitations This programme used observational studies, some with limited sample size, and the possibility of uncontrolled confounding cannot be excluded. Conclusions Implementation of the findings of this research could prevent both future severe pregnancy complications as well as improving the outcome of pregnancy for women. One of the clearest findings relates to the population of women with other medical and mental health problems in pregnancy and their risk of severe morbidity. Further research into models of pre-pregnancy, pregnancy and postnatal care is clearly needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clustering algorithms, pattern mining techniques and associated quality metrics emerged as reliable methods for modeling learners’ performance, comprehension and interaction in given educational scenarios. The specificity of available data such as missing values, extreme values or outliers, creates a challenge to extract significant user models from an educational perspective. In this paper we introduce a pattern detection mechanism with-in our data analytics tool based on k-means clustering and on SSE, silhouette, Dunn index and Xi-Beni index quality metrics. Experiments performed on a dataset obtained from our online e-learning platform show that the extracted interaction patterns were representative in classifying learners. Furthermore, the performed monitoring activities created a strong basis for generating automatic feedback to learners in terms of their course participation, while relying on their previous performance. In addition, our analysis introduces automatic triggers that highlight learners who will potentially fail the course, enabling tutors to take timely actions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews literature on alternative convenience food choices and analyses the findings from consumer behaviour and manufacturing/retailing perspective. As consumers’ demand for easy prepared and healthier food products has gradually increased, so has the related research activity. This address provides a synopsis of 60 relevant peer-review publications based on an online research carried out using related to organic ready-to-eat meals search terms. An overview of topic’s most important outcomes is presented, compared and evaluated. Results reveal positive attitudes, increased interest and willingness to purchase such products. Research gaps are identified in the field of personal and social norms as well as in the regulation and seeking information process. Policy making implications and recommendations are also discussed in conjunction with future research opportunities