945 resultados para Advanced Transaction Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reports the results of research into the connections between transaction attributes and buyer-supplier relationships (BSR) in advanced manufacturing technology (AMT) acquisitions and implementation. It also examines the impact of the different patterns of BSR on performance. Specifically, it addresses the issues of how the three transaction attributes; namely level of complexity, level of asset specificity, and level of uncertainty, can affect the relationships between the technology buyer and suppler in AMT acquisition and implementation, and then to see the impact of different patterns of BSR on the two aspect of performance; namely technology and implementation performance. In understanding the pohenomena, the study mainly draws on and integrates the literature of transaction cost economics theory,buyer-supplier relationships and advanced manufacturing technology as a basis of theoretical framework and hypotheses development.data were gathered through a questionnaire survey with 147 responses and seven semi-structured interviews of manufacturing firms in Malaysia. Quantitative data were analysed mainly using the AMOS (Analysis of Moment Structure) package for structural equation modeling and SPSS (Statistical Package for Social Science) for analysis of variance (ANOVA). Data from interview sessions were used to develop a case study with the intention of providing a richer and deeper understanding on the subject under investigation and to offer triangulation in the research process. he results of the questionnaire survey indicate that the higher the level of technological specificity and uncertainty, the more firms are likely to engage in a closer relationship with technology suppliers.However, the complexity of the technology being implemented is associated with BSR only because it is associated with the level of uncertainty that has direct impact upon BSR.The analysis also provides strong support for the premise that developing strong BSR could lead to an improved performance. However, with high levels of transaction attribute, implementation performance suffers more when firms have weak relationships with technology suppliers than with moderate and low levels of transaction attributes. The implications of the study are offered for both the academic and practitioner audience. The thesis closes with reports on its limitations and suggestions for further research that would address some of these limitations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formative measurement has seen increasing acceptance in organizational research since the turn of the 21st Century. However, in more recent times, a number of criticisms of the formative approach have appeared. Such work argues that formatively-measured constructs are empirically ambiguous and thus flawed in a theory-testing context. The aim of the present paper is to examine the underpinnings of formative measurement theory in light of theories of causality and ontology in measurement in general. In doing so, a thesis is advanced which draws a distinction between reflective, formative, and causal theories of latent variables. This distinction is shown to be advantageous in that it clarifies the ontological status of each type of latent variable, and thus provides advice on appropriate conceptualization and application. The distinction also reconciles in part both recent supportive and critical perspectives on formative measurement. In light of this, advice is given on how most appropriately to model formative composites in theory-testing applications, placing the onus on the researcher to make clear their conceptualization and operationalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a two-dimensional water model investigation and development of a multiscale method for the modelling of large systems, such as virus in water or peptide immersed in the solvent. We have implemented a two-dimensional ‘Mercedes Benz’ (MB) or BN2D water model using Molecular Dynamics. We have studied its dynamical and structural properties dependence on the model’s parameters. For the first time we derived formulas to calculate thermodynamic properties of the MB model in the microcanonical (NVE) ensemble. We also derived equations of motion in the isothermal–isobaric (NPT) ensemble. We have analysed the rotational degree of freedom of the model in both ensembles. We have developed and implemented a self-consistent multiscale method, which is able to communicate micro- and macro- scales. This multiscale method assumes, that matter consists of the two phases. One phase is related to micro- and the other to macroscale. We simulate the macro scale using Landau Lifshitz-Fluctuating Hydrodynamics, while we describe the microscale using Molecular Dynamics. We have demonstrated that the communication between the disparate scales is possible without introduction of fictitious interface or approximations which reduce the accuracy of the information exchange between the scales. We have investigated control parameters, which were introduced to control the contribution of each phases to the matter behaviour. We have shown, that microscales inherit dynamical properties of the macroscales and vice versa, depending on the concentration of each phase. We have shown, that Radial Distribution Function is not altered and velocity autocorrelation functions are gradually transformed, from Molecular Dynamics to Fluctuating Hydrodynamics description, when phase balance is changed. In this work we test our multiscale method for the liquid argon, BN2D and SPC/E water models. For the SPC/E water model we investigate microscale fluctuations which are computed using advanced mapping technique of the small scales to the large scales, which was developed by Voulgarakisand et. al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives Particle delivery to the airways is an attractive prospect for many potential therapeutics, including vaccines. Developing strategies for inhalation of particles provides a targeted, controlled and non-invasive delivery route but, as with all novel therapeutics, in vitro and in vivo testing are needed prior to clinical use. Whilst advanced vaccine testing demands the use of animal models to address safety issues, the production of robust in vitro cellular models would take account of the ethical framework known as the 3Rs (Replacement, Reduction and Refinement of animal use), by permitting initial screening of potential candidates prior to animal use. There is thus a need for relevant, realistic in vitro models of the human airways. Key findings Our laboratory has designed and characterised a multi-cellular model of human airways that takes account of the conditions in the airways and recapitulates many salient features, including the epithelial barrier and mucus secretion. Summary Our human pulmonary models recreate many of the obstacles to successful pulmonary delivery of particles and therefore represent a valid test platform for screening compounds and delivery systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a detailed numerical analysis, fabrication method and experimental investigation on 45º tilted fiber gratings (45º-TFGs) and excessively tilted fiber gratings (Ex-TFGs), and their applications in fiber laser and sensing systems. The one of the most significant contributions of the work reported in this thesis is that the 45º-TFGs with high polarization extinction ratio (PER) have been fabricated in single mode telecom and polarization maintaining (PM) fibers with spectral response covering three prominent optic communication and central wavelength ranges at 1060nm, 1310nm and 1550nm. The most achieved PERs for the 45º-TFGs are up to and greater than 35-50dB, which have reached and even exceeded many commercial in-fiber polarizers. It has been proposed that the 45º-TFGs of high PER can be used as ideal in-fiber polarizers for a wide range of fiber systems and applications. In addition, in-depth detailed theoretical models and analysis have been developed and systematic experimental evaluation has been conducted producing results in excellent agreement with theoretical modeling. Another important outcome of the research work is the proposal and demonstration of all fiber Lyot filters (AFLFs) implemented by utilizing two (for a single stage type) and more (for multi-stage) 45º-TFGs in PM fiber cavity structure. The detailed theoretical analysis and modelling of such AFLFs have also been carried out giving design guidance for the practical implementation. The unique function advantages of 45º-TFG based AFLFs have been revealed, showing high finesse multi-wavelength transmission of single polarization and wide range of tuneability. The temperature tuning results of AFLFs have shown that the AFLFs have 60 times higher thermal sensitivity than the normal FBGs, thus permitting thermal tuning rate of ~8nm/10ºC. By using an intra-cavity AFLF, an all fiber soliton mode locking laser with almost total suppression of siliton sidebands, single polarization output and single/multi-wavelength switchable operation has been demonstrated. The final significant contribution is the theoretical analysis and experimental verification on the design, fabrication and sensing application of Ex-TFGs. The Ex-TFG sensitivity model to the surrounding medium refractive index (SRI) has been developed for the first time, and the factors that affect the thermal and SRI sensitivity in relation to the wavelength range, tilt angle, and the size of cladding have been investigated. As a practical SRI sensor, an 81º-TFG UV-inscribed in the fiber with small (40μm) cladding radius has shown an SRI sensitivity up to 1180nm/RIU in the index of 1.345 range. Finally, to ensure single polarization detection in such an SRI sensor, a hybrid configuration by UV-inscribing a 45º-TFG and an 81º-TFG closely on the same piece of fiber has been demonstrated as a more advanced SRI sensing system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 92C20

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensive Care Units (ICUs) account for over 10 percent of all US hospital beds, have over 4.4 million patient admissions yearly, approximately 360,000 deaths, and account for close to 30% of acute care hospital costs. The need for critical care services has increased due to an aging population and medical advances that extend life. The result is efforts to improve patient outcomes, optimize financial performance, and implement models of ICU care that enhance quality of care and reduce health care costs. This retrospective chart review study examined the dose effect of APN Intensivists in a surgical intensive care unit (SICU) on differences in patient outcomes, healthcare charges, SICU length of stay, charges for APN intensivist services, and frequency of APNs special initiatives when the SICU was staffed by differing levels of APN Intensivist staffing over four time periods (T1-T4) between 2009 and 2011. The sample consisted of 816 randomly selected (204 per T1-T4) patient chart data. Study findings indicated reported ventilator associated pneumonia (VAP) rates, ventilator days, catheter days and catheter associated urinary tract infection (CAUTI) rates increased at T4 (when there was the lowest number of APN Intensivists), and there was increased pressure ulcer incidence in first two quarters of T4. There was no statistically significant difference in post-surgical glycemic control (M = 142.84, SD = 40.00), t (223) = 1.40, p = .17, and no statistically significant difference in the SICU length of stay among the time-periods (M = 3.27, SD = 3.32), t (202) = 1.02, p = .31. Charges for APN services increased over the 4 time periods from $11,268 at T1 to $51,727 at T4 when a system to capture APN billing was put into place. The number of new APN initiatives declined in T4 as the number of APN Intensivists declined. Study results suggest a dose effect of APN Intensivists on important patient health outcomes and on the number of APNs initiatives to prevent health complications in the SICU. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High street optometric practices are for-profit businesses. They mostly provide sight testing and eye examination services and sell optical products, such as spectacles and contact lenses. The sight testing services are often sold at a vastly reduced price and profits are generated primarily through high margin spectacle sales, in a loss leading strategy. Published literature highlights weaknesses in this strategy as it forms a barrier to widening the scope of services provided within optometric practices. This includes specialist non-refraction based services, such as shared care. In addition this business strategy discourages investment in advanced diagnostic equipment and higher professional qualifications. The aim of this thesis was to develop a greater understanding of the traditional loss-leading strategy. The thesis also aimed to assess the plausibility of alternative business models to support the development of specialist non-refraction services within high street optometric practice. This research was based on a single independent optometric practice that specialises in advanced retinal imaging and offers a broad range of shared care services. Specialist non-refraction based services were found to be poor generators of spectacle sales likely due to patient needs and presenting concerns. Alternative business strategies to support these services included charging more realistic professional fees via cost-based pricing and monthly payment plans. These strategies enabled specialist services to be more self-sustainable with less reliance on cross-subsidy from spectacle sales. Furthermore, improving operational efficiency can increase stand-alone profits for specialist services.Practice managers may be reluctant to increase professional fees due to market pressures and confidence. However, this thesis found that patients were accepting of increased professional fees. Practice managers can implement alternative business models to enhance eye care provision in high street optometric practices. These alternative business models also improve revenues and profits generated via clinical services and improve patient loyalty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in genomic technology, both in the increased speed and reduced cost of sequencing, have expanded the appreciation of the abundance of human genetic variation. However the sheer amount of variation, as well as the varying type and genomic content of variation, poses a challenge in understanding the clinical consequence of a single mutation. This work uses several methodologies to interpret the observed variation in the human genome, and presents novel strategies for the prediction of allele pathogenicity.

Using the zebrafish model system as an in vivo assay of allele function, we identified a novel driver of Bardet-Biedl Syndrome (BBS) in CEP76. A combination of targeted sequencing of 785 cilia-associated genes in a cohort of BBS patients and subsequent in vivo functional assays recapitulating the human phenotype gave strong evidence for the role of CEP76 mutations in the pathology of an affected family. This portion of the work demonstrated the necessity of functional testing in validating disease-associated mutations, and added to the catalogue of known BBS disease genes.

Further study into the role of copy-number variations (CNVs) in a cohort of BBS patients showed the significant contribution of CNVs to disease pathology. Using high-density array comparative genomic hybridization (aCGH) we were able to identify pathogenic CNVs as small as several hundred bp. Dissection of constituent gene and in vivo experiments investigating epistatic interactions between affected genes allowed for an appreciation of several paradigms by which CNVs can contribute to disease. This study revealed that the contribution of CNVs to disease in BBS patients is much higher than previously expected, and demonstrated the necessity of consideration of CNV contribution in future (and retrospective) investigations of human genetic disease.

Finally, we used a combination of comparative genomics and in vivo complementation assays to identify second-site compensatory modification of pathogenic alleles. These pathogenic alleles, which are found compensated in other species (termed compensated pathogenic deviations [CPDs]), represent a significant fraction (from 3 – 10%) of human disease-associated alleles. In silico pathogenicity prediction algorithms, a valuable method of allele prioritization, often misrepresent these alleles as benign, leading to omission of possibly informative variants in studies of human genetic disease. We created a mathematical model that was able to predict CPDs and putative compensatory sites, and functionally showed in vivo that second-site mutation can mitigate the pathogenicity of disease alleles. Additionally, we made publically available an in silico module for the prediction of CPDs and modifier sites.

These studies have advanced the ability to interpret the pathogenicity of multiple types of human variation, as well as made available tools for others to do so as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensive Care Units (ICUs) account for over 10 percent of all US hospital beds, have over 4.4 million patient admissions yearly, approximately 360,000 deaths, and account for close to 30% of acute care hospital costs. The need for critical care services has increased due to an aging population and medical advances that extend life. The result is efforts to improve patient outcomes, optimize financial performance, and implement models of ICU care that enhance quality of care and reduce health care costs. This retrospective chart review study examined the dose effect of APN Intensivists in a surgical intensive care unit (SICU) on differences in patient outcomes, healthcare charges, SICU length of stay, charges for APN intensivist services, and frequency of APNs special initiatives when the SICU was staffed by differing levels of APN Intensivist staffing over four time periods (T1-T4) between 2009 and 2011. The sample consisted of 816 randomly selected (204 per T1-T4) patient chart data. Study findings indicated reported ventilator associated pneumonia (VAP) rates, ventilator days, catheter days and catheter associated urinary tract infection (CAUTI) rates increased at T4 (when there was the lowest number of APN Intensivists), and there was increased pressure ulcer incidence in first two quarters of T4. There was no statistically significant difference in post-surgical glycemic control (M = 142.84, SD= 40.00), t (223) = 1.40, p = .17, and no statistically significant difference in the SICU length of stay among the time-periods (M= 3.27, SD = 3.32), t (202) = 1.02, p= .31. Charges for APN services increased over the 4 time periods from $11,268 at T1 to $51,727 at T4 when a system to capture APN billing was put into place. The number of new APN initiatives declined in T4 as the number of APN Intensivists declined. Study results suggest a dose effect of APN Intensivists on important patient health outcomes and on the number of APNs initiatives to prevent health complications in the SICU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: EGFR overexpression occurs in 27-55% of oesophagogastric adenocarcinomas, and correlates with poor prognosis. We aimed to assess addition of the anti-EGFR antibody panitumumab to epirubicin, oxaliplatin, and capecitabine (EOC) in patients with advanced oesophagogastric adenocarcinoma. METHODS: In this randomised, open-label phase 3 trial (REAL3), we enrolled patients with untreated, metastatic, or locally advanced oesophagogastric adenocarcinoma at 63 centres (tertiary referral centres, teaching hospitals, and district general hospitals) in the UK. Eligible patients were randomly allocated (1:1) to receive up to eight 21-day cycles of open-label EOC (epirubicin 50 mg/m(2) and oxaliplatin 130 mg/m(2) on day 1 and capecitabine 1250 mg/m(2) per day on days 1-21) or modified-dose EOC plus panitumumab (mEOC+P; epirubicin 50 mg/m(2) and oxaliplatin 100 mg/m(2) on day 1, capecitabine 1000 mg/m(2) per day on days 1-21, and panitumumab 9 mg/kg on day 1). Randomisation was blocked and stratified for centre region, extent of disease, and performance status. The primary endpoint was overall survival in the intention-to-treat population. We assessed safety in all patients who received at least one dose of study drug. After a preplanned independent data monitoring committee review in October, 2011, trial recruitment was halted and panitumumab withdrawn. Data for patients on treatment were censored at this timepoint. This study is registered with ClinicalTrials.gov, number NCT00824785. FINDINGS: Between June 2, 2008, and Oct 17, 2011, we enrolled 553 eligible patients. Median overall survival in 275 patients allocated EOC was 11.3 months (95% CI 9.6-13.0) compared with 8.8 months (7.7-9.8) in 278 patients allocated mEOC+P (hazard ratio [HR] 1.37, 95% CI 1.07-1.76; p=0.013). mEOC+P was associated with increased incidence of grade 3-4 diarrhoea (48 [17%] of 276 patients allocated mEOC+P vs 29 [11%] of 266 patients allocated EOC), rash (29 [11%] vs two [1%]), mucositis (14 [5%] vs none), and hypomagnesaemia (13 [5%] vs none) but reduced incidence of haematological toxicity (grade ≥ 3 neutropenia 35 [13%] vs 74 [28%]). INTERPRETATION: Addition of panitumumab to EOC chemotherapy does not increase overall survival and cannot be recommended for use in an unselected population with advanced oesophagogastric adenocarcinoma. FUNDING: Amgen, UK National Institute for Health Research Biomedical Research Centre.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key Performance Indicators (KPIs) and their predictions are widely used by the enterprises for informed decision making. Nevertheless , a very important factor, which is generally overlooked, is that the top level strategic KPIs are actually driven by the operational level business processes. These two domains are, however, mostly segregated and analysed in silos with different Business Intelligence solutions. In this paper, we are proposing an approach for advanced Business Simulations, which converges the two domains by utilising process execution & business data, and concepts from Business Dynamics (BD) and Business Ontologies, to promote better system understanding and detailed KPI predictions. Our approach incorporates the automated creation of Causal Loop Diagrams, thus empowering the analyst to critically examine the complex dependencies hidden in the massive amounts of available enterprise data. We have further evaluated our proposed approach in the context of a retail use-case that involved verification of the automatically generated causal models by a domain expert.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Literature describing the notion and practice of business models has grown considerably over the last few years. Innovative business models appear in every sector of the economy challenging traditional ways of creating and capturing value. However, research describing the theoretical foundations of the field is scarce and many questions still remain. This article examines business models promoting various aspects of sustainable development and tests the explanatory power of two theoretical approaches, namely the resource based view of the firm and transaction cost theory regarding their emergence and successful market performance. Through the examples of industrial ecology and the sharing economy the author shows that a sharp reduction of transaction costs (e.g. in the form of internet based systems) coupled with resources widely available but not utilised before may result in fast growing new markets. This research also provides evidence regarding the notion that these two theoretical approaches can complement each other in explaining corporate behaviour.