879 resultados para Metrics of managment


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) is the measurement of the magnetic fields generated outside the head by the brain’s electrical activity. The technique offers the promise of high temporal and spatial resolution. There is however an ambiguity in the inversion process of estimating what goes on inside the head from what is measured outside. Other techniques, such as functional Magnetic Resonance Imaging (fMRI) have no such inversion problems yet suffer from poorer temporal resolution. In this study we examined metrics of mutual information and linear correlation between volumetric images from the two modalities. Measures of mutual information reveal a significant, non-linear, relationship between MEG and fMRI datasets across a number of frequency bands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysing the molecular polymorphism and interactions of DNA, RNA and proteins is of fundamental importance in biology. Predicting functions of polymorphic molecules is important in order to design more effective medicines. Analysing major histocompatibility complex (MHC) polymorphism is important for mate choice, epitope-based vaccine design and transplantation rejection etc. Most of the existing exploratory approaches cannot analyse these datasets because of the large number of molecules with a high number of descriptors per molecule. This thesis develops novel methods for data projection in order to explore high dimensional biological dataset by visualising them in a low-dimensional space. With increasing dimensionality, some existing data visualisation methods such as generative topographic mapping (GTM) become computationally intractable. We propose variants of these methods, where we use log-transformations at certain steps of expectation maximisation (EM) based parameter learning process, to make them tractable for high-dimensional datasets. We demonstrate these proposed variants both for synthetic and electrostatic potential dataset of MHC class-I. We also propose to extend a latent trait model (LTM), suitable for visualising high dimensional discrete data, to simultaneously estimate feature saliency as an integrated part of the parameter learning process of a visualisation model. This LTM variant not only gives better visualisation by modifying the project map based on feature relevance, but also helps users to assess the significance of each feature. Another problem which is not addressed much in the literature is the visualisation of mixed-type data. We propose to combine GTM and LTM in a principled way where appropriate noise models are used for each type of data in order to visualise mixed-type data in a single plot. We call this model a generalised GTM (GGTM). We also propose to extend GGTM model to estimate feature saliencies while training a visualisation model and this is called GGTM with feature saliency (GGTM-FS). We demonstrate effectiveness of these proposed models both for synthetic and real datasets. We evaluate visualisation quality using quality metrics such as distance distortion measure and rank based measures: trustworthiness, continuity, mean relative rank errors with respect to data space and latent space. In cases where the labels are known we also use quality metrics of KL divergence and nearest neighbour classifications error in order to determine the separation between classes. We demonstrate the efficacy of these proposed models both for synthetic and real biological datasets with a main focus on the MHC class-I dataset.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In terms of binary relations the author analyses the task of an individual consumers’ choice on the teaching excerpts set. It is suggested to analyse the function of consumer’s value as additive reduction. For localization of the vector of weighting coefficients of additive reduction the procedures based on metrics of object distance towards the ideal point are suggested.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research sought to understand the role that differentially assessed lands (lands in the United States given tax breaks in return for their guarantee to remain in agriculture) play in influencing urban growth. Our method was to calibrate the SLEUTH urban growth model under two different conditions. The first used an excluded layer that ignored such lands, effectively rendering them available for development. The second treated those lands as totally excluded from development. Our hypothesis was that excluding those lands would yield better metrics of fit with past data. Our results validate our hypothesis since two different metrics that evaluate goodness of fit both yielded higher values when differentially assessed lands are treated as excluded. This suggests that, at least in our study area, differential assessment, which protects farm and ranch lands for tenuous periods of time, has indeed allowed farmland to resist urban development. Including differentially assessed lands also yielded very different calibrated coefficients of growth as the model tried to account for the same growth patterns over two very different excluded areas. Excluded layer design can greatly affect model behavior. Since differentially assessed lands are quite common through the United States and are often ignored in urban growth modeling, the findings of this research can assist other urban growth modelers in designing excluded layers that result in more accurate model calibration and thus forecasting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Seascape ecology provides a useful framework from which to understand the processes governing spatial variability in ecological patterns. Seascape context, or the composition and pattern of habitat surrounding a focal patch, has the potential to impact resource availability, predator-prey interactions, and connectivity with other habitats. For my dissertation research, I combined a variety of approaches to examine how habitat quality for fishes is influenced by a diverse range of seascape factors in sub-tropical, back-reef ecosystems. In the first part of my dissertation, I examined how seascape context can affect reef fish communities on an experimental array of artificial reefs created in various seascape contexts in Abaco, Bahamas. I found that the amount of seagrass at large spatial scales was an important predictor of community assembly on these reefs. Additionally, seascape context had differing effects on various aspects of habitat quality for the most common reef species, White grunt Haemulon plumierii. The amount of seagrass at large spatial scales had positive effects on fish abundance and secondary production, but not on metrics of condition and growth. The second part of my dissertation focused on how foraging conditions for fish varied across a linear seascape gradient in the Loxahatchee River estuary in Florida, USA. Gray snapper, Lutjanus griseus, traded food quality for quantity along this estuarine gradient, maintaining similar growth rates and condition among sites. Additional work focused on identifying major energy flow pathways to two consumers in oyster-reef food webs in the Loxahatchee. Algal and microphytobenthos resource pools supported most of the production to these consumers, and body size for one of the consumers mediated food web linkages with surrounding mangrove habitats. All of these studies examined a different facet of the importance of seascape context in governing ecological processes occurring in focal habitats and underscore the role of connectivity among habitats in back-reef systems. The results suggest that management approaches consider the surrounding seascape when prioritizing areas for conservation or attempting to understand the impacts of seascape change on focal habitat patches. For this reason, spatially-based management approaches are recommended to most effectively manage back-reef systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

1. The niche variation hypothesis predicts that among-individual variation in niche use will increase in the presence of intraspecific competition and decrease in the presence of interspecific competition. We sought to determine whether the local isotopic niche breadth of fish inhabiting a wetland was best explained by competition for resources and the niche variation hypothesis, by dispersal of individuals from locations with different prey resources or by a combination of the two. We analysed stable isotopes of carbon and nitrogen as indices of feeding niche and compared metrics of within-site spread to characterise site-level isotopic niche breadth. We then evaluated the explanatory power of competing models of the direct and indirect effects of several environmental variables spanning gradients of disturbance, competition strength and food availability on among-individual variation of the eastern mosquitofish (Gambusia holbrooki). 2. The Dispersal model posits that only the direct effect of disturbance (i.e. changes in water level known to induce fish movement) influences among-individual variation in isotopic niche. The Partitioning model allows for only direct effects of local food availability on among-individual variation. The Combined model allows for both hypotheses by including the direct effects of disturbance and food availability. 3. A linear regression of the Combined model described more variance than models limited to the variables of either the Dispersal or Partitioning models. Of the independent variables considered, the food availability variable (per cent edible periphyton) explained the most variation in isotopic niche breadth, followed closely by the disturbance variable (days since last drying event). 4. Structural equation modelling provided further evidence that the Combined model was best supported by the data, with the Partitioning and the Dispersal models only modestly less informative. Again, the per cent edible periphyton was the variable with the largest direct effect on niche variability, with other food availability variables and the disturbance variable only slightly less important. Indirect effects of heterospecific and conspecific competitor densities were also important, through their effects on prey density. 5. Our results support the Combined hypotheses, although partitioning mechanisms appear to explain the most diet variation among individuals in the eastern mosquitofish. The results also support some predictions of the niche variation hypothesis, although both conspecific and interspecific competition appeared to increase isotopic niche breadth in contrast to predictions that interspecific competition would decrease it. We think this resulted from high diet overlap of co-occurring species, most of which consume similar macroinvertebrates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A growing human population, shifting human dietary habits, and climate change are negatively affecting global ecosystems on a massive scale. Expanding agricultural areas to feed a growing population drives extensive habitat loss, and climate change compounds stresses on both food security and ecosystems. Understanding the negative effects of human diet and climate change on agricultural and natural ecosystems provides a context within which potential technological and behavioral solutions can be proposed to help maximize conservation. The purpose of this research was to (1) examine the potential effects of climate change on the suitability of areas for commercial banana plantations in Latin America in the 2050s and how shifts in growing areas could affect protected areas; (2) test the ability of small unmanned aerial vehicles (UAVs) to map productivity of banana plantations as a potential tool for increasing yields and decreasing future plantation expansions; (3) project the effects on biodiversity of increasing rates of animal product consumption in developing megadiverse countries; and (4) estimate the capacity of global pasture biomass production and Fischer-Tropsch hydrocarbon synthesis (IGCC-FT) processing to meet electricity, gasoline and diesel needs. The results indicate that (1) the overall extent of areas suitable for conventional banana cultivation is predicted to decrease by 19% by 2050 because of a hotter and drier climate, but all current banana exporting countries are predicted to maintain some suitable areas with no effects on protected areas; (2) Spatial patterns of NDVI and ENDVI were significantly positively correlated with several metrics of fruit yield and quality, indicating that UAV systems can be used in banana plantations to map spatial patterns of fruit yield; (3) Livestock production is the single largest driver of habitat loss, and both livestock and feedstock production are increasing in developing biodiverse tropical countries. Reducing global animal product consumption should therefore be at the forefront of strategies aimed at reducing biodiversity loss; (4) Removing livestock from global pasture lands and instead utilizing the biomass production could produce enough energy to meet 100% of the electricity, gasoline, and diesel needs of over 40 countries with extensive grassland ecosystems, primarily in tropical developing countries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research adds to a body of work exploring the role of Social Network Analysis (SNA) in the study of both relational and structural characteristics of supply chain networks. Two contrasting network cases (food enterprises and digital-based enterprises) are chosen in order to elicit structural differences in business networks subject to divergences in local embeddedness and the relative materiality of the goods and services produced. Our analysis and findings draw out differences in network structure as evidenced by metrics of network centralization and cohesion, the presence of components and other sub-groupings, and the position of central actors. We relate these structural features both to the nature of the networks and to the (qualitative) experiences of the actors themselves. We find, in particular, the role of customers as co-creators of knowledge (for the Food network), the central role of infrastructure and services (for the Digital network), the importance of ICT as a source of codified knowledge inputs, along with the continuing importance of geographical proximity for the development and transfer of tacit knowledge and for incremental learning.