961 resultados para chromosomal aberration and reconstruction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous research has indicated that schematic eyes incorporating aspheric surfaces but lacking gradient index are unable to model ocular spherical aberration and peripheral astigmatism simultaneously. This limits their use as wide-angle schematic eyes. This thesis challenges this assumption by investigating the flexibility of schematic eyes comprising aspheric optical surfaces and homogeneous optical media. The full variation of ocular component dimensions found in human eyes was established from the literature. Schematic eye parameter variants were limited to these dimensions. The levels of spherical aberration and peripheral astigmatism modelled by these schematic eyes were compared to the range of measured levels. These were also established from the literature. To simplify comparison of modelled and measured data, single value parameters were introduced; the spherical aberration function (SAF), and peripheral astigmatism function (PAF). Some ocular components variations produced a wide range of aberrations without exceeding the limits of human ocular components. The effect of ocular component variations on coma was also investigated, but no comparison could be made as no empirical data exists. It was demonstrated that by combined manipulation of a number of parameters in the schematic eyes it was possible to model all levels of ocular spherical aberration and peripheral astigmatism. However, the unique parameters of a human eye could not be obtained in this way, as a number of models could be used to produce the same spherical aberration and peripheral astigmatism, while giving very different coma levels. It was concluded that these schematic eyes are flexible enough to model the monochromatic aberrations tested, the absence of gradient index being compensated for by altering the asphericity of one or more surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The catecholic cephalosporin BRL 41897 A is resistant to β-lactamases and is taken up by bacteria via the iron transport system. The uptake of this antibiotic in E.coli uses the Fiu and Cir outer membrane proteins, whereas in P. aerugtnosa it enters via the pyochelin transport system. In this thesis mutants of K. pneumoniae resistant to BRL 41897A were isolated using TnphoA mutagenesis and used to study the mechanism of uptake of BRL 41897A by K. pneumoniae. The activity of BRL 41897A towards the parent strain (M10) was increased in iron depleted media, whereas no significant differences in the resistant (KSL) mutants were observed. Three mutants (KSL19, KSL38and KSL59) produced decreased amounts of certain iron-regulated outer membrane proteins. The uptake of 55Fe-BRL 41897A by M10 in iron-deficient medium was higher than in iron-rich medium. This result indicated the involvement of an iron transport system in the uptake of BRL 41897A by K. pneumoniae. Uptake by the KSL mutants in iron-deficient culture was higher than that by M10. This result, supported by analysis of outer membrane and periplasmic proteins of the KSL mutants, indicates that loss of one outer membrane protein can be compensated by over expression of other outer membrane and/or periplasmic proteins. However, the increased uptake of BRL 41897A by the KSL mutants did not reflect increased activity towards these strains, indicating that there are defects in the transport of BRL 41897A resulting in failure to reach the penicillin binding protein target sites in the cytoplasmic membrane. Southern blotting of chromosomal digests and sequencing in one mutant (KSL19) showed that only one copy of TnphoA was inserted into its chromosome. A putative TnphoA inserted gene in KSL19, designated kslA, carrying a signal sequence was identified. Transformation of a fragment containing the kslA gene into KSL19 cells restored the sensitivity to BRL 41897A to that of the parent strain. Data base peptide sequence searches revealed that the kslA gene in the KSL19 has some amino acid homology with the E. coli ExbD protein, which is involved in stabilisation of the TonB protein. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining an appropriate research methodology is considered as an important element in a research study; especially in a doctoral research study. It involves approach to the entire process of a research study, starting from theoretical underpinnings and spanning to data collection and analysis, and extending to developing the solutions for the problems investigated. Research methodology in essence is focused around the problems to be investigated in a research study and therefore varies according to the problems investigated. Thus, identifying the research methodology that best suits a research in hand is important, not only as it will benefit achieving the set objectives of a research, but also as it will serve establishing the credibility of the work. Research philosophy, approach, strategy, choice, and techniques are inherent components of the methodology. Research strategy provides the overall direction of the research including the process by which the research is conducted. Case study, experiment, survey, action research, grounded theory and ethnography are examples for such research strategies. Case study is documented as an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between phenomenon and context are not clearly evident. Case study was adopted as the overarching research strategy, in a doctoral study developed to investigate the resilience of construction Small and Medium-sized Enterprises (SMEs) in the UK to extreme weather events. The research sought to investigate how construction SMEs are affected by EWEs, respond to the risk of EWEs, and means of enhancing their resilience to future EWEs. It is argued that utilising case study strategy will benefit the research study, in achieving the set objectives of the research and answering the research questions raised, by comparing and contrasting with the alternative strategies available. It is also claimed that the selected strategy will contribute towards addressing the call for improved methodological pluralism in construction management research, enhancing the understanding of complex network of relationships pertinent to the industry and the phenomenon being studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining an appropriate research methodology is considered as an important element in a research study; especially in a doctoral research study. It involves approach to the entire process of a research study, starting from theoretical underpinnings and spanning to data collection and analysis, and extending to developing the solutions for the problems investigated. Research methodology in essence is focused around the problems to be investigated in a research study and therefore varies according to the problems investigated. Thus, identifying the research methodology that best suits a research in hand is important, not only as it will benefit achieving the set objectives of a research, but also as it will serve establishing the credibility of the work. Research philosophy, approach, strategy, choice, and techniques are inherent components of the methodology. Research strategy provides the overall direction of the research including the process by which the research is conducted. Case study, experiment, survey, action research, grounded theory and ethnography are examples for such research strategies. Case study is documented as an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between phenomenon and context are not clearly evident. Case study was adopted as the overarching research strategy, in a doctoral study developed to investigate the resilience of construction Small and Medium-sized Enterprises (SMEs) in the UK to extreme weather events. The research sought to investigate how construction SMEs are affected by EWEs, respond to the risk of EWEs, and means of enhancing their resilience to future EWEs. It is argued that utilising case study strategy will benefit the research study, in achieving the set objectives of the research and answering the research questions raised, by comparing and contrasting with the alternative strategies available. It is also claimed that the selected strategy will contribute towards addressing the call for improved methodological pluralism in construction management research, enhancing the understanding of complex network of relationships pertinent to the industry and the phenomenon being studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enhancing the resilience of local communities to weather extremes has gained significant interest over the years, amidst the increased intensity and frequency of such events. The fact that such weather extremes are forecast to further increase in number and severity in future has added extra weight to the importance of the issue. As a local community consists of a number of community groups such as households, businesses and policy makers, the actions of different community groups in combination will determine the resilience of the community as a whole. An important role has to be played by Small and Medium-sized Enterprises (SMEs); which is an integral segment of a local community in the UK, in this regard. While it is recognised that they are vital to the economy of a country and determines the prosperity of communities, they are increasingly vulnerable to effects of extreme weather. This paper discusses some of the exploratory studies conducted in the UK on SMEs and their ability to cope with extreme weather events, specifically flooding. Although a reasonable level of awareness of the risk was observed among the SMEs, this has not always resulted in increased preparedness even if they are located in areas at risk of flooding. The attitude and the motivation to change differed widely between SMEs. The paper presents schemas by which the SMEs can identify their vulnerability better so that they can be populated among a community of SMEs, which can be taken forward to inform policy making in this area. Therefore the main contribution the paper makes to the body of knowledge in the area is a novel way to communicate to SMEs on improving resilience against extreme weather, which will inform some of the policy making initiatives in the UK.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite Government investment in flood defence schemes, many properties remain at high risk of flooding. A substantial portion of these properties are business establishments. Flooding can create serious consequences for businesses, including damage to property and stocks, being out of business for a considerable period and ultimately business failure. Recent flood events such as those in 2007 and 2009 that affected many parts of the UK have helped to establish the true costs of flooding to businesses. This greater understanding of the risks to businesses has heightened the need for business owners to adapt their businesses to the threat of future flooding. Government policy has now shifted away from investment in engineered flood defences, towards encouraging the uptake of property level flood resistance and resilience measures by businesses. However, implementing such adaptation strategies remains a challenge due a range of reasons. A review of the current state of property level flood risk adaptation of UK businesses is presented, drawing from extant literature. Barriers that may hinder the uptake of property level adaptation by businesses are revealed and drivers that may enhance uptake and effectively overcome these barriers are also discussed. It is concluded that the professions from the construction sector have the potential to contribute towards the adaptation of business properties and thereby the flood resilience of businesses at risk of flooding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This piece of research aims to address the issue on the construction and reconstruction of Câmara Cascudo´s discursive representations in Mário de Andrade´s discourse. In order to describe, analyze and interpret these representations, we recur to some semantic categories from the Discourse Textual Analysis (DTA) by articulating them with other categories, notably, Grize´s Logic (1996,1997), Text Linguistics and Semantics. Therefore, the purpose is to analyze how these representations are constructed discursively, in written letters, by means of semantic categories such as referentiation, predication, modification, connection and spatial and temporal locations. In the theoretical foundation, proposals of Textual Analysis of Discourses, conceived by the linguist Jean-Michel Adam (1990, 2008a, 2011a), Text Linguistics, Semantics and Logic, focusing, especially, on the phenomenon of discursive representations, are articulated. The research approach is of qualitative nature supported by some quantitative data (OLIVEIRA, 2012); option which makes the analysis richer and comprehensive. As a hypothesis, it is presented the fact that these categories used by Mário de Andrade, in his discourse, do not only enable the (re)construction of the interlocutor´s images, discursively constructed, but they also provide a multiplicity of information and viewpoints about the RN writer´s personality. The study corpus is constituted of 20 texts written by Mário de Andrade and sent to Câmara Cascudo between 1924 and 1944, of which 35 fragments were selected and analyzed. However, it can be verified that, in the analyzed corpus, a set of discursive representations is constructed for Câmara Cascudo, from semantic categories proposed for analysis and used in Mário de Andrade´s discourse. These categories enable to construct and reconstruct the representations that emerge in the texts. Therefore, the analysis points out the construction of a set of different representations, highlighting the representation of the writer, the intellectual and the friend.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work has as a research subject of popular education policies of the city of Natal, Rio Grande do Norte, in the years 1957 to 1964. It aims to identify and analyze popular education policies developed and implemented by the Municipality of Natal in these years. To get the historical data, we establish as a guiding reserch question the following: Which elaborated educational policies were implemented by the Municipality of Natal in the years 1957-1964? and took over as the method Evidential Paradigm as proposition in Pinheiro (2009). This is anchored in documentary sources of Educational Legislation at National, State and Municipal levels as well as in the newspapers Folha de Tarde and Jornal de Natal; in existing documents from the archives of the Historical and Geographical Institute of Rio Grande do Norte (IHGRN), the Municipal Public Archives of Natal; iconographic sources; interviews and academic publications. In addition to these sources, we were inspired by the works of Aristotles (2011), Hobbes (2009), Freire (2011), Góes (1980), Germano (1989), Cortez (2005) and Galvão (2004). This research allowed us to understand that policies of popular education of Natal (RN) were based on a democratic educational practice, supported on three pillars, namely: participation and involvement of Natal population; construction and reconstruction of teaching practices in prioritizing their action programs to mass literacy and the training of lay teachers; and the democratization of culture. This historical process made Natal on educating city.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fast growth of the elderly population is a reality throughout the world and has become one of the greatest challenges for contemporary public health. When considering the increased life expectancy and the aging as a multidimensional phenomenon, one should highlight the need to investigate if the increase of longevity is associated with satisfactory levels of Quality of Life (QOL). This study has the objective of assessing the QOL of elderly people from the Paraíba’s Western Curimataú microregion, explained by its health and living conditions. This is a cross-sectional and observational study with quantitative design held with 444 elderly people from five cities: Barra de Santa Rosa, Cuité, Nova Floresta, Remígio e Sossego. In order to obtain information, the following instruments were used: I) Questionnaire for collection data related to the elderly population, for sociodemographic, clinical and behavioral characteristics; and II) WHOQOL-Old questionnaire, with a view to measuring and assessing QOL. Data were processed on the IBM-SPSS Statistics 20.0 software by means of the ANOVA (one-way), Student’s t, Mann-Whitney, Kruskal-Wallis and Pearson’s correlation tests, with p-values<0,05 accepted as being statistically significant. The results indicate a good global QOL (ETT=65,69%), with better assessment by elderly men, aged between 60 and 74 years, married, living with partner and children, without caregiver, physical activity practitioners, with up to one health problem before an aspect of multimorbidity and with very good and/or good assessment of basic needs. The self-reported stress showed a negative significant correlation before the global QOL, where the greater the perception of stress, the worse the assessment of QOL. In the faceted assessment of QOL, the Sensory Operation showed the best performance (ETF= 68,86%) and the Social Participation (SP) the worst (ETF=60,37%). In the multiple linear regression model, SP is singly responsible for 51,8% (R2=0,518) of explanation of the global QOL. In the intercorrelation among the WHOQOL-Old facets, only Death and Dying did not reveal significance. The harmony highlighted among the facets raises the need to ensure a comprehensive health care for the elderly population, especially in understanding the social participation as an intrinsic part of the QOL and that it requires the re-discussion and reconstruction of individual and collective, family and community, political and government actions. Hence, guaranteeing an active, healthy and participatory aging, with QOL, is the major challenge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fast growth of the elderly population is a reality throughout the world and has become one of the greatest challenges for contemporary public health. When considering the increased life expectancy and the aging as a multidimensional phenomenon, one should highlight the need to investigate if the increase of longevity is associated with satisfactory levels of Quality of Life (QOL). This study has the objective of assessing the QOL of elderly people from the Paraíba’s Western Curimataú microregion, explained by its health and living conditions. This is a cross-sectional and observational study with quantitative design held with 444 elderly people from five cities: Barra de Santa Rosa, Cuité, Nova Floresta, Remígio e Sossego. In order to obtain information, the following instruments were used: I) Questionnaire for collection data related to the elderly population, for sociodemographic, clinical and behavioral characteristics; and II) WHOQOL-Old questionnaire, with a view to measuring and assessing QOL. Data were processed on the IBM-SPSS Statistics 20.0 software by means of the ANOVA (one-way), Student’s t, Mann-Whitney, Kruskal-Wallis and Pearson’s correlation tests, with p-values<0,05 accepted as being statistically significant. The results indicate a good global QOL (ETT=65,69%), with better assessment by elderly men, aged between 60 and 74 years, married, living with partner and children, without caregiver, physical activity practitioners, with up to one health problem before an aspect of multimorbidity and with very good and/or good assessment of basic needs. The self-reported stress showed a negative significant correlation before the global QOL, where the greater the perception of stress, the worse the assessment of QOL. In the faceted assessment of QOL, the Sensory Operation showed the best performance (ETF= 68,86%) and the Social Participation (SP) the worst (ETF=60,37%). In the multiple linear regression model, SP is singly responsible for 51,8% (R2=0,518) of explanation of the global QOL. In the intercorrelation among the WHOQOL-Old facets, only Death and Dying did not reveal significance. The harmony highlighted among the facets raises the need to ensure a comprehensive health care for the elderly population, especially in understanding the social participation as an intrinsic part of the QOL and that it requires the re-discussion and reconstruction of individual and collective, family and community, political and government actions. Hence, guaranteeing an active, healthy and participatory aging, with QOL, is the major challenge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Atlantic is regarded as a huge carbonate depocenter due to an on average deep calcite lysocline. However, calculations and models that attribute the calcite lysocline to the critical undersaturation depth (hydrographic or chemical lysocline) and not to the depth at which significant calcium carbonate dissolution is observed (sedimentary calcite lysocline) strongly overestimate the preservation potential of calcareous deep-sea sediments. Significant calcium carbonate dissolution is expected to begin firstly below 5000 m in the deep Guinea and Angola Basin and below 4400 m in the Cape Basin. Our study that is based on different calcium carbonate dissolution stages of the planktic foraminifera Globigerina bulloides clearly shows that it starts between 400 and 1600 m shallower depending on the different hydrographic settings of the South Atlantic Ocean. In particular, coastal areas are severely affected by increased supply of organic matter and the resultant production of metabolic CO2 which seems to create microenvironments favorable for dissolution of calcite well above the hydrographic lysocline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we test various parameters in deep-sea sediments (bulk sediment parameters and changes in microfossil abundances and preservation character) which are generally accepted as indicators of calcium carbonate dissolution. We investigate sediment material from station GeoB 1710-3 in the northern Cape Basin (eastern South Atlantic), 280 km away from the Namibian coast, well outside today's coastal upwelling. As northern Benguela upwelling cells were displaced westward and periodically preceded the core location during the past 245 kyr (Volbers et al., submitted), GeoB 1710-3 sediments reflect these changes in upwelling productivity. Results of the most commonly used calcium carbonate dissolution proxies do not only monitor dissolution within these calcareous sediments but also reflect changes in upwelling intensity. Accordingly, these conventional proxy parameters misrepresent, to some extent, the extent of calcium carbonate dissolution. These results were verified by an independent dissolution proxy, the Globigerina bulloides dissolution index (BDX') (Volbers and Henrich, 2002, doi:10.1016/S0025-3227(02)00333-X). The BDX' is based on scanning electronic microscope ultrastructural investigation of planktonic foraminiferal tests and indicates persistent good carbonate preservation throughout the past 245 kyr, with the exception of one pronounced dissolution event at early oxygen isotopic stage (OIS) 6. The early OIS 6 is characterized by calcium carbonate contents, sand contents, and planktonic foraminiferal concentrations all at their lowest levels for the last 245 kyr. At the same time, the ratio of radiolarian to planktonic foraminiferal abundances and the ratio of benthic to planktonic foraminiferal tests are strongly increased, as are the rain ratio, the fragmentation index, and the BDX'. The sedimentary calcite lysocline rose above the core position and GeoB 1710-3 sediments were heavily altered, as attested to by the unusual accumulation of pellets, aggregates, sponge spicules, radiolaria, benthic foraminifera, and planktonic foraminiferal assemblages. Solely the early OIS 6 dissolution event altered the coarse fraction intensely, and is therefore reflected by all conventional calcium carbonate preservation proxies and the BDX'. We attribute the more than 1000 m rise of the sedimentary calcite lysocline to the combination of two processes: (a) a prominent change in the deep-water mass distribution within the South Atlantic and (b) intense degradation of organic material within the sediment (preserved as maximum total organic carbon content) creating microenvironments favorable for calcium carbonate dissolution.