764 resultados para S960 QC teräs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give a relativistic spin network model for quantum gravity based on the Lorentz group and its q-deformation, the Quantum Lorentz Algebra. We propose a combinatorial model for the path integral given by an integral over suitable representations of this algebra. This generalises the state sum models for the case of the four-dimensional rotation group previously studied in gr-qc/9709028. As a technical tool, formulae for the evaluation of relativistic spin networks for the Lorentz group are developed, with some simple examples which show that the evaluation is finite in interesting cases. We conjecture that the `10J' symbol needed in our model has a finite value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mammography equipment must be evaluated to ensure that images will be of acceptable diagnostic quality with lowest radiation dose. Quality Assurance (QA) aims to provide systematic and constant improvement through a feedback mechanism to address the technical, clinical and training aspects. Quality Control (QC), in relation to mammography equipment, comprises a series of tests to determine equipment performance characteristics. The introduction of digital technologies promoted changes in QC tests and protocols and there are some tests that are specific for each manufacturer. Within each country specifi c QC tests should be compliant with regulatory requirements and guidance. Ideally, one mammography practitioner should take overarching responsibility for QC within a service, with all practitioners having responsibility for actual QC testing. All QC results must be documented to facilitate troubleshooting, internal audit and external assessment. Generally speaking, the practitioner’s role includes performing, interpreting and recording the QC tests as well as reporting any out of action limits to their service lead. They must undertake additional continuous professional development to maintain their QC competencies. They are usually supported by technicians and medical physicists; in some countries the latter are mandatory. Technicians and/or medical physicists often perform many of the tests indicated within this chapter. It is important to recognise that this chapter is an attempt to encompass the main tests performed within European countries. Specific tests related to the service that you work within must be familiarised with and adhered too.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: A adolescência, enquanto período de desenvolvimento favorável ao estabelecimento das primeiras relações amorosas, é propícia à construção de atitudes sobre a intimidade, mas também às primeiras manifestações de poder e controlo nestas relações (Leitão et al., 2013). A avaliação dos conhecimentos sobre o fenómeno é o primeiro passo para a implementação de programas que visem a construção de competências promotoras a vivência de relações de intimidade felizes e saudáveis. Objetivos: Apresentar a avaliação das propriedades psicométricas do Questionário de Conhecimentos sobre Violência nas Relações de Intimidade (QC-VRI) que permite medir os conhecimentos que os adolescentes detêm sobre este fenómeno. Metodologia: Estudo de validação de um instrumento de medida. O estudo foi desenvolvido em duas fases: a elaboração do questionário de 47 itens sobre causas, consequências e frequência da ocorrência de violência no namoro e a avaliação das suas propriedades psicométricas. Para avaliação das propriedades psicométricas o questionário foi aplicado a uma amostra de 465 adolescentes e jovens portugueses estudantes do ensino secundário e superior, sendo 81,5% do sexo feminino e 18,5% do masculino. A média da idade é de 17,91 anos. Resultados: O teste de adequabilidade da amostra apresentou um valor superior a 0,6 (KMO = 0,655), suportando a adequação da matriz de correlações. Por sua vez o teste de esfericidade de Bartlett's é significativo para um nível de significância de 5 % (p < 0,001). O QC-VRI apresenta índices de fiabilidade bons (> 0,70) e uma estrutura fatorial que é consistente com os princípios que presidiram ao seu desenvolvimento, sendo constituído por 21 itens que se estruturam numa solução de 5 fatores que explicam 47,33% da variância. Conclusões: O questionário poderá ser aplicado quer como medida de rastreio dos conhecimentos, quer como medida de avaliação do impacto das intervenções de sensibilização ou de formação sobre violência nas relações de intimidade nos adolescentes e jovens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the construction of operational oceanography systems, the need for real-time has become more and more important. A lot of work had been done in the past, within National Data Centres (NODC) and International Oceanographic Data and Information Exchange (IODE) to standardise delayed mode quality control procedures. Concerning such quality control procedures applicable in real-time (within hours to a maximum of a week from acquisition), which means automatically, some recommendations were set up for physical parameters but mainly within projects without consolidation with other initiatives. During the past ten years the EuroGOOS community has been working on such procedures within international programs such as Argo, OceanSites or GOSUD, or within EC projects such as Mersea, MFSTEP, FerryBox, ECOOP, and MyOcean. In collaboration with the FP7 SeaDataNet project that is standardizing the delayed mode quality control procedures in NODCs, and MyOcean GMES FP7 project that is standardizing near real time quality control procedures for operational oceanography purposes, the DATA-MEQ working group decided to put together this document to summarize the recommendations for near real-time QC procedures that they judged mature enough to be advertised and recommended to EuroGOOS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivo: Evidenciar los conflictos éticos que pueden surgir entre los discursos legal y ético, a través de explorar el contenido del Real Decreto-Ley 16/2012 que modifica la ley sanitaria en España y los códigos éticos. Método: Revisión y análisis crítico del discurso de cinco códigos éticos de Enfermería de Barcelona, Cataluña, España, Europa e Internacional, y del discurso de la legislación sanitaria vigente en España en 2013, en los que se identificaron y compararon estructuras lingüísticas referentes a cinco principios y valores éticos del marco teórico de la ética de los cuidados: equidad, derechos humanos, derecho a la salud, accesibilidad y continuidad de los cuidados. Resultados: Mientras que el discurso ético define la función enfermera en función de la equidad, el reconocimiento de los derechos humanos, el derecho a la salud, la accesibilidad y la continuidad de los cuidados de la persona, el discurso legal se vertebra sobre el concepto de beneficiario o asegurado. Conclusiones: La divergencia entre el discurso ético y legal puede producir conflictos éticos que afecten negativamente a la práctica de la profesión enfermera. La aplicación del RDL 16/2012 promueve un marco de acción que impide que los profesionales enfermeros presten sus cuidados a colectivos no asegurados, lo que atenta contra los derechos humanos y los principios de la ética de los cuidados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum mechanics, optics and indeed any wave theory exhibits the phenomenon of interference. In this thesis we present two problems investigating interference due to indistinguishable alternatives and a mostly unrelated investigation into the free space propagation speed of light pulses in particular spatial modes. In chapter 1 we introduce the basic properties of the electromagnetic field needed for the subsequent chapters. In chapter 2 we review the properties of interference using the beam splitter and the Mach-Zehnder interferometer. In particular we review what happens when one of the paths of the interferometer is marked in some way so that the particle having traversed it contains information as to which path it went down (to be followed up in chapter 3) and we review Hong-Ou-Mandel interference at a beam splitter (to be followed up in chapter 5). In chapter 3 we present the first of the interference problems. This consists of a nested Mach-Zehnder interferometer in which each of the free space propagation segments are weakly marked by mirrors vibrating at different frequencies [1]. The original experiment drew the conclusions that the photons followed disconnected paths. We partition the description of the light in the interferometer according to the number of paths it contains which-way information about and reinterpret the results reported in [1] in terms of the interference of paths spatially connected from source to detector. In chapter 4 we briefly review optical angular momentum, entanglement and spontaneous parametric down conversion. These concepts feed into chapter 5 in which we present the second of the interference problems namely Hong-Ou-Mandel interference with particles possessing two degrees of freedom. We analyse the problem in terms of exchange symmetry for both boson and fermion pairs and show that the particle statistics at a beam splitter can be controlled for suitably chosen states. We propose an experimental test of these ideas using orbital angular momentum entangled photons. In chapter 6 we look at the effect that the transverse spatial structure of the mode that a pulse of light is excited in has on its group velocity. We show that the resulting group velocity is slower than the speed of light in vacuum for plane waves and that this reduction in the group velocity is related to the spread in the wave vectors required to create the transverse spatial structure. We present experimental results of the measurement of this slowing down using Hong-Ou-Mandel interference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Little is known about historic wood as it ages naturally. Instead, most studies focus on biological decay, as it is often assumed that wood remains otherwise stable with age. This PhD project was organised by Historic Scotland and the University of Glasgow to investigate the natural chemical and physical aging of wood. The natural aging of wood was a concern for Historic Scotland as traditional timber replacement is the standard form of repair used in wooden cultural heritage; replacing rotten timber with new timber of the same species. The project was set up to look at what differences could exist both chemically and physically between old and new wood, which could put unforeseen stress on the joint between them. Through Historic Scotland it was possible to work with genuine historic wood from two species, Oak and Scots pine, both from the 1500’s, rather than relying on artificial aging. Artificial aging of wood is still a debated topic, with consideration given to whether it is truly mimicking the aging process or just damaging the wood cells. The chemical stability of wood was investigated using Fourier-transform infrared (FTIR) microscopy, as well as wet chemistry methods including a test for soluble sugars from the possible breakdown of the wood polymers. The physical properties assessed included using a tensile testing machine to uncover possible differences in mechanical properties. An environmental chamber was used to test the reaction to moisture of wood of different ages, as moisture is the most damaging aspect of the environment to wooden cultural objects. The project uncovered several differences, both physical and chemical, between the modern and historic wood which could affect the success of traditional ‘like for like’ repairs. Both oak and pine lost acetyl groups, over historic time, from their hemicellulose polymers. This chemical reaction releases acetic acid, which had no effect on the historic oak but was associated with reduced stiffness in historic pine, probably due to degradation of the hemicellulose polymers by acid hydrolysis. The stiffness of historic oak and pine was also reduced by decay. Visible pest decay led to loss of wood density but there was evidence that fungal decay, extending beyond what was visible, degraded the S2 layer of the pine cell walls, reducing the stiffness of the wood by depleting the cellulose microfibrils most aligned with the grain. Fungal decay of polysaccharides in pine wood left behind sugars that attracted increased levels of moisture. The degradation of essential polymers in the wood structure due to age had different impacts on the two species of wood, and raised questions concerning both the mechanism of aging of wood and the ways in which traditional repairs are implemented, especially in Scots pine. These repairs need to be done with more care and precision, especially in choosing new timber to match the old. Within this project a quantitative method of measuring the microfibril angle (MFA) of wood using polarised Fourier transform infrared (FTIR) microscopy has been developed, allowing the MFA of both new and historic pine to be measured. This provides some of the information needed for a more specific match when selecting replacement timbers for historic buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generalised refraction is a topic which has, thus far, garnered far less attention than it deserves. The purpose of this thesis is to highlight the potential that generalised refraction has to offer with regards to imaging and its application to designing new passive optical devices. Specifically in this thesis we will explore two types of gener- alised refraction which takes place across a planar interface: refraction by generalised confocal lenslet arrays (gCLAs), and refraction by ray-rotation sheets. We will show that the corresponding laws of refraction for these interfaces produce, in general, light-ray fields with non-zero curl, and as such do not have a corresponding outgoing waveform. We will then show that gCLAs perform integral, geometrical imaging, and that this enables them to be considered as approximate realisations of metric tensor interfaces. The concept of piecewise transformation optics will be introduced and we will show that it is possible to use gCLAs along with other optical elements such as lenses to design simple piecewise transformation-optics devices such as invisibility cloaks and insulation windows. Finally, we shall show that ray-rotation sheets can be interpreted as performing geometrical imaging into complex space, and that as a consequence, ray-rotation sheets and gCLAs may in fact be more closely related than first realised. We conclude with a summary of potential future projects which lead naturally from the results of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypertension is a major risk factor for cardiovascular disease and mortality, and a growing global public health concern, with up to one-third of the world’s population affected. Despite the vast amount of evidence for the benefits of blood pressure (BP) lowering accumulated to date, elevated BP is still the leading risk factor for disease and disability worldwide. It is well established that hypertension and BP are common complex traits, where multiple genetic and environmental factors contribute to BP variation. Furthermore, family and twin studies confirmed the genetic component of BP, with a heritability estimate in the range of 30-50%. Contemporary genomic tools enabling the genotyping of millions of genetic variants across the human genome in an efficient, reliable, and cost-effective manner, has transformed hypertension genetics research. This is accompanied by the presence of international consortia that have offered unprecedentedly large sample sizes for genome-wide association studies (GWASs). While GWAS for hypertension and BP have identified more than 60 loci, variants in these loci are associated with modest effects on BP and in aggregate can explain less than 3% of the variance in BP. The aims of this thesis are to study the genetic and environmental factors that influence BP and hypertension traits in the Scottish population, by performing several genetic epidemiological analyses. In the first part of this thesis, it aims to study the burden of hypertension in the Scottish population, along with assessing the familial aggregation and heritialbity of BP and hypertension traits. In the second part, it aims to validate the association of common SNPs reported in the large GWAS and to estimate the variance explained by these variants. In this thesis, comprehensive genetic epidemiology analyses were performed on Generation Scotland: Scottish Family Health Study (GS:SFHS), one of the largest population-based family design studies. The availability of clinical, biological samples, self-reported information, and medical records for study participants has allowed several assessments to be performed to evaluate factors that influence BP variation in the Scottish population. Of the 20,753 subjects genotyped in the study, a total of 18,470 individuals (grouped into 7,025 extended families) passed the stringent quality control (QC) criteria and were available for all subsequent analysis. Based on the BP-lowering treatment exposure sources, subjects were further classified into two groups. First, subjects with both a self-reported medications (SRMs) history and electronic-prescription records (EPRs; n =12,347); second, all the subjects with at least one medication history source (n =18,470). In the first group, the analysis showed a good concordance between SRMs and EPRs (kappa =71%), indicating that SRMs can be used as a surrogate to assess the exposure to BP-lowering medication in GS:SFHS participants. Although both sources suffer from some limitations, SRMs can be considered the best available source to estimate the drug exposure history in those without EPRs. The prevalence of hypertension was 40.8% with higher prevalence in men (46.3%) compared to women (35.8%). The prevalence of awareness, treatment and controlled hypertension as defined by the study definition were 25.3%, 31.2%, and 54.3%, respectively. These findings are lower than similar reported studies in other populations, with the exception of controlled hypertension prevalence, which can be considered better than other populations. Odds of hypertension were higher in men, obese or overweight individuals, people with a parental history of hypertension, and those living in the most deprived area of Scotland. On the other hand, deprivation was associated with higher odds of treatment, awareness and controlled hypertension, suggesting that people living in the most deprived area may have been receiving better quality of care, or have higher comorbidity levels requiring greater engagement with doctors. These findings highlight the need for further work to improve hypertension management in Scotland. The family design of GS:SFHS has allowed family-based analysis to be performed to assess the familial aggregation and heritability of BP and hypertension traits. The familial correlation of BP traits ranged from 0.07 to 0.20, and from 0.18 to 0.34 for parent-offspring pairs and sibling pairs, respectively. A higher correlation of BP traits was observed among first-degree relatives than other types of relative pairs. A variance-component model that was adjusted for sex, body mass index (BMI), age, and age-squared was used to estimate heritability of BP traits, which ranged from 24% to 32% with pulse pressure (PP) having the lowest estimates. The genetic correlation between BP traits showed a high correlation between systolic (SBP), diastolic (DBP) and mean arterial pressure (MAP) (G: 81% to 94%), but lower correlations with PP (G: 22% to 78%). The sibling recurrence risk ratio (λS) for hypertension and treatment were calculated as 1.60 and 2.04 respectively. These findings confirm the genetic components of BP traits in GS:SFHS, and justify further work to investigate genetic determinants of BP. Genetic variants reported in the recent large GWAS of BP traits were selected for genotyping in GS:SFHS using a custom designed TaqMan® OpenArray®. The genotyping plate included 44 single nucleotide polymorphisms (SNPs) that have been previously reported to be associated with BP or hypertension at genome-wide significance level. A linear mixed model that is adjusted for age, age-squared, sex, and BMI was used to test for the association between the genetic variants and BP traits. Of the 43 variants that passed the QC, 11 variants showed statistically significant association with at least one BP trait. The phenotypic variance explained by these variant for the four BP traits were 1.4%, 1.5%, 1.6%, and 0.8% for SBP, DBP, MAP, and PP, respectively. The association of genetic risk score (GRS) that were constructed from selected variants has showed a positive association with BP level and hypertension prevalence, with an average effect of one mmHg increase with each 0.80 unit increases in the GRS across the different BP traits. The impact of BP-lowering medication on the genetic association study for BP traits has been established, with typical practice of adding a fixed value (i.e. 15/10 mmHg) to the measured BP values to adjust for BP treatment. Using the subset of participants with the two treatment exposure sources (i.e. SRMs and EPRs), the influence of using either source to justify the addition of fixed values in SNP association signal was analysed. BP phenotypes derived from EPRs were considered the true phenotypes, and those derived from SRMs were considered less accurate, with some phenotypic noise. Comparing SNPs association signals between the four BP traits in the two model derived from the different adjustments showed that MAP was the least impacted by the phenotypic noise. This was suggested by identifying the same overlapped significant SNPs for the two models in the case of MAP, while other BP traits had some discrepancy between the two sources

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Droplet microfluidics is an active multidisciplinary area of research that evolved out of the larger field of microfluidics. It enables the user to handle, process and manipulate micrometer-sized emulsion droplets on a micro- fabricated platform. The capability to carry out a large number of individual experiments per unit time makes the droplet microfluidic technology an ideal high-throughput platform for analysis of biological and biochemical samples. The objective of this thesis was to use such a technology for designing systems with novel implications in the newly emerging field of synthetic biology. Chapter 4, the first results chapter, introduces a novel method of droplet coalescence using a flow-focusing capillary device. In Chapter 5, the development of a microfluidic platform for the fabrication of a cell-free micro-environment for site-specific gene manipulation and protein expression is described. Furthermore, a novel fluorescent reporter system which functions both in vivo and in vitro is introduced in this chapter. Chapter 6 covers the microfluidic fabrication of polymeric vesicles from poly(2-methyloxazoline-b-dimethylsiloxane-b-2-methyloxazoline) tri-block copolymer. The polymersome made from this polymer was used in the next Chapter for the study of a chimeric membrane protein called mRFP1-EstA∗. In Chapter 7, the application of microfluidics for the fabrication of synthetic biological membranes to recreate artificial cell- like chassis structures for reconstitution of a membrane-anchored protein is described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes the application of multispectral imaging to several novel oximetry applications. Chapter 1 motivates optical microvascular oximetry, outlines oxygen transport in the body, describes the theory of oximetry, and describes the challenges associated with in vivo oximetry, in particular imaging through tissue. Chapter 2 reviews various imaging techniques for quantitative in vivo oximetry of the microvasculature, including multispectral and hyperspectral imaging, photoacoustic imaging, optical coherence tomography, and laser speckle techniques. Chapter 3 describes a two-wavelength oximetry study of two microvascular beds in the anterior segment of the eye: the bulbar conjunctival and episcleral microvasculature. This study reveals previously unseen oxygen diffusion from ambient air into the bulbar conjunctival microvasculature, altering the oxygen saturation of the bulbar conjunctiva. The response of the bulbar conjunctival and episcleral microvascular beds to acute mild hypoxia is quantified and the rate at which oxygen diffuses into bulbar conjunctival vessels is measured. Chapter 4 describes the development and application of a highly novel non-invasive retinal angiography technique: Oximetric Ratio Contrast Angiography (ORCA). ORCA requires only multispectral imaging and a small perturbation of blood oxygen saturation to produce angiographic sequences. A pilot study of ORCA in human subjects was conducted. This study demonstrates that ORCA can produce angiographic sequences with features such as sequential vessel filling and laminar flow. The application and challenges of ORCA are discussed, with emphasis on comparison with other angiography techniques, such as fluorescein angiography. Chapter 5 describes the development of a multispectral microscope for oximetry in the spinal cord dorsal vein of rats. Measurements of blood oxygen saturation are made in the dorsal vein of both healthy rats, and in rats with the Experimental autoimmune encephalomyelitis (EAE) disease model of multiple sclerosis. The venous blood oxygen saturation of EAE disease model rats was found to be significantly lower than that of healthy controls, indicating increased oxygen uptake from blood in the EAE disease model of multiple sclerosis. Chapter 6 describes the development of video-rate red eye oximetry; a technique which could enable stand-off oximetry of the blood-supply of the eye with high temporal resolution. The various challenges associated with video-rate red eye oximetry are investigated and their influence quantified. The eventual aim of this research is to track circulating deoxygenation perturbations as they arrive in both eyes, which could provide a screening method for carotid artery stenosis, which is major risk-factor for stroke. However, due to time constraints, it was not possible to thoroughly investigate if video-rate red eye can detect such perturbations. Directions and recommendations for future research are outlined.