943 resultados para Instrumentation: spectrographs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

An attempt to delineate rather than to precisely define what we mean by "ecophysiology" is based on a brief historical overview of what eventually led to development of instrumentation and sampling strategies for analyses that allow description of physiological performance in the field. These techniques are surveyed. Ecophysiology originally is aut-ecology dedicated to the behaviour of individual plants, species or higher taxa, viz. "physiotypes", in particular habitats. Examples of ecophysiological diversity are developed, which illustrate gradual merging with more integrative considerations of functions and dynamics of habitats or ecosystems, i.e. a trend of research towards physiological syn-ecology. The latter is exemplified by studies with comparisons of a variety of morphotypes and physiotypes within a given habitat or ecosystem and across a range of habitats or ecosystems. The high demand and complexity as well as the excitement of ecology and ecophysiology arise from the quest to cover all conditions of the existence of organisms according to Ernst Haeckel's original definition of "ecology".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrokinetics has emerged as a potential technique for in situ soil remediation and especially unique because of the ability to work in low permeability soil. In electrokinetic remediation, non-polar contaminants like most organic compounds are transported primarily by electroosmosis, thus the process is effective only if the contaminants are soluble in pore fluid. Therefore, enhancement is needed to improve mobility of these hydrophobic compounds, which tend to adsorb strongly to the soil. On the other hand, as a novel and rapidly growing science, the applications of ultrasound in environmental technology hold a promising future. Compared to conventional methods, ultrasonication can bring several benefits such as environmental friendliness (no toxic chemical are used or produced), low cost, and compact instrumentation. It also can be applied onsite. Ultrasonic energy applied into contaminated soils can increase desorption and mobilization of contaminants and porosity and permeability of soil through developing of cavitation. The research investigated the coupling effect of the combination of these two techniques, electrokinetics and ultrasonication, in persistent organic pollutant removal from contaminated low permeability clayey soil (with kaolin as a model medium). The preliminary study checked feasibility of ultrasonic treatment of kaolin highly contaminated by persistent organic pollutants (POPs). The laboratory experiments were conducted in various conditions (moisture, frequency, power, duration time, initial concentration) to examine the effects of these parameters on the treatment process. Experimental results showed that ultrasonication has a potential to remove POPs, although the removal efficiencies were not high with short duration time. The study also suggested intermittent ultrasonication over longer time as an effective means to increase the removal efficiencies. Then, experiments were conducted to compare the performances among electrokinetic process alone and electrokinetic processes combined with surfactant addition and mainly with ultrasonication, in designed cylinders (with filtercloth separating central part and electrolyte parts) and in open pans. Combined electrokinetic and ultrasonic treatment did prove positive coupling effect compared to each single process alone, though the level of enhancement is not very significant. The assistance of ultrasound in electrokinetic remediation can help reduce POPs from clayey soil by improving the mobility of hydrophobic organic compounds and degrading these contaminants through pyrolysis and oxidation. Ultrasonication also sustains higher current and increases electroosmotic flow. Initial contaminant concentration is an essential input parameter that can affect the removal effectiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measuring protein biomarkers from sample matrix, such as plasma, is one of the basic tasks in clinical diagnostics. Bioanalytical assays used for the measuring should be able to measure proteins with high sensitivity and specificity. Furthermore, multiplexing capability would also be advantageous. To ensure the utility of the diagnostic test in point-of-care setting, additional requirements such as short turn-around times, ease-ofuse and low costs need to be met. On the other hand, enhancement of assay sensitivity could enable exploiting novel biomarkers, which are present in very low concentrations and which the current immunoassays are unable to measure. Furthermore, highly sensitive assays could enable the use of minimally invasive sampling. In the development of high-sensitivity assays the label technology and affinity binders are in pivotal role. Additionally, innovative assay designs contribute to the obtained sensitivity and other characteristics of the assay as well as its applicability. The aim of this thesis was to study the impact of assay components on the performance of both homogeneous and heterogeneous assays. Applicability of two different lanthanide-based label technologies, upconverting nanoparticles and switchable lanthanide luminescence, to protein detection was explored. Moreover, the potential of recombinant antibodies and aptamers as alternative affinity binders were evaluated. Additionally, alternative conjugation chemistries for production of the labeled binders were studied. Different assay concepts were also evaluated with respect to their applicability to point-of-care testing, which requires simple yet sensitive methods. The applicability of upconverting nanoparticles to the simultaneous quantitative measurement of multiple analytes using imaging-based detection was demonstrated. Additionally, the required instrumentation was relatively simple and inexpensive compared to other luminescent lanthanide-based labels requiring time-resolved measurement. The developed homogeneous assays exploiting switchable lanthanide luminescence were rapid and simple to perform and thus applicable even to point-ofcare testing. The sensitivities of the homogeneous assays were in the picomolar range, which are still inadequate for some analytes, such as cardiac troponins, requiring ultralow limits of detection. For most analytes, however, the obtained limits of detection were sufficient. The use of recombinant antibody fragments and aptamers as binders allowed site-specific and controlled covalent conjugation to construct labeled binders reproducibly either by using chemical modification or recombinant technology. Luminescent lanthanide labels were shown to be widely applicable for protein detection in various assay setups and to contribute assay sensitivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present paper we discuss the development of "wave-front", an instrument for determining the lower and higher optical aberrations of the human eye. We also discuss the advantages that such instrumentation and techniques might bring to the ophthalmology professional of the 21st century. By shining a small light spot on the retina of subjects and observing the light that is reflected back from within the eye, we are able to quantitatively determine the amount of lower order aberrations (astigmatism, myopia, hyperopia) and higher order aberrations (coma, spherical aberration, etc.). We have measured artificial eyes with calibrated ametropia ranging from +5 to -5 D, with and without 2 D astigmatism with axis at 45º and 90º. We used a device known as the Hartmann-Shack (HS) sensor, originally developed for measuring the optical aberrations of optical instruments and general refracting surfaces in astronomical telescopes. The HS sensor sends information to a computer software for decomposition of wave-front aberrations into a set of Zernike polynomials. These polynomials have special mathematical properties and are more suitable in this case than the traditional Seidel polynomials. We have demonstrated that this technique is more precise than conventional autorefraction, with a root mean square error (RMSE) of less than 0.1 µm for a 4-mm diameter pupil. In terms of dioptric power this represents an RMSE error of less than 0.04 D and 5º for the axis. This precision is sufficient for customized corneal ablations, among other applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energian kulutuksen vähentäminen ja sen tutkiminen on kasvavan kiinnostuksen kohteena. Syntyneen lämmön mittaaminen on yksi tapa mitata energian siirtymistä. Lämpötilan mittaaminen on yleistä, vaikka usein on merkittävämpää selvittää missä ja miten lämpöenergia on siirtynyt. Tästä syystä tarvitaan lämpövuoantureita, jotka reagoivat suoraan lämpövuohon eli lämpöenergian siirtymiseen. Tässä tutkimuksessa suunnitellaan ja toteutetaan lämpövuoanturin mittauselektroniikka vaativaan käyttöympäristöön. Työssä käytettävän gradienttilämpövuoanturin tuottama jännitesignaali on mikrovolttiluokkaa ja ympäristön aiheuttama kohina voi olla huomattavasti suurempi. Tämän takia anturin tuottamaa signaalia on vahvistettava, jotta sitä voidaan mitata luotettavasti. Tutkimuksessa keskitytään vahvistimen suunnitteluun, mutta suunnittelussa on otettava huomioon koko järjestelmä. Anturin sähköiset ominaisuudet ja ympäristö asettavat rajoitteita vahvistimelle. Tavoitteena on selvittää miten voidaan mitata mikrovolttien jännitesignaalia mahdollisimman suurella taajuuskaistalla vaativassa käyttöympäristössä. Työn tuloksena syntyi mittalaite, jota voidaan käyttää vaativassa ympäristössä lämpövuon mittaamiseen. Suunnitteluparametrien mukainen vahvistus ja päästökaista sekä offset-jännitteen ryömintä saavutettiin suunnitellulla mittalaitteella, mutta offsetjännite ja kohina olivat hieman suunniteltua suuremmat. Mittalaitteella ja lämpövuoanturilla havaittiin selvästi lämpövuon muutoksia keinotekoisilla herätteillä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target’s three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Milk and egg matrixes were assayed for aflatoxin M1 (AFM1) and B1 (AFB1) respectively, by AOAC official and modified methods with detection and quantification by thin layer chromatography (TLC) and high performance thin layer chromatography (HPTLC). The modified methods: Blanc followed by Romer, showed to be most appropriate for AFM1 analysis in milk. Both methods reduced emulsion formation, produced cleaner extracts, no streaking spots, precision and accuracy improved, especially when quantification was performed by HPTLC. The use of ternary mixture in the Blanc Method was advantageous as the solvent could extract AFM1 directly from the first stage (extraction), leaving other compounds in the binary mixture layer, avoiding emulsion formation, thus reducing toxin loss. The relative standard deviation (RSD%) values were low, 16 and 7% when TLC and HPTLC were used, with a mean recovery of 94 and 97%, respectively. As far as egg matrix and final extract are concerned, both methods evaluated for AFB1 need further studies. Although that matrix leads to emulsion with consequent loss of toxin, the Romer modified presented a reasonable clean extract (mean recovery of 92 and 96% for TLC and HPTLC, respectively). Most of the methods studied did not performed as expected mainly due to the matrixes high content of triglicerides (rich on saturated fatty acids), cholesterol, carotene and proteins. Although nowadays most methodology for AFM1 is based on HPLC, TLC determination (Blanc and Romer modified) for AFM1 and AFB1 is particularly recommended to those, inexperienced in food and feed mycotoxins analysis and especially who cannot afford to purchase sophisticated (HPLC,HPTLC) instrumentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, technological advancements in microelectronics and sensor technologies have revolutionized the field of electrical engineering. New manufacturing techniques have enabled a higher level of integration that has combined sensors and electronics into compact and inexpensive systems. Previously, the challenge in measurements was to understand the operation of the electronics and sensors, but this has now changed. Nowadays, the challenge in measurement instrumentation lies in mastering the whole system, not just the electronics. To address this issue, this doctoral dissertation studies whether it would be beneficial to consider a measurement system as a whole from the physical phenomena to the digital recording device, where each piece of the measurement system affects the system performance, rather than as a system consisting of small independent parts such as a sensor or an amplifier that could be designed separately. The objective of this doctoral dissertation is to describe in depth the development of the measurement system taking into account the challenges caused by the electrical and mechanical requirements and the measurement environment. The work is done as an empirical case study in two example applications that are both intended for scientific studies. The cases are a light sensitive biological sensor used in imaging and a gas electron multiplier detector for particle physics. The study showed that in these two cases there were a number of different parts of the measurement system that interacted with each other. Without considering these interactions, the reliability of the measurement may be compromised, which may lead to wrong conclusions about the measurement. For this reason it is beneficial to conceptualize the measurement system as a whole from the physical phenomena to the digital recording device where each piece of the measurement system affects the system performance. The results work as examples of how a measurement system can be successfully constructed to support a study of sensors and electronics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this work was to monitor the operational conditions of the transport of chilled and frozen foods during delivery within cities and to evaluate the impact of the door openings on the alteration of the internal temperature of the refrigerated environment. Several temperature and pressure sensors were used in a refrigerated container with two compartments and they were installed in the refrigeration system unit and on the internal and external surfaces of the container. After the monitoring tests, it was verified that door openings during deliveries resulted in a disturbance that raised the internal temperature of the refrigerated container above values recommended for adequate conservation of the products transported. Moreover, increasing the number of door openings promoted a cumulative effect on the internal temperature, mainly in the chilled food compartment of the container. It was concluded that the refrigeration system unit presented serious limitations with regard to the maintenance of the container's internal temperature during the actual distribution routine, since it does not possess enough instantaneous capacity to restore the temperature set-point between deliveries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This case study examines the impact of a computer information system as it was being implemented in one Ontario hospital. The attitudes of a cross section of the hospital staff acted as a barometer to measure their perceptions of the implementation process. With The Mississauga Hospital in the early stages of an extensive computer implementation project, the opportunity existed to identify staff attitudes about the computer system, overall knowledge and compare the findings with the literature. The goal of the study was to develop a greater base about the affective domain in the relationship between people and the computer system. Eight exploratory questions shaped the focus of the investigation. Data were collected from three sources: a survey questionnaire, focused interviews, and internal hospital documents. Both quantitative and qualitative data were analyzed. Instrumentation in the study consisted of a survey distributed at two points in time to randomly selected hospital employees who represented all staff levels.Other sources of data included hospital documents, and twenty-five focused interviews with staff who replied to both surveys. Leavitt's socio-technical system, with its four subsystems: task, structure, technology, and people was used to classify staff responses to the research questions. The study findings revealed that the majority of respondents felt positive about using the computer as part of their jobs. No apparent correlations were found between sex, age, or staff group and feelings about using the computer. Differences in attitudes, and attitude changes were found in potential relationship to the element of time. Another difference was found in staff group and perception of being involved in the decision making process. These findings and other evidence about the role of change agents in this change process help to emphasize that planning change is one thing, managing the transition is another.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La lecture numérique prend de plus en plus de place dans l'espace global de la lecture des étudiants. Bien que les premiers systèmes de lecture numérique, communément appelés livres électroniques, datent déjà de plusieurs années, les opinions quant à leur potentiel divergent encore. Une variété de contenus universitaires numériques s’offre aujourd’hui aux étudiants, entraînant par le fait même une multiplication d'usages ainsi qu'une variété de modes de lecture. Les systèmes de lecture numérique font maintenant partie intégrante de l’environnement électronique auquel les étudiants ont accès et méritent d’être étudiés plus en profondeur. Maintes expérimentations ont été menées dans des bibliothèques publiques et dans des bibliothèques universitaires sur les livres électroniques. Des recherches ont été conduites sur leur utilisabilité et sur le degré de satisfaction des lecteurs dans le but d’en améliorer le design. Cependant, très peu d’études ont porté sur les pratiques de lecture proprement dites des universitaires (notamment les étudiants) et sur leurs perceptions de ces nouveaux systèmes de lecture. Notre recherche s’intéresse à ces aspects en étudiant deux systèmes de lecture numérique, une Tablet PC (dispositif nomade) et un système de livres-Web, NetLibrary (interface de lecture intégrée à un navigateur Web). Notre recherche étudie les pratiques de lecture des étudiants sur ces systèmes de lecture numérique. Elle est guidée par trois questions de recherche qui s’articulent autour (1) des stratégies de lecture employées par des étudiants (avant, pendant et après la lecture), (2) des éléments du système de lecture qui influencent (positivement ou négativement) le processus de lecture et (3) des perceptions des étudiants vis-à-vis la technologie du livre électronique et son apport à leur travail universitaire. Pour mener cette recherche, une approche méthodologique mixte a été retenue, utilisant trois modes de collecte de données : un questionnaire, des entrevues semi-structurées avec les étudiants ayant utilisé l’un ou l’autre des systèmes étudiés, et le prélèvement des traces de lecture laissées par les étudiants dans les systèmes, après usage. Les répondants (n=46) étaient des étudiants de l’Université de Montréal, provenant de trois départements (Bibliothéconomie & sciences de l’information, Communication et Linguistique & traduction). Près de la moitié d’entre eux (n=21) ont été interviewés. Parallèlement, les traces de lecture laissées dans les systèmes de lecture par les étudiants (annotations, surlignages, etc.) ont été prélevées et analysées. Les données des entrevues et des réponses aux questions ouvertes du questionnaire ont fait l'objet d'une analyse de contenu et un traitement statistique a été réservé aux données des questions fermées du questionnaire et des traces de lecture. Les résultats obtenus montrent que, d’une façon générale, l’objectif de lecture, la nouveauté du contenu, les habitudes de lecture de l’étudiant de même que les possibilités du système de lecture sont les éléments qui orientent le choix et l’application des stratégies de lecture. Des aides et des obstacles à la lecture ont été identifiés pour chacun des systèmes de lecture étudiés. Les aides consistent en la présence de certains éléments de la métaphore du livre papier dans le système de lecture numérique (notion de page délimitée, pagination, etc.), le dictionnaire intégré au système, et le fait que les systèmes de lecture étudiés facilitent la lecture en diagonale. Pour les obstacles, l’instrumentation de la lecture a rendu l’appropriation du texte par le lecteur difficile. De plus, la lecture numérique (donc « sur écran ») a entraîné un manque de concentration et une fatigue visuelle notamment avec NetLibrary. La Tablet PC, tout comme NetLibrary, a été perçue comme facile à utiliser mais pas toujours confortable, l’inconfort étant davantage manifeste dans NetLibrary. Les étudiants considèrent les deux systèmes de lecture comme des outils pratiques pour le travail universitaire, mais pour des raisons différentes, spécifiques à chaque système. L’évaluation globale de l’expérience de lecture numérique des répondants s’est avérée, dans l’ensemble, positive pour la Tablet PC et plutôt mitigée pour NetLibrary. Cette recherche contribue à enrichir les connaissances sur (1) la lecture numérique, notamment celle du lectorat universitaire étudiant, et (2) l’impact d’un système de lecture sur l’efficacité de la lecture, sur les lecteurs, sur l’atteinte de l’objectif de lecture, et sur les stratégies de lecture utilisées. Outre les limites de l’étude, des pistes pour des recherches futures sont présentées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette thèse porte sur l’amélioration des techniques d’imagerie à haut-contraste permettant la détection directe de compagnons à de faibles séparations de leur étoile hôte. Plus précisément, elle s’inscrit dans le développement du Gemini Planet Imager (GPI) qui est un instrument de deuxième génération pour les télescopes Gemini. Cette caméra utilisera un spectromètre à champ intégral (SCI) pour caractériser les compagnons détectés et pour réduire le bruit de tavelure limitant leur détection et corrigera la turbulence atmosphérique à un niveau encore jamais atteint en utilisant deux miroirs déformables dans son système d’optique adaptative (OA) : le woofer et le tweeter. Le woofer corrigera les aberrations de basses fréquences spatiales et de grandes amplitudes alors que le tweeter compensera les aberrations de plus hautes fréquences ayant une plus faible amplitude. Dans un premier temps, les performances pouvant être atteintes à l’aide des SCIs présentement en fonction sur les télescopes de 8-10 m sont investiguées en observant le compagnon de l’étoile GQ Lup à l’aide du SCI NIFS et du système OA ALTAIR installés sur le télescope Gemini Nord. La technique de l’imagerie différentielle angulaire (IDA) est utilisée pour atténuer le bruit de tavelure d’un facteur 2 à 6. Les spectres obtenus en bandes JHK ont été utilisés pour contraindre la masse du compagnon par comparaison avec les prédictions des modèles atmosphériques et évolutifs à 8−60 MJup, où MJup représente la masse de Jupiter. Ainsi, il est déterminé qu’il s’agit plus probablement d’une naine brune que d’une planète. Comme les SCIs présentement en fonction sont des caméras polyvalentes pouvant être utilisées pour plusieurs domaines de l’astrophysique, leur conception n’a pas été optimisée pour l’imagerie à haut-contraste. Ainsi, la deuxième étape de cette thèse a consisté à concevoir et tester en laboratoire un prototype de SCI optimisé pour cette tâche. Quatre algorithmes de suppression du bruit de tavelure ont été testés sur les données obtenues : la simple différence, la double différence, la déconvolution spectrale ainsi qu’un nouvel algorithme développé au sein de cette thèse baptisé l’algorithme des spectres jumeaux. Nous trouvons que l’algorithme des spectres jumeaux est le plus performant pour les deux types de compagnons testés : les compagnons méthaniques et non-méthaniques. Le rapport signal-sur-bruit de la détection a été amélioré d’un facteur allant jusqu’à 14 pour un compagnon méthanique et d’un facteur 2 pour un compagnon non-méthanique. Dernièrement, nous nous intéressons à certains problèmes liés à la séparation de la commande entre deux miroirs déformables dans le système OA de GPI. Nous présentons tout d’abord une méthode utilisant des calculs analytiques et des simulations Monte Carlo pour déterminer les paramètres clés du woofer tels que son diamètre, son nombre d’éléments actifs et leur course qui ont ensuite eu des répercussions sur le design général de l’instrument. Ensuite, le système étudié utilisant un reconstructeur de Fourier, nous proposons de séparer la commande entre les deux miroirs dans l’espace de Fourier et de limiter les modes transférés au woofer à ceux qu’il peut précisément reproduire. Dans le contexte de GPI, ceci permet de remplacer deux matrices de 1600×69 éléments nécessaires pour une séparation “classique” de la commande par une seule de 45×69 composantes et ainsi d’utiliser un processeur prêt à être utilisé plutôt qu’une architecture informatique plus complexe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rapport d'analyse d'intervention présenté à la Faculté des arts et sciences en vue de l'obtention du grade de Maîtrise ès sciences (M. Sc.) en psychoéducation