923 resultados para Children’s time-space
Resumo:
The thesis work concerns X-ray spectrometry for both medical and space applications and is divided into two sections. The first section addresses an X-ray spectrometric system designed to study radiological beams and is devoted to the optimization of diagnostic procedures in medicine. A parametric semi-empirical model capable of efficiently reconstructing diagnostic X-ray spectra in 'middle power' computers was developed and tested. In addition, different silicon diode detectors were tested as real-time detectors in order to provide a real-time evaluation of the spectrum during diagnostic procedures. This project contributes to the field by presenting an improved simulation of a realistic X-ray beam emerging from a common X-ray tube with a complete and detailed spectrum that lends itself to further studies of added filtration, thus providing an optimized beam for different diagnostic applications in medicine. The second section describes the preliminary tests that have been carried out on the first version of an Application Specific Integrated Circuit (ASIC), integrated with large area position-sensitive Silicon Drift Detector (SDD) to be used on board future space missions. This technology has been developed for the ESA project: LOFT (Large Observatory for X-ray Timing), a new medium-class space mission that the European Space Agency has been assessing since February of 2011. The LOFT project was proposed as part of the Cosmic Vision Program (2015-2025).
Resumo:
The membrane protein Cytochrome c Oxidase (CcO) is one of the most important functional bio-molecules. It appears in almost every eukaryotic cell and many bacteria. Although the different species differ in the number of subunits, the functional differences are merely marginal. CcO is the terminal link in the electron transfer pathway of the mitochondrial respiratory chain. Electrons transferred to the catalytic center of the enzyme conduce to the reduction of molecular oxygen to water. Oxygen reduction is coupled to the pumping of protons into the inter-membrane space and hence generates a difference in electrochemical potential of protons across the inner mitochondrial membrane. This potential difference drives the synthesis of adenosine triphosphate (ATP), which is the universal energy carrier within all biological cells. rnrnThe goal of the present work is to contribute to a better understanding of the functional mechanism of CcO by using time-resolved surface enhanced resonance Raman spectroscopy (TR-SERRS). Despite intensive research effort within the last decades, the functional mechanism of CcO is still subject to controversial discussions. It was the primary goal of this dissertation to initiate electron transfer to the redox centers CuA, heme a, heme a3 and CuB electrochemically and to observe the corresponding redox transitions in-situ with a focus on the two heme structures by using SERRS. A measuring cell was developed, which allowed combination of electrochemical excitation with Raman spectroscopy for the purpose of performing the accordant measurements. Cytochrome c was used as a benchmark system to test the new measuring cell and to prove the feasibility of appropriate Raman measurements. In contrast to CcO the heme protein cc contains only a single heme structure. Nevertheless, characteristic Raman bands of the hemes can be observed for both proteins.rnrnIn order to investigate CcO it was immobilized on top of a silver substrate and embedded into an artificial membrane. The catalytic activity of CcO and therefore the complete functional capability of the enzyme within the biomimetic membrane architecture was verified using cyclic voltammetry. Raman spectroscopy was performed using a special nano-structured silver surface, which was developed within the scope of the present work. This new substrate combined two fundamental properties. It facilitated the formation of a protein tethered bilayer lipid membrane (ptBLM) and it allowed obtaining Raman spectra with sufficient high signal-to-noise ratios.rnSpectro-electrochemical investigations showed that at open circuit potential the enzyme exists in a mixed-valence state, with heme a and and heme a3 in the reduced and oxidized state, respectively. This was considered as an intermediate state between the non-activated and the fully activated state of CcO. Time-resolved SERRS measurements revealed that a hampered electron transfer to the redox center heme a3 characterizes this intermediate state.rn
Resumo:
Cytochrom c Oxidase (CcO), der Komplex IV der Atmungskette, ist eine der Häm-Kupfer enthaltenden Oxidasen und hat eine wichtige Funktion im Zellmetabolismus. Das Enzym enthält vier prosthetische Gruppen und befindet sich in der inneren Membran von Mitochondrien und in der Zellmembran einiger aerober Bakterien. Die CcO katalysiert den Elektronentransfer (ET) von Cytochrom c zu O2, wobei die eigentliche Reaktion am binuklearen Zentrum (CuB-Häm a3) erfolgt. Bei der Reduktion von O2 zu zwei H2O werden vier Protonen verbraucht. Zudem werden vier Protonen über die Membran transportiert, wodurch eine elektrochemische Potentialdifferenz dieser Ionen zwischen Matrix und Intermembranphase entsteht. Trotz ihrer Wichtigkeit sind Membranproteine wie die CcO noch wenig untersucht, weshalb auch der Mechanismus der Atmungskette noch nicht vollständig aufgeklärt ist. Das Ziel dieser Arbeit ist, einen Beitrag zum Verständnis der Funktion der CcO zu leisten. Hierzu wurde die CcO aus Rhodobacter sphaeroides über einen His-Anker, der am C-Terminus der Untereinheit II angebracht wurde, an eine funktionalisierte Metallelektrode in definierter Orientierung gebunden. Der erste Elektronenakzeptor, das CuA, liegt dabei am nächsten zur Metalloberfläche. Dann wurde eine Doppelschicht aus Lipiden insitu zwischen die gebundenen Proteine eingefügt, was zur sog. proteingebundenen Lipid-Doppelschicht Membran (ptBLM) führt. Dabei musste die optimale Oberflächenkonzentration der gebundenen Proteine herausgefunden werden. Elektrochemische Impedanzspektroskopie(EIS), Oberflächenplasmonenresonanzspektroskopie (SPR) und zyklische Voltammetrie (CV) wurden angewandt um die Aktivität der CcO als Funktion der Packungsdichte zu charakterisieren. Der Hauptteil der Arbeit betrifft die Untersuchung des direkten ET zur CcO unter anaeroben Bedingungen. Die Kombination aus zeitaufgelöster oberflächenverstärkter Infrarot-Absorptionsspektroskopie (tr-SEIRAS) und Elektrochemie hat sich dafür als besonders geeignet erwiesen. In einer ersten Studie wurde der ET mit Hilfe von fast scan CV untersucht, wobei CVs von nicht-aktivierter sowie aktivierter CcO mit verschiedenen Vorschubgeschwindigkeiten gemessen wurden. Die aktivierte Form wurde nach dem katalytischen Umsatz des Proteins in Anwesenheit von O2 erhalten. Ein vier-ET-modell wurde entwickelt um die CVs zu analysieren. Die Methode erlaubt zwischen dem Mechanismus des sequentiellen und des unabhängigen ET zu den vier Zentren CuA, Häm a, Häm a3 und CuB zu unterscheiden. Zudem lassen sich die Standardredoxpotentiale und die kinetischen Koeffizienten des ET bestimmen. In einer zweiten Studie wurde tr-SEIRAS im step scan Modus angewandt. Dafür wurden Rechteckpulse an die CcO angelegt und SEIRAS im ART-Modus verwendet um Spektren bei definierten Zeitscheiben aufzunehmen. Aus diesen Spektren wurden einzelne Banden isoliert, die Veränderungen von Vibrationsmoden der Aminosäuren und Peptidgruppen in Abhängigkeit des Redoxzustands der Zentren zeigen. Aufgrund von Zuordnungen aus der Literatur, die durch potentiometrische Titration der CcO ermittelt wurden, konnten die Banden versuchsweise den Redoxzentren zugeordnet werden. Die Bandenflächen gegen die Zeit aufgetragen geben dann die Redox-Kinetik der Zentren wieder und wurden wiederum mit dem vier-ET-Modell ausgewertet. Die Ergebnisse beider Studien erlauben die Schlussfolgerung, dass der ET zur CcO in einer ptBLM mit größter Wahrscheinlichkeit dem sequentiellen Mechanismus folgt, was dem natürlichen ET von Cytochrom c zur CcO entspricht.
An Integrated Transmission-Media Noise Calibration Software For Deep-Space Radio Science Experiments
Resumo:
The thesis describes the implementation of a calibration, format-translation and data conditioning software for radiometric tracking data of deep-space spacecraft. All of the available propagation-media noise rejection techniques available as features in the code are covered in their mathematical formulations, performance and software implementations. Some techniques are retrieved from literature and current state of the art, while other algorithms have been conceived ex novo. All of the three typical deep-space refractive environments (solar plasma, ionosphere, troposphere) are dealt with by employing specific subroutines. Specific attention has been reserved to the GNSS-based tropospheric path delay calibration subroutine, since it is the most bulky module of the software suite, in terms of both the sheer number of lines of code, and development time. The software is currently in its final stage of development and once completed will serve as a pre-processing stage for orbit determination codes. Calibration of transmission-media noise sources in radiometric observables proved to be an essential operation to be performed of radiometric data in order to meet the more and more demanding error budget requirements of modern deep-space missions. A completely autonomous and all-around propagation-media calibration software is a novelty in orbit determination, although standalone codes are currently employed by ESA and NASA. The described S/W is planned to be compatible with the current standards for tropospheric noise calibration used by both these agencies like the AMC, TSAC and ESA IFMS weather data, and it natively works with the Tracking Data Message file format (TDM) adopted by CCSDS as standard aimed to promote and simplify inter-agency collaboration.
Resumo:
The PhD research activity has taken place in the space debris field. In detail, it is focused on the possibility of detecting space debris from the space based platform. The research is focused at the same time on the software and the hardware of this detection system. For the software, a program has been developed for being able to detect an object in space and locate it in the sky solving the star field. For the hardware, the possibility of adapting a ground telescope for space activity has been considered and it has been tested on a possible electronic board.
Resumo:
The purpose of this thesis is, on the one hand, to illustrate the peculiarities of children’s literature, fantasy fiction and their translation and, on the other hand, to propose a translation from English to Italian of some chapters of the e-book The Explorers’ Gate by American author Chris Grabenstein. The first chapters of this work offer an analysis of different critical studies on children’s literature and fantasy fiction and illustrate the characteristics of these two literary expressions. I will also discuss the different approaches to their translation in order to produce a translated text that is consistent with its literary genre and with translation theories. The third chapter is about the author and includes an interview on his idea of children’s literature and his opinions about translation. The second part of this thesis is represented by the actual translation of the e-book. Firstly, I will analyze the source text, dividing the analysis in extra-textual and intra- textual and focusing on sender, addressee, time and space, function of the text, plot, structure, narrator, style and language used by the author. I will also highlight those elements that probably would be challenging during the translation phase. Secondly, I will explain the macro-strategy that I adopted during the process of translation, which can be defined as child-oriented. In the last chapter I will highlight those passages that represented translation challenges and I will show how I tackled them.
Resumo:
The study of supermassive black hole (SMBH) accretion during their phase of activity (hence becoming active galactic nuclei, AGN), and its relation to the host-galaxy growth, requires large datasets of AGN, including a significant fraction of obscured sources. X-ray data are strategic in AGN selection, because at X-ray energies the contamination from non-active galaxies is far less significant than in optical/infrared surveys, and the selection of obscured AGN, including also a fraction of heavily obscured AGN, is much more effective. In this thesis, I present the results of the Chandra COSMOS Legacy survey, a 4.6 Ms X-ray survey covering the equatorial COSMOS area. The COSMOS Legacy depth (flux limit f=2x10^(-16) erg/s/cm^(-2) in the 0.5-2 keV band) is significantly better than that of other X-ray surveys on similar area, and represents the path for surveys with future facilities, like Athena and X-ray Surveyor. The final Chandra COSMOS Legacy catalog contains 4016 point-like sources, 97% of which with redshift. 65% of the sources are optically obscured and potentially caught in the phase of main BH growth. We used the sample of 174 Chandra COSMOS Legacy at z>3 to place constraints on the BH formation scenario. We found a significant disagreement between our space density and the predictions of a physical model of AGN activation through major-merger. This suggests that in our luminosity range the BH triggering through secular accretion is likely preferred to a major-merger triggering scenario. Thanks to its large statistics, the Chandra COSMOS Legacy dataset, combined with the other multiwavelength COSMOS catalogs, will be used to answer questions related to a large number of astrophysical topics, with particular focus on the SMBH accretion in different luminosity and redshift regimes.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
Neglect is defined as the failure to attend and to orient to the contralesional side of space. A horizontal bias towards the right visual field is a classical finding in patients who suffered from a right-hemispheric stroke. The vertical dimension of spatial attention orienting has only sparsely been investigated so far. The aim of this study was to investigate the specificity of this vertical bias by means of a search task, which taps a more pronounced top-down attentional component. Eye movements and behavioural search performance were measured in thirteen patients with left-sided neglect after right hemispheric stroke and in thirteen age-matched controls. Concerning behavioural performance, patients found significantly less targets than healthy controls in both the upper and lower left quadrant. However, when targets were located in the lower left quadrant, patients needed more visual fixations (and therefore longer search time) to find them, suggesting a time-dependent vertical bias.
Resumo:
Identification of the subarachnoid space has traditionally been achieved by either a blind landmark-guided approach or using prepuncture ultrasound assistance. To assess the feasibility of performing spinal anaesthesia under real-time ultrasound guidance in routine clinical practice we conducted a single center prospective observational study among patients undergoing lower limb orthopaedic surgery. A spinal needle was inserted unassisted within the ultrasound transducer imaging plane using a paramedian approach (i.e., the operator held the transducer in one hand and the spinal needle in the other). The primary outcome measure was the success rate of CSF acquisition under real-time ultrasound guidance with CSF being located in 97 out of 100 consecutive patients within median three needle passes (IQR 1-6). CSF was not acquired in three patients. Subsequent attempts combining landmark palpation and pre-puncture ultrasound scanning resulted in successful spinal anaesthesia in two of these patients with the third patient requiring general anaesthesia. Median time from spinal needle insertion until intrathecal injection completion was 1.2 minutes (IQR 0.83-4.1) demonstrating the feasibility of this technique in routine clinical practice.
Resumo:
Stem cell regeneration of damaged tissue has recently been reported in many different organs. Since the loss of retinal pigment epithelium (RPE) in the eye is associated with a major cause of visual loss - specifically, age-related macular degeneration - we investigated whether hematopoietic stem cells (HSC) given systemically can home to the damaged subretinal space and express markers of RPE lineage. Green fluorescent protein (GFP) cells of bone marrow origin were used in a sodium iodate (NaIO(3)) model of RPE damage in the mouse. The optimal time for adoptive transfer of bone marrow-derived stem cells relative to the time of injury and the optimal cell type [whole bone marrow, mobilized peripheral blood, HSC, facilitating cells (FC)] were determined by counting the number of GFP(+) cells in whole eye flat mounts. Immunocytochemistry was performed to identify the bone marrow origin of the cells in the RPE using antibodies for CD45, Sca-1, and c-kit, as well as the expression of the RPE-specific marker, RPE-65. The time at which bone marrow-derived cells were adoptively transferred relative to the time of NaIO(3) injection did not significantly influence the number of cells that homed to the subretinal space. At both one and two weeks after intravenous (i.v.) injection, GFP(+) cells of bone marrow origin were observed in the damaged subretinal space, at sites of RPE loss, but not in the normal subretinal space. The combined transplantation of HSC+FC cells appeared to favor the survival of the homed stem cells at two weeks, and RPE-65 was expressed by adoptively transferred HSC by four weeks. We have shown that systemically injected HSC homed to the subretinal space in the presence of RPE damage and that FC promoted survival of these cells. Furthermore, the RPE-specific marker RPE-65 was expressed on adoptively transferred HSC in the denuded areas.
Resumo:
Boston Harbor has had a history of poor water quality, including contamination by enteric pathogens. We conduct a statistical analysis of data collected by the Massachusetts Water Resources Authority (MWRA) between 1996 and 2002 to evaluate the effects of court-mandated improvements in sewage treatment. Motivated by the ineffectiveness of standard Poisson mixture models and their zero-inflated counterparts, we propose a new negative binomial model for time series of Enterococcus counts in Boston Harbor, where nonstationarity and autocorrelation are modeled using a nonparametric smooth function of time in the predictor. Without further restrictions, this function is not identifiable in the presence of time-dependent covariates; consequently we use a basis orthogonal to the space spanned by the covariates and use penalized quasi-likelihood (PQL) for estimation. We conclude that Enterococcus counts were greatly reduced near the Nut Island Treatment Plant (NITP) outfalls following the transfer of wastewaters from NITP to the Deer Island Treatment Plant (DITP) and that the transfer of wastewaters from Boston Harbor to the offshore diffusers in Massachusetts Bay reduced the Enterococcus counts near the DITP outfalls.
Resumo:
In environmental epidemiology, exposure X and health outcome Y vary in space and time. We present a method to diagnose the possible influence of unmeasured confounders U on the estimated effect of X on Y and to propose several approaches to robust estimation. The idea is to use space and time as proxy measures for the unmeasured factors U. We start with the time series case where X and Y are continuous variables at equally-spaced times and assume a linear model. We define matching estimator b(u)s that correspond to pairs of observations with specific lag u. Controlling for a smooth function of time, St, using a kernel estimator is roughly equivalent to estimating the association with a linear combination of the b(u)s with weights that involve two components: the assumptions about the smoothness of St and the normalized variogram of the X process. When an unmeasured confounder U exists, but the model otherwise correctly controls for measured confounders, the excess variation in b(u)s is evidence of confounding by U. We use the plot of b(u)s versus lag u, lagged-estimator-plot (LEP), to diagnose the influence of U on the effect of X on Y. We use appropriate linear combination of b(u)s or extrapolate to b(0) to obtain novel estimators that are more robust to the influence of smooth U. The methods are extended to time series log-linear models and to spatial analyses. The LEP plot gives us a direct view of the magnitude of the estimators for each lag u and provides evidence when models did not adequately describe the data.
Resumo:
We present an overview of different methods for decomposing a multichannel spontaneous electroencephalogram (EEG) into sets of temporal patterns and topographic distributions. All of the methods presented here consider the scalp electric field as the basic analysis entity in space. In time, the resolution of the methods is between milliseconds (time-domain analysis), subseconds (time- and frequency-domain analysis) and seconds (frequency-domain analysis). For any of these methods, we show that large parts of the data can be explained by a small number of topographic distributions. Physically, this implies that the brain regions that generated one of those topographies must have been active with a common phase. If several brain regions are producing EEG signals at the same time and frequency, they have a strong tendency to do this in a synchronized mode. This view is illustrated by several examples (including combined EEG and functional magnetic resonance imaging (fMRI)) and a selective review of the literature. The findings are discussed in terms of short-lasting binding between different brain regions through synchronized oscillations, which could constitute a mechanism to form transient, functional neurocognitive networks.