885 resultados para Early case detection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CAPITOLO 1 INTRODUZIONE Il lavoro presentato è relativo all’utilizzo a fini metrici di immagini satellitari storiche a geometria panoramica; in particolare sono state elaborate immagini satellitari acquisite dalla piattaforma statunitense CORONA, progettata ed impiegata essenzialmente a scopi militari tra gli anni ’60 e ’70 del secolo scorso, e recentemente soggette ad una declassificazione che ne ha consentito l’accesso anche a scopi ed utenti non militari. Il tema del recupero di immagini aeree e satellitari del passato è di grande interesse per un ampio spettro di applicazioni sul territorio, dall’analisi dello sviluppo urbano o in ambito regionale fino ad indagini specifiche locali relative a siti di interesse archeologico, industriale, ambientale. Esiste infatti un grandissimo patrimonio informativo che potrebbe colmare le lacune della documentazione cartografica, di per sé, per ovvi motivi tecnici ed economici, limitata a rappresentare l’evoluzione territoriale in modo asincrono e sporadico, e con “forzature” e limitazioni nel contenuto informativo legate agli scopi ed alle modalità di rappresentazione delle carte nel corso del tempo e per diversi tipi di applicazioni. L’immagine di tipo fotografico offre una rappresentazione completa, ancorché non soggettiva, dell’esistente e può complementare molto efficacemente il dato cartografico o farne le veci laddove questo non esista. La maggior parte del patrimonio di immagini storiche è certamente legata a voli fotogrammetrici che, a partire dai primi decenni del ‘900, hanno interessato vaste aree dei paesi più avanzati, o regioni di interesse a fini bellici. Accanto a queste, ed ovviamente su periodi più vicini a noi, si collocano le immagini acquisite da piattaforma satellitare, tra le quali rivestono un grande interesse quelle realizzate a scopo di spionaggio militare, essendo ad alta risoluzione geometrica e di ottimo dettaglio. Purtroppo, questo ricco patrimonio è ancora oggi in gran parte inaccessibile, anche se recentemente sono state avviate iniziative per permetterne l’accesso a fini civili, in considerazione anche dell’obsolescenza del dato e della disponibilità di altre e migliori fonti di informazione che il moderno telerilevamento ci propone. L’impiego di immagini storiche, siano esse aeree o satellitari, è nella gran parte dei casi di carattere qualitativo, inteso ad investigare sulla presenza o assenza di oggetti o fenomeni, e di rado assume un carattere metrico ed oggettivo, che richiederebbe tra l’altro la conoscenza di dati tecnici (per esempio il certificato di calibrazione nel caso delle camere aerofotogrammetriche) che sono andati perduti o sono inaccessibili. Va ricordato anche che i mezzi di presa dell’epoca erano spesso soggetti a fenomeni di distorsione ottica o altro tipo di degrado delle immagini che ne rendevano difficile un uso metrico. D’altra parte, un utilizzo metrico di queste immagini consentirebbe di conferire all’analisi del territorio e delle modifiche in esso intercorse anche un significato oggettivo che sarebbe essenziale per diversi scopi: per esempio, per potere effettuare misure su oggetti non più esistenti o per potere confrontare con precisione o co-registrare le immagini storiche con quelle attuali opportunamente georeferenziate. Il caso delle immagini Corona è molto interessante, per una serie di specificità che esse presentano: in primo luogo esse associano ad una alta risoluzione (dimensione del pixel a terra fino a 1.80 metri) una ampia copertura a terra (i fotogrammi di alcune missioni coprono strisce lunghe fino a 250 chilometri). Queste due caratteristiche “derivano” dal principio adottato in fase di acquisizione delle immagini stesse, vale a dire la geometria panoramica scelta appunto perché l’unica che consente di associare le due caratteristiche predette e quindi molto indicata ai fini spionaggio. Inoltre, data la numerosità e la frequenza delle missioni all’interno dell’omonimo programma, le serie storiche di questi fotogrammi permettono una ricostruzione “ricca” e “minuziosa” degli assetti territoriali pregressi, data appunto la maggior quantità di informazioni e l’imparzialità associabili ai prodotti fotografici. Va precisato sin dall’inizio come queste immagini, seppur rappresentino una risorsa “storica” notevole (sono datate fra il 1959 ed il 1972 e coprono regioni moto ampie e di grandissimo interesse per analisi territoriali), siano state molto raramente impiegate a scopi metrici. Ciò è probabilmente imputabile al fatto che il loro trattamento a fini metrici non è affatto semplice per tutta una serie di motivi che saranno evidenziati nei capitoli successivi. La sperimentazione condotta nell’ambito della tesi ha avuto due obiettivi primari, uno generale ed uno più particolare: da un lato il tentativo di valutare in senso lato le potenzialità dell’enorme patrimonio rappresentato da tali immagini (reperibili ad un costo basso in confronto a prodotti simili) e dall’altro l’opportunità di indagare la situazione territoriale locale per una zona della Turchia sud orientale (intorno al sito archeologico di Tilmen Höyük) sulla quale è attivo un progetto condotto dall’Università di Bologna (responsabile scientifico il Prof. Nicolò Marchetti del Dipartimento di Archeologia), a cui il DISTART collabora attivamente dal 2005. L’attività è condotta in collaborazione con l’Università di Istanbul ed il Museo Archeologico di Gaziantep. Questo lavoro si inserisce, inoltre, in un’ottica più ampia di quelle esposta, dello studio cioè a carattere regionale della zona in cui si trovano gli scavi archeologici di Tilmen Höyük; la disponibilità di immagini multitemporali su un ampio intervallo temporale, nonché di tipo multi sensore, con dati multispettrali, doterebbe questo studio di strumenti di conoscenza di altissimo interesse per la caratterizzazione dei cambiamenti intercorsi. Per quanto riguarda l’aspetto più generale, mettere a punto una procedura per il trattamento metrico delle immagini CORONA può rivelarsi utile all’intera comunità che ruota attorno al “mondo” dei GIS e del telerilevamento; come prima ricordato tali immagini (che coprono una superficie di quasi due milioni di chilometri quadrati) rappresentano un patrimonio storico fotografico immenso che potrebbe (e dovrebbe) essere utilizzato sia a scopi archeologici, sia come supporto per lo studio, in ambiente GIS, delle dinamiche territoriali di sviluppo di quelle zone in cui sono scarse o addirittura assenti immagini satellitari dati cartografici pregressi. Il lavoro è stato suddiviso in 6 capitoli, di cui il presente costituisce il primo. Il secondo capitolo è stato dedicato alla descrizione sommaria del progetto spaziale CORONA (progetto statunitense condotto a scopo di fotoricognizione del territorio dell’ex Unione Sovietica e delle aree Mediorientali politicamente correlate ad essa); in questa fase vengono riportate notizie in merito alla nascita e all’evoluzione di tale programma, vengono descritti piuttosto dettagliatamente gli aspetti concernenti le ottiche impiegate e le modalità di acquisizione delle immagini, vengono riportati tutti i riferimenti (storici e non) utili a chi volesse approfondire la conoscenza di questo straordinario programma spaziale. Nel terzo capitolo viene presentata una breve discussione in merito alle immagini panoramiche in generale, vale a dire le modalità di acquisizione, gli aspetti geometrici e prospettici alla base del principio panoramico, i pregi ed i difetti di questo tipo di immagini. Vengono inoltre presentati i diversi metodi rintracciabili in bibliografia per la correzione delle immagini panoramiche e quelli impiegati dai diversi autori (pochi per la verità) che hanno scelto di conferire un significato metrico (quindi quantitativo e non solo qualitativo come è accaduto per lungo tempo) alle immagini CORONA. Il quarto capitolo rappresenta una breve descrizione del sito archeologico di Tilmen Höyuk; collocazione geografica, cronologia delle varie campagne di studio che l’hanno riguardato, monumenti e suppellettili rinvenute nell’area e che hanno reso possibili una ricostruzione virtuale dell’aspetto originario della città ed una più profonda comprensione della situazione delle capitali del Mediterraneo durante il periodo del Bronzo Medio. Il quinto capitolo è dedicato allo “scopo” principe del lavoro affrontato, vale a dire la generazione dell’ortofotomosaico relativo alla zona di cui sopra. Dopo un’introduzione teorica in merito alla produzione di questo tipo di prodotto (procedure e trasformazioni utilizzabili, metodi di interpolazione dei pixel, qualità del DEM utilizzato), vengono presentati e commentati i risultati ottenuti, cercando di evidenziare le correlazioni fra gli stessi e le problematiche di diversa natura incontrate nella redazione di questo lavoro di tesi. Nel sesto ed ultimo capitolo sono contenute le conclusioni in merito al lavoro in questa sede presentato. Nell’appendice A vengono riportate le tabelle dei punti di controllo utilizzati in fase di orientamento esterno dei fotogrammi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are a large number of signal processing techniques that can be used in order to extract interesting information from a measured vibration signal. This study seeks to detect rotating machine defects using a range of techniques including synchronous time averaging, Hilbert transform-based demodulation, continuous wavelet transform, Wigner-Ville distribution and spectral correlation density function. The detection and the diagnostic capability of these techniques are discussed and compared on the basis of experimental results concerning gear tooth faults, i.e. fatigue crack at the tooth root and tooth spalls of different sizes, as well as assembly faults in diesel engine. Moreover, the sensitivity to fault severity is assessed by the application of these signal processing techniques to gear tooth faults of different sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precipitation retrieval over high latitudes, particularly snowfall retrieval over ice and snow, using satellite-based passive microwave spectrometers, is currently an unsolved problem. The challenge results from the large variability of microwave emissivity spectra for snow and ice surfaces, which can mimic, to some degree, the spectral characteristics of snowfall. This work focuses on the investigation of a new snowfall detection algorithm specific for high latitude regions, based on a combination of active and passive sensors able to discriminate between snowing and non snowing areas. The space-borne Cloud Profiling Radar (on CloudSat), the Advanced Microwave Sensor units A and B (on NOAA-16) and the infrared spectrometer MODIS (on AQUA) have been co-located for 365 days, from October 1st 2006 to September 30th, 2007. CloudSat products have been used as truth to calibrate and validate all the proposed algorithms. The methodological approach followed can be summarised into two different steps. In a first step, an empirical search for a threshold, aimed at discriminating the case of no snow, was performed, following Kongoli et al. [2003]. This single-channel approach has not produced appropriate results, a more statistically sound approach was attempted. Two different techniques, which allow to compute the probability above and below a Brightness Temperature (BT) threshold, have been used on the available data. The first technique is based upon a Logistic Distribution to represent the probability of Snow given the predictors. The second technique, defined Bayesian Multivariate Binary Predictor (BMBP), is a fully Bayesian technique not requiring any hypothesis on the shape of the probabilistic model (such as for instance the Logistic), which only requires the estimation of the BT thresholds. The results obtained show that both methods proposed are able to discriminate snowing and non snowing condition over the Polar regions with a probability of correct detection larger than 0.5, highlighting the importance of a multispectral approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Der AMANDA-II Detektor ist primär für den richtungsaufgelösten Nachweis hochenergetischer Neutrinos konzipiert. Trotzdem können auch niederenergetische Neutrinoausbrüche, wie sie von Supernovae erwartet werden, mit hoher Signifikanz nachgewiesen werden, sofern sie innerhalb der Milchstraße stattfinden. Die experimentelle Signatur im Detektor ist ein kollektiver Anstieg der Rauschraten aller optischen Module. Zur Abschätzung der Stärke des erwarteten Signals wurden theoretische Modelle und Simulationen zu Supernovae und experimentelle Daten der Supernova SN1987A studiert. Außerdem wurden die Sensitivitäten der optischen Module neu bestimmt. Dazu mussten für den Fall des südpolaren Eises die Energieverluste geladener Teilchen untersucht und eine Simulation der Propagation von Photonen entwickelt werden. Schließlich konnte das im Kamiokande-II Detektor gemessene Signal auf die Verhältnisse des AMANDA-II Detektors skaliert werden. Im Rahmen dieser Arbeit wurde ein Algorithmus zur Echtzeit-Suche nach Signalen von Supernovae als Teilmodul der Datennahme implementiert. Dieser beinhaltet diverse Verbesserungen gegenüber der zuvor von der AMANDA-Kollaboration verwendeten Version. Aufgrund einer Optimierung auf Rechengeschwindigkeit können nun mehrere Echtzeit-Suchen mit verschiedenen Analyse-Zeitbasen im Rahmen der Datennahme simultan laufen. Die Disqualifikation optischer Module mit ungeeignetem Verhalten geschieht in Echtzeit. Allerdings muss das Verhalten der Module zu diesem Zweck anhand von gepufferten Daten beurteilt werden. Dadurch kann die Analyse der Daten der qualifizierten Module nicht ohne eine Verzögerung von etwa 5 Minuten geschehen. Im Falle einer erkannten Supernova werden die Daten für die Zeitdauer mehrerer Minuten zur späteren Auswertung in 10 Millisekunden-Intervallen archiviert. Da die Daten des Rauschverhaltens der optischen Module ansonsten in Intervallen von 500 ms zur Verfgung stehen, ist die Zeitbasis der Analyse in Einheiten von 500 ms frei wählbar. Im Rahmen dieser Arbeit wurden drei Analysen dieser Art am Südpol aktiviert: Eine mit der Zeitbasis der Datennahme von 500 ms, eine mit der Zeitbasis 4 s und eine mit der Zeitbasis 10 s. Dadurch wird die Sensitivität für Signale maximiert, die eine charakteristische exponentielle Zerfallszeit von 3 s aufweisen und gleichzeitig eine gute Sensitivität über einen weiten Bereich exponentieller Zerfallszeiten gewahrt. Anhand von Daten der Jahre 2000 bis 2003 wurden diese Analysen ausführlich untersucht. Während die Ergebnisse der Analyse mit t = 500 ms nicht vollständig nachvollziehbare Ergebnisse produzierte, konnten die Resultate der beiden Analysen mit den längeren Zeitbasen durch Simulationen reproduziert und entsprechend gut verstanden werden. Auf der Grundlage der gemessenen Daten wurden die erwarteten Signale von Supernovae simuliert. Aus einem Vergleich zwischen dieser Simulation den gemessenen Daten der Jahre 2000 bis 2003 und der Simulation des erwarteten statistischen Untergrunds kann mit einem Konfidenz-Niveau von mindestens 90 % gefolgert werden, dass in der Milchstraße nicht mehr als 3.2 Supernovae pro Jahr stattfinden. Zur Identifikation einer Supernova wird ein Ratenanstieg mit einer Signifikanz von mindestens 7.4 Standardabweichungen verlangt. Die Anzahl erwarteter Ereignisse aus dem statistischen Untergrund beträgt auf diesem Niveau weniger als ein Millionstel. Dennoch wurde ein solches Ereignis gemessen. Mit der gewählten Signifikanzschwelle werden 74 % aller möglichen Vorläufer-Sterne von Supernovae in der Galaxis überwacht. In Kombination mit dem letzten von der AMANDA-Kollaboration veröffentlicheten Ergebnis ergibt sich sogar eine obere Grenze von nur 2.6 Supernovae pro Jahr. Im Rahmen der Echtzeit-Analyse wird für die kollektive Ratenüberhöhung eine Signifikanz von mindestens 5.5 Standardabweichungen verlangt, bevor eine Meldung über die Detektion eines Supernova-Kandidaten verschickt wird. Damit liegt der überwachte Anteil Sterne der Galaxis bei 81 %, aber auch die Frequenz falscher Alarme steigt auf bei etwa 2 Ereignissen pro Woche. Die Alarm-Meldungen werden über ein Iridium-Modem in die nördliche Hemisphäre übertragen, und sollen schon bald zu SNEWS beitragen, dem weltweiten Netzwerk zur Früherkennung von Supernovae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, conditions of deposition and stratigraphical architecture of Neogene (Tortonian, 11-6,7Ma) sediments of southern central Crete were analysed. In order to improve resolution of paleoclimatic data, new methods were applied to quantify environmental parameters and to increase the chronostratigraphic resolution in shallow water sediments. A relationship between paleoenvironmental change observed on Crete and global processes was established and a depositional model was developed. Based on a detailed analysis of the distribution of non geniculate coralline red algae, index values for water temperature and water depth were established and tested with the distribution patterns of benthic foraminifera and symbiont-bearing corals. Calcite shelled bivalves were sampled from the Algarve coast (southern Portugal) and central Crete and then 87Sr/86Sr was measured. A high resolution chronostratigraphy was developed based on the correlation between fluctuations in Sr ratios in the measured sections and in a late Miocene global seawater Sr isotope reference curve. Applying this method, a time frame was established to compare paleoenvironmental data from southern central Crete with global information on climate change reflected in oxygen isotope data. The comparison between paleotemperature data based on red algae and global oxygen isotope data showed that the employed index values reflect global change in temperature. Data indicate a warm interval during earliest Tortonian, a second short warm interval between 10 and 9,5Ma, a longer climatic optimum between 9 and 8Ma and an interval of increasing temperatures in the latest Tortonian. The distribution of coral reefs and carpets shows that during the warm intervals, the depositional environment became tropical while temperate climates prevailed during the cold interval. Since relative tectonic movements after initial half-graben formation in the early Tortonian were low in southern central Crete, sedimentary successions strongly respond to global sea-level fluctuation. A characteristic sedimentary succession formed during a 3rd order sea-level cycle: It comprises mixed siliciclastic-limestone deposited during sea-level fall and lowstand, homogenous red algal deposits formed during sea-level rise and coral carpets formed during late rise and highstand. Individual beds in the succession reflect glacioeustatic fluctuations that are most prominent in the mixed siliciclastic-limestone interval. These results confirm the fact that sedimentary successions deposited at the critical threshold between temperate and tropical environments develop characteristic changes in depositional systems and biotic associations that can be used to assemble paleoclimatic datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanotechnology entails the manufacturing and manipulation of matter at length scales ranging from single atoms to micron-sized objects. The ability to address properties on the biologically-relevant nanometer scale has made nanotechnology attractive for Nanomedicine. This is perceived as a great opportunity in healthcare especially in diagnostics, therapeutics and more in general to develop personalized medicine. Nanomedicine has the potential to enable early detection and prevention, and to improve diagnosis, mass screening, treatment and follow-up of many diseases. From the biological standpoint, nanomaterials match the typical size of naturally occurring functional units or components of living organisms and, for this reason, enable more effective interaction with biological systems. Nanomaterials have the potential to influence the functionality and cell fate in the regeneration of organs and tissues. To this aim, nanotechnology provides an arsenal of techniques for intervening, fabricate, and modulate the environment where cells live and function. Unconventional micro- and nano-fabrication techniques allow patterning biomolecules and biocompatible materials down to the level of a few nanometer feature size. Patterning is not simply a deterministic placement of a material; in a more extended acception it allows a controlled fabrication of structures and gradients of different nature. Gradients are emerging as one of the key factors guiding cell adhesion, proliferation, migration and even differentiation in the case of stem cells. The main goal of this thesis has been to devise a nanotechnology-based strategy and tools to spatially and temporally control biologically-relevant phenomena in-vitro which are important in some fields of medical research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Der Erfolg einer Schizophrenie-Behandlung ist zum größten Teil abhängig vom Ansprechen des Patienten auf seine antipsychotische Medikation. Welches Medikament und welche Dosis bei einem individuellen Patienten wirksam sind, kann derzeit erst nach mehrwöchiger Behandlung beurteilt werden. Ein Grund für variierendes Therapieansprechen sind variable Plasmakonzentrationen der Antipsychotika. Ziel dieser Arbeit war es, zu untersuchen, in wieweit der Therapieerfolg zu einem frühen Zeitpunkt der Behandlung durch objektive Symptomerfassung vorhersagbar ist und welche Faktoren die hohe Variabilität der Antipsychotikaspiegel im Blut beeinflussen. rnEine 18-monatige naturalistische klinische Studie an schizophrenen Patienten wurde durchgeführt, um folgende Fragen zu beantworten: Kann man das Therapieansprechen prädizieren und welche Instrumente sind dafür geeignet? Die Psychopathologie wurde anhand zweier Messskalen (Brief Psychiatric Rating Scale, BPRS und Clinical Global Impressions, CGI) wöchentlich ermittelt, um die Besserung der Krankheitssymptome im Verlauf von 8 Wochen zu bewerten. Therapiebegleitend wurden noch die Serum-Konzentrationen der Antipsychotika gemessen. Objektive Symptomerfassung durch BPRS oder CGI waren als Messinstrumente geeignet, Therapieansprechen vorherzusagen. Bezogen auf den Behandlungsbeginn war eine Verminderung der Symptome hoch prädiktiv für späteres Therapieversagen oder -ansprechen. Eine Verminderung um mehr als 36,5% auf der BPRS Skala in Woche 2 wurde als signifikanter Schwellenwert für Nichtansprechen ermittelt. Patienten, deren Symptombesserung unterhalb des Schwellenwertes lag, hatten eine 11,2-fach höhere Wahrscheinlichkeit, am Ende der Studie nicht auf ihre medikamentöse Therapie anzusprechen als die Patienten, die sich um mindestens 36,5% verbesserten. Andere Faktoren, wie Alter, Geschlecht, Dauer der Erkrankung oder Anzahl der stationären Aufenthalte hatten keinen Einfluss auf die Prädiktion des Therapieansprechens. Therapeutische Antipsychotika-Spiegel übten einen positiven Einfluss auf die Ansprechrate aus. Bei Patienten mit therapeutischen Spiegeln war das Ansprechen rascher und die Ansprechrate größer als unter denjenigen deren Spiegel außerhalb der therapeutisch üblichen Bereiche lag. rnEine wichtige Voraussetzung für den Einsatz von TDM ist das Vorhandensein einer präzisen, reproduzierbaren, zeit- und kostensparenden analytischen Methode zur quantitativen Bestimmung der untersuchten Substanzen. Die Entwicklung und Validierung einer solchen geeigneten Methode wurde für den Nachweis von Haloperidol vorgenommen. Eine HPLC-Methode mit Säulenschaltung erwies sich für TDM geeignet. rnBasierend auf den Ergebnissen der eigenen klinischen Studie zur Response Prädiktion wurde untersucht, welche Faktoren die Variabilität der Pharmakokinetik von Antipsychotika beeinflussen. Die Variabilität der Pharmakokinetik ist ein Grund für fehlendes oder unzureichendes Ansprechen. Es wurde zum einen der Einfluss der galenischen Formulierung auf die Freisetzung und zum anderen der Einfluss von entzündlichen Prozessen auf die Metabolisierung eines Antipsychotikums untersucht. Dazu wurden Patientendaten retrospektiv ausgewertet.rnDie Analyse von 247 Serumspiegeln von Patienten, die mit Paliperidon in OROS®Formulierung, einer neu eingeführten Retardform, behandelt wurden, zeigte, dass die intraindividuelle Variabilität der Talspiegel (Vk) von Paliperidon 35% betrug. Er war damit vergleichbar wie für nicht retardiertes Risperidon 32% (p=n.s.). Die Retardierung hatte demnach keinen Varianz mindernden Effekt auf die Talspiegel des Antipsychotikums. Der Wirkstoff-Konzentrations-Bereich lag bei 21-55 ng/ml und entsprach ebenfalls nahezu dem therapeutischen Bereich von Risperidon (20-60 ng/ml). rnEntzündliche Prozesse können die Metabolisierung von Medikamenten verändern. Dies wurde bisher für Medikamente nachgewiesen, die über CYP1A2 abgebaut werden. Durch die eigene Analyse von 84 Patienten-Serumspiegeln konnte festgestellt werden, dass die Metabolisierung von Quetiapin während eines entzündlichen Prozesses beeinträchtigt war, wahrscheinlich durch Hemmung von CYP3A4. Dies sprach dafür, dass auch Wirkstoffe, die über CYP3A4 abgebaut werden, während eines entzündlichen Prozesses im Körper in ihrer Pharmakokinetik beeinträchtigt sein können. Aus diesem Grund sollte während einer Infektion unter der Therapie mit Quetiapin besonders auf die Nebenwirkungen geachtet werden und der Serumspiegel sollte in dieser Zeit überwacht werden, um den Patienten vor eventuellen Nebenwirkungen oder sogar Intoxikationen zu schützen. rnDie Befunde dieser Arbeit zeigen, dass bei einer Behandlung schizophrener Patienten mit Antipsychotika die Messung der Psychopathologie zur Vorhersage des Therapieansprechens und die Messung der Blutspiegel zur Identifizierung von Faktoren, die die pharmakokinetische Variabilität bedingen, geeignet sind. Objektive Symptomerfassung und Therapeutisches Drug Monitoring sind demnach Instrumente, die für die Steuerung der antipsychotischen Pharmakotherapie genutzt werden sollten.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is about plant breeding in Early 20th-Century Italy. The stories of the two most prominent Italian plant-breeders of the time, Nazareno Strampelli and Francesco Todaro, are used to explore a fragment of the often-neglected history of Italian agricultural research. While Italy was not at the forefront of agricultural innovation, research programs aimed at varietal innovation did emerge in the country, along with an early diffusion of Mendelism. Using philosophical as well as historical analysis, plant breeding is analysed throughout this thesis as a process: a sequence of steps that lays on practical skills and theoretical assumptions, acting on various elements of production. Systematic plant-breeding programs in Italy started from small individual efforts, attracting more and more resources until they became a crucial part of the fascist regime's infamous agricultural policy. Hybrid varieties developed in the early 20th century survived World War II and are now ancestors of the varieties that are still cultivated today. Despite this relevance, the history of Italian wheat hybrids is today largely forgotten: this thesis is an effort to re-evaluate a part of it. The research did allow previously unknown or neglected facts to emerge, giving a new perspective on the infamous alliance between plant-breeding programs and the fascist regime. This thesis undertakes an analysis of Italian plant-breeding programs as processes. Those processes had a practical as well as a theoretical side, and involved various elements of production. Although a complete history of Italian plant breeding still remains to be written, the Italian case can now be considered along with the other case-studies that other scholars have developed in the history of plant breeding. The hope is that this historical and philosophical analysis will contribute to the on-going effort to understand the history of plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Komplementdefizienzen gehen mit einer erhöhten Infektionsanfälligkeit gegenüber bestimmten Krankheitserregern in den ersten Lebensjahren (MBL-Defizienz) und darüber hinaus (C1q- und anderen Komplementdefizienten) einher. Dies unterstreicht die Rolle des Komplementsystems als effektiver Abwehrmechanismus in der Übergangsphase zwischen Verlust des „mütterlichen Nestschutzes“ und Ausreifung der eigenen „erworbenen“ Immunität. Das Auftreten von Autoimmunerkrankungen wie dem SLE-ähnlichen Syndrom bei Defizienzen des Klassischen Weges beleuchten zusätzliche Funktionen des Komplementsystems während der Ausreifung der erworbenen Immunität und als wesentlicher Effektor in der Erkennung apoptotischer Zellen und deren Eliminierung aus dem System.rnHereditäre C1q-Defizienzen gehen mit einer hohen Wahrscheinlichkeit mit einem SLE-ähnlichen Syndrom einher. Sie stellen unter den Defizienzen des Komplementsystems eines Seltenheit dar, ihr klinisches „Gesicht“ ist umso eindrucksvoller. Sie sind von der funktionellen C1q-Defizienz im Rahmen eines erhöhten „turnover“ und in der Folge einer C1q-Autoantokörperbildung abzugrenzen. Ursächlich ist ihnen eine Mutation in einem der drei C1q-Gene, die auf dem Chromosom 1 lokalisiert sind. Homozygote Mutationsträger können den Defekt nicht ausgleichen und zeigen eine C1q-Defizienz mit Verlust der gesamthämolytischen Aktivität CH50. Häufungen treten bei Nachkommen von Geschwister- und Verwandtschaftsehen auf.rnrnIn dieser Arbeit wird der Fall einer Patientin mit einem schweren, frühkindlich einsetzenden, SLE-ähnlichen Syndrom aufgearbeitet. Als Ursache für eine Erkrankung konnte ein hereditärer C1q-Defekt, ohne immunologischem Nachweis eines C1q oer LMQ-C1q, identifiziert werden. Da sich keine der vorab beschriebenen Mutatonsmuster bei der Patientin detektieren ließ, erfolgte die Sequenzierung aller drei C1q-Gene. Dadurch ließ sich ein neues Mutationsmuster darstellen.rnrnDie in dieser Arbeit vorgestellte Mutation unterscheidet sich von den bislang beschriebenen Mutationen dadurch, dass es sich nicht um eine Punktmutation, sonder um eine Deletion von 29 Basen (c283_311) im Exon 2 des C1q-B-Ketten-Gens mit einhergehendem Rasterschub und vorzeitigem Stop-Codon (pMet95TrpfsX8) handelt. Durch die Analyse der Eltern und Geschwister der betroffenen Patientin konnte der Vererbungsweg dargestellt werden. Zudem gelang es die Mutation im Rahmen einer Pränataldiagnostik bei einem „ungeborenen“ Geschwisterkind auszuschließen.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early-Type galaxies (ETGs) are embedded in hot (10^6-10^7 K), X-ray emitting gaseous haloes, produced mainly by stellar winds and heated by Type Ia supernovae explosions, by the thermalization of stellar motions and occasionally by the central super-massive black hole (SMBH). In particular, the thermalization of the stellar motions is due to the interaction between the stellar and the SNIa ejecta and the hot interstellar medium (ISM) already residing in the ETG. A number of different astrophysical phenomena determine the X-ray properties of the hot ISM, such as stellar population formation and evolution, galaxy structure and internal kinematics, Active Galactic Nuclei (AGN) presence, and environmental effects. With the aid of high-resolution hydrodynamical simulations performed on state-of-the-art galaxy models, in this Thesis we focus on the effects of galaxy shape, stellar kinematics and star formation on the evolution of the X-ray coronae of ETGs. Numerical simulations show that the relative importance of flattening and rotation are functions of the galaxy mass: at low galaxy masses, adding flattening and rotation induces a galactic wind, thus lowering the X-ray luminosity; at high galaxy masses the angular momentum conservation keeps the central regions of rotating galaxies at low density, whereas in non-rotating models a denser and brighter atmosphere is formed. The same dependence from the galaxy mass is present in the effects of star formation (SF): in light galaxies SF contributes to increase the spread in Lx, while at high galaxy masses the halo X-ray properties are marginally sensitive to SF effects. In every case, the star formation rate at the present epoch quite agrees with observations, and the massive, cold gaseous discs are partially or completely consumed by SF on a time-scale of few Gyr, excluding the presence of young stellar discs at the present epoch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present thesis, a new methodology of diagnosis based on advanced use of time-frequency technique analysis is presented. More precisely, a new fault index that allows tracking individual fault components in a single frequency band is defined. More in detail, a frequency sliding is applied to the signals being analyzed (currents, voltages, vibration signals), so that each single fault frequency component is shifted into a prefixed single frequency band. Then, the discrete Wavelet Transform is applied to the resulting signal to extract the fault signature in the frequency band that has been chosen. Once the state of the machine has been qualitatively diagnosed, a quantitative evaluation of the fault degree is necessary. For this purpose, a fault index based on the energy calculation of approximation and/or detail signals resulting from wavelet decomposition has been introduced to quantify the fault extend. The main advantages of the developed new method over existing Diagnosis techniques are the following: - Capability of monitoring the fault evolution continuously over time under any transient operating condition; - Speed/slip measurement or estimation is not required; - Higher accuracy in filtering frequency components around the fundamental in case of rotor faults; - Reduction in the likelihood of false indications by avoiding confusion with other fault harmonics (the contribution of the most relevant fault frequency components under speed-varying conditions are clamped in a single frequency band); - Low memory requirement due to low sampling frequency; - Reduction in the latency of time processing (no requirement of repeated sampling operation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plant communities on weathered rock and outcrops are characterized by high values in species richness (Dengler 2006) and often persist on small and fragmented surfaces. Yet very few studies have examined the relationships between heterogeneity and plant diversity at small scales, in particular in poor-nutrient and low productive environment (Shmida and Wilson 1985, Lundholm 2003). In order to assess these relationships both in space and time in relationship, two different approaches were employed in the present study, in two gypsum outcrops of Northern Apennine. Diachronic and synchronic samplings from April 2012 to March 2013 were performed. A 50x50 cm plot was used in both samplings such as the sampling unit base. The diachronic survey aims to investigate seasonal patterning of plant diversity by the use of images analysis techniques integrated with field data and considering also seasonal climatic trend, the substrate quality and its variation in time. The purpose of the further, synchronic sampling was to describe plant diversity pattern as a function of the environmental heterogeneity meaning in substrate typologies, soil depth and topographic features. Results showed that responses of diversity pattern depend both on the resources availability, environmental heterogeneity and the manner in which the different taxonomic group access to them during the year. Species richness and Shannon diversity were positively affected by increasing in substrate heterogeneity. Furthermore a good turnover in seasonal species occurrence was detected. This vegetation may be described by the coexistence of three groups of species which created a gradient from early colonization stages, characterized by greater slope and predominance of bare rock, gradually to situation of more developed soil.