913 resultados para Monitoring, SLA, JBoss, Middleware, J2EE, Java, Service Level Agreements
Resumo:
This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.
Resumo:
The thesis deals with channel coding theory applied to upper layers in the protocol stack of a communication link and it is the outcome of four year research activity. A specific aspect of this activity has been the continuous interaction between the natural curiosity related to the academic blue-sky research and the system oriented design deriving from the collaboration with European industry in the framework of European funded research projects. In this dissertation, the classical channel coding techniques, that are traditionally applied at physical layer, find their application at upper layers where the encoding units (symbols) are packets of bits and not just single bits, thus explaining why such upper layer coding techniques are usually referred to as packet layer coding. The rationale behind the adoption of packet layer techniques is in that physical layer channel coding is a suitable countermeasure to cope with small-scale fading, while it is less efficient against large-scale fading. This is mainly due to the limitation of the time diversity inherent in the necessity of adopting a physical layer interleaver of a reasonable size so as to avoid increasing the modem complexity and the latency of all services. Packet layer techniques, thanks to the longer codeword duration (each codeword is composed of several packets of bits), have an intrinsic longer protection against long fading events. Furthermore, being they are implemented at upper layer, Packet layer techniques have the indisputable advantages of simpler implementations (very close to software implementation) and of a selective applicability to different services, thus enabling a better matching with the service requirements (e.g. latency constraints). Packet coding technique improvement has been largely recognized in the recent communication standards as a viable and efficient coding solution: Digital Video Broadcasting standards, like DVB-H, DVB-SH, and DVB-RCS mobile, and 3GPP standards (MBMS) employ packet coding techniques working at layers higher than the physical one. In this framework, the aim of the research work has been the study of the state-of-the-art coding techniques working at upper layer, the performance evaluation of these techniques in realistic propagation scenario, and the design of new coding schemes for upper layer applications. After a review of the most important packet layer codes, i.e. Reed Solomon, LDPC and Fountain codes, in the thesis focus our attention on the performance evaluation of ideal codes (i.e. Maximum Distance Separable codes) working at UL. In particular, we analyze the performance of UL-FEC techniques in Land Mobile Satellite channels. We derive an analytical framework which is a useful tool for system design allowing to foresee the performance of the upper layer decoder. We also analyze a system in which upper layer and physical layer codes work together, and we derive the optimal splitting of redundancy when a frequency non-selective slowly varying fading channel is taken into account. The whole analysis is supported and validated through computer simulation. In the last part of the dissertation, we propose LDPC Convolutional Codes (LDPCCC) as possible coding scheme for future UL-FEC application. Since one of the main drawbacks related to the adoption of packet layer codes is the large decoding latency, we introduce a latency-constrained decoder for LDPCCC (called windowed erasure decoder). We analyze the performance of the state-of-the-art LDPCCC when our decoder is adopted. Finally, we propose a design rule which allows to trade-off performance and latency.
Resumo:
Sea-level variability is characterized by multiple interacting factors described in the Fourth Assessment Report (Bindoff et al., 2007) of the Intergovernmental Panel on Climate Change (IPCC) that act over wide spectra of temporal and spatial scales. In Church et al. (2010) sea-level variability and changes are defined as manifestations of climate variability and change. The European Environmental Agency (EEA) defines sea level as one of most important indicators for monitoring climate change, as it integrates the response of different components of the Earths system and is also affected by anthropogenic contributions (EEA, 2011). The balance between the different sea-level contributions represents an important source of uncertainty, involving stochastic processes that are very difficult to describe and understand in detail, to the point that they are defined as an enigma in Munk (2002). Sea-level rate estimates are affected by all these uncertainties, in particular if we look at possible responses to sea-level contributions to future climate. At the regional scale, lateral fluxes also contribute to sea-level variability, adding complexity to sea-level dynamics. The research strategy adopted in this work to approach such an interesting and challenging topic has been to develop an objective methodology to study sea-level variability at different temporal and spatial scales, applicable in each part of the Mediterranean basin in particular, and in the global ocean in general, using all the best calibrated sources of data (for the Mediterranean): in-situ, remote-sensig and numerical models data. The global objective of this work was to achieve a deep understanding of all of the components of the sea-level signal contributing to sea-level variability, tendency and trend and to quantify them.
Resumo:
The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.
Resumo:
Nel primo capitolo si è studiata la nuova tecnologia del Cloud Computing, fornendo una semplice analisi di tutte le caratteristiche principali, gli attori coinvolti e i relativi metodi di distribuzione e servizi offerti. Nel secondo capitolo si è introdotta la nozione di coordination as a service, discutendone le relative astrazioni che compongono l'architettura logica. Successivamente si è preso in considerazione il modello di coordinazione TuCSoN definendo cosa si intende per nodo, agente, centro di tuple e agent coordination context ed è stato analizzato il relativo linguaggio di coordinazione attraverso il quale essi interagiscono. Nel terzo capitolo sono state riviste ed estese le nozioni di TuCSoN, precedentemente acquisite, nell'ambito del Cloud Computing ed è stato fornito un modello astratto ed una possibile architettura di TuCSoN in the Cloud. Sono stati analizzati anche gli aspetti di un possibile servizio di tale genere nello scenario di servizio pay-per-use. Infine nel quarto ed ultimo capitolo si è sviluppato un caso di studio in cui si è implementata un'interfaccia per l'attuale CLI di TuCSoN sottoforma di applet, che è stata poi inserita nel Cloud attraverso la piattaforma PaaS Cloudify.
Resumo:
In questo lavoro di tesi vengono esaminate quelle caratteristiche architetturali del middleware di coordinazione TuCSoN che maggiormente impattano sulle prestazioni dei sistemi coordinati. Laddove è stato possibile si è intervenuto sia a livello architetturale sia a livello tecnologico per migliorare le prestazioni del middleware. Come risultato finale si è ottenuto un importante incremento delle prestazioni del sistema. Non tutte le migliorie apportabili sono state realizzate, tuttavia vengono forniti alcuni spunti per possibili sviluppi futuri.
Resumo:
Der Gemeine Ohrwurm (Forficula auricularia LINNAEUS 1758) wurde bisher im Weinbau als natürlicher Gegenspieler verschiedener Rebschädlinge zu den Nützlingen gezählt. Etwa seit 2005 verursacht er aufgrund stark ansteigender Populationsdichten Schäden in pfälzischen Rebanlagen. Ohrwürmer halten sich massenhaft in den Trauben auf. Zusammen mit ihren Exkrementen geraten sie bei der Lese in großer Zahl ins Erntegut. Die Tiere werden von der weinbaulichen Praxis als sehr störend und qualitätsmindernd empfunden und ihre Einstufung als Nützling kritisch gesehen. Aufgrund dieser Problematik wurde im Mai 2007 ein durch den Forschungsring des Deutschen Weinbaus (FDW) finanziertes Forschungsprojekt am Dienstleistungszentrum Ländlicher Raum Rheinpfalz in Neustadt an der Weinstraße begonnen. Bis 2010 wurden offene Fragen zur Erfassung und Populationsbiologie des Gemeinen Ohrwurms in Rebanlagen bearbeitet, die von ihm verursachten Schäden beschrieben und Strategien zu seiner Befallsregulation entwickelt. Am Boden aktive Ohrwürmer wurden mit Bodenfallen nach BARBER (1931) aufgenommen. In der Laubwand des Rebstockes wurden die Ohrwürmer mit eigens konzipierten Bambusfallen erfasst. F. auricularia ist in pfälzischen Rebanlagen die dominierende Ohrwurm-Art. Im Projektverlauf wurde der univoltine Entwicklungszyklus des Gemeinen Ohrwurms in pfälzischen Rebanlagen vollständig aufgeklärt. In der Vegetationsperiode beeinflussten die Intensität der Bodenbewirtschaftung mit der resultierenden Flächenbegrünung, die Bodenart, die Lufttemperatur, die Luftfeuchtigkeit und die Niederschlagsmenge die Befallsdichten am Rebstock signifikant. Der Ohrwurm-Befall in den Trauben war signifikant von der Kompaktheit und vom Gewicht der Trauben sowie dem Fäulnisanteil pro Traube und von eingewachsenen Rebblättern in den Trauben abhängig. Das Überwinterungs- und Brutverhalten wurde durch die Art und Weise der Bodenbewirtschaftung beeinflusst beziehungsweise gestört.rnLabor- und Freilandversuche haben gezeigt, dass F. auricularia Pilzpathogene wie die Graufäule (Botrytis cinerea PERSOON 1794) und den Pinselschimmel (Penicillium crustosum THOM 1930) auf gesunde Trauben überträgt. Ferner haben Fraßversuche ergeben, dass der Ohrwurm nur faule und vorgeschädigte Beeren anfressen kann und keine intakten Beeren verletzt. Durch analytische und sensorische Untersuchungen wurde festgestellt, dass Ohrwurm-Kot sensorische Fehltöne im Wein verursachen kann. Diese werden durch das im Kot enthaltene 2-Methyl-1,4-benzochinon hervorgerufen, das eine Komponente des arteigenen Abwehrsekrets ist. Da sich der Ohrwurm jahreszeitlich bedingt entweder im Boden oder am Rebstock aufhält, wurden befallsregulierende Maßnahmen im Boden- und Laubwandbereich der Rebanlage durchgeführt. Durch Tiefengrubbern mit Umbruch der Begrünung im Herbst und Frühjahr wurden die überwinternden Imagines und die Gelege geschädigt, so dass in der darauf folgenden Vegetationsperiode die Befallsdichten in der Laubwand geringfügig aber nicht signifikant abnahmen. Die während der Aufwanderungsphase der Ohrwürmer Ende Juni durchgeführte mechanische Störung der Begrünung reduzierte den Ohrwurm-Befall am Rebstock bis zu drei Wochen nach der Maßnahme signifikant. In der Laubwand der Rebstöcke wurden die Befallsdichten durch die Insektizide SpinTor (Wirkstoff Spinosad: 0,01%) und Steward® (Wirkstoff Indoxacarb: 0,0125 %) sowie sekundär durch partielles Entblättern der Laubwand dauerhaft bis zur Traubenlese reduziert. rn
Resumo:
Geochemical mapping is a valuable tool for the control of territory that can be used not only in the identification of mineral resources and geological, agricultural and forestry studies but also in the monitoring of natural resources by giving solutions to environmental and economic problems. Stream sediments are widely used in the sampling campaigns carried out by the world's governments and research groups for their characteristics of broad representativeness of rocks and soils, for ease of sampling and for the possibility to conduct very detailed sampling In this context, the environmental role of stream sediments provides a good basis for the implementation of environmental management measures, in fact the composition of river sediments is an important factor in understanding the complex dynamics that develop within catchment basins therefore they represent a critical environmental compartment: they can persistently incorporate pollutants after a process of contamination and release into the biosphere if the environmental conditions change. It is essential to determine whether the concentrations of certain elements, in particular heavy metals, can be the result of natural erosion of rocks containing high concentrations of specific elements or are generated as residues of human activities related to a certain study area. This PhD thesis aims to extract from an extensive database on stream sediments of the Romagna rivers the widest spectrum of informations. The study involved low and high order stream in the mountain and hilly area, but also the sediments of the floodplain area, where intensive agriculture is active. The geochemical signals recorded by the stream sediments will be interpreted in order to reconstruct the natural variability related to bedrock and soil contribution, the effects of the river dynamics, the anomalous sites, and with the calculation of background values be able to evaluate their level of degradation and predict the environmental risk.
Resumo:
The monitoring of cognitive functions aims at gaining information about the current cognitive state of the user by decoding brain signals. In recent years, this approach allowed to acquire valuable information about the cognitive aspects regarding the interaction of humans with external world. From this consideration, researchers started to consider passive application of brain–computer interface (BCI) in order to provide a novel input modality for technical systems solely based on brain activity. The objective of this thesis is to demonstrate how the passive Brain Computer Interfaces (BCIs) applications can be used to assess the mental states of the users, in order to improve the human machine interaction. Two main studies has been proposed. The first one allows to investigate whatever the Event Related Potentials (ERPs) morphological variations can be used to predict the users’ mental states (e.g. attentional resources, mental workload) during different reactive BCI tasks (e.g. P300-based BCIs), and if these information can predict the subjects’ performance in performing the tasks. In the second study, a passive BCI system able to online estimate the mental workload of the user by relying on the combination of the EEG and the ECG biosignals has been proposed. The latter study has been performed by simulating an operative scenario, in which the occurrence of errors or lack of performance could have significant consequences. The results showed that the proposed system is able to estimate online the mental workload of the subjects discriminating three different difficulty level of the tasks ensuring a high reliability.
Resumo:
EUMETSAT (www.eumetsat.int) e’ l’agenzia europea per operazioni su satelliti per monitorare clima, meteo e ambiente terrestre. Dal centro operativo situato a Darmstadt (Germania), si controllano satelliti meteorologici su orbite geostazionarie e polari che raccolgono dati per l’osservazione dell’atmosfera, degli oceani e della superficie terrestre per un servizio continuo di 24/7. Un sistema di monitoraggio centralizzato per programmi diversi all’interno dell’ambiente operazionale di EUMETSAT, e’ dato da GEMS (Generic Event Monitoring System). Il software garantisce il controllo di diverse piattaforme, cross-monitoring di diverse sezioni operative, ed ha le caratteristiche per potere essere esteso a future missioni. L’attuale versione della GEMS MMI (Multi Media Interface), v. 3.6, utilizza standard Java Server Pages (JSP) e fa uso pesante di codici Java; utilizza inoltre files ASCII per filtri e display dei dati. Conseguenza diretta e’ ad esempio, il fatto che le informazioni non sono automaticamente aggiornate, ma hanno bisogno di ricaricare la pagina. Ulteriori inputs per una nuova versione della GEMS MMI vengono da diversi comportamenti anomali riportati durante l’uso quotidiano del software. La tesi si concentra sulla definizione di nuovi requisiti per una nuova versione della GEMS MMI (v. 4.4) da parte della divisione ingegneristica e di manutenzione di operazioni di EUMETSAT. Per le attivita’ di supporto, i test sono stati condotti presso Solenix. Il nuovo software permettera’ una migliore applicazione web, con tempi di risposta piu’ rapidi, aggiornamento delle informazioni automatico, utilizzo totale del database di GEMS e le capacita’ di filtri, insieme ad applicazioni per telefoni cellulari per il supporto delle attivita’ di reperibilita’. La nuova versione di GEMS avra’ una nuova Graphical User Interface (GUI) che utilizza tecnologie moderne. Per un ambiente di operazioni come e’ quello di EUMETSAT, dove l’affidabilita’ delle tecnologie e la longevita’ dell’approccio scelto sono di vitale importanza, non tutti gli attuali strumenti a disposizione sono adatti e hanno bisogno di essere migliorati. Allo stesso tempo, un’ interfaccia moderna, in termini di visual design, interattivita’ e funzionalita’, e’ importante per la nuova GEMS MMI.
Resumo:
Coral reefs are the most biodiverse ecosystems of the ocean and they provide notable ecosystem services. Nowadays, they are facing a number of local anthropogenic threats and environmental change is threatening their survivorship on a global scale. Large-scale monitoring is necessary to understand environmental changes and to perform useful conservation measurements. Governmental agencies are often underfunded and are not able of sustain the necessary spatial and temporal large-scale monitoring. To overcome the economic constrains, in some cases scientists can engage volunteers in environmental monitoring. Citizen Science enables the collection and analysis of scientific data at larger spatial and temporal scales than otherwise possible, addressing issues that are otherwise logistically or financially unfeasible. “STE: Scuba Tourism for the Environment” was a volunteer-based Red Sea coral reef biodiversity monitoring program. SCUBA divers and snorkelers were involved in the collection of data for 72 taxa, by completing survey questionnaires after their dives. In my thesis, I evaluated the reliability of the data collected by volunteers, comparing their questionnaires with those completed by professional scientists. Validation trials showed a sufficient level of reliability, indicating that non-specialists performed similarly to conservation volunteer divers on accurate transects. Using the data collected by volunteers, I developed a biodiversity index that revealed spatial trends across surveyed areas. The project results provided important feedbacks to the local authorities on the current health status of Red Sea coral reefs and on the effectiveness of the environmental management. I also analysed the spatial and temporal distribution of each surveyed taxa, identifying abundance trends related with anthropogenic impacts. Finally, I evaluated the effectiveness of the project to increase the environmental education of volunteers and showed that the participation in STEproject significantly increased both the knowledge on coral reef biology and ecology and the awareness of human behavioural impacts on the environment.
Resumo:
Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.
Resumo:
Klimamontoring benötigt eine operative, raum-zeitliche Analyse der Klimavariabilität. Mit dieser Zielsetzung, funktionsbereite Karten regelmäßig zu erstellen, ist es hilfreich auf einen Blick, die räumliche Variabilität der Klimaelemente in der zeitlichen Veränderungen darzustellen. Für aktuelle und kürzlich vergangene Jahre entwickelte der Deutsche Wetterdienst ein Standardverfahren zur Erstellung solcher Karten. Die Methode zur Erstellung solcher Karten variiert für die verschiedenen Klimaelemente bedingt durch die Datengrundlage, die natürliche Variabilität und der Verfügbarkeit der in-situ Daten.rnIm Rahmen der Analyse der raum-zeitlichen Variabilität innerhalb dieser Dissertation werden verschiedene Interpolationsverfahren auf die Mitteltemperatur der fünf Dekaden der Jahre 1951-2000 für ein relativ großes Gebiet, der Region VI der Weltorganisation für Meteorologie (Europa und Naher Osten) angewendet. Die Region deckt ein relativ heterogenes Arbeitsgebiet von Grönland im Nordwesten bis Syrien im Südosten hinsichtlich der Klimatologie ab.rnDas zentrale Ziel der Dissertation ist eine Methode zur räumlichen Interpolation der mittleren Dekadentemperaturwerte für die Region VI zu entwickeln. Diese Methode soll in Zukunft für die operative monatliche Klimakartenerstellung geeignet sein. Diese einheitliche Methode soll auf andere Klimaelemente übertragbar und mit der entsprechenden Software überall anwendbar sein. Zwei zentrale Datenbanken werden im Rahmen dieser Dissertation verwendet: So genannte CLIMAT-Daten über dem Land und Schiffsdaten über dem Meer.rnIm Grunde wird die Übertragung der Punktwerte der Temperatur per räumlicher Interpolation auf die Fläche in drei Schritten vollzogen. Der erste Schritt beinhaltet eine multiple Regression zur Reduktion der Stationswerte mit den vier Einflussgrößen der Geographischen Breite, der Höhe über Normalnull, der Jahrestemperaturamplitude und der thermischen Kontinentalität auf ein einheitliches Niveau. Im zweiten Schritt werden die reduzierten Temperaturwerte, so genannte Residuen, mit der Interpolationsmethode der Radialen Basis Funktionen aus der Gruppe der Neuronalen Netzwerk Modelle (NNM) interpoliert. Im letzten Schritt werden die interpolierten Temperaturraster mit der Umkehrung der multiplen Regression aus Schritt eins mit Hilfe der vier Einflussgrößen auf ihr ursprüngliches Niveau hochgerechnet.rnFür alle Stationswerte wird die Differenz zwischen geschätzten Wert aus der Interpolation und dem wahren gemessenen Wert berechnet und durch die geostatistische Kenngröße des Root Mean Square Errors (RMSE) wiedergegeben. Der zentrale Vorteil ist die wertegetreue Wiedergabe, die fehlende Generalisierung und die Vermeidung von Interpolationsinseln. Das entwickelte Verfahren ist auf andere Klimaelemente wie Niederschlag, Schneedeckenhöhe oder Sonnenscheindauer übertragbar.
Resumo:
Sea level variation is one of the parameters directly related to climate change. Monitoring sea level rise is an important scientific issue since many populated areas of the world and megacities are located in low-lying regions. At present, sea level is measured by means of two techniques: the tide gauges and the satellite radar altimetry. Tide gauges measure sea-level relatively to a ground benchmark, hence, their measurements are directly affected by vertical ground motions. Satellite radar altimetry measures sea-level relative to a geocentric reference and are not affected by vertical land motions. In this study, the linear relative sea level trends of 35 tide gauge stations distributed across the Mediterranean Sea have been computed over the period 1993-2014. In order to extract the real sea-level variation, the vertical land motion has been estimated using the observations of available GPS stations and removed from the tide gauges records. These GPS-corrected trends have then been compared with satellite altimetry measurements over the same time interval (AVISO data set). A further comparison has been performed, over the period 1993-2013, using the CCI satellite altimetry data set which has been generated using an updated modeling. The absolute sea level trends obtained from satellite altimetry and GPS-corrected tide gauge data are mostly consistent, meaning that GPS data have provided reliable corrections for most of the sites. The trend values range between +2.5 and +4 mm/yr almost everywhere in the Mediterranean area, the largest trends were found in the Northern Adriatic Sea and in the Aegean. These results are in agreement with estimates of the global mean sea level rise over the last two decades. Where GPS data were not available, information on the vertical land motion deduced from the differences between absolute and relative trends are in agreement with the results of other studies.
Resumo:
An imaging biomarker that would provide for an early quantitative metric of clinical treatment response in cancer patients would provide for a paradigm shift in cancer care. Currently, nonimage based clinical outcome metrics include morphology, clinical, and laboratory parameters, however, these are obtained relatively late following treatment. Diffusion-weighted MRI (DW-MRI) holds promise for use as a cancer treatment response biomarker as it is sensitive to macromolecular and microstructural changes which can occur at the cellular level earlier than anatomical changes during therapy. Studies have shown that successful treatment of many tumor types can be detected using DW-MRI as an early increase in the apparent diffusion coefficient (ADC) values. Additionally, low pretreatment ADC values of various tumors are often predictive of better outcome. These capabilities, once validated, could provide for an important opportunity to individualize therapy thereby minimizing unnecessary systemic toxicity associated with ineffective therapies with the additional advantage of improving overall patient health care and associated costs. In this report, we provide a brief technical overview of DW-MRI acquisition protocols, quantitative image analysis approaches and review studies which have implemented DW-MRI for the purpose of early prediction of cancer treatment response.