922 resultados para data storage concept
Resumo:
Changes in the oceanic heat storage (HS) can reveal important evidences of climate variability related to ocean heat fluxes. Specifically, long-term variations in HS are a powerful indicator of climate change as HS represents the balance between the net surface energy flux and the poleward heat transported by the ocean currents. HS is estimated from sea surface height anomaly measured from the altimeters TOPEX/Poseidon and Jason 1 from 1993 to 2006. To characterize and validate the altimeter-based HS in the Atlantic, we used the data from the Pilot Research Moored Array in the Tropical Atlantic (PIRATA) array. Correlations and rms differences are used as statistical figures of merit to compare the HS estimates. The correlations range from 0.50 to 0.87 in the buoys located at the equator and at the southern part of the array. In that region the rms differences range between 0.40 and 0.51 x 10(9) Jm(-2). These results are encouraging and indicate that the altimeter has the precision necessary to capture the interannual trends in HS in the Atlantic. Albeit relatively small, salinity changes can also have an effect on the sea surface height anomaly. To account for this effect, NCEP/GODAS reanalysis data are used to estimate the haline contraction. To understand which dynamical processes are involved in the HS variability, the total signal is decomposed into nonpropagating basin-scale and seasonal (HS(l)) planetary waves, mesoscale eddies, and small-scale residual components. In general, HS(l) is the dominant signal in the tropical region. Results show a warming trend of HS(l) in the past 13 years almost all over the Atlantic basin with the most prominent slopes found at high latitudes. Positive interannual trends are found in the halosteric component at high latitudes of the South Atlantic and near the Labrador Sea. This could be an indication that the salinity anomaly increased in the upper layers during this period. The dynamics of the South Atlantic subtropical gyre could also be subject to low-frequency changes caused by a trend in the halosteric component on each side of the South Atlantic Current.
Resumo:
The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.
Resumo:
The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
Il Cloud Storage è un modello di conservazione dati su computer in rete, dove i dati stessi sono memorizzati su molteplici server, reali e/o virtuali, generalmente ospitati presso strutture di terze parti o su server dedicati. Tramite questo modello è possibile accedere alle informazioni personali o aziendali, siano essi video, fotografie, musica, database o file in maniera “smaterializzata”, senza conoscere l’ubicazione fisica dei dati, da qualsiasi parte del mondo, con un qualsiasi dispositivo adeguato. I vantaggi di questa metodologia sono molteplici: infinita capacita’ di spazio di memoria, pagamento solo dell’effettiva quantità di memoria utilizzata, file accessibili da qualunque parte del mondo, manutenzione estremamente ridotta e maggiore sicurezza in quanto i file sono protetti da furto, fuoco o danni che potrebbero avvenire su computer locali. Google Cloud Storage cade in questa categoria: è un servizio per sviluppatori fornito da Google che permette di salvare e manipolare dati direttamente sull’infrastruttura di Google. In maggior dettaglio, Google Cloud Storage fornisce un’interfaccia di programmazione che fa uso di semplici richieste HTTP per eseguire operazioni sulla propria infrastruttura. Esempi di operazioni ammissibili sono: upload di un file, download di un file, eliminazione di un file, ottenere la lista dei file oppure la dimensione di un dato file. Ogniuna di queste richieste HTTP incapsula l’informazione sul metodo utilizzato (il tipo di richista, come GET, PUT, ...) e un’informazione di “portata” (la risorsa su cui effettuare la richiesta). Ne segue che diventa possibile la creazione di un’applicazione che, facendo uso di queste richieste HTTP, fornisce un servizio di Cloud Storage (in cui le applicazioni salvano dati in remoto generalmene attraverso dei server di terze parti). In questa tesi, dopo aver analizzato tutti i dettagli del servizio Google Cloud Storage, è stata implementata un’applicazione, chiamata iHD, che fa uso di quest’ultimo servizio per salvare, manipolare e condividere dati in remoto (nel “cloud”). Operazioni comuni di questa applicazione permettono di condividere cartelle tra più utenti iscritti al servizio, eseguire operazioni di upload e download di file, eliminare cartelle o file ed infine creare cartelle. L’esigenza di un’appliazione di questo tipo è nata da un forte incremento, sul merato della telefonia mobile, di dispositivi con tecnologie e con funzioni sempre più legate ad Internet ed alla connettività che esso offre. La tesi presenta anche una descrizione delle fasi di progettazione e implementazione riguardanti l’applicazione iHD. Nella fase di progettazione si sono analizzati tutti i requisiti funzionali e non funzionali dell’applicazione ed infine tutti i moduli da cui è composta quest’ultima. Infine, per quanto riguarda la fase di implementazione, la tesi presenta tutte le classi ed i rispettivi metodi presenti per ogni modulo, ed in alcuni casi anche come queste classi sono state effettivamente implementate nel linguaggio di programmazione utilizzato.
Resumo:
Beside the traditional paradigm of "centralized" power generation, a new concept of "distributed" generation is emerging, in which the same user becomes pro-sumer. During this transition, the Energy Storage Systems (ESS) can provide multiple services and features, which are necessary for a higher quality of the electrical system and for the optimization of non-programmable Renewable Energy Source (RES) power plants. A ESS prototype was designed, developed and integrated into a renewable energy production system in order to create a smart microgrid and consequently manage in an efficient and intelligent way the energy flow as a function of the power demand. The produced energy can be introduced into the grid, supplied to the load directly or stored in batteries. The microgrid is composed by a 7 kW wind turbine (WT) and a 17 kW photovoltaic (PV) plant are part of. The load is given by electrical utilities of a cheese factory. The ESS is composed by the following two subsystems, a Battery Energy Storage System (BESS) and a Power Control System (PCS). With the aim of sizing the ESS, a Remote Grid Analyzer (RGA) was designed, realized and connected to the wind turbine, photovoltaic plant and the switchboard. Afterwards, different electrochemical storage technologies were studied, and taking into account the load requirements present in the cheese factory, the most suitable solution was identified in the high temperatures salt Na-NiCl2 battery technology. The data acquisition from all electrical utilities provided a detailed load analysis, indicating the optimal storage size equal to a 30 kW battery system. Moreover a container was designed and realized to locate the BESS and PCS, meeting all the requirements and safety conditions. Furthermore, a smart control system was implemented in order to handle the different applications of the ESS, such as peak shaving or load levelling.
Resumo:
Recent developments in clinical radiology have resulted in additional developments in the field of forensic radiology. After implementation of cross-sectional radiology and optical surface documentation in forensic medicine, difficulties in the validation and analysis of the acquired data was experienced. To address this problem and for the comparison of autopsy and radiological data a centralized database with internet technology for forensic cases was created. The main goals of the database are (1) creation of a digital and standardized documentation tool for forensic-radiological and pathological findings; (2) establishing a basis for validation of forensic cross-sectional radiology as a non-invasive examination method in forensic medicine that means comparing and evaluating the radiological and autopsy data and analyzing the accuracy of such data; and (3) providing a conduit for continuing research and education in forensic medicine. Considering the infrequent availability of CT or MRI for forensic institutions and the heterogeneous nature of case material in forensic medicine an evaluation of benefits and limitations of cross-sectional imaging concerning certain forensic features by a single institution may be of limited value. A centralized database permitting international forensic and cross disciplinary collaborations may provide important support for forensic-radiological casework and research.
Resumo:
Traditionally, ontologies describe knowledge representation in a denotational, formalized, and deductive way. In addition, in this paper, we propose a semiotic, inductive, and approximate approach to ontology creation. We define a conceptual framework, a semantics extraction algorithm, and a first proof of concept applying the algorithm to a small set of Wikipedia documents. Intended as an extension to the prevailing top-down ontologies, we introduce an inductive fuzzy grassroots ontology, which organizes itself organically from existing natural language Web content. Using inductive and approximate reasoning to reflect the natural way in which knowledge is processed, the ontology’s bottom-up build process creates emergent semantics learned from the Web. By this means, the ontology acts as a hub for computing with words described in natural language. For Web users, the structural semantics are visualized as inductive fuzzy cognitive maps, allowing an initial form of intelligence amplification. Eventually, we present an implementation of our inductive fuzzy grassroots ontology Thus,this paper contributes an algorithm for the extraction of fuzzy grassroots ontologies from Web data by inductive fuzzy classification.
Resumo:
Aims: To evaluate the implications of an Absorb bioresorbable vascular scaffold (Absorb BVS) on the morphology of the superficial plaques. Methods and results: Forty-six patients who underwent Absorb BVS implantation and 20 patients implanted with bare metal stents (BMS) who had serial optical coherence tomographic examination at baseline and follow-up were included in this analysis. The thin-capped fibroatheromas (TCFA) were identified in the device implantation regions and in the adjacent native coronary segments. Within all regions, circumferential locations of TCFA and calcific tissues were identified, and the neointimal thickness was measured at follow-up. At six to 12-month follow-up, only 8% of the TCFA detected at baseline were still present in the Absorb BVS and 27% in the BMS implantation segment (p=0.231). Sixty percent of the TCFA in native segments did not change their phenotype at follow-up. At short-term follow-up, significant reduction in the lumen area of the BMS was noted, which was higher compared to that reported in the Absorb BVS group (-2.11±1.97 mm2 vs. -1.34±0.99 mm2, p=0.026). In Absorb BVS, neointima tissue continued to develop at midterm follow-up (2.17±0.48 mm2 vs. 1.38±0.52 mm2, p<0.0001) and covered the underlying tissues without compromising the luminal dimensions (5.93±1.49 mm2 vs. 6.14±1.49 mm2, p=0.571) as it was accommodated by the expanded scaffold (8.28±1.74 mm2 vs. 7.67±1.28 mm2, p<0.0001). Conclusions: Neointimal tissue develops following either Absorb BVS or BMS implantation and shields lipid tissues. The neointimal response in the BMS causes a higher reduction of luminal dimensions compared to the Absorb BVS. Thus, Absorb BVS may have a value in the invasive re-capping of high-risk plaques.
Resumo:
The oceans play a critical role in the Earth's climate, but unfortunately, the extent of this role is only partially understood. One major obstacle is the difficulty associated with making high-quality, globally distributed observations, a feat that is nearly impossible using only ships and other ocean-based platforms. The data collected by satellite-borne ocean color instruments, however, provide environmental scientists a synoptic look at the productivity and variability of the Earth's oceans and atmosphere, respectively, on high-resolution temporal and spatial scales. Three such instruments, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) onboard ORBIMAGE's OrbView-2 satellite, and two Moderate Resolution Imaging Spectroradiometers (MODIS) onboard the National Aeronautic and Space Administration's (NASA) Terra and Aqua satellites, have been in continuous operation since September 1997, February 2000, and June 2002, respectively. To facilitate the assembly of a suitably accurate data set for climate research, members of the NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project and SeaWiFS Project Offices devote significant attention to the calibration and validation of these and other ocean color instruments. This article briefly presents results from the SIMBIOS and SeaWiFS Project Office's (SSPO) satellite ocean color validation activities and describes the SeaWiFS Bio-optical Archive and Storage System (SeaBASS), a state-of-the-art system for archiving, cataloging, and distributing the in situ data used in these activities.
Resumo:
A comprehensive hydroclimatic data set is presented for the 2011 water year to improve understanding of hydrologic processes in the rain-snow transition zone. This type of dataset is extremely rare in scientific literature because of the quality and quantity of soil depth, soil texture, soil moisture, and soil temperature data. Standard meteorological and snow cover data for the entire 2011 water year are included, which include several rain-on-snow events. Surface soil textures and soil depths from 57 points are presented as well as soil texture profiles from 14 points. Meteorological data include continuous hourly shielded, unshielded, and wind corrected precipitation, wind speed, air temperature, relative humidity, dew point temperature, and incoming solar and thermal radiation data. Sub-surface data included are hourly soil moisture data from multiple depths from 7 soil profiles within the catchment, and soil temperatures from multiple depths from 2 soil profiles. Hydrologic response data include hourly stream discharge from the catchment outlet weir, continuous snow depths from one location, intermittent snow depths from 5 locations, and snow depth and density data from ten weekly snow surveys. Though it represents only a single water year, the presentation of both above and below ground hydrologic condition makes it one of the most detailed and complete hydro-climatic datasets from the climatically sensitive rain-snow transition zone for a wide range of modeling and descriptive studies.
Resumo:
A PET imaging system demonstrator based on LYSO crystal arrays coupled to SiPM matrices is under construction at the University and INFN of Pisa. Two SiPM matrices, composed of 8×8 SiPM pixels, and 1,5 mm pitch, have been coupled one to one to a LYSO crystals array and read out by a custom electronics system. front-end ASICs were used to read 8 channels of each matrix. Data from each front-end were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port for the storage and off-line data processing. In this paper we show the first preliminary tomographic image of a point-like radioactive source acquired with part of the two detection heads in time coincidence.
Resumo:
We dedicate this paper to the memory of Prof. Andres Perez Estaún, who was a great and committed scientist, wonderful colleague and even better friend. The datasets in this work have been funded by Fundación Ciudad de la Energía (Spanish Government, www.ciuden.es) and by the European Union through the “European Energy Programme 15 for Recovery” and the Compostilla OXYCFB300 project. Dr. Juan Alcalde is currently funded by NERC grant NE/M007251/1. Simon Campbell and Samuel Cheyney are acknowledged for thoughtful comments on gravity inversion
Resumo:
For many inborn errors of metabolism, early treatment is critical to prevent long-term developmental sequelae. We have used a gene-therapy approach to demonstrate this concept in a murine model of mucopolysaccharidosis type VII (MPS VII). Newborn MPS VII mice received a single intravenous injection with 5.4 × 106 infectious units of recombinant adeno-associated virus encoding the human β-glucuronidase (GUSB) cDNA. Therapeutic levels of GUSB expression were achieved by 1 week of age in liver, heart, lung, spleen, kidney, brain, and retina. GUSB expression persisted in most organs for the 16-week duration of the study at levels sufficient to either reduce or prevent completely lysosomal storage. Of particular significance, neurons, microglia, and meninges of the central nervous system were virtually cleared of disease. In addition, neonatal treatment of MPS VII mice provided access to the central nervous system via an intravenous route, avoiding a more invasive procedure later in life. These data suggest that gene transfer mediated by adeno-associated virus can achieve therapeutically relevant levels of enzyme very early in life and that the rapid growth and differentiation of tissues does not limit long-term expression.