986 resultados para unison sincronizzazione condivisione file system sistemi operativi
Resumo:
The Tara Oceans Expedition (2009-2013) was a global survey of ocean ecosystems aboard the Sailing Vessel Tara. It carried out extensive measurements of evironmental conditions and collected plankton (viruses, bacteria, protists and metazoans) for later analysis using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication provides permanent links to original and updated versions of validated data files containing measurements from the Continuous Surface Sampling System [CSSS]. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a WETLabs AC-S spectrophotometer and a WETLabs chlorophyll fluorometer. Systems maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.
Resumo:
The Tara Oceans Expedition (2009-2013) was a global survey of ocean ecosystems aboard the Sailing Vessel Tara. It carried out extensive measurements of evironmental conditions and collected plankton (viruses, bacteria, protists and metazoans) for later analysis using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication provides permanent links to original and updated versions of validated data files containing measurements from the Continuous Surface Sampling System [CSSS]. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a Sea-Bird TSG temperature and conductivity sensor. Systems maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.
Resumo:
Il processo di localizzazione risponde all'esigenza dell'uomo di avere una percezione sempre più dettagliata e precisa del contesto in cui si trova, con l'intento di migliorare e semplificare l'interazione con oggetti e cose ivi presenti. L'idea attuale è quella di progettare sistemi di posizionamento con particolare riguardo agli ambienti indoor, caratterizzati da proprietà e densità di elementi che limitano fortemente le prestazioni dei consolidati sistemi di tracking, particolarmente efficienti in spazi aperti. Consapevole di questa necessità, il seguente elaborato analizza le prestazioni di un sistema di localizzazione sviluppato dall'Università di Bologna funzionante in tecnologia Ultra-Wide Bandwidth (UWB) e installato nei laboratori DEI dell'Alma Mater Studiorum con sede a Cesena. L'obiettivo è quello di caratterizzare l'accuratezza di localizzazione del sistema, suggerendo nuovi approcci operativi e de�finendo il ruolo dei principali parametri che giocano nel meccanismo di stima della posizione, sia in riferimento a scenari marcatamente statici sia in contesti in cui si ha una interazione dinamica degli oggetti con lo spazio circostante. Una caratteristica della tecnologia UWB è, infatti, quella di limitare l'errore di posizionamento nel caso di localizzazione indoor, grazie alle caratteristiche fisiche ed elettriche dei segnali, aprendo nuovi scenari applicativi favorevoli in termini economici, energetici e di minore complessità dei dispositivi impiegati.
Resumo:
La presente tesi ha come obiettivo quello di sviluppare un modello per la gestione ottimizzata delle unità di generazione e di accumulo di una microrete elettrica. La tesi analizza, come caso studio di riferimento, una microrete contenente impianti di generazione da fonti rinnovabili, sistemi di accumulo a batteria (BES:Battery Energy System) e stazioni di ricarica per veicoli elettrici. In particolare le stazioni di ricarica sono a flusso bidirezionale, in grado di fornire servizi di tipo "grid-to-vehicle"(G2V) e "vehicle-to-grid" (V2G). Il modello consente di definire, come sistema di dispacciamento centrale, le potenze che le varie risorse distribuite devono erogare o assorbire nella rete nelle 24 ore successive. Il dispacciamento avviene mediante risoluzione di un problema di minimizzazione dei costi operativi e dell'energia prelevata dalla rete esterna. Il problema è stato formulato tramite l'approccio di programmazione stocastica lineare dove i parametri incerti del modello sono modellizzati tramite processi stocastici. L'implementazione del modello è stata effettuata tramite il software AIMMS, un programma di ottimizzazione che prevede al suo interno delle funzionalità specifiche per la programmazione stocastica
Resumo:
Dimensional and form inspections are key to the manufacturing and assembly of products. Product verification can involve a number of different measuring instruments operated using their dedicated software. Typically, each of these instruments with their associated software is more suitable for the verification of a pre-specified quality characteristic of the product than others. The number of different systems and software applications to perform a complete measurement of products and assemblies within a manufacturing organisation is therefore expected to be large. This number becomes even larger as advances in measurement technologies are made. The idea of a universal software application for any instrument still appears to be only a theoretical possibility. A need for information integration is apparent. In this paper, a design of an information system to consistently manage (store, search, retrieve, search, secure) measurement results from various instruments and software applications is introduced. Two of the main ideas underlying the proposed system include abstracting structures and formats of measurement files from the data so that complexity and compatibility between different approaches to measurement data modelling is avoided. Secondly, the information within a file is enriched with meta-information to facilitate its consistent storage and retrieval. To demonstrate the designed information system, a web application is implemented. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
The early Pliocene warm phase was characterized by high sea surface temperatures and a deep thermocline in the eastern equatorial Pacific. A new hypothesis suggests that the progressive closure of the Panamanian seaway contributed substantially to the termination of this zonally symmetric state in the equatorial Pacific. According to this hypothesis, intensification of the Atlantic meridional overturning circulation (AMOC) - induced by the closure of the gateway - was the principal cause of equatorial Pacific thermocline shoaling during the Pliocene. In this study, twelve Panama seaway sensitivity experiments from eight ocean/climate models of different complexity are analyzed to examine the effect of an open gateway on AMOC strength and thermocline depth. All models show an eastward Panamanian net throughflow, leading to a reduction in AMOC strength compared to the corresponding closed-Panama case. In those models that do not include a dynamic atmosphere, deepening of the equatorial Pacific thermocline appears to scale almost linearly with the throughflow-induced reduction in AMOC strength. Models with dynamic atmosphere do not follow this simple relation. There are indications that in four out of five models equatorial wind-stress anomalies amplify the tropical Pacific thermocline deepening. In summary, the models provide strong support for the hypothesized relationship between Panama closure and equatorial Pacific thermocline shoaling.
Resumo:
Transient simulations are widely used in studying the past climate as they provide better comparison with any exisiting proxy data. However, multi-millennial transient simulations using coupled climate models are usually computationally very expensive. As a result several acceleration techniques are implemented when using numerical simulations to recreate past climate. In this study, we compare the results from transient simulations of the present and the last interglacial with and without acceleration of the orbital forcing, using the comprehensive coupled climate model CCSM3 (Community Climate System Model 3). Our study shows that in low-latitude regions, the simulation of long-term variations in interglacial surface climate is not significantly affected by the use of the acceleration technique (with an acceleration factor of 10) and hence, large-scale model-data comparison of surface variables is not hampered. However, in high-latitude regions where the surface climate has a direct connection to the deep ocean, e.g. in the Southern Ocean or the Nordic Seas, acceleration-induced biases in sea-surface temperature evolution may occur with potential influence on the dynamics of the overlying atmosphere. The data provided here are from both accelerated and non-accelerated runs as decadal mean values.
Resumo:
Progettazione di dettaglio di un banco di prova per testare sistemi ADCS per CubeSat: Alma Test-Bed. Ci si è concentrati sul progetto di un primo nucleo di AlmaTB in grado di testare il controllo di tipo magnetico. Fanno parte di AlmaTB una gabbia di Helmholtz, un air-bearing system, un CubeSat di test, un metrology system. La gabbia di Helmholtz è un apparato costituito da tre coppie di bobine, una per ogni asse spaziale, che serve ad annullare il campo magnetico locale e simulare quello che si troverà in orbita attorno alla Terra. Un software ricava i dati del campo magnetico terrestre da modello IGRF a determinate coordinate e quota e fornisce agli alimentatori del set di bobine l'indicazione della corrente da distribuire. L'air-bearing system è un cuscinetto d'aria generato da un compressore che serve a ricreare le caratteristiche condizioni dell'ambiente spaziale di microgravità e attrito quasi-zero. Il CubeSat di test sarà montato su questo sistema. Il CubeSat di test, nella prima versione di AlmaTB, contiene i sensori e gli attuatori di tipo magnetico per determinare e controllare l'assetto di un nanosatellite. Il magnetometro presente all'interno è utilizzato anche come controllo del funzionamento della gabbia di Helmholtz. Il metrology system traccia i movimenti e l'inclinazione del CubeSat. Questo fornisce il riferimento di assetto vero, in modo da capire se il sistema ADCS lavora correttamente. Una volta che il banco di prova sarà completato e operativo sarà possibile testare algoritmi di determinazione e controllo di assetto che utilizzano diversi dispositivi tra sensori e attuatori disponibili nel mock-up. Su una workstation sono installati i software di controllo ed elaborazione dati. Si è scelto di procedere con un approccio di tipo "chiavi in mano", cioè scegliendo, quando disponibile, sistemi già completi e disponibili sul mercato. La prima versione di AlmaTB nasce dall'importante, vasto lavoro di matching tra i diversi apparati.
Resumo:
2016 is the outbreak year of the virtual reality industry. In the field of virtual reality, 3D surveying plays an important role. Nowadays, 3D surveying technology has received increasing attention. This project aims to establish and optimize a WebGL three-dimensional broadcast platform combined with streaming media technology. It takes streaming media server and panoramic video broadcast in browser as the application background. Simultaneously, it discusses about the architecture from streaming media server to panoramic media player and analyzing relevant theory problem. This paper focuses on the debugging of streaming media platform, the structure of WebGL player environment, different types of ball model analysis, and the 3D mapping technology. The main work contains the following points: Initially, relay on Easy Darwin open source streaming media server, built a streaming service platform. It can realize the transmission from RTSP stream to streaming media server, and forwards HLS slice video to clients; Then, wrote a WebGL panoramic video player based on Three.js lib with JQuery browser playback controls. Set up a HTML5 panoramic video player; Next, analyzed the latitude and longitude sphere model which from Three.js library according to WebGL rendering method. Pointed out the drawbacks of this model and the breakthrough point of improvement; After that, on the basis of Schneider transform principle, established the Schneider sphere projection model, and converted the output OBJ file to JS file for media player reading. Finally implemented real time panoramic video high precision playing without plugin; At last, I summarized the whole project. Put forward the direction of future optimization and extensible market.
Resumo:
Artificial Immune Systems have been used successfully to build recommender systems for film databases. In this research, an attempt is made to extend this idea to web site recommendation. A collection of more than 1000 individuals' web profiles (alternatively called preferences / favourites / bookmarks file) will be used. URLs will be classified using the DMOZ (Directory Mozilla) database of the Open Directory Project as our ontology. This will then be used as the data for the Artificial Immune Systems rather than the actual addresses. The first attempt will involve using a simple classification code number coupled with the number of pages within that classification code. However, this implementation does not make use of the hierarchical tree-like structure of DMOZ. Consideration will then be given to the construction of a similarity measure for web profiles that makes use of this hierarchical information to build a better-informed Artificial Immune System.
Resumo:
The OPIT program is briefly described. OPIT is a basis-set-optimising, self-consistent field, molecular orbital program for calculating properties of closed-shell ground states of atoms and molecules. A file handling technique is then put forward which enables core storage to be used efficiently in large FORTRAN scientific applications programs. Hashing and list processing techniques, of the type frequently used in writing system software and computer operating systems, are here applied to the creation of data files (integral label and value lists etc.). Files consist of a chained series of blocks which may exist in core or on backing store or both. Efficient use of core store is achieved and the processes of file deletion, file re-writing and garbage collection of unused blocks can be easily arranged. The scheme is exemplified with reference to the OPIT program. A subsequent paper will describe a job scheduling scheme for large programs of this sort.
Resumo:
Artificial Immune Systems have been used successfully to build recommender systems for film databases. In this research, an attempt is made to extend this idea to web site recommendation. A collection of more than 1000 individuals' web profiles (alternatively called preferences / favourites / bookmarks file) will be used. URLs will be classified using the DMOZ (Directory Mozilla) database of the Open Directory Project as our ontology. This will then be used as the data for the Artificial Immune Systems rather than the actual addresses. The first attempt will involve using a simple classification code number coupled with the number of pages within that classification code. However, this implementation does not make use of the hierarchical tree-like structure of DMOZ. Consideration will then be given to the construction of a similarity measure for web profiles that makes use of this hierarchical information to build a better-informed Artificial Immune System.
Resumo:
This paper deals with the development and the analysis of asymptotically stable and consistent schemes in the joint quasi-neutral and fluid limits for the collisional Vlasov-Poisson system. In these limits, the classical explicit schemes suffer from time step restrictions due to the small plasma period and Knudsen number. To solve this problem, we propose a new scheme stable for choices of time steps independent from the small scales dynamics and with comparable computational cost with respect to standard explicit schemes. In addition, this scheme reduces automatically to consistent discretizations of the underlying asymptotic systems. In this first work on this subject, we propose a first order in time scheme and we perform a relative linear stability analysis to deal with such problems. The framework we propose permits to extend this approach to high order schemes in the next future. We finally show the capability of the method in dealing with small scales through numerical experiments.
Resumo:
Part 20: Health and Care Networks
Resumo:
Part 17: Risk Analysis