952 resultados para Data-stream balancing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I Big Data hanno forgiato nuove tecnologie che migliorano la qualità della vita utilizzando la combinazione di rappresentazioni eterogenee di dati in varie discipline. Occorre, quindi, un sistema realtime in grado di computare i dati in tempo reale. Tale sistema viene denominato speed layer, come si evince dal nome si è pensato a garantire che i nuovi dati siano restituiti dalle query funcions con la rapidità in cui essi arrivano. Il lavoro di tesi verte sulla realizzazione di un’architettura che si rifaccia allo Speed Layer della Lambda Architecture e che sia in grado di ricevere dati metereologici pubblicati su una coda MQTT, elaborarli in tempo reale e memorizzarli in un database per renderli disponibili ai Data Scientist. L’ambiente di programmazione utilizzato è JAVA, il progetto è stato installato sulla piattaforma Hortonworks che si basa sul framework Hadoop e sul sistema di computazione Storm, che permette di lavorare con flussi di dati illimitati, effettuando l’elaborazione in tempo reale. A differenza dei tradizionali approcci di stream-processing con reti di code e workers, Storm è fault-tolerance e scalabile. Gli sforzi dedicati al suo sviluppo da parte della Apache Software Foundation, il crescente utilizzo in ambito di produzione di importanti aziende, il supporto da parte delle compagnie di cloud hosting sono segnali che questa tecnologia prenderà sempre più piede come soluzione per la gestione di computazioni distribuite orientate agli eventi. Per poter memorizzare e analizzare queste moli di dati, che da sempre hanno costituito una problematica non superabile con i database tradizionali, è stato utilizzato un database non relazionale: HBase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ogni giorno vengono generati grandi moli di dati attraverso sorgenti diverse. Questi dati, chiamati Big Data, sono attualmente oggetto di forte interesse nel settore IT (Information Technology). I processi digitalizzati, le interazioni sui social media, i sensori ed i sistemi mobili, che utilizziamo quotidianamente, sono solo un piccolo sottoinsieme di tutte le fonti che contribuiscono alla produzione di questi dati. Per poter analizzare ed estrarre informazioni da questi grandi volumi di dati, tante sono le tecnologie che sono state sviluppate. Molte di queste sfruttano approcci distribuiti e paralleli. Una delle tecnologie che ha avuto maggior successo nel processamento dei Big Data, e Apache Hadoop. Il Cloud Computing, in particolare le soluzioni che seguono il modello IaaS (Infrastructure as a Service), forniscono un valido strumento all'approvvigionamento di risorse in maniera semplice e veloce. Per questo motivo, in questa proposta, viene utilizzato OpenStack come piattaforma IaaS. Grazie all'integrazione delle tecnologie OpenStack e Hadoop, attraverso Sahara, si riesce a sfruttare le potenzialita offerte da un ambiente cloud per migliorare le prestazioni dell'elaborazione distribuita e parallela. Lo scopo di questo lavoro e ottenere una miglior distribuzione delle risorse utilizzate nel sistema cloud con obiettivi di load balancing. Per raggiungere questi obiettivi, si sono rese necessarie modifiche sia al framework Hadoop che al progetto Sahara.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-stream structures including cross-vanes, J-hooks, rock vanes, and W-weirs are widely used in river restoration to limit bank erosion, prevent changes in channel gradient, and improve aquatic habitat. During this investigation, a rapid assessment protocol was combined with post-project monitoring data to assess factors influencing the performance of more than 558 in-stream structures and rootwads in North Carolina. Cross-sectional survey data examined for 221 cross sections from 26 sites showed that channel adjustments were highly variable from site to site, but approximately 60 % of the sites underwent at least a 20 % net change in channel capacity. Evaluation of in-stream structures ranging from 1 to 8 years in age showed that about half of the structures were impaired at 10 of the 26 sites. Major structural damage was often associated with floods of low to moderate frequency and magnitude. Failure mechanisms varied between sites and structure types, but included: (1) erosion of the channel bed and banks (outflanking); (2) movement of rock materials during floods; and (3) burial of the structures in the channel bed. Sites with reconstructed channels that exhibited large changes in channel capacity possessed the highest rates of structural impairment, suggesting that channel adjustments between structures led to their degradation of function. The data question whether currently used in-stream structures are capable of stabilizing reconfigured channels for even short periods when applied to dynamic rivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the survival and success rates of immediately restored implants with sandblasted, large-grit, acid-etched (SLA) surfaces over a period of 5 years. Twenty patients (mean age, 47.3 years) received a total of 21 SLA wide-neck implants in healed mandibular first molar sites after initial periodontal treatment. To be included in the study, the implants had to demonstrate primary stability with an insertion torque value of 35 Ncm. A provisional restoration was fabricated chairside and placed on the day of surgery. Definitive cemented restorations were inserted 8 weeks after surgery. Community Periodontal Index of Treatment Needs (CPITN) indices and the radiographic distance between the implant shoulder and the first visible bone-implant contact (DIB) were measured and compared over the study period. The initial mean CPITN was 3.24, and decreased over the study period to 1.43. At the postoperative radiographic examination, the mean DIB was 1.41 mm for the 21 implants, indicating that part of the machined neck of the implants was placed slightly below the osseous crest. The mean DIB value increased to 1.99 mm at the 5-year examination. This increase proved to be statistically significant (P < .0001). Between the baseline and 5-year examinations, the mean bone crest level loss was 0.58 mm. Success and survival rates of the 21 implants after 5 years of function were 100%. This 5-year study confirms that immediate restoration of mandibular molar wide-neck implants with good primary stability, as noted by insertion torque values of at least 35 Ncm, is a safe and predictable procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtualization has become a common abstraction layer in modern data centers. By multiplexing hardware resources into multiple virtual machines (VMs) and thus enabling several operating systems to run on the same physical platform simultaneously, it can effectively reduce power consumption and building size or improve security by isolating VMs. In a virtualized system, memory resource management plays a critical role in achieving high resource utilization and performance. Insufficient memory allocation to a VM will degrade its performance dramatically. On the contrary, over-allocation causes waste of memory resources. Meanwhile, a VM’s memory demand may vary significantly. As a result, effective memory resource management calls for a dynamic memory balancer, which, ideally, can adjust memory allocation in a timely manner for each VM based on their current memory demand and thus achieve the best memory utilization and the optimal overall performance. In order to estimate the memory demand of each VM and to arbitrate possible memory resource contention, a widely proposed approach is to construct an LRU-based miss ratio curve (MRC), which provides not only the current working set size (WSS) but also the correlation between performance and the target memory allocation size. Unfortunately, the cost of constructing an MRC is nontrivial. In this dissertation, we first present a low overhead LRU-based memory demand tracking scheme, which includes three orthogonal optimizations: AVL-based LRU organization, dynamic hot set sizing and intermittent memory tracking. Our evaluation results show that, for the whole SPEC CPU 2006 benchmark suite, after applying the three optimizing techniques, the mean overhead of MRC construction is lowered from 173% to only 2%. Based on current WSS, we then predict its trend in the near future and take different strategies for different prediction results. When there is a sufficient amount of physical memory on the host, it locally balances its memory resource for the VMs. Once the local memory resource is insufficient and the memory pressure is predicted to sustain for a sufficiently long time, a relatively expensive solution, VM live migration, is used to move one or more VMs from the hot host to other host(s). Finally, for transient memory pressure, a remote cache is used to alleviate the temporary performance penalty. Our experimental results show that this design achieves 49% center-wide speedup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discussion of a new, innovative method for dating rocks, called laser ablation split stream (LASS) petrochronology, which is an in situ method that couples geochronological and geochemical data of minerals that remain in the rock matrix. The talk focuses on the application of this technique with U-Th-Pb dating of the phosphate minerals monazite and xenotine in metamorphic rocks. Examples from the Ruby Range in southwestern Montana and metamorphic core complexes in the northern Idaho panhandle will be explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major episodic acidifications were observed on several occasions in first-order brooks at Acadia National Park, Mount Desert Island, Maine. Short-term declines of up to 2 pH units and 130-mu-eq L-1 acid-neutralizing capacity were caused by HCl from soil solutions, rather than by H2SO4 or HNO3 from precipitation, because (1) SO4 concentrations were constant or decreased during the pH depression, (2) Cl concentrations were greatest at the time of lowest pH, and (3) Na:Cl ratios decreased from values much greater than those in precipitation (a result of chemical weathering), to values equal to or less than those in precipitation. Dilution, increases in NO3 concentrations, or increased export or organic acidity from soils were insufficient to cause the observed decreases in pH. These data represent surface water acidifications due primarily to an ion exchange "salt effect" of Na+ for H+ in soil solution, and secondarily to dilution, neither of which is a consequence of acidic deposition. The requisite conditions for a major episodic salt effect acidification include acidic soils, and either an especially salt-laden wet precipitation event, or a period of accumulation of marine salts from dry deposition, followed by wet inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lesni Potok stream drains a forested headwater catchment in the central Czech Republic. It was artificially acidified with hydrochloric acid (HCl) for four hours to assess the role of stream substrate in acid-neutralisation and recovery. The pH was lowered from 4.7 to 3.2. Desorption of Ca and MP and desorption or solution of Al dominated acid-neutralisation; Al mobilisation was more important later. The stream substrate released 4.542 meq Ca, 1, 184 meq Mg, and 2,329 meq Al over a 45 in long and I in wide stream segment, smaller amounts of Be. Cd, Fe, and Mn were released. Adsorption of SO42- and desorption of F- occurred during the acidification phase of the experiment. The exchange reactions were rapidly reversible for Ca, Mg and SO42- but not symmetric as the substrate resorbed 1083, 790 and 0 meq Ca, Mg, and Al. respectively, in a 4-hour recovery period. Desorption of SO42- occurred during the resorption of Ca and Mg. These exchange and dissolution reactions delay acidification, diminish the pH depression and retard recovery from episodic acidification. The behaviour of the stream substrate-water interaction resembles that for soil-soil water interactions. A mathematical dynamic mass-balance based model, MASS (Modelling Acidification of Stream Sediments), was developed which simulates the adsorption and desorption of base cations during the experiment and was successfully calibrated to the experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When genetic constraints restrict phenotypic evolution, diversification can be predicted to evolve along so-called lines of least resistance. To address the importance of such constraints and their resolution, studies of parallel phenotypic divergence that differ in their age are valuable. Here, we investigate the parapatric evolution of six lake and stream threespine stickleback systems from Iceland and Switzerland, ranging in age from a few decades to several millennia. Using phenotypic data, we test for parallelism in ecotypic divergence between parapatric lake and stream populations and compare the observed patterns to an ancestral-like marine population. We find strong and consistent phenotypic divergence, both among lake and stream populations and between our freshwater populations and the marine population. Interestingly, ecotypic divergence in low-dimensional phenotype space (i.e. single traits) is rapid and seems to be often completed within 100 years. Yet, the dimensionality of ecotypic divergence was highest in our oldest systems and only there parallel evolution of unrelated ecotypes was strong enough to overwrite phylogenetic contingency. Moreover, the dimensionality of divergence in different systems varies between trait complexes, suggesting different constraints and evolutionary pathways to their resolution among freshwater systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a reliable simulation of the time and space dependent CO2 redistribution between ocean and atmosphere an appropriate time dependent simulation of particle dynamics processes is essential but has not been carried out so far. The major difficulties were the lack of suitable modules for particle dynamics and early diagenesis (in order to close the carbon and nutrient budget) in ocean general circulation models, and the lack of an understanding of biogeochemical processes, such as the partial dissolution of calcareous particles in oversaturated water. The main target of ORFOIS was to fill in this gap in our knowledge and prediction capability infrastructure. This goal has been achieved step by step. At first comprehensive data bases (already existing data) of observations of relevance for the three major types of biogenic particles, organic carbon (POC), calcium carbonate (CaCO3), and biogenic silica (BSi or opal), as well as for refractory particles of terrestrial origin were collated and made publicly available.