920 resultados para BigData StreamProcessing ApacheStorm Storm ApacheCassandra Eddystone XMPP LoadBalancing Metric
Resumo:
Big data è il termine usato per descrivere una raccolta di dati così estesa in termini di volume,velocità e varietà da richiedere tecnologie e metodi analitici specifici per l'estrazione di valori significativi. Molti sistemi sono sempre più costituiti e caratterizzati da enormi moli di dati da gestire,originati da sorgenti altamente eterogenee e con formati altamente differenziati,oltre a qualità dei dati estremamente eterogenei. Un altro requisito in questi sistemi potrebbe essere il fattore temporale: sempre più sistemi hanno bisogno di ricevere dati significativi dai Big Data il prima possibile,e sempre più spesso l’input da gestire è rappresentato da uno stream di informazioni continuo. In questo campo si inseriscono delle soluzioni specifiche per questi casi chiamati Online Stream Processing. L’obiettivo di questa tesi è di proporre un prototipo funzionante che elabori dati di Instant Coupon provenienti da diverse fonti con diversi formati e protocolli di informazioni e trasmissione e che memorizzi i dati elaborati in maniera efficiente per avere delle risposte in tempo reale. Le fonti di informazione possono essere di due tipologie: XMPP e Eddystone. Il sistema una volta ricevute le informazioni in ingresso, estrapola ed elabora codeste fino ad avere dati significativi che possono essere utilizzati da terze parti. Lo storage di questi dati è fatto su Apache Cassandra. Il problema più grosso che si è dovuto risolvere riguarda il fatto che Apache Storm non prevede il ribilanciamento delle risorse in maniera automatica, in questo caso specifico però la distribuzione dei clienti durante la giornata è molto varia e ricca di picchi. Il sistema interno di ribilanciamento sfrutta tecnologie innovative come le metriche e sulla base del throughput e della latenza esecutiva decide se aumentare/diminuire il numero di risorse o semplicemente non fare niente se le statistiche sono all’interno dei valori di soglia voluti.
Resumo:
Parametro indispensabile di valutazione di un qualsiasi prodotto o servizio, ai giorni nostri, è la web reputation. Sono sempre più numerose le aziende che monitorano la propria "reputazione online". Quest'ultima può esser definita come l'insieme dei messaggi, commenti e feedbacks, positivi, neutri o negativi che siano, di utenti che esprimono la loro opinione tramite il web su un determinato servizio o prodotto rivolto al pubblico. L’applicazione sviluppata, si pone l’obiettivo di analizzare in tempo reale tramite l’utilizzo di Apache Storm, dati provenienti da fonti eterogenee, classificarli tramite KNIME utilizzando tecniche di classificazione quali SVM, alberi decisionali e Naive Bayesian, renderli persistenti mediante l’utilizzo del database NoSQL HBASE e di visualizzarli in tempo reale attraverso dei grafici utilizzando delle servlet, al fine di costituire un valido strumento di supporto per i processi decisionali.
Resumo:
Con l’avvento di Internet, il numero di utenti con un effettivo accesso alla rete e la possibilità di condividere informazioni con tutto il mondo è, negli anni, in continua crescita. Con l’introduzione dei social media, in aggiunta, gli utenti sono portati a trasferire sul web una grande quantità di informazioni personali mettendoli a disposizione delle varie aziende. Inoltre, il mondo dell’Internet Of Things, grazie al quale i sensori e le macchine risultano essere agenti sulla rete, permette di avere, per ogni utente, un numero maggiore di dispositivi, direttamente collegati tra loro e alla rete globale. Proporzionalmente a questi fattori anche la mole di dati che vengono generati e immagazzinati sta aumentando in maniera vertiginosa dando luogo alla nascita di un nuovo concetto: i Big Data. Nasce, di conseguenza, la necessità di far ricorso a nuovi strumenti che possano sfruttare la potenza di calcolo oggi offerta dalle architetture più complesse che comprendono, sotto un unico sistema, un insieme di host utili per l’analisi. A tal merito, una quantità di dati così vasta, routine se si parla di Big Data, aggiunta ad una velocità di trasmissione e trasferimento altrettanto alta, rende la memorizzazione dei dati malagevole, tanto meno se le tecniche di storage risultano essere i tradizionali DBMS. Una soluzione relazionale classica, infatti, permetterebbe di processare dati solo su richiesta, producendo ritardi, significative latenze e inevitabile perdita di frazioni di dataset. Occorre, perciò, far ricorso a nuove tecnologie e strumenti consoni a esigenze diverse dalla classica analisi batch. In particolare, è stato preso in considerazione, come argomento di questa tesi, il Data Stream Processing progettando e prototipando un sistema bastato su Apache Storm scegliendo, come campo di applicazione, la cyber security.
Resumo:
A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.
Resumo:
In recent years, there has been a significant amount of research and development in the area of solar photocatalysis. This paper reviews and summarizes the mechanism of photocatalytic oxidation process, types of photocatalyst, and the factors influencing the photoreactor efficiency and the most recent findings related to solar detoxification and disinfection of water contaminants. Various solar reactors for photocatlytic water purification are also briefly described. The future potential of solar photocatlysis for storm water treatment and reuse is also discussed to ensure sustainable use of solar energy and storm water resources.
Resumo:
The heterogeneous photocatalytic oxidation process offers a versatile promise in the detoxification and disinfection of wastewater containing hazardous organic compounds such as pesticides and phenolic compounds in storm and wastewater effluent. This process has gained wide attention due to its effectiveness in degrading and mineralizing the organic compounds into harmless and often useful components. To develop an efficient photocatalytic process, titanium dioxide has been actively studied in recent years due to its excellent performance as a photocatalyst under UV light irradiation. This paper aims at critically evaluating and highlighting the recent developments of the heterogeneous photocatalytic systems with a special focus on storm and wastewater treatment applications.
Resumo:
Particle number concentrations and size distributions, visibility and particulate mass concentrations and weather parameters were monitored in Brisbane, Australia, on 23 September 2009, during the passage of a dust storm that originated 1400 km away in the dry continental interior. The dust concentration peaked at about mid-day when the hourly average PM2.5 and PM10 values reached 814 and 6460 µg m-3, respectively, with a sharp drop in atmospheric visibility. A linear regression analysis showed a good correlation between the coefficient of light scattering by particles (Bsp) and both PM10 and PM2.5. The particle number in the size range 0.5-20 µm exhibited a lognormal size distribution with modal and geometrical mean diameters of 1.6 and 1.9 µm, respectively. The modal mass was around 10 µm with less than 10% of the mass carried by particles smaller than 2.5 µm. The PM10 fraction accounted for about 68% of the total mass. By mid-day, as the dust began to increase sharply, the ultrafine particle number concentration fell from about 6x103 cm-3 to 3x103 cm-3 and then continued to decrease to less than 1x103 cm-3 by 14h, showing a power-law decrease with Bsp with an R2 value of 0.77 (p<0.01). Ultrafine particle size distributions also showed a significant decrease in number during the dust storm. This is the first scientific study of particle size distributions in an Australian dust storm.
Resumo:
In September 2009 an enormous dust storm swept across eastern Australia. Dust is potentially hazardous to health as it interferes with breathing, and previous dust storms have been linked to increased risks of asthma and even death. We examined whether the 2009 Australian dust storm changed the volume or characteristics of emergency admissions to hospital. We used an observational study design, using time series analyses to examine changes in the number of admissions, and case-only analyses to examine changes in the characteristics of admissions. The admission data were from the Prince Charles Hospital, Brisbane, between 1 January 2009 and 31 October 2009. There was a 39% increase in emergency admissions associated with the storm (95% confidence interval: 5, 81%), which lasted for just one day. The health effects of the storm could not be detected using particulate matter levels. We found no significant change in the characteristics of admissions during the storm, specifically there was no increase in respiratory admissions. The dust storm had a short-lived impact on emergency hospital admissions. This may be because the public took effective avoidance measures, or because the dust was simply not toxic, being mainly composed of soil. Emergency departments should be prepared for a short-term increase in admissions during dust storms.
Resumo:
The multifractal properties of two indices of geomagnetic activity, D st (representative of low latitudes) and a p (representative of the global geomagnetic activity), with the solar X-ray brightness, X l , during the period from 1 March 1995 to 17 June 2003 are examined using multifractal detrended fluctuation analysis (MF-DFA). The h(q) curves of D st and a p in the MF-DFA are similar to each other, but they are different from that of X l , indicating that the scaling properties of X l are different from those of D st and a p . Hence, one should not predict the magnitude of magnetic storms directly from solar X-ray observations. However, a strong relationship exists between the classes of the solar X-ray irradiance (the classes being chosen to separate solar flares of class X-M, class C, and class B or less, including no flares) in hourly measurements and the geomagnetic disturbances (large to moderate, small, or quiet) seen in D st and a p during the active period. Each time series was converted into a symbolic sequence using three classes. The frequency, yielding the measure representations, of the substrings in the symbolic sequences then characterizes the pattern of space weather events. Using the MF-DFA method and traditional multifractal analysis, we calculate the h(q), D(q), and τ (q) curves of the measure representations. The τ (q) curves indicate that the measure representations of these three indices are multifractal. On the basis of this three-class clustering, we find that the h(q), D(q), and τ (q) curves of the measure representations of these three indices are similar to each other for positive values of q. Hence, a positive flare storm class dependence is reflected in the scaling exponents h(q) in the MF-DFA and the multifractal exponents D(q) and τ (q). This finding indicates that the use of the solar flare classes could improve the prediction of the D st classes.
Resumo:
Ingredients: - 1 cup Vision - 100ml ‘Real World’ Application - 100ml Unit Structure/Organisation - 100ml Student-centric Approach [optional: Add Social Media/Popular Culture for extra goodness] - Large Dollop of Passion + Enthusiasm - Sprinkle of Approachability Mix all ingredients well. Cover and leave to rise in a Lecture Theatre for 1.5 hours. Cook in a Classroom for 1.5 hours. Garnish with a dash of Humour before serving. Serves 170 Students
Resumo:
The characteristics of dust particles deposited during the 2009 dust storm in the Gold Coast and Brisbane regions of Australia are discussed in this paper. The study outcomes provide important knowledge in relation to the potential impacts of dust storm related pollution on ecosystem health in the context that the frequency of dust storms is predicted to increase due to anthropogenic desert surface modifications and climate change impacts. The investigated dust storm contributed a large fraction of fine particles to the environment with an increased amount of total suspended solids, compared to dry deposition under ambient conditions. Although the dust storm passed over forested areas, the organic carbon content in the dust was relatively low. The primary metals present in the dust storm deposition were aluminium, iron and manganese, which are common soil minerals in Australia. The dust storm deposition did not contain significant loads of nickel, cadmium, copper and lead, which are commonly present in the urban environment. Furthermore, the comparison between the ambient and dust storm chromium and zinc loads suggested that these metals were contributed to the dust storm by local anthropogenic sources. The potential ecosystem health impacts of the 2009 dust storm include, increased fine solids deposition on ground surfaces resulting in an enhanced capacity to adsorb toxic pollutants as well as increased aluminium, iron and manganese loads. In contrast, the ecosystem health impacts related to organic carbon and other metals from dust storm atmospheric deposition are not considered to be significant.
Resumo:
The policy objectives of the continuous disclosure regime augmented by the misleading or deceptive conduct provisions in the Corporations Act are to enhance the integrity and efficiency of Australian capital markets by ensuring equality of opportunity for all investors through public access to accurate and material company information to enable them to make well-informed investment decisions. This article argues that there were failures by the regulators in the performance of their roles to protect the interests of investors in Forrest v ASIC; FMG v ASIC (2012) 247 CLR 486: ASX failed to enforce timely compliance with the continuous disclosure regime and ensure that the market was properly informed by seeking immediate clarification from FMG as to the agreed fixed price and/or seeking production of a copy of the CREC agreement; and ASIC failed to succeed in the High Court because of the way it pleaded its case. The article also examines the reasoning of the High Court in Forrest v ASIC and whether it might have changed previous understandings of the Campomar test for determining whether representations directed to the public generally are misleading.
Resumo:
Background How accurately do people perceive extreme water speeds and how does their perception affect perceived risk? Prior research has focused on the characteristics of moving water that can reduce human stability or balance. The current research presents the first experiment on people's perceptions of risk and moving water at different speeds and depths. Methods Using a randomized within-person 2 (water depth: 0.45, 0.90 m) ×3 (water speed: 0.4, 0.8, 1.2 m/s) experiment, we immersed 76 people in moving water and asked them to estimate water speed and the risk they felt. Results Multilevel modeling showed that people increasingly overestimated water speeds as actual water speeds increased or as water depth increased. Water speed perceptions mediated the direct positive relationship between actual water speeds and perceptions of risk; the faster the moving water, the greater the perceived risk. Participants' prior experience with rip currents and tropical cyclones moderated the strength of the actual–perceived water speed relationship; consequently, mediation was stronger for people who had experienced no rip currents or fewer storms. Conclusions These findings provide a clearer understanding of water speed and risk perception, which may help communicate the risks associated with anticipated floods and tropical cyclones.
Resumo:
The occurrence of extreme water levels along low-lying, highly populated and/or developed coastlines can lead to considerable loss of life and billions of dollars of damage to coastal infrastructure. Therefore it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood management, engineering and future land-use planning. This ensures the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. This paper estimates for the first time present day extreme water level exceedence probabilities around the whole coastline of Australia. A high-resolution depth averaged hydrodynamic model has been configured for the Australian continental shelf region and has been forced with tidal levels from a global tidal model and meteorological fields from a global reanalysis to generate a 61-year hindcast of water levels. Output from this model has been successfully validated against measurements from 30 tide gauge sites. At each numeric coastal grid point, extreme value distributions have been fitted to the derived time series of annual maxima and the several largest water levels each year to estimate exceedence probabilities. This provides a reliable estimate of water level probabilities around southern Australia; a region mainly impacted by extra-tropical cyclones. However, as the meteorological forcing used only weakly includes the effects of tropical cyclones, extreme water level probabilities are underestimated around the western, northern and north-eastern Australian coastline. In a companion paper we build on the work presented here and more accurately include tropical cyclone-induced surges in the estimation of extreme water level. The multi-decadal hindcast generated here has been used primarily to estimate extreme water level exceedance probabilities but could be used more widely in the future for a variety of other research and practical applications.