876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current state-of-the-art climate models fail to capture accurately the path of the Gulf Stream and North Atlantic Current. This leads to a warm bias near the North American coast, where the modelled Gulf Stream separates from the coast further north, and a cold anomaly to the east of the Grand Banks of Newfoundland, where the North Atlantic Current remains too zonal in this region. Using an atmosphere-only model forced with the sea surface temperature (SST) biases in the North Atlantic, we consider the impact they have on the mean state and the variability in the North Atlantic European region in winter. Our results show that the SST errors produce a mean sea-level pressure response that is similar in magnitude and pattern to the atmospheric circulation errors in the coupled climate model. The work also suggests that errors in the coupled model storm tracks and North Atlantic Oscillation, compared to reanalysis data, can also be explained partly by these SST errors. Our results suggest that both the error in the Gulf Stream separation location and the path of the North Atlantic Current around the Grand Banks play important roles in affecting the atmospheric circulation. Reducing these coupled model errors could improve significantly the representation of the large-scale atmospheric circulation of the North Atlantic and European region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In data fusion systems, one often encounters measurements of past target locations and then wishes to deduce where the targets are currently located. Recent research on the processing of such out-of-sequence data has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships between the algorithms so that any approximations made are explicit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A weekly programme of water quality monitoring has been conducted by Slapton Ley Field Centre since 1970. Samples have been collected for the four main streams draining into Slapton Ley, from the Ley itself and from other sites within the catchment. On occasions, more frequent sampling has been undertaken during short-term research projects, usually in relation to nutrient export from the catchment. These water quality data, unparalleled in length for a series of small drainage basins in the British Isles, provide a unique resource for analysis of spatial and temporal variations in stream water quality within an agricultural area. Not surprisingly, given the eutrophic status of the Ley, most attention has focused on the nutrients nitrate and phosphate. A number of approaches to modelling nutrient loss have been attempted, including time series analysis and the application of nutrient export and physically-based models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the on-line processing of unaccusative and unergative sentences in a group of eight Greek-speaking individuals diagnosed with Broca aphasia and a group of language-unimpaired subjects used as the baseline. The processing of unaccusativity refers to the reactivation of the postverbal trace by retrieving the mnemonic representation of the verb’s syntactically defined antecedent provided in the early part of the sentence. Our results demonstrate that the Broca group showed selective reactivation of the antecedent for the unaccusatives. We consider several interpretations for our data, including explanations focusing on the transitivization properties of nonactive and active voice-alternating unaccusatives, the costly procedure claimed to underlie the parsing of active nonvoice-alternating unaccusatives, and the animacy of the antecedent modulating the syntactic choices of the patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A physically based gust parameterisation is added to the atmospheric mesoscale model FOOT3DK to estimate wind gusts associated with storms over West Germany. The gust parameterisation follows the Wind Gust Estimate (WGE) method and its functionality is verified in this study. The method assumes that gusts occurring at the surface are induced by turbulent eddies in the planetary boundary layer, deflecting air parcels from higher levels down to the surface under suitable conditions. Model simulations are performed with horizontal resolutions of 20 km and 5 km. Ten historical storm events of different characteristics and intensities are chosen in order to include a wide range of typical storms affecting Central Europe. All simulated storms occurred between 1990 and 1998. The accuracy of the method is assessed objectively by validating the simulated wind gusts against data from 16 synoptic stations by means of “quality parameters”. Concerning these parameters, the temporal and spatial evolution of the simulated gusts is well reproduced. Simulated values for low altitude stations agree particularly well with the measured gusts. For orographically exposed locations, the gust speeds are partly underestimated. The absolute maximum gusts lie in most cases within the bounding interval given by the WGE method. Focussing on individual storms, the performance of the method is better for intense and large storms than for weaker ones. Particularly for weaker storms, the gusts are typically overestimated. The results for the sample of ten storms document that the method is generally applicable with the mesoscale model FOOT3DK for mid-latitude winter storms, even in areas with complex orography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The synoptic evolution and some meteorological impacts of the European winter storm Kyrill that swept across Western, Central, and Eastern Europe between 17 and 19 January 2007 are investigated. The intensity and large storm damage associated with Kyrill is explained based on synoptic and mesoscale environmental storm features, as well as on comparisons to previous storms. Kyrill appeared on weather maps over the US state of Arkansas about four days before it hit Europe. It underwent an explosive intensification over the Western North Atlantic Ocean while crossing a very intense zonal polar jet stream. A superposition of several favourable meteorological conditions west of the British Isles caused a further deepening of the storm when it started to affect Western Europe. Evidence is provided that a favourable alignment of three polar jet streaks and a dry air intrusion over the occlusion and cold fronts were causal factors in maintaining Kyrill's low pressure very far into Eastern Europe. Kyrill, like many other strong European winter storms, was embedded in a pre-existing, anomalously wide, north-south mean sea-level pressure (MSLP) gradient field. In addition to the range of gusts that might be expected from the synoptic-scale pressure field, mesoscale features associated with convective overturning at the cold front are suggested as the likely causes for the extremely damaging peak gusts observed at many lowland stations during the passage of Kyrill's cold front. Compared to other storms, Kyrill was by far not the most intense system in terms of core pressure and circulation anomaly. However, the system moved into a pre-existing strong MSLP gradient located over Central Europe which extended into Eastern Europe. This fact is considered determinant for the anomalously large area affected by Kyrill. Additionally, considerations of windiness in climate change simulations using two state-of-the-art regional climate models driven by ECHAM5 indicate that not only Central, but also Eastern Central Europe may be affected by higher surface wind speeds at the end of the 21st century. These changes are partially associated with the increased pressure gradient over Europe which is identified in the ECHAM5 simulations. Thus, with respect to the area affected, as well as to the synoptic and mesoscale storm features, it is proposed that Kyrill may serve as an interesting study case to assess future storm impacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synoptic activity over the Northern Hemisphere is evaluated in ensembles of ECHAM5/MPI-OM1 simulations for recent climate conditions (20C) and for three climate scenarios (following SRES A1B, A2, B1). A close agreement is found between the simulations for present day climate and the respective results from reanalysis. Significant changes in the winter mid-tropospheric storm tracks are detected in all three scenario simulations. Ensemble mean climate signals are rather similar, with particularly large activity increases downstream of the Atlantic storm track over Western Europe. The magnitude of this signal is largely dependent on the imposed change in forcing. However, differences between individual ensemble members may be large. With respect to the surface cyclones, the scenario runs produce a reduction in cyclonic track density over the mid-latitudes, even in the areas with increasing mid-tropospheric activity. The largest decrease in track densities occurs at subtropical latitudes, e.g., over the Mediterranean Basin. An increase of cyclone intensities is detected for limited areas (e.g., near Great Britain and Aleutian Isles) for the A1B and A2 experiments. The changes in synoptic activity are associated with alterations of the Northern Hemisphere circulation and background conditions (blocking frequencies, jet stream). The North Atlantic Oscillation index also shows increased values with enhanced forcing. With respect to the effects of changing synoptic activity, the regional change in cyclone intensities is accompanied by alterations of the extreme surface winds, with increasing values over Great Britain, North and Baltic Seas, as well as the areas with vanishing sea ice, and decreases over much of the subtropics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.