942 resultados para Centralised data warehouse Architecture


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides an insight to the development of a process model for the essential expansion of the automatic miniload warehouse. The model is based on the literature research and covers four phases of a warehouse expansion: the preparatory phase, the current state analysis, the design phase and the decision making phase. In addition to the literature research, the presented model is based on a reliable data set and can be applicable with a reasonable effort to ensure the informed decision on the warehouse layout. The model is addressed to users who are usually employees of logistics department, and is oriented on the improvement of the daily business organization combined with the warehouse expansion planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present in this paper several contributions on the collision detection optimization centered on hardware performance. We focus on the broad phase which is the first step of the collision detection process and propose three new ways of parallelization of the well-known Sweep and Prune algorithm. We first developed a multi-core model takes into account the number of available cores. Multi-core architecture enables us to distribute geometric computations with use of multi-threading. Critical writing section and threads idling have been minimized by introducing new data structures for each thread. Programming with directives, like OpenMP, appears to be a good compromise for code portability. We then proposed a new GPU-based algorithm also based on the "Sweep and Prune" that has been adapted to multi-GPU architectures. Our technique is based on a spatial subdivision method used to distribute computations among GPUs. Results show that significant speed-up can be obtained by passing from 1 to 4 GPUs in a large-scale environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article provides a holistic legal analysis of the use of cookies in Online Behavioural Advertising. The current EU legislative framework is outlined in detail, and the legal obligations are examined. Consent and the debates surrounding its implementation form a large portion of the analysis. The article outlines the current difficulties associated with the reliance on this requirement as a condition for the placing and accessing of cookies. Alternatives to this approach are explored, and the implementation of solutions based on the application of the Privacy by Design and Privacy by Default concepts are presented. This discussion involves an analysis of the use of code and, therefore, product architecture to ensure adequate protections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we investigate content-centric data transmission in the context of short opportunistic contacts and base our work on an existing content-centric networking architecture. In case of short interconnection times, file transfers may not be completed and the received information is discarded. Caches in content-centric networks are used for short-term storage and do not guarantee persistence. We implemented a mechanism to extend caching on persistent storage enabling the completion of disrupted content transfers. The mechanisms have been implemented in the CCNx framework and have been evaluated on wireless mesh nodes. Our evaluations using multicast and unicast communication show that the implementation can support content transfers in opportunistic environments without significant processing and storing overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT. Here we present datasets from a hydroacoustic survey in July 2011 at Lake Torneträsk, northern Sweden. Our hydroacoustic data exhibit lake floor morphologies formed by glacial erosion and accumulation processes, insights into lacustrine sediment accumulation since the beginning of deglaciation, and information on seismic activity along the Pärvie Fault. Features of glacial scouring with a high-energy relief, steep slopes, and relative reliefs of more than 50 m are observed in the large W-basin. The remainder of the lacustrine subsurface appears to host a broad variety of well preserved formations from glacial accumulation related to the last retreat of the Fennoscandian ice sheet. Deposition of glaciolacustrine and lacustrine sediments is focused in areas situated in proximity to major inlets. Sediment accumulation in distal areas of the lake seldom exceeds 2 m or is not observable. We assume that lack of sediment deposition in the lake is a result of different factors, including low rates of erosion in the catchment, a previously high lake level leading to deposition of sediments in higher elevated paleodeltas, tributaries carrying low suspension loads as a result of sedimentation in upstream lakes, and an overall low productivity in the lake. A clear off-shore trace of the Pärvie Fault could not be detected from our hydroacoustic data. However, an absence of sediment disturbance in close proximity to the presumed fault trace implies minimal seismic activity since deposition of the glaciolacustrine and lacustrine sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long Term Evolution (LTE) represents the fourth generation (4G) technology which is capable of providing high data rates as well as support of high speed mobility. The EU FP7 Mobile Cloud Networking (MCN) project integrates the use of cloud computing concepts in LTE mobile networks in order to increase LTE's performance. In this way a shared distributed virtualized LTE mobile network is built that can optimize the utilization of virtualized computing, storage and network resources and minimize communication delays. Two important features that can be used in such a virtualized system to improve its performance are the user mobility and bandwidth prediction. This paper introduces the architecture and challenges that are associated with user mobility and bandwidth prediction approaches in virtualized LTE systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information-centric networking (ICN) has been proposed to cope with the drawbacks of the Internet Protocol, namely scalability and security. The majority of research efforts in ICN have focused on routing and caching in wired networks, while little attention has been paid to optimizing the communication and caching efficiency in wireless networks. In this work, we study the application of Raptor codes to Named Data Networking (NDN), which is a popular ICN architecture, in order to minimize the number of transmitted messages and accelerate content retrieval times. We propose RC-NDN, which is a NDN compatible Raptor codes architecture. In contrast to other coding-based NDN solutions that employ network codes, RC-NDN considers security architectures inherent to NDN. Moreover, different from existing network coding based solutions for NDN, RC-NDN does not require significant computational resources, which renders it appropriate for low cost networks. We evaluate RC-NDN in mobile scenarios with high mobility. Evaluations show that RC-NDN outperforms the original NDN significantly. RC-NDN is particularly efficient in dense environments, where retrieval times can be reduced by 83% and the number of Data transmissions by 84.5% compared to NDN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oceans play a critical role in the Earth's climate, but unfortunately, the extent of this role is only partially understood. One major obstacle is the difficulty associated with making high-quality, globally distributed observations, a feat that is nearly impossible using only ships and other ocean-based platforms. The data collected by satellite-borne ocean color instruments, however, provide environmental scientists a synoptic look at the productivity and variability of the Earth's oceans and atmosphere, respectively, on high-resolution temporal and spatial scales. Three such instruments, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) onboard ORBIMAGE's OrbView-2 satellite, and two Moderate Resolution Imaging Spectroradiometers (MODIS) onboard the National Aeronautic and Space Administration's (NASA) Terra and Aqua satellites, have been in continuous operation since September 1997, February 2000, and June 2002, respectively. To facilitate the assembly of a suitably accurate data set for climate research, members of the NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project and SeaWiFS Project Offices devote significant attention to the calibration and validation of these and other ocean color instruments. This article briefly presents results from the SIMBIOS and SeaWiFS Project Office's (SSPO) satellite ocean color validation activities and describes the SeaWiFS Bio-optical Archive and Storage System (SeaBASS), a state-of-the-art system for archiving, cataloging, and distributing the in situ data used in these activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground penetrating radar (GPR) and capacitive coupled resistivity (CCR) measurements were conducted in order to image subsurface structures in the Orkhon Valley, Central Mongolia. The data are extended by information from drill cores to the entire transects distinguishing different sedimentary environments in the valley. The Orkhon Valley is part of the high sensitive Steppe region in Central Mongolia, one of the most important cultural landscapes in Central Asia. There, archaeological, geoarchaeological and sedimentological research aims to reconstruct the landscape evolution and the interaction between man and environment during the last millennia since the first settlement. In May 2009 and 2010 geophysical surveys have been conducted including transects with lengths between 1.5 and 30 km crossing the entire valley and a kilometre-scaled grid in the southern part of the investigation area. The geoelectrical and GPR data revealed the existence of two layers characterized by different resistivity values and radar reflectors. The two layers do not only represent material contrasts, but also reflect the influence of sporadic permafrost which occurs in several areas of Mongolia. The results help to reconstruct the evolution of the braided Orkhon River and therefore give important hints to understand the environmental history of the Orkhon Valley.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fine-grained sediment depocenters on continental shelves are of increased scientific interest since they record environmental changes sensitively. A north-south elongated mud depocenter extends along the Senegalese coast in mid-shelf position. Shallow-acoustic profiling was carried out to determine extent, geometry and internal structures of this sedimentary body. In addition, four sediment cores were retrieved with the main aim to identify how paleoclimatic signals and coastal changes have controlled the formation of this mud depocenter. A general paleoclimatic pattern in terms of fluvial input appears to be recorded in this depositional archive. Intervals characterized by high terrigenous input, high sedimentation rates and fine grain sizes occur roughly contemporaneously in all cores and are interpreted as corresponding to intensified river discharge related to more humid conditions in the hinterland. From 2750 to 1900 and from 1000 to 700 cal a BP, wetter conditions are recorded off Senegal, an observation which is in accordance with other records from NW-Africa. Nevertheless, the three employed proxies (sedimentation rate, grain size and elemental distribution) do not always display consistent inter-core patterns. Major differences between the individual core records are attributed to sediment remobilization which was linked to local hydrographic variations as well as reorganizations of the coastal system. The Senegal mud belt is a layered inhomogeneous sedimentary body deposited on an irregular erosive surface. Early Holocene deceleration in the rate of the sea-level rise could have enabled initial mud deposition on the shelf. These favorable conditions for mud deposition occur coevally with a humid period over NW-Africa, thus, high river discharge. Sedimentation started preferentially in the northern areas of the mud belt. During mid-Holocene, a marine incursion led to the formation of an embayment. Afterwards, sedimentation in the north was interrupted in association with a remarkable southward shift in the location of the active depocenter as it is reflected by the sedimentary architecture and confirmed by radiocarbon dates. These sub-recent shifts in depocenters location are caused by migrations of the Senegal River mouth. During late Holocene times, the weakening of river discharge allowed the longshore currents to build up a chain of beach barriers which have forced the river mouth to shift southwards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bathymetry based on data recorded during M52-1 between 02.01.2002 and 01.02.2002 in the Black Sea. The cruise was focused on studying the distribution, structure and architecture of gas hydrate deposits in the Black Sea as well as their relationship to fluid migration pathways. While high-resolution geoacoustic investigation tools covering a whole range of frequencies and techniques render detailed images of near-surface gas hydrates and associated fluid migration pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This airborne hyperspectral (19 bands) image data of Heron Reef, Great Barrier Reef, Australia is derived from Compact Airborne Spectrographic Imager (CASI) data acquired on 1st and 3rd of July 2002, latitude -23.45, longitude 151.92. Processing and correction to at-surface data was completed by Karen Joyce (Joyce, 2004). Raw imagery consisted several images corresponding to the number of flight paths taken to cover the entire Heron Reef. Spatial resolution is one meter. Radiometric corrections converted the at-sensor digital number values to at surface spectral radiance values using sensor specific calibration coefficients and CSIRO's c-WomBat-c atmospheric correction software. Geometric corrections were done using field collected coordinates of features identified in the image. Projection used was Universal Transverse Mercator Zone 56 South and Datum used was WGS 84. Image data is in TIFF format.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, Internet is a place where social networks have reached an important impact in collaboration among people over the world in different ways. This article proposes a new paradigm for building CSCW business tools following the novel ideas provided by the social web to collaborate and generate awareness. An implementation of these concepts is described, including the components we provide to collaborate in workspaces, (such as videoconference, chat, desktop sharing, forums or temporal events), and the way we generate awareness from these complex social data structures. Figures and validation results are also presented to stress that this architecture has been defined to support awareness generation via joining current and future social data from business and social networks worlds, based on the idea of using social data stored in the cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fuzzy min–max neural network classifier is a supervised learning method. This classifier takes the hybrid neural networks and fuzzy systems approach. All input variables in the network are required to correspond to continuously valued variables, and this can be a significant constraint in many real-world situations where there are not only quantitative but also categorical data. The usual way of dealing with this type of variables is to replace the categorical by numerical values and treat them as if they were continuously valued. But this method, implicitly defines a possibly unsuitable metric for the categories. A number of different procedures have been proposed to tackle the problem. In this article, we present a new method. The procedure extends the fuzzy min–max neural network input to categorical variables by introducing new fuzzy sets, a new operation, and a new architecture. This provides for greater flexibility and wider application. The proposed method is then applied to missing data imputation in voting intention polls. The micro data—the set of the respondents’ individual answers to the questions—of this type of poll are especially suited for evaluating the method since they include a large number of numerical and categorical attributes.