922 resultados para Data replication processes
Resumo:
This research is supported by the UK Research Councils’ Digital Economy IT as a Utility Network+ (EP/K003569/1) and the dot.rural Digital Economy Hub (EP/G066051/1).
Resumo:
The research described here is supported by the award made by the RCUK Digital Economy programme to the dot.rural Digital Economy Hub; award reference: EP/G066051/1.
Resumo:
bstract: During the Regional Forest Agreement (RFA) process in south-east Queensland, the conservation status of, and threats to, priority vascular plant taxa in the region was assessed. Characteristics of biology, demography and distribution were used to assess the species' intrinsic risk of extinction. In contrast, the threats to the taxa (their extrinsic risk of extinction) were assessed using a decision-support protocol for setting conservation targets for taxa lacking population viability analyses and habitat modelling data. Disturbance processes known or suspected to be adversely affecting the taxa were evaluated for their intensity, extent and time-scale. Expert opinion was used to provide much of the data and to assess the recommended protection areas. Five categories of intrinsic risk of extinction were recognised for the 105 priority taxa: critically endangered (43 taxa); endangered (29); vulnerable (21); rare (10); and presumed extinct (2). Only 6 of the 103 extant taxa were found to be adequately reserved and the majority were considered inadequately protected to survive the current regimes of threatening processes affecting them. Data were insufficient to calculate a protection target for one extant taxon. Over half of the taxa require all populations to be conserved as well as active management to alleviate threatening processes. The most common threats to particular taxa were competition from weeds or native species, inappropriate fire regimes, agricultural clearing, forestry, grazing by native or feral species, drought, urban development, illegal collection of plants, and altered hydrology. Apart from drought and competition from native species, these disturbances are largely influenced or initiated by human actions. Therefore, as well as increased protection of most of the taxa, active management interventions are necessary to reduce the effects of threatening processes and to enable the persistence of the taxa.
Resumo:
his article presents some of the results of the Ph.D. thesis Class Association Rule Mining Using MultiDimensional Numbered Information Spaces by Iliya Mitov (Institute of Mathematics and Informatics, BAS), successfully defended at Hasselt University, Faculty of Science on 15 November 2011 in Belgium
Resumo:
3D geographic information system (GIS) is data and computation intensive in nature. Internet users are usually equipped with low-end personal computers and network connections of limited bandwidth. Data reduction and performance optimization techniques are of critical importance in quality of service (QoS) management for online 3D GIS. In this research, QoS management issues regarding distributed 3D GIS presentation were studied to develop 3D TerraFly, an interactive 3D GIS that supports high quality online terrain visualization and navigation. ^ To tackle the QoS management challenges, multi-resolution rendering model, adaptive level of detail (LOD) control and mesh simplification algorithms were proposed to effectively reduce the terrain model complexity. The rendering model is adaptively decomposed into sub-regions of up-to-three detail levels according to viewing distance and other dynamic quality measurements. The mesh simplification algorithm was designed as a hybrid algorithm that combines edge straightening and quad-tree compression to reduce the mesh complexity by removing geometrically redundant vertices. The main advantage of this mesh simplification algorithm is that grid mesh can be directly processed in parallel without triangulation overhead. Algorithms facilitating remote accessing and distributed processing of volumetric GIS data, such as data replication, directory service, request scheduling, predictive data retrieving and caching were also proposed. ^ A prototype of the proposed 3D TerraFly implemented in this research demonstrates the effectiveness of our proposed QoS management framework in handling interactive online 3D GIS. The system implementation details and future directions of this research are also addressed in this thesis. ^
Resumo:
In finance literature many economic theories and models have been proposed to explain and estimate the relationship between risk and return. Assuming risk averseness and rational behavior on part of the investor, the models are developed which are supposed to help in forming efficient portfolios that either maximize (minimize) the expected rate of return (risk) for a given level of risk (rates of return). One of the most used models to form these efficient portfolios is the Sharpe's Capital Asset Pricing Model (CAPM). In the development of this model it is assumed that the investors have homogeneous expectations about the future probability distribution of the rates of return. That is, every investor assumes the same values of the parameters of the probability distribution. Likewise financial volatility homogeneity is commonly assumed, where volatility is taken as investment risk which is usually measured by the variance of the rates of return. Typically the square root of the variance is used to define financial volatility, furthermore it is also often assumed that the data generating process is made of independent and identically distributed random variables. This again implies that financial volatility is measured from homogeneous time series with stationary parameters. In this dissertation, we investigate the assumptions of homogeneity of market agents and provide evidence for the case of heterogeneity in market participants' information, objectives, and expectations about the parameters of the probability distribution of prices as given by the differences in the empirical distributions corresponding to different time scales, which in this study are associated with different classes of investors, as well as demonstrate that statistical properties of the underlying data generating processes including the volatility in the rates of return are quite heterogeneous. In other words, we provide empirical evidence against the traditional views about homogeneity using non-parametric wavelet analysis on trading data, The results show heterogeneity of financial volatility at different time scales, and time-scale is one of the most important aspects in which trading behavior differs. In fact we conclude that heterogeneity as posited by the Heterogeneous Markets Hypothesis is the norm and not the exception.
Resumo:
Radiogenic isotopes of hafnium (Hf) and neodymium (Nd) are powerful tracers for water mass transport and trace metal cycling in the present and past oceans. However, due to the scarcity of available data the processes governing their distribution are not well understood. Here we present the first combined dissolved Hf and Nd isotope and concentration data from surface waters of the Atlantic sector of the Southern Ocean. The samples were collected along the Zero Meridian, in the Weddell Sea and in the Drake Passage during RV Polarstern expeditions ANT-XXIV/3 and ANT-XXIII/3 in the frame of the International Polar Year (IPY) and the GEOTRACES program. The general distribution of Hf and Nd concentrations in the region is similar. However, at the northernmost station located 200 km southwest of Cape Town a pronounced increase of the Nd concentration is observed, whereas the Hf concentration is minimal, suggesting much less Hf than Nd is released by the weathering of the South African Archean cratonic rocks. From the southern part of the Subtropical Front (STF) to the Polar Front (PF) Hf and Nd show the lowest concentrations (<0.12 pmol/kg and 10 pmol/kg, respectively), most probably due to the low terrigenous flux in this area and efficient scavenging of Hf and Nd by biogenic opal. In the vicinity of landmasses the dissolved Hf and Nd isotope compositions are clearly labelled by terrigenous inputs. Near South Africa Nd isotope values as low as epsilon-Nd = -18.9 indicate unradiogenic inputs supplied via the Agulhas Current. Further south the isotopic data show significant increases to epsilon-Hf = 6.1 and epsilon-Nd = -4.0 documenting exchange of seawater Nd and Hf with the Antarctic Peninsula. In the open Southern Ocean the Nd isotope compositions are relatively homogeneous (epsilon-Nd ~ -8 to -8.5) towards the STF, within the Antarctic Circumpolar Current, in the Weddell Gyre, and the Drake Pasage. The Hf isotope compositions in the entire study area only show a small range between epsilon-Hf = +6.1 and +2.8 support Hf to be more readily released from young mafic rocks compared to old continental ones. The Nd isotope composition ranges from epsilon-Nd = -18.9 to -4.0 showing Nd isotopes to be a sensitive tracer for the provenance of weathering inputs into surface waters of the Southern Ocean.
Resumo:
This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software component and a game engine should be implemented for procuring seamless integration. To this end, a RAGE-compliant C# software component providing a difficulty adaptation routine was integrated with an exemplary strategic tile-based game “TileZero”. Implementa-tions in MonoGame, Unity and Xamarin, respectively, have demonstrated successful portability of the adaptation component. Also, portability across various delivery platforms (Windows desktop, iOS, Android, Windows Phone) was established. Thereby this study has established the validity of the RAGE architecture and its un-derlying interaction processes for the cross-platform and cross-game engine reuse of software components. The RAGE architecture thereby accommodates the large scale development and application of reusable software components for serious gaming.
Resumo:
Individuals and corporate users are persistently considering cloud adoption due to its significant benefits compared to traditional computing environments. The data and applications in the cloud are stored in an environment that is separated, managed and maintained externally to the organisation. Therefore, it is essential for cloud providers to demonstrate and implement adequate security practices to protect the data and processes put under their stewardship. Security transparency in the cloud is likely to become the core theme that underpins the systematic disclosure of security designs and practices that enhance customer confidence in using cloud service and deployment models. In this paper, we present a framework that enables a detailed analysis of security transparency for cloud based systems. In particular, we consider security transparency from three different levels of abstraction, i.e., conceptual, organisation and technical levels, and identify the relevant concepts within these levels. This allows us to provide an elaboration of the essential concepts at the core of transparency and analyse the means for implementing them from a technical perspective. Finally, an example from a real world migration context is given to provide a solid discussion on the applicability of the proposed framework.
Resumo:
[EN] We carry out quasi-classical trajectory caculations for theC + CH+ → C2+ + H reaction on an ad hoc computed high-level ab initio potential energy surface. Thermal rate coefficients at the temperatures of relevance in cold interstellar clouds are derived and compared with the assumed, temperature-independent estimates publicly available in kinetic databases KIDA and UDfA. For a temperature of 10 K the database value overestimates by a factor of two the one obtained by us (thus improperly enhancing the destruction route of CH+ in astrochemical kinetic models) which is seen to double in the temperature range 5–300 K with a sharp increase in the first 50 K. The computed values are fitted via the popular Arrhenius–Kooij formula and best-fitting parameters α = 1:32 X 10-9 cm3s-1, β = 0:10 and γ = 2:19 K to be included in the online mentioned databases are provided. Further investigation shows that the temperature dependence of the thermal rate coefficient better conforms to the recently proposed so-called ‘deformed Arrhenius’ law by Aquilanti and Mundim.
Resumo:
La formación y preparación constante del personal de TI es una de las estrategias más efectivas para mejorar la calidad, estabilidad y seguridad de las redes y servicios asociados. En esta línea, el CEDIA ha venido implementando cursos y talleres de capacitación dirigidos a sus miembros y, dentro del CSIRT-CEDIA, se ha pensado en la posibilidad de optimizar los procesos asociados al despliegue de la infraestructura necesaria para proveer a los participantes de éstas capacitaciones, con el material personalizado adecuado, en las áreas de seguridad informática. Es así que se decidió usar técnicas de virtualización para aprovechar los recursos disponibles, pero aun cuando esto en sí no es una tendencia nueva, el uso de una copia completa del disco virtual para cada participante, no sólo resulta impráctico en cuestión de tiempo, sino también en cuanto al consumo de almacenamiento necesario. Este trabajo se orienta justamente a la optimización en los tiempos y consumos asociados a los procesos de replicación de un mismo equipo y disco virtuales para uso particularizado de varios participantes.
Resumo:
A replicação de base de dados tem como objectivo a cópia de dados entre bases de dados distribuídas numa rede de computadores. A replicação de dados é importante em várias situações, desde a realização de cópias de segurança da informação, ao balanceamento de carga, à distribuição da informação por vários locais, até à integração de sistemas heterogéneos. A replicação possibilita uma diminuição do tráfego de rede, pois os dados ficam disponíveis localmente possibilitando também o seu acesso no caso de indisponibilidade da rede. Esta dissertação baseia-se na realização de um trabalho que consistiu no desenvolvimento de uma aplicação genérica para a replicação de bases de dados a disponibilizar como open source software. A aplicação desenvolvida possibilita a integração de dados entre vários sistemas, com foco na integração de dados heterogéneos, na fragmentação de dados e também na possibilidade de adaptação a várias situações. ABSTRACT: Data replication is a mechanism to synchronize and integrate data between distributed databases over a computer network. Data replication is an important tool in several situations, such as the creation of backup systems, load balancing between various nodes, distribution of information between various locations, integration of heterogeneous systems. Replication enables a reduction in network traffic, because data remains available locally even in the event of a temporary network failure. This thesis is based on the work carried out to develop an application for database replication to be made accessible as open source software. The application that was built allows for data integration between various systems, with particular focus on, amongst others, the integration of heterogeneous data, the fragmentation of data, replication in cascade, data format changes between replicas, master/slave and multi master synchronization.
Resumo:
High Energy efficiency and high performance are the key regiments for Internet of Things (IoT) end-nodes. Exploiting cluster of multiple programmable processors has recently emerged as a suitable solution to address this challenge. However, one of the main bottlenecks for multi-core architectures is the instruction cache. While private caches fall into data replication and wasting area, fully shared caches lack scalability and form a bottleneck for the operating frequency. Hence we propose a hybrid solution where a larger shared cache (L1.5) is shared by multiple cores connected through a low-latency interconnect to small private caches (L1). However, it is still limited by large capacity miss with a small L1. Thus, we propose a sequential prefetch from L1 to L1.5 to improve the performance with little area overhead. Moreover, to cut the critical path for better timing, we optimized the core instruction fetch stage with non-blocking transfer by adopting a 4 x 32-bit ring buffer FIFO and adding a pipeline for the conditional branch. We present a detailed comparison of different instruction cache architectures' performance and energy efficiency recently proposed for Parallel Ultra-Low-Power clusters. On average, when executing a set of real-life IoT applications, our two-level cache improves the performance by up to 20% and loses 7% energy efficiency with respect to the private cache. Compared to a shared cache system, it improves performance by up to 17% and keeps the same energy efficiency. In the end, up to 20% timing (maximum frequency) improvement and software control enable the two-level instruction cache with prefetch adapt to various battery-powered usage cases to balance high performance and energy efficiency.
Resumo:
In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.
Resumo:
This paper describes the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at s√=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z→ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products are obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples.