822 resultados para Data transmission systems.


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The availability of critical services and their data can be significantly increased by replicating them on multiple systems connected with each other, even in the face of system and network failures. In some platforms such as peer-to-peer (P2P) systems, their inherent characteristic mandates the employment of some form of replication to provide acceptable service to their users. However, the problem of how best to replicate data to build highly available peer-to-peer systems is still an open problem. In this paper, we propose an approach to address the data replication problem on P2P systems. The proposed scheme is compared with other techniques and is shown to require less communication cost for an operation as well as provide higher degree of data availability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mobile computing has enabled users to seamlessly access databases even when they are on the move. Mobile computing environments require data management approaches that are able to provide complete and highly available access to shared data at any time from any where. In this paper, we propose a novel replicated data protocol for achieving such goal. The proposed scheme replicates data synchronously over stationary sites based on three dimensional grid structure while objects in mobile sites are asynchronously replicated based on commonly visited sites for each user. This combination allows the proposed protocol to operate with less than full connectivity, to easily adapt to changes in group membership and not require all sites to agree to update data objects at any given time, thus giving the technique flexibility in mobile environments. The proposed replication technique is compared with a baseline replication technique and shown to exhibit high availability, fault tolerance and minimal access times of the data and services, which are very important in an environment with low-quality communication links.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Determining the provenance of data, i.e. the process that led to that data, is vital in many disciplines. For example, in science, the process that produced a given result must be demonstrably rigorous for the result to be deemed reliable. A provenance system supports applications in recording adequate documentation about process executions to answer queries regarding provenance, and provides functionality to perform those queries. Several provenance systems are being developed, but all focus on systems in which the components are textitreactive, for example Web Services that act on the basis of a request, job submission system, etc. This limitation means that questions regarding the motives of autonomous actors, or textitagents, in such systems remain unanswerable in the general case. Such questions include: who was ultimately responsible for a given effect, what was their reason for initiating the process and does the effect of a process match what was intended to occur by those initiating the process? In this paper, we address this limitation by integrating two solutions: a generic, re-usable framework for representing the provenance of data in service-oriented architectures and a model for describing the goal-oriented delegation and engagement of agents in multi-agent systems. Using these solutions, we present algorithms to answer common questions regarding responsibility and success of a process and evaluate the approach with a simulated healthcare example.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the continually evolving social nature of information systems research there is a need to identify different “modes of analysis” (Myers, 1997) to uncover our understanding of the complex, messy and often chaotic nature of human factors. One suggested mode of analysis is that of social dramas, a tool developed in the anthropological discipline by Victor Turner. The use of social dramas also utilises the work by Goffman (1959; 1997) and enables the researcher to investigate events from the front stage, reporting obvious issues in systems implementation, and from the back stage, identifying the hidden aspects of systems implementation and the underpinning discourses. A case study exploring the social dramas involved in systems selection and implementation has been provided to support the use of this methodological tool.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The widespread adoption of cluster computing as a high performance computing platform has seen the growth of data intensive scientific, engineering and commercial applications such as digital libraries, climate modeling, computational chemistry, computational fluid dynamics and image repositories. However, I/O subsystem performance has not been keeping pace with processor and memory performance, and is fast becoming the dominant factor in overall system performance.  Thus, parallel I/O has become a necessity in the face of performance improvements in other areas of computing systems. This paper addresses the problem of parallel I/O scheduling on cluster computing systems in the presence of data replication.  We propose two new I/O scheduling algorithms and evaluate the relative performance of the proposed policies against two existing approaches.  Simulation results show that the proposed policies perform substantially better than the baseline policies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In data-intensive distributed systems, replication is the most widely used approach to offer high data availability, low bandwidth consumption, increased fault-tolerance and improved scalability of the overall system. Replication-based systems implement replica control protocols that enforce a specified semantics of accessing the data. Also, the performance depends on a number of factors, the chief of which is the protocol used to maintain consistency among object replica. In this paper, we propose a new low-cost and high data availability protocol called the box-shaped grid structure for maintaining consistency of replicated data on networked distributed computing systems. We show that the proposed protocol provides high data availability, low communication costs, and increased fault-tolerance as compared to the baseline replica control protocols.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Researchers are working to identify and promote environment and policy initiatives to encourage more active and healthy communities. Measuring environmental attributes through objective means can verify which physical environment factors are most important. We describe how Geographic Information Systems (GIS) may be used to measure objectively, the features of the built environment that may influence walking. We show how four key attributes currently believed to be of most relevance to walking for transport may be used to create a ‘walkability’ index. These are dwelling density (higher-density neighbourhoods support greater retail and service variety, resulting in shorter, walkable distances between facilities; driving and parking are more difficult); street connectivity (higher intersection density provides people with a greater choice of potential routes, easier access to major roads where public transport is available and shorter times to get to destinations); land use mix (the more varied the land use mix and built form, the more conducive it is to walk to various destinations); and net retail area (people who live near multiple and diverse retail opportunities are able to make more frequent and shorter shopping trips by walking and can walk to more local employment opportunities). The potential relationships between each of the objective environmental-attribute measures and walking behaviours is discussed, together with suggestions as to how such measures might be used to guide community infrastructure planning. GIS mapping can assist decision makers in where to focus transportation investments and where to guide future growth. Readily accessible GIS data can be used to guide and support urban planning and infrastructure investment decisions in both the private and public sectors, to increase walking in communities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multisensor data fusion has attracted a lot of research in recent years. It has been widely used in many applications especially military applications for target tracking and identification. In this paper, we will handle the multisensor data fusion problem for systems suffering from the possibility of missing measurements. We present the optimal recursive fusion filter for measurements obtained from two sensors subject to random intermittent measurements. The noise covariance in the observation process is allowed to be singular which requires the use of generalized inverse. Illustration example shows the effectiveness of the proposed filter in the measurements loss case compared to the available optimal linear fusion methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we provide the optimal data fusion filter for linear systems suffering from possible missing measurements. The noise covariance in the observation process is allowed to be singular which requires the use of generalized inverse. The data fusion process is made on the raw data provided by two sensors  observing the same entity. Each of the sensors is losing the measurements in its own data loss rate. The data fusion filter is provided in a recursive form for ease of implementation in real-world applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Long term evolution (LTE) is designed for high speed data rate, higher spectral efficiency, and lower latency as well as high-capacity voice support. LTE uses single carrierfrequency division multiple access (SC-FDMA) scheme for the uplink transmission and orthogonal frequency division multiple access (OFDMA) in downlink. The one of the most important challenges for a terminal implementation are channel estimation (CE) and equalization. In this paper, a minimum mean square error (MMSE) based channel estimator is proposed for an OFDMA systems that can avoid the ill-conditioned least square (LS) problem with lower computational complexity. This channel estimation technique uses knowledge of channel properties to estimate the unknown channel transfer function at non-pilot subcarriers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nearly all drinking water distribution systems experience a "natural" reduction of disinfection residuals. The most frequently used disinfectant is chlorine, which can decay due to reactions with organic and inorganic compounds in the water and by liquid/solids reaction with the biofilm, pipe walls and sediments. Usually levels of 0.2-0.5 mg/L of free chlorine are required at the point of consumption to maintain bacteriological safety. Higher concentrations are not desirable as they present the problems of taste and odour and increase formation of disinfection by-products. It is usually a considerable concern for the operators of drinking water distribution systems to manage chlorine residuals at the "optimum level", considering all these issues. This paper describes how the chlorine profile in a drinking water distribution system can be modelled and optimised on the basis of readily and inexpensively available laboratory data. Methods are presented for deriving the laboratory data, fitting a chlorine decay model of bulk water to the data and applying the model, in conjunction with a simplified hydraulic model, to obtain the chlorine profile in a distribution system at steady flow conditions. Two case studies are used to demonstrate the utility of the technique. Melbourne's Greenvale-Sydenham distribution system is unfiltered and uses chlorination as its only treatment. The chlorine model developed from laboratory data was applied to the whole system and the chlorine profile was shown to be accurately simulated. Biofilm was not found to critically affect chlorine decay. In the other case study, Sydney Water's Nepean system was modelled from limited hydraulic data. Chlorine decay and trihalomethane (THM) formation in raw and treated water were measured in a laboratory, and a chlorine decay and THM model was derived on the basis of these data. Simulated chlorine and THM profiles agree well with the measured values available. Various applications of this modelling approach are also briefly discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nearly all drinking water distribution systems experience a "natural" reduction of disinfection residuals. The most frequently used disinfectant is chlorine, which can decay due to reactions with organic and inorganic compounds in the water and by liquid/solids reaction with the biofilm, pipe walls and sediments. Usually levels of 0.2-0.5 mg/L of free chlorine are required at the point of consumption to maintain bacteriological safety. Higher concentrations are not desirable as they present the problems of taste and odour and increase formation of disinfection by-products. It is usually a considerable concern for the operators of drinking water distribution systems to manage chlorine residuals at the "optimum level", considering all these issues. This paper describes how the chlorine profile in a drinking water distribution system can be modelled and optimised on the basis of readily and inexpensively available laboratory data. Methods are presented for deriving the laboratory data, fitting a chlorine decay model of bulk water to the data and applying the model, in conjunction with a simplified hydraulic model, to obtain the chlorine profile in a distribution system at steady flow conditions. Two case studies are used to demonstrate the utility of the technique. Melbourne's Greenvale-Sydenham distribution system is unfiltered and uses chlorination as its only treatment. The chlorine model developed from laboratory data was applied to the whole system and the chlorine profile was shown to be accurately simulated. Biofilm was not found to critically affect chlorine decay. In the other case study, Sydney Water's Nepean system was modelled from limited hydraulic data. Chlorine decay and trihalomethane (THM) formation in raw and treated water were measured in a laboratory, and a chlorine decay and THM model was derived on the basis of these data. Simulated chlorine and THM profiles agree well with the measured values available. Various applications of this modelling approach are also briefly discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we address the problem of securing networked RFID applications. We develop and present a RFID security protocol that allows mutual authentication between the reader and tag as well as secure communication of tag data. The protocol presented uses a hybrid method to provide strong security while ensuring the resource requirements are low. To this end it employs a mix of simple one way hashing and low-cost bit wise operations. Our protocol ensures the confidentiality and integrity of all data being communicated and allows for reliable mutual authentication between tags and readers. The protocol presented is also resistant to a large number of common attacks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of RFID (Radio Frequency Identification) technology can be employed for tracking and detecting each container, pallet, case, and product uniquely in the supply chain. It connects the supply chain stakeholders (i.e.; suppliers, manufacturers, wholesalers/distributors, retailers and customers) and allows them to exchange data and product information. Despite these potential benefits, security issues are the key factor in the deployment of a RFID-enabled system in the global supply chain. This paper proposes a hybrid approach to secure RFID transmission in Supply Chain Management (SCM) systems using modified Wired Equivalent Encryption (WEP) and Rivest, Shamir and Adleman (RSA) cryptosystem. The proposed system also addresses the common loop hole of WEP key algorithm and makes it more secure compare to the existing modified WEP key process.