937 resultados para Distributed data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, two approaches have been introduced that distribute the molecular fragment mining problem. The first approach applies a master/worker topology, the second approach, a completely distributed peer-to-peer system, solves the scalability problem due to the bottleneck at the master node. However, in many real world scenarios the participating computing nodes cannot communicate directly due to administrative policies such as security restrictions. Thus, potential computing power is not accessible to accelerate the mining run. To solve this shortcoming, this work introduces a hierarchical topology of computing resources, which distributes the management over several levels and adapts to the natural structure of those multi-domain architectures. The most important aspect is the load balancing scheme, which has been designed and optimized for the hierarchical structure. The approach allows dynamic aggregation of heterogenous computing resources and is applied to wide area network scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monophyly of the Peltophorum group, one of nine informal groups recognized by Polhill in the Caesalpinieae, was tested using sequence data from the trnL-F, rbcL, and rps16 regions of the chloroplast genome. Exemplars were included from all 16 genera of the Peltophorum group, and from 15 genera representing seven of the other eight informal groups in the tribe. The data were analyzed separately and in combined analyses using parsimony and Bayesian methods. The analysis method had little effect on the topology of well-supported relationships. The molecular data recovered a generally well-supported phylogeny with many intergeneric relationships resolved. Results show that the Peltophorum group as currently delimited is polyphyletic, but that eight genera plus one undescribed genus form a core Peltophorum group, which is referred to here as the Peltophorum group sensu stricto. These genera are Bussea, Conzattia, Colvillea, Delonix, Heteroflorum (inedit.), Lemuropisum, Parkinsonia, Peltophorum, and Schizolobium. The remaining eight genera of the Peltophorum group s.l. are distributed across the Caesalpinieae. Morphological support for the redelimited Peltophorum group and the other recovered clades was assessed, and no unique synapomorphy was found for the Peltophorum group s.s. A proposal for the reclassification of the Peltophorum group s.l. is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneity in lifetime data may be modelled by multiplying an individual's hazard by an unobserved frailty. We test for the presence of frailty of this kind in univariate and bivariate data with Weibull distributed lifetimes, using statistics based on the ordered Cox-Snell residuals from the null model of no frailty. The form of the statistics is suggested by outlier testing in the gamma distribution. We find through simulation that the sum of the k largest or k smallest order statistics, for suitably chosen k , provides a powerful test when the frailty distribution is assumed to be gamma or positive stable, respectively. We provide recommended values of k for sample sizes up to 100 and simple formulae for estimated critical values for tests at the 5% level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physiological parameters measured by an embedded body sensor system were demonstrated to respond to changes of the air temperature in an office environment. The thermal parameters were monitored with the use of a wireless sensor system that made possible to turn any existing room into a field laboratory. Two human subjects were monitored over daily activities and at various steady-state thermal conditions when the air temperature of the room was altered from 22-23°C to 25-28°C. The subjects indicated their thermal feeling on questionnaires. The measured skin temperature was distributed close to the calculated mean skin temperature corresponding to the given activity level. The variation of Galvanic Skin Response (GSR) reflected the evaporative heat loss through the body surfaces and indicated whether sweating occurred on the subjects. Further investigations are needed to fully evaluate the influence of thermal and other factors on the output given by the investigated body sensor system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wireless sensor network (WSN) is a group of sensors linked by wireless medium to perform distributed sensing tasks. WSNs have attracted a wide interest from academia and industry alike due to their diversity of applications, including home automation, smart environment, and emergency services, in various buildings. The primary goal of a WSN is to collect data sensed by sensors. These data are characteristic of being heavily noisy, exhibiting temporal and spatial correlation. In order to extract useful information from such data, as this paper will demonstrate, people need to utilise various techniques to analyse the data. Data mining is a process in which a wide spectrum of data analysis methods is used. It is applied in the paper to analyse data collected from WSNs monitoring an indoor environment in a building. A case study is given to demonstrate how data mining can be used to optimise the use of the office space in a building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under transmission errors. The idea of ErDCF is to use high data rate nodes to work as relays for the low data rate nodes. ErDCF achieves higher throughput and reduces energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF) in an ideal channel environment. However, there is a possibility that this expected gain may decrease in the presence of transmission errors. In this work, we modify the saturation throughput model of ErDCF to accurately reflect the impact of transmission errors under different rate combinations. It turns out that the throughput gain of ErDCF can still be maintained under reasonable link quality and distance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we evaluate the performance of our earlier proposed enhanced relay-enabled distributed coordination function (ErDCF) for wireless ad hoc networks. The idea of ErDCF is to use high data rate nodes to work as relays for the low data rate nodes. ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 distributed coordination function (DCF). This is a result of. 1) using relay which helps to increase the throughput and lower overall blocking time of nodes due to faster dual-hop transmission, 2) using dynamic preamble (i.e. using short preamble for the relay transmission) which further increases the throughput and lower overall blocking time and also by 3) reducing unnecessary overhearing (by other nodes not involved in transmission). We evaluate the throughput and energy performance of the ErDCF with different rate combinations. ErDCF (11,11) (ie. R1=R2=11 Mbps) yields a throughput improvement of 92.9% (at the packet length of 1000 bytes) and an energy saving of 72.2% at 50 nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the performance of enhanced relay-enabled distributed coordination function (ErDCF) for wireless ad hoc networks under transmission errors. The idea of ErDCF is to use high data rate nodes to work as relays for the low data rate nodes. ErDCF achieves higher throughput and reduces energy consumption compared to IEEE 802.11 distributed coordination function (DCF) in an ideal channel environment. However, there is a possibility that this expected gain may decrease in the presence of transmission errors. In this work, we modify the saturation throughput model of ErDCF to accurately reflect the impact of transmission errors under different rate combinations. It turns out that the throughput gain of ErDCF can still be maintained under reasonable link quality and distance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the delay performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under ideal condition and in the presence of transmission errors. Relays are nodes capable of supporting high data rates for other low data rate nodes. In ideal channel ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF). This gain is still maintained in the presence of errors. It is also expected of relays to reduce the delay. However, the impact on the delay behavior of ErDCF under transmission errors is not known. In this work, we have presented the impact of transmission errors on delay. It turns out that under transmission errors of sufficient magnitude to increase dropped packets, packet delay is reduced. This is due to increase in the probability of failure. As a result the packet drop time increases, thus reflecting the throughput degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on a distributed moisture detection scheme which uses a cable design based on waterswellable hydrogel polymers. The cable modulates the loss characteristic of light guided within a multi-mode optical fibre in response to relative water potentials in the surrounding environment. Interrogation of the cable using conventional optical time-domain reflectometry (OTDR) instruments allows water ingress points to be identified and located with a spatial resolution of 50 cm. The system has been tested in a simulated tendon duct grouting experiment as a means of mapping the extent of fill along the duct during the grouting process. Voided regions were detected and identified to within 50 cm. A series of salt solutions has been used to determine the sensor behaviour over a range of water potentials. These experiments predict that measurements of soil moisture content can be made over the range 0 to – 1500 kPa. Preliminary data on soil measurements have shown that the sensor can detect water pressure changes with a resolution of 45 kPa. Applications for the sensor include quality assurance of grouting procedures, verification of waterproofing barriers and soil moisture content determination (for load-bearing calculations).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) centres use numerical models of the atmospheric flow to forecast future weather states from an estimate of the current state. Variational data assimilation (VAR) is used commonly to determine an optimal state estimate that miminizes the errors between observations of the dynamical system and model predictions of the flow. The rate of convergence of the VAR scheme and the sensitivity of the solution to errors in the data are dependent on the condition number of the Hessian of the variational least-squares objective function. The traditional formulation of VAR is ill-conditioned and hence leads to slow convergence and an inaccurate solution. In practice, operational NWP centres precondition the system via a control variable transform to reduce the condition number of the Hessian. In this paper we investigate the conditioning of VAR for a single, periodic, spatially-distributed state variable. We present theoretical bounds on the condition number of the original and preconditioned Hessians and hence demonstrate the improvement produced by the preconditioning. We also investigate theoretically the effect of observation position and error variance on the preconditioned system and show that the problem becomes more ill-conditioned with increasingly dense and accurate observations. Finally, we confirm the theoretical results in an operational setting by giving experimental results from the Met Office variational system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new generation of advanced surveillance systems is being conceived as a collection of multi-sensor components such as video, audio and mobile robots interacting in a cooperating manner to enhance situation awareness capabilities to assist surveillance personnel. The prominent issues that these systems face are: the improvement of existing intelligent video surveillance systems, the inclusion of wireless networks, the use of low power sensors, the design architecture, the communication between different components, the fusion of data emerging from different type of sensors, the location of personnel (providers and consumers) and the scalability of the system. This paper focuses on the aspects pertaining to real-time distributed architecture and scalability. For example, to meet real-time requirements, these systems need to process data streams in concurrent environments, designed by taking into account scheduling and synchronisation. The paper proposes a framework for the design of visual surveillance systems based on components derived from the principles of Real Time Networks/Data Oriented Requirements Implementation Scheme (RTN/DORIS). It also proposes the implementation of these components using the well-known middleware technology Common Object Request Broker Architecture (CORBA). Results using this architecture for video surveillance are presented through an implemented prototype.