876 resultados para Distributed data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the convergence of a remote iterative learning control system subject to data dropouts. The system is composed by a set of discrete-time multiple input-multiple output linear models, each one with its corresponding actuator device and its sensor. Each actuator applies the input signals vector to its corresponding model at the sampling instants and the sensor measures the output signals vector. The iterative learning law is processed in a controller located far away of the models so the control signals vector has to be transmitted from the controller to the actuators through transmission channels. Such a law uses the measurements of each model to generate the input vector to be applied to its subsequent model so the measurements of the models have to be transmitted from the sensors to the controller. All transmissions are subject to failures which are described as a binary sequence taking value 1 or 0. A compensation dropout technique is used to replace the lost data in the transmission processes. The convergence to zero of the errors between the output signals vector and a reference one is achieved as the number of models tends to infinity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Jumping to conclusions (JTC) is associated with psychotic disorder and psychotic symptoms. If JTC represents a trait, the rate should be (i) increased in people with elevated levels of psychosis proneness such as individuals diagnosed with borderline personality disorder (BPD), and (ii) show a degree of stability over time. Methods The JTC rate was examined in 3 groups: patients with first episode psychosis (FEP), BPD patients and controls, using the Beads Task. PANSS, SIS-R and CAPE scales were used to assess positive psychotic symptoms. Four WAIS III subtests were used to assess IQ. Results A total of 61 FEP, 26 BPD and 150 controls were evaluated. 29 FEP were revaluated after one year. 44% of FEP (OR = 8.4, 95% CI: 3.9-17.9) displayed a JTC reasoning bias versus 19% of BPD (OR = 2.5, 95% CI: 0.8-7.8) and 9% of controls. JTC was not associated with level of psychotic symptoms or specifically delusionality across the different groups. Differences between FEP and controls were independent of sex, educational level, cannabis use and IQ. After one year, 47.8% of FEP with JTC at baseline again displayed JTC. Conclusions JTC in part reflects trait vulnerability to develop disorders with expression of psychotic symptoms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sand seatrout (Cynoscion arenarius) and silver seatrout (C. nothus) are both found within the immediate offshore areas of the Gulf of Mexico, especially around Texas; however information is limited on how much distributional overlap really occurs between these species. In order to investigate spatial and seasonal differences between species, we analyzed twenty years of bay and offshore trawl data collected by biologists of the Coastal Fisheries Division, Texas Parks and Wildlife Department. Sand seatrout and silver seatrout were distributed differently among offshore sampling areas, and salinity and water depth appeared to correlate with their distribution. Additionally, within the northernmost sampling area of the gulf waters, water depth correlated significantly with the presence of silver seatrout, which were found at deeper depths than sand seatrout. There was also an overall significant decrease in silver seatrout abundance during the summer season, when temperatures were at their highest, and this decrease may have indicated a migration farther offshore. Sand seatrout abundance had an inverse relationship with salinity and water depth offshore. In addition, sand seatrout abundance was highest in bays with direct passes to the gulf and correlated with corresponding abundance in offshore areas. These data highlight the seasonal and spatial differences in abundance between sand and silver seatrout and relate these differences to the hydrological and geological features found along the Texas coastline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the problem of one-class classification (OCC) one of the classes, the target class, has to be distinguished from all other possible objects, considered as nontargets. In many biomedical problems this situation arises, for example, in diagnosis, image based tumor recognition or analysis of electrocardiogram data. In this paper an approach to OCC based on a typicality test is experimentally compared with reference state-of-the-art OCC techniques-Gaussian, mixture of Gaussians, naive Parzen, Parzen, and support vector data description-using biomedical data sets. We evaluate the ability of the procedures using twelve experimental data sets with not necessarily continuous data. As there are few benchmark data sets for one-class classification, all data sets considered in the evaluation have multiple classes. Each class in turn is considered as the target class and the units in the other classes are considered as new units to be classified. The results of the comparison show the good performance of the typicality approach, which is available for high dimensional data; it is worth mentioning that it can be used for any kind of data (continuous, discrete, or nominal), whereas state-of-the-art approaches application is not straightforward when nominal variables are present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stone marten is a widely distributed mustelid in the Palaearctic region that exhibits variable habitat preferences in different parts of its range. The species is a Holocene immigrant from southwest Asia which, according to fossil remains, followed the expansion of the Neolithic farming cultures into Europe and possibly colonized the Iberian Peninsula during the Early Neolithic (ca. 7,000 years BP). However, the population genetic structure and historical biogeography of this generalist carnivore remains essentially unknown. In this study we have combined mitochondrial DNA (mtDNA) sequencing (621 bp) and microsatellite genotyping (23 polymorphic markers) to infer the population genetic structure of the stone marten within the Iberian Peninsula. The mtDNA data revealed low haplotype and nucleotide diversities and a lack of phylogeographic structure, most likely due to a recent colonization of the Iberian Peninsula by a few mtDNA lineages during the Early Neolithic. The microsatellite data set was analysed with a) spatial and non-spatial Bayesian individual-based clustering (IBC) approaches (STRUCTURE, TESS, BAPS and GENELAND), and b) multivariate methods [discriminant analysis of principal components (DAPC) and spatial principal component analysis (sPCA)]. Additionally, because isolation by distance (IBD) is a common spatial genetic pattern in mobile and continuously distributed species and it may represent a challenge to the performance of the above methods, the microsatellite data set was tested for its presence. Overall, the genetic structure of the stone marten in the Iberian Peninsula was characterized by a NE-SW spatial pattern of IBD, and this may explain the observed disagreement between clustering solutions obtained by the different IBC methods. However, there was significant indication for contemporary genetic structuring, albeit weak, into at least three different subpopulations. The detected subdivision could be attributed to the influence of the rivers Ebro, Tagus and Guadiana, suggesting that main watercourses in the Iberian Peninsula may act as semi-permeable barriers to gene flow in stone martens. To our knowledge, this is the first phylogeographic and population genetic study of the species at a broad regional scale. We also wanted to make the case for the importance and benefits of using and comparing multiple different clustering and multivariate methods in spatial genetic analyses of mobile and continuously distributed species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent player tracking technology provides new information about basketball game performance. The aim of this study was to (i) compare the game performances of all-star and non all-star basketball players from the National Basketball Association (NBA), and (ii) describe the different basketball game performance profiles based on the different game roles. Archival data were obtained from all 2013-2014 regular season games (n = 1230). The variables analyzed included the points per game, minutes played and the game actions recorded by the player tracking system. To accomplish the first aim, the performance per minute of play was analyzed using a descriptive discriminant analysis to identify which variables best predict the all-star and non all-star playing categories. The all-star players showed slower velocities in defense and performed better in elbow touches, defensive rebounds, close touches, close points and pull-up points, possibly due to optimized attention processes that are key for perceiving the required appropriate environmental information. The second aim was addressed using a k-means cluster analysis, with the aim of creating maximal different performance profile groupings. Afterwards, a descriptive discriminant analysis identified which variables best predict the different playing clusters. The results identified different playing profile of performers, particularly related to the game roles of scoring, passing, defensive and all-round game behavior. Coaching staffs may apply this information to different players, while accounting for individual differences and functional variability, to optimize practice planning and, consequently, the game performances of individuals and teams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A single-contact, mode-hop-free, single longitudinal mode laser operating cw under large signal modulation at 2.5 Gbit/s at room temperature was created by introducing a short etched region around the ridge-waveguide of a Fabry-Perot laser. The device could be suitable for use in extended range data communications applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optical fiber strain sensing technique, based on Brillouin Optical Time Domain Reflectometry (BOTDR), was used to obtain the full deformation profile of a secant pile wall during construction of an adjacent basement in London. Details of the installation of sensors as well as data processing are described. By installing optical fiber down opposite sides of the pile, the distributed strain profiles obtained can be used to give both the axial and lateral movements along the pile. Measurements obtained from the BOTDR were found in good agreement with inclinometer data from the adjacent piles. The relative merits of the two different techniques are discussed. © 2007 ASCE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radio Frequency Identification (RFID) technology allows automatic data capture from tagged objects moving in a supply chain. This data can be very useful if it is used to answer traceability queries, however it is distributed across many different repositories, owned by different companies. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe simple yet scalable and distributed algorithms for solving the maximum flow problem and its minimum cost flow variant, motivated by problems of interest in objects similarity visualization. We formulate the fundamental problem as a convex-concave saddle point problem. We then show that this problem can be efficiently solved by a first order method or by exploiting faster quasi-Newton steps. Our proposed approach costs at most O(|ε|) per iteration for a graph with |ε| edges. Further, the number of required iterations can be shown to be independent of number of edges for the first order approximation method. We present experimental results in two applications: mosaic generation and color similarity based image layouting. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses a new way for handling distributed design know as the Macro concept. It is based round the assumption that future design teams will become more distributed in nature as industry exploits the Internet and other integrated communication and data exchange systems. The paper notes that this concept is part of an attack on the problems associated with the total process of Distribute Multi-Disciplinary design and Optimisation. The concepts rely on the creation of distributed self-building and self-organising teams made up from members who are globally distributed. The paper describes both the approach adopted and its implementation in a prototype software system operating over the Internet. In essence the work presented is describing a novel method for implementing a distributed design process which is far from complete but which is producing challenging ideas. © 2000 by Cranfield University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With long-term marine surveys and research, and especially with the development of new marine environment monitoring technologies, prodigious amounts of complex marine environmental data are generated, and continuously increase rapidly. Features of these data include massive volume, widespread distribution, multiple-sources, heterogeneous, multi-dimensional and dynamic in structure and time. The present study recommends an integrative visualization solution for these data, to enhance the visual display of data and data archives, and to develop a joint use of these data distributed among different organizations or communities. This study also analyses the web services technologies and defines the concept of the marine information gird, then focuses on the spatiotemporal visualization method and proposes a process-oriented spatiotemporal visualization method. We discuss how marine environmental data can be organized based on the spatiotemporal visualization method, and how organized data are represented for use with web services and stored in a reusable fashion. In addition, we provide an original visualization architecture that is integrative and based on the explored technologies. In the end, we propose a prototype system of marine environmental data of the South China Sea for visualizations of Argo floats, sea surface temperature fields, sea current fields, salinity, in-situ investigation data, and ocean stations. An integration visualization architecture is illustrated on the prototype system, which highlights the process-oriented temporal visualization method and demonstrates the benefit of the architecture and the methods described in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We leverage the buffering capabilities of end-systems to achieve scalable, asynchronous delivery of streams in a peer-to-peer environment. Unlike existing cache-and-relay schemes, we propose a distributed prefetching protocol where peers prefetch and store portions of the streaming media ahead of their playout time, thus not only turning themselves to possible sources for other peers but their prefetched data can allow them to overcome the departure of their source-peer. This stands in sharp contrast to existing cache-and-relay schemes where the departure of the source-peer forces its peer children to go the original server, thus disrupting their service and increasing server and network load. Through mathematical analysis and simulations, we show the effectiveness of maintaining such asynchronous multicasts from several source-peers to other children peers, and the efficacy of prefetching in the face of peer departures. We confirm the scalability of our dPAM protocol as it is shown to significantly reduce server load.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of inexpensive workstations and networks has created a new era in distributed computing. At the same time, non-traditional applications such as computer-aided design (CAD), computer-aided software engineering (CASE), geographic-information systems (GIS), and office-information systems (OIS) have placed increased demands for high-performance transaction processing on database systems. The combination of these factors gives rise to significant challenges in the design of modern database systems. In this thesis, we propose novel techniques whose aim is to improve the performance and scalability of these new database systems. These techniques exploit client resources through client-based transaction management. Client-based transaction management is realized by providing logging facilities locally even when data is shared in a global environment. This thesis presents several recovery algorithms which utilize client disks for storing recovery related information (i.e., log records). Our algorithms work with both coarse and fine-granularity locking and they do not require the merging of client logs at any time. Moreover, our algorithms support fine-granularity locking with multiple clients permitted to concurrently update different portions of the same database page. The database state is recovered correctly when there is a complex crash as well as when the updates performed by different clients on a page are not present on the disk version of the page, even though some of the updating transactions have committed. This thesis also presents the implementation of the proposed algorithms in a memory-mapped storage manager as well as a detailed performance study of these algorithms using the OO1 database benchmark. The performance results show that client-based logging is superior to traditional server-based logging. This is because client-based logging is an effective way to reduce dependencies on server CPU and disk resources and, thus, prevents the server from becoming a performance bottleneck as quickly when the number of clients accessing the database increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pervasiveness of personal computing platforms offers an unprecedented opportunity to deploy large-scale services that are distributed over wide physical spaces. Two major challenges face the deployment of such services: the often resource-limited nature of these platforms, and the necessity of preserving the autonomy of the owner of these devices. These challenges preclude using centralized control and preclude considering services that are subject to performance guarantees. To that end, this thesis advances a number of new distributed resource management techniques that are shown to be effective in such settings, focusing on two application domains: distributed Field Monitoring Applications (FMAs), and Message Delivery Applications (MDAs). In the context of FMA, this thesis presents two techniques that are well-suited to the fairly limited storage and power resources of autonomously mobile sensor nodes. The first technique relies on amorphous placement of sensory data through the use of novel storage management and sample diffusion techniques. The second approach relies on an information-theoretic framework to optimize local resource management decisions. Both approaches are proactive in that they aim to provide nodes with a view of the monitored field that reflects the characteristics of queries over that field, enabling them to handle more queries locally, and thus reduce communication overheads. Then, this thesis recognizes node mobility as a resource to be leveraged, and in that respect proposes novel mobility coordination techniques for FMAs and MDAs. Assuming that node mobility is governed by a spatio-temporal schedule featuring some slack, this thesis presents novel algorithms of various computational complexities to orchestrate the use of this slack to improve the performance of supported applications. The findings in this thesis, which are supported by analysis and extensive simulations, highlight the importance of two general design principles for distributed systems. First, a-priori knowledge (e.g., about the target phenomena of FMAs and/or the workload of either FMAs or DMAs) could be used effectively for local resource management. Second, judicious leverage and coordination of node mobility could lead to significant performance gains for distributed applications deployed over resource-impoverished infrastructures.