975 resultados para Vertical Data Distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuous field mapping has to address two conflicting remote sensing requirements when collecting training data. On one hand, continuous field mapping trains fractional land cover and thus favours mixed training pixels. On the other hand, the spectral signature has to be preferably distinct and thus favours pure training pixels. The aim of this study was to evaluate the sensitivity of training data distribution along fractional and spectral gradients on the resulting mapping performance. We derived four continuous fields (tree, shrubherb, bare, water) from aerial photographs as response variables and processed corresponding spectral signatures from multitemporal Landsat 5 TM data as explanatory variables. Subsequent controlled experiments along fractional cover gradients were then based on generalised linear models. Resulting fractional and spectral distribution differed between single continuous fields, but could be satisfactorily trained and mapped. Pixels with fractional or without respective cover were much more critical than pure full cover pixels. Error distribution of continuous field models was non-uniform with respect to horizontal and vertical spatial distribution of target fields. We conclude that a sampling for continuous field training data should be based on extent and densities in the fractional and spectral, rather than the real spatial space. Consequently, adequate training plots are most probably not systematically distributed in the real spatial space, but cover the gradient and covariate structure of the fractional and spectral space well. (C) 2009 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unauthorized accesses to digital contents are serious threats to international security and informatics. We propose an offline oblivious data distribution framework that preserves the sender's security and the receiver's privacy using tamper-proof smart cards. This framework provides persistent content protections from digital piracy and promises private content consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In systems that combine the outputs of classification methods (combination systems), such as ensembles and multi-agent systems, one of the main constraints is that the base components (classifiers or agents) should be diverse among themselves. In other words, there is clearly no accuracy gain in a system that is composed of a set of identical base components. One way of increasing diversity is through the use of feature selection or data distribution methods in combination systems. In this work, an investigation of the impact of using data distribution methods among the components of combination systems will be performed. In this investigation, different methods of data distribution will be used and an analysis of the combination systems, using several different configurations, will be performed. As a result of this analysis, it is aimed to detect which combination systems are more suitable to use feature distribution among the components

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Vertical distributions of turbulent energy dissipation rates and fluorescence were measured simultaneously with a high-resolution micro-profiler in four different oceanographic regions, from temperate to polar and from coastal to open waters settings. High fluorescence values, forming a deep chlorophyll maximum (DCM), were often located in weakly stratified portions of the upper water column, just below layers with maximum levels of turbulent energy dissipation rate. In the vicinity of the DCM, a significant negative relationship between fluorescence and turbulent energy dissipation rate was found. We discuss the mechanisms that may explain the observed patterns of planktonic biomass distribution within the ocean mixed layer, including a vertically variable diffusion coefficient and the alteration of the cells sinking velocity by turbulent motion. These findings provide further insight into the processes controlling the vertical distribution of the pelagic community and position of the DCM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il Data Distribution Management (DDM) è un componente dello standard High Level Architecture. Il suo compito è quello di rilevare le sovrapposizioni tra update e subscription extent in modo efficiente. All'interno di questa tesi si discute la necessità di avere un framework e per quali motivi è stato implementato. Il testing di algoritmi per un confronto equo, librerie per facilitare la realizzazione di algoritmi, automatizzazione della fase di compilazione, sono motivi che sono stati fondamentali per iniziare la realizzazione framework. Il motivo portante è stato che esplorando articoli scientifici sul DDM e sui vari algoritmi si è notato che in ogni articolo si creavano dei dati appositi per fare dei test. L'obiettivo di questo framework è anche quello di riuscire a confrontare gli algoritmi con un insieme di dati coerente. Si è deciso di testare il framework sul Cloud per avere un confronto più affidabile tra esecuzioni di utenti diversi. Si sono presi in considerazione due dei servizi più utilizzati: Amazon AWS EC2 e Google App Engine. Sono stati mostrati i vantaggi e gli svantaggi dell'uno e dell'altro e il motivo per cui si è scelto di utilizzare Google App Engine. Si sono sviluppati quattro algoritmi: Brute Force, Binary Partition, Improved Sort, Interval Tree Matching. Sono stati svolti dei test sul tempo di esecuzione e sulla memoria di picco utilizzata. Dai risultati si evince che l'Interval Tree Matching e l'Improved Sort sono i più efficienti. Tutti i test sono stati svolti sulle versioni sequenziali degli algoritmi e che quindi ci può essere un riduzione nel tempo di esecuzione per l'algoritmo Interval Tree Matching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se ha investigado la posibilidad de utilizar el estándar DDS (Data Distribution Service) desarrollado por el OMG (Object Management Group) para la monitorización en tiempo real del nivel de glucosa en pacientes diabéticos. Dicho estándar sigue el patrón publicador/suscriptor de modo que, en la prueba de concepto desarrollada, los sensores del punto de cuidado son publicadores de los valores de glucosa de los pacientes y diferentes supervisores se suscriben a esa información. Estos supervisores reaccionan de la forma más adecuada a los valores y la evolución del nivel de glucosa en el paciente, por ejemplo, registrando el valor de la muestra o generando una alarma. El software de intermediación que soporta la comunicación de datos sigue el estándar DDS. Esto facilita por un lado la escalabilidad e interoperatividad de la solución desarrollada y por otro la monitorización de niveles de glucosa y la activación de protocolos predefinidos en tiempo real. La investigación se enmarca dentro del proyecto intramural PERSONA del CIBER-BBN, cuyo objetivo es el diseño de herramientas de soporte a la decisión para la monitorización continua de pacientes personalizadas e integradas en una plataforma tecnológica para diabetes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The vertical profile of aerosol is important for its radiative effects, but weakly constrained by observations on the global scale, and highly variable among different models. To investigate the controlling factors in one particular model, we investigate the effects of individual processes in HadGEM3–UKCA and compare the resulting diversity of aerosol vertical profiles with the inter-model diversity from the AeroCom Phase II control experiment. In this way we show that (in this model at least) the vertical profile is controlled by a relatively small number of processes, although these vary among aerosol components and particle sizes. We also show that sufficiently coarse variations in these processes can produce a similar diversity to that among different models in terms of the global-mean profile and, to a lesser extent, the zonal-mean vertical position. However, there are features of certain models' profiles that cannot be reproduced, suggesting the influence of further structural differences between models. In HadGEM3–UKCA, convective transport is found to be very important in controlling the vertical profile of all aerosol components by mass. In-cloud scavenging is very important for all except mineral dust. Growth by condensation is important for sulfate and carbonaceous aerosol (along with aqueous oxidation for the former and ageing by soluble material for the latter). The vertical extent of biomass-burning emissions into the free troposphere is also important for the profile of carbonaceous aerosol. Boundary-layer mixing plays a dominant role for sea salt and mineral dust, which are emitted only from the surface. Dry deposition and below-cloud scavenging are important for the profile of mineral dust only. In this model, the microphysical processes of nucleation, condensation and coagulation dominate the vertical profile of the smallest particles by number (e.g. total CN  >  3 nm), while the profiles of larger particles (e.g. CN  >  100 nm) are controlled by the same processes as the component mass profiles, plus the size distribution of primary emissions. We also show that the processes that affect the AOD-normalised radiative forcing in the model are predominantly those that affect the vertical mass distribution, in particular convective transport, in-cloud scavenging, aqueous oxidation, ageing and the vertical extent of biomass-burning emissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Foraminiferal data were obtained from 66 samples of box cores on the southeastern Brazilian upper margin (between 23.8A degrees-25.9A degrees S and 42.8A degrees-46.13A degrees W) to evaluate the benthic foraminiferal fauna distribution and its relation to some selected abiotic parameters. We focused on areas with different primary production regimes on the southern Brazilian margin, which is generally considered as an oligotrophic region. The total density (D), richness (R), mean diversity (H) over bar`, average living depth (ALD(X) ) and percentages of specimens of different microhabitats (epifauna, shallow infauna, intermediate infauna and deep infauna) were analyzed. The dominant species identified were Uvigerina spp., Globocassidulina subglobosa, Bulimina marginata, Adercotryma wrighti, Islandiella norcrossi, Rhizammina spp. and Brizalina sp.. We also established a set of mathematical functions for analyzing the vertical foraminiferal distribution patterns, providing a quantitative tool that allows correlating the microfaunal density distributions with abiotic factors. In general, the cores that fit with pure exponential decaying functions were related to the oligotrophic conditions prevalent on the Brazilian margin and to the flow of the Brazilian Current (BC). Different foraminiferal responses were identified in cores located in higher productivity zones, such as the northern and the southern region of the study area, where high percentages of infauna were encountered in these cores, and the functions used to fit these profiles differ appreciably from a pure exponential function, as a response of the significant living fauna in deeper layers of the sediment. One of the main factors supporting the different foraminiferal assemblage responses may be related to the differences in primary productivity of the water column and, consequently, in the estimated carbon flux to the sea floor. Nevertheless, also bottom water velocities, substrate type and water depth need to be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cranial cruciate ligament rupture (CCLR) is one of the most important stifle injuries and a common cause of lameness in dogs. Our objective was to measure the vertical forces in the pads of Pitbulls with cranial cruciate ligament rupture (CCLR) using a pressure sensitive walkway. A pressure sensitive walkway was used to collect vertical force data from the pads of 10 Pitbulls affected with unilateral CCLR. Ten healthy Pitbulls were included in the study as controls. Velocity varied between 1.3 and 1.6 m/s and acceleration was kept below ± 0.1 m/s2. Differences between groups and between pads in the same limb within groups were investigated using ANOVA and the Tukey test. The paired Student t-test was employed to assess gait symmetry (p < 0.05). Results: Peak vertical forces (PVF) were lower in the affected limb, particularly in the metatarsal pad. Increased PVF values in the forelimb and the contralateral hind limb pads of affected dogs suggest a compensatory effect. Conclusions: A consistent pattern of vertical force distribution was observed in the pads of dogs with CCLR. These data are important for increased understanding of vertical force distribution in the limb of dogs with CCLR disease. Kinetic analysis using pressure sensitive walkways can be useful in follow-up assessment of surgically treated dogs regardless of the surgical technique employed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a vertically resolved zonal mean monthly mean global ozone data set spanning the period 1901 to 2007, called HISTOZ.1.0. It is based on a new approach that combines information from an ensemble of chemistry climate model (CCM) simulations with historical total column ozone information. The CCM simulations incorporate important external drivers of stratospheric chemistry and dynamics (in particular solar and volcanic effects, greenhouse gases and ozone depleting substances, sea surface temperatures, and the quasi-biennial oscillation). The historical total column ozone observations include ground-based measurements from the 1920s onward and satellite observations from 1970 to 1976. An off-line data assimilation approach is used to combine model simulations, observations, and information on the observation error. The period starting in 1979 was used for validation with existing ozone data sets and therefore only ground-based measurements were assimilated. Results demonstrate considerable skill from the CCM simulations alone. Assimilating observations provides additional skill for total column ozone. With respect to the vertical ozone distribution, assimilating observations increases on average the correlation with a reference data set, but does not decrease the mean squared error. Analyses of HISTOZ.1.0 with respect to the effects of El Niño–Southern Oscillation (ENSO) and of the 11 yr solar cycle on stratospheric ozone from 1934 to 1979 qualitatively confirm previous studies that focussed on the post-1979 period. The ENSO signature exhibits a much clearer imprint of a change in strength of the Brewer–Dobson circulation compared to the post-1979 period. The imprint of the 11 yr solar cycle is slightly weaker in the earlier period. Furthermore, the total column ozone increase from the 1950s to around 1970 at northern mid-latitudes is briefly discussed. Indications for contributions of a tropospheric ozone increase, greenhouse gases, and changes in atmospheric circulation are found. Finally, the paper points at several possible future improvements of HISTOZ.1.0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Foraminiferal data were obtained from 66 samples of box cores on the southeastern Brazilian upper margin (between 23.8A degrees-25.9A degrees S and 42.8A degrees-46.13A degrees W) to evaluate the benthic foraminiferal fauna distribution and its relation to some selected abiotic parameters. We focused on areas with different primary production regimes on the southern Brazilian margin, which is generally considered as an oligotrophic region. The total density (D), richness (R), mean diversity (H) over bar', average living depth (ALD(X) ) and percentages of specimens of different microhabitats (epifauna, shallow infauna, intermediate infauna and deep infauna) were analyzed. The dominant species identified were Uvigerina spp., Globocassidulina subglobosa, Bulimina marginata, Adercotryma wrighti, Islandiella norcrossi, Rhizammina spp. and Brizalina sp.. We also established a set of mathematical functions for analyzing the vertical foraminiferal distribution patterns, providing a quantitative tool that allows correlating the microfaunal density distributions with abiotic factors. In general, the cores that fit with pure exponential decaying functions were related to the oligotrophic conditions prevalent on the Brazilian margin and to the flow of the Brazilian Current (BC). Different foraminiferal responses were identified in cores located in higher productivity zones, such as the northern and the southern region of the study area, where high percentages of infauna were encountered in these cores, and the functions used to fit these profiles differ appreciably from a pure exponential function, as a response of the significant living fauna in deeper layers of the sediment. One of the main factors supporting the different foraminiferal assemblage responses may be related to the differences in primary productivity of the water column and, consequently, in the estimated carbon flux to the sea floor. Nevertheless, also bottom water velocities, substrate type and water depth need to be considered.