931 resultados para k-Means algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Lanczos algorithm is appreciated in many situations due to its speed. and economy of storage. However, the advantage that the Lanczos basis vectors need not be kept is lost when the algorithm is used to compute the action of a matrix function on a vector. Either the basis vectors need to be kept, or the Lanczos process needs to be applied twice. In this study we describe an augmented Lanczos algorithm to compute a dot product relative to a function of a large sparse symmetric matrix, without keeping the basis vectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IEEE 802.15.4 standard provides appealing features to simultaneously support real-time and non realtime traffic, but it is only capable of supporting real-time communications from at most seven devices. Additionally, it cannot guarantee delay bounds lower than the superframe duration. Motivated by this problem, in this paper we propose an Explicit Guaranteed time slot Sharing and Allocation scheme (EGSA) for beacon-enabled IEEE 802.15.4 networks. This scheme is capable of providing tighter delay bounds for real-time communications by splitting the Contention Free access Period (CFP) into smaller mini time slots and by means of a new guaranteed bandwidth allocation scheme for a set of devices with periodic messages. At the same the novel bandwidth allocation scheme can maximize the duration of the CFP for non real-time communications. Performance analysis results show that the EGSA scheme works efficiently and outperforms competitor schemes both in terms of guaranteed delay and bandwidth utilization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hexagonal wireless sensor network refers to a network topology where a subset of nodes have six peer neighbors. These nodes form a backbone for multi-hop communications. In a previous work, we proposed the use of hexagonal topology in wireless sensor networks and discussed its properties in relation to real-time (bounded latency) multi-hop communications in large-scale deployments. In that work, we did not consider the problem of hexagonal topology formation in practice - which is the subject of this research. In this paper, we present a decentralized algorithm that forms the hexagonal topology backbone in an arbitrary but sufficiently dense network deployment. We implemented a prototype of our algorithm in NesC for TinyOS based platforms. We present data from field tests of our implementation, collected using a deployment of fifty wireless sensor nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fingerprinting is an indoor location technique, based on wireless networks, where data stored during the offline phase is compared with data collected by the mobile device during the online phase. In most of the real-life scenarios, the mobile node used throughout the offline phase is different from the mobile nodes that will be used during the online phase. This means that there might be very significant differences between the Received Signal Strength values acquired by the mobile node and the ones stored in the Fingerprinting Map. As a consequence, this difference between RSS values might contribute to increase the location estimation error. One possible solution to minimize these differences is to adapt the RSS values, acquired during the online phase, before sending them to the Location Estimation Algorithm. Also the internal parameters of the Location Estimation Algorithms, for example the weights of the Weighted k-Nearest Neighbour, might need to be tuned for every type of terminal. This paper focuses both approaches, using Direct Search optimization methods to adapt the Received Signal Strength and to tune the Location Estimation Algorithm parameters. As a result it was possible to decrease the location estimation error originally obtained without any calibration procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

No literature data above atmospheric pressure could be found for the viscosity of TOTIVI. As a consequence, the present viscosity results could only be compared upon extrapolation of the vibrating wire data to 0.1 MPa. Independent viscosity measurements were performed, at atmospheric pressure, using an Ubbelohde capillary in order to compare with the vibrating wire results, extrapolated by means of the above mentioned correlation. The two data sets agree within +/- 1%, which is commensurate with the mutual uncertainty of the experimental methods. Comparisons of the literature data obtained at atmospheric pressure with the present extrapolated vibrating-wire viscosity measurements have shown an agreement within +/- 2% for temperatures up to 339 K and within +/- 3.3% for temperatures up to 368 K. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solvatochromic UV-Vis shifts of four indicators (4-nitroaniline, 4-nitroanisole, 4-nitrophenol and N,N-dimethy-1-4-nitro aniline) have been measured at 298.15 K in the ternary mixture methano1/1-propanol/acetonitrile (MeOH/1-PrOH/MeCN) in a total of 22 mole fractions, along with 18 additional mole fractions for each of the corresponding binary mixtures, MeOH/1-PrOH, 1-PrOH/MeCN and MeOH/MeCN. These values, combined with our previous experimental results for 2,6-dipheny1-4-(2,4,6-triphenylpyridinium-1-yl)phenolate (Reichardt's betaine dye) in the same mixtures, permitted the computation of the Kamlet-Taft solvent parameters, alpha, beta, and pi*. The rationalization of the spectroscopic behavior of each probe within each mixture's whole mole fraction range was achieved through the use of the Bosch and Roses preferential solvation model. The applied model allowed the identification of synergistic behaviors in MeCN/alcohol mixtures and thus to infer the existence of solvent complexes in solution. Also, the addition of small amounts of MeCN to the binary mixtures was seen to cause a significant variation in pi*, whereas the addition of alcohol to MeCN mixtures always lead to a sudden change in a and The behavior of these parameters in the ternary mixture was shown to be mainly determined by the contributions of the underlying binary mixtures. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper characterizes four ‘fractal vegetables’: (i) cauliflower (brassica oleracea var. Botrytis); (ii) broccoli (brassica oleracea var. italica); (iii) round cabbage (brassica oleracea var. capitata) and (iv) Brussels sprout (brassica oleracea var. gemmifera), by means of electrical impedance spectroscopy and fractional calculus tools. Experimental data is approximated using fractional-order models and the corresponding parameters are determined with a genetic algorithm. The Havriliak-Negami five-parameter model fits well into the data, demonstrating that classical formulae can constitute simple and reliable models to characterize biological structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to predict time series of SO2 concentrations emitted by coal-fired power stations in order to estimate in advance emission episodes and analyze the influence of some meteorological variables in the prediction. An emission episode is said to occur when the series of bi-hourly means of SO2 is greater than a specific level. For coal-fired power stations it is essential to predict emission epi- sodes sufficiently in advance so appropriate preventive measures can be taken. We proposed a meth- odology to predict SO2 emission episodes based on using an additive model and an algorithm for variable selection. The methodology was applied to the estimation of SO2 emissions registered in sampling lo- cations near a coal-fired power station located in Northern Spain. The results obtained indicate a good performance of the model considering only two terms of the time series and that the inclusion of the meteorological variables in the model is not significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waveform tomographic imaging of crosshole georadar data is a powerful method to investigate the shallow subsurface because of its ability to provide images of pertinent petrophysical parameters with extremely high spatial resolution. All current crosshole georadar waveform inversion strategies are based on the assumption of frequency-independent electromagnetic constitutive parameters. However, in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behavior. In this paper, we evaluate synthetically the reconstruction limits of a recently published crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. Our results indicate that, when combined with a source wavelet estimation procedure that provides a means of partially accounting for the frequency-dependent effects through an "effective" wavelet, the inversion algorithm performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genes underlying mutant phenotypes can be isolated by combining marker discovery, genetic mapping and resequencing, but a more straightforward strategy for mapping mutations would be the direct comparison of mutant and wild-type genomes. Applying such an approach, however, is hampered by the need for reference sequences and by mutational loads that confound the unambiguous identification of causal mutations. Here we introduce NIKS (needle in the k-stack), a reference-free algorithm based on comparing k-mers in whole-genome sequencing data for precise discovery of homozygous mutations. We applied NIKS to eight mutants induced in nonreference rice cultivars and to two mutants of the nonmodel species Arabis alpina. In both species, comparing pooled F2 individuals selected for mutant phenotypes revealed small sets of mutations including the causal changes. Moreover, comparing M3 seedlings of two allelic mutants unambiguously identified the causal gene. Thus, for any species amenable to mutagenesis, NIKS enables forward genetics without requiring segregating populations, genetic maps and reference sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Positron emission tomography with [18F] fluorodeoxyglucose (FDG-PET) plays a well-established role in assisting early detection of frontotemporal lobar degeneration (FTLD). Here, we examined the impact of intensity normalization to different reference areas on accuracy of FDG-PET to discriminate between patients with mild FTLD and healthy elderly subjects. FDG-PET was conducted at two centers using different acquisition protocols: 41 FTLD patients and 42 controls were studied at center 1, 11 FTLD patients and 13 controls were studied at center 2. All PET images were intensity normalized to the cerebellum, primary sensorimotor cortex (SMC), cerebral global mean (CGM), and a reference cluster with most preserved FDG uptake in the aforementioned patients group of center 1. Metabolic deficits in the patient group at center 1 appeared 1.5, 3.6, and 4.6 times greater in spatial extent, when tracer uptake was normalized to the reference cluster rather than to the cerebellum, SMC, and CGM, respectively. Logistic regression analyses based on normalized values from FTLD-typical regions showed that at center 1, cerebellar, SMC, CGM, and cluster normalizations differentiated patients from controls with accuracies of 86%, 76%, 75% and 90%, respectively. A similar order of effects was found at center 2. Cluster normalization leads to a significant increase of statistical power in detecting early FTLD-associated metabolic deficits. The established FTLD-specific cluster can be used to improve detection of FTLD on a single case basis at independent centers - a decisive step towards early diagnosis and prediction of FTLD syndromes enabling specific therapies in the future.