878 resultados para Topologies on an arbitrary set


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation given at the Al-Azhar Engineering First Conference, AEC’89, Dec. 9-12 1989, Cairo, Egypt. The paper presented at AEC'89 suggests an infinite storage scheme divided into one volume which is online and an arbitrary number of off-line volumes arranged into a linear chain which hold records which haven't been accessed recently. The online volume holds the records in sorted order (e.g. as a B-tree) and contains shortest prefixes of keys of records already pushed offline. As new records enter, older ones are retired to the first volume which is going offline next. Statistical arguments are given for the rate at which an off-line volume needs to be fetched to reload a record which had been retired before. The rate depends on the distribution of access probabilities as a function of time. Applications are medical records, production records or other data which need to be kept for a long time for legal reasons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a component-based approach for recognizing objects under large pose changes. From a set of training images of a given object we extract a large number of components which are clustered based on the similarity of their image features and their locations within the object image. The cluster centers build an initial set of component templates from which we select a subset for the final recognizer. In experiments we evaluate different sizes and types of components and three standard techniques for component selection. The component classifiers are finally compared to global classifiers on a database of four objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a complete control architecture that has been designed to fulfill predefined missions with an autonomous underwater vehicle (AUV). The control architecture has three levels of control: mission level, task level and vehicle level. The novelty of the work resides in the mission level, which is built with a Petri network that defines the sequence of tasks that are executed depending on the unpredictable situations that may occur. The task control system is composed of a set of active behaviours and a coordinator that selects the most appropriate vehicle action at each moment. The paper focuses on the design of the mission controller and its interaction with the task controller. Simulations, inspired on an industrial underwater inspection of a dam grate, show the effectiveness of the control architecture

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Describes a method to code a decimated model of an isosurface on an octree representation while maintaining volume data if it is needed. The proposed technique is based on grouping the marching cubes (MC) patterns into five configurations according the topology and the number of planes of the surface that are contained in a cell. Moreover, the discrete number of planes on which the surface lays is fixed. Starting from a complete volume octree, with the isosurface codified at terminal nodes according to the new configuration, a bottom-up strategy is taken for merging cells. Such a strategy allows one to implicitly represent co-planar faces in the upper octree levels without introducing any error. At the end of this merging process, when it is required, a reconstruction strategy is applied to generate the surface contained in the octree intersected leaves. Some examples with medical data demonstrate that a reduction of up to 50% in the number of polygons can be achieved

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The no response test is a new scheme in inverse problems for partial differential equations which was recently proposed in [D. R. Luke and R. Potthast, SIAM J. Appl. Math., 63 (2003), pp. 1292–1312] in the framework of inverse acoustic scattering problems. The main idea of the scheme is to construct special probing waves which are small on some test domain. Then the response for these waves is constructed. If the response is small, the unknown object is assumed to be a subset of the test domain. The response is constructed from one, several, or many particular solutions of the problem under consideration. In this paper, we investigate the convergence of the no response test for the reconstruction information about inclusions D from the Cauchy values of solutions to the Helmholtz equation on an outer surface $\partial\Omega$ with $\overline{D} \subset \Omega$. We show that the one‐wave no response test provides a criterion to test the analytic extensibility of a field. In particular, we investigate the construction of approximations for the set of singular points $N(u)$ of the total fields u from one given pair of Cauchy data. Thus, the no response test solves a particular version of the classical Cauchy problem. Also, if an infinite number of fields is given, we prove that a multifield version of the no response test reconstructs the unknown inclusion D. This is the first convergence analysis which could be achieved for the no response test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are various situations in which it is natural to ask whether a given collection of k functions, ρ j (r 1,…,r j ), j=1,…,k, defined on a set X, are the first k correlation functions of a point process on X. Here we describe some necessary and sufficient conditions on the ρ j ’s for this to be true. Our primary examples are X=ℝ d , X=ℤ d , and X an arbitrary finite set. In particular, we extend a result by Ambartzumian and Sukiasian showing realizability at sufficiently small densities ρ 1(r). Typically if any realizing process exists there will be many (even an uncountable number); in this case we prove, when X is a finite set, the existence of a realizing Gibbs measure with k body potentials which maximizes the entropy among all realizing measures. We also investigate in detail a simple example in which a uniform density ρ and translation invariant ρ 2 are specified on ℤ; there is a gap between our best upper bound on possible values of ρ and the largest ρ for which realizability can be established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for using remotely sensed data to both generate and evaluate a hydraulic model of floodplain inundation is presented for a rural case study in the United Kingdom: Upton-upon-Severn. Remotely sensed data have been processed and assembled to provide an excellent test data set for both model construction and validation. In order to assess the usefulness of the data and the issues encountered in their use, two models for floodplain inundation were constructed: one based on an industry standard one-dimensional approach and the other based on a simple two-dimensional approach. The results and their implications for the future use of remotely sensed data for predicting flood inundation are discussed. Key conclusions for the use of remotely sensed data are that care must be taken to integrate different data sources for both model construction and validation and that improvements in ground height data shift the focus in terms of model uncertainties to other sources such as boundary conditions. The differences between the two models are found to be of minor significance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider boundary value problems posed on an interval [0,L] for an arbitrary linear evolution equation in one space dimension with spatial derivatives of order n. We characterize a class of such problems that admit a unique solution and are well posed in this sense. Such well-posed boundary value problems are obtained by prescribing N conditions at x=0 and n–N conditions at x=L, where N depends on n and on the sign of the highest-degree coefficient n in the dispersion relation of the equation. For the problems in this class, we give a spectrally decomposed integral representation of the solution; moreover, we show that these are the only problems that admit such a representation. These results can be used to establish the well-posedness, at least locally in time, of some physically relevant nonlinear evolution equations in one space dimension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disease-weather relationships influencing Septoria leaf blotch (SLB) preceding growth stage (GS) 31 were identified using data from 12 sites in the UK covering 8 years. Based on these relationships, an early-warning predictive model for SLB on winter wheat was formulated to predict the occurrence of a damaging epidemic (defined as disease severity of 5% or > 5% on the top three leaf layers). The final model was based on accumulated rain > 3 mm in the 80-day period preceding GS 31 (roughly from early-February to the end of April) and accumulated minimum temperature with a 0A degrees C base in the 50-day period starting from 120 days preceding GS 31 (approximately January and February). The model was validated on an independent data set on which the prediction accuracy was influenced by cultivar resistance. Over all observations, the model had a true positive proportion of 0.61, a true negative proportion of 0.73, a sensitivity of 0.83, and a specificity of 0.18. True negative proportion increased to 0.85 for resistant cultivars and decreased to 0.50 for susceptible cultivars. Potential fungicide savings are most likely to be made with resistant cultivars, but such benefits would need to be identified with an in-depth evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, airborne based LIght Detection And Ranging (LIDAR) has been recognised by both the commercial and public sectors as a reliable and accurate source for land surveying in environmental, engineering and civil applications. Commonly, the first task to investigate LIDAR point clouds is to separate ground and object points. Skewness Balancing has been proven to be an efficient non-parametric unsupervised classification algorithm to address this challenge. Initially developed for moderate terrain, this algorithm needs to be adapted to handle sloped terrain. This paper addresses the difficulty of object and ground point separation in LIDAR data in hilly terrain. A case study on a diverse LIDAR data set in terms of data provider, resolution and LIDAR echo has been carried out. Several sites in urban and rural areas with man-made structure and vegetation in moderate and hilly terrain have been investigated and three categories have been identified. A deeper investigation on an urban scene with a river bank has been selected to extend the existing algorithm. The results show that an iterative use of Skewness Balancing is suitable for sloped terrain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The popularity of wireless local area networks (WLANs) has resulted in their dense deployments around the world. While this increases capacity and coverage, the problem of increased interference can severely degrade the performance of WLANs. However, the impact of interference on throughput in dense WLANs with multiple access points (APs) has had very limited prior research. This is believed to be due to 1) the inaccurate assumption that throughput is always a monotonically decreasing function of interference and 2) the prohibitively high complexity of an accurate analytical model. In this work, firstly we provide a useful classification of commonly found interference scenarios. Secondly, we investigate the impact of interference on throughput for each class based on an approach that determines the possibility of parallel transmissions. Extensive packet-level simulations using OPNET have been performed to support the observations made. Interestingly, results have shown that in some topologies, increased interference can lead to higher throughput and vice versa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga et al. [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga et al. in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga et al. in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.