969 resultados para Real-world


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite significant advances in recent years, structure-from-motion (SfM) pipelines suffer from two important drawbacks. Apart from requiring significant computational power to solve the large-scale computations involved, such pipelines sometimes fail to correctly reconstruct when the accumulated error in incremental reconstruction is large or when the number of 3D to 2D correspondences are insufficient. In this paper we present a novel approach to mitigate the above-mentioned drawbacks. Using an image match graph based on matching features we partition the image data set into smaller sets or components which are reconstructed independently. Following such reconstructions we utilise the available epipolar relationships that connect images across components to correctly align the individual reconstructions in a global frame of reference. This results in both a significant speed up of at least one order of magnitude and also mitigates the problems of reconstruction failures with a marginal loss in accuracy. The effectiveness of our approach is demonstrated on some large-scale real world data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose apractical, feature-level and score-level fusion approach by combining acoustic and estimated articulatory information for both text independent and text dependent speaker verification. From a practical point of view, we study how to improve speaker verification performance by combining dynamic articulatory information with the conventional acoustic features. On text independent speaker verification, we find that concatenating articulatory features obtained from measured speech production data with conventional Mel-frequency cepstral coefficients (MFCCs) improves the performance dramatically. However, since directly measuring articulatory data is not feasible in many real world applications, we also experiment with estimated articulatory features obtained through acoustic-to-articulatory inversion. We explore both feature level and score level fusion methods and find that the overall system performance is significantly enhanced even with estimated articulatory features. Such a performance boost could be due to the inter-speaker variation information embedded in the estimated articulatory features. Since the dynamics of articulation contain important information, we included inverted articulatory trajectories in text dependent speaker verification. We demonstrate that the articulatory constraints introduced by inverted articulatory features help to reject wrong password trials and improve the performance after score level fusion. We evaluate the proposed methods on the X-ray Microbeam database and the RSR 2015 database, respectively, for the aforementioned two tasks. Experimental results show that we achieve more than 15% relative equal error rate reduction for both speaker verification tasks. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Complex systems inspired analysis suggests a hypothesis that financial meltdowns are abrupt critical transitions that occur when the system reaches a tipping point. Theoretical and empirical studies on climatic and ecological dynamical systems have shown that approach to tipping points is preceded by a generic phenomenon called critical slowing down, i.e. an increasingly slow response of the system to perturbations. Therefore, it has been suggested that critical slowing down may be used as an early warning signal of imminent critical transitions. Whether financial markets exhibit critical slowing down prior to meltdowns remains unclear. Here, our analysis reveals that three major US (Dow Jones Index, S&P 500 and NASDAQ) and two European markets (DAX and FTSE) did not exhibit critical slowing down prior to major financial crashes over the last century. However, all markets showed strong trends of rising variability, quantified by time series variance and spectral function at low frequencies, prior to crashes. These results suggest that financial crashes are not critical transitions that occur in the vicinity of a tipping point. Using a simple model, we argue that financial crashes are likely to be stochastic transitions which can occur even when the system is far away from the tipping point. Specifically, we show that a gradually increasing strength of stochastic perturbations may have caused to abrupt transitions in the financial markets. Broadly, our results highlight the importance of stochastically driven abrupt transitions in real world scenarios. Our study offers rising variability as a precursor of financial meltdowns albeit with a limitation that they may signal false alarms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a completely automatic approach for recognizing low resolution face images captured in uncontrolled environment. The approach uses multidimensional scaling to learn a common transformation matrix for the entire face which simultaneously transforms the facial features of the low resolution and the high resolution training images such that the distance between them approximates the distance had both the images been captured under the same controlled imaging conditions. Stereo matching cost is used to obtain the similarity of two images in the transformed space. Though this gives very good recognition performance, the time taken for computing the stereo matching cost is significant. To overcome this limitation, we propose a reference-based approach in which each face image is represented by its stereo matching cost from a few reference images. Experimental evaluation on the real world challenging databases and comparison with the state-of-the-art super-resolution, classifier based and cross modal synthesis techniques show the effectiveness of the proposed algorithm.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and prediction techniques, most Bayesian papers have focused on the retrospective segmentation problem. Here we examine the case where the model parameters before and after the changepoint are independent and we derive an online algorithm for exact inference of the most recent changepoint. We compute the probability distribution of the length of the current ``run,'' or time since the last changepoint, using a simple message-passing algorithm. Our implementation is highly modular so that the algorithm may be applied to a variety of types of data. We illustrate this modularity by demonstrating the algorithm on three different real-world data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets. Copyright 2009.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finite-dimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we introduce a weighted complex networks model to investigate and recognize structures of patterns. The regular treating in pattern recognition models is to describe each pattern as a high-dimensional vector which however is insufficient to express the structural information. Thus, a number of methods are developed to extract the structural information, such as different feature extraction algorithms used in pre-processing steps, or the local receptive fields in convolutional networks. In our model, each pattern is attributed to a weighted complex network, whose topology represents the structure of that pattern. Based upon the training samples, we get several prototypal complex networks which could stand for the general structural characteristics of patterns in different categories. We use these prototypal networks to recognize the unknown patterns. It is an attempt to use complex networks in pattern recognition, and our result shows the potential for real-world pattern recognition. A spatial parameter is introduced to get the optimal recognition accuracy, and it remains constant insensitive to the amount of training samples. We have discussed the interesting properties of the prototypal networks. An approximate linear relation is found between the strength and color of vertexes, in which we could compare the structural difference between each category. We have visualized these prototypal networks to show that their topology indeed represents the common characteristics of patterns. We have also shown that the asymmetric strength distribution in these prototypal networks brings high robustness for recognition. Our study may cast a light on understanding the mechanism of the biologic neuronal systems in object recognition as well.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Resumen: El presente artículo, después de trazar un resumen del recorrido de los modos de historiar la filosofía, enfoca el tema de la relación entre ambos aspectos: el de la filosofía teorética y el de la historia de la filosofía. Toma distancia crítica tanto de la concepción que independiza la historia de la filosofía de la labor de búsqueda de la verdad teorética, como de la identificación de ambas dimensiones, tesis que tomó forma sobre todo a partir de Hegel. Propone una aplicación de la hermenéutica en la lectura del pasado filosófico, para abrir nuevas cuestiones y horizontes de búsqueda en el mundo de la realidad y en las exigencias y desafíos de la propia época histórica. Dicha hermenéutica, a su vez, no se desliza hacia el relativismo, sino que confirma la perennidad de lo verdadero en su persistencia fecunda a lo largo del tiempo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Authority files serve to uniquely identify real world ‘things’ or entities like documents, persons, organisations, and their properties, like relations and features. Already important in the classical library world, authority files are indispensable for adequate information retrieval and analysis in the computer age. This is because, even more than humans, computers are poor at handling ambiguity. Through authority files, people tell computers which terms, names or numbers refer to the same thing or have the same meaning by giving equivalent notions the same identifier. Thus, authority files signpost the internet where these identifiers are interlinked on the basis of relevance. When executing a query, computers are able to navigate from identifier to identifier by following these links and collect the queried information on these so-called ‘crosswalks’. In this context, identifiers also go under the name controlled access points. Identifiers become even more crucial now massive data collections like library catalogues or research datasets are releasing their till-now contained data directly to the internet. This development is coined Open Linked Data. The concatenating name for the internet is Web of Data instead of the classical Web of Documents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem discussed is the stability of two input-output feedforward and feedback relations, under an integral-type constraint defining an admissible class of feedback controllers. Sufficiency-type conditions are given for the positive, bounded and of closed range feed-forward operator to be strictly positive and then boundedly invertible, with its existing inverse being also a strictly positive operator. The general formalism is first established and the linked to properties of some typical contractive and pseudocontractive mappings while some real-world applications and links of the above formalism to asymptotic hyperstability of dynamic systems are discussed later on.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

If a product is being designed to be genuinely inclusive, then the designers need to be able to assess the level of exclusion of the product that they are working on and to identify possible areas of improvement. To be of practical use, the assessments need to be quick, consistent and repeatable. The aim of this workshop is to invite attendees to participate in the evaluation of a number of everyday objects using an assessment technique being considered by the workshop organisers. The objectives of the workshop include evaluating the effectiveness of the assessment method, evaluating the accessibility of the products being assessed and to suggest revisions to the assessment scales being used. The assessment technique is to be based on the ONS capability measures [1]. This source recognises fourteen capability scales of which seven are particularly pertinent to product evaluation, namely: motion, dexterity, reach and stretch, vision, hearing, communication, and intellectual functioning. Each of these scales ranges from 0 (fully able) through 1 (minimal impairment) to 10 (severe impairment). The attendees will be asked to rate the products on these scales. Clearly the assessed accessibility of the product depends on the assumptions made about the context of use. The attendees will be asked to clearly note the assumptions that they are making about the context in which the product is being assessed. For instance, with a hot water bottle, assumptions have to be made about the availability of hot water and these can affect the overall accessibility rating. The workshop organisers will not specify the context of use as the aim is to identify how assessors would use the assessment method in the real world. The objects being assessed will include items such as remote controls, pill bottles, food packaging, hot water bottles and mobile telephones. the attendees will be encouraged to assess two or more products in detail. Helpers will be on hand to assist and observe the assessments. The assessments will be collated and compared and feedback about the assessment method sought from the attendees. Drawing on a preliminary review of the assessment results, initial conclusions will be presented at the end of the workshop. More detailed analyses will be made available in subsequent proceedings. It is intended that the workshop will provide workshop attendees with an opportunity to perform hands-on assessment of a number everyday products and identify features which are inclusive and those that are not. It is also intended to encourage an appreciation of the capabilities to be considered when evaluating accessibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop on Towed Vehicles: Undulating Platforms As Tools for Mapping Coastal Processes and Water Quality Assessment was convened February 5-7,2007 at The Embassy Suites Hotel, Seaside, California and sponsored by the ACT-Pacific Coast partnership at the Moss Landing Marine Laboratories (MLML). The TUV workshop was co-chaired by Richard Burt (Chelsea Technology Group) and Stewart Lamerdin (MLML Marine Operations). Invited participants were selected to provide a uniform representation of the academic researchers, private sector product developers, and existing and potential data product users from the resource management community to enable development of broad consensus opinions on the application of TUV platforms in coastal resource assessment and management. The workshop was organized to address recognized limitations of point-based monitoring programs, which, while providing valuable data, are incapable of describing the spatial heterogeneity and the extent of features distributed in the bulk solution. This is particularly true as surveys approach the coastal zone where tidal and estuarine influences result in spatially and temporally heterogeneous water masses and entrained biological components. Aerial or satellite based remote sensing can provide an assessment of the aerial extent of plumes and blooms, yet provide no information regarding the third dimension of these features. Towed vehicles offer a cost-effective solution to this problem by providing platforms, which can sample in the horizontal, vertical, and time-based domains. Towed undulating vehicles (henceforth TUVs) represent useful platforms for event-response characterization. This workshop reviewed the current status of towed vehicle technology focusing on limitations of depth, data telemetry, instrument power demands, and ship requirements in an attempt to identify means to incorporate such technology more routinely in monitoring and event-response programs. Specifically, the participants were charged to address the following: (1) Summarize the state of the art in TUV technologies; (2) Identify how TUV platforms are used and how they can assist coastal managers in fulfilling their regulatory and management responsibilities; (3) Identify barriers and challenges to the application of TUV technologies in management and research activities, and (4) Recommend a series of community actions to overcome identified barriers and challenges. A series of plenary presentation were provided to enhance subsequent breakout discussions by the participants. Dave Nelson (University of Rhode Island) provided extensive summaries and real-world assessment of the operational features of a variety of TUV platforms available in the UNOLs scientific fleet. Dr. Burke Hales (Oregon State University) described the modification of TUV to provide a novel sampling platform for high resolution mapping of chemical distributions in near real time. Dr. Sonia Batten (Sir Alister Hardy Foundation for Ocean Sciences) provided an overview on the deployment of specialized towed vehicles equipped with rugged continuous plankton recorders on ships of opportunity to obtain long-term, basin wide surveys of zooplankton community structure, enhancing our understanding of trends in secondary production in the upper ocean. [PDF contains 32 pages]

Relevância:

60.00% 60.00%

Publicador:

Resumo:

DOI del libro: http://dx.doi.org/10.5772/1399

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dissertation is concerned with the mathematical study of various network problems. First, three real-world networks are considered: (i) the human brain network (ii) communication networks, (iii) electric power networks. Although these networks perform very different tasks, they share similar mathematical foundations. The high-level goal is to analyze and/or synthesis each of these systems from a “control and optimization” point of view. After studying these three real-world networks, two abstract network problems are also explored, which are motivated by power systems. The first one is “flow optimization over a flow network” and the second one is “nonlinear optimization over a generalized weighted graph”. The results derived in this dissertation are summarized below.

Brain Networks: Neuroimaging data reveals the coordinated activity of spatially distinct brain regions, which may be represented mathematically as a network of nodes (brain regions) and links (interdependencies). To obtain the brain connectivity network, the graphs associated with the correlation matrix and the inverse covariance matrix—describing marginal and conditional dependencies between brain regions—have been proposed in the literature. A question arises as to whether any of these graphs provides useful information about the brain connectivity. Due to the electrical properties of the brain, this problem will be investigated in the context of electrical circuits. First, we consider an electric circuit model and show that the inverse covariance matrix of the node voltages reveals the topology of the circuit. Second, we study the problem of finding the topology of the circuit based on only measurement. In this case, by assuming that the circuit is hidden inside a black box and only the nodal signals are available for measurement, the aim is to find the topology of the circuit when a limited number of samples are available. For this purpose, we deploy the graphical lasso technique to estimate a sparse inverse covariance matrix. It is shown that the graphical lasso may find most of the circuit topology if the exact covariance matrix is well-conditioned. However, it may fail to work well when this matrix is ill-conditioned. To deal with ill-conditioned matrices, we propose a small modification to the graphical lasso algorithm and demonstrate its performance. Finally, the technique developed in this work will be applied to the resting-state fMRI data of a number of healthy subjects.

Communication Networks: Congestion control techniques aim to adjust the transmission rates of competing users in the Internet in such a way that the network resources are shared efficiently. Despite the progress in the analysis and synthesis of the Internet congestion control, almost all existing fluid models of congestion control assume that every link in the path of a flow observes the original source rate. To address this issue, a more accurate model is derived in this work for the behavior of the network under an arbitrary congestion controller, which takes into account of the effect of buffering (queueing) on data flows. Using this model, it is proved that the well-known Internet congestion control algorithms may no longer be stable for the common pricing schemes, unless a sufficient condition is satisfied. It is also shown that these algorithms are guaranteed to be stable if a new pricing mechanism is used.

Electrical Power Networks: Optimal power flow (OPF) has been one of the most studied problems for power systems since its introduction by Carpentier in 1962. This problem is concerned with finding an optimal operating point of a power network minimizing the total power generation cost subject to network and physical constraints. It is well known that OPF is computationally hard to solve due to the nonlinear interrelation among the optimization variables. The objective is to identify a large class of networks over which every OPF problem can be solved in polynomial time. To this end, a convex relaxation is proposed, which solves the OPF problem exactly for every radial network and every meshed network with a sufficient number of phase shifters, provided power over-delivery is allowed. The concept of “power over-delivery” is equivalent to relaxing the power balance equations to inequality constraints.

Flow Networks: In this part of the dissertation, the minimum-cost flow problem over an arbitrary flow network is considered. In this problem, each node is associated with some possibly unknown injection, each line has two unknown flows at its ends related to each other via a nonlinear function, and all injections and flows need to satisfy certain box constraints. This problem, named generalized network flow (GNF), is highly non-convex due to its nonlinear equality constraints. Under the assumption of monotonicity and convexity of the flow and cost functions, a convex relaxation is proposed, which always finds the optimal injections. A primary application of this work is in the OPF problem. The results of this work on GNF prove that the relaxation on power balance equations (i.e., load over-delivery) is not needed in practice under a very mild angle assumption.

Generalized Weighted Graphs: Motivated by power optimizations, this part aims to find a global optimization technique for a nonlinear optimization defined over a generalized weighted graph. Every edge of this type of graph is associated with a weight set corresponding to the known parameters of the optimization (e.g., the coefficients). The motivation behind this problem is to investigate how the (hidden) structure of a given real/complex valued optimization makes the problem easy to solve, and indeed the generalized weighted graph is introduced to capture the structure of an optimization. Various sufficient conditions are derived, which relate the polynomial-time solvability of different classes of optimization problems to weak properties of the generalized weighted graph such as its topology and the sign definiteness of its weight sets. As an application, it is proved that a broad class of real and complex optimizations over power networks are polynomial-time solvable due to the passivity of transmission lines and transformers.