893 resultados para SPARSE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inference and optimisation of real-value edge variables in sparse graphs are studied using the tree based Bethe approximation optimisation algorithms. Equilibrium states of general energy functions involving a large set of real edge-variables that interact at the network nodes are obtained for networks in various cases. These include different cost functions, connectivity values, constraints on the edge bandwidth and the case of multiclass optimisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis looks at the UK onshore oil and gas production industry and follows the history of a population of firms over a fifteen-year period following the industry's renaissance. It examines the linkage between firm survival, selection pressures and adaptation responses at the firm level, especially the role of discretionary adaptation, specifically exploration and exploitation strategies.Taking a Realist approach and using quantitative and qualitative methods for triangulation on a new data base derived from archival data, as well as informant interviews, it tests seven hypotheses' about post-entry survival of firms. The quantitative findings suggest that firm survival within this industry is linked to discretionary adaptation, when measured at the firm level, and to a mixture of selection and adaptation forces when measured for each firm for each individual year. The qualitative research suggests that selection factors dominate. This difference in views is unresolved. However the small, sparse population and the nature of the oil and gas industry compared with other common research contexts such as manufacturing or service firms suggests the results be treated with caution as befits a preliminary investigation. The major findings include limited support for the theory that the external environment is the major determinant of firm survival, though environment components affect firms differentially; resolution of apparent literature differences relating to the sequencing of exploration and exploitation and potential tangible evidence of coevolution. The research also finds that, though selection may be considered important by industry players, discretionary adaptation appears to play the key role, and that the key survival drivers for thispopulation are intra-industry ties, exploitation experience and a learning/experience component. Selection has a place, however, in determining the life-cycle of the firm returning to be a key survival driver at certain ages of the firm inside the industry boundary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While much has been discussed about the relationship between ownership and financial performance of banks in emerging markets, literature about cross-ownership differences in credit market behaviour of banks in emerging economies is sparse. Using a portfolio choice model and bank-level data from India for 9 years (1995–96 to 2003–04), we examine banks’ behaviour in the context of credit markets of an emerging market economy. Our results indicate that, in India, the data for the domestic banks fit well the aforementioned portfolio-choice model, especially for private banks, but the model cannot explain the behaviour of foreign banks. In general, allocation of assets between risk-free government securities and risky credit is affected by past allocation patterns, stock exchange listing (for private banks), risk averseness of banks, regulations regarding treatment of NPA, and ability of banks to recover doubtful credit. It is also evident that banks deal with changing levels of systematic risk by altering the ratio of securitized to non-securitized credit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A property of sparse representations in relation to their capacity for information storage is discussed. It is shown that this feature can be used for an application that we term Encrypted Image Folding. The proposed procedure is realizable through any suitable transformation. In particular, in this paper we illustrate the approach by recourse to the Discrete Cosine Transform and a combination of redundant Cosine and Dirac dictionaries. The main advantage of the proposed technique is that both storage and encryption can be achieved simultaneously using simple processing steps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis considers sparse approximation of still images as the basis of a lossy compression system. The Matching Pursuit (MP) algorithm is presented as a method particularly suited for application in lossy scalable image coding. Its multichannel extension, capable of exploiting inter-channel correlations, is found to be an efficient way to represent colour data in RGB colour space. Known problems with MP, high computational complexity of encoding and dictionary design, are tackled by finding an appropriate partitioning of an image. The idea of performing MP in the spatio-frequency domain after transform such as Discrete Wavelet Transform (DWT) is explored. The main challenge, though, is to encode the image representation obtained after MP into a bit-stream. Novel approaches for encoding the atomic decomposition of a signal and colour amplitudes quantisation are proposed and evaluated. The image codec that has been built is capable of competing with scalable coders such as JPEG 2000 and SPIHT in terms of compression ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web document cluster analysis plays an important role in information retrieval by organizing large amounts of documents into a small number of meaningful clusters. Traditional web document clustering is based on the Vector Space Model (VSM), which takes into account only two-level (document and term) knowledge granularity but ignores the bridging paragraph granularity. However, this two-level granularity may lead to unsatisfactory clustering results with “false correlation”. In order to deal with the problem, a Hierarchical Representation Model with Multi-granularity (HRMM), which consists of five-layer representation of data and a twophase clustering process is proposed based on granular computing and article structure theory. To deal with the zero-valued similarity problemresulted from the sparse term-paragraphmatrix, an ontology based strategy and a tolerance-rough-set based strategy are introduced into HRMM. By using granular computing, structural knowledge hidden in documents can be more efficiently and effectively captured in HRMM and thus web document clusters with higher quality can be generated. Extensive experiments show that HRMM, HRMM with tolerancerough-set strategy, and HRMM with ontology all outperform VSM and a representative non VSM-based algorithm, WFP, significantly in terms of the F-Score.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the problem of obtaining a dense reconstruction in real-time, from a live video stream. In recent years, multi-view stereo (MVS) has received considerable attention and a number of methods have been proposed. However, most methods operate under the assumption of a relatively sparse set of still images as input and unlimited computation time. Video based MVS has received less attention despite the fact that video sequences offer significant benefits in terms of usability of MVS systems. In this paper we propose a novel video based MVS algorithm that is suitable for real-time, interactive 3d modeling with a hand-held camera. The key idea is a per-pixel, probabilistic depth estimation scheme that updates posterior depth distributions with every new frame. The current implementation is capable of updating 15 million distributions/s. We evaluate the proposed method against the state-of-the-art real-time MVS method and show improvement in terms of accuracy. © 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human accommodation system has been extensively examined for over a century, with a particular focus on trying to understand the mechanisms that lead to the loss of accommodative ability with age (Presbyopia). The accommodative process, along with the potential causes of presbyopia, are disputed; hindering efforts to develop methods of restoring accommodation in the presbyopic eye. One method that can be used to provide insight into this complex area is Finite Element Analysis (FEA). The effectiveness of FEA in modelling the accommodative process has been illustrated by a number of accommodative FEA models developed to date. However, there have been limitations to these previous models; principally due to the variation in data on the geometry of the accommodative components, combined with sparse measurements of their material properties. Despite advances in available data, continued oversimplification has occurred in the modelling of the crystalline lens structure and the zonular fibres that surround the lens. A new accommodation model was proposed by the author that aims to eliminate these limitations. A novel representation of the zonular structure was developed, combined with updated lens and capsule modelling methods. The model has been designed to be adaptable so that a range of different age accommodation systems can be modelled, allowing the age related changes that occur to be simulated. The new modelling methods were validated by comparing the changes induced within the model to available in vivo data, leading to the definition of three different age models. These were used in an extended sensitivity study on age related changes, where individual parameters were altered to investigate their effect on the accommodative process. The material properties were found to have the largest impact on the decline in accommodative ability, in particular compared to changes in ciliary body movement or zonular structure. Novel data on the importance of the capsule stiffness and thickness was also established. The new model detailed within this thesis provides further insight into the accommodation mechanism, as well as a foundation for future, more detailed investigations into accommodation, presbyopia and accommodative restoration techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Excess calorie consumption is associated with metabolic disorders and increased incidence of morbidity. Restricting calorie content, either by daily calorie restriction or intermittent fasting periods, has multiple benefits including weight loss and improved body composition. Previous research has shown that restricting calories in this way can increase longevity and slow the ageing process in laboratory animals, although only sparse data exist in human populations. This review critically evaluates the benefits of these dietary interventions on age-related decline and longevity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Learning user interests from online social networks helps to better understand user behaviors and provides useful guidance to design user-centric applications. Apart from analyzing users' online content, it is also important to consider users' social connections in the social Web. Graph regularization methods have been widely used in various text mining tasks, which can leverage the graph structure information extracted from data. Previously, graph regularization methods operate under the cluster assumption that nearby nodes are more similar and nodes on the same structure (typically referred to as a cluster or a manifold) are likely to be similar. We argue that learning user interests from complex, sparse, and dynamic social networks should be based on the link structure assumption under which node similarities are evaluated based on the local link structures instead of explicit links between two nodes. We propose a regularization framework based on the relation bipartite graph, which can be constructed from any type of relations. Using Twitter as our case study, we evaluate our proposed framework from social networks built from retweet relations. Both quantitative and qualitative experiments show that our proposed method outperforms a few competitive baselines in learning user interests over a set of predefined topics. It also gives superior results compared to the baselines on retweet prediction and topical authority identification. © 2014 ACM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An effective aperture approach is used as a tool for analysis and parameter optimization of mostly known ultrasound imaging systems - phased array systems, compounding systems and synthetic aperture imaging systems. Both characteristics of an imaging system, the effective aperture function and the corresponding two-way radiation pattern, provide information about two of the most important parameters of images produced by an ultrasound system - lateral resolution and contrast. Therefore, in the design, optimization of the effective aperture function leads to optimal choice of such parameters of an imaging systems that influence on lateral resolution and contrast of images produced by this imaging system. It is shown that the effective aperture approach can be used for optimization of a sparse synthetic transmit aperture (STA) imaging system. A new two-stage algorithm is proposed for optimization of both the positions of the transmitted elements and the weights of the receive elements. The proposed system employs a 64-element array with only four active elements used during transmit. The numerical results show that Hamming apodization gives the best compromise between the contrast of images and the lateral resolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = -0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson-Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. © 2013 The Author(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the increasing body of evidence supporting the hypothesis of schizophrenia as a disconnection syndrome, studies of resting-state EEG Source Functional Connectivity (EEG-SFC) in people affected by schizophrenia are sparse. The aim of the present study was to investigate resting-state EEG-SFC in 77 stable, medicated patients with schizophrenia (SCZ) compared to 78 healthy volunteers (HV). In order to study the effect of illness duration, SCZ were divided in those with a short duration of disease (SDD; n = 25) and those with a long duration of disease (LDD; n = 52). Resting-state EEG recordings in eyes closed condition were analyzed and lagged phase synchronization (LPS) indices were calculated for each ROI pair in the source-space EEG data. In delta and theta bands, SCZ had greater EEG-SFC than HV; a higher theta band connectivity in frontal regions was observed in LDD compared with SDD. In the alpha band, SCZ showed lower frontal EEG-SFC compared with HV whereas no differences were found between LDD and SDD. In the beta1 band, SCZ had greater EEG-SFC compared with HVs and in the beta2 band, LDD presented lower frontal and parieto-temporal EEG-SFC compared with HV. In the gamma band, SDD had greater connectivity values compared with LDD and HV. This study suggests that resting state brain network connectivity is abnormally organized in schizophrenia, with different patterns for the different EEG frequency components and that EEG can be a powerful tool to further elucidate the complexity of such disordered connectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MSC 2010: 05C50, 15A03, 15A06, 65K05, 90C08, 90C35

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As one of the most popular deep learning models, convolution neural network (CNN) has achieved huge success in image information extraction. Traditionally CNN is trained by supervised learning method with labeled data and used as a classifier by adding a classification layer in the end. Its capability of extracting image features is largely limited due to the difficulty of setting up a large training dataset. In this paper, we propose a new unsupervised learning CNN model, which uses a so-called convolutional sparse auto-encoder (CSAE) algorithm pre-Train the CNN. Instead of using labeled natural images for CNN training, the CSAE algorithm can be used to train the CNN with unlabeled artificial images, which enables easy expansion of training data and unsupervised learning. The CSAE algorithm is especially designed for extracting complex features from specific objects such as Chinese characters. After the features of articficial images are extracted by the CSAE algorithm, the learned parameters are used to initialize the first CNN convolutional layer, and then the CNN model is fine-Trained by scene image patches with a linear classifier. The new CNN model is applied to Chinese scene text detection and is evaluated with a multilingual image dataset, which labels Chinese, English and numerals texts separately. More than 10% detection precision gain is observed over two CNN models.