898 resultados para Exponential Random Graph Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transducer function mu for contrast perception describes the nonlinear mapping of stimulus contrast onto an internal response. Under a signal detection theory approach, the transducer model of contrast perception states that the internal response elicited by a stimulus of contrast c is a random variable with mean mu(c). Using this approach, we derive the formal relations between the transducer function, the threshold-versus-contrast (TvC) function, and the psychometric functions for contrast detection and discrimination in 2AFC tasks. We show that the mathematical form of the TvC function is determined only by mu, and that the psychometric functions for detection and discrimination have a common mathematical form with common parameters emanating from, and only from, the transducer function mu and the form of the distribution of the internal responses. We discuss the theoretical and practical implications of these relations, which have bearings on the tenability of certain mathematical forms for the psychometric function and on the suitability of empirical approaches to model validation. We also present the results of a comprehensive test of these relations using two alternative forms of the transducer model: a three-parameter version that renders logistic psychometric functions and a five-parameter version using Foley's variant of the Naka-Rushton equation as transducer function. Our results support the validity of the formal relations implied by the general transducer model, and the two versions that were contrasted account for our data equally well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study theoretically the effect of a new type of blocklike positional disorder on the effective electromagnetic properties of one-dimensional chains of resonant, high-permittivity dielectric particles, where particles are arranged into perfectly well-ordered blocks whose relative position is a random variable. This creates a finite order correlation length that mimics the situation encountered in metamaterials fabricated through self-assembled techniques, whose structures often display short-range order between near neighbors but long-range disorder, due to stacking defects. Using a spectral theory approach combined with a principal component statistical analysis, we study, in the long-wavelength regime, the evolution of the electromagnetic response when the composite filling fraction and the block size are changed. Modifications in key features of the resonant response (amplitude, width, etc.) are investigated, showing a regime transition for a filling fraction around 50%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A main unsolved problem in the RNA world scenario for the origin of life is how a template-dependent RNA polymerase ribozyme emerged from short RNA oligomers generated by random polymerization of ribonucleotides (Joyce and Orgel 2006). Current estimates establish a minimum size about 165 nt long for such a ribozyme (Johnston et al. 2001), a length three to four times that of the longest RNA oligomers obtained by random polymerization on clay mineral surfaces (Huang and Ferris 2003, 2006). To overcome this gap, we have developed a stepwise model of ligation-based, modular evolution of RNA (Briones et al. 2009) whose main conceptual steps are summarized in Figure 1. This scenario has two main advantages with respect to previous hypotheses put forward for the origin of the RNA world: i) short RNA....

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective
Pedestrian detection under video surveillance systems has always been a hot topic in computer vision research. These systems are widely used in train stations, airports, large commercial plazas, and other public places. However, pedestrian detection remains difficult because of complex backgrounds. Given its development in recent years, the visual attention mechanism has attracted increasing attention in object detection and tracking research, and previous studies have achieved substantial progress and breakthroughs. We propose a novel pedestrian detection method based on the semantic features under the visual attention mechanism.
Method
The proposed semantic feature-based visual attention model is a spatial-temporal model that consists of two parts: the static visual attention model and the motion visual attention model. The static visual attention model in the spatial domain is constructed by combining bottom-up with top-down attention guidance. Based on the characteristics of pedestrians, the bottom-up visual attention model of Itti is improved by intensifying the orientation vectors of elementary visual features to make the visual saliency map suitable for pedestrian detection. In terms of pedestrian attributes, skin color is selected as a semantic feature for pedestrian detection. The regional and Gaussian models are adopted to construct the skin color model. Skin feature-based visual attention guidance is then proposed to complete the top-down process. The bottom-up and top-down visual attentions are linearly combined using the proper weights obtained from experiments to construct the static visual attention model in the spatial domain. The spatial-temporal visual attention model is then constructed via the motion features in the temporal domain. Based on the static visual attention model in the spatial domain, the frame difference method is combined with optical flowing to detect motion vectors. Filtering is applied to process the field of motion vectors. The saliency of motion vectors can be evaluated via motion entropy to make the selected motion feature more suitable for the spatial-temporal visual attention model.
Result
Standard datasets and practical videos are selected for the experiments. The experiments are performed on a MATLAB R2012a platform. The experimental results show that our spatial-temporal visual attention model demonstrates favorable robustness under various scenes, including indoor train station surveillance videos and outdoor scenes with swaying leaves. Our proposed model outperforms the visual attention model of Itti, the graph-based visual saliency model, the phase spectrum of quaternion Fourier transform model, and the motion channel model of Liu in terms of pedestrian detection. The proposed model achieves a 93% accuracy rate on the test video.
Conclusion
This paper proposes a novel pedestrian method based on the visual attention mechanism. A spatial-temporal visual attention model that uses low-level and semantic features is proposed to calculate the saliency map. Based on this model, the pedestrian targets can be detected through focus of attention shifts. The experimental results verify the effectiveness of the proposed attention model for detecting pedestrians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though a large amount of evidence would suggest that PP2A serine/threonine protein phosphatase acts as a tumour suppressor the genomics data to support this claim is limited. We fit a sparse binary Markov random field with individual sample's total mutational frequency as an additional covariate to model the dependencies between the mutations occurring in the PP2A encoding genes. We utilize the data from recent large scale cancer genomics studies, where the whole genome from a human tumour biopsy has been analysed. Our results show a complex network of interactions between the occurrence of mutations in our twenty examined genes. According to our analysis the mutations occurring in the genes PPP2R1A, PPP2R3A, and PPP2R2B are identified as the key mutations. These genes form the core of the network of conditional dependency between the mutations in the investigated twenty genes. Additionally, we note that the mutations occurring in PPP2R4 seem to be more influential in samples with higher number of total mutations. The mutations occurring in the set of genes suggested by our results has been shown to contribute to the transformation of human cells. We conclude that our evidence further supports the claim that PP2A acts as a tumour suppressor and restoring PP2A activity is an appealing therapeutic strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides an agent-based software exploration of the wellknown free market efficiency/equality trade-off. Our study simulates the interaction of agents producing, trading and consuming goods in the presence of different market structures, and looks at how efficient the producers/consumers mapping turn out to be as well as the resulting distribution of welfare among agents at the end of an arbitrarily large number of iterations. Two market mechanisms are compared: the competitive market (a double auction market in which agents outbid each other in order to buy and sell products) and the random one (in which products are allocated randomly). Our results confirm that the superior efficiency of the competitive market (an effective and never stopping producers/consumers mapping and a superior aggregative welfare) comes at a very high price in terms of inequality (above all when severe budget constraints are in play).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recently reported Monte Carlo Random Path Sampling method (RPS) is here improved and its application is expanded to the study of the 2D and 3D Ising and discrete Heisenberg models. The methodology was implemented to allow use in both CPU-based high-performance computing infrastructures (C/MPI) and GPU-based (CUDA) parallel computation, with significant computational performance gains. Convergence is discussed, both in terms of free energy and magnetization dependence on field/temperature. From the calculated magnetization-energy joint density of states, fast calculations of field and temperature dependent thermodynamic properties are performed, including the effects of anisotropy on coercivity, and the magnetocaloric effect. The emergence of first-order magneto-volume transitions in the compressible Ising model is interpreted using the Landau theory of phase transitions. Using metallic Gadolinium as a real-world example, the possibility of using RPS as a tool for computational magnetic materials design is discussed. Experimental magnetic and structural properties of a Gadolinium single crystal are compared to RPS-based calculations using microscopic parameters obtained from Density Functional Theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste trabalho é estudado o modelo de Kuramoto num grafo completo, em redes scale-free com uma distribuição de ligações P(q) ~ q-Y e na presença de campos aleatórios com magnitude constante e gaussiana. Para tal, foi considerado o método Ott-Antonsen e uma aproximação "annealed network". Num grafo completo, na presença de campos aleatórios gaussianos, e em redes scale-free com 2 < y < 5 na presença de ambos os campos aleatórios referidos, foram encontradas transições de fase contínuas. Considerando a presença de campos aleatórios com magnitude constante num grafo completo e em redes scale-free com y > 5, encontraram-se transições de fase contínua (h < √2) e descontínua (h > √2). Para uma rede SF com y = 3, foi observada uma transição de fase de ordem infinita. Os resultados do modelo de Kuramoto num grafo completo e na presença de campos aleatórios com magnitude constante foram comparados aos de simulações, tendo-se verificado uma boa concordância. Verifica-se que, independentemente da topologia de rede, a constante de acoplamento crítico aumenta com a magnitude do campo considerado. Na topologia de rede scale-free, concluiu-se que o valor do acoplamento crítico diminui à medida que valor de y diminui e que o grau de sincronização aumenta com o aumento do número médio das ligações na rede. A presença de campos aleatórios com magnitude gaussiana num grafo completo e numa rede scale-free com y > 2 não destrói a transição de fase contínua e não altera o comportamento crítico do modelo de Kuramoto.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the hybridization of two graph coloring heuristics (Saturation Degree and Largest Degree), and their application within a hyperheuristic for exam timetabling problems. Hyper-heuristics can be seen as algorithms which intelligently select appropriate algorithms/heuristics for solving a problem. We developed a Tabu Search based hyper-heuristic to search for heuristic lists (of graph heuristics) for solving problems and investigated the heuristic lists found by employing knowledge discovery techniques. Two hybrid approaches (involving Saturation Degree and Largest Degree) including one which employs Case Based Reasoning are presented and discussed. Both the Tabu Search based hyper-heuristic and the hybrid approaches are tested on random and real-world exam timetabling problems. Experimental results are comparable with the best state-of-the-art approaches (as measured against established benchmark problems). The results also demonstrate an increased level of generality in our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Edge-labeled graphs have proliferated rapidly over the last decade due to the increased popularity of social networks and the Semantic Web. In social networks, relationships between people are represented by edges and each edge is labeled with a semantic annotation. Hence, a huge single graph can express many different relationships between entities. The Semantic Web represents each single fragment of knowledge as a triple (subject, predicate, object), which is conceptually identical to an edge from subject to object labeled with predicates. A set of triples constitutes an edge-labeled graph on which knowledge inference is performed. Subgraph matching has been extensively used as a query language for patterns in the context of edge-labeled graphs. For example, in social networks, users can specify a subgraph matching query to find all people that have certain neighborhood relationships. Heavily used fragments of the SPARQL query language for the Semantic Web and graph queries of other graph DBMS can also be viewed as subgraph matching over large graphs. Though subgraph matching has been extensively studied as a query paradigm in the Semantic Web and in social networks, a user can get a large number of answers in response to a query. These answers can be shown to the user in accordance with an importance ranking. In this thesis proposal, we present four different scoring models along with scalable algorithms to find the top-k answers via a suite of intelligent pruning techniques. The suggested models consist of a practically important subset of the SPARQL query language augmented with some additional useful features. The first model called Substitution Importance Query (SIQ) identifies the top-k answers whose scores are calculated from matched vertices' properties in each answer in accordance with a user-specified notion of importance. The second model called Vertex Importance Query (VIQ) identifies important vertices in accordance with a user-defined scoring method that builds on top of various subgraphs articulated by the user. Approximate Importance Query (AIQ), our third model, allows partial and inexact matchings and returns top-k of them with a user-specified approximation terms and scoring functions. In the fourth model called Probabilistic Importance Query (PIQ), a query consists of several sub-blocks: one mandatory block that must be mapped and other blocks that can be opportunistically mapped. The probability is calculated from various aspects of answers such as the number of mapped blocks, vertices' properties in each block and so on and the most top-k probable answers are returned. An important distinguishing feature of our work is that we allow the user a huge amount of freedom in specifying: (i) what pattern and approximation he considers important, (ii) how to score answers - irrespective of whether they are vertices or substitution, and (iii) how to combine and aggregate scores generated by multiple patterns and/or multiple substitutions. Because so much power is given to the user, indexing is more challenging than in situations where additional restrictions are imposed on the queries the user can ask. The proposed algorithms for the first model can also be used for answering SPARQL queries with ORDER BY and LIMIT, and the method for the second model also works for SPARQL queries with GROUP BY, ORDER BY and LIMIT. We test our algorithms on multiple real-world graph databases, showing that our algorithms are far more efficient than popular triple stores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study spatially localized states of a spiking neuronal network populated by a pulse coupled phase oscillator known as the lighthouse model. We show that in the limit of slow synaptic interactions in the continuum limit the dynamics reduce to those of the standard Amari model. For non-slow synaptic connections we are able to go beyond the standard firing rate analysis of localized solutions allowing us to explicitly construct a family of co-existing one-bump solutions, and then track bump width and firing pattern as a function of system parameters. We also present an analysis of the model on a discrete lattice. We show that multiple width bump states can co-exist and uncover a mechanism for bump wandering linked to the speed of synaptic processing. Moreover, beyond a wandering transition point we show that the bump undergoes an effective random walk with a diffusion coefficient that scales exponentially with the rate of synaptic processing and linearly with the lattice spacing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite record-setting performance demonstrated by superconducting Transition Edge Sensors (TESs) and growing utilization of the technology, a theoretical model of the physics governing TES devices superconducting phase transition has proven elusive. Earlier attempts to describe TESs assumed them to be uniform superconductors. Sadleir et al. 2010 shows that TESs are weak links and that the superconducting order parameter strength has significant spatial variation. Measurements are presented of the temperature T and magnetic field B dependence of the critical current Ic measured over 7 orders of magnitude on square Mo/Au bilayers ranging in length from 8 to 290 microns. We find our measurements have a natural explanation in terms of a spatially varying order parameter that is enhanced in proximity to the higher transition temperature superconducting leads (the longitudinal proximity effect) and suppressed in proximity to the added normal metal structures (the lateral inverse proximity effect). These in-plane proximity effects and scaling relations are observed over unprecedentedly long lengths (in excess of 1000 times the mean free path) and explained in terms of a Ginzburg-Landau model. Our low temperature Ic(B) measurements are found to agree with a general derivation of a superconducting strip with an edge or geometric barrier to vortex entry and we also derive two conditions that lead to Ic rectification. At high temperatures the Ic(B) exhibits distinct Josephson effect behavior over long length scales and following functional dependences not previously reported. We also investigate how film stress changes the transition, explain some transition features in terms of a nonequilibrium superconductivity effect, and show that our measurements of the resistive transition are not consistent with a percolating resistor network model.