950 resultados para Probabilistic metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a full theoretical approach to clustering in complex networks. A key concept is introduced, the edge multiplicity, that measures the number of triangles passing through an edge. This quantity extends the clustering coefficient in that it involves the properties of two¿and not just one¿vertices. The formalism is completed with the definition of a three-vertex correlation function, which is the fundamental quantity describing the properties of clustered networks. The formalism suggests different metrics that are able to thoroughly characterize transitive relations. A rigorous analysis of several real networks, which makes use of this formalism and the metrics, is also provided. It is also found that clustered networks can be classified into two main groups: the weak and the strong transitivity classes. In the first class, edge multiplicity is small, with triangles being disjoint. In the second class, edge multiplicity is high and so triangles share many edges. As we shall see in the following paper, the class a network belongs to has strong implications in its percolation properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study spatio-temporal pattern formation in a ring of N oscillators with inhibitory unidirectional pulselike interactions. The attractors of the dynamics are limit cycles where each oscillator fires once and only once. Since some of these limit cycles lead to the same pattern, we introduce the concept of pattern degeneracy to take it into account. Moreover, we give a qualitative estimation of the volume of the basin of attraction of each pattern by means of some probabilistic arguments and pattern degeneracy, and show how they are modified as we change the value of the coupling strength. In the limit of small coupling, our estimative formula gives a pefect agreement with numerical simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a statistical theory to characterize correlations in weighted networks. We define the appropriate metrics quantifying correlations and show that strictly uncorrelated weighted networks do not exist due to the presence of structural constraints. We also introduce an algorithm for generating maximally random weighted networks with arbitrary P(k,s) to be used as null models. The application of our measures to real networks reveals the importance of weights in a correct understanding and modeling of these heterogeneous systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generalized KerrSchild space-times for a perfect-fluid source are investigated. New Petrov type D perfect fluid solutions are obtained starting from conformally flat perfect-fluid metrics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Petrov types D and II perfect-fluid solutions are obtained starting from conformally flat perfect-fluid metrics and by using a generalized KerrSchild ansatz. Most of the Petrov type D metrics obtained have the property that the velocity of the fluid does not lie in the two-space defined by the principal null directions of the Weyl tensor. The properties of the perfect-fluid sources are studied. Finally, a detailed analysis of a new class of spherically symmetric static perfect-fluid metrics is given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Red light running (RLR) is a problem in the US that has resulted in 165,000 injuries and 907 fatalities annually. In Iowa, RLR-related crashes make up 24.5 percent of all crashes and account for 31.7 percent of fatal and major injury crashes at signalized intersections. RLR crashes are a safety concern due to the increased likelihood of injury compared to other types of crashes. One tool used to combat red light running is automated enforcement in the form of RLR cameras. Automated enforcement, while effective, is often controversial. Cedar Rapids, Iowa installed RLR and speeding cameras at seven intersections across the city. The intersections were chosen based on crash rates and whether cameras could feasibly be placed at the intersection approaches. The cameras were placed starting in February 2010 with the last one becoming operational in December 2010. An analysis of the effect of the cameras on safety at these intersections was determined prudent in helping to justify the installation and effectiveness of the cameras. The objective of this research was to assess the safety effectiveness of the RLR program that has been implemented in Cedar Rapids. This was accomplished by analyzing data to determine changes in the following metrics:  Reductions in red light violation rates based on overall changes, time of day changes, and changes by lane  Effectiveness of the cameras over time  Time in which those running the red light enter the intersection  Changes in the average headway between vehicles entering the intersection

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Timber material repair and replacement cost for timber bridges is a considerable expense to highway agencies in Iowa, especially to county road departments. To address these needs, the objectives of this investigation was to study the field effectiveness of various treatment alternatives used on Iowa roadway projects and to determine if the current specifications and testing are adequate for providing proper wood preservation. To satisfy the research needs, the project scope involved a literature review, identification of metrics, questionnaire survey of Iowa counties, onsite inspections, and a review of current specifications and testing procedures. Based on the preservative information obtained, the following general conclusions were made: Copper naphthenate is recommended as the plant-applied preservative treatment for timber bridges. Best Management Practices should be followed to ensure quality treatment of timber materials. Bridge maintenance programs need to be developed and implemented. The Iowa Department of Transportation specifications for preservative treatment are the regulating specification for bridges constructed with state or federal funding in Iowa and are also recommended for all other bridges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Network reconstructions at the cell level are a major development in Systems Biology. However, we are far from fully exploiting its potentialities. Often, the incremental complexity of the pursued systems overrides experimental capabilities, or increasingly sophisticated protocols are underutilized to merely refine confidence levels of already established interactions. For metabolic networks, the currently employed confidence scoring system rates reactions discretely according to nested categories of experimental evidence or model-based likelihood. Results: Here, we propose a complementary network-based scoring system that exploits the statistical regularities of a metabolic network as a bipartite graph. As an illustration, we apply it to the metabolism of Escherichia coli. The model is adjusted to the observations to derive connection probabilities between individual metabolite-reaction pairs and, after validation, to assess the reliability of each reaction in probabilistic terms. This network-based scoring system uncovers very specific reactions that could be functionally or evolutionary important, identifies prominent experimental targets, and enables further confirmation of modeling results. Conclusions: We foresee a wide range of potential applications at different sub-cellular or supra-cellular levels of biological interactions given the natural bipartivity of many biological networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract In social insects, workers perform a multitude of tasks, such as foraging, nest construction, and brood rearing, without central control of how work is allocated among individuals. It has been suggested that workers choose a task by responding to stimuli gathered from the environment. Response-threshold models assume that individuals in a colony vary in the stimulus intensity (response threshold) at which they begin to perform the corresponding task. Here we highlight the limitations of these models with respect to colony performance in task allocation. First, we show with analysis and quantitative simulations that the deterministic response-threshold model constrains the workers' behavioral flexibility under some stimulus conditions. Next, we show that the probabilistic response-threshold model fails to explain precise colony responses to varying stimuli. Both of these limitations would be detrimental to colony performance when dynamic and precise task allocation is needed. To address these problems, we propose extensions of the response-threshold model by adding variables that weigh stimuli. We test the extended response-threshold model in a foraging scenario and show in simulations that it results in an efficient task allocation. Finally, we show that response-threshold models can be formulated as artificial neural networks, which consequently provide a comprehensive framework for modeling task allocation in social insects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary: The metrics of Larin Paraske's epic folk poetry in the Kalevala metre

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detailed knowledge of the anatomy and connectivity pattern of cortico-basal ganglia circuits is essential to an understanding of abnormal cortical function and pathophysiology associated with a wide range of neurological and neuropsychiatric diseases. We aim to study the spatial extent and topography of human basal ganglia connectivity in vivo. Additionally, we explore at an anatomical level the hypothesis of coexistent segregated and integrative cortico-basal ganglia loops. We use probabilistic tractography on magnetic resonance diffusion weighted imaging data to segment basal ganglia and thalamus in 30 healthy subjects based on their cortical and subcortical projections. We introduce a novel method to define voxel-based connectivity profiles that allow representation of projections from a source to more than one target region. Using this method, we localize specific relay nuclei within predefined functional circuits. We find strong correlation between tractography-based basal ganglia parcellation and anatomical data from previously reported invasive tracing studies in nonhuman primates. Additionally, we show in vivo the anatomical basis of segregated loops and the extent of their overlap in prefrontal, premotor, and motor networks. Our findings in healthy humans support the notion that probabilistic diffusion tractography can be used to parcellate subcortical gray matter structures on the basis of their connectivity patterns. The coexistence of clearly segregated and also overlapping connections from cortical sites to basal ganglia subregions is a neuroanatomical correlate of both parallel and integrative networks within them. We believe that this method can be used to examine pathophysiological concepts in a number of basal ganglia-related disorders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.