960 resultados para Probability Metrics
Resumo:
We have studied enhancer function in transient and stable expression assays in mammalian cells by using systems that distinguish expressing from nonexpressing cells. When expression is studied in this way, enhancers are found to increase the probability of a construct being active but not the level of expression per template. In stably integrated constructs, large differences in expression level are observed but these are not related to the presence of an enhancer. Together with earlier studies, these results suggest that enhancers act to affect a binary (on/off) switch in transcriptional activity. Although this idea challenges the widely accepted model of enhancer activity, it is consistent with much, if not all, experimental evidence on this subject. We hypothesize that enhancers act to increase the probability of forming a stably active template. When randomly integrated into the genome, enhancers may affect a metastable state of repression/activity, permitting expression in regions that would not permit activity of an isolated promoter.
Resumo:
It is well known that quantum correlations for bipartite dichotomic measurements are those of the form (Formula presented.), where the vectors ui and vj are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of (Formula presented.), where the previous vectors are sampled according to the Haar measure in the unit sphere of (Formula presented.). In particular, we prove the existence of an (Formula presented.) such that if (Formula presented.), (Formula presented.) is nonlocal with probability tending to 1 as (Formula presented.), while for (Formula presented.), (Formula presented.) is local with probability tending to 1 as (Formula presented.).
Resumo:
In this paper we introduce the concept of Lateral Trigger Probability (LTP) function, i.e., the probability for an Extensive Air Shower (EAS) to trigger an individual detector of a ground based array as a function of distance to the shower axis, taking into account energy, mass and direction of the primary cosmic ray. We apply this concept to the surface array of the Pierre Auger Observatory consisting of a 1.5 km spaced grid of about 1600 water Cherenkov stations. Using Monte Carlo simulations of ultra-high energy showers the LTP functions are derived for energies in the range between 10(17) and 10(19) eV and zenith angles up to 65 degrees. A parametrization combining a step function with an exponential is found to reproduce them very well in the considered range of energies and zenith angles. The LTP functions can also be obtained from data using events simultaneously observed by the fluorescence and the surface detector of the Pierre Auger Observatory (hybrid events). We validate the Monte Carlo results showing how LTP functions from data are in good agreement with simulations.
Resumo:
Successful HR departments should support key business objectives by establishing metrics that determine the effectiveness of their processes. Functions such as recruiting, benefits, and training are processes that should have metrics. Understanding who measures what, when, and how often is the first step in measuring how much it costs to run HR. The next step is determining which processes are most critical, and then determining the metrics that fit the business needs. Slight adjustments will need to be made as business needs change, but the process for measuring outcomes should not change. This paper will focus on multinational corporations that employ at least ten thousand employees and have a ratio of one HR professional to every hundred fulltime equivalents (FTEs).
Resumo:
The Iterative Closest Point algorithm (ICP) is commonly used in engineering applications to solve the rigid registration problem of partially overlapped point sets which are pre-aligned with a coarse estimate of their relative positions. This iterative algorithm is applied in many areas such as the medicine for volumetric reconstruction of tomography data, in robotics to reconstruct surfaces or scenes using range sensor information, in industrial systems for quality control of manufactured objects or even in biology to study the structure and folding of proteins. One of the algorithm’s main problems is its high computational complexity (quadratic in the number of points with the non-optimized original variant) in a context where high density point sets, acquired by high resolution scanners, are processed. Many variants have been proposed in the literature whose goal is the performance improvement either by reducing the number of points or the required iterations or even enhancing the complexity of the most expensive phase: the closest neighbor search. In spite of decreasing its complexity, some of the variants tend to have a negative impact on the final registration precision or the convergence domain thus limiting the possible application scenarios. The goal of this work is the improvement of the algorithm’s computational cost so that a wider range of computationally demanding problems from among the ones described before can be addressed. For that purpose, an experimental and mathematical convergence analysis and validation of point-to-point distance metrics has been performed taking into account those distances with lower computational cost than the Euclidean one, which is used as the de facto standard for the algorithm’s implementations in the literature. In that analysis, the functioning of the algorithm in diverse topological spaces, characterized by different metrics, has been studied to check the convergence, efficacy and cost of the method in order to determine the one which offers the best results. Given that the distance calculation represents a significant part of the whole set of computations performed by the algorithm, it is expected that any reduction of that operation affects significantly and positively the overall performance of the method. As a result, a performance improvement has been achieved by the application of those reduced cost metrics whose quality in terms of convergence and error has been analyzed and validated experimentally as comparable with respect to the Euclidean distance using a heterogeneous set of objects, scenarios and initial situations.
Resumo:
This paper proposes an adaptive algorithm for clustering cumulative probability distribution functions (c.p.d.f.) of a continuous random variable, observed in different populations, into the minimum homogeneous clusters, making no parametric assumptions about the c.p.d.f.’s. The distance function for clustering c.p.d.f.’s that is proposed is based on the Kolmogorov–Smirnov two sample statistic. This test is able to detect differences in position, dispersion or shape of the c.p.d.f.’s. In our context, this statistic allows us to cluster the recorded data with a homogeneity criterion based on the whole distribution of each data set, and to decide whether it is necessary to add more clusters or not. In this sense, the proposed algorithm is adaptive as it automatically increases the number of clusters only as necessary; therefore, there is no need to fix in advance the number of clusters. The output of the algorithm are the common c.p.d.f. of all observed data in the cluster (the centroid) and, for each cluster, the Kolmogorov–Smirnov statistic between the centroid and the most distant c.p.d.f. The proposed algorithm has been used for a large data set of solar global irradiation spectra distributions. The results obtained enable to reduce all the information of more than 270,000 c.p.d.f.’s in only 6 different clusters that correspond to 6 different c.p.d.f.’s.
Resumo:
Development of desalination projects requires simple methodologies and tools for cost-effective and environmentally-sensitive management. Sentinel taxa and biotic indices are easily interpreted in the perspective of environment management. Echinoderms are potential sentinel taxon to gauge the impact produced by brine discharge and the BOPA index is considered an effective tool for monitoring different types of impact. Salinity increase due to desalination brine discharge was evaluated in terms of these two indicators. They reflected the environmental impact and recovery after implementation of a mitigation measure. Echinoderms disappeared at the station closest to the discharge during the years with highest salinity and then recovered their abundance after installation of a diffuser reduced the salinity increase. In the same period, BOPA responded due to the decrease in sensitive amphipods and the increase in tolerant polychaete families when salinities rose. Although salinity changes explained most of the observed variability in both indicators, other abiotic parameters were also significant in explaining this variability.
Resumo:
In 1991, Bryant and Eckard estimated the annual probability that a cartel would be detected by the US Federal authorities, conditional on being detected, to be at most between 13 % and 17 %. 15 years later, we estimated the same probability over a European sample and we found an annual probability that falls between 12.9 % and 13.3 %. We also develop a detection model to clarify this probability. Our estimate is based on detection durations, calculated from data reported for all the cartels convicted by the European Commission from 1969 to the present date, and a statistical birth and death process model describing the onset and detection of cartels.
Resumo:
Both the EU and its member states are in a period of rethinking security strategy to adapt to contemporary challenges both in the European region and beyond, including Northeast Asia. In this Security Policy Brief, Mason Richey discusses what difficulties and risks a North Korean regime collapse would pose, the likelihood that it will occur sooner rather than later, and how Europe will be affected by such a scenario.
Resumo:
rrreg fits a linear probability model for randomized response data
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
Oct. 1978.
Resumo:
Mode of access: Internet.