937 resultados para Probability densities


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expert knowledge is used to assign probabilities to events in many risk analysis models. However, experts sometimes find it hard to provide specific values for these probabilities, preferring to express vague or imprecise terms that are mapped using a previously defined fuzzy number scale. The rigidity of these scales generates bias in the probability elicitation process and does not allow experts to adequately express their probabilistic judgments. We present an interactive method for extracting a fuzzy number from experts that represents their probabilistic judgments for a given event, along with a quality measure of the probabilistic judgments, useful in a final information filtering and analysis sensitivity process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current space environment, consisting of manmade debris and micrometeoroids, poses a risk to safe operations in space, and the situation is continuously deteriorating due to in-orbit debris collisions and to new satellite launches. Bare electrodynamic tethers can provide an efficient mechanism for rapid deorbiting of satellites from low Earth orbit at end of life. Because of its particular geometry (length very much larger than cross-sectional dimensions), a tether may have a relatively high risk of being severed by the single impact of small debris. The rates of fatal impact of orbital debris on round and tape tethers of equal length and mass, evaluated with an analytical approximation to debris flux modeled by NASA’s ORDEM2000, shows much higher survival probability for tapes. A comparative numerical analysis using debris flux model ORDEM2000 and ESA’s MASTER2005 validates the analytical result and shows that, for a given time in orbit, a tape has a probability of survival of about one and a half orders of magnitude higher than a round tether of equal mass and length. Because deorbiting from a given altitude is much faster for the tape due to its larger perimeter, its probability of survival in a practical sense is quite high.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by these difficulties, Castillo et al. (2012) made some suggestions on how to build consistent stochastic models avoiding the selection of easy to use mathematical functions, which were replaced by those resulting from a set of properties to be satisfied by the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In previous papers, the type-I intermittent phenomenon with continuous reinjection probability density (RPD) has been extensively studied. However, in this paper type-I intermittency considering discontinuous RPD function in one-dimensional maps is analyzed. To carry out the present study the analytic approximation presented by del Río and Elaskar (Int. J. Bifurc. Chaos 20:1185-1191, 2010) and Elaskar et al. (Physica A. 390:2759-2768, 2011) is extended to consider discontinuous RPD functions. The results of this analysis show that the characteristic relation only depends on the position of the lower bound of reinjection (LBR), therefore for the LBR below the tangent point the relation {Mathematical expression}, where {Mathematical expression} is the control parameter, remains robust regardless the form of the RPD, although the average of the laminar phases {Mathematical expression} can change. Finally, the study of discontinuous RPD for type-I intermittency which occurs in a three-wave truncation model for the derivative nonlinear Schrodinger equation is presented. In all tests the theoretical results properly verify the numerical data

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La fiabilidad está pasando a ser el principal problema de los circuitos integrados según la tecnología desciende por debajo de los 22nm. Pequeñas imperfecciones en la fabricación de los dispositivos dan lugar ahora a importantes diferencias aleatorias en sus características eléctricas, que han de ser tenidas en cuenta durante la fase de diseño. Los nuevos procesos y materiales requeridos para la fabricación de dispositivos de dimensiones tan reducidas están dando lugar a diferentes efectos que resultan finalmente en un incremento del consumo estático, o una mayor vulnerabilidad frente a radiación. Las memorias SRAM son ya la parte más vulnerable de un sistema electrónico, no solo por representar más de la mitad del área de los SoCs y microprocesadores actuales, sino también porque las variaciones de proceso les afectan de forma crítica, donde el fallo de una única célula afecta a la memoria entera. Esta tesis aborda los diferentes retos que presenta el diseño de memorias SRAM en las tecnologías más pequeñas. En un escenario de aumento de la variabilidad, se consideran problemas como el consumo de energía, el diseño teniendo en cuenta efectos de la tecnología a bajo nivel o el endurecimiento frente a radiación. En primer lugar, dado el aumento de la variabilidad de los dispositivos pertenecientes a los nodos tecnológicos más pequeños, así como a la aparición de nuevas fuentes de variabilidad por la inclusión de nuevos dispositivos y la reducción de sus dimensiones, la precisión del modelado de dicha variabilidad es crucial. Se propone en la tesis extender el método de inyectores, que modela la variabilidad a nivel de circuito, abstrayendo sus causas físicas, añadiendo dos nuevas fuentes para modelar la pendiente sub-umbral y el DIBL, de creciente importancia en la tecnología FinFET. Los dos nuevos inyectores propuestos incrementan la exactitud de figuras de mérito a diferentes niveles de abstracción del diseño electrónico: a nivel de transistor, de puerta y de circuito. El error cuadrático medio al simular métricas de estabilidad y prestaciones de células SRAM se reduce un mínimo de 1,5 veces y hasta un máximo de 7,5 a la vez que la estimación de la probabilidad de fallo se mejora en varios ordenes de magnitud. El diseño para bajo consumo es una de las principales aplicaciones actuales dada la creciente importancia de los dispositivos móviles dependientes de baterías. Es igualmente necesario debido a las importantes densidades de potencia en los sistemas actuales, con el fin de reducir su disipación térmica y sus consecuencias en cuanto al envejecimiento. El método tradicional de reducir la tensión de alimentación para reducir el consumo es problemático en el caso de las memorias SRAM dado el creciente impacto de la variabilidad a bajas tensiones. Se propone el diseño de una célula que usa valores negativos en la bit-line para reducir los fallos de escritura según se reduce la tensión de alimentación principal. A pesar de usar una segunda fuente de alimentación para la tensión negativa en la bit-line, el diseño propuesto consigue reducir el consumo hasta en un 20 % comparado con una célula convencional. Una nueva métrica, el hold trip point se ha propuesto para prevenir nuevos tipos de fallo debidos al uso de tensiones negativas, así como un método alternativo para estimar la velocidad de lectura, reduciendo el número de simulaciones necesarias. Según continúa la reducción del tamaño de los dispositivos electrónicos, se incluyen nuevos mecanismos que permiten facilitar el proceso de fabricación, o alcanzar las prestaciones requeridas para cada nueva generación tecnológica. Se puede citar como ejemplo el estrés compresivo o extensivo aplicado a los fins en tecnologías FinFET, que altera la movilidad de los transistores fabricados a partir de dichos fins. Los efectos de estos mecanismos dependen mucho del layout, la posición de unos transistores afecta a los transistores colindantes y pudiendo ser el efecto diferente en diferentes tipos de transistores. Se propone el uso de una célula SRAM complementaria que utiliza dispositivos pMOS en los transistores de paso, así reduciendo la longitud de los fins de los transistores nMOS y alargando los de los pMOS, extendiéndolos a las células vecinas y hasta los límites de la matriz de células. Considerando los efectos del STI y estresores de SiGe, el diseño propuesto mejora los dos tipos de transistores, mejorando las prestaciones de la célula SRAM complementaria en más de un 10% para una misma probabilidad de fallo y un mismo consumo estático, sin que se requiera aumentar el área. Finalmente, la radiación ha sido un problema recurrente en la electrónica para aplicaciones espaciales, pero la reducción de las corrientes y tensiones de los dispositivos actuales los está volviendo vulnerables al ruido generado por radiación, incluso a nivel de suelo. Pese a que tecnologías como SOI o FinFET reducen la cantidad de energía colectada por el circuito durante el impacto de una partícula, las importantes variaciones de proceso en los nodos más pequeños va a afectar su inmunidad frente a la radiación. Se demuestra que los errores inducidos por radiación pueden aumentar hasta en un 40 % en el nodo de 7nm cuando se consideran las variaciones de proceso, comparado con el caso nominal. Este incremento es de una magnitud mayor que la mejora obtenida mediante el diseño de células de memoria específicamente endurecidas frente a radiación, sugiriendo que la reducción de la variabilidad representaría una mayor mejora. ABSTRACT Reliability is becoming the main concern on integrated circuit as the technology goes beyond 22nm. Small imperfections in the device manufacturing result now in important random differences of the devices at electrical level which must be dealt with during the design. New processes and materials, required to allow the fabrication of the extremely short devices, are making new effects appear resulting ultimately on increased static power consumption, or higher vulnerability to radiation SRAMs have become the most vulnerable part of electronic systems, not only they account for more than half of the chip area of nowadays SoCs and microprocessors, but they are critical as soon as different variation sources are regarded, with failures in a single cell making the whole memory fail. This thesis addresses the different challenges that SRAM design has in the smallest technologies. In a common scenario of increasing variability, issues like energy consumption, design aware of the technology and radiation hardening are considered. First, given the increasing magnitude of device variability in the smallest nodes, as well as new sources of variability appearing as a consequence of new devices and shortened lengths, an accurate modeling of the variability is crucial. We propose to extend the injectors method that models variability at circuit level, abstracting its physical sources, to better model sub-threshold slope and drain induced barrier lowering that are gaining importance in FinFET technology. The two new proposed injectors bring an increased accuracy of figures of merit at different abstraction levels of electronic design, at transistor, gate and circuit levels. The mean square error estimating performance and stability metrics of SRAM cells is reduced by at least 1.5 and up to 7.5 while the yield estimation is improved by orders of magnitude. Low power design is a major constraint given the high-growing market of mobile devices that run on battery. It is also relevant because of the increased power densities of nowadays systems, in order to reduce the thermal dissipation and its impact on aging. The traditional approach of reducing the voltage to lower the energy consumption if challenging in the case of SRAMs given the increased impact of process variations at low voltage supplies. We propose a cell design that makes use of negative bit-line write-assist to overcome write failures as the main supply voltage is lowered. Despite using a second power source for the negative bit-line, the design achieves an energy reduction up to 20% compared to a conventional cell. A new metric, the hold trip point has been introduced to deal with new sources of failures to cells using a negative bit-line voltage, as well as an alternative method to estimate cell speed, requiring less simulations. With the continuous reduction of device sizes, new mechanisms need to be included to ease the fabrication process and to meet the performance targets of the successive nodes. As example we can consider the compressive or tensile strains included in FinFET technology, that alter the mobility of the transistors made out of the concerned fins. The effects of these mechanisms are very dependent on the layout, with transistor being affected by their neighbors, and different types of transistors being affected in a different way. We propose to use complementary SRAM cells with pMOS pass-gates in order to reduce the fin length of nMOS devices and achieve long uncut fins for the pMOS devices when the cell is included in its corresponding array. Once Shallow Trench isolation and SiGe stressors are considered the proposed design improves both kinds of transistor, boosting the performance of complementary SRAM cells by more than 10% for a same failure probability and static power consumption, with no area overhead. While radiation has been a traditional concern in space electronics, the small currents and voltages used in the latest nodes are making them more vulnerable to radiation-induced transient noise, even at ground level. Even if SOI or FinFET technologies reduce the amount of energy transferred from the striking particle to the circuit, the important process variation that the smallest nodes will present will affect their radiation hardening capabilities. We demonstrate that process variations can increase the radiation-induced error rate by up to 40% in the 7nm node compared to the nominal case. This increase is higher than the improvement achieved by radiation-hardened cells suggesting that the reduction of process variations would bring a higher improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of cell toxicity are known to be inherent in carcinogenesis induced by radiation or chemical carcinogens. The event of cell death precludes tumor induction from occurring. A long standing problem is to estimate the proportion of initiated cells that die before tumor induction. No experimental techniques are currently available for directly gauging the rate of cell death over extended periods of time. The obstacle can be surmounted by newly developed theoretical methods of carcinogenesis modeling. In this paper, we apply such methods to published data on multiple lung tumors in mice receiving different schedules of urethane. Bioassays of this type play an important role in testing environmental chemicals for carcinogenic activity. Our estimates for urethane-induced carcinogenesis show that, unexpectedly, many initiated cells die early in the course of tumor promotion. We present numerical estimates for the probability of initiated cell death for different schedules (and doses) of urethane administration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural genomics aims to solve a large number of protein structures that represent the protein space. Currently an exhaustive solution for all structures seems prohibitively expensive, so the challenge is to define a relatively small set of proteins with new, currently unknown folds. This paper presents a method that assigns each protein with a probability of having an unsolved fold. The method makes extensive use of protomap, a sequence-based classification, and scop, a structure-based classification. According to protomap, the protein space encodes the relationship among proteins as a graph whose vertices correspond to 13,354 clusters of proteins. A representative fold for a cluster with at least one solved protein is determined after superposition of all scop (release 1.37) folds onto protomap clusters. Distances within the protomap graph are computed from each representative fold to the neighboring folds. The distribution of these distances is used to create a statistical model for distances among those folds that are already known and those that have yet to be discovered. The distribution of distances for solved/unsolved proteins is significantly different. This difference makes it possible to use Bayes' rule to derive a statistical estimate that any protein has a yet undetermined fold. Proteins that score the highest probability to represent a new fold constitute the target list for structural determination. Our predicted probabilities for unsolved proteins correlate very well with the proportion of new folds among recently solved structures (new scop 1.39 records) that are disjoint from our original training set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ATP-binding cassette (ABC) transporters bind and hydrolyze ATP. In the cystic fibrosis transmembrane conductance regulator Cl− channel, this interaction with ATP generates a gating cycle between a closed (C) and two open (O1 and O2) conformations. To understand better how ATP controls channel activity, we examined gating transitions from the C to the O1 and O2 states and from these open states to the C conformation. We made three main observations. First, we found that the channel can open into either the O1 or O2 state, that the frequency of transitions to both states was increased by ATP concentration, and that ATP increased the relative proportion of openings into O1 vs. O2. These results indicate that ATP can interact with the closed state to open the channel in at least two ways, which may involve binding to nucleotide-binding domains (NBDs) NBD1 and NBD2. Second, ATP prolonged the burst duration and altered the way in which the channel closed. These data suggest that ATP also interacts with the open channel. Third, the channel showed runs of specific types of open–closed transitions. This finding suggests a mechanism with more than one cycle of gating transitions. These data suggest models to explain how ATP influences conformational transitions in cystic fibrosis transmembrane conductance regulator and perhaps other ABC transporters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have investigated the efficiency of packing by calculating intramolecular packing density above and below peptide planes of internal beta-pleated sheet residues in five globular proteins. The orientation of interest was chosen to allow study of regions that are approximately perpendicular to the faces of beta-pleated sheets. In these locations, nonbonded van der Waals packing interactions predominate over hydrogen bonding and solvent interactions. We observed considerable variability in packing densities within these regions, confirming that the interior packing of a protein does not result in uniform occupation of the available space. Patterns of fluctuation in packing density suggest that the regular backbone-to-backbone network of hydrogen bonds is not likely to be interrupted to maximize van der Waals interactions. However, high-density packing tends to occur toward the ends of beta-structure strands where hydrogen bonds are more likely to involve nonpolar side-chain groups or solvent molecules. These features result in internal protein folding with a central low-density core surrounded by a higher-density subsurface shell, consistent with our previous calculations regarding overall protein packing density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reconstruction of multitaxon trees from molecular sequences is confounded by the variety of algorithms and criteria used to evaluate trees, making it difficult to compare the results of different analyses. A global method of multitaxon phylogenetic reconstruction described here, Bootstrappers Gambit, can be used with any four-taxon algorithm, including distance, maximum likelihood, and parsimony methods. It incorporates a Bayesian-Jeffreys'-bootstrap analysis to provide a uniform probability-based criterion for comparing the results from diverse algorithms. To examine the usefulness of the method, the origin of the eukaryotes has been investigated by the analysis of ribosomal small subunit RNA sequences. Three common algorithms (paralinear distances, Jukes-Cantor distances, and Kimura distances) support the eocyte topology, whereas one (maximum parsimony) supports the archaebacterial topology, suggesting that the eocyte prokaryotes are the closest prokaryotic relatives of the eukaryotes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have studied enhancer function in transient and stable expression assays in mammalian cells by using systems that distinguish expressing from nonexpressing cells. When expression is studied in this way, enhancers are found to increase the probability of a construct being active but not the level of expression per template. In stably integrated constructs, large differences in expression level are observed but these are not related to the presence of an enhancer. Together with earlier studies, these results suggest that enhancers act to affect a binary (on/off) switch in transcriptional activity. Although this idea challenges the widely accepted model of enhancer activity, it is consistent with much, if not all, experimental evidence on this subject. We hypothesize that enhancers act to increase the probability of forming a stably active template. When randomly integrated into the genome, enhancers may affect a metastable state of repression/activity, permitting expression in regions that would not permit activity of an isolated promoter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some islands in the Gulf of California support very high densities of spiders. Spider density is negatively correlated with island size; many small islands support 50-200 spiders per m3 of cactus. Energy for these spiders comes primarily from the ocean and not from in situ productivity by land plants. We explicitly connect the marine and terrestrial systems to show that insular food webs represent one endpoint of the marine web. We describe two conduits for marine energy entering these islands: shore drift and seabird colonies. Both conduits are related to island area, having a much stronger effect on smaller islands. This asymmetric effect helps to explain the exceptionally high spider densities on small islands. Although productivity sets the maximal potential densities, predation (by scorpions) limits realized spider abundance. Thus, prey availability and predation act in concert to set insular spider abundance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that quantum correlations for bipartite dichotomic measurements are those of the form (Formula presented.), where the vectors ui and vj are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of (Formula presented.), where the previous vectors are sampled according to the Haar measure in the unit sphere of (Formula presented.). In particular, we prove the existence of an (Formula presented.) such that if (Formula presented.), (Formula presented.) is nonlocal with probability tending to 1 as (Formula presented.), while for (Formula presented.), (Formula presented.) is local with probability tending to 1 as (Formula presented.).