917 resultados para Probabilistic constraints


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Excesses on positron and electron fluxes-measured by ATIC and the PAMELA and Fermi-LAT telescopes-can be explained by dark matter annihilation in the Galaxy, however, it requires large boosts on the dark matter annihilation rate. There are many possible enhancement mechanisms such as the Sommerfeld effect or the existence of dark matter clumps in our halo. If enhancements on the dark matter annihilation cross section are taking place, the dark matter annihilation in the core of the Earth will be enhanced. Here we use recent results from the IceCube 40-string configuration to probe generic enhancement scenarios. We present results as a function of the dark matter-proton interaction cross section, sigma(chi p) weighted by the branching fraction into neutrinos f(nu(nu) over bar) as a function of a generic boost factor B-F, which parametrizes the expected enhancement of the annihilation rate. We find that dark matter models that require annihilation enhancements of O(100) or more and that annihilate significantly into neutrinos are excluded as an explanation for these excesses. We also determine the boost range that can be probed by the full IceCube telescope.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the first tungsten isotopic measurements in stardust silicon carbide (SiC) grains recovered from the Murchison carbonaceous chondrite. The isotopes (182,183,184,186)Wand (179,180)Hf were measured on both an aggregate (KJB fraction) and single stardust SiC grains (LS+ LU fraction) believed to have condensed in the outflows of low-mass carbon-rich asymptotic giant branch (AGB) stars with close-to-solar metallicity. The SiC aggregate shows small deviations from terrestrial (= solar) composition in the (182)W/(184)Wand (183)W/(184)Wratios, with deficits in (182)W and (183)W with respect to (184)W. The (186)W/(184)W ratio, however, shows no apparent deviation from the solar value. Tungsten isotopic measurements in single mainstream stardust SiC grains revealed lower than solar (182)W/(184)W, (183)W/(184)W, and (186)W/(184)W ratios. We have compared the SiC data with theoretical predictions of the evolution of W isotopic ratios in the envelopes of AGB stars. These ratios are affected by the slow neutron-capture process and match the SiC data regarding their (182)W/(184)W, (183)W/(184)W, and (179)Hf/(180)Hf isotopic compositions, although a small adjustment in the s-process production of (183)W is needed in order to have a better agreement between the SiC data and model predictions. The models cannot explain the (186)W/(184)W ratios observed in the SiC grains, even when the current (185)W neutron-capture cross section is increased by a factor of two. Further study is required to better assess how model uncertainties (e. g., the formation of the (13)C neutron source, the mass-loss law, the modeling of the third dredge-up, and the efficiency of the (22)Ne neutron source) may affect current s-process predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A variety of seemingly unrelated processes, such as core-mantle interaction, desulfurization, and direct precipitation from a silicate melt have been proposed to explain the formation of Ru-Os-Ir alloys (here referred to as osmiridiums) found in terrestrial mantle rocks. However, no consensus has yet been reached on how these important micrometer-sized phases form. In this paper we report the results of an experimental study on the solubilities of Ru, Os and Ir in sulfide melts (or mattes) as a function of alloy composition at 1300 degrees C. Considering the low solubilities of Ru, Os, and Ir in silicate melts, coupled with their high matte/silicate-melt partition coefficients, our results indicate that these elements concentrate initially at the ppm level in a matte phase in the mantle source region. During partial melting, the extraction of sulfur into silicate melt leads to a decrease in fS(2) that triggers the exsolution of osmiridiums from the refractory matte in the residue. The newly formed osmiridiums may persist in the terrestrial mantle for periods exceeding billions of years. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Dom Feliciano Belt, situated in southernmost Brazil and Uruguay, contains a large mass of granite-gneissic rocks (also known as Florianopolis/Pelotas Batholith) formed during the pre-, syn- and post-orogenic phases of the Brasiliano/Pan-African cycle. In the NE extreme of this granitic mass, pre-, syn- and post-tectonic granites associated with the Major Gercino Shear Zone (MGSZ) are exposed. The granitic manifestation along the MGSZ can be divided into pre-kinematic tonalitic gneisses, peraluminous high-K calcalkaline early kinematic shoshonitic, and metaluminous post-kinematic granites. U-Pb zircon data suggest an age of 649 +/- 10 Ma for the pre-tectonic gneisses, and a time span from 623 +/- 6 Ma to 588 +/- 3 Ma for the early to post-tectonic magmatism. Negative epsilon Hf (t) values ranging from -4.6 to -14.6 and Hf model ages ranging from 1.64 to 2.39 Ga for magmatic zircons coupled with whole rock Nd model ages ranging from 1.24 to 2.05 Ga and epsilon Nd (t) values ranging from -3.84 to -7.50, point to a crustal derivation for the granitic magmatism. The geochemical and isotope data support a continental magmatic arc generated from melting of dominant Paleoproterozoic crust, and a similar evolution for the granitic batholiths of the eastern Dom Feliciano Belt and western Kaoko Belt. (C) 2011 International Association for Gondwana Research. Published by Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background The structure of regulatory networks remains an open question in our understanding of complex biological systems. Interactions during complete viral life cycles present unique opportunities to understand how host-parasite network take shape and behave. The Anticarsia gemmatalis multiple nucleopolyhedrovirus (AgMNPV) is a large double-stranded DNA virus, whose genome may encode for 152 open reading frames (ORFs). Here we present the analysis of the ordered cascade of the AgMNPV gene expression. Results We observed an earlier onset of the expression than previously reported for other baculoviruses, especially for genes involved in DNA replication. Most ORFs were expressed at higher levels in a more permissive host cell line. Genes with more than one copy in the genome had distinct expression profiles, which could indicate the acquisition of new functionalities. The transcription gene regulatory network (GRN) for 149 ORFs had a modular topology comprising five communities of highly interconnected nodes that separated key genes that are functionally related on different communities, possibly maximizing redundancy and GRN robustness by compartmentalization of important functions. Core conserved functions showed expression synchronicity, distinct GRN features and significantly less genetic diversity, consistent with evolutionary constraints imposed in key elements of biological systems. This reduced genetic diversity also had a positive correlation with the importance of the gene in our estimated GRN, supporting a relationship between phylogenetic data of baculovirus genes and network features inferred from expression data. We also observed that gene arrangement in overlapping transcripts was conserved among related baculoviruses, suggesting a principle of genome organization. Conclusions Albeit with a reduced number of nodes (149), the AgMNPV GRN had a topology and key characteristics similar to those observed in complex cellular organisms, which indicates that modularity may be a general feature of biological gene regulatory networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provade a very Complexity of inferences in polytree-shaped semi-qualitative probabilistic networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that interferences can be performed in time linear in the number of nodes if there is a single observed node. Because our proof is construtive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynominal-time algorithm for SQPn. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the growing interest in social networks, link prediction has received significant attention. Link prediction is mostly based on graph-based features, with some recent approaches focusing on domain semantics. We propose algorithms for link prediction that use a probabilistic ontology to enhance the analysis of the domain and the unavoidable uncertainty in the task (the ontology is specified in the probabilistic description logic crALC). The scalability of the approach is investigated, through a combination of semantic assumptions and graph-based features. We evaluate empirically our proposal, and compare it with standard solutions in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above '10 POT. 18' eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments.Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above '10 POT. 18' eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The aim of this work is to propose a model for computing the optical flow in a sequence of images. We introduce a new temporal regularizer that is suitable for large displacements. We propose to decouple the spatial and temporal regularizations to avoid an incongruous formulation. For the spatial regularization we use the Nagel-Enkelmann operator and a newly designed temporal regularization. Our model is based on an energy functional that yields a partial differential equation (PDE). This PDE is embedded into a multipyramidal strategy to recover large displacements. A gradient descent technique is applied at each scale to reach the minimum.