932 resultados para Random-set theory
Resumo:
This paper addresses the time-variant reliability analysis of structures with random resistance or random system parameters. It deals with the problem of a random load process crossing a random barrier level. The implications of approximating the arrival rate of the first overload by an ensemble-crossing rate are studied. The error involved in this so-called ""ensemble-crossing rate"" approximation is described in terms of load process and barrier distribution parameters, and in terms of the number of load cycles. Existing results are reviewed, and significant improvements involving load process bandwidth, mean-crossing frequency and time are presented. The paper shows that the ensemble-crossing rate approximation can be accurate enough for problems where load process variance is large in comparison to barrier variance, but especially when the number of load cycles is small. This includes important practical applications like random vibration due to impact loadings and earthquake loading. Two application examples are presented, one involving earthquake loading and one involving a frame structure subject to wind and snow loadings. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Mass transfer across a gas-liquid interface was studied theoretically and experimentally, using transfer of oxygen into water as the gas-liquid system. The experimental results support the conclusions of a theoretical description of the concentration field that uses random square waves approximations. The effect of diffusion over the concentration records was quantified. It is shown that the peak of the normalized rills concentration fluctuation profiles must be lower than 0.5, and that the position of the peak of the rms value is an adequate measure of the thickness of the diffusive layer. The position of the peak is the boundary between the regions more subject to molecular diffusion or to turbulent transport of dissolved mass.
Resumo:
We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
The concrete offshore platforms, which are subjected a several loading combinations and, thus, requires an analysis more generic possible, can be designed using the concepts adopted to shell elements, but the resistance must be verify in particular cross-sections to shear forces. This work about design of shell elements will be make using the three-layer shell theory. The elements are subject to combined loading of membrane and plate, totalizing eight components of internal forces, which are three membrane forces, three moments (two out-of-plane bending moments and one in-plane, or torsion, moment) and two shear forces. The design method adopted, utilizing the iterative process proposed by Lourenco & Figueiras (1993) obtained from equations of equilibrium developed by Gupta (1896) , will be compared to results of experimentally tested shell elements found in the literature using the program DIANA.
Resumo:
On February 6, 1994, a large debris flow developed because of intense rains in a 800-m-high mountain range called Serra do Cubatao, the local name for the Serra do Mar, located along the coast of the state of Sao Paulo, Brazil. It affected the Presidente Bernardes Refinery, owned by Petrobras, in Cubatao. The damages amounted to about US $40 million because of the muck cleaning, repairs, and 3-week interruption of the operations. This prompted Petrobras to conduct studies, carried out by the authors, to develop protection works, which were done at a cost of approximately US $12 million. The paper describes the studies conducted on debris flow mechanics. A new criteria to define rainfall intensities that trigger debris flows is presented, as well as a correlation of slipped area with soil porosity and rain intensity. Also presented are (a) an actual grain size distribution of a deposited material, determined by laboratory and a large-scale field test, and (b) the size distribution of large boulders along the river bed. Based on theory, empirical experience and back-analysis of the events, the main parameters as the front velocity, the peak discharge and the volume of the transported sediments were determined in a rational basis for the design of the protection works. Finally, the paper describes the set of the protection works built, emphasizing their concept and function. They also included some low-cost innovative works.
Resumo:
As many countries are moving toward water sector reforms, practical issues of how water management institutions can better effect allocation, regulation, and enforcement of water rights have emerged. The problem of nonavailability of water to tailenders on an irrigation system in developing countries, due to unlicensed upstream diversions is well documented. The reliability of access or equivalently the uncertainty associated with water availability at their diversion point becomes a parameter that is likely to influence the application by users for water licenses, as well as their willingness to pay for licensed use. The ability of a water agency to reduce this uncertainty through effective water rights enforcement is related to the fiscal ability of the agency to monitor and enforce licensed use. In this paper, this interplay across the users and the agency is explored, considering the hydraulic structure or sequence of water use and parameters that define the users and the agency`s economics. The potential for free rider behavior by the users, as well as their proposals for licensed use are derived conditional on this setting. The analyses presented are developed in the framework of the theory of ""Law and Economics,`` with user interactions modeled as a game theoretic enterprise. The state of Ceara, Brazil, is used loosely as an example setting, with parameter values for the experiments indexed to be approximately those relevant for current decisions. The potential for using the ideas in participatory decision making is discussed. This paper is an initial attempt to develop a conceptual framework for analyzing such situations but with a focus on the reservoir-canal system water rights enforcement.
Resumo:
Although the Hertz theory is not applicable in the analysis of the indentation of elastic-plastic materials, it is common practice to incorporate the concept of indenter/specimen combined modulus to consider indenter deformation. The appropriateness was assessed of the use of reduced modulus to incorporate the effect of indenter deformation in the analysis of the indentation with spherical indenters. The analysis based on finite element simulations considered four values of the ratio of the indented material elastic modulus to that of the diamond indenter, E/E(i) (0, 0.04, 0.19, 0.39), four values of the ratio of the elastic reduced modulus to the initial yield strength, E(r)/Y (0, 10, 20, 100), and two values of the ratio of the indenter radius to maximum total displacement, R/delta(max) (3, 10). Indenter deformation effects are better accounted for by the reduced modulus if the indented material behaves entirely elastically. In this case, identical load-displacement (P - delta) curves are obtained with rigid and elastic spherical indenters for the same elastic reduced modulus. Changes in the ratio E/E(i), from 0 to 0.39, resulted in variations lower than 5% for the load dimensionless functions, lower than 3% in the contact area, A(c), and lower than 5% in the ratio H/E(r). However, deformations of the elastic indenter made the actual radius of contact change, even in the indentation of elastic materials. Even though the load dimensionless functions showed only a little increase with the ratio E/E(i), the hardening coefficient and the yield strength could be slightly overestimated when algorithms based on rigid indenters are used. For the unloading curves, the ratio delta(e)/delta(max), where delta(e) is the point corresponding to zero load of a straight line with slope S from the point (P(max), delta(max)), varied less than 5% with the ratio E/E(i). Similarly, the relationship between reduced modulus and the unloading indentation curve, expressed by Sneddon`s equation, did not reveal the necessity of correction with the ratio E/E(i). The most affected parameter in the indentation curve, as a consequence of the indentation deformation, was the ratio between the residual indentation depth after complete unloading and the maximum indenter displacement, delta(r)/delta(max) (up to 26%), but this variation did not significantly decrease the capability to estimate hardness and elastic modulus based on the ratio of the residual indentation depth to maximum indentation depth, h(r)/h(max). In general, the results confirm the convenience of the use of the reduced modulus in the spherical instrumented indentation tests.
Resumo:
We present a method to simulate the Magnetic Barkhausen Noise using the Random Field Ising Model with magnetic long-range interaction. The method allows calculating the magnetic flux density behavior in particular sections of the lattice reticule. The results show an internal demagnetizing effect that proceeds from the magnetic long-range interactions. This demagnetizing effect induces the appearing of a magnetic pattern in the region of magnetic avalanches. When compared with the traditional method, the proposed numerical procedure neatly reduces computational costs of simulation. (c) 2008 Published by Elsevier B.V.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This paper presents concentration inequalities and laws of large numbers under weak assumptions of irrelevance that are expressed using lower and upper expectations. The results build upon De Cooman and Miranda`s recent inequalities and laws of large numbers. The proofs indicate connections between the theory of martingales and concepts of epistemic and regular irrelevance. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Gamma ray tomography experiments have been carried out to detect spatial patterns in the porosity in a 0.27 m diameter column packed with steel Rashig rings of different sizes: 12.6, 37.9, and 76 mm. using a first generation CT system (Chen et al., 1998). A fast Fourier transform tomographic reconstruction algorithm has been used to calculate the spatial variation over the column cross section. Cross-sectional gas porosity and solid holdup distribution were determinate. The values of cross-sectional average gas porosity were epsilon=0.849, 0.938 and 0.966 for the 12.6, 37.9, and 76 mm rings, respectively. Radial holdup variation within the packed bed has been determined. The variation of the circumferentially averaged gas holdup in the radial direction indicates that the porosity in the column wall region is a somewhat higher than that in the bulk region, due to the effect of the column wall. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Model predictive control (MPC) is usually implemented as a control strategy where the system outputs are controlled within specified zones, instead of fixed set points. One strategy to implement the zone control is by means of the selection of different weights for the output error in the control cost function. A disadvantage of this approach is that closed-loop stability cannot be guaranteed, as a different linear controller may be activated at each time step. A way to implement a stable zone control is by means of the use of an infinite horizon cost in which the set point is an additional variable of the control problem. In this case, the set point is restricted to remain inside the output zone and an appropriate output slack variable is included in the optimisation problem to assure the recursive feasibility of the control optimisation problem. Following this approach, a robust MPC is developed for the case of multi-model uncertainty of open-loop stable systems. The controller is devoted to maintain the outputs within their corresponding feasible zone, while reaching the desired optimal input target. Simulation of a process of the oil re. ning industry illustrates the performance of the proposed strategy.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
The classical approach for acoustic imaging consists of beamforming, and produces the source distribution of interest convolved with the array point spread function. This convolution smears the image of interest, significantly reducing its effective resolution. Deconvolution methods have been proposed to enhance acoustic images and have produced significant improvements. Other proposals involve covariance fitting techniques, which avoid deconvolution altogether. However, in their traditional presentation, these enhanced reconstruction methods have very high computational costs, mostly because they have no means of efficiently transforming back and forth between a hypothetical image and the measured data. In this paper, we propose the Kronecker Array Transform ( KAT), a fast separable transform for array imaging applications. Under the assumption of a separable array, it enables the acceleration of imaging techniques by several orders of magnitude with respect to the fastest previously available methods, and enables the use of state-of-the-art regularized least-squares solvers. Using the KAT, one can reconstruct images with higher resolutions than was previously possible and use more accurate reconstruction techniques, opening new and exciting possibilities for acoustic imaging.