980 resultados para reliability theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of time variant reliability analysis of existing structures subjected to stationary random dynamic excitations is considered. The study assumes that samples of dynamic response of the structure, under the action of external excitations, have been measured at a set of sparse points on the structure. The utilization of these measurements m in updating reliability models, postulated prior to making any measurements, is considered. This is achieved by using dynamic state estimation methods which combine results from Markov process theory and Bayes' theorem. The uncertainties present in measurements as well as in the postulated model for the structural behaviour are accounted for. The samples of external excitations are taken to emanate from known stochastic models and allowance is made for ability (or lack of it) to measure the applied excitations. The future reliability of the structure is modeled using expected structural response conditioned on all the measurements made. This expected response is shown to have a time varying mean and a random component that can be treated as being weakly stationary. For linear systems, an approximate analytical solution for the problem of reliability model updating is obtained by combining theories of discrete Kalman filter and level crossing statistics. For the case of nonlinear systems, the problem is tackled by combining particle filtering strategies with data based extreme value analysis. In all these studies, the governing stochastic differential equations are discretized using the strong forms of Ito-Taylor's discretization schemes. The possibility of using conditional simulation strategies, when applied external actions are measured, is also considered. The proposed procedures are exemplifiedmby considering the reliability analysis of a few low-dimensional dynamical systems based on synthetically generated measurement data. The performance of the procedures developed is also assessed based on a limited amount of pertinent Monte Carlo simulations. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the use of probability theory in reliability based optimum design of reinforced gravity retaining wall is described. The formulation for computing system reliability index is presented. A parametric study is conducted using advanced first order second moment method (AFOSM) developed by Hasofer-Lind and Rackwitz-Fiessler (HL-RF) to asses the effect of uncertainties in design parameters on the probability of failure of reinforced gravity retaining wall. Totally 8 modes of failure are considered, viz overturning, sliding, eccentricity, bearing capacity failure, shear and moment failure in the toe slab and heel slab. The analysis is performed by treating back fill soil properties, foundation soil properties, geometric properties of wall, reinforcement properties and concrete properties as random variables. These results are used to investigate optimum wall proportions for different coefficients of variation of φ (5% and 10%) and targeting system reliability index (βt) in the range of 3 – 3.2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, the development of a probabilistic approach to robust control is motivated by structural control applications in civil engineering. Often in civil structural applications, a system's performance is specified in terms of its reliability. In addition, the model and input uncertainty for the system may be described most appropriately using probabilistic or "soft" bounds on the model and input sets. The probabilistic robust control methodology contrasts with existing H∞/μ robust control methodologies that do not use probability information for the model and input uncertainty sets, yielding only the guaranteed (i.e., "worst-case") system performance, and no information about the system's probable performance which would be of interest to civil engineers.

The design objective for the probabilistic robust controller is to maximize the reliability of the uncertain structure/controller system for a probabilistically-described uncertain excitation. The robust performance is computed for a set of possible models by weighting the conditional performance probability for a particular model by the probability of that model, then integrating over the set of possible models. This integration is accomplished efficiently using an asymptotic approximation. The probable performance can be optimized numerically over the class of allowable controllers to find the optimal controller. Also, if structural response data becomes available from a controlled structure, its probable performance can easily be updated using Bayes's Theorem to update the probability distribution over the set of possible models. An updated optimal controller can then be produced, if desired, by following the original procedure. Thus, the probabilistic framework integrates system identification and robust control in a natural manner.

The probabilistic robust control methodology is applied to two systems in this thesis. The first is a high-fidelity computer model of a benchmark structural control laboratory experiment. For this application, uncertainty in the input model only is considered. The probabilistic control design minimizes the failure probability of the benchmark system while remaining robust with respect to the input model uncertainty. The performance of an optimal low-order controller compares favorably with higher-order controllers for the same benchmark system which are based on other approaches. The second application is to the Caltech Flexible Structure, which is a light-weight aluminum truss structure actuated by three voice coil actuators. A controller is designed to minimize the failure probability for a nominal model of this system. Furthermore, the method for updating the model-based performance calculation given new response data from the system is illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The density and distribution of spatial samples heavily affect the precision and reliability of estimated population attributes. An optimization method based on Mean of Surface with Nonhomogeneity (MSN) theory has been developed into a computer package with the purpose of improving accuracy in the global estimation of some spatial properties, given a spatial sample distributed over a heterogeneous surface; and in return, for a given variance of estimation, the program can export both the optimal number of sample units needed and their appropriate distribution within a specified research area. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Future high speed communications networks will transmit data predominantly over optical fibres. As consumer and enterprise computing will remain the domain of electronics, the electro-optical conversion will get pushed further downstream towards the end user. Consequently, efficient tools are needed for this conversion and due to many potential advantages, including low cost and high output powers, long wavelength Vertical Cavity Surface Emitting Lasers (VCSELs) are a viable option. Drawbacks, such as broader linewidths than competing options, can be mitigated through the use of additional techniques such as Optical Injection Locking (OIL) which can require significant expertise and expensive equipment. This thesis addresses these issues by removing some of the experimental barriers to achieving performance increases via remote OIL. Firstly, numerical simulations of the phase and the photon and carrier numbers of an OIL semiconductor laser allowed the classification of the stable locking phase limits into three distinct groups. The frequency detuning of constant phase values (ø) was considered, in particular ø = 0 where the modulation response parameters were shown to be independent of the linewidth enhancement factor, α. A new method to estimate α and the coupling rate in a single experiment was formulated. Secondly, a novel technique to remotely determine the locked state of a VCSEL based on voltage variations of 2mV−30mV during detuned injection has been developed which can identify oscillatory and locked states. 2D & 3D maps of voltage, optical and electrical spectra illustrate corresponding behaviours. Finally, the use of directly modulated VCSELs as light sources for passive optical networks was investigated by successful transmission of data at 10 Gbit/s over 40km of single mode fibre (SMF) using cost effective electronic dispersion compensation to mitigate errors due to wavelength chirp. A widely tuneable MEMS-VCSEL was established as a good candidate for an externally modulated colourless source after a record error free transmission at 10 Gbit/s over 50km of SMF across a 30nm single mode tuning range. The ability to remotely set the emission wavelength using the novel methods developed in this thesis was demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical decisions are made by decision-makers throughout
the life-cycle of large-scale projects. These decisions are crucial as they
have a direct impact upon the outcome and the success of projects. To aid
decision-makers in the decision making process we present an evidential
reasoning framework. This approach utilizes the Dezert-Smarandache
theory to fuse heterogeneous evidence sources that suffer from levels
of uncertainty, imprecision and conflicts to provide beliefs for decision
options. To analyze the impact of source reliability and priority upon
the decision making process, a reliability discounting technique and a
priority discounting technique, are applied. A maximal consistent subset
is constructed to aid in dening where discounting should be applied.
Application of the evidential reasoning framework is illustrated using a
case study based in the Aerospace domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the first known empirical use of the Reversal Theory State Measure (RTSM) since its publication by Desselles et al. (2014). The RTSM was employed to track responses to three purposely-selected video commercials in a between-subjects design. Results of the study provide empirical support for the central conceptual premise of reversal theory, the experience of metamotivational reversals and the ability of the RTSM to capture them. The RTSM was also found to be psychometrically sound after adjustments were made to two of its three component subscales. Detailed account and rationale is provided for the analytical process of assessing the psychometric robustness of the RTSM, with a number of techniques and interpretations relating to component structure and reliability discussed. Agreeability and critique of the two available versions of the RTSM – the bundled and the branched – is also examined. Researchers are encouraged to assist development of the RTSM through further use, taking into account the analysis and recommendations presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To determine overall, test–retest and inter-rater reliability of posture indices among persons with idiopathic scoliosis. Design A reliability study using two raters and two test sessions. Setting Tertiary care paediatric centre. Participants Seventy participants aged between 10 and 20 years with different types of idiopathic scoliosis (Cobb angle 15 to 60°) were recruited from the scoliosis clinic. Main outcome measures Based on the XY co-ordinates of natural reference points (e.g. eyes) as well as markers placed on several anatomical landmarks, 32 angular and linear posture indices taken from digital photographs in the standing position were calculated from a specially developed software program. Generalisability theory served to estimate the reliability and standard error of measurement (SEM) for the overall, test–retest and inter-rater designs. Bland and Altman's method was also used to document agreement between sessions and raters. Results In the random design, dependability coefficients demonstrated a moderate level of reliability for six posture indices (ϕ = 0.51 to 0.72) and a good level of reliability for 26 posture indices out of 32 (ϕ ≥ 0.79). Error attributable to marker placement was negligible for most indices. Limits of agreement and SEM values were larger for shoulder protraction, trunk list, Q angle, cervical lordosis and scoliosis angles. The most reproducible indices were waist angles and knee valgus and varus. Conclusions Posture can be assessed in a global fashion from photographs in persons with idiopathic scoliosis. Despite the good reliability of marker placement, other studies are needed to minimise measurement errors in order to provide a suitable tool for monitoring change in posture over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of Queueing theory in areas like Computer networking, ATM facilities, Telecommunications and to many other numerous situation made people study Queueing models extensively and it has become an ever expanding branch of applied probability. The thesis discusses Reliability of a ‘k-out-of-n system’ where the server also attends external customers when there are no failed components (main customers), under a retrial policy, which can be explained in detail. It explains the reliability of a ‘K-out-of-n-system’ where the server also attends external customers and studies a multi-server infinite capacity Queueing system where each customer arrives as ordinary but can generate into priority customer which waiting in the queue. The study gives details on a finite capacity multi-server queueing system with self-generation of priority customers and also on a single server infinite capacity retrial Queue where the customer in the orbit can generate into a priority customer and leaves the system if the server is already busy with a priority generated customer; else he is taken for service immediately. Arrival process is according to a MAP and service times follow MSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability analysis is a well established branch of statistics that deals with the statistical study of different aspects of lifetimes of a system of components. As we pointed out earlier that major part of the theory and applications in connection with reliability analysis were discussed based on the measures in terms of distribution function. In the beginning chapters of the thesis, we have described some attractive features of quantile functions and the relevance of its use in reliability analysis. Motivated by the works of Parzen (1979), Freimer et al. (1988) and Gilchrist (2000), who indicated the scope of quantile functions in reliability analysis and as a follow up of the systematic study in this connection by Nair and Sankaran (2009), in the present work we tried to extend their ideas to develop necessary theoretical framework for lifetime data analysis. In Chapter 1, we have given the relevance and scope of the study and a brief outline of the work we have carried out. Chapter 2 of this thesis is devoted to the presentation of various concepts and their brief reviews, which were useful for the discussions in the subsequent chapters .In the introduction of Chapter 4, we have pointed out the role of ageing concepts in reliability analysis and in identifying life distributions .In Chapter 6, we have studied the first two L-moments of residual life and their relevance in various applications of reliability analysis. We have shown that the first L-moment of residual function is equivalent to the vitality function, which have been widely discussed in the literature .In Chapter 7, we have defined percentile residual life in reversed time (RPRL) and derived its relationship with reversed hazard rate (RHR). We have discussed the characterization problem of RPRL and demonstrated with an example that the RPRL for given does not determine the distribution uniquely

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In populational sampling it is vitally important to clarify and discern: first, the design or sampling method used to solve the research problem; second, the sampling size, taking into account different components (precision, reliability, variance); third, random selection and fourth, the precision estimate (sampling errors), so as to determine if it is possible to infer the obtained estimates from the target population. The existing difficulty to use concepts from the sampling theory is to understand them with absolute clarity and, to achieve it, the help from didactic-pedagogical strategies arranged as conceptual “mentefactos” (simple hierarchic diagrams organized from propositions) may prove useful. This paper presents the conceptual definition, through conceptual “mentefactos”, of the most important populational probabilistic sampling concepts, in order to obtain representative samples from populations in health research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviourbased models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley’s declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The kinetics of the reactions of the atoms O(P-3), S(P-3), Se(P-3), and Te((3)p) with a series of alkenes are examined for correlations relating the logarithms of the rate coefficients to the energies of the highest occupied molecular orbitals (HOMOs) of the alkenes. These correlations may be employed to predict rate coefficients from the calculated HOMO energy of any other alkene of interest. The rate coefficients obtained from the correlations were used to formulate structure-activity relations (SARs) for reactions of O((3)p), S(P-3), Se (P-3), and Te((3)p) with alkenes. A comparison of the values predicted by both the correlations and the SARs with experimental data where they exist allowed us to assess the reliability of our method. We demonstrate the applicability of perturbation frontier molecular orbital theory to gas-phase reactions of these atoms with alkenes. The correlations are apparently not applicable to reactions of C(P-3), Si(P-3), N(S-4), and Al(P-2) atoms with alkenes, a conclusion that could be explained in terms of a different mechanism for reaction of these atoms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern transaction cost economics (TCE) thinking has developed into a key intellectual foundation of international business (IB) research, but the Williamsonian version has faced substantial criticism for adopting the behavioral assumption of opportunism. In this paper we assess both the opportunism concept and existing alternatives such as trust within the context of IB research, especially work on multinational enterprise (MNE) governance. Case analyses of nine global MNEs illustrate an alternative to the opportunism assumption that captures more fully the mechanisms underlying failed commitments inside the MNE. As a substitute for the often-criticized assumption of opportunism, we propose the envelope concept of bounded reliability (BRel), an assumption that represents more accurately and more completely the reasons for failed commitments, without invalidating the other critical assumption in conventional TCE (and internalization theory) thinking, namely the widely accepted envelope concept of bounded rationality (BRat). Bounded reliability as an envelope concept includes two main components, within the context of global MNE management: opportunism as intentional deceit, and benevolent preference reversal. The implications for IB research of adopting the bounded reliability concept are far reaching, as this concept may increase the legitimacy of comparative institutional analysis in the social sciences.