819 resultados para rater reliability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To evaluate and compare the intraobserver and interobserver reliability and agreement for the biparietal diameter (BPD), abdominal circumference (AC), femur length (FL) and estimated fetal weight (EFW) obtained by two-dimensional ultrasound (2D-US) and three-dimensional ultrasound (3D-US). Methods Singleton pregnant women between 24 and 40 weeks were invited to participate in this study. They were examined using 2D-US in a blinded manner, twice by one observer, intercalated by a scan by a second observer, to determine BPD, AC and FL. In each of the three examinations, three 3D-US datasets (head, abdomen and thigh) were acquired for measurements of the same parameters. We determined EFW using Hadlock's formula. Systematic errors between 3D-US and 2D-US were examined using the paired t-test. Reliability and agreement were assessed by intraclass correlation coefficients (ICCs), limits of agreement (LoA), SD of differences and proportion of differences below arbitrary points. Results We evaluated 102 singleton pregnancies. No significant systematic error between 2D-US and 3D-US was observed. The ICC values were higher for 3D-US in both intra- and interobserver evaluations; however, only for FL was there no overlap in the 95% CI. The LoA values were wider for 2D-US, suggesting that random errors were smaller when using 3D-US. Additionally, we observed that the SD values determined from 3D-US differences were smaller than those obtained for 2D-US. Higher proportions of differences were below the arbitrarily defined cut-off points when using 3D-US. Conclusion 3D-US improved the reliability and agreement of fetal measurements and EFW compared with 2D-US.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To evaluate the intra- and interobserver reliability of assessment of three-dimensional power Doppler (3D-PD) indices from single spherical samples of the placenta. Methods Women with singleton pregnancies at 2440 weeks' gestation were included. Three scans were independently performed by two observers; Observer 1 performed the first and third scan, intercalated by the scan of Observer 2. The observers independently analyzed the 3D-PD datasets that they had previously acquired using four different methods, each using a spherical sample: random sample extending from basal to chorionic plate; random sample with 2 cm3 of volume; directed sample to the region subjectively determined as containing more color Doppler signals extending from basal to chorionic plate; or directed sample with 2 cm3 of volume. The vascularization index (VI), flow index (FI) and vascularization flow index (VFI) were evaluated in each case. The observers were blinded to their own and each other's results. Additional evaluation was performed according to placental location: anterior, posterior and fundal or lateral. Intra- and interobserver reliability was assessed by intraclass correlation coefficients (ICC). Results Ninety-five pregnancies were included in the analysis. All three placental 3D-PD indices showed only weak to moderate reliability (ICC < 0.66 and ICC < 0.48, intra- and interobserver, respectively). The highest values of ICC were observed when using directed spherical samples from basal to chorionic plate. When analyzed by placental location, we found lower ICCs for lateral and fundal placentae compared to anterior and posterior ones. Conclusion Intra- and interobserver reliability of assessment of placental 3D-PD indices from single spherical samples in pregnant women greater than 24 weeks' gestation is poor to moderate, and clinical usefulness of these indices is likely to be limited. Copyright (c) 2012 ISUOG. Published by John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. To evaluate whether the overall dysphonia grade, roughness, breathiness, asthenia, and strain (GRBAS) scale, and the Consensus Auditory Perceptual Evaluation-Voice (CAPE-V) scale show the same reliability and consensus when applied to the same vocal sample at different times. Study Design. Observational cross-sectional study. Methods. Sixty subjects had their voices recorded according to the tasks proposed in the CAPE-V scale. Vowels /a/ and /i/ were sustained between 3 and 5 seconds. Reproduction of six sentences and spontaneous speech from the request "Tell me about your voice" were analyzed. For the analysis of the GRBAS scale, the sustained vowel and reading tasks of the sentences was used. Auditory-perceptual voice analyses were conducted by three expert speech therapists with more than 5 years of experience and familiar with both the scales. Results. A strong correlation was observed in the intrajudge consensus analysis, both for the GRBAS scale as well as for CAPE-V, with intraclass coefficient values ranging from 0.923 to 0.985. A high degree of correlation between the general GRBAS and CAPE-V grades (coefficient = 0.842) was observed, with similarities in the grades of dysphonia distribution in both scales. The evaluators indicated a mild difficulty in applying the GRBAS scale and low to mild difficulty in applying the CAPE-V scale. The three evaluators agreed when indicating the GRBAS scale as the fastest and the CAPE-V scale as the most sensitive, especially for detecting small changes in voice. Conclusions. The two scales are reliable and are indicated for use in analyzing voice quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Because the mechanical behavior of the implant-abutment system is critical for the longevity of implant-supported reconstructions, this study evaluated the fatigue reliability of different implant-abutment systems used as single-unit crowns and their failure modes. Methods and Materials: Sixty-three Ti-6Al-4V implants were divided in 3 groups: Replace Select (RS); IC-IMP Osseotite; and Unitite were restored with their respective abutments. Anatomically correct central incisor metal crowns were cemented and subjected to separate single load to failure tests and step-stress accelerated life testing (n = 18). A master Weibull curve and reliability for a mission of 50,000 cycles at 200 N were calculated. Polarized-light and scanning electron microscopes were used for failure analyses. Results: The load at failure mean values during step-stress accelerated life testing were 348.14 N for RS, 324.07 N for Osseotite, and 321.29 N for the Unitite systems. No differences in reliability levels were detected between systems, and only the RS system mechanical failures were shown to be accelerated by damage accumulation. Failure modes differed between systems. Conclusions: The 3 evaluated systems did not present significantly different reliability; however, failure modes were different. (Implant Dent 2012;21:67-71)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Schizophrenia is a chronic mental disorder associated with impairment in social functioning. The most widely used scale to measure social functioning is the GAF (Global Assessment of Functioning), but it has the disadvantage of measuring at the same time symptoms and functioning, as described in its anchors. OBJECTIVES:Translation and cultural adaptation of the PSP, proposing a final version in Portuguese for use in Brazil. METHODS: We performed five steps: 1) translation; 2) back translation; 3) formal assessment of semantic equivalence; 4) debriefing; 5) analysis by experts. Interrater reliability (Intraclass correlation, ICC) between two raters was also measured. RESULTS: The final version was applied by two independent investigators in 18 adults with schizophrenia (DSM-IV-TR). The interrater reliability (ICC) was 0.812 (p < 0.001). CONCLUSION: The translation and adaptation of the PSP had an adequate level of semantic equivalence between the Portuguese version and the original English version. There were no difficulties related to understanding the content expressed in the translated texts and terms. Its application was easy and it showed a good interrater reliability. The PSP is a valid instrument for the measurement of personal and social functioning in schizophrenia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared the effectiveness of the multifocal visual evoked cortical potentials (mfVEP) elicited by pattern pulse stimulation with that of pattern reversal in producing reliable responses (signal-to-noise ratio >1.359). Participants were 14 healthy subjects. Visual stimulation was obtained using a 60-sector dartboard display consisting of 6 concentric rings presented in either pulse or reversal mode. Each sector, consisting of 16 checks at 99% Michelson contrast and 80 cd/m² mean luminance, was controlled by a binary m-sequence in the time domain. The signal-to-noise ratio was generally larger in the pattern reversal than in the pattern pulse mode. The number of reliable responses was similar in the central sectors for the two stimulation modes. At the periphery, pattern reversal showed a larger number of reliable responses. Pattern pulse stimuli performed similarly to pattern reversal stimuli to generate reliable waveforms in R1 and R2. The advantage of using both protocols to study mfVEP responses is their complementarity: in some patients, reliable waveforms in specific sectors may be obtained with only one of the two methods. The joint analysis of pattern reversal and pattern pulse stimuli increased the rate of reliability for central sectors by 7.14% in R1, 5.35% in R2, 4.76% in R3, 3.57% in R4, 2.97% in R5, and 1.78% in R6. From R1 to R4 the reliability to generate mfVEPs was above 70% when using both protocols. Thus, for a very high reliability and thorough examination of visual performance, it is recommended to use both stimulation protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brazilian design code ABNT NBR6118:2003 - Design of Concrete Structures - Procedures - [1] proposes the use of simplified models for the consideration of non-linear material behavior in the evaluation of horizontal displacements in buildings. These models penalize stiffness of columns and beams, representing the effects of concrete cracking and avoiding costly physical non-linear analyses. The objectives of the present paper are to investigate the accuracy and uncertainty of these simplified models, as well as to evaluate the reliabilities of structures designed following ABNT NBR6118:2003[1&] in the service limit state for horizontal displacements. Model error statistics are obtained from 42 representative plane frames. The reliabilities of three typical (4, 8 and 12 floor) buildings are evaluated, using the simplified models and a rigorous, physical and geometrical non-linear analysis. Results show that the 70/70 (column/beam stiffness reduction) model is more accurate and less conservative than the 80/40 model. Results also show that ABNT NBR6118:2003 [1] design criteria for horizontal displacement limit states (masonry damage according to ACI 435.3R-68(1984) [10]) are conservative, and result in reliability indexes which are larger than those recommended in EUROCODE [2] for irreversible service limit states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Performance-Based Earthquake Engineering (PBEE), evaluating the seismic performance (or seismic risk) of a structure at a designed site has gained major attention, especially in the past decade. One of the objectives in PBEE is to quantify the seismic reliability of a structure (due to the future random earthquakes) at a site. For that purpose, Probabilistic Seismic Demand Analysis (PSDA) is utilized as a tool to estimate the Mean Annual Frequency (MAF) of exceeding a specified value of a structural Engineering Demand Parameter (EDP). This dissertation focuses mainly on applying an average of a certain number of spectral acceleration ordinates in a certain interval of periods, Sa,avg (T1,…,Tn), as scalar ground motion Intensity Measure (IM) when assessing the seismic performance of inelastic structures. Since the interval of periods where computing Sa,avg is related to the more or less influence of higher vibration modes on the inelastic response, it is appropriate to speak about improved IMs. The results using these improved IMs are compared with a conventional elastic-based scalar IMs (e.g., pseudo spectral acceleration, Sa ( T(¹)), or peak ground acceleration, PGA) and the advanced inelastic-based scalar IM (i.e., inelastic spectral displacement, Sdi). The advantages of applying improved IMs are: (i ) "computability" of the seismic hazard according to traditional Probabilistic Seismic Hazard Analysis (PSHA), because ground motion prediction models are already available for Sa (Ti), and hence it is possibile to employ existing models to assess hazard in terms of Sa,avg, and (ii ) "efficiency" or smaller variability of structural response, which was minimized to assess the optimal range to compute Sa,avg. More work is needed to assess also "sufficiency" and "scaling robustness" desirable properties, which are disregarded in this dissertation. However, for ordinary records (i.e., with no pulse like effects), using the improved IMs is found to be more accurate than using the elastic- and inelastic-based IMs. For structural demands that are dominated by the first mode of vibration, using Sa,avg can be negligible relative to the conventionally-used Sa (T(¹)) and the advanced Sdi. For structural demands with sign.cant higher-mode contribution, an improved scalar IM that incorporates higher modes needs to be utilized. In order to fully understand the influence of the IM on the seismis risk, a simplified closed-form expression for the probability of exceeding a limit state capacity was chosen as a reliability measure under seismic excitations and implemented for Reinforced Concrete (RC) frame structures. This closed-form expression is partuclarly useful for seismic assessment and design of structures, taking into account the uncertainty in the generic variables, structural "demand" and "capacity" as well as the uncertainty in seismic excitations. The assumed framework employs nonlinear Incremental Dynamic Analysis (IDA) procedures in order to estimate variability in the response of the structure (demand) to seismic excitations, conditioned to IM. The estimation of the seismic risk using the simplified closed-form expression is affected by IM, because the final seismic risk is not constant, but with the same order of magnitude. Possible reasons concern the non-linear model assumed, or the insufficiency of the selected IM. Since it is impossibile to state what is the "real" probability of exceeding a limit state looking the total risk, the only way is represented by the optimization of the desirable properties of an IM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]A new algorithm for evaluating the top event probability of large fault trees (FTs) is presented. This algorithm does not require any previous qualitative analysis of the FT. Indeed, its efficiency is independent of the FT logic, and it only depends on the number n of basic system components and on their failure probabilities. Our method provides exact lower and upper bounds on the top event probability by using new properties of the intrinsic order relation between binary strings. The intrinsic order enables one to select binary n-tuples with large occurrence probabilities without necessity to evaluate them. This drastically reduces the complexity of the problem from exponential (2n binary n-tuples) to linear (n Boolean variables)...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next generation electronic devices have to guarantee high performance while being less power-consuming and highly reliable for several application domains ranging from the entertainment to the business. In this context, multicore platforms have proven the most efficient design choice but new challenges have to be faced. The ever-increasing miniaturization of the components produces unexpected variations on technological parameters and wear-out characterized by soft and hard errors. Even though hardware techniques, which lend themselves to be applied at design time, have been studied with the objective to mitigate these effects, they are not sufficient; thus software adaptive techniques are necessary. In this thesis we focus on multicore task allocation strategies to minimize the energy consumption while meeting performance constraints. We firstly devise a technique based on an Integer Linear Problem formulation which provides the optimal solution but cannot be applied on-line since the algorithm it needs is time-demanding; then we propose a sub-optimal technique based on two steps which can be applied on-line. We demonstrate the effectiveness of the latter solution through an exhaustive comparison against the optimal solution, state-of-the-art policies, and variability-agnostic task allocations by running multimedia applications on the virtual prototype of a next generation industrial multicore platform. We also face the problem of the performance and lifetime degradation. We firstly focus on embedded multicore platforms and propose an idleness distribution policy that increases core expected lifetimes by duty cycling their activity; then, we investigate the use of micro thermoelectrical coolers in general-purpose multicore processors to control the temperature of the cores at runtime with the objective of meeting lifetime constraints without performance loss.