818 resultados para Reliability assessments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Schizophrenia is a chronic mental disorder associated with impairment in social functioning. The most widely used scale to measure social functioning is the GAF (Global Assessment of Functioning), but it has the disadvantage of measuring at the same time symptoms and functioning, as described in its anchors. OBJECTIVES:Translation and cultural adaptation of the PSP, proposing a final version in Portuguese for use in Brazil. METHODS: We performed five steps: 1) translation; 2) back translation; 3) formal assessment of semantic equivalence; 4) debriefing; 5) analysis by experts. Interrater reliability (Intraclass correlation, ICC) between two raters was also measured. RESULTS: The final version was applied by two independent investigators in 18 adults with schizophrenia (DSM-IV-TR). The interrater reliability (ICC) was 0.812 (p < 0.001). CONCLUSION: The translation and adaptation of the PSP had an adequate level of semantic equivalence between the Portuguese version and the original English version. There were no difficulties related to understanding the content expressed in the translated texts and terms. Its application was easy and it showed a good interrater reliability. The PSP is a valid instrument for the measurement of personal and social functioning in schizophrenia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared the effectiveness of the multifocal visual evoked cortical potentials (mfVEP) elicited by pattern pulse stimulation with that of pattern reversal in producing reliable responses (signal-to-noise ratio >1.359). Participants were 14 healthy subjects. Visual stimulation was obtained using a 60-sector dartboard display consisting of 6 concentric rings presented in either pulse or reversal mode. Each sector, consisting of 16 checks at 99% Michelson contrast and 80 cd/m² mean luminance, was controlled by a binary m-sequence in the time domain. The signal-to-noise ratio was generally larger in the pattern reversal than in the pattern pulse mode. The number of reliable responses was similar in the central sectors for the two stimulation modes. At the periphery, pattern reversal showed a larger number of reliable responses. Pattern pulse stimuli performed similarly to pattern reversal stimuli to generate reliable waveforms in R1 and R2. The advantage of using both protocols to study mfVEP responses is their complementarity: in some patients, reliable waveforms in specific sectors may be obtained with only one of the two methods. The joint analysis of pattern reversal and pattern pulse stimuli increased the rate of reliability for central sectors by 7.14% in R1, 5.35% in R2, 4.76% in R3, 3.57% in R4, 2.97% in R5, and 1.78% in R6. From R1 to R4 the reliability to generate mfVEPs was above 70% when using both protocols. Thus, for a very high reliability and thorough examination of visual performance, it is recommended to use both stimulation protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because discomfort caused by different approximal caries detection methods can influence their performance, the assessment of this discomfort is important. Thus, this study aimed to evaluate the discomfort reported by children after the use of different diagnostic methods to detect approximal caries lesions in primary teeth: visual inspection, bitewing radiography, laser fluorescence (DIAGNOdent pen - LFpen) and temporary separation with orthodontic rubbers. Seventy-six children aged 4 to 12 years were examined using these methods. Their discomfort was assessed using the Wong-Baker scale and compared among the methods. Visual inspection caused less discomfort than did other methods. Radiography and the LFpen presented similar levels of discomfort. Older children reported higher discomfort using temporary separation, whereas younger children reported less discomfort with the LFpen. In conclusion, radiographic, temporary separation and LFpen methods provoke higher discomfort than visual inspection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brazilian design code ABNT NBR6118:2003 - Design of Concrete Structures - Procedures - [1] proposes the use of simplified models for the consideration of non-linear material behavior in the evaluation of horizontal displacements in buildings. These models penalize stiffness of columns and beams, representing the effects of concrete cracking and avoiding costly physical non-linear analyses. The objectives of the present paper are to investigate the accuracy and uncertainty of these simplified models, as well as to evaluate the reliabilities of structures designed following ABNT NBR6118:2003[1&] in the service limit state for horizontal displacements. Model error statistics are obtained from 42 representative plane frames. The reliabilities of three typical (4, 8 and 12 floor) buildings are evaluated, using the simplified models and a rigorous, physical and geometrical non-linear analysis. Results show that the 70/70 (column/beam stiffness reduction) model is more accurate and less conservative than the 80/40 model. Results also show that ABNT NBR6118:2003 [1] design criteria for horizontal displacement limit states (masonry damage according to ACI 435.3R-68(1984) [10]) are conservative, and result in reliability indexes which are larger than those recommended in EUROCODE [2] for irreversible service limit states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work provides a numerical and experimental investigation of fatigue crack growth behavior in steel weldments including crack closure effects and their coupled interaction with weld strength mismatch. A central objective of this study is to extend previously developed frameworks for evaluation of crack clo- sure effects on FCGR to steel weldments while, at the same time, gaining additional understanding of commonly adopted criteria for crack closure loads and their influence on fatigue life of structural welds. Very detailed non-linear finite element analyses using 3-D models of compact tension C ( T ) fracture spec- imens with center cracked, square groove welds provide the evolution of crack growth with cyclic stress intensity factor which is required for the estimation of the closure loads. Fatigue crack growth tests con- ducted on plane-sided, shallow-cracked C ( T ) specimens provide the necessary data against which crack closure effects on fatigue crack growth behavior can be assessed. Overall, the present investigation pro- vides additional support for estimation procedures of plasticity-induced crack closure loads in fatigue analyses of structural steels and their weldments

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Performance-Based Earthquake Engineering (PBEE), evaluating the seismic performance (or seismic risk) of a structure at a designed site has gained major attention, especially in the past decade. One of the objectives in PBEE is to quantify the seismic reliability of a structure (due to the future random earthquakes) at a site. For that purpose, Probabilistic Seismic Demand Analysis (PSDA) is utilized as a tool to estimate the Mean Annual Frequency (MAF) of exceeding a specified value of a structural Engineering Demand Parameter (EDP). This dissertation focuses mainly on applying an average of a certain number of spectral acceleration ordinates in a certain interval of periods, Sa,avg (T1,…,Tn), as scalar ground motion Intensity Measure (IM) when assessing the seismic performance of inelastic structures. Since the interval of periods where computing Sa,avg is related to the more or less influence of higher vibration modes on the inelastic response, it is appropriate to speak about improved IMs. The results using these improved IMs are compared with a conventional elastic-based scalar IMs (e.g., pseudo spectral acceleration, Sa ( T(¹)), or peak ground acceleration, PGA) and the advanced inelastic-based scalar IM (i.e., inelastic spectral displacement, Sdi). The advantages of applying improved IMs are: (i ) "computability" of the seismic hazard according to traditional Probabilistic Seismic Hazard Analysis (PSHA), because ground motion prediction models are already available for Sa (Ti), and hence it is possibile to employ existing models to assess hazard in terms of Sa,avg, and (ii ) "efficiency" or smaller variability of structural response, which was minimized to assess the optimal range to compute Sa,avg. More work is needed to assess also "sufficiency" and "scaling robustness" desirable properties, which are disregarded in this dissertation. However, for ordinary records (i.e., with no pulse like effects), using the improved IMs is found to be more accurate than using the elastic- and inelastic-based IMs. For structural demands that are dominated by the first mode of vibration, using Sa,avg can be negligible relative to the conventionally-used Sa (T(¹)) and the advanced Sdi. For structural demands with sign.cant higher-mode contribution, an improved scalar IM that incorporates higher modes needs to be utilized. In order to fully understand the influence of the IM on the seismis risk, a simplified closed-form expression for the probability of exceeding a limit state capacity was chosen as a reliability measure under seismic excitations and implemented for Reinforced Concrete (RC) frame structures. This closed-form expression is partuclarly useful for seismic assessment and design of structures, taking into account the uncertainty in the generic variables, structural "demand" and "capacity" as well as the uncertainty in seismic excitations. The assumed framework employs nonlinear Incremental Dynamic Analysis (IDA) procedures in order to estimate variability in the response of the structure (demand) to seismic excitations, conditioned to IM. The estimation of the seismic risk using the simplified closed-form expression is affected by IM, because the final seismic risk is not constant, but with the same order of magnitude. Possible reasons concern the non-linear model assumed, or the insufficiency of the selected IM. Since it is impossibile to state what is the "real" probability of exceeding a limit state looking the total risk, the only way is represented by the optimization of the desirable properties of an IM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impairment due to narcolepsy strongly limits job performance, but there are no standard criteria to assess disability in people with narcolepsy and a scale of disease severity is still lacking. We explored: 1. the interobserver reliability among Italian Medical Commissions making disability and handicap benefit decisions for people with narcolepsy, searching for correlations between the recognized disability degree and patients’ features; 2. the willingness to report patients to the driving licence authority; 3. possible sources of variance in judgement. Fifteen narcoleptic patients were examined by four Medical Commissions in simulated sessions. Raw agreement and interobserver reliability among Commissions were calculated for disability and handicap benefit decisions and for driving licence decisions. Levels of judgement differed on percentage of disability (p<0.001), severity of handicap (p=0.0007) and the need to inform the driving licence authority (p=0.032). Interobserver reliability ranged from Kappa = - 0.10 to Kappa = 0.35 for disability benefit decision and from Kappa = - 0.26 to Kappa = 0.36 for handicap benefit decision. The raw agreement on driving licence decision ranged from 73% to 100% (Kappa not calculable). Spearman’s correlation between percentages of disability and patients’ features showed correlations with age, daytime naps, sleepiness, cataplexy and quality of life. This first interobserver reliability study on social benefit decisions for narcolepsy shows the difficulty of reaching an agreement in this field, mainly due to variance in interpretation of the assessment criteria. The minimum set of indicators of disease severity correlating with patients’ self assessments encourages a disability classification of narcolepsy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]A new algorithm for evaluating the top event probability of large fault trees (FTs) is presented. This algorithm does not require any previous qualitative analysis of the FT. Indeed, its efficiency is independent of the FT logic, and it only depends on the number n of basic system components and on their failure probabilities. Our method provides exact lower and upper bounds on the top event probability by using new properties of the intrinsic order relation between binary strings. The intrinsic order enables one to select binary n-tuples with large occurrence probabilities without necessity to evaluate them. This drastically reduces the complexity of the problem from exponential (2n binary n-tuples) to linear (n Boolean variables)...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]In this paper we present our concept of a virtual classroom for mathematics lectures in the study beginning phase. It describes the two main elements the course uses within our Learning Management System. The first element is the dynamic e-tests-method and the second interactive teaching videos. The tests are already in active use. Therefore, initial experience and feedback are described. The videos are in the prototype status. Therefore here only the evaluation approaches are described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In case of severe osteoarthritis at the knee causing pain, deformity, and loss of stability and mobility, the clinicians consider that the substitution of these surfaces by means of joint prostheses. The objectives to be pursued by this surgery are: complete pain elimination, restoration of the normal physiological mobility and joint stability, correction of all deformities and, thus, of limping. The knee surgical navigation systems have bee developed in computer-aided surgery in order to improve the surgical final outcome in total knee arthroplasty. These systems provide the surgeon with quantitative and real-time information about each surgical action, like bone cut executions and prosthesis component alignment, by mean of tracking tools rigidly fixed onto the femur and the tibia. Nevertheless, there is still a margin of error due to the incorrect surgical procedures and to the still limited number of kinematic information provided by the current systems. Particularly, patello-femoral joint kinematics is not considered in knee surgical navigation. It is also unclear and, thus, a source of misunderstanding, what the most appropriate methodology is to study the patellar motion. In addition, also the knee ligamentous apparatus is superficially considered in navigated total knee arthroplasty, without taking into account how their physiological behavior is altered by this surgery. The aim of the present research work was to provide new functional and biomechanical assessments for the improvement of the surgical navigation systems for joint replacement in the human lower limb. This was mainly realized by means of the identification and development of new techniques that allow a thorough comprehension of the functioning of the knee joint, with particular attention to the patello-femoral joint and to the main knee soft tissues. A knee surgical navigation system with active markers was used in all research activities presented in this research work. Particularly, preliminary test were performed in order to assess the system accuracy and the robustness of a number of navigation procedures. Four studies were performed in-vivo on patients requiring total knee arthroplasty and randomly implanted by means of traditional and navigated procedures in order to check for the real efficacy of the latter with respect to the former. In order to cope with assessment of patello-femoral joint kinematics in the intact and replaced knees, twenty in-vitro tests were performed by using a prototypal tracking tool also for the patella. In addition to standard anatomical and articular recommendations, original proposals for defining the patellar anatomical-based reference frame and for studying the patello-femoral joint kinematics were reported and used in these tests. These definitions were applied to two further in-vitro tests in which, for the first time, also the implant of patellar component insert was fully navigated. In addition, an original technique to analyze the main knee soft tissues by means of anatomical-based fiber mappings was also reported and used in the same tests. The preliminary instrumental tests revealed a system accuracy within the millimeter and a good inter- and intra-observer repeatability in defining all anatomical reference frames. In in-vivo studies, the general alignments of femoral and tibial prosthesis components and of the lower limb mechanical axis, as measured on radiographs, was more satisfactory, i.e. within ±3°, in those patient in which total knee arthroplasty was performed by navigated procedures. As for in-vitro tests, consistent patello-femoral joint kinematic patterns were observed over specimens throughout the knee flexion arc. Generally, the physiological intact knee patellar motion was not restored after the implant. This restoration was successfully achieved in the two further tests where all component implants, included the patellar insert, were fully navigated, i.e. by means of intra-operative assessment of also patellar component positioning and general tibio-femoral and patello-femoral joint assessment. The tests for assessing the behavior of the main knee ligaments revealed the complexity of the latter and the different functional roles played by the several sub-bundles compounding each ligament. Also in this case, total knee arthroplasty altered the physiological behavior of these knee soft tissues. These results reveal in-vitro the relevance and the feasibility of the applications of new techniques for accurate knee soft tissues monitoring, patellar tracking assessment and navigated patellar resurfacing intra-operatively in the contest of the most modern operative techniques. This present research work gives a contribution to the much controversial knowledge on the normal and replaced of knee kinematics by testing the reported new methodologies. The consistence of these results provides fundamental information for the comprehension and improvements of knee orthopedic treatments. In the future, the reported new techniques can be safely applied in-vivo and also adopted in other joint replacements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next generation electronic devices have to guarantee high performance while being less power-consuming and highly reliable for several application domains ranging from the entertainment to the business. In this context, multicore platforms have proven the most efficient design choice but new challenges have to be faced. The ever-increasing miniaturization of the components produces unexpected variations on technological parameters and wear-out characterized by soft and hard errors. Even though hardware techniques, which lend themselves to be applied at design time, have been studied with the objective to mitigate these effects, they are not sufficient; thus software adaptive techniques are necessary. In this thesis we focus on multicore task allocation strategies to minimize the energy consumption while meeting performance constraints. We firstly devise a technique based on an Integer Linear Problem formulation which provides the optimal solution but cannot be applied on-line since the algorithm it needs is time-demanding; then we propose a sub-optimal technique based on two steps which can be applied on-line. We demonstrate the effectiveness of the latter solution through an exhaustive comparison against the optimal solution, state-of-the-art policies, and variability-agnostic task allocations by running multimedia applications on the virtual prototype of a next generation industrial multicore platform. We also face the problem of the performance and lifetime degradation. We firstly focus on embedded multicore platforms and propose an idleness distribution policy that increases core expected lifetimes by duty cycling their activity; then, we investigate the use of micro thermoelectrical coolers in general-purpose multicore processors to control the temperature of the cores at runtime with the objective of meeting lifetime constraints without performance loss.