880 resultados para reliability analyses
Resumo:
Los análisis de fiabilidad representan una herramienta adecuada para contemplar las incertidumbres inherentes que existen en los parámetros geotécnicos. En esta Tesis Doctoral se desarrolla una metodología basada en una linealización sencilla, que emplea aproximaciones de primer o segundo orden, para evaluar eficientemente la fiabilidad del sistema en los problemas geotécnicos. En primer lugar, se emplean diferentes métodos para analizar la fiabilidad de dos aspectos propios del diseño de los túneles: la estabilidad del frente y el comportamiento del sostenimiento. Se aplican varias metodologías de fiabilidad — el Método de Fiabilidad de Primer Orden (FORM), el Método de Fiabilidad de Segundo Orden (SORM) y el Muestreo por Importancia (IS). Los resultados muestran que los tipos de distribución y las estructuras de correlación consideradas para todas las variables aleatorias tienen una influencia significativa en los resultados de fiabilidad, lo cual remarca la importancia de una adecuada caracterización de las incertidumbres geotécnicas en las aplicaciones prácticas. Los resultados también muestran que tanto el FORM como el SORM pueden emplearse para estimar la fiabilidad del sostenimiento de un túnel y que el SORM puede mejorar el FORM con un esfuerzo computacional adicional aceptable. Posteriormente, se desarrolla una metodología de linealización para evaluar la fiabilidad del sistema en los problemas geotécnicos. Esta metodología solamente necesita la información proporcionada por el FORM: el vector de índices de fiabilidad de las funciones de estado límite (LSFs) que componen el sistema y su matriz de correlación. Se analizan dos problemas geotécnicos comunes —la estabilidad de un talud en un suelo estratificado y un túnel circular excavado en roca— para demostrar la sencillez, precisión y eficiencia del procedimiento propuesto. Asimismo, se reflejan las ventajas de la metodología de linealización con respecto a las herramientas computacionales alternativas. Igualmente se muestra que, en el caso de que resulte necesario, se puede emplear el SORM —que aproxima la verdadera LSF mejor que el FORM— para calcular estimaciones más precisas de la fiabilidad del sistema. Finalmente, se presenta una nueva metodología que emplea Algoritmos Genéticos para identificar, de manera precisa, las superficies de deslizamiento representativas (RSSs) de taludes en suelos estratificados, las cuales se emplean posteriormente para estimar la fiabilidad del sistema, empleando la metodología de linealización propuesta. Se adoptan tres taludes en suelos estratificados característicos para demostrar la eficiencia, precisión y robustez del procedimiento propuesto y se discuten las ventajas del mismo con respecto a otros métodos alternativos. Los resultados muestran que la metodología propuesta da estimaciones de fiabilidad que mejoran los resultados previamente publicados, enfatizando la importancia de hallar buenas RSSs —y, especialmente, adecuadas (desde un punto de vista probabilístico) superficies de deslizamiento críticas que podrían ser no-circulares— para obtener estimaciones acertadas de la fiabilidad de taludes en suelos. Reliability analyses provide an adequate tool to consider the inherent uncertainties that exist in geotechnical parameters. This dissertation develops a simple linearization-based approach, that uses first or second order approximations, to efficiently evaluate the system reliability of geotechnical problems. First, reliability methods are employed to analyze the reliability of two tunnel design aspects: face stability and performance of support systems. Several reliability approaches —the first order reliability method (FORM), the second order reliability method (SORM), the response surface method (RSM) and importance sampling (IS)— are employed, with results showing that the assumed distribution types and correlation structures for all random variables have a significant effect on the reliability results. This emphasizes the importance of an adequate characterization of geotechnical uncertainties for practical applications. Results also show that both FORM and SORM can be used to estimate the reliability of tunnel-support systems; and that SORM can outperform FORM with an acceptable additional computational effort. A linearization approach is then developed to evaluate the system reliability of series geotechnical problems. The approach only needs information provided by FORM: the vector of reliability indices of the limit state functions (LSFs) composing the system, and their correlation matrix. Two common geotechnical problems —the stability of a slope in layered soil and a circular tunnel in rock— are employed to demonstrate the simplicity, accuracy and efficiency of the suggested procedure. Advantages of the linearization approach with respect to alternative computational tools are discussed. It is also found that, if necessary, SORM —that approximates the true LSF better than FORM— can be employed to compute better estimations of the system’s reliability. Finally, a new approach using Genetic Algorithms (GAs) is presented to identify the fully specified representative slip surfaces (RSSs) of layered soil slopes, and such RSSs are then employed to estimate the system reliability of slopes, using our proposed linearization approach. Three typical benchmark-slopes with layered soils are adopted to demonstrate the efficiency, accuracy and robustness of the suggested procedure, and advantages of the proposed method with respect to alternative methods are discussed. Results show that the proposed approach provides reliability estimates that improve previously published results, emphasizing the importance of finding good RSSs —and, especially, good (probabilistic) critical slip surfaces that might be non-circular— to obtain good estimations of the reliability of soil slope systems.
Resumo:
The authors investigated the extent to which the joint-attention behaviors of gaze following, social referencing, and object-directed imitation were related to each other and to infants vocabulary development in a sample of 60 infants between the ages of 8 and 14 months. Joint-attention skills and vocabulary development were assessed in a laboratory setting. Split-half reliability analyses on the joint-attention measures indicated that the tasks reliably assessed infants' capabilities. In the main analysis, no significant correlations were found among the joint-attention behaviors except for a significant relationship between gaze following and the number of names in infants' productive vocabularies. The overall pattern of results did not replicate results of previous studies (e.g., M. Carpenter, K. Nagell, & M. Tomasello, 1998) that found relationships between various emerging joint-attention behaviors.
Resumo:
BACKGROUND: Patient behavior accounts for half or more of the variance in health, disease, mortality and treatment outcome and costs. Counseling using motivational interviewing (MI) effectively improves the substance use and medical compliance behavior of patients. Medical training should include substantial focus on this key issue of health promotion. The objective of the study is to test the efficacy of teaching MI to medical students. METHODS: Thirteen fourth-year medical students volunteered to participate. Seven days before and after an 8-hour interactive MI training workshop, each student performed a video-recorded interview with two standardized patients: a 60 year-old alcohol dependent female consulting a primary care physician for the first time about fatigue and depression symptoms; and a 50 year-old male cigarette smoker hospitalized for myocardial infarction. All 52 videos (13 students×2 interviews before and after training) were independently coded by two blinded clinicians using the Motivational Interviewing Training Integrity (MITI, 3.0). MITI scores consist of global spirit (Evocation, Collaboration, Autonomy/Support), global Empathy and Direction, and behavior count summary scores (% Open questions, Reflection to question ratio, % Complex reflections, % MI-adherent behaviors). A "beginning proficiency" threshold (BPT) is defined for each of these 9 scores. The proportion of students reaching BPT before and after training was compared using McNemar exact tests. Inter-rater reliability was evaluated by comparing double coding, and test-retest analyses were conducted on a sub-sample of 10 consecutive interviews by each coder. Weighted Kappas were used for global rating scales and intra-class correlations (ICC) were computed for behavior count summary scores. RESULTS: The percent of counselors reaching BPT before and after MI training increased significantly for Evocation (15% to 65%, p<.001), Collaboration (27% to 77%, p=.001), Autonomy/Support (15% to 54%, p=.006), and % Open questions (4% to 38%, p=.004). Proportions increased, but were not statistically significant for Empathy (38% to 58%, p=.18), Reflection to question ratio (0% to 15%, p=.12), % Complex reflection (35% to 54%, p=.23), and % MI-adherent behaviors (8% to 15%, p=.69). There was virtually no change for the Direction scale (92% to 88%, p=1.00). The reliability analyses produced mixed results. Weighted kappas for inter-rater reliability ranged from .14 for Direction to .51 for Collaboration, and from .27 for Direction to .80 for Empathy for test-retest. ICCs ranged from .20 for Complex reflections to .89 for Open questions (inter-rater), and from .67 for Complex reflections to .99 for Reflection to question ratio (test-retest). CONCLUSION: This pilot study indicates that a single 8-hour training in motivational interviewing for voluntary fourth-year medical students results in significant improvement of some MI skills. A larger sample of randomly selected medical students observed over longer periods should be studied to test if MI training generalizes to medical students. Inter-rater reliability and test-retest findings indicate a need for caution when interpreting the present results, as well as for more intensive training to help appropriately capture more dimensions of the process in future studies.
Resumo:
The application of the three voltage level 20/1/0.4 distribution system in Finland has proved to be an economic solution to enhance the reability of electricity distribution. By using 1 kV voltage level between medium and low voltage networks, the improvement in reability could be reached especially inaerial lines networks. Also considerable savings in investment and outage costscould be archieved compared to the traditional distribution system. This master's thesis is focused on the describing the situation in Russian distribution netwoks and consequent analyses the possibility of applying 1000V distribution system in Russia. The goal is to investigate on the basis of Finnish experience is any possible installation targets in Russia for the new system. Compatibility with Russian safety and quality standards are also studied in this thesis.
Resumo:
Strategic development of distribution networks plays a key role in the asset management in electricity distribution companies. Owing to the capital-intensive nature of the field and longspan operations of companies, the significance of a strategy is emphasised. A well-devised strategy combines awareness of challenges posed by the operating environment and the future targets of the distribution company. Economic regulation, ageing infrastructure, scarcity of resources and tightening supply requirements with challenges created by the climate change put a pressure on the strategy work. On the other hand, technology development related to network automation and underground cabling assists in answering these challenges. This dissertation aims at developing process knowledge and establishing a methodological framework by which key issues related to network development can be addressed. Moreover, the work develops tools by which the effects of changes in the operating environment on the distribution business can be analysed in the strategy work. To this end, the work discusses certain characteristics of the distribution business and describes the strategy process at a principle level. Further, the work defines the subtasks in the strategy process and presents the key elements in the strategy work and long-term network planning. The work delineates the factors having either a direct or indirect effect on strategic planning and development needs in the networks; in particular, outage costs constitute an important part of the economic regulation of the distribution business, reliability being thus a key driver in network planning. The dissertation describes the methodology and tools applied to cost and reliability analyses in the strategy work. The work focuses on determination of the techno-economic feasibility of different network development technologies; these feasibility surveys are linked to the economic regulation model of the distribution business, in particular from the viewpoint of reliability of electricity supply and allowed return. The work introduces the asset management system developed for research purposes and to support the strategy work, the calculation elements of the system and initial data used in the network analysis. The key elements of this asset management system are utilised in the dissertation. Finally, the study addresses the stages of strategic decision-making and compilation of investment strategies. Further, the work illustrates implementation of strategic planning in an actual distribution company environment.
Resumo:
The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Birnbaum-Saunders models have largely been applied in material fatigue studies and reliability analyses to relate the total time until failure with some type of cumulative damage. In many problems related to the medical field, such as chronic cardiac diseases and different types of cancer, a cumulative damage caused by several risk factors might cause some degradation that leads to a fatigue process. In these cases, BS models can be suitable for describing the propagation lifetime. However, since the cumulative damage is assumed to be normally distributed in the BS distribution, the parameter estimates from this model can be sensitive to outlying observations. In order to attenuate this influence, we present in this paper BS models, in which a Student-t distribution is assumed to explain the cumulative damage. In particular, we show that the maximum likelihood estimates of the Student-t log-BS models attribute smaller weights to outlying observations, which produce robust parameter estimates. Also, some inferential results are presented. In addition, based on local influence and deviance component and martingale-type residuals, a diagnostics analysis is derived. Finally, a motivating example from the medical field is analyzed using log-BS regression models. Since the parameter estimates appear to be very sensitive to outlying and influential observations, the Student-t log-BS regression model should attenuate such influences. The model checking methodologies developed in this paper are used to compare the fitted models.
Resumo:
OBJECTIVES Cognitive fluctuation (CF) is a common feature of dementia and a core diagnostic symptom for dementia with Lewy bodies (DLB). CF remains difficult to accurately and reliably detect clinically. This study aimed to develop a psychometric test that could be used by clinicians to facilitate the identification of CF and improve the recognition and diagnosis of DLB and Parkinson disease, and to improve differential diagnosis of other dementias. METHODS We compiled a 17-item psychometric test for identifying CF and applied this measure in a cross-sectional design. Participants were recruited from the North East of England, and assessments were made in individuals' homes. We recruited people with four subtypes of dementia and a healthy comparison group, and all subjects were administered this pilot scale together with other standard ratings. The psychometric properties of the scale were examined with exploratory factor analysis. We also examined the ability of individual items to identify CF to discriminate between dementia subtypes. The sensitivity and specificity of discriminating items were explored along with validity and reliability analyses. RESULTS Participants comprised 32 comparison subjects, 30 people with Alzheimer disease, 30 with vascular dementia, 29 with DLB, and 32 with dementia associated with Parkinson disease. Four items significantly discriminated between dementia groups and showed good levels of sensitivity (range: 78.6%-80.3%) and specificity (range: 73.9%-79.3%). The scale had very good levels of test-retest (Cronbach's alpha: 0.82) and interrater (0.81) reliabilities. The four items loaded onto three different factors. These items were: 1) marked differences in functioning during the daytime; 2) daytime somnolence; 3) daytime drowsiness; and 4) altered levels of consciousness during the day. CONCLUSIONS We identified four items that provide valid, sensitive, and specific questions for reliably identifying CF and distinguishing the Lewy body dementias from other major causes of dementia (Alzheimer disease and vascular dementia).
Resumo:
The aim of this study was to test a newly developed LED-based fluorescence device for approximal caries detection in vitro. We assembled 120 extracted molars without frank cavitations or fillings pairwise in order to create contact areas. The teeth were independently assessed by two examiners using visual caries detection (International Caries Detection and Assessment System, ICDAS), bitewing radiography (BW), laser fluorescence (LFpen), and LED fluorescence (Midwest Caries I.D., MW). The measurements were repeated at least 1 week later. The diagnostic performance was calculated with Bayesian analyses. Post-test probabilities were calculated in order to judge the diagnostic performance of combined methods. Reliability analyses were performed using kappa statistics for nominal data and intraclass correlation (ICC) for absolute data. Histology served as the gold standard. Sensitivities/specificities at the enamel threshold were 0.33/0.84 for ICDAS, 0.23/0.86 for BW, 0.47/0.78 for LFpen, and 0.32/0.87 for MW. Sensitivities/specificities at the dentine threshold were 0.04/0.89 for ICDAS, 0.27/0.94 for BW, 0.39/0.84 for LFpen, and 0.07/0.96 for MW. Reliability data were fair to moderate for MW and good for BW and LFpen. The combination of ICDAS and radiography yielded the best diagnostic performance (post-test probability of 0.73 at the dentine threshold). The newly developed LED device is not able to be recommended for approximal caries detection. There might be too much signal loss during signal transduction from the occlusal aspect to the proximal lesion site and the reverse.
Resumo:
The purpose of this study was to investigate a selection of children's historical nonfiction literature for evidence of coherence. Although research has been conducted on coherence of textbook material and its influences on comprehension there has been limited study on coherence in children's nonfiction literature. Generally, textual coherence has been seen as critical in the comprehensibility of content area textbooks because it concerns the unity of connections among ideas and information. Disciplinary coherence concerns the extent to which authors of historical text show readers how historians think and write. Since young readers are apprentices in learning historical content and conventions of historical thinking, evidence of disciplinary coherence is significant in nonfiction literature for young readers. The sample of the study contained 32 books published between 1989 and 2000 ranging in length from less than 90 pages to more than 150 pages. Content analysis was the quantitative research technique used to measure 84 variables of textual and disciplinary coherence in three passages of each book, as proportions of the total number of words for each book. Reliability analyses and an examination of 750 correlations showed the extent to which variables were related in the books. Three important findings emerged from the study that should be considered in the selection and use of children's historical nonfiction literature in classrooms. First, characteristics of coherence are significantly related together in high quality nonfiction literature. Second, shorter books have a higher proportion of textual coherence than longer books as measured in three passages. Third, presence of the author is related to characteristics of coherence throughout the books. The findings show that nonfiction literature offers students content that researchers have found textbooks lack. Both younger and older students have the opportunity to learn the conventions of historical thinking as they learn content through nonfiction literature. Further, the children's literature, represented in the Orbis Pictus list, shows students that authors select, interpret, and question information, and give other interpretations. The implications of the study for teaching history, teacher preparation in content and literacy, school practices, children's librarians, and publishers of children's nonfiction are discussed.
Resumo:
Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.
Resumo:
Objective: Expectancies about the outcomes of alcohol consumption are widely accepted as important determinants of drinking. This construct is increasingly recognized as a significant element of psychological interventions for alcohol-related problems. Much effort has been invested in producing reliable and valid instruments to measure this construct for research and clinical purposes, but very few have had their factor structure subjected to adequate validation. Among them, the Drinking Expectancies Questionnaire (DEQ) was developed to address some theoretical and design issues with earlier expectancy scales. Exploratory factor analyses, in addition to validity and reliability analyses, were performed when the original questionnaire was developed. The object of this study was to undertake a confirmatory analysis of the factor structure of the DEQ. Method: Confirmatory factor analysis through LISREL 8 was performed using a randomly split sample of 679 drinkers. Results: Results suggested that a new 5-factor model, which differs slightly from the original 6-factor version, was a more robust measure of expectancies. A new method of scoring the DEQ consistent with this factor structure is presented. Conclusions: The present study shows more robust psychometric properties of the DEQ using the new factor structure.
Resumo:
The objectives of this thesis were threefold: (1) to review the concept of attributional style, (2) to demonstrate its applicability to affiliative behavior, and (3) to document the existence of actual and perceived sex differences in attributional style for affiliative behavior. To fulfill the first two objectives the development of attributional theory was traced from the Abramson, Seligman, and Teasdale (1978) presentation of the reformulated learned helplessness model through Weiner's (1979) examination of attributional style as a motivational feature of achievement behavior to the application of attribution theory to affiliative behavior. To fulfill the third objective the evidence detailing the sex differences in achievement and affiliative attributional styles was reviewed within the framework of perceptions of sex appropriate behavior. A study was then designed to assess both actual and perceived affiliative attributional sex differences. The Escovar, Brown, and Rodriguez Attributional Style Questionnaire for Affiliative Behavior was administered to 107 male and female University of Miami and Florida International University students. Each subject answered the questionnaire twice, once for themselves and once as if they were a member of the opposite sex. The results indicated that the EBR-ASQ maintained previous levels of internal consistency and reliability. Analyses performed on the covariate of the order of perspective presentation were negative; all further analyses were performed without a covariate. The data were analyzed using a 2(Sex) X 2(Perspective) X 2(Outcome) factorial, multivariate, repeated measures design with the three attributional dimensions serving as the dependent variable repeated measures. As expected all multivariate tests revealed that each of the three factors was a significant influence over all three of the dependent variables. Of the 21 univariate tests 12 of the main effect and two-way interactions were significant and one approached significance. Examination of the means revealed that of the eight significant main effects six were in the expected direction; of the four significant two-way interactions three were in the expected direction. Although the results were not totally supportive of the hypotheses they did support the thesis that affiliation is the female sex-role appropriate analogue to male achievement behavior.
Resumo:
Objectives To examine the extent of multiplicity of data in trial reports and to assess the impact of multiplicity on meta-analysis results. Design Empirical study on a cohort of Cochrane systematic reviews. Data sources All Cochrane systematic reviews published from issue 3 in 2006 to issue 2 in 2007 that presented a result as a standardised mean difference (SMD). We retrieved trial reports contributing to the first SMD result in each review, and downloaded review protocols. We used these SMDs to identify a specific outcome for each meta-analysis from its protocol. Review methods Reviews were eligible if SMD results were based on two to ten randomised trials and if protocols described the outcome. We excluded reviews if they only presented results of subgroup analyses. Based on review protocols and index outcomes, two observers independently extracted the data necessary to calculate SMDs from the original trial reports for any intervention group, time point, or outcome measure compatible with the protocol. From the extracted data, we used Monte Carlo simulations to calculate all possible SMDs for every meta-analysis. Results We identified 19 eligible meta-analyses (including 83 trials). Published review protocols often lacked information about which data to choose. Twenty-four (29%) trials reported data for multiple intervention groups, 30 (36%) reported data for multiple time points, and 29 (35%) reported the index outcome measured on multiple scales. In 18 meta-analyses, we found multiplicity of data in at least one trial report; the median difference between the smallest and largest SMD results within a meta-analysis was 0.40 standard deviation units (range 0.04 to 0.91). Conclusions Multiplicity of data can affect the findings of systematic reviews and meta-analyses. To reduce the risk of bias, reviews and meta-analyses should comply with prespecified protocols that clearly identify time points, intervention groups, and scales of interest.