866 resultados para random number generation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is devoted to the study of some stochastic models in inventories. An inventory system is a facility at which items of materials are stocked. In order to promote smooth and efficient running of business, and to provide adequate service to the customers, an inventory materials is essential for any enterprise. When uncertainty is present, inventories are used as a protection against risk of stock out. It is advantageous to procure the item before it is needed at a lower marginal cost. Again, by bulk purchasing, the advantage of price discounts can be availed. All these contribute to the formation of inventory. Maintaining inventories is a major expenditure for any organization. For each inventory, the fundamental question is how much new stock should be ordered and when should the orders are replaced. In the present study, considered several models for single and two commodity stochastic inventory problems. The thesis discusses two models. In the first model, examined the case in which the time elapsed between two consecutive demand points are independent and identically distributed with common distribution function F(.) with mean  (assumed finite) and in which demand magnitude depends only on the time elapsed since the previous demand epoch. The time between disasters has an exponential distribution with parameter . In Model II, the inter arrival time of disasters have general distribution (F.) with mean  ( ) and the quantity destructed depends on the time elapsed between disasters. Demands form compound poison processes with inter arrival times of demands having mean 1/. It deals with linearly correlated bulk demand two Commodity inventory problem, where each arrival demands a random number of items of each commodity C1 and C2, the maximum quantity demanded being a (< S1) and b(

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis entitled Analysis of Some Stochastic Models in Inventories and Queues. This thesis is devoted to the study of some stochastic models in Inventories and Queues which are physically realizable, though complex. It contains a detailed analysis of the basic stochastic processes underlying these models. In this thesis, (s,S) inventory systems with nonidentically distributed interarrival demand times and random lead times, state dependent demands, varying ordering levels and perishable commodities with exponential life times have been studied. The queueing system of the type Ek/Ga,b/l with server vacations, service systems with single and batch services, queueing system with phase type arrival and service processes and finite capacity M/G/l queue when server going for vacation after serving a random number of customers are also analysed. The analogy between the queueing systems and inventory systems could be exploited in solving certain models. In vacation models, one important result is the stochastic decomposition property of the system size or waiting time. One can think of extending this to the transient case. In inventory theory, one can extend the present study to the case of multi-item, multi-echelon problems. The study of perishable inventory problem when the commodities have a general life time distribution would be a quite interesting problem. The analogy between the queueing systems and inventory systems could be exploited in solving certain models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new fast stream cipher, MAJE4 is designed and developed with a variable key size of 128-bit or 256-bit. The randomness property of the stream cipher is analysed by using the statistical tests. The performance evaluation of the stream cipher is done in comparison with another fast stream cipher called JEROBOAM. The focus is to generate a long unpredictable key stream with better performance, which can be used for cryptographic applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Studies evaluating acceptability of simplified follow-up after medical abortion have focused on high-resource or urban settings where telephones, road connections, and modes of transport are available and where women have formal education. Objective: To investigate women's acceptability of home-assessment of abortion and whether acceptability of medical abortion differs by in-clinic or home-assessment of abortion outcome in a low-resource setting in India. Design: Secondary outcome of a randomised, controlled, non-inferiority trial. Setting Outpatient primary health care clinics in rural and urban Rajasthan, India. Population: Women were eligible if they sought abortion with a gestation up to 9 weeks, lived within defined study area and agreed to follow-up. Women were ineligible if they had known contraindications to medical abortion, haemoglobin < 85mg/l and were below 18 years. Methods: Abortion outcome assessment through routine clinic follow-up by a doctor was compared with home-assessment using a low-sensitivity pregnancy test and a pictorial instruction sheet. A computerized random number generator generated the randomisation sequence (1: 1) in blocks of six. Research assistants randomly allocated eligible women who opted for medical abortion (mifepristone and misoprostol), using opaque sealed envelopes. Blinding during outcome assessment was not possible. Main outcome measures: Women's acceptability of home-assessment was measured as future preference of follow-up. Overall satisfaction, expectations, and comparison with previous abortion experiences were compared between study groups. Results: 731 women were randomized to the clinic follow-up group (n = 353) or home-assessment group (n = 378). 623 (85%) women were successfully followed up, of those 597 (96%) were satisfied and 592 (95%) found the abortion better or as expected, with no difference between study groups. The majority, 355 (57%) women, preferred home-assessment in the event of a future abortion. Significantly more women, 284 (82%), in the home-assessment group preferred home-assessment in the future, as compared with 188 (70%) of women in the clinic follow-up group, who preferred clinic follow-up in the future (p < 0.001). Conclusion: Home-assessment is highly acceptable among women in low-resource, and rural, settings. The choice to follow-up an early medical abortion according to women's preference should be offered to foster women's reproductive autonomy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective is to analyze the relationship between risk and number of stocks of a portfolio for an individual investor when stocks are chosen by "naive strategy". For this, we carried out an experiment in which individuals select actions to reproduce this relationship. 126 participants were informed that the risk of first choice would be an asset average of all standard deviations of the portfolios consist of a single asset, and the same procedure should be used for portfolios composed of two, three and so on, up to 30 actions . They selected the assets they want in their portfolios without the support of a financial analysis. For comparison we also tested a hypothetical simulation of 126 investors who selected shares the same universe, through a random number generator. Thus, each real participant is compensated for random hypothetical investor facing the same opportunity. Patterns were observed in the portfolios of individual participants, characterizing the curves for the components of the samples. Because these groupings are somewhat arbitrary, it was used a more objective measure of behavior: a simple linear regression for each participant, in order to predict the variance of the portfolio depending on the number of assets. In addition, we conducted a pooled regression on all observations by analyzing cross-section. The result of pattern occurs on average but not for most individuals, many of which effectively "de-diversify" when adding seemingly random bonds. Furthermore, the results are slightly worse using a random number generator. This finding challenges the belief that only a small number of titles is necessary for diversification and shows that there is only applicable to a large sample. The implications are important since many individual investors holding few stocks in their portfolios

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Psyllium é uma fonte rica de fibra solúvel mucilaginosa e é considerado um suplemento dietético útil no tratamento de pacientes com hipercolesterolemia. O objetivo deste estudo foi avaliar a eficácia e a segurança da suplementação do psyllium na redução do perfil lipídico em crianças e adolescentes brasileiros dislipidêmicos. Cinqüenta e cinco sujeitos (6-19 anos) com hipercolesterolemia moderada foram avaliados em um estudo clínico, paralelo, duplo cego, controlado e randomizado, conduzido em 2 períodos. Inicialmente, todos participantes recrutados passaram por um estágio de adaptação à dieta restrita em gordura saturada (<7%) e colesterol (<200 mg/dia) que durou 6 semanas antes do tratamento. Após este período, os participantes elegíveis foram alocados aleatoriamente para 2 grupos (controle n=25 e psyllium n=30) usando uma seqüência numerada randomizada gerada por computador. Durante o período de 8 semanas do ensaio clínico, o grupo psyllium manteve a dieta restrita em gordura saturada e colesterol, suplementada diariamente com 7,0 g de psyllium , enquanto o grupo controle recebeu a mesma dieta adicionada com uma quantidade equivalente de celulose (placebo). No final do tratamento, quatro sujeitos foram excluídos após randomização (perdas no seguimento) totalizando 51 sujeitos (grupo controle=24; grupo psyllium n=27), que completaram o estudo. O grupo que recebeu psyllium apresentou um decréscimo significativo nas concentrações de colesterol total (CT) (4,1% [-0,20mmol/L]; p=0,01) e de LDL-colesterol (LDL-c) (7,2% [-0,24 mmol/L]; p<0,001) em comparação à linha de base. Reduções adicionais foram observadas quando comparadas com o grupo controle (CT:4,1% [0,20mmol/L]; p=0,002) e (LDL-c:7,8% [0,26mmol/L]; p=0,007). Nenhum dos participantes relatou aversão ao cheiro, sabor e textura do psyllium, nem a presença de efeitos adversos significativos. A terapia com psyllium se mostrou eficaz na redução das concentrações do LDL-c e demonstrou ser seguro e aceitável pela população do estudo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A identificação e descrição dos caracteres litológicos de uma formação são indispensáveis à avaliação de formações complexas. Com este objetivo, tem sido sistematicamente usada a combinação de ferramentas nucleares em poços não-revestidos. Os perfis resultantes podem ser considerados como a interação entre duas fases distintas: • Fase de transporte da radiação desde a fonte até um ou mais detectores, através da formação. • Fase de detecção, que consiste na coleção da radiação, sua transformação em pulsos de corrente e, finalmente, na distribuição espectral destes pulsos. Visto que a presença do detector não afeta fortemente o resultado do transporte da radiação, cada fase pode ser simulada independentemente uma da outra, o que permite introduzir um novo tipo de modelamento que desacopla as duas fases. Neste trabalho, a resposta final é simulada combinando soluções numéricas do transporte com uma biblioteca de funções resposta do detector, para diferentes energias incidentes e para cada arranjo específico de fontes e detectores. O transporte da radiação é calculado através do algoritmo de elementos finitos (FEM), na forma de fluxo escalar 2½-D, proveniente da solução numérica da aproximação de difusão para multigrupos da equação de transporte de Boltzmann, no espaço de fase, dita aproximação P1, onde a variável direção é expandida em termos dos polinômios ortogonais de Legendre. Isto determina a redução da dimensionalidade do problema, tornando-o mais compatível com o algoritmo FEM, onde o fluxo dependa exclusivamente da variável espacial e das propriedades físicas da formação. A função resposta do detector NaI(Tl) é obtida independentemente pelo método Monte Carlo (MC) em que a reconstrução da vida de uma partícula dentro do cristal cintilador é feita simulando, interação por interação, a posição, direção e energia das diferentes partículas, com a ajuda de números aleatórios aos quais estão associados leis de probabilidades adequadas. Os possíveis tipos de interação (Rayleigh, Efeito fotoelétrico, Compton e Produção de pares) são determinados similarmente. Completa-se a simulação quando as funções resposta do detector são convolvidas com o fluxo escalar, produzindo como resposta final, o espectro de altura de pulso do sistema modelado. Neste espectro serão selecionados conjuntos de canais denominados janelas de detecção. As taxas de contagens em cada janela apresentam dependências diferenciadas sobre a densidade eletrônica e a fitologia. Isto permite utilizar a combinação dessas janelas na determinação da densidade e do fator de absorção fotoelétrico das formações. De acordo com a metodologia desenvolvida, os perfis, tanto em modelos de camadas espessas quanto finas, puderam ser simulados. O desempenho do método foi testado em formações complexas, principalmente naquelas em que a presença de minerais de argila, feldspato e mica, produziram efeitos consideráveis capazes de perturbar a resposta final das ferramentas. Os resultados mostraram que as formações com densidade entre 1.8 e 4.0 g/cm3 e fatores de absorção fotoelétrico no intervalo de 1.5 a 5 barns/e-, tiveram seus caracteres físicos e litológicos perfeitamente identificados. As concentrações de Potássio, Urânio e Tório, puderam ser obtidas com a introdução de um novo sistema de calibração, capaz de corrigir os efeitos devidos à influência de altas variâncias e de correlações negativas, observadas principalmente no cálculo das concentrações em massa de Urânio e Potássio. Na simulação da resposta da sonda CNL, utilizando o algoritmo de regressão polinomial de Tittle, foi verificado que, devido à resolução vertical limitada por ela apresentada, as camadas com espessuras inferiores ao espaçamento fonte - detector mais distante tiveram os valores de porosidade aparente medidos erroneamente. Isto deve-se ao fato do algoritmo de Tittle aplicar-se exclusivamente a camadas espessas. Em virtude desse erro, foi desenvolvido um método que leva em conta um fator de contribuição determinado pela área relativa de cada camada dentro da zona de máxima informação. Assim, a porosidade de cada ponto em subsuperfície pôde ser determinada convolvendo estes fatores com os índices de porosidade locais, porém supondo cada camada suficientemente espessa a fim de adequar-se ao algoritmo de Tittle. Por fim, as limitações adicionais impostas pela presença de minerais perturbadores, foram resolvidas supondo a formação como que composta por um mineral base totalmente saturada com água, sendo os componentes restantes considerados perturbações sobre este caso base. Estes resultados permitem calcular perfis sintéticos de poço, que poderão ser utilizados em esquemas de inversão com o objetivo de obter uma avaliação quantitativa mais detalhada de formações complexas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A seletividade espacial para cor tem sido investigada usando métodos eletrofisiológicos invasivos e não invasivos, e métodos psicofísicos. Em eletrofisiologia cortical visual não invasiva este tópico foi investigado usando métodos convencionais de estimulação periódica e extração de respostas por promediação simples. Novos métodos de estimulação (apresentação pseudo-aleatória) e extração de respostas corticais não invasivas (correlação cruzada) foram desenvolvidos e ainda não foram usados para investigar a seletividade espacial de cor de respostas corticais. Este trabalho objetivou introduzir esse novo método de eletrofisiologia pseudoaleatória para estudar a seletividade espacial de cor. Foram avaliados 14 tricromatas e 16 discromatópsicos com acuidade visual normal ou corrigida. Os voluntários foram avaliados pelo anomaloscópio HMC e teste de figuras de Ishihara para caracterizar a visão de cores quanto à presença de tricromacia. Foram usadas redes senoidais, 8º de ângulo visual, vermelho-verde para 8 frequências espaciais entre 0,2 a 10 cpg. O estímulo foi temporalmente modulado por uma sequência-m binária em um modo de apresentação de padrão reverso. O sistema VERIS foi usado para extrair o primeiro e o segundo slice do kernel de segunda ordem (K2.1 e K2.2, respectivamente). Após a modelagem da resposta às frequências espaciais com função de diferença de gaussianas, extraiu-se a frequência espacial ótima e banda de frequências com amplitudes acima de ¾ da amplitude máxima da função para servirem como indicadores da seletividade espacial da função. Também foi estimada a acuidade visual cromática pelo ajuste de uma função linear aos dados de amplitude a partir da frequência espacial do pico de amplitude até a mais alta frequência espacial testada. Em tricromatas, foi encontrada respostas cromáticas no K2.1 e no K2.2 que apresentaram seletividade espacial diferentes. Os componentes negativos do K2.1 e do K2.2 apresentaram sintonia passa-banda e o componente positivo do K2.1 apresentou sintonia passa-baixa. A acuidade visual estimada de todos os componentes estudados foi próxima àquelas encontradas por Mullen (1985) e Kelly (1983). Diferentes componentes celulares podem estar contribuindo para a geração do VECP pseudoaleatório. Este novo método se candidata a ser uma importante ferramenta para a avaliação não invasiva da visão de cores em humanos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BackgroundDiabetes is associated with long-term damage, dysfunction and failure of various organs, especially the eyes, kidneys, nerves, heart and blood vessels. The risk of developing type 2 diabetes increases with age, obesity and lack of physical activity. Insulin resistance is a fundamental aspect of the aetiology of type 2 diabetes. Insulin resistance has been shown to be associated with atherosclerosis, dyslipidaemia, glucose intolerance, hyperuricaemia, hypertension and polycystic ovary syndrome. The mineral zinc plays a key role in the synthesis and action of insulin, both physiologically and in diabetes mellitus. Zinc seems to stimulate insulin action and insulin receptor tyrosine kinase activity.ObjectivesTo assess the effects of zinc supplementation for the prevention of type 2 diabetes mellitus in adults with insulin resistance.Search methodsThis review is an update of a previous Cochrane systematic review published in 2007. We searched the Cochrane Library (2015, Issue 3), MEDLINE, EMBASE, LILACS and the ICTRP trial register (frominception toMarch 2015). There were no language restrictions. We conducted citation searches and screened reference lists of included studies.Selection criteriaWe included studies if they had a randomised or quasi-randomised design and if they investigated zinc supplementation compared with placebo or no intervention in adults with insulin resistance living in the community.Data collection and analysisTwo review authors selected relevant trials, assessed risk of bias and extracted data.Main resultsWe included three trials with a total of 128 participants in this review. The duration of zinc supplementation ranged between four and 12 weeks. Risk of bias was unclear for most studies regarding selection bias (random sequence generation, allocation concealment) and detection bias (blinding of outcome assessment). No study reported on our key outcome measures (incidence of type 2 diabetes mellitus, adverse events, health-related quality of life, all-cause mortality, diabetic complications, socioeconomic effects). Evaluation of insulin resistance as measured by the Homeostasis Model Assessment of Insulin Resistance (HOMA-IR) showed neutral effects when comparing zinc supplementation with control (two trials; 114 participants). There were neutral effects for trials comparing zinc supplementation with placebo for total cholesterol, high-density lipoprotein (HDL) cholesterol, low-density lipoprotein (LDL) cholesterol and triglycerides (2 studies, 70 participants). The one trial comparing zinc supplementation with exercise also showed neutral effects for total cholesterol, HDL and LDL cholesterol, and a mean difference in triglycerides of -30 mg/dL (95% confidence interval (CI) -49 to -10) in favour of zinc supplementation (53 participants). Various surrogate laboratory parameters were also analysed in the included trials.Authors'conclusionsThere is currently no evidence on which to base the use of zinc supplementation for the prevention of type 2 diabetes mellitus. Future trials should investigate patient-important outcome measures such as incidence of type 2 diabetes mellitus, health-related quality of life, diabetic complications, all-cause mortality and socioeconomic effects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the nonequilibrium roughening transition of a one-dimensional restricted solid-on-solid model by directly sampling the stationary probability density of a suitable order parameter as the surface adsorption rate varies. The shapes of the probability density histograms suggest a typical Ginzburg-Landau scenario for the phase transition of the model, and estimates of the "magnetic" exponent seem to confirm its mean-field critical behavior. We also found that the flipping times between the metastable phases of the model scale exponentially with the system size, signaling the breaking of ergodicity in the thermodynamic limit. Incidentally, we discovered that a closely related model not considered before also displays a phase transition with the same critical behavior as the original model. Our results support the usefulness of off-critical histogram techniques in the investigation of nonequilibrium phase transitions. We also briefly discuss in the appendix a good and simple pseudo-random number generator used in our simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this treatise we consider finite systems of branching particles where the particles move independently of each other according to d-dimensional diffusions. Particles are killed at a position dependent rate, leaving at their death position a random number of descendants according to a position dependent reproduction law. In addition particles immigrate at constant rate (one immigrant per immigration time). A process with above properties is called a branching diffusion withimmigration (BDI). In the first part we present the model in detail and discuss the properties of the BDI under our basic assumptions. In the second part we consider the problem of reconstruction of the trajectory of a BDI from discrete observations. We observe positions of the particles at discrete times; in particular we assume that we have no information about the pedigree of the particles. A natural question arises if we want to apply statistical procedures on the discrete observations: How can we find couples of particle positions which belong to the same particle? We give an easy to implement 'reconstruction scheme' which allows us to redraw or 'reconstruct' parts of the trajectory of the BDI with high accuracy. Moreover asymptotically the whole path can be reconstructed. Further we present simulations which show that our partial reconstruction rule is tractable in practice. In the third part we study how the partial reconstruction rule fits into statistical applications. As an extensive example we present a nonparametric estimator for the diffusion coefficient of a BDI where the particles move according to one-dimensional diffusions. This estimator is based on the Nadaraya-Watson estimator for the diffusion coefficient of one-dimensional diffusions and it uses the partial reconstruction rule developed in the second part above. We are able to prove a rate of convergence of this estimator and finally we present simulations which show that the estimator works well even if we leave our set of assumptions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The idea of balancing the resources spent in the acquisition and encoding of natural signals strictly to their intrinsic information content has interested nearly a decade of research under the name of compressed sensing. In this doctoral dissertation we develop some extensions and improvements upon this technique's foundations, by modifying the random sensing matrices on which the signals of interest are projected to achieve different objectives. Firstly, we propose two methods for the adaptation of sensing matrix ensembles to the second-order moments of natural signals. These techniques leverage the maximisation of different proxies for the quantity of information acquired by compressed sensing, and are efficiently applied in the encoding of electrocardiographic tracks with minimum-complexity digital hardware. Secondly, we focus on the possibility of using compressed sensing as a method to provide a partial, yet cryptanalysis-resistant form of encryption; in this context, we show how a random matrix generation strategy with a controlled amount of perturbations can be used to distinguish between multiple user classes with different quality of access to the encrypted information content. Finally, we explore the application of compressed sensing in the design of a multispectral imager, by implementing an optical scheme that entails a coded aperture array and Fabry-Pérot spectral filters. The signal recoveries obtained by processing real-world measurements show promising results, that leave room for an improvement of the sensing matrix calibration problem in the devised imager.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wir betrachten Systeme von endlich vielen Partikeln, wobei die Partikel sich unabhängig voneinander gemäß eindimensionaler Diffusionen [dX_t = b(X_t),dt + sigma(X_t),dW_t] bewegen. Die Partikel sterben mit positionsabhängigen Raten und hinterlassen eine zufällige Anzahl an Nachkommen, die sich gemäß eines Übergangskerns im Raum verteilen. Zudem immigrieren neue Partikel mit einer konstanten Rate. Ein Prozess mit diesen Eigenschaften wird Verzweigungsprozess mit Immigration genannt. Beobachten wir einen solchen Prozess zu diskreten Zeitpunkten, so ist zunächst nicht offensichtlich, welche diskret beobachteten Punkte zu welchem Pfad gehören. Daher entwickeln wir einen Algorithmus, um den zugrundeliegenden Pfad zu rekonstruieren. Mit Hilfe dieses Algorithmus konstruieren wir einen nichtparametrischen Schätzer für den quadrierten Diffusionskoeffizienten $sigma^2(cdot),$ wobei die Konstruktion im Wesentlichen auf dem Auffüllen eines klassischen Regressionsschemas beruht. Wir beweisen Konsistenz und einen zentralen Grenzwertsatz.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Published evidence suggests that aspects of trial design lead to biased intervention effect estimates, but findings from different studies are inconsistent. This study combined data from 7 meta-epidemiologic studies and removed overlaps to derive a final data set of 234 unique meta-analyses containing 1973 trials. Outcome measures were classified as "mortality," "other objective," "or subjective," and Bayesian hierarchical models were used to estimate associations of trial characteristics with average bias and between-trial heterogeneity. Intervention effect estimates seemed to be exaggerated in trials with inadequate or unclear (vs. adequate) random-sequence generation (ratio of odds ratios, 0.89 [95% credible interval {CrI}, 0.82 to 0.96]) and with inadequate or unclear (vs. adequate) allocation concealment (ratio of odds ratios, 0.93 [CrI, 0.87 to 0.99]). Lack of or unclear double-blinding (vs. double-blinding) was associated with an average of 13% exaggeration of intervention effects (ratio of odds ratios, 0.87 [CrI, 0.79 to 0.96]), and between-trial heterogeneity was increased for such studies (SD increase in heterogeneity, 0.14 [CrI, 0.02 to 0.30]). For each characteristic, average bias and increases in between-trial heterogeneity were driven primarily by trials with subjective outcomes, with little evidence of bias in trials with objective and mortality outcomes. This study is limited by incomplete trial reporting, and findings may be confounded by other study design characteristics. Bias associated with study design characteristics may lead to exaggeration of intervention effect estimates and increases in between-trial heterogeneity in trials reporting subjectively assessed outcomes.