953 resultados para Random Number Generation
Resumo:
This paper presents a new framework for studying irreversible (dis)investment whena market follows a random number of random-length cycles (such as a high-tech productmarket). It is assumed that a firm facing such market evolution is always unsure aboutwhether the current cycle is the last one, although it can update its beliefs about theprobability of facing a permanent decline by observing that no further growth phasearrives. We show that the existence of regime shifts in fluctuating markets suffices for anoption value of waiting to (dis)invest to arise, and we provide a marginal interpretationof the optimal (dis)investment policies, absent in the real options literature. Thepaper also shows that, despite the stochastic process of the underlying variable has acontinuous sample path, the discreteness in the regime changes implies that the samplepath of the firm s value experiences jumps whenever the regime switches all of a sudden,irrespective of whether the firm is active or not.
Resumo:
Background: Although randomized clinical trials (RCTs) are considered the gold standard of evidence, their reporting is often suboptimal. Trial registries have the potential to contribute important methodologic information for critical appraisal of study results. Methods and Findings: The objective of the study was to evaluate the reporting of key methodologic study characteristics in trial registries. We identified a random sample (n = 265) of actively recruiting RCTs using the World Health Organization International Clinical Trials Registry Platform (ICTRP) search portal in 2008. We assessed the reporting of relevant domains from the Cochrane Collaboration’s ‘Risk of bias’ tool and other key methodological aspects. Our primary outcomes were the proportion of registry records with adequate reporting of random sequence generation, allocation concealment, blinding, and trial outcomes. Two reviewers independently assessed each record. Weighted overall proportions in the ICTRP search portal for adequate reporting of sequence generation, allocation concealment, blinding (including and excluding open label RCT) and primary outcomes were 5.7% (95% CI 3.0–8.4%), 1.4% (0–2.8%), 41% (35–47%), 8.4% (4.1–13%), and 66% (60–72%), respectively. The proportion of adequately reported RCTs was higher for registries that used specific methodological fields for describing methods of randomization and allocation concealment compared to registries that did not. Concerning other key methodological aspects, weighted overall proportions of RCTs with adequately reported items were as follows: eligibility criteria (81%), secondary outcomes (46%), harm (5%) follow-up duration (62%), description of the interventions (53%) and sample size calculation (1%). Conclusions: Trial registries currently contain limited methodologic information about registered RCTs. In order to permit adequate critical appraisal of trial results reported in journals and registries, trial registries should consider requesting details on key RCT methods to complement journal publications. Full protocols remain the most comprehensive source of methodologic information and should be made publicly available.
Resumo:
The purpose of this master thesis was to perform simulations that involve use of random number while testing hypotheses especially on two samples populations being compared weather by their means, variances or Sharpe ratios. Specifically, we simulated some well known distributions by Matlab and check out the accuracy of an hypothesis testing. Furthermore, we went deeper and check what could happen once the bootstrapping method as described by Effrons is applied on the simulated data. In addition to that, one well known RobustSharpe hypothesis testing stated in the paper of Ledoit and Wolf was applied to measure the statistical significance performance between two investment founds basing on testing weather there is a statistically significant difference between their Sharpe Ratios or not. We collected many literatures about our topic and perform by Matlab many simulated random numbers as possible to put out our purpose; As results we come out with a good understanding that testing are not always accurate; for instance while testing weather two normal distributed random vectors come from the same normal distribution. The Jacque-Berra test for normality showed that for the normal random vector r1 and r2, only 94,7% and 95,7% respectively are coming from normal distribution in contrast 5,3% and 4,3% failed to shown the truth already known; but when we introduce the bootstrapping methods by Effrons while estimating pvalues where the hypothesis decision is based, the accuracy of the test was 100% successful. From the above results the reports showed that bootstrapping methods while testing or estimating some statistics should always considered because at most cases the outcome are accurate and errors are minimized in the computation. Also the RobustSharpe test which is known to use one of the bootstrapping methods, studentised one, were applied first on different simulated data including distribution of many kind and different shape secondly, on real data, Hedge and Mutual funds. The test performed quite well to agree with the existence of statistical significance difference between their Sharpe ratios as described in the paper of Ledoit andWolf.
Resumo:
This thesis is devoted to the study of some stochastic models in inventories. An inventory system is a facility at which items of materials are stocked. In order to promote smooth and efficient running of business, and to provide adequate service to the customers, an inventory materials is essential for any enterprise. When uncertainty is present, inventories are used as a protection against risk of stock out. It is advantageous to procure the item before it is needed at a lower marginal cost. Again, by bulk purchasing, the advantage of price discounts can be availed. All these contribute to the formation of inventory. Maintaining inventories is a major expenditure for any organization. For each inventory, the fundamental question is how much new stock should be ordered and when should the orders are replaced. In the present study, considered several models for single and two commodity stochastic inventory problems. The thesis discusses two models. In the first model, examined the case in which the time elapsed between two consecutive demand points are independent and identically distributed with common distribution function F(.) with mean (assumed finite) and in which demand magnitude depends only on the time elapsed since the previous demand epoch. The time between disasters has an exponential distribution with parameter . In Model II, the inter arrival time of disasters have general distribution (F.) with mean ( ) and the quantity destructed depends on the time elapsed between disasters. Demands form compound poison processes with inter arrival times of demands having mean 1/. It deals with linearly correlated bulk demand two
Commodity inventory problem, where each arrival demands a random number of items of each commodity C1 and C2, the maximum quantity demanded being a (< S1) and b(
Resumo:
The thesis entitled Analysis of Some Stochastic Models in Inventories and Queues. This thesis is devoted to the study of some stochastic models in Inventories and Queues which are physically realizable, though complex. It contains a detailed analysis of the basic stochastic processes underlying these models. In this thesis, (s,S) inventory systems with nonidentically distributed interarrival demand times and random lead times, state dependent demands, varying ordering levels and perishable commodities with exponential life times have been studied. The queueing system of the type Ek/Ga,b/l with server vacations, service systems with single and batch services, queueing system with phase type arrival and service processes and finite capacity M/G/l queue when server going for vacation after serving a random number of customers are also analysed. The analogy between the queueing systems and inventory systems could be exploited in solving certain models. In vacation models, one important result is the stochastic decomposition property of the system size or waiting time. One can think of extending this to the transient case. In inventory theory, one can extend the present study to the case of multi-item, multi-echelon problems. The study of perishable inventory problem when the commodities have a general life time distribution would be a quite interesting problem. The analogy between the queueing systems and inventory systems could be exploited in solving certain models.
Resumo:
A new fast stream cipher, MAJE4 is designed and developed with a variable key size of 128-bit or 256-bit. The randomness property of the stream cipher is analysed by using the statistical tests. The performance evaluation of the stream cipher is done in comparison with another fast stream cipher called JEROBOAM. The focus is to generate a long unpredictable key stream with better performance, which can be used for cryptographic applications.
Resumo:
The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Background: Studies evaluating acceptability of simplified follow-up after medical abortion have focused on high-resource or urban settings where telephones, road connections, and modes of transport are available and where women have formal education. Objective: To investigate women's acceptability of home-assessment of abortion and whether acceptability of medical abortion differs by in-clinic or home-assessment of abortion outcome in a low-resource setting in India. Design: Secondary outcome of a randomised, controlled, non-inferiority trial. Setting Outpatient primary health care clinics in rural and urban Rajasthan, India. Population: Women were eligible if they sought abortion with a gestation up to 9 weeks, lived within defined study area and agreed to follow-up. Women were ineligible if they had known contraindications to medical abortion, haemoglobin < 85mg/l and were below 18 years. Methods: Abortion outcome assessment through routine clinic follow-up by a doctor was compared with home-assessment using a low-sensitivity pregnancy test and a pictorial instruction sheet. A computerized random number generator generated the randomisation sequence (1: 1) in blocks of six. Research assistants randomly allocated eligible women who opted for medical abortion (mifepristone and misoprostol), using opaque sealed envelopes. Blinding during outcome assessment was not possible. Main outcome measures: Women's acceptability of home-assessment was measured as future preference of follow-up. Overall satisfaction, expectations, and comparison with previous abortion experiences were compared between study groups. Results: 731 women were randomized to the clinic follow-up group (n = 353) or home-assessment group (n = 378). 623 (85%) women were successfully followed up, of those 597 (96%) were satisfied and 592 (95%) found the abortion better or as expected, with no difference between study groups. The majority, 355 (57%) women, preferred home-assessment in the event of a future abortion. Significantly more women, 284 (82%), in the home-assessment group preferred home-assessment in the future, as compared with 188 (70%) of women in the clinic follow-up group, who preferred clinic follow-up in the future (p < 0.001). Conclusion: Home-assessment is highly acceptable among women in low-resource, and rural, settings. The choice to follow-up an early medical abortion according to women's preference should be offered to foster women's reproductive autonomy.
Resumo:
The objective is to analyze the relationship between risk and number of stocks of a portfolio for an individual investor when stocks are chosen by "naive strategy". For this, we carried out an experiment in which individuals select actions to reproduce this relationship. 126 participants were informed that the risk of first choice would be an asset average of all standard deviations of the portfolios consist of a single asset, and the same procedure should be used for portfolios composed of two, three and so on, up to 30 actions . They selected the assets they want in their portfolios without the support of a financial analysis. For comparison we also tested a hypothetical simulation of 126 investors who selected shares the same universe, through a random number generator. Thus, each real participant is compensated for random hypothetical investor facing the same opportunity. Patterns were observed in the portfolios of individual participants, characterizing the curves for the components of the samples. Because these groupings are somewhat arbitrary, it was used a more objective measure of behavior: a simple linear regression for each participant, in order to predict the variance of the portfolio depending on the number of assets. In addition, we conducted a pooled regression on all observations by analyzing cross-section. The result of pattern occurs on average but not for most individuals, many of which effectively "de-diversify" when adding seemingly random bonds. Furthermore, the results are slightly worse using a random number generator. This finding challenges the belief that only a small number of titles is necessary for diversification and shows that there is only applicable to a large sample. The implications are important since many individual investors holding few stocks in their portfolios
Resumo:
Psyllium é uma fonte rica de fibra solúvel mucilaginosa e é considerado um suplemento dietético útil no tratamento de pacientes com hipercolesterolemia. O objetivo deste estudo foi avaliar a eficácia e a segurança da suplementação do psyllium na redução do perfil lipídico em crianças e adolescentes brasileiros dislipidêmicos. Cinqüenta e cinco sujeitos (6-19 anos) com hipercolesterolemia moderada foram avaliados em um estudo clínico, paralelo, duplo cego, controlado e randomizado, conduzido em 2 períodos. Inicialmente, todos participantes recrutados passaram por um estágio de adaptação à dieta restrita em gordura saturada (<7%) e colesterol (<200 mg/dia) que durou 6 semanas antes do tratamento. Após este período, os participantes elegíveis foram alocados aleatoriamente para 2 grupos (controle n=25 e psyllium n=30) usando uma seqüência numerada randomizada gerada por computador. Durante o período de 8 semanas do ensaio clínico, o grupo psyllium manteve a dieta restrita em gordura saturada e colesterol, suplementada diariamente com 7,0 g de psyllium , enquanto o grupo controle recebeu a mesma dieta adicionada com uma quantidade equivalente de celulose (placebo). No final do tratamento, quatro sujeitos foram excluídos após randomização (perdas no seguimento) totalizando 51 sujeitos (grupo controle=24; grupo psyllium n=27), que completaram o estudo. O grupo que recebeu psyllium apresentou um decréscimo significativo nas concentrações de colesterol total (CT) (4,1% [-0,20mmol/L]; p=0,01) e de LDL-colesterol (LDL-c) (7,2% [-0,24 mmol/L]; p<0,001) em comparação à linha de base. Reduções adicionais foram observadas quando comparadas com o grupo controle (CT:4,1% [0,20mmol/L]; p=0,002) e (LDL-c:7,8% [0,26mmol/L]; p=0,007). Nenhum dos participantes relatou aversão ao cheiro, sabor e textura do psyllium, nem a presença de efeitos adversos significativos. A terapia com psyllium se mostrou eficaz na redução das concentrações do LDL-c e demonstrou ser seguro e aceitável pela população do estudo.
Resumo:
A identificação e descrição dos caracteres litológicos de uma formação são indispensáveis à avaliação de formações complexas. Com este objetivo, tem sido sistematicamente usada a combinação de ferramentas nucleares em poços não-revestidos. Os perfis resultantes podem ser considerados como a interação entre duas fases distintas: • Fase de transporte da radiação desde a fonte até um ou mais detectores, através da formação. • Fase de detecção, que consiste na coleção da radiação, sua transformação em pulsos de corrente e, finalmente, na distribuição espectral destes pulsos. Visto que a presença do detector não afeta fortemente o resultado do transporte da radiação, cada fase pode ser simulada independentemente uma da outra, o que permite introduzir um novo tipo de modelamento que desacopla as duas fases. Neste trabalho, a resposta final é simulada combinando soluções numéricas do transporte com uma biblioteca de funções resposta do detector, para diferentes energias incidentes e para cada arranjo específico de fontes e detectores. O transporte da radiação é calculado através do algoritmo de elementos finitos (FEM), na forma de fluxo escalar 2½-D, proveniente da solução numérica da aproximação de difusão para multigrupos da equação de transporte de Boltzmann, no espaço de fase, dita aproximação P1, onde a variável direção é expandida em termos dos polinômios ortogonais de Legendre. Isto determina a redução da dimensionalidade do problema, tornando-o mais compatível com o algoritmo FEM, onde o fluxo dependa exclusivamente da variável espacial e das propriedades físicas da formação. A função resposta do detector NaI(Tl) é obtida independentemente pelo método Monte Carlo (MC) em que a reconstrução da vida de uma partícula dentro do cristal cintilador é feita simulando, interação por interação, a posição, direção e energia das diferentes partículas, com a ajuda de números aleatórios aos quais estão associados leis de probabilidades adequadas. Os possíveis tipos de interação (Rayleigh, Efeito fotoelétrico, Compton e Produção de pares) são determinados similarmente. Completa-se a simulação quando as funções resposta do detector são convolvidas com o fluxo escalar, produzindo como resposta final, o espectro de altura de pulso do sistema modelado. Neste espectro serão selecionados conjuntos de canais denominados janelas de detecção. As taxas de contagens em cada janela apresentam dependências diferenciadas sobre a densidade eletrônica e a fitologia. Isto permite utilizar a combinação dessas janelas na determinação da densidade e do fator de absorção fotoelétrico das formações. De acordo com a metodologia desenvolvida, os perfis, tanto em modelos de camadas espessas quanto finas, puderam ser simulados. O desempenho do método foi testado em formações complexas, principalmente naquelas em que a presença de minerais de argila, feldspato e mica, produziram efeitos consideráveis capazes de perturbar a resposta final das ferramentas. Os resultados mostraram que as formações com densidade entre 1.8 e 4.0 g/cm3 e fatores de absorção fotoelétrico no intervalo de 1.5 a 5 barns/e-, tiveram seus caracteres físicos e litológicos perfeitamente identificados. As concentrações de Potássio, Urânio e Tório, puderam ser obtidas com a introdução de um novo sistema de calibração, capaz de corrigir os efeitos devidos à influência de altas variâncias e de correlações negativas, observadas principalmente no cálculo das concentrações em massa de Urânio e Potássio. Na simulação da resposta da sonda CNL, utilizando o algoritmo de regressão polinomial de Tittle, foi verificado que, devido à resolução vertical limitada por ela apresentada, as camadas com espessuras inferiores ao espaçamento fonte - detector mais distante tiveram os valores de porosidade aparente medidos erroneamente. Isto deve-se ao fato do algoritmo de Tittle aplicar-se exclusivamente a camadas espessas. Em virtude desse erro, foi desenvolvido um método que leva em conta um fator de contribuição determinado pela área relativa de cada camada dentro da zona de máxima informação. Assim, a porosidade de cada ponto em subsuperfície pôde ser determinada convolvendo estes fatores com os índices de porosidade locais, porém supondo cada camada suficientemente espessa a fim de adequar-se ao algoritmo de Tittle. Por fim, as limitações adicionais impostas pela presença de minerais perturbadores, foram resolvidas supondo a formação como que composta por um mineral base totalmente saturada com água, sendo os componentes restantes considerados perturbações sobre este caso base. Estes resultados permitem calcular perfis sintéticos de poço, que poderão ser utilizados em esquemas de inversão com o objetivo de obter uma avaliação quantitativa mais detalhada de formações complexas.
Resumo:
A seletividade espacial para cor tem sido investigada usando métodos eletrofisiológicos invasivos e não invasivos, e métodos psicofísicos. Em eletrofisiologia cortical visual não invasiva este tópico foi investigado usando métodos convencionais de estimulação periódica e extração de respostas por promediação simples. Novos métodos de estimulação (apresentação pseudo-aleatória) e extração de respostas corticais não invasivas (correlação cruzada) foram desenvolvidos e ainda não foram usados para investigar a seletividade espacial de cor de respostas corticais. Este trabalho objetivou introduzir esse novo método de eletrofisiologia pseudoaleatória para estudar a seletividade espacial de cor. Foram avaliados 14 tricromatas e 16 discromatópsicos com acuidade visual normal ou corrigida. Os voluntários foram avaliados pelo anomaloscópio HMC e teste de figuras de Ishihara para caracterizar a visão de cores quanto à presença de tricromacia. Foram usadas redes senoidais, 8º de ângulo visual, vermelho-verde para 8 frequências espaciais entre 0,2 a 10 cpg. O estímulo foi temporalmente modulado por uma sequência-m binária em um modo de apresentação de padrão reverso. O sistema VERIS foi usado para extrair o primeiro e o segundo slice do kernel de segunda ordem (K2.1 e K2.2, respectivamente). Após a modelagem da resposta às frequências espaciais com função de diferença de gaussianas, extraiu-se a frequência espacial ótima e banda de frequências com amplitudes acima de ¾ da amplitude máxima da função para servirem como indicadores da seletividade espacial da função. Também foi estimada a acuidade visual cromática pelo ajuste de uma função linear aos dados de amplitude a partir da frequência espacial do pico de amplitude até a mais alta frequência espacial testada. Em tricromatas, foi encontrada respostas cromáticas no K2.1 e no K2.2 que apresentaram seletividade espacial diferentes. Os componentes negativos do K2.1 e do K2.2 apresentaram sintonia passa-banda e o componente positivo do K2.1 apresentou sintonia passa-baixa. A acuidade visual estimada de todos os componentes estudados foi próxima àquelas encontradas por Mullen (1985) e Kelly (1983). Diferentes componentes celulares podem estar contribuindo para a geração do VECP pseudoaleatório. Este novo método se candidata a ser uma importante ferramenta para a avaliação não invasiva da visão de cores em humanos.
Resumo:
BackgroundDiabetes is associated with long-term damage, dysfunction and failure of various organs, especially the eyes, kidneys, nerves, heart and blood vessels. The risk of developing type 2 diabetes increases with age, obesity and lack of physical activity. Insulin resistance is a fundamental aspect of the aetiology of type 2 diabetes. Insulin resistance has been shown to be associated with atherosclerosis, dyslipidaemia, glucose intolerance, hyperuricaemia, hypertension and polycystic ovary syndrome. The mineral zinc plays a key role in the synthesis and action of insulin, both physiologically and in diabetes mellitus. Zinc seems to stimulate insulin action and insulin receptor tyrosine kinase activity.ObjectivesTo assess the effects of zinc supplementation for the prevention of type 2 diabetes mellitus in adults with insulin resistance.Search methodsThis review is an update of a previous Cochrane systematic review published in 2007. We searched the Cochrane Library (2015, Issue 3), MEDLINE, EMBASE, LILACS and the ICTRP trial register (frominception toMarch 2015). There were no language restrictions. We conducted citation searches and screened reference lists of included studies.Selection criteriaWe included studies if they had a randomised or quasi-randomised design and if they investigated zinc supplementation compared with placebo or no intervention in adults with insulin resistance living in the community.Data collection and analysisTwo review authors selected relevant trials, assessed risk of bias and extracted data.Main resultsWe included three trials with a total of 128 participants in this review. The duration of zinc supplementation ranged between four and 12 weeks. Risk of bias was unclear for most studies regarding selection bias (random sequence generation, allocation concealment) and detection bias (blinding of outcome assessment). No study reported on our key outcome measures (incidence of type 2 diabetes mellitus, adverse events, health-related quality of life, all-cause mortality, diabetic complications, socioeconomic effects). Evaluation of insulin resistance as measured by the Homeostasis Model Assessment of Insulin Resistance (HOMA-IR) showed neutral effects when comparing zinc supplementation with control (two trials; 114 participants). There were neutral effects for trials comparing zinc supplementation with placebo for total cholesterol, high-density lipoprotein (HDL) cholesterol, low-density lipoprotein (LDL) cholesterol and triglycerides (2 studies, 70 participants). The one trial comparing zinc supplementation with exercise also showed neutral effects for total cholesterol, HDL and LDL cholesterol, and a mean difference in triglycerides of -30 mg/dL (95% confidence interval (CI) -49 to -10) in favour of zinc supplementation (53 participants). Various surrogate laboratory parameters were also analysed in the included trials.Authors'conclusionsThere is currently no evidence on which to base the use of zinc supplementation for the prevention of type 2 diabetes mellitus. Future trials should investigate patient-important outcome measures such as incidence of type 2 diabetes mellitus, health-related quality of life, diabetic complications, all-cause mortality and socioeconomic effects.
Resumo:
We investigate the nonequilibrium roughening transition of a one-dimensional restricted solid-on-solid model by directly sampling the stationary probability density of a suitable order parameter as the surface adsorption rate varies. The shapes of the probability density histograms suggest a typical Ginzburg-Landau scenario for the phase transition of the model, and estimates of the "magnetic" exponent seem to confirm its mean-field critical behavior. We also found that the flipping times between the metastable phases of the model scale exponentially with the system size, signaling the breaking of ergodicity in the thermodynamic limit. Incidentally, we discovered that a closely related model not considered before also displays a phase transition with the same critical behavior as the original model. Our results support the usefulness of off-critical histogram techniques in the investigation of nonequilibrium phase transitions. We also briefly discuss in the appendix a good and simple pseudo-random number generator used in our simulations.
Resumo:
In this treatise we consider finite systems of branching particles where the particles move independently of each other according to d-dimensional diffusions. Particles are killed at a position dependent rate, leaving at their death position a random number of descendants according to a position dependent reproduction law. In addition particles immigrate at constant rate (one immigrant per immigration time). A process with above properties is called a branching diffusion withimmigration (BDI). In the first part we present the model in detail and discuss the properties of the BDI under our basic assumptions. In the second part we consider the problem of reconstruction of the trajectory of a BDI from discrete observations. We observe positions of the particles at discrete times; in particular we assume that we have no information about the pedigree of the particles. A natural question arises if we want to apply statistical procedures on the discrete observations: How can we find couples of particle positions which belong to the same particle? We give an easy to implement 'reconstruction scheme' which allows us to redraw or 'reconstruct' parts of the trajectory of the BDI with high accuracy. Moreover asymptotically the whole path can be reconstructed. Further we present simulations which show that our partial reconstruction rule is tractable in practice. In the third part we study how the partial reconstruction rule fits into statistical applications. As an extensive example we present a nonparametric estimator for the diffusion coefficient of a BDI where the particles move according to one-dimensional diffusions. This estimator is based on the Nadaraya-Watson estimator for the diffusion coefficient of one-dimensional diffusions and it uses the partial reconstruction rule developed in the second part above. We are able to prove a rate of convergence of this estimator and finally we present simulations which show that the estimator works well even if we leave our set of assumptions.