955 resultados para statistical potentials
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
This article proposes a checklist to improve statistical reporting in the manuscripts submitted to Public Understanding of Science. Generally, these guidelines will allow the reviewers (and readers) to judge whether the evidence provided in the manuscript is relevant. The article ends with other suggestions for a better statistical quality of the journal.
Resumo:
This study analyzed high-density event-related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task-irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory-visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross-modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non-linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top-down attentional control that further modulates cross-modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context-based control over multisensory processing, whose influences multiplex across finer and broader time scales.
Resumo:
Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.
Resumo:
In this thesis the X-ray tomography is discussed from the Bayesian statistical viewpoint. The unknown parameters are assumed random variables and as opposite to traditional methods the solution is obtained as a large sample of the distribution of all possible solutions. As an introduction to tomography an inversion formula for Radon transform is presented on a plane. The vastly used filtered backprojection algorithm is derived. The traditional regularization methods are presented sufficiently to ground the Bayesian approach. The measurements are foton counts at the detector pixels. Thus the assumption of a Poisson distributed measurement error is justified. Often the error is assumed Gaussian, altough the electronic noise caused by the measurement device can change the error structure. The assumption of Gaussian measurement error is discussed. In the thesis the use of different prior distributions in X-ray tomography is discussed. Especially in severely ill-posed problems the use of a suitable prior is the main part of the whole solution process. In the empirical part the presented prior distributions are tested using simulated measurements. The effect of different prior distributions produce are shown in the empirical part of the thesis. The use of prior is shown obligatory in case of severely ill-posed problem.
Resumo:
This thesis was focussed on statistical analysis methods and proposes the use of Bayesian inference to extract information contained in experimental data by estimating Ebola model parameters. The model is a system of differential equations expressing the behavior and dynamics of Ebola. Two sets of data (onset and death data) were both used to estimate parameters, which has not been done by previous researchers in (Chowell, 2004). To be able to use both data, a new version of the model has been built. Model parameters have been estimated and then used to calculate the basic reproduction number and to study the disease-free equilibrium. Estimates of the parameters were useful to determine how well the model fits the data and how good estimates were, in terms of the information they provided about the possible relationship between variables. The solution showed that Ebola model fits the observed onset data at 98.95% and the observed death data at 93.6%. Since Bayesian inference can not be performed analytically, the Markov chain Monte Carlo approach has been used to generate samples from the posterior distribution over parameters. Samples have been used to check the accuracy of the model and other characteristics of the target posteriors.
Resumo:
The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.
Resumo:
The importance of logistics for companies is a well known and justified issue. Today, enterprises are developing their logistics processes in order to match their products and services to the requirements of the most important customers. Therefore there is a need for developing analysing tools for logistics and especially for analysing the significance of various customer service elements. The aim of this paper is to propose analytic tools for supporting strategic level logistics decision making by emphasizing service level elements on two levels: (1) to introduce and propose approaches to categorize the developing efforts of logistics and (2) to introduce and/or propose approaches for solving some customer service related strategic level logistics problems. This study consists of two parts. In the first part an overview of the work is presented, and the second part comprises eight research papers on the topic of the study. The overview includes an introduction, where strategic and tactical level logistics problems are discussed and the relation of logistics to marketing and customer service issues is presented. In the first part of the study the objectives, the structure, the research strategy and the contribution of the research are described, and the challenges for future research are discussed. In the second part the three first papers deal with the identification of objectives for logistics while the remaining five papaers concentrate on solving customer service related strategic level logistics problems.
Resumo:
Two high performance liquid chromatography (HPLC) methods for the quantitative determination of indinavir sulfate were tested, validated and statistically compared. Assays were carried out using as mobile phases mixtures of dibutylammonium phosphate buffer pH 6.5 and acetonitrile (55:45) at 1 mL/min or citrate buffer pH 5 and acetonitrile (60:40) at 1 mL/min, an octylsilane column (RP-8) and a UV spectrophotometric detector at 260 nm. Both methods showed good sensitivity, linearity, precision and accuracy. The statistical analysis using the t-student test for the determination of indinavir sulfate raw material and capsules indicated no statistically significant difference between the two methods.
Resumo:
In the current study, we evaluated various robust statistical methods for comparing two independent groups. Two scenarios for simulation were generated: one of equality and another of population mean differences. In each of the scenarios, 33 experimental conditions were used as a function of sample size, standard deviation and asymmetry. For each condition, 5000 replications per group were generated. The results obtained by this study show an adequate type error I rate but not a high power for the confidence intervals. In general, for the two scenarios studied (mean population differences and not mean population differences) in the different conditions analysed, the Mann-Whitney U-test demonstrated strong performance, and a little worse the t-test of Yuen-Welch.
Resumo:
The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.
Resumo:
A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae). The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone) were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%), total polyphenol content (29.39 ± 0.39%), and antioxidant capacity (6.38 ± 0.21). An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.
Resumo:
This research was motivated by the need to examine the potential application areas of process intensification technologies in Neste Oil Oyj. According to the company’s interest membrane reactor technology was chosen and applicability of this technology in refining industry was investigated. Moreover, Neste Oil suggested a project which is related to the CO2 capture from FCC unit flue gas stream. The flowrate of the flue gas is 180t/h and consist of approximately 14% by volume CO2. Membrane based absorption process (membrane contactor) was chosen as a potential technique to model CO2 capture from fluid catalytic cracking (FCC) unit effluent. In the design of membrane contactor, a mathematical model was developed to describe CO2 absorption from a gas mixture using monoethanole amine (MEA) aqueous solution. According to the results of literature survey, in the hollow fiber contactor for laminar flow conditions approximately 99 % percent of CO2 can be removed by using a 20 cm in length polyvinylidene fluoride (PDVF) membrane. Furthermore, the design of whole process was performed by using PRO/II simulation software and the CO2 removal efficiency of the whole process obtained as 97 %. The technical and economical comparisons among existing MEA absorption processes were performed to determine the advantages and disadvantages of membrane contactor technology.