998 resultados para Statistical Convergence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article proposes a checklist to improve statistical reporting in the manuscripts submitted to Public Understanding of Science. Generally, these guidelines will allow the reviewers (and readers) to judge whether the evidence provided in the manuscript is relevant. The article ends with other suggestions for a better statistical quality of the journal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the differential impact of human capital, in terms of different levels of schooling, on regional productivity and convergence. The potential existence of geographical spillovers of human capital is also considered by applying spatial panel data techniques. The empirical analysis of Spanish provinces between 1980 and 2007 confirms the positive impact of human capital on regional productivity and convergence, but reveals no evidence of any positive geographical spillovers of human capital. In fact, in some specifications the spatial lag presented by tertiary studies has a negative effect on the variables under consideration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis the X-ray tomography is discussed from the Bayesian statistical viewpoint. The unknown parameters are assumed random variables and as opposite to traditional methods the solution is obtained as a large sample of the distribution of all possible solutions. As an introduction to tomography an inversion formula for Radon transform is presented on a plane. The vastly used filtered backprojection algorithm is derived. The traditional regularization methods are presented sufficiently to ground the Bayesian approach. The measurements are foton counts at the detector pixels. Thus the assumption of a Poisson distributed measurement error is justified. Often the error is assumed Gaussian, altough the electronic noise caused by the measurement device can change the error structure. The assumption of Gaussian measurement error is discussed. In the thesis the use of different prior distributions in X-ray tomography is discussed. Especially in severely ill-posed problems the use of a suitable prior is the main part of the whole solution process. In the empirical part the presented prior distributions are tested using simulated measurements. The effect of different prior distributions produce are shown in the empirical part of the thesis. The use of prior is shown obligatory in case of severely ill-posed problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis was focussed on statistical analysis methods and proposes the use of Bayesian inference to extract information contained in experimental data by estimating Ebola model parameters. The model is a system of differential equations expressing the behavior and dynamics of Ebola. Two sets of data (onset and death data) were both used to estimate parameters, which has not been done by previous researchers in (Chowell, 2004). To be able to use both data, a new version of the model has been built. Model parameters have been estimated and then used to calculate the basic reproduction number and to study the disease-free equilibrium. Estimates of the parameters were useful to determine how well the model fits the data and how good estimates were, in terms of the information they provided about the possible relationship between variables. The solution showed that Ebola model fits the observed onset data at 98.95% and the observed death data at 93.6%. Since Bayesian inference can not be performed analytically, the Markov chain Monte Carlo approach has been used to generate samples from the posterior distribution over parameters. Samples have been used to check the accuracy of the model and other characteristics of the target posteriors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two high performance liquid chromatography (HPLC) methods for the quantitative determination of indinavir sulfate were tested, validated and statistically compared. Assays were carried out using as mobile phases mixtures of dibutylammonium phosphate buffer pH 6.5 and acetonitrile (55:45) at 1 mL/min or citrate buffer pH 5 and acetonitrile (60:40) at 1 mL/min, an octylsilane column (RP-8) and a UV spectrophotometric detector at 260 nm. Both methods showed good sensitivity, linearity, precision and accuracy. The statistical analysis using the t-student test for the determination of indinavir sulfate raw material and capsules indicated no statistically significant difference between the two methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the current study, we evaluated various robust statistical methods for comparing two independent groups. Two scenarios for simulation were generated: one of equality and another of population mean differences. In each of the scenarios, 33 experimental conditions were used as a function of sample size, standard deviation and asymmetry. For each condition, 5000 replications per group were generated. The results obtained by this study show an adequate type error I rate but not a high power for the confidence intervals. In general, for the two scenarios studied (mean population differences and not mean population differences) in the different conditions analysed, the Mann-Whitney U-test demonstrated strong performance, and a little worse the t-test of Yuen-Welch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae). The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone) were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%), total polyphenol content (29.39 ± 0.39%), and antioxidant capacity (6.38 ± 0.21). An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explore a DNA statistical model to obtain information about the behavior of the thermodynamics quantities. Special attention is given to the thermal denaturation of this macromolecule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies properties of transforms based on parabolic scaling, like Curvelet-, Contourlet-, Shearlet- and Hart-Smith-transform. Essentially, two di erent questions are considered: How these transforms can characterize H older regularity and how non-linear approximation of a piecewise smooth function converges. In study of Hölder regularities, several theorems that relate regularity of a function f : R2 → R to decay properties of its transform are presented. Of particular interest is the case where a function has lower regularity along some line segment than elsewhere. Theorems that give estimates for direction and location of this line, and regularity of the function are presented. Numerical demonstrations suggest also that similar theorems would hold for more general shape of segment of low regularity. Theorems related to uniform and pointwise Hölder regularity are presented as well. Although none of the theorems presented give full characterization of regularity, the su cient and necessary conditions are very similar. Another theme of the thesis is the study of convergence of non-linear M ─term approximation of functions that have discontinuous on some curves and otherwise are smooth. With particular smoothness assumptions, it is well known that squared L2 approximation error is O(M-2(logM)3) for curvelet, shearlet or contourlet bases. Here it is shown that assuming higher smoothness properties, the log-factor can be removed, even if the function still is discontinuous.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aktörer inom telekommunikationsbranschen i Finland har genomgått en intensiv förändring under de senaste 25 åren, från 1980-talets självständiga företag till företag beroende av varandra, och även av aktörer inom närliggande branscher. I dag skapas telekommunikationsmarknaden inte endast av operatörerna, utan också av mediebolag (t.ex. MTV Media) och IT-företag (t.ex. TietoEnator). Gränserna mellan olika industrier håller därmed på att suddas ut - ett fenomen som allmänt benämns som teknologisk konvergens. Konvergens innebär att någonting integreras; det kan handla om t.ex. teknologier (telefoni och Internet), företag (AOL och Time Warner), industrier (telekom, media och IT-branscherna), tjänster (mobilt TV), produkter (PDA) osv. Detta innebär att ytterst få telekomaktörer ensamma kan vidareutveckla marknaden och tekniska lösningar. Samarbete mellan aktörer krävs; mobiltelefontillverkare, innehållsproducenter, operatörer osv. bör intesifiera sitt samarbete för att kunna erbjuda attraktiva tjänster och produkter till kunder och slutanvändare. Avhandlingen fokuserar speciellt på affärsnätverk och samarbetsmönster mellan nätverksaktörer som medel för att få tillgång till resurser som krävs i en konvergenskarakteriserad affärsomgivning. Avhandlingen lyfter fram vad den teknologiska konvergensen har inneburit för telekomaktörer, dvs. att företag tvingats förändra sina strategier och verksamhetsmodeller. För många företag i branschen har anpassningen till konvergenstänkande varit utmanande, och i vissa fall kan man till och med tala om att företagen upplevt en identitetskris. Den utförda forskningen visar att konvergens uppfattas på marknaden som en pågående förändringsprocess, där varje telekomaktör är tvungen att utvärdera sin roll och position i relation till andra aktörer inom branschen. Konvergensprocesser forsätter i framtiden med ökad intensitet. Aktörerna skapar medvetet sin omgivning genom att agera i olika roller, som kan sträcka sig över industrigränser. Avhandlingen påvisar även att externa händelser och industrikontexten påverkar dynamiken i ett affärsnätverk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of convergence and divergence in global economy and social development utilises comparative indicators to investigate the contents of economic and social development policy and their effects on the global samples that represent the rich industrial, semi-industrial and the poor developing nations. The study searchesfor answers to questions such as "what are the objectives of economic growth policies in globalisation under the imperatives of convergence and divergence, and how do these affect human well-being in consideration to the objectives of social policy in various nations?" The empirical verification of data utilises the concepts of the `logic of industrialism´ for comparative analysis that focuses mainly on identifying the levels of well-being in world nations after the Second World War. The perspectives of convergence and divergence in global economy and social development critically examine the stages of early development processes in global economy, distinguish the differences between economy and social development, illustrate the contents of economic and social development policies, their effects on rich and poor countries, and the nature of convergence and divergence in propelling economic growth and unequal social development in world nations. The measurement of convergence and divergence in global economy and social development utilised both economic and social data that were combined into an index that measures the precise levels of the effects of economic and social development policies on human well-being in the rich and poor nations. The task of finding policy solutions to resolve the controversies are reviewed through empirical investigations and the analyses of trends indicated within economic and social indicators and data. These revealed how the adoption of social policy measures in translating the gains from economic growth, towards promoting education, public health, and equity, generate social progress and longer life expectancy, higher economic growth, and sustain more stable macro economy for the nations. Social policy is concerned with the translation of benefits from objectives of global economic growth policies, to objectives of social development policy in nation states. Social policy, therefore, represents an open door whereby benefits of economic growth policies are linked with the broader objectives of social development policy, thereby enhancing the possibility of extending benefits from economic growth to all human being in every nation.