979 resultados para Methods : Statistical


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seaports play an important part in the wellbeing of a nation. Many nations are highly dependent on foreign trade and most trade is done using sea vessels. This study is part of a larger research project, where a simulation model is required in order to create further analyses on Finnish macro logistical networks. The objective of this study is to create a system dynamic simulation model, which gives an accurate forecast for the development of demand of Finnish seaports up to 2030. The emphasis on this study is to show how it is possible to create a detailed harbor demand System Dynamic model with the help of statistical methods. The used forecasting methods were ARIMA (autoregressive integrated moving average) and regression models. The created simulation model gives a forecast with confidence intervals and allows studying different scenarios. The building process was found to be a useful one and the built model can be expanded to be more detailed. Required capacity for other parts of the Finnish logistical system could easily be included in the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis the X-ray tomography is discussed from the Bayesian statistical viewpoint. The unknown parameters are assumed random variables and as opposite to traditional methods the solution is obtained as a large sample of the distribution of all possible solutions. As an introduction to tomography an inversion formula for Radon transform is presented on a plane. The vastly used filtered backprojection algorithm is derived. The traditional regularization methods are presented sufficiently to ground the Bayesian approach. The measurements are foton counts at the detector pixels. Thus the assumption of a Poisson distributed measurement error is justified. Often the error is assumed Gaussian, altough the electronic noise caused by the measurement device can change the error structure. The assumption of Gaussian measurement error is discussed. In the thesis the use of different prior distributions in X-ray tomography is discussed. Especially in severely ill-posed problems the use of a suitable prior is the main part of the whole solution process. In the empirical part the presented prior distributions are tested using simulated measurements. The effect of different prior distributions produce are shown in the empirical part of the thesis. The use of prior is shown obligatory in case of severely ill-posed problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis was focussed on statistical analysis methods and proposes the use of Bayesian inference to extract information contained in experimental data by estimating Ebola model parameters. The model is a system of differential equations expressing the behavior and dynamics of Ebola. Two sets of data (onset and death data) were both used to estimate parameters, which has not been done by previous researchers in (Chowell, 2004). To be able to use both data, a new version of the model has been built. Model parameters have been estimated and then used to calculate the basic reproduction number and to study the disease-free equilibrium. Estimates of the parameters were useful to determine how well the model fits the data and how good estimates were, in terms of the information they provided about the possible relationship between variables. The solution showed that Ebola model fits the observed onset data at 98.95% and the observed death data at 93.6%. Since Bayesian inference can not be performed analytically, the Markov chain Monte Carlo approach has been used to generate samples from the posterior distribution over parameters. Samples have been used to check the accuracy of the model and other characteristics of the target posteriors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current study, we evaluated various robust statistical methods for comparing two independent groups. Two scenarios for simulation were generated: one of equality and another of population mean differences. In each of the scenarios, 33 experimental conditions were used as a function of sample size, standard deviation and asymmetry. For each condition, 5000 replications per group were generated. The results obtained by this study show an adequate type error I rate but not a high power for the confidence intervals. In general, for the two scenarios studied (mean population differences and not mean population differences) in the different conditions analysed, the Mann-Whitney U-test demonstrated strong performance, and a little worse the t-test of Yuen-Welch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aikuispotilaan kotisyntyisen keuhkokuumeen etiologinen diagnostiikka mikrobiologisilla pikamenetelmillä Tausta. Keuhkokuume on vakava sairaus, johon sairastuu Suomessa vuosittain n. 60 000 aikuista. Huolimatta siitä, että taudin hoito on kehittynyt, siihen liittyy yhä merkittävä, 6-15%:n kuolleisuus. Alahengitystieinfektion aiheuttajamikrobien tunnistaminen on myös edelleen haasteellista. Tavoitteet. Tämän työn tavoitteena oli tutkia Turun yliopistollisessa keskussairaalassa hoidettujen aikuispotilaiden keuhkokuumeen etiologiaa sekä selvittää uusien mikrobiologisten pikamenetelmi¬en hyödyllisyyttä taudinaiheuttajan toteamisessa. Aineisto. Osatöiden I ja III aineisto koostui 384 Turun yliopistollisen keskussairaalaan infektio-osastolla hoidetusta keuhkokuumepotilaasta. Osatyössä I tutkittiin keuhkokuumeen aiheuttaja¬mikrobeja käyttämällä perinteisten menetelmien lisäksi antigeeniosoitukseen ja PCR-tekniikkaan perustuvia pikamenetelmiä. Osatyö II käsitti 231 potilaasta koostuvan alaryhmän, jossa tutkittiin potilaiden nielun limanäytteestä rinovirusten ja enterovirusten esiintyvyyttä. Osatyössä III potilailta tutkittiin plasman C-reaktiivisen proteiinin (CRP) pitoisuus ensimmäisten viiden sairaalahoitopäi¬vän aikana. Laajoja tilastotieteellisiä analyysejä käyttämällä selvitettiin CRP:n käyttökelpoisuutta sairauden vaikeusasteen arvioinnissa ja komplikaatioiden kehittymisen ennustamisessa. Osatyössä IV 68 keuhkokuumepotilaan sairaalaan tulovaiheessa otetuista näytteistä määritettiin neutrofiilien pintareseptorien ekspressio. Osatyössä V analysoitiin sisätautien vuodeosastoilla vuosina 1996-2000 keuhkokuumepotilaille tehtyjen keuhkohuuhtelunäytteiden laboratoriotutkimustulokset. Tulokset. Keuhkokuumeen aiheuttaja löytyi 209 potilaalta, aiheuttajamikrobeja löydettiin kaikkiaan 230. Näistä aiheuttajista 135 (58.7%) löydettiin antigeenin osoituksella tai PCR-menetelmillä. Suu¬rin osa, 95 (70.4%), todettiin pelkästään kyseisillä pikamenetelmillä. Respiratorinen virus todettiin antigeeniosoituksella 11.1% keuhkokuumepotilaalla. Eniten respiratorisia viruksia löytyi vakavaa keuhkokuumetta sairastavilta potilailta (20.3%). 231 keuhkokuumepotilaan alaryhmässä todettiin PCR-menetelmällä picornavirus 19 (8.2%) potilaalla. Respiratorinen virus löytyi tässä potilasryh¬mässä kaiken kaikkiaan 47 (20%) potilaalta. Näistä 17:llä (36%) löytyi samanaikaisesti bakteerin aiheuttama infektio. CRP-tasot olivat sairaalaan tulovaiheessa merkitsevästi korkeammat vakavaa keuhkokuumetta (PSI-luokat III-V) sairastavilla potilailla kuin lievää keuhkokuumetta (PSI-luokat I-II) sairastavilla potilailla (p <0.001). Yli 100 mg/l oleva CRP-taso neljän päivän kuluttua sairaa¬laan tulosta ennusti keuhkokuumeen komplikaatiota tai huonoa hoitovastetta. Neutrofiilien komple¬menttireseptorin ekspressio oli pneumokokin aiheuttamaa keuhkokuumetta sairastavilla merkitse¬västi korkeampi kuin influenssan aiheuttamaa keuhkokuumetta sairastavilla. BAL-näytteistä vain yhdessä 71:stä (1.3%) todettiin diagnostinen bakteerikasvu kvantitatiivisessa viljelyssä. Uusilla menetelmilläkin keuhkokuumeen aiheuttaja löytyi vain 9.8% BAL-näytteistä. Päätelmät. Uusilla antigeeniosoitus- ja PCR-menetelmillä keuhkokuumeen etiologia voidaan saada selvitettyä nopeasti. Lisäksi näitä menetelmiä käyttämällä taudin aiheuttajamikrobi löytyi huomattavasti suuremmalta osalta potilaista kuin pelkästään tavanomaisia menetelmiä käyttämällä. Pikamenetelmien hyödyllisyys vaihteli taudin vaikeusasteen mukaan. Respiratorinen virus löytyi huomattavan usein keuhkokuumetta sairastavilta potilailta, ja näiden potilaiden taudinkuva oli usein vaikea. Tulovaiheen korkeaa CRP-tasoa voidaan käyttää lisäkeinona arvioitaessa keuhkokuumeen vaikeutta. CRP on erityisen hyödyllinen arvioitaessa hoitovastetta ja riskiä komplikaatioiden ke¬hittymiseen. Neutrofiilien komplementtireseptorin ekspression tutkiminen näyttää lupaavalta pi¬kamenetelmältä erottamaan bakteerien ja virusten aiheuttamat taudit toisistaan. Antimikrobihoitoa saavilla potilailla BAL-tutkimuksen löydökset olivat vähäiset ja vaikuttivat hoitoon vain harvoin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reversed-phase liquid chromatographic (LC) and ultraviolet (UV) spectrophotometric methods were developed and validated for the assay of bromopride in oral and injectable solutions. The methods were validated according to ICH guideline. Both methods were linear in the range between 5-25 μg mL-1 (y = 41837x - 5103.4, r = 0.9996 and y = 0.0284x - 0.0351, r = 1, respectively). The statistical analysis showed no significant difference between the results obtained by the two methods. The proposed methods were found to be simple, rapid, precise, accurate, and sensitive. The LC and UV methods can be used in the routine quantitative analysis of bromopride in oral and injectable solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evapotranspiration is the process of water loss of vegetated soil due to evaporation and transpiration, and it may be estimated by various empirical methods. This study had the objective to carry out the evaluation of the performance of the following methods: Blaney-Criddle, Jensen-Haise, Linacre, Solar Radiation, Hargreaves-Samani, Makkink, Thornthwaite, Camargo, Priestley-Taylor and Original Penman in the estimation of the potential evapotranspiration when compared to the Penman-Monteith standard method (FAO56) to the climatic conditions of Uberaba, state of Minas Gerais, Brazil. A set of 21 years monthly data (1990 to 2010) was used, working with the climatic elements: temperature, relative humidity, wind speed and insolation. The empirical methods to estimate reference evapotranspiration were compared with the standard method using linear regression, simple statistical analysis, Willmott agreement index (d) and performance index (c). The methods Makkink and Camargo showed the best performance, with "c" values ​​of 0.75 and 0.66, respectively. The Hargreaves-Samani method presented a better linear relation with the standard method, with a correlation coefficient (r) of 0.88.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.