981 resultados para Parametric modelling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To compare the population modelling programs NONMEM and P-PHARM during investigation of the pharmacokinetics of tacrolimus in paediatric liver-transplant recipients. Methods: Population pharmacokinetic analysis was performed using NONMEM and P-PHARM on retrospective data from 35 paediatric liver-transplant patients receiving tacrolimus therapy. The same data were presented to both programs. Maximum likelihood estimates were sought for apparent clearance (CL/F) and apparent volume of distribution (V/F). Covariates screened for influence on these parameters were weight, age, gender, post-operative day, days of tacrolimus therapy, transplant type, biliary reconstructive procedure, liver function tests, creatinine clearance, haematocrit, corticosteroid dose, and potential interacting drugs. Results: A satisfactory model was developed in both programs with a single categorical covariate - transplant type - providing stable parameter estimates and small, normally distributed (weighted) residuals. In NONMEM, the continuous covariates - age and liver function tests - improved modelling further. Mean parameter estimates were CL/F (whole liver) = 16.3 1/h, CL/F (cut-down liver) = 8.5 1/h and V/F = 565 1 in NONMEM, and CL/F = 8.3 1/h and V/F = 155 1 in P-PHARM. Individual Bayesian parameter estimates were CL/F (whole liver) = 17.9 +/- 8.8 1/h, CL/F (cutdown liver) = 11.6 +/- 18.8 1/h and V/F = 712 792 1 in NONMEM, and CL/F (whole liver) = 12.8 +/- 3.5 1/h, CL/F (cut-down liver) = 8.2 +/- 3.4 1/h and V/F = 221 1641 in P-PHARM. Marked interindividual kinetic variability (38-108%) and residual random error (approximately 3 ng/ml) were observed. P-PHARM was more user friendly and readily provided informative graphical presentation of results. NONMEM allowed a wider choice of errors for statistical modelling and coped better with complex covariate data sets. Conclusion: Results from parametric modelling programs can vary due to different algorithms employed to estimate parameters, alternative methods of covariate analysis and variations and limitations in the software itself.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diplomityössä tutkittiin kaupallisen monikappaledynamiikkaohjelmiston soveltuvuutta kiinnirullaimen dynamiikan ja värähtelyjen tutkimiseen. Erityisen kiinnostuneita oltiin nipin kuvauksesta sekä nipissä tapahtuvista värähtelyistä. Tässä diplomityössä mallinnettiin kiinnirullaimen ensiö- ja toisiokäytöt sekä tampuuritela. Malli yhdistettiin myöhemmin Metso Paper Järvenpäässä rinnakkaisena diplomityönä tehtyyn malliin, joista muodostui kahteen ratkaisijaan perustuva simulointimalli. Simulointimalli rakennettiin käyttämään kahta erillistä ratkaisijaa, joista toinen on mekaniikkamallin rakentamisessa käytetty ADAMS-ohjelmisto ja toinen säätöjärjestelmää ja hydraulipiirejä kuvaava Simulink-malli. Nipin mallintamiseksi tampuuritela ja rullaussylinteri mallinnettiin joustaviksi käyttäen keskitettyjen massojen menetelmää. Siirtolaitteissa sekä runkorakenteissa tapahtuvat joustot kuvattiin yhden vapausasteen jousi-vaimennin voimilla kuvattuina järjestelminä. Tässä diplomityössä on myös keskitytty esittelemään ADAMS-ohjelmiston toimintaa ohjeistavasti sekä käsittelemään parametrisen mallintamisen etuja. Työssä havaittiin monikappaledynamiikan soveltuvuus kiinnirullaimen dynamiikan sekä dynaamisten voimien aiheuttamien värähtelyjen tutkimiseen. Suoritetuista värähtelymittauksista voitiin tehdä vain arvioita. Mallin havaittiin vaativan lisätutkimusta ja kehitystyötä

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diplomityö käsittelee hisseissä erikoistapauksessa käytettävän kulmakorin suunnittelua ja tuotteistamista. Työ suoritetaan KONE Oyj:lle. Diplomityössä luotiin kulmakorille modulaarinen tuotearkkitehtuuri ja määritettiin korin toimitusprosessi. Työn tavoitteena oli saavuttaa 48,12% asiakkaiden mahdollisista vaatimuksista ja vähentää suunnitteluun kuluvaa aikaa aikaisemmasta 24 tunnista neljään tuntiin. Työn tavoite saavutettiin kokeneen tapauskohtaisten kulmakorien suunnittelijan kommenttien perusteella. 48,12% asiakasvaatimuksista sisällytettiin tuotemalliin konfigurointimahdollisuuksina. Työn alussa on esitelty tuotesuunnittelua, laadun hallintaa, parametrista mallinnusta, massakustomointia ja tuotetiedon hallintaa. Sen jälkeen on käsitelty kulmakorin tuotteistamisen kannalta kaikki tärkeimmät muuttujat. Tämän jälkeen kulmakorin tuotemalli suunnitellaan ja mallinnetaan systemaattisesti ylhäältä-alas –mallinnustapaa käyttäen ja luodaan osille ja kokoonpanoille valmistuskuvat. Päätyökaluna työssä käytettiin Pro/ENGINEER-ohjelmistoa. Tällä mallinnettiin parametrinen tuotemalli ja rakenteiden lujuustarkastelussa käytettiin ohjelmistoa Ansys. Työn tavoite saavutettiin analysoimalla massakustomoinnin perusteiden olennaisimmat osat ja seuraamalla analyyttistä ja systemaattista tuotekehitysprosessia. Laatua painottaen tuotearkkitehtuuri validoitiin suorittamalla rajoitettu tuotanto, joka sisälsi kolme tuotemallilla konfiguroitua kulmakoria. Yksi koreista testikasattiin Hyvinkään tehtaalla.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the disadvantages of old age is that there is more past than future: this, however, may be turned into an advantage if the wealth of experience and, hopefully, wisdom gained in the past can be reflected upon and throw some light on possible future trends. To an extent, then, this talk is necessarily personal, certainly nostalgic, but also self critical and inquisitive about our understanding of the discipline of statistics. A number of almost philosophical themes will run through the talk: search for appropriate modelling in relation to the real problem envisaged, emphasis on sensible balances between simplicity and complexity, the relative roles of theory and practice, the nature of communication of inferential ideas to the statistical layman, the inter-related roles of teaching, consultation and research. A list of keywords might be: identification of sample space and its mathematical structure, choices between transform and stay, the role of parametric modelling, the role of a sample space metric, the underused hypothesis lattice, the nature of compositional change, particularly in relation to the modelling of processes. While the main theme will be relevance to compositional data analysis we shall point to substantial implications for general multivariate analysis arising from experience of the development of compositional data analysis…

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Building Information Modelling is changing the design and construction field ever since it entered the market. It took just some time to show its capabilities, it takes some time to be mastered before it could be used expressing all its best features. Since it was conceived to be adopted from the earliest stage of design to get the maximum from the decisional project, it still struggles to adapt to existing buildings. In fact, there is a branch of this methodology that is dedicated to what has been already made that is called Historic BIM or HBIM. This study aims to make clear what are BIM and HBIM, both from a theoretical point of view and in practice, applying from scratch the state of the art to a case study. It had been chosen the fortress of San Felice sul Panaro, a marvellous building with a thousand years of history in its bricks, that suffered violent earthquakes, but it is still standing. By means of this example, it will be shown which are the limits that could be encountered when applying BIM methodology to existing heritage, moreover will be pointed out all the new features that a simple 2D design could not achieve.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article considers alternative methods to calculate the fair premium rate of crop insurance contracts based on county yields. The premium rate was calculated using parametric and nonparametric approaches to estimate the conditional agricultural yield density. These methods were applied to a data set of county yield provided by the Statistical and Geography Brazilian Institute (IBGE), for the period of 1990 through 2002, for soybean, corn and wheat, in the State of Paran. In this article, we propose methodological alternatives to pricing crop insurance contracts resulting in more accurate premium rates in a situation of limited data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Financial prediction has attracted a lot of interest due to the financial implications that the accurate prediction of financial markets can have. A variety of data driven modellingapproaches have been applied but their performance has produced mixed results. In this study we apply both parametric (neural networks with active neurons) and nonparametric (analog complexing) self-organisingmodelling methods for the daily prediction of the exchangerate market. We also propose acombinedapproach where the parametric and nonparametricself-organising methods are combined sequentially, exploiting the advantages of the individual methods with the aim of improving their performance. The combined method is found to produce promising results and to outperform the individual methods when tested with two exchangerates: the American Dollar and the Deutche Mark against the British Pound.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis aims to understand the behavior of a low-rise unreinforced masonry building (URM), the typical residential house in the Netherlands, when subjected to low-intensity earthquakes. In fact, in the last decades, the Groningen region was hit by several shallow earthquakes caused by the extraction of natural gas. In particular, the focus is addressed to the internal non-structural walls and to their interaction with the structural parts of the building. A simple and cost-efficient 2D FEM model is developed, focused on the interfaces representing mortar layers that are present between the non-structural walls and the rest of the structure. As a reference for geometries and materials, it has been taken into consideration a prototype that was built in full-scale at the EUCENTRE laboratory of Pavia (Italy). Firstly, a quasi-static analysis is performed by gradually applying a prescribed displacement on the roof floor of the structure. Sensitivity analyses are conducted on some key parameters characterizing mortar. This analysis allows for the calibration of their values and the evaluation of the reliability of the model. Successively, a transient analysis is performed to effectively subject the model to a seismic action and hence also evaluate the mechanical response of the building over time. Moreover, it was possible to compare the results of this analysis with the displacements recorded in the experimental tests by creating a model representing the entire considered structure. As a result, some conditions for the model calibration are defined. The reliability of the model is then confirmed by both the reasonable results obtained from the sensitivity analysis and the compatibility of the values obtained for the top displacement of the roof floor of the experimental test, and the same value acquired from the structural model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix population models, elasticity analysis and loop analysis can potentially provide powerful techniques for the analysis of life histories. Data from a capture-recapture study on a population of southern highland water skinks (Eulamprus tympanum) were used to construct a matrix population model. Errors in elasticities were calculated by using the parametric bootstrap technique. Elasticity and loop analyses were then conducted to identify the life history stages most important to fitness. The same techniques were used to investigate the relative importance of fast versus slow growth, and rapid versus delayed reproduction. Mature water skinks were long-lived, but there was high immature mortality. The most sensitive life history stage was the subadult stage. It is suggested that life history evolution in E. tympanum may be strongly affected by predation, particularly by birds. Because our population declined over the study, slow growth and delayed reproduction were the optimal life history strategies over this period. Although the techniques of evolutionary demography provide a powerful approach for the analysis of life histories, there are formidable logistical obstacles in gathering enough high-quality data for robust estimates of the critical parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Izenman and Sommer (1988) used a non-parametric Kernel density estimation technique to fit a seven-component model to the paper thickness of the 1872 Hidalgo stamp issue of Mexico. They observed an apparent conflict when fitting a normal mixture model with three components with unequal variances. This conflict is examined further by investigating the most appropriate number of components when fitting a normal mixture of components with equal variances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências (Especialidade em Matemática)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quasi-optical de-embedding technique for characterizing waveguides is demonstrated using wideband time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time domain responses were discretised and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an ARX as well as with a state space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize signal distortion and the noise propagating in the ARX and subspace models. The model identification procedure requires isolation of the phase delay in the structure and therefore the time-domain signatures must be firstly aligned with respect to each other before they are compared. An initial estimate of the number of propagating modes was provided by comparing the measured phase delay in the structure with theoretical calculations that take into account the physical dimensions of the waveguide. Models derived from measurements of THz transients in a precision WR-8 waveguide adjustable short will be presented.