937 resultados para Inovation models in nets


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työssä tutkittiin kiekkosuodattimeen liittyviä ulkoisia simulointimalleja integroidussa simulointiympäristössä. Työn tarkoituksena oli parantaa olemassa olevaa mekanistista kiekkosuodatinmallia. Malli laadittiin dynaamiseen paperiteollisuuden tarpeisiin tehtyyn simulaattoriin (APMS), jossa olevaan alkuperäiseen mekanistiseen malliin tehtiin ulkoinen lisämalli, joka käyttää hyväkseen kiekkosuodatinvalmistajan mittaustuloksia. Laitetiedon saatavuutta suodattimien käyttäjille parannettiin luomalla Internetissä sijaitsevalle palvelimelle kiekkosuodattimen laitetietomäärittelyt. Suodatinvalmistaja voi palvella asiakkaitaan viemällä laitetiedot palvelimelle ja yhdistämällä laitetiedon simulointimalliin. Tämä on mahdollista Internetin ylitse käytettävän integroidun simulointiympäristön avulla, jonka on tarkoitus kokonaisvaltaisesti yhdistää simulointi ja prosessisuunnittelu. Suunnittelijalle tarjotaan työkalut, joilla dynaaminen simulointi, tasesimulointi ja kaavioiden piirtäminen onnistuu prosessilaitetiedon ollessa saatavilla. Nämä työkalut on tarkoitus toteuttaa projektissa nimeltä Galleria, jossa luodaan prosessimalli- ja laitetietopalvelin Internetiin. Gallerian käyttöliittymän avulla prosessisuunnittelija voi käyttää erilaisia simulointiohjelmistoja ja niihin luotuja valmiita malleja, sekä saada käsiinsä ajan tasalla olevaa laitetietoa. Ulkoinen kiekkosuodatinmalli laskee suodosvirtaamat ja suodosten pitoisuudet likaiselle, kirkkaalle ja superkirkkaalle suodokselle. Mallin syöttöparametrit ovat kiekkojen pyörimisnopeus, sisään tulevan syötön pitoisuus, suotautuvuus (freeness) ja säätöparametri, jolla säädetään likaisen ja kirkkaan suodoksen keskinäinen suhde. Suotautuvuus kertoo mistä massasta on kyse. Mitä suurempi suotautuvuus on, sitä paremmin massa suodattuu ja sitä puhtaampia suodokset yleensä ovat. Mallin parametrit viritettiin regressioanalyysillä ja valmistajan palautetta apuna käyttäen. Käyttäjä voi valita haluaako hän käyttää ulkoista vai alkuperäistä mallia. Alkuperäinen malli täytyy ensin alustaa antamalla sille nominaaliset toimintapisteet virtaamille ja pitoisuuksille tietyllä pyörimisnopeudella. Ulkoisen mallin yhtälöitä voi käyttää alkuperäisen mallin alustamiseen, jos alkuperäinen malli toimii ulkoista paremmin. Ulkoista mallia voi käyttää myös ilman simulointiohjelmaa Galleria-palvelimelta käsin. Käyttäjälle avautuu näin mahdollisuus tarkastella kiekkosuodattimien parametreja ja nähdä suotautumistulokset oman työasemansa ääreltä mistä tahansa, kunhan Internetyhteys on olemassa. Työn tuloksena kiekkosuodattimien laitetiedon saatavuus käyttäjille parani ja alkuperäisen simulointimallin rajoituksia ja puutteita vähennettiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkimuksen tavoitteena oli tutkia langattomien internet palveluiden arvoverkkoa ja liiketoimintamalleja. Tutkimus oli luonteeltaan kvalitatiivinen ja siinä käytettiin strategiana konstruktiivista case-tutkimusta. Esimerkkipalveluna oli Treasure Hunters matkapuhelinpeli. Tutkimus muodostui teoreettisesta ja empiirisestä osasta. Teoriaosassa liitettiin innovaatio, liiketoimintamallit ja arvoverkko käsitteellisesti toisiinsa, sekä luotiin perusta liiketoimintamallien kehittämiselle. Empiirisessä osassa keskityttiin ensin liiketoimintamallien luomiseen kehitettyjen innovaatioiden pohjalta. Lopuksi pyrittiin määrittämään arvoverkko palvelun toteuttamiseksi. Tutkimusmenetelminä käytettiin innovaatiosessiota, haastatteluja ja lomakekyselyä. Tulosten pohjalta muodostettiin useita liiketoimintakonsepteja sekä kuvaus arvoverkon perusmallista langattomille peleille. Loppupäätelmänä todettiin että langattomat palvelut vaativat toteutuakseen useista toimijoista koostuvan arvoverkon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of research methods requires a systematic review of their status. This study focuses on the use of Hierarchical Linear Modeling methods in psychiatric research. Evaluation includes 207 documents published until 2007, included and indexed in the ISI Web of Knowledge databases; analyses focuses on the 194 articles in the sample. Bibliometric methods are used to describe the publications patterns. Results indicate a growing interest in applying the models and an establishment of methods after 2000. Both Lotka"s and Bradford"s distributions are adjusted to the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: The purpose of this study is to determine the possible differences in deflection between two needles of same length and external gauge but with different internal gauges during truncal block of the inferior alveolar nerve. The initial working hypothesis was that greater deflection may be expected with larger internal gauge needles. Study design: Four clinicians subjected 346 patients to inferior alveolar nerve block and infiltrating anesthesia of the buccal nerve trajectory for the surgical or conventional extraction of the lower third molar. A nonautoaspirating syringe system with 2 types of needle was used: a standard 27-gauge x 35-mm needle with an internal gauge of 0.215 mm or an XL Monoprotect® 27-gauge x 35-mm needle with an internal gauge of 0.265 mm. The following information was systematically recorded for each patient: needle type, gender, anesthetic technique (direct or indirect truncal block) and the number of bone contacts during the procedure, the patient-extraction side, the practitioner performing the technique, and blood aspiration (either positive or negative). Results: 346 needles were used in total. 190 were standard needles (27-gauge x 35-mm needle with an internal gauge of 0.215 mm) and 156 were XL Monoprotect®. Incidence of deflection was observed in 49.1% of cases (170 needles) where 94 were standard needles and 76 XL Monoprotect®. Needle torsion ranged from 0º and 6º. Conclusions: No significant differences were recorded in terms of deflection and internal gauge, operator, patient-extraction side, the anesthetic technique involved and the number of bone contacts during the procedure

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines whether global, local and exchange risks are priced in Scandinavian countries’ equity markets by using conditional international asset pricing models. The employed international asset pricing models are the world capital asset pricing model, the international asset pricing model augmented with the currency risk, and the partially segmented model augmented with the currency risk. Moreover, this research traces estimated equity risk premiums for the Scandinavian countries. The empirical part of the study is performed using generalized method of moments approach. Monthly observations from February 1994 to June 2007 are used. Investors’ conditional expectations are modeled using several instrumental variables. In order to keep system parsimonious the prices of risk are assumed to be constant whereas expected returns and conditional covariances vary over time. The empirical findings of this thesis suggest that the prices of global and local market risk are priced in the Scandinavian countries. This indicates that the Scandinavian countries are mildly segmented from the global markets. Furthermore, the results show that the exchange risk is priced in the Danish and Swedish stock markets when the partially segmented model is augmented with the currency risk factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, models of ecological systems can be broadly categorized as ’top-down’ or ’bottom-up’ models, based on the hierarchical level that the model processes are formulated on. The structure of a top-down, also known as phenomenological, population model can be interpreted in terms of population characteristics, but it typically lacks an interpretation on a more basic level. In contrast, bottom-up, also known as mechanistic, population models are derived from assumptions and processes on a more basic level, which allows interpretation of the model parameters in terms of individual behavior. Both approaches, phenomenological and mechanistic modelling, can have their advantages and disadvantages in different situations. However, mechanistically derived models might be better at capturing the properties of the system at hand, and thus give more accurate predictions. In particular, when models are used for evolutionary studies, mechanistic models are more appropriate, since natural selection takes place on the individual level, and in mechanistic models the direct connection between model parameters and individual properties has already been established. The purpose of this thesis is twofold. Firstly, a systematical way to derive mechanistic discrete-time population models is presented. The derivation is based on combining explicitly modelled, continuous processes on the individual level within a reproductive period with a discrete-time maturation process between reproductive periods. Secondly, as an example of how evolutionary studies can be carried out in mechanistic models, the evolution of the timing of reproduction is investigated. Thus, these two lines of research, derivation of mechanistic population models and evolutionary studies, are complementary to each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selective papers of the workshop on "Development of models and forest soil surveys for monitoring of soil carbon", Koli, Finland, April 5-9 2006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to compare the hydrographically conditioned digital elevation models (HCDEMs) generated from data of VNIR (Visible Near Infrared) sensor of ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer), of SRTM (Shuttle Radar Topography Mission) and topographical maps from IBGE in a scale of 1:50,000, processed in the Geographical Information System (GIS), aiming the morphometric characterization of watersheds. It was taken as basis the Sub-basin of São Bartolomeu River, obtaining morphometric characteristics from HCDEMs. Root Mean Square Error (RMSE) and cross validation were the statistics indexes used to evaluate the quality of HCDEMs. The percentage differences in the morphometric parameters obtained from these three different data sets were less than 10%, except for the mean slope (21%). In general, it was observed a good agreement between HCDEMs generated from remote sensing data and IBGE maps. The result of HCDEM ASTER was slightly higher than that from HCDEM SRTM. The HCDEM ASTER was more accurate than the HCDEM SRTM in basins with high altitudes and rugged terrain, by presenting frequency altimetry nearest to HCDEM IBGE, considered standard in this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cells of epithelial origin, e.g. from breast and prostate cancers, effectively differentiate into complex multicellular structures when cultured in three-dimensions (3D) instead of conventional two-dimensional (2D) adherent surfaces. The spectrum of different organotypic morphologies is highly dependent on the culture environment that can be either non-adherent or scaffold-based. When embedded in physiological extracellular matrices (ECMs), such as laminin-rich basement membrane extracts, normal epithelial cells differentiate into acinar spheroids reminiscent of glandular ductal structures. Transformed cancer cells, in contrast, typically fail to undergo acinar morphogenic patterns, forming poorly differentiated or invasive multicellular structures. The 3D cancer spheroids are widely accepted to better recapitulate various tumorigenic processes and drug responses. So far, however, 3D models have been employed predominantly in the Academia, whereas the pharmaceutical industry has yet to adopt a more widely and routine use. This is mainly due to poor characterisation of cell models, lack of standardised workflows and high throughput cell culture platforms, and the availability of proper readout and quantification tools. In this thesis, a complete workflow has been established entailing well-characterised 3D cell culture models for prostate cancer, a standardised 3D cell culture routine based on high-throughput-ready platform, automated image acquisition with concomitant morphometric image analysis, and data visualisation, in order to enable large-scale high-content screens. Our integrated suite of software and statistical analysis tools were optimised and validated using a comprehensive panel of prostate cancer cell lines and 3D models. The tools quantify multiple key cancer-relevant morphological features, ranging from cancer cell invasion through multicellular differentiation to growth, and detect dynamic changes both in morphology and function, such as cell death and apoptosis, in response to experimental perturbations including RNA interference and small molecule inhibitors. Our panel of cell lines included many non-transformed and most currently available classic prostate cancer cell lines, which were characterised for their morphogenetic properties in 3D laminin-rich ECM. The phenotypes and gene expression profiles were evaluated concerning their relevance for pre-clinical drug discovery, disease modelling and basic research. In addition, a spontaneous model for invasive transformation was discovered, displaying a highdegree of epithelial plasticity. This plasticity is mediated by an abundant bioactive serum lipid, lysophosphatidic acid (LPA), and its receptor LPAR1. The invasive transformation was caused by abrupt cytoskeletal rearrangement through impaired G protein alpha 12/13 and RhoA/ROCK, and mediated by upregulated adenylyl cyclase/cyclic AMP (cAMP)/protein kinase A, and Rac/ PAK pathways. The spontaneous invasion model tangibly exemplifies the biological relevance of organotypic cell culture models. Overall, this thesis work underlines the power of novel morphometric screening tools in drug discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this thesis is to concretize the potential benefits that the industrial maintenance case network could achieve through using the value-based life-cycle model and the flexible asset management model. It is also inspected what factors prevent value creation and sharing in the maintenance contract practices of the case network. This thesis is a case study which utilizes modelling. Four scenarios were developed to demonstrate value creation in the future. The data was partly provided by the collaborating company, partly gathered from public financial statement information. The results indicate that value has been created through the past maintenance of the collaborating company’s rod mill and that profitability of the collaborating company has been mostly on satisfactory level during the past few years. Potential value might be created by increasing the share of proactive maintenance of the rod mill in the future. Profitability of the network could be improved in the future through flexible asset management operations. The main obstacle for value creation and sharing seems to be the lack of sufficient trust between the network members.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this Master’s thesis was to study the business model development in Finnish newspaper industry during the next then years through scenario planning. The objective was to see how will the business models develop amidst the many changes in the industry, what factors are affecting the change, what are the implications of these changes for the players in the industry and how should the Finnish newspaper companies evolve in order to succeed in the future. In this thesis the business model change is studied based on all the elements of business models, as it was discovered that the industry is too often focusing on changes in only few of those elements and a more broader view can provide valuable information for the companies. The results revealed that the industry is affected by many changes during the next ten years. Scenario planning provides a good tool for analyzing this change and for developing valuable options for businesses. After conducting series of interviews and discovering forces affecting the change, four different scenarios were developed centered on the role that newspaper will take and the level at which they are providing the content in the future. These scenarios indicated that there are varieties of options in the way the business models may develop and that companies should start making decisions proactively in order to succeed. As the business model elements are interdepended, changes made in the other elements will affect the whole model, making these decisions about the role and level of content important for the companies. In the future, it is likely that the Finnish newspaper industry will include many different kinds of business models, some of which can be drastically different from the current ones and some of which can still be similar, but take better into account the new kind of media environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous genetic association studies have overlooked the potential for biased results when analyzing different population structures in ethnically diverse populations. The purpose of the present study was to quantify this bias in two-locus association studies conducted on an admixtured urban population. We studied the genetic structure distribution of angiotensin-converting enzyme insertion/deletion (ACE I/D) and angiotensinogen methionine/threonine (M/T) polymorphisms in 382 subjects from three subgroups in a highly admixtured urban population. Group I included 150 white subjects; group II, 142 mulatto subjects, and group III, 90 black subjects. We conducted sample size simulation studies using these data in different genetic models of gene action and interaction and used genetic distance calculation algorithms to help determine the population structure for the studied loci. Our results showed a statistically different population structure distribution of both ACE I/D (P = 0.02, OR = 1.56, 95% CI = 1.05-2.33 for the D allele, white versus black subgroup) and angiotensinogen M/T polymorphism (P = 0.007, OR = 1.71, 95% CI = 1.14-2.58 for the T allele, white versus black subgroup). Different sample sizes are predicted to be determinant of the power to detect a given genotypic association with a particular phenotype when conducting two-locus association studies in admixtured populations. In addition, the postulated genetic model is also a major determinant of the power to detect any association in a given sample size. The present simulation study helped to demonstrate the complex interrelation among ethnicity, power of the association, and the postulated genetic model of action of a particular allele in the context of clustering studies. This information is essential for the correct planning and interpretation of future association studies conducted on this population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Animal models have a long history of being useful tools, not only to test and select vaccines, but also to help understand the elaborate details of the immune response that follows infection. Different models have been extensively used to investigate putative immunological correlates of protection against parasitic diseases that are important to reach a successful vaccine. The greatest challenge has been the improvement and adaptation of these models to reflect the reality of human disease and the screening of vaccine candidates capable of overcoming the challenge of natural transmission. This review will discuss the advantages and challenges of using experimental animal models for vaccine development and how the knowledge achieved can be extrapolated to human disease by looking into two important parasitic diseases: malaria and leishmaniasis.