988 resultados para Gaussian assumption


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Aquest article vol refutar la hipòtesi que els partits decideixen sistemes electorals majoritaris i que decideixen també mantenir-los invariables, sempre que el sistema de partits s"aproximi al bipartidisme i cap dels dos grans partits pugui perdre la seva posició a favor d"un nou competidor. De manera inversa, els sistemes electorals proporcionals són la conseqüència del multipartidisme, en el qual cap partit té opcions de rebre la majoria dels vots. El cas valencià, però, confirma només parcialment la hipòtesi: els partits van aprovar el 1982 regles proporcionals perquè les eleccions dels parlaments autonòmics eren considerades secundàries, no només pel multipartidisme existent aleshores. En canvi, sí que es confirma que el canvi iniciat el 2006 amb la reforma estatutària manté, de moment, l"status quo per no alterar la formació de majories parlamentàries. Encara queda pendent, però, que es modifiqui la Llei Electoral, de la qual depèn quin mínim de vots per entrar a les Corts s"establirà.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past several years we conducted a comprehensive study on the pore systems of limestones used as coarse aggregate in portland cement concrete (pee) and their relationship to freeze-thaw aggregate failure. A simple test called the Iowa Pore Index Test was developed and used to identify those coarse aggregates that had freeze-thaw susceptible pore systems. Basically, it identified those aggregates that could take on a considerable amount of water but only at a slow rate. The assumption was that if an aggregate would take on a considerable amount of water at a slow rate, its pore system would impede the outward movement of water through a critically saturated particle during freezing, causing particle fracture. The test was quite successful when used to identify aggregates containing susceptible pore systems if the aggregates were clean carbonates containing less than 2% or 3% insolubles. The correlation between service record, ASTM C666B and the pore index test was good, but not good enough. It became apparent over the past year that there were factors other than the pore system that could cause an aggregate to fail when used in pee. The role that silica and clay play in aggregate durability was studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At present, there are no in vivo or in vitro methods developed which has been adopted by regulatory authorities to assess photosensitization induced by chemicals. Recently, we have proposed the use of THP-1 cells and IL-8 release to identify the potential of chemicals to induce skin sensitization. Based on the assumption that sensitization and photosensitization share common mechanisms, the aim of this work was to explore the THP-1 model as an in vitro model to identify photoallergenic chemicals. THP-1 cells were exposed to 7 photoallergens and 3 photoirritants and irradiated with UVA light or kept in dark. Non phototoxic allergens or irritants were also included as negative compounds. Following 24 h of incubation, cytotoxicity and IL-8 release were measured. At subtoxic concentrations, photoallergens produced a dose-related increase in IL-8 release after irradiation. Some photoirritants also produced a slight increase in IL-8 release. However, when the overall stimulation indexes of IL-8 were calculated for each chemical, 6 out of 7 photoallergens tested reached a stimulation index above 2, while the entire set of negative compounds had stimulation indexes below 2. Our data suggest that this assay may become a useful cell-based in vitro test for evaluating the photosensitizing potential of chemicals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Positron emission computed tomography (PET) is a functional, noninvasive method for imaging regional metabolic processes that is nowadays most often combined to morphological imaging with computed tomography (CT). Its use is based on the well-founded assumption that metabolic changes occur earlier in tumors than morphologic changes, adding another dimension to imaging. This article will review the established and investigational indications and radiopharmaceuticals for PET/CT imaging for prostate cancer, bladder cancer and testicular cancer, before presenting upcoming applications in radiation therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Innate immunity reacts to conserved bacterial molecules. The outermost lipopolysaccharide (LPS) of Gram-negative organisms is highly inflammatory. It activates responsive cells via specific CD14 and toll-like receptor-4 (TLR4) surface receptor and co-receptors. Gram-positive bacteria do not contain LPS, but carry surface teichoic acids, lipoteichoic acids and peptidoglycan instead. Among these, the thick peptidoglycan is the most conserved. It also triggers cytokine release via CD14, but uses the TLR2 co-receptor instead of TLR4 used by LPS. Moreover, whole peptidoglycan is 1000-fold less active than LPS in a weight-to-weight ratio. This suggests either that it is not important for inflammation, or that only part of it is reactive while the rest acts as ballast. Biochemical dissection of Staphylococcus aureus and Streptococcus pneumoniae cell walls indicates that the second assumption is correct. Long, soluble peptidoglycan chains (approximately 125 kDa) are poorly active. Hydrolysing these chains to their minimal unit (2 sugars and a stem peptide) completely abrogates inflammation. Enzymatic dissection of the pneumococcal wall generated a mixture of highly active fragments, constituted of trimeric stem peptides, and poorly active fragments, constituted of simple monomers and dimers or highly polymerized structures. Hence, the optimal constraint for activation might be 3 cross-linked stem peptides. The importance of structural constraint was demonstrated in additional studies. For example, replacing the first L-alanine in the stem peptide with a D-alanine totally abrogated inflammation in experimental meningitis. Likewise, modifying the D-alanine decorations of lipoteichoic acids with L-alanine, or deacylating them from their diacylglycerol lipid anchor also decreased the inflammatory response. Thus, although considered as a broad-spectrum pattern-recognizing system, innate immunity can detect very subtle differences in Gram-positive walls. This high specificity underlines the importance of using well-characterized microbial material in investigating the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMELes modèles classiques sur l'évolution des chromosomes sexuels supposent que des gènes sexe- antagonistes s'accumulent sur les chromosomes sexuels, entraînant ainsi l'apparition d'une région non- recombinante, qui se répand progressivement en favorisant l'accumulation de mutations délétères. En accord avec cette théorie, les chromosomes sexuels que l'on observe aujourd'hui chez les mammifères et les oiseaux sont considérablement différenciés. En revanche, chez la plupart des vertébrés ectothermes, les chromosomes sexuels sont indifférenciés et il existe une impressionnante diversité de mécanismes de détermination du sexe. Au cours de cette thèse, j'ai étudié l'évolution des chromosomes sexuels chez les vertébrés ectothermes, en outre pour mieux comprendre ce contraste avec les vertébrés endothermes. L'hypothèse « high-turnover » postule que les chromosomes sexuels sont remplacés régulièrement à partir d'autosomes afin d'éviter leur dégénérescence. L'hypothèse « fountain-of-youth » propose que la recombinaison entre le chromosome X et le chromosome Y au sein de femelles XY empêche la dégénérescence. Les résultats de ma thèse, basés sur des études théoriques et empiriques, suggèrent que les deux processus peuvent être entraînés par l'environnement et ainsi jouent un rôle important dans l'évolution des chromosomes sexuels chez les vertébrés ectothermes.SUMMARYClassical models of sex-chromosome evolution assume that sexually antagonistic genes accumulate on sex chromosomes leading to a non-recombining region, which progressively expands and favors the accumulation of deleterious mutations. Concordant with this theory, sex chromosomes in extant mammals and birds are considerably differentiated. In most ectothermic vertebrates, such as frogs, however, sex chromosomes are undifferentiated and a striking diversity of sex determination systems is observed. This thesis was aimed to investigate this apparent contrast of sex chromosome evolution between endothermic and ectothermic vertebrates. The "high-turnover" hypothesis holds that sex chromosomes arose regularly from autosomes preventing decay. The "fountain-of-youth" hypothesis posits that sex chromosomes undergo episodic X-Y recombination in sex-reversed XY females, thereby purging ("rejuvenating") the Y chromosome. We suggest that both processes likely played an important role in sex chromosome evolution of ectothermic vertebrates. The literature largely views sex determination as a dichotomous process: individual sex is assumed to be determined either by genetic (genotypic sex determination, GSD) or by environmental factors (environmental sex determination, ESD), most often temperature (temperature sex determination, TSD). We endorsed an alternative view, which sees GSD and TSD as the ends of a continuum. The conservatism of molecular processes among different systems of sex determination strongly supports the continuum view. We proposed to define sex as a threshold trait underlain by a liability factor, and reaction norms allowing modeling interactions between genotypic and temperature effects. We showed that temperature changes (due to e.g., climatic changes or range expansions) are expected to provoke turnovers in sex-determination mechanisms maintaining homomorphic sex chromosomes. The balanced lethal system of crested newts might be the result of such a sex determination turnover, originating from two variants of ancient Y-chromosomes. Observations from a group of tree frogs, on the other hand, supported the 'fountain of youth' hypothesis. We then showed that low rates of sex- reversals in species with GSD might actually be adaptive considering joint effects of deleterious mutation purging and sexually antagonistic selection. Ongoing climatic changes are expected to threaten species with TSD by biasing population sex ratios. In contrast, species with GSD are implicitly assumed immune against such changes, because genetic systems are thought to necessarily produce even sex ratios. We showed that this assumption may be wrong and that sex-ratio biases by climatic changes may represent a previously unrecognized extinction threat for some GSD species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study biased, diffusive transport of Brownian particles through narrow, spatially periodic structures in which the motion is constrained in lateral directions. The problem is analyzed under the perspective of the Fick-Jacobs equation, which accounts for the effect of the lateral confinement by introducing an entropic barrier in a one-dimensional diffusion. The validity of this approximation, based on the assumption of an instantaneous equilibration of the particle distribution in the cross section of the structure, is analyzed by comparing the different time scales that characterize the problem. A validity criterion is established in terms of the shape of the structure and of the applied force. It is analytically corroborated and verified by numerical simulations that the critical value of the force up to which this description holds true scales as the square of the periodicity of the structure. The criterion can be visualized by means of a diagram representing the regions where the Fick-Jacobs description becomes inaccurate in terms of the scaled force versus the periodicity of the structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse the variations produced on tsunami propagation and impact over a straight coastline because of the presence of a submarine canyon incised in the continental margin. For ease of calculation we assume that the shoreline and the shelf edge are parallel and that the incident wave approaches them normally. A total of 512 synthetic scenarios have been computed by combining the bathymetry of a continental margin incised by a parameterised single canyon and the incident tsunami waves. The margin bathymetry, the canyon and the tsunami waves have been generated using mathematical functions (e.g. Gaussian). Canyon parameters analysed are: (i) incision length into the continental shelf, which for a constant shelf width relates directly to the distance from the canyon head to the coast, (ii) canyon width, and (iii) canyon orientation with respect to the shoreline. Tsunami wave parameters considered are period and sign. The COMCOT tsunami model from Cornell University was applied to propagate the waves across the synthetic bathymetric surfaces. Five simulations of tsunami propagation over a non-canyoned margin were also performed for reference. The analysis of the results reveals a strong variation of tsunami arrival times and amplitudes reaching the coastline when a tsunami wave travels over a submarine canyon, with changing maximum height location and alongshore extension. In general, the presence of a submarine canyon lowers the arrival time to the shoreline but prevents wave build-up just over the canyon axis. This leads to a decrease in tsunami amplitude at the coastal stretch located just shoreward of the canyon head, which results in a lower run-up in comparison with a non-canyoned margin. Contrarily, an increased wave build-up occurs on both sides of the canyon head, generating two coastal stretches with an enhanced run-up. These aggravated or reduced tsunami effects are modified with (i) proximity of the canyon tip to the coast, amplifying the wave height, (ii) canyon width, enlarging the areas with lower and higher maximum height wave along the coastline, and (iii) canyon obliquity with respect to the shoreline and shelf edge, increasing wave height shoreward of the leeward flank of the canyon. Moreover, the presence of a submarine canyon near the coast produces a variation of wave energy along the shore, eventually resulting in edge waves shoreward of the canyon head. Edge waves subsequently spread out alongshore reaching significant amplitudes especially when coupling with tsunami secondary waves occurs. Model results have been groundtruthed using the actual bathymetry of Blanes Canyon area in the North Catalan margin. This paper underlines the effects of the presence, morphology and orientation of submarine canyons as a determining factor on tsunami propagation and impact, which could prevail over other effects deriving from coastal configuration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The omega-loop gastric bypass (OLGBP), also called "mini-gastric bypass" or "single-anastomosis" gastric bypass is a form of gastric bypass where a long, narrow gastric pouch is created and anastomosed to the jejunum about 200- 250 cm from the angle of Treitz in an omega loop fashion, thereby avoiding a jejuno-jejunostomy.Proponents of the OLGBP claim that it is a safer and simpler operation than the traditional Roux-en-Y gastric bypass (RYGBP), easier to teach, that gives the same results in terms of weight loss than the RYGBP. One randomized study comparing the two techniques showed similar results after five years.The OLGBP is criticized because it creates an anastomosis between the gastric pouch and the jejunum where a large amount of biliopancreatic juices travel, thereby creating a situation where reflux of the latter into the stomach and distal esophagus is likely to develop. Such a situation has clearly been associated, in several animal studies, with an increased incidence of gastric cancer, especially at or close to the gastro-jejunostomy, and with an increased risk of lower esophageal cancer. In clinical practice, omega-loop gastrojejunostomies such as those used for reconstruction after gastric resection for benign disease or distal gastric cancer have been associated with the so called classical anastomotic cancer, linked to biliary reflux into the stomach, despite the fact that epidemiological studies about this do not show uniform results. Although no evidence at the present time links OLGBP to an increased risk of gastric cancer in the human, this possibility raises a concern among many bariatric surgeons, especially in the view that bariatric surgery is performed in relatively young patients with a long life expectancy, hence prone to develop cancer if indeed the risk is increased. Another arguments used against the OLGBP is that the jejuno-jejunostomy in the traditional RYGBP is easy to perform and associated with virtually no complication.Supporters of the OLGBP claim that the liquid that refluxes into the stomach after their procedure is not pure bile and pancreatic juice, but a combination of those with jejunal secretions, and that the latter is not as harmful. We would urge the proponents of the OLGBP to undertake the necessary animal studies to show that their assumption is indeed true before the procedure is performed widely, possibly leading to the development of hundreds of late gastric or esophageal carcinoma in the bariatric population. In the meantime, we strongly believe that RYGBP should remain the gold standard in gastric bypass surgery for morbid obesity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Daily precipitation is recorded as the total amount of water collected by a rain-gauge in 24 h. Events are modelled as a Poisson process and the 24 h precipitation by a Generalised Pareto Distribution (GPD) of excesses. Hazard assessment is complete when estimates of the Poisson rate and the distribution parameters, together with a measure of their uncertainty, are obtained. The shape parameter of the GPD determines the support of the variable: Weibull domain of attraction (DA) corresponds to finite support variables as should be for natural phenomena. However, Fréchet DA has been reported for daily precipitation, which implies an infinite support and a heavy-tailed distribution. Bayesian techniques are used to estimate the parameters. The approach is illustrated with precipitation data from the Eastern coast of the Iberian Peninsula affected by severe convective precipitation. The estimated GPD is mainly in the Fréchet DA, something incompatible with the common sense assumption of that precipitation is a bounded phenomenon. The bounded character of precipitation is then taken as a priori hypothesis. Consistency of this hypothesis with the data is checked in two cases: using the raw-data (in mm) and using log-transformed data. As expected, a Bayesian model checking clearly rejects the model in the raw-data case. However, log-transformed data seem to be consistent with the model. This fact may be due to the adequacy of the log-scale to represent positive measurements for which differences are better relative than absolute

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compute the exact vacuum expectation value of 1/2 BPS circular Wilson loops of TeX = 4 U(N) super Yang-Mills in arbitrary irreducible representations. By localization arguments, the computation reduces to evaluating certain integrals in a Gaussian matrix model, which we do using the method of orthogonal polynomials. Our results are particularly simple for Wilson loops in antisymmetric representations; in this case, we observe that the final answers admit an expansion where the coefficients are positive integers, and can be written in terms of sums over skew Young diagrams. As an application of our results, we use them to discuss the exact Bremsstrahlung functions associated to the corresponding heavy probes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to their high polymorphism, microsatellites have become one of the most valued genetic markers in population biology. We review the first two published studies on hybrid zones of the common shrew based on microsatellites. Both reveal surprisingly high interracial gene flow. It can be shown that these are overestimates. Indeed, in classical population genetics models as F-statistics, mutation is neglected. This constitutes an acceptable assumption as long as migration is higher than mutation. However, in hybrid zones where genetic exchanges can be rare, neglecting mutation will lead to strong overestimates of migration when working with microsatellites which display mutation rates up to 10(-3). As there seems to be no straightforward way to correct for this bias, interracial gene flow estimates based on microsatellites should be taken with caution. This problem should however not conceal the enormous potential of microsatellites to unravel the genetics of hybrid zones.