940 resultados para Lattice-binary parameter
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
This thesis presents a topological approach to studying fuzzy setsby means of modifier operators. Modifier operators are mathematical models, e.g., for hedges, and we present briefly different approaches to studying modifier operators. We are interested in compositional modifier operators, modifiers for short, and these modifiers depend on binary relations. We show that if a modifier depends on a reflexive and transitive binary relation on U, then there exists a unique topology on U such that this modifier is the closure operator in that topology. Also, if U is finite then there exists a lattice isomorphism between the class of all reflexive and transitive relations and the class of all topologies on U. We define topological similarity relation "≈" between L-fuzzy sets in an universe U, and show that the class LU/ ≈ is isomorphic with the class of all topologies on U, if U is finite and L is suitable. We consider finite bitopological spaces as approximation spaces, and we show that lower and upper approximations can be computed by means of α-level sets also in the case of equivalence relations. This means that approximations in the sense of Rough Set Theory can be computed by means of α-level sets. Finally, we present and application to data analysis: we study an approach to detecting dependencies of attributes in data base-like systems, called information systems.
Resumo:
Tractable cases of the binary CSP are mainly divided in two classes: constraint language restrictions and constraint graph restrictions. To better understand and identify the hardest binary CSPs, in this work we propose methods to increase their hardness by increasing the balance of both the constraint language and the constraint graph. The balance of a constraint is increased by maximizing the number of domain elements with the same number of occurrences. The balance of the graph is defined using the classical definition from graph the- ory. In this sense we present two graph models; a first graph model that increases the balance of a graph maximizing the number of vertices with the same degree, and a second one that additionally increases the girth of the graph, because a high girth implies a high treewidth, an important parameter for binary CSPs hardness. Our results show that our more balanced graph models and constraints result in harder instances when compared to typical random binary CSP instances, by several orders of magnitude. Also we detect, at least for sparse constraint graphs, a higher treewidth for our graph models.
Resumo:
The most general black M5-brane solution of eleven-dimensional supergravity (with a flat R4 spacetime in the brane and a regular horizon) is characterized by charge, mass and two angular momenta. We use this metric to construct general dual models of large-N QCD (at strong coupling) that depend on two free parameters. The mass spectrum of scalar particles is determined analytically (in the WKB approximation) and numerically in the whole two-dimensional parameter space. We compare the mass spectrum with analogous results from lattice calculations, and find that the supergravity predictions are close to the lattice results everywhere on the two dimensional parameter space except along a special line. We also examine the mass spectrum of the supergravity Kaluza-Klein (KK) modes and find that the KK modes along the compact D-brane coordinate decouple from the spectrum for large angular momenta. There are however KK modes charged under a U(1)×U(1) global symmetry which do not decouple anywhere on the parameter space. General formulas for the string tension and action are also given.
Resumo:
We study large N SU(N) Yang-Mills theory in three and four dimensions using a one-parameter family of supergravity models which originate from non-extremal rotating D-branes. We show explicitly that varying this angular momentum parameter decouples the Kaluza-Klein modes associated with the compact D-brane coordinate, while the mass ratios for ordinary glueballs are quite stable against this variation, and are in good agreement with the latest lattice results. We also compute the topological susceptibility and the gluon condensate as a function of the "angular momentum" parameter.
Resumo:
LS 5039 is one of the few TeV emitting X-ray binaries detected so far. The powering source of its multiwavelength emission can be accretion in a microquasar scenario or wind interaction in a young nonaccreting pulsar scenario. Aims.To present new high-resolution radio images and compare them with the expected behavior in the different scenarios. Methods.We analyze Very Long Baseline Array (VLBA) radio observations that provide morphological and astrometric information at milliarcsecond scales. Results.We detect a changing morphology between two images obtained five days apart. In both runs there is a core component with a constant flux density, and an elongated emission with a position angle (PA) that changes by 12 $\pm$ $3\degr$ between both runs. The source is nearly symmetric in the first run and asymmetric in the second one. The astrometric results are not conclusive. Conclusions.A simple and shockless microquasar scenario cannot easily explain the observed changes in morphology. An interpretation within the young nonaccreting pulsar scenario requires the inclination of the binary system to be very close to the upper limit imposed by the absence of X-ray eclipses.
Resumo:
BACKGROUND: Best corrected visual acuity (BCVA) of 0.8 or above in AMD patients can sometimes correspond to poor macular function inducing a serious visual handicap. Microperimetry can be used to objectivize this difference. PATIENTS AND METHODS: A retrospective study was undertaken on 233 files of AMD patients of whom 82 had had a microperimetry. BCVA was compared with microperimetry performance. All examinations were performed in an identical setting by the same team of 3 persons. RESULTS: Among the 82 patients included, 32 (39.0%) had a BCVA equal to or above 0.8 even though their microperimetry performance was lower than 200/560 db. 10 of them (12.2% of total) had an even poorer microperimetry below 120/560 db indicating poor macular function. CONCLUSIONS: More than a third of the AMD patients had a bad or very bad microperimetry performance in parallel with a good visual acuity. Microperimetry is a valuable tool to assess and follow real macular function in AMD patients when visual acuity alone can be misleading.
Resumo:
BACKGROUND: Most peripheral T-cell lymphoma (PTCL) patients have a poor outcome and the identification of prognostic factors at diagnosis is needed. PATIENTS AND METHODS: The prognostic impact of total metabolic tumor volume (TMTV0), measured on baseline [(18)F]2-fluoro-2-deoxy-d-glucose positron emission tomography/computed tomography, was evaluated in a retrospective study including 108 PTCL patients (27 PTCL not otherwise specified, 43 angioimmunoblastic T-cell lymphomas and 38 anaplastic large-cell lymphomas). All received anthracycline-based chemotherapy. TMTV0 was computed with the 41% maximum standardized uptake value threshold method and an optimal cut-off point for binary outcomes was determined and compared with others prognostic factors. RESULTS: With a median follow-up of 23 months, 2-year progression-free survival (PFS) was 49% and 2-year overall survival (OS) was 67%. High TMTV0 was significantly associated with a worse prognosis. At 2 years, PFS was 26% in patients with a high TMTV0 (>230 cm(3), n = 53) versus 71% for those with a low TMTV0, [P < 0.0001, hazard ratio (HR) = 4], whereas OS was 50% versus 80%, respectively, (P = 0.0005, HR = 3.1). In multivariate analysis, TMTV0 was the only significant independent parameter for both PFS and OS. TMTV0, combined with PIT, discriminated even better than TMTV0 alone, patients with an adverse outcome (TMTV0 >230 cm(3) and PIT >1, n = 33,) from those with good prognosis (TMTV0 ≤230 cm(3) and PIT ≤1, n = 40): 19% versus 73% 2-year PFS (P < 0.0001) and 43% versus 81% 2-year OS, respectively (P = 0.0002). Thirty-one patients (other TMTV0-PIT combinations) had an intermediate outcome, 50% 2-year PFS and 68% 2-year OS. CONCLUSION: TMTV0 appears as an independent predictor of PTCL outcome. Combined with PIT, it could identify different risk categories at diagnosis and warrants further validation as a prognostic marker.
Resumo:
Objective The objective of the present study was to evaluate current radiographic parameters designed to investigate adenoid hypertrophy and nasopharyngeal obstruction, and to present an alternative radiographic assessment method. Materials and Methods In order to do so, children (4 to14 years old) who presented with nasal obstruction or oral breathing complaints were submitted to cavum radiographic examination. One hundred and twenty records were evaluated according to quantitative radiographic parameters, and data were correlated with a gold-standard videonasopharyngoscopic study, in relation to the percentage of choanal obstruction. Subsequently, a regression analysis was performed in order to create an original model so the percentage of the choanal obstruction could be predicted. Results The quantitative parameters demonstrated moderate, if not weak correlation with the real percentage of choanal obstruction. The regression model (110.119*A/N) demonstrated a satisfactory ability to “predict” the actual percentage of choanal obstruction. Conclusion Since current adenoid quantitative radiographic parameters present limitations, the model presented by the present study might be considered as an alternative assessment method in cases where videonasopharyngoscopic evaluation is unavailable.
Resumo:
An empirical equation: DMHmº = t i/b (where t i is the Kelvin temperature of the beginning of the thermal decomposition obtained from the thermogravimetry of the adducts; b is an empirical parameter wich depends on the metal halide and on the number of ligands) was obtained and tested for 53 adducts MX2.nL (where MX2 is a metal halide from the zinc group). The difference between experimental and calculated values was less than 6% for 22 adducts. To another 22 adducts, that difference was less than 10%. Only for 4 compounds the difference between experimental and calculated values exceeds 15%.
Resumo:
A nanostructured disordered Fe(Al) solid solution was obtained from elemental powders of Fe and Al using a high-energy ball mill. The transformations occurring in the material during milling were studied with the use of X-ray diffraction. In addition lattice microstrain, average crystallite size, dislocation density, and the lattice parameter were determined. Scanning electron microscopy (SEM) was employed to examine the morphology of the samples as a function of milling times. Thermal behaviour of the milled powders was examined by differential scanning calorimetry (DSC). The results, as well as dissimilarity between calorimetric curves of the powders after 2 and 20 h of milling, indicated the formation of a nanostructured Fe(Al) solid solution
Resumo:
We report a Lattice-Boltzmann scheme that accounts for adsorption and desorption in the calculation of mesoscale dynamical properties of tracers in media of arbitrary complexity. Lattice Boltzmann simulations made it possible to solve numerically the coupled Navier-Stokes equations of fluid dynamics and Nernst-Planck equations of electrokinetics in complex, heterogeneous media. With the moment propagation scheme, it became possible to extract the effective diffusion and dispersion coefficients of tracers, or solutes, of any charge, e.g., in porous media. Nevertheless, the dynamical properties of tracers depend on the tracer-surface affinity, which is not purely electrostatic and also includes a species-specific contribution. In order to capture this important feature, we introduce specific adsorption and desorption processes in a lattice Boltzmann scheme through a modified moment propagation algorithm, in which tracers may adsorb and desorb from surfaces through kinetic reaction rates. The method is validated on exact results for pure diffusion and diffusion-advection in Poiseuille flows in a simple geometry. We finally illustrate the importance of taking such processes into account in the time-dependent diffusion coefficient in a more complex porous medium.