997 resultados para Problem Resolution
Resumo:
Spectroscopic ellipsometry and high resolution transmission electron microscopy have been used to characterize microcrystalline silicon films. We obtain an excellent agreement between the multilayer model used in the analysis of the optical data and the microscopy measurements. Moreover, thanks to the high resolution achieved in the microscopy measurements and to the improved optical models, two new features of the layer-by-layer deposition of microcrystalline silicon have been detected: i) the microcrystalline films present large crystals extending from the a-Si:H substrate to the film surface, despite the sequential process in the layer-by-layer deposition; and ii) a porous layer exists between the amorphous silicon substrate and the microcrystalline silicon film.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
1. Digital elevation models (DEMs) are often used in landscape ecology to retrieve elevation or first derivative terrain attributes such as slope or aspect in the context of species distribution modelling. However, DEM-derived variables are scale-dependent and, given the increasing availability of very high-resolution (VHR) DEMs, their ecological relevancemust be assessed for different spatial resolutions. 2. In a study area located in the Swiss Western Alps, we computed VHR DEMs-derived variables related to morphometry, hydrology and solar radiation. Based on an original spatial resolution of 0.5 m, we generated DEM-derived variables at 1, 2 and 4 mspatial resolutions, applying a Gaussian Pyramid. Their associations with local climatic factors, measured by sensors (direct and ambient air temperature, air humidity and soil moisture) as well as ecological indicators derived fromspecies composition, were assessed with multivariate generalized linearmodels (GLM) andmixed models (GLMM). 3. Specific VHR DEM-derived variables showed significant associations with climatic factors. In addition to slope, aspect and curvature, the underused wetness and ruggedness indices modelledmeasured ambient humidity and soilmoisture, respectively. Remarkably, spatial resolution of VHR DEM-derived variables had a significant influence on models' strength, with coefficients of determination decreasing with coarser resolutions or showing a local optimumwith a 2 mresolution, depending on the variable considered. 4. These results support the relevance of using multi-scale DEM variables to provide surrogates for important climatic variables such as humidity, moisture and temperature, offering suitable alternatives to direct measurements for evolutionary ecology studies at a local scale.
Resumo:
This paper investigates the example of Cyprus as a case study for the Europeanisation of conflict resolution. The argument advanced is that the European Union (EU) impacts the positions of the parts of the conflict (here, Greek Cypriots, Turkish Cypriots and Turkey) but not always towards the resolution of the dispute and compliance with EU conditionality. Conformity with EU conditionality depends on its credibility, which is decreased by the internalisation of the conflict into the EU. In this context, this work contributes to the discussion on Europeanisation and the aptitude of the EU in conflict resolution as well as the role of the EU in the Cyprus conflict during the post-accession years.
Resumo:
Network neutrality is a growing policy controversy. Traffic management techniques affect not only high-speed, high-money content, but by extension all other content too. Internet regulators and users may tolerate much more discrimination in the interests of innovation. For instance, in the absence of regulatory oversight, ISPs could use Deep Packet Inspection (DPI) to block some content altogether, if they decide it is not to the benefit of ISPs, copyright holders, parents or the government. ISP blocking is currently widespread in controlling spam email, and in some countries in blocking sexually graphic illegal images. In 1999 this led to scrutiny of foreclosure of Instant Messaging and video and cable-telephony horizontal merger. Fourteen years later, there were in 2013 net neutrality laws implemented in Slovenia, the Netherlands, Chile and Finland, regulation in the United States and Canada , co-regulation in Norway, and self-regulation in Japan, the United Kingdom and many other European countries . Both Germany and France in mid-2013 debated new net neutrality legislation, and the European Commission announced on 11 September 2013 that it would aim to introduce legislation in early 2014. This paper analyses these legal developments, and in particular the difficulty in assessing reasonable traffic management and ‘specialized’ (i.e. unregulated) faster services in both EU and US law. It also assesses net neutrality law against the international legal norms for user privacy and freedom of expression
Resumo:
Aim The aim of this study was to test different modelling approaches, including a new framework, for predicting the spatial distribution of richness and composition of two insect groups. Location The western Swiss Alps. Methods We compared two community modelling approaches: the classical method of stacking binary prediction obtained fromindividual species distribution models (binary stacked species distribution models, bS-SDMs), and various implementations of a recent framework (spatially explicit species assemblage modelling, SESAM) based on four steps that integrate the different drivers of the assembly process in a unique modelling procedure. We used: (1) five methods to create bS-SDM predictions; (2) two approaches for predicting species richness, by summing individual SDM probabilities or by modelling the number of species (i.e. richness) directly; and (3) five different biotic rules based either on ranking probabilities from SDMs or on community co-occurrence patterns. Combining these various options resulted in 47 implementations for each taxon. Results Species richness of the two taxonomic groups was predicted with good accuracy overall, and in most cases bS-SDM did not produce a biased prediction exceeding the actual number of species in each unit. In the prediction of community composition bS-SDM often also yielded the best evaluation score. In the case of poor performance of bS-SDM (i.e. when bS-SDM overestimated the prediction of richness) the SESAM framework improved predictions of species composition. Main conclusions Our results differed from previous findings using community-level models. First, we show that overprediction of richness by bS-SDM is not a general rule, thus highlighting the relevance of producing good individual SDMs to capture the ecological filters that are important for the assembly process. Second, we confirm the potential of SESAM when richness is overpredicted by bS-SDM; limiting the number of species for each unit and applying biotic rules (here using the ranking of SDM probabilities) can improve predictions of species composition
Resumo:
Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy [1], Total Variation (TV)based energies [2,3] and more recently non-local means [4]. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm for fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n(2)) and O(1/root epsilon), while existing techniques are in O(1/n) and O(1/epsilon). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.
Resumo:
Glucose is the primary source of energy for the brain but also an important source of building blocks for proteins, lipids, and nucleic acids. Little is known about the use of glucose for biosynthesis in tissues at the cellular level. We demonstrate that local cerebral metabolic activity can be mapped in mouse brain tissue by quantitatively imaging the biosynthetic products deriving from [U-(13)C]glucose metabolism using a combination of in situ electron microscopy and secondary ion mass-spectroscopy (NanoSIMS). Images of the (13)C-label incorporated into cerebral ultrastructure with ca. 100nm resolution allowed us to determine the timescale on which the metabolic products of glucose are incorporated into different cells, their sub-compartments and organelles. These were mapped in astrocytes and neurons in the different layers of the motor cortex. We see evidence for high metabolic activity in neurons via the nucleus (13)C enrichment. We observe that in all the major cell compartments, such as e.g. nucleus and Golgi apparatus, neurons incorporate substantially higher concentrations of (13)C-label than astrocytes.
Resumo:
This work describes the formation of transformation products (TPs) by the enzymatic degradation at laboratory scale of two highly consumed antibiotics: tetracycline (Tc) and erythromycin (ERY). The analysis of the samples was carried out by a fast and simple method based on the novel configuration of the on-line turbulent flow system coupled to a hybrid linear ion trap – high resolution mass spectrometer. The method was optimized and validated for the complete analysis of ERY, Tc and their transformation products within 10 min without any other sample manipulation. Furthermore, the applicability of the on-line procedure was evaluated for 25 additional antibiotics, covering a wide range of chemical classes in different environmental waters with satisfactory quality parameters. Degradation rates obtained for Tc by laccase enzyme and ERY by EreB esterase enzyme without the presence of mediators were ∼78% and ∼50%, respectively. Concerning the identification of TPs, three suspected compounds for Tc and five of ERY have been proposed. In the case of Tc, the tentative molecular formulas with errors mass within 2 ppm have been based on the hypothesis of dehydroxylation, (bi)demethylation and oxidation of the rings A and C as major reactions. In contrast, the major TP detected for ERY has been identified as the “dehydration ERY-A”, with the same molecular formula of its parent compound. In addition, the evaluation of the antibiotic activity of the samples along the enzymatic treatments showed a decrease around 100% in both cases
Resumo:
In fetal brain MRI, most of the high-resolution reconstruction algorithms rely on brain segmentation as a preprocessing step. Manual brain segmentation is however highly time-consuming and therefore not a realistic solution. In this work, we assess on a large dataset the performance of Multiple Atlas Fusion (MAF) strategies to automatically address this problem. Firstly, we show that MAF significantly increase the accuracy of brain segmentation as regards single-atlas strategy. Secondly, we show that MAF compares favorably with the most recent approach (Dice above 0.90). Finally, we show that MAF could in turn provide an enhancement in terms of reconstruction quality.
Resumo:
Focal epilepsy is increasingly recognized as the result of an altered brain network, both on the structural and functional levels and the characterization of these widespread brain alterations is crucial for our understanding of the clinical manifestation of seizure and cognitive deficits as well as for the management of candidates to epilepsy surgery. Tractography based on Diffusion Tensor Imaging allows non-invasive mapping of white matter tracts in vivo. Recently, diffusion spectrum imaging (DSI), based on an increased number of diffusion directions and intensities, has improved the sensitivity of tractography, notably with respect to the problem of fiber crossing and recent developments allow acquisition times compatible with clinical application. We used DSI and parcellation of the gray matter in regions of interest to build whole-brain connectivity matrices describing the mutual connections between cortical and subcortical regions in patients with focal epilepsy and healthy controls. In addition, the high angular and radial resolution of DSI allowed us to evaluate also some of the biophysical compartment models, to better understand the cause of the changes in diffusion anisotropy. Global connectivity, hub architecture and regional connectivity patterns were altered in TLE patients and showed different characteristics in RTLE vs LTLE with stronger abnormalities in RTLE. The microstructural analysis suggested that disturbed axonal density contributed more than fiber orientation to the connectivity changes affecting the temporal lobes whereas fiber orientation changes were more involved in extratemporal lobe changes. Our study provides further structural evidence that RTLE and LTLE are not symmetrical entities and DSI-based imaging could help investigate the microstructural correlate of these imaging abnormalities.
Resumo:
A minimum cost spanning tree (mcst) problem analyzes the way to efficiently connect individuals to a source when they are located at different places. Once the efficient tree is obtained, the question on how allocating the total cost among the involved agents defines, in a natural way, a confliicting claims situation. For instance, we may consider the endowment as the total cost of the network, whereas for each individual her claim is the maximum amount she will be allocated, that is, her connection cost to the source. Obviously, we have a confliicting claims problem, so we can apply claims rules in order to obtain an allocation of the total cost. Nevertheless, the allocation obtained by using claims rules might not satisfy some appealing properties (in particular, it does not belong to the core of the associated cooperative game). We will define other natural claims problems that appear if we analyze the maximum and minimum amount that an individual should pay in order to support the minimum cost tree. Keywords: Minimum cost spanning tree problem, Claims problem, Core JEL classification: C71, D63, D71.