923 resultados para Convergence And Extension
Resumo:
A Digital Elevation Model (DEM) provides the information basis used for many geographic applications such as topographic and geomorphologic studies, landscape through GIS (Geographic Information Systems) among others. The DEM capacity to represent Earth?s surface depends on the surface roughness and the resolution used. Each DEM pixel depends on the scale used characterized by two variables: resolution and extension of the area studied. DEMs can vary in resolution and accuracy by the production method, although there are statistical characteristics that keep constant or very similar in a wide range of scales. Based on this property, several techniques have been applied to characterize DEM through multiscale analysis directly related to fractal geometry: multifractal spectrum and the structure function. The comparison of the results by both methods is discussed. The study area is represented by a 1024 x 1024 data matrix obtained from a DEM with a resolution of 10 x 10 m each point, which correspond with a region known as ?Monte de El Pardo? a property of Spanish National Heritage (Patrimonio Nacional Español) of 15820 Ha located to a short distance from the center of Madrid. Manzanares River goes through this area from North to South. In the southern area a reservoir is found with a capacity of 43 hm3, with an altitude of 603.3 m till 632 m when it is at the highest capacity. In the middle of the reservoir the minimum altitude of this area is achieved.
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
Dislocation mobility —the relation between applied stress and dislocation velocity—is an important property to model the mechanical behavior of structural materials. These mobilities reflect the interaction between the dislocation core and the host lattice and, thus, atomistic resolution is required to capture its details. Because the mobility function is multiparametric, its computation is often highly demanding in terms of computational requirements. Optimizing how tractions are applied can be greatly advantageous in accelerating convergence and reducing the overall computational cost of the simulations. In this paper we perform molecular dynamics simulations of ½ 〈1 1 1〉 screw dislocation motion in tungsten using step and linear time functions for applying external stress. We find that linear functions over time scales of the order of 10–20 ps reduce fluctuations and speed up convergence to the steady-state velocity value by up to a factor of two.
Resumo:
En los últimos años, el Ge ha ganado de nuevo atención con la finalidad de ser integrado en el seno de las existentes tecnologías de microelectrónica. Aunque no se le considera como un canddato capaz de reemplazar completamente al Si en el futuro próximo, probalemente servirá como un excelente complemento para aumentar las propiedades eléctricas en dispositivos futuros, especialmente debido a su alta movilidad de portadores. Esta integración requiere de un avance significativo del estado del arte en los procesos de fabricado. Técnicas de simulación, como los algoritmos de Monte Carlo cinético (KMC), proporcionan un ambiente atractivo para llevar a cabo investigación y desarrollo en este campo, especialmente en términos de costes en tiempo y financiación. En este estudio se han usado, por primera vez, técnicas de KMC con el fin entender el procesado “front-end” de Ge en su fabricación, específicamente la acumulación de dañado y amorfización producidas por implantación iónica y el crecimiento epitaxial en fase sólida (SPER) de las capas amorfizadas. Primero, simulaciones de aproximación de clisiones binarias (BCA) son usadas para calcular el dañado causado por cada ión. La evolución de este dañado en el tiempo se simula usando KMC sin red, o de objetos (OKMC) en el que sólamente se consideran los defectos. El SPER se simula a través de una aproximación KMC de red (LKMC), siendo capaz de seguir la evolución de los átomos de la red que forman la intercara amorfo/cristalina. Con el modelo de amorfización desarrollado a lo largo de este trabajo, implementado en un simulador multi-material, se pueden simular todos estos procesos. Ha sido posible entender la acumulación de dañado, desde la generación de defectos puntuales hasta la formación completa de capas amorfas. Esta acumulación ocurre en tres regímenes bien diferenciados, empezando con un ritmo lento de formación de regiones de dañado, seguido por una rápida relajación local de ciertas áreas en la fase amorfa donde ambas fases, amorfa y cristalina, coexisten, para terminar en la amorfización completa de capas extensas, donde satura el ritmo de acumulación. Dicha transición ocurre cuando la concentración de dañado supera cierto valor límite, el cual es independiente de las condiciones de implantación. Cuando se implantan los iones a temperaturas relativamente altas, el recocido dinámico cura el dañado previamente introducido y se establece una competición entre la generación de dañado y su disolución. Estos efectos se vuelven especialmente importantes para iones ligeros, como el B, el cual crea dañado más diluido, pequeño y distribuido de manera diferente que el causado por la implantación de iones más pesados, como el Ge. Esta descripción reproduce satisfactoriamente la cantidad de dañado y la extensión de las capas amorfas causadas por implantación iónica reportadas en la bibliografía. La velocidad de recristalización de la muestra previamente amorfizada depende fuertemente de la orientación del sustrato. El modelo LKMC presentado ha sido capaz de explicar estas diferencias entre orientaciones a través de un simple modelo, dominado por una única energía de activación y diferentes prefactores en las frecuencias de SPER dependiendo de las configuraciones de vecinos de los átomos que recristalizan. La formación de maclas aparece como una consecuencia de esta descripción, y es predominante en sustratos crecidos en la orientación (111)Ge. Este modelo es capaz de reproducir resultados experimentales para diferentes orientaciones, temperaturas y tiempos de evolución de la intercara amorfo/cristalina reportados por diferentes autores. Las parametrizaciones preliminares realizadas de los tensores de activación de tensiones son también capaces de proveer una buena correlación entre las simulaciones y los resultados experimentales de velocidad de SPER a diferentes temperaturas bajo una presión hidrostática aplicada. Los estudios presentados en esta tesis han ayudado a alcanzar un mejor entendimiento de los mecanismos de producción de dañado, su evolución, amorfización y SPER para Ge, además de servir como una útil herramienta para continuar el trabajo en este campo. In the recent years, Ge has regained attention to be integrated into existing microelectronic technologies. Even though it is not thought to be a feasible full replacement to Si in the near future, it will likely serve as an excellent complement to enhance electrical properties in future devices, specially due to its high carrier mobilities. This integration requires a significant upgrade of the state-of-the-art of regular manufacturing processes. Simulation techniques, such as kinetic Monte Carlo (KMC) algorithms, provide an appealing environment to research and innovation in the field, specially in terms of time and funding costs. In the present study, KMC techniques are used, for the first time, to understand Ge front-end processing, specifically damage accumulation and amorphization produced by ion implantation and Solid Phase Epitaxial Regrowth (SPER) of the amorphized layers. First, Binary Collision Approximation (BCA) simulations are used to calculate the damage caused by every ion. The evolution of this damage over time is simulated using non-lattice, or Object, KMC (OKMC) in which only defects are considered. SPER is simulated through a Lattice KMC (LKMC) approach, being able to follow the evolution of the lattice atoms forming the amorphous/crystalline interface. With the amorphization model developed in this work, implemented into a multi-material process simulator, all these processes can be simulated. It has been possible to understand damage accumulation, from point defect generation up to full amorphous layers formation. This accumulation occurs in three differentiated regimes, starting at a slow formation rate of the damage regions, followed by a fast local relaxation of areas into the amorphous phase where both crystalline and amorphous phases coexist, ending in full amorphization of extended layers, where the accumulation rate saturates. This transition occurs when the damage concentration overcomes a certain threshold value, which is independent of the implantation conditions. When implanting ions at relatively high temperatures, dynamic annealing takes place, healing the previously induced damage and establishing a competition between damage generation and its dissolution. These effects become specially important for light ions, as B, for which the created damage is more diluted, smaller and differently distributed than that caused by implanting heavier ions, as Ge. This description successfully reproduces damage quantity and extension of amorphous layers caused by means of ion implantation reported in the literature. Recrystallization velocity of the previously amorphized sample strongly depends on the substrate orientation. The presented LKMC model has been able to explain these differences between orientations through a simple model, dominated by one only activation energy and different prefactors for the SPER rates depending on the neighboring configuration of the recrystallizing atoms. Twin defects formation appears as a consequence of this description, and are predominant for (111)Ge oriented grown substrates. This model is able to reproduce experimental results for different orientations, temperatures and times of evolution of the amorphous/crystalline interface reported by different authors. Preliminary parameterizations for the activation strain tensors are able to also provide a good match between simulations and reported experimental results for SPER velocities at different temperatures under the appliance of hydrostatic pressure. The studies presented in this thesis have helped to achieve a greater understanding of damage generation, evolution, amorphization and SPER mechanisms in Ge, and also provide a useful tool to continue research in this field.
Resumo:
Human telomerase, a cellular reverse transcriptase (hTERT), is a nuclear ribonucleoprotein enzyme complex that catalyzes the synthesis and extension of telomeric DNA. This enzyme is specifically activated in most malignant tumors but is usually inactive in normal somatic cells, suggesting that telomerase plays an important role in cellular immortalization and tumorigenesis. Terminal maturation of tumor cells has been associated with the repression of telomerase activity. Using maturation-sensitive and -resistant NB4 cell lines, we analyzed the pattern of telomerase expression during the therapeutic treatment of acute promyelocytic leukemia (APL) by retinoids. Two pathways leading to the down-regulation of hTERT and telomerase activity were identified. The first pathway results in a rapid down-regulation of telomerase that is associated with retinoic acid receptor (RAR)-dependent maturation of NB4 cells. Furthermore, during NB4 cell maturation, obtained independently of RAR by retinoic X receptor (RXR)-specific agonists (rexinoids), no change in telomerase activity was observed, suggesting that hTERT regulation requires a specific signaling and occurs autonomously. A second pathway of hTERT regulation, identified in the RAR-responsive, maturation-resistant NB4-R1 cell line, results in a down-regulation of telomerase that develops slowly during two weeks of all-trans retinoic acid (ATRA) treatment. This pathway leads to telomere shortening, growth arrest, and cell death, all events that are overcome by ectopic expression of hTERT. These findings demonstrate a clear and full dissociation between the process of tumor cell maturation and the regulation of hTERT mRNA expression and telomerase activity by retinoids. We propose telomerase expression as an efficient and selective target of retinoids in the therapy of tumors.
Resumo:
Fossorial salamanders typically have elongate and attenuated heads and bodies, diminutive limbs, hands and feet, and extremely elongate tails. Batrachoseps from California, Lineatriton from eastern México, and Oedipina from southern México to Ecuador, all members of the family Plethodontidae, tribe Bolitoglossini, resemble one another in external morphology, which has evolved independently. Whereas Oedipina and Batrachoseps are elongate because there are more trunk vertebrae, a widespread homoplasy (parallelism) in salamanders, the genus Lineatriton is unique in having evolved convergently by an alternate “giraffe-neck” developmental program. Lineatriton has the same number of trunk vertebrae as related, nonelongated taxa, but individual trunk vertebrae are elongated. A robust phylogenetic hypothesis, based on sequences of three mtDNA genes, finds Lineatriton to be deeply nested within a clade characterized by generalized ecology and morphology. Lineatriton lineolus, the only currently recognized taxon in the genus, shows unanticipated genetic diversity. Surprisingly, geographically separated populations of L. lineolus are not monophyletic, but are sister taxa of different species of the morphologically generalized genus Pseudoeurycea. Lineatriton, long thought to be a unique monospecific lineage, is polyphyletic. Accordingly, the specialized morphology of Lineatriton displays homoplasy at two hierarchical levels: (i) with respect to other elongate lineages in the family (convergence), and (ii) within what is currently recognized as a single taxon (parallelism). These evolutionary events are of adaptive significance because to invade the lowland tropics salamanders must be either arboreal or fossorial; the repeated evolution of elongation and attenuation has led to multiple lowland invasions.
Resumo:
Microsatellites are tandem repeat sequences abundant in the genomes of higher eukaryotes and hitherto considered as "junk DNA." Analysis of a human genome representative data base (2.84 Mb) reveals a distinct juxtaposition of A-rich microsatellites and retroposons and suggests their coevolution. The analysis implies that most microsatellites were generated by a 3'-extension of retrotranscripts, similar to mRNA polyadenylylation, and that they serve in turn as "retroposition navigators," directing the retroposons via homology-driven integration into defined sites. Thus, they became instrumental in the preservation and extension of primordial genomic patterns. A role is assigned to these reiterating A-rich loci in the higher-order organization of the chromatin. The disease-associated triplet repeats are mostly found in coding regions and do not show an association with retroposons, constituting a unique set within the family of microsatellite sequences.
Resumo:
Fructans play an important role in assimilate partitioning and possibly in stress tolerance in many plant families. Sucrose:fructan 6-fructosyltransferase (6-SFT), an enzyme catalyzing the formation and extension of beta-2,6-linked fructans typical of grasses, was purified from barley (Hordeum vulgare L.). It occurred in two closely similar isoforms with indistinguishable catalytic properties, both consisting of two subunits with apparent masses of 49 and 23 kDa. Oligonucleotides, designed according to the sequences of tryptic peptides from the large subunit, were used to amplify corresponding sequences from barley cDNA. The main fragment generated was cloned and used to screen a barley cDNA expression library. The longest cDNA obtained was transiently expressed in Nicotiana plumbaginifolia protoplasts and shown to encode a functional 6-SFT. The deduced amino acid sequence of the cDNA comprises both subunits of 6-SFT. It has high similarity to plant invertases and other beta-fructosyl hydrolases but only little to bacterial fructosyltransferases catalyzing the same type of reaction as 6-SFT.
Resumo:
Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.
Resumo:
In this article I investigate to what extent European Integration stimulates policy convergence and diffusion of various forms of tax policy. Using a mixed-methods design, I find that several causal mechanisms contribute to an EU-wide diffusion of tax policies: imposition, competition, harmonization and learning/communication. I show that these mechanisms have different effects on different forms of taxation. Even if the ultimate outcome of this influence only in few cases leads to unconditional convergence, the EU has markedly accelerated policy diffusion among its member states.
Resumo:
As the Greek debt drama reaches another supposedly decision point, Daniel Gros urges creditors (and indeed all policy-makers) to think about the long term and poses one key question in this CEPS High-Level Brief: What can be gained by keeping Greece inside the euro area at “whatever it takes”? As he points out, the US, with its unified politics and its federal fiscal transfer system, is often taken as a model for the Eurozone, and it is thus instructive to consider the longer-term performance of an area of the US which has for years been kept afloat by massive transfers, and which is now experiencing a public debt crisis. The entity in question is Puerto Rico, which is an integral part of the US in all relevant economic dimensions (currency, economic policy, etc.). The dismal fiscal and economic performance of Puerto Rico carries two lessons: 1) Keeping Greece in the eurozone by increasing implicit subsidies in the form of debt forgiveness might create a low-growth equilibrium with increasing aid dependency. 2) It is wrong to assume that, further integration, including a fiscal and political union, would be sufficient to foster convergence, and prevent further problems of the type the EU is experiencing with Greece.
Resumo:
Triassic turbidites of the Nanpanjiang basin of south China represent the most expansive and voluminous siliciclastic turbidite accumulation in south China. The Nanpanjiang basin occurs at a critical junction between the southern margin of the south China plate and the Indochina, Siamo and Sibumasu plates to the south and southwest. The Triassic Yangtze carbonate shelf and isolated carbonated platforms in the basin have been extensively studied, but silicilastic turbidites in the basin have received relatively little attention. Deciphering the facies, paleocurrent indicators and provenance of the Triassic turbidites is important for several reasons: it promises to help resolve the timing of plate collisions along suture zones bordering the basin to the south and southwest, it will enable evaluation of which suture zones and Precambrian massifs were source areas, and it will allow an evaluation of the impact of the siliciclastic flux on carbonate platform evolution within the basin. Turbidites in the basin include the Early Triassic Shipao Formation and the Middle-Late Triassic Baifeng, Xinyuan, Lanmu Bianyang and Laishike formations. Each ranges upward of 700 m and the thickest is nearly 3 km. The turbidites contain very-fine sand in the northern part of the basin whereas the central and southern parts of the basin also commonly contain fine and rarely medium sand size. Coarser sand sizes occur where paleocurrents are from the south, and in this area some turbidites exhibit complete bouma sequences with graded A divisions. Successions contain numerous alternations between mud-rich and sand-rich intervals with thickness trends corresponding to proximal/ distal fan components. Spectacularly preserved sedimentary structures enable robust evaluation of turbidite systems and paleocurrent analyses. Analysis of paleocurrent measurements indicates two major directions of sediment fill. The northern part of the basin was sourced primarily by the Jiangnan massif in the northeast, and the central and southern parts of the basin were sourced primarily from suture zones and the Yunkai massif to the south and southeast respectively. Sandstones of the Lower Triassic Shipao Fm. have volcaniclastic composition including embayed quartz and glass shards. Middle Triassic sandstones are moderately mature, matrix-rich, lithic wackes. The average QFL ratio from all point count samples is 54.1/18.1/27.8% and the QmFLt ratio is 37.8/ 18.1/ 44.1%. Lithic fragments are dominantly claystone and siltstone clasts and metasedimentary clasts such as quartz mica tectonite. Volcanic lithics are rare. Most samples fall in the recycled orogen field of QmFLt plots, indicating a relatively quartz and lithic rich composition consistent with derivation from Precambrian massifs such as the Jiangnan, and Yunkai. A few samples from the southwest part of the basin fall into the dissected arc field, indicating a somewhat more lithic and feldspar-rich composition consistent with derivation from a suture zone Analysis of detrial zircon populations from 17 samples collected across the basin indicate: (1) Several samples contain zircons with concordant ages greater than 3000 Ma, (2) there are widespread peaks across the basin at 1800 Ma and 2500, (3) a widespread 900 Ma population, (3) a widespread population of zircons at 440 Ma, and (5) a larger population of younger zircons about 250 Ma in the southwestern part which is replaced to the north and northwest by a somewhat older population around 260-290 Ma. The 900 Ma provenance fits derivation from the Jiangnan Massif, the 2500, 1800, and 440 Ma provenance fits the Yunkai massif, and the 250 Ma is consistent with convergence and arc development in suture zones bordering the basin on the south or southwest. Early siliciclastic turbidite flux, proximal to source areas impacted carbonate platform evolution by infilling the basin, reducing accommodation space, stabilizing carbonate platform margins and promoting margin progradation. Late arrival, in areas far from source areas caused margin aggradation over a starved basin, development of high relief aggradational escarpments and unstable scalloped margins.
Resumo:
Using the concept of 'orbital tuning', a continuous, high-resolution deep-sea chronostratigraphy has been developed spanning the last 300,000 yr. The chronology is developed using a stacked oxygen-isotope stratigraphy and four different orbital tuning approaches, each of which is based upon a different assumption concerning the response of the orbital signal recorded in the data. Each approach yields a separate chronology. The error measured by the standard deviation about the average of these four results (which represents the 'best' chronology) has an average magnitude of only 2500 yr. This small value indicates that the chronology produced is insensitive to the specific orbital tuning technique used. Excellent convergence between chronologies developed using each of five different paleoclimatological indicators (from a single core) is also obtained. The resultant chronology is also insensitive to the specific indicator used. The error associated with each tuning approach is estimated independently and propagated through to the average result. The resulting error estimate is independent of that associated with the degree of convergence and has an average magnitude of 3500 yr, in excellent agreement with the 2500-yr estimate. Transfer of the final chronology to the stacked record leads to an estimated error of +/-1500 yr. Thus the final chronology has an average error of +/-5000 yr.
Resumo:
Errata sheets tipped in.
Resumo:
Mode of access: Internet.