888 resultados para Symphonic music, analysis, hermeneutics, Bax, music and nature
Resumo:
This research aims to diachronically analyze the worldwide scientific production on open access, in the academic and scientific context, in order to contribute to knowledge and visualization of its main actors. As a method, bibliographical, descriptive and analytical research was used, with the contribution of bibliometric studies, especially the production indicators, scientific collaboration and indicators of thematic co-occurrence. The Scopus database was used as a source to retrieve the articles on the subject, with a resulting corpus of 1179 articles. Using Bibexcel software, frequency tables were constructed for the variables, and Pajek software was used to visualize the collaboration network and VoSViewer for the construction of the keywords' network. As for the results, the most productive researchers come from countries such as the United States, Canada, France and Spain. Journals with higher impact in the academic community have disseminated the new constructed knowledge. A collaborative network with a few subnets where co-authors are from different countries has been observed. As conclusions, this study allows identifying the themes of debates that mark the development of open access at the international level, and it is possible to state that open access is one of the new emerging and frontier fields of library and information science
Resumo:
Molybdenum and vanadium were analysed in 9 scediment cores recovered from the continental slope and rise off NW Africa. Additionall chemical and sedimentological parameters as well as isotope stage boundaries were available for the same core profiles from other investigations. Molybdenum, ranging between <1 and 10 ppm, occurs in two associateions, either with organic carbon and sulphides in sediments with reducing conditions or with Mn oxides in oxidized near-surface core sections. Highest values (between 4 and 10 ppm Mo) are found in sulphide-rich core sections deposited during glacial times in a core from 200 m water depth. The possibility of anoxic near-bottom water conditions prevailing at thhis site during certain glacial intervals is discussed. In oxidized near-surface core sections, the diagenetic mobility of Mo becomes evident from strong Mo enrichment together with Mn oxides (values up to 4 ppm Mo). This enrichment is probably due to coprecipitation and/or adsorption of Mo from interstitial water to the diagenetically forming Mn oxides. The close relation between Mo and Corg results in strongly covarying sedimentation rates in both components reaching up to 10 times the rates in glacial compared to interglacial core sections. Vanadium (values between 20 and 100 ppm) does not show clear relations to climate and near-bottom or sediment milieu. It occurs mainly bound to the fine grained terrigenous fraction, associated with aluminium silicates (clay minerals) and iron oxides. Additionally positive covariation of vanadium with phosphorus in most core profiles suggest that some V may be bound to phosphates.
Resumo:
Understanding past human-climate-environment interactions is essential for assessing the vulnerability of landscapes and ecosystems to future climate change. This is particularly important in southern Morocco where the current vegetation is impacted by pastoralism, and the region is highly sensitive to climate variability. Here, we present a 2000-year record of vegetation, sedimentation rate, XRF chemical element intensities, and particle size from two decadal-resolved, marine sediment cores, raised from offshore Cape Ghir, southern Morocco. The results show that between 650 and 850 AD the sedimentation rate increased dramatically from 100 cm/1000 years to 300 cm/1000 years, and the Fe/Ca and pollen flux doubled, together indicating higher inputs of terrestrial sediment. Particle size measurements and end-member modelling suggest increased fluvial transport of the sediment. Beginning at 650 AD pollen levels from Cichorioideae species show a sharp rise from 10% to 20%. Pollen from Atemisia and Plantago, also increase from this time. Deciduous oak pollen percentages show a decline, whereas those of evergreen oak barely change. The abrupt increase in terrestrial/fluvial input from 650 to 850 AD occurs, within the age uncertainty, of the arrival of Islam (Islamisation) in Morocco at around 700 AD. Historical evidence suggests Islamisation led to population increase and development of southern Morocco, including expanded pastoralism, deforestation and agriculture. Livestock pressure may have changed the vegetation structure, accounting for the increase in pollen from Cichorioideae, Plantago, and Artemisia, which include many weedy species. Goats in particular may have played a dominant role as agents of erosion, and intense browsing may have led to the decline in deciduous oak; evergreen oak is more likely to survive as it re-sprouts more vigorously after browsing. From 850 AD to present sedimentation rates, Fe/Ca ratios and fluvial discharge remain stable, whereas pollen results suggest continued degradation. Pollen results from the past 150 years suggest expanded cultivation of olives and the native argan tree, and the introduction of Australian eucalyptus trees. The rapidly increasing population in southern Morocco is causing continued pressure to expand pastoralism and agriculture. The history of land degradation presented here suggests that the vegetation in southern Morocco may have been degraded for a longer period than previously thought and may be particularly sensitive to further land use changes. These results should be included in land management strategies for southern Morocco.
Resumo:
Cold-water corals are common along the Moroccan continental margin off Melilla in the Alboran Sea (western Mediterranean Sea), where they colonise and largely cover mound and ridge structures. Radiocarbon ages of the reef-forming coral species Lophelia pertusa and Madrepora oculata sampled from those structures, reveal that they were prolific in this area during the last glacial-interglacial transition with pronounced growth periods covering the Bølling-Allerød interstadial (13.5-12.8 ka BP) and the Early Holocene (11.3-9.8 ka BP). Their proliferation during these periods is expressed in vertical accumulation rates for an individual coral ridge of 266-419 cm ka**-1 that consists of coral fragments embedded in a hemipelagic sediment matrix. Following a period of coral absence, as noted in the records, cold-water corals re-colonised the area during the Mid-Holocene (5.4 ka BP) and underwater photographs indicate that corals currently thrive there. It appears that periods of sustained cold-water coral growth in the Melilla Coral Province were closely linked to phases of high marine productivity. The increased productivity was related to the deglacial formation of the most recent organic rich layer in the western Mediterranean Sea and to the development of modern circulation patterns in the Alboran Sea.
Resumo:
Strong climatic and temperature fluctuations mark the Late Campanian and Maastrichtian as indicated by stable isotope records from the equatorial Pacific (Site 463) and middle and high latitude South Atlantic (Sites 525, 689 and 690). The first major global cooling decreased intermediate water temperatures (IWT) by 5-6°C between 73-70 Ma. At the same time, sea surface temperature (SST) decreased by 4-5°C in middle and high latitudes. Intermediate waters (IW) temporarily warmed by 2°C in low and middle latitudes between 70-68.5 Ma. Global cooling resumed between 68.5-65.5 Ma when IWT decreased by 3-4°C and SST by 5°C in middle latitudes. About 450 ka before the Cretaceous-Tertiary boundary rapid global warming increased IWT and SST by 3-4°C, though SST in the tropics changed little. During the last 200 ka of the Maastrichtian, climate cooled rapidly with IWT and SST decreasing by 2-3°C. During the global cooling at 71-70 Ma and possibly at 67-65.5 Ma, the sources of cold intermediate waters in the equatorial Pacific, Indo-Pacific and South Atlantic were derived from the high latitude North Pacific. In contrast, during the global climate warming between 65.2-65.4 Ma, the middle latitude South Atlantic was closest to the source of IW production and implies that the low latitude Tethys played a major role in global climate change. Climate changes, sea-level fluctuations and associated restricted seaways appear to be the most likely mechanisms for the alternating sources of IW production.
Resumo:
This article presents a probabilistic method for vehicle detection and tracking through the analysis of monocular images obtained from a vehicle-mounted camera. The method is designed to address the main shortcomings of traditional particle filtering approaches, namely Bayesian methods based on importance sampling, for use in traffic environments. These methods do not scale well when the dimensionality of the feature space grows, which creates significant limitations when tracking multiple objects. Alternatively, the proposed method is based on a Markov chain Monte Carlo (MCMC) approach, which allows efficient sampling of the feature space. The method involves important contributions in both the motion and the observation models of the tracker. Indeed, as opposed to particle filter-based tracking methods in the literature, which typically resort to observation models based on appearance or template matching, in this study a likelihood model that combines appearance analysis with information from motion parallax is introduced. Regarding the motion model, a new interaction treatment is defined based on Markov random fields (MRF) that allows for the handling of possible inter-dependencies in vehicle trajectories. As for vehicle detection, the method relies on a supervised classification stage using support vector machines (SVM). The contribution in this field is twofold. First, a new descriptor based on the analysis of gradient orientations in concentric rectangles is dened. This descriptor involves a much smaller feature space compared to traditional descriptors, which are too costly for real-time applications. Second, a new vehicle image database is generated to train the SVM and made public. The proposed vehicle detection and tracking method is proven to outperform existing methods and to successfully handle challenging situations in the test sequences.
Resumo:
This communication presents an overview of their first results and innovate methodologies, focused in their possibilities and limitations for the reconstruction of recent floods and paleofloods over the World.
Resumo:
Precise modeling of the program heap is fundamental for understanding the behavior of a program, and is thus of signiflcant interest for many optimization applications. One of the fundamental properties of the heap that can be used in a range of optimization techniques is the sharing relationships between the elements in an array or collection. If an analysis can determine that the memory locations pointed to by different entries of an array (or collection) are disjoint, then in many cases loops that traverse the array can be vectorized or transformed into a thread-parallel versión. This paper introduces several novel sharing properties over the concrete heap and corresponding abstractions to represent them. In conjunction with an existing shape analysis technique, these abstractions allow us to precisely resolve the sharing relations in a wide range of heap structures (arrays, collections, recursive data structures, composite heap structures) in a computationally efflcient manner. The effectiveness of the approach is evaluated on a set of challenge problems from the JOlden and SPECjvm98 suites. Sharing information obtained from the analysis is used to achieve substantial thread-level parallel speedups.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
We use multifractal analysis (MFA) to investigate how the Rényi dimensions of the solid mass and the pore space in porous structures are related to each other. To our knowledge, there is no investigation about the relationship of Rényi or generalized dimensions of two phases of the same structure.
Resumo:
In this paper, a fully automatic goal-oriented hp-adaptive finite element strategy for open region electromagnetic problems (radiation and scattering) is presented. The methodology leads to exponential rates of convergence in terms of an upper bound of an user-prescribed quantity of interest. Thus, the adaptivity may be guided to provide an optimal error, not globally for the field in the whole finite element domain, but for specific parameters of engineering interest. For instance, the error on the numerical computation of the S-parameters of an antenna array, the field radiated by an antenna, or the Radar Cross Section on given directions, can be minimized. The efficiency of the approach is illustrated with several numerical simulations with two dimensional problem domains. Results include the comparison with the previously developed energy-norm based hp-adaptivity.