853 resultados para Exploration-exploitation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colombia has oceanic waters, catchment areas, like lakes, cienagas and swamps, water flows, like rivers, gorges and streams, small rivers and groundwater. The oceanic waters are the Caribbean Sea-1600 km and the Pacific Ocean-1300 km that comprise the north and west continental territory, respectively. Actually the Region of Darién, geographically bounded by the Carribean Sea to the north is becoming to be focused by studies due to use conflicts and disputes about water and a forest reserve on its territories. Considering its location, strategic at northwestern Colombia, frontier region with Central America, several dynamics are imposed. One of them is the implantation of a road system entitled Connecting Road of the Americas. This fact means the construction of an infra-structure that will cross a special zone formed by swamps and jungle known as The Darién Gap. Evidences of such interests are revealed by projects like the constructions of Turbo's Port in the Atlantic Ocean, Department of Antioquia and Tribugá's Port in the Pacific Ocean, Department of Choco, the mountain road and the coastal conection Colombia-Venezuela attending to the main intentions of the central region of the department (Metropolitan Area of Aburrá Valley-AMVA). Human settlements form a productive system, based on small and medium familiar agriculture's production, corresponding to the western portion and piedmont of Abibe's mountain at its antioquian portion, alluvial plan that forms the rivers on this area, the littoral zone that delimits the Carribean Sea, the Darién and Baudó Mountains and the gulf that receives, among other waters, the ones from Atrato and León, as well as the exodus process constitutes a forced exit resulting from actions of several armed groups. It can be identified intense historical, cultural, political and environmental relations, specially the last one associated with strategic ecosystems that are fundamental for the hydric regulation of the region, as well as food safety of the local inhabitants. Results from two researches (UPB, 2007 y 2010) reveals this quick transformation in the spatial re-configuration, demographical and economical indicators and the exacerbated fight for resources, damaging the extractive vocation in the Region. Path to commerce of illegalities (drugs, guns) and to implementation of the agroindustrial project for biofuel production, cooperation program that involves Venezuela, Brazil and Colombia. Appropriation modes allow the existence of strategies since global interests revealing a development logic that privileges the conception of an artificialized nature. Since the smallest portion of rural areas, specific modes of resources exploration are linked to imposed interests of transnational corporations. Disparate consequences are going deeper evidenced by social, technical and nature transformations, envisioning risks for the habitability's condition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Palestine Exploration Fund (PEF) Survey of Western Palestine (1871-1877) is highly praised for its accuracy and completeness; the first systematic analysis of its planimetric accuracy was published by Levin (2006). To study the potential of these 1:63,360 maps for a quantitative analysis of land cover changes over a period of time, Levin has compared them to 20th century topographic maps. The map registration error of the PEF maps was 74.4 m using 123 control points of trigonometrical stations and a 1st order polynomial. The median RMSE of all control and test points (n = 1104) was 153.6 m. Following the georeferencing of each of the 26 sheets of the PEF maps of the Survey of Western Palestine, a mosaicked file has been created. Care should be taken when analysing historical maps, as it cannot be assumed that their accuracy is consistent at different parts or for different features depicted on them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bathymetry based on data recorded during MSM34-2 between 27.12.2013 and 18.01.2014 in the Black Sea. The main objective of this cruise was the mapping and imaging of the gas hydrate distribution and gas accumulations as well as possible gas migration pathways. Objectives of Cruise: Gas hydrates have been the focus of scientific and economic interest for the past 15-20 years, mainly because the amount of carbon stored in gas hydrates is much greater than in other carbon reservoirs. Several countries including Japan, Korea and India have launched vast reasearch programmes dedicated to the exploration for gas hydrate resources and ultimately the exploitation of the gas hydrates for methane. The German SUGAR project that is financed the the Ministry of Education and Research (BmBF) and the Ministry of Economics (BmWi) aims at developing technology to exploit gas hydrate resources by injecting and storing CO2 instead of methane in the hydrates. This approach includes techniques to locate and quantify hydrate reservoirs, drill into the reservoir, extract methane from the hydrates by replacing it with CO2, and monitor the thus formed CO2-hydrate reservoir. Numerical modeling has shown that any exploitation of the gas hydrates can only be succesful, if sufficient hydrate resources are present within permeable reservoirs such as sandy or gravelly deposits. The ultimate goal of the SUGAR project being a field test of the technology developed within the project, knowledge of a suitable test site becomes crucial. Within European waters only the Norwegian margin and the Danube deep-sea fan show clear geophysical evidence for large gas hydrate accumulations, but only the Danube deep-sea fan most likely contains gas hydrates within sandy deposits. The main objective of cruise MSM34 therefore is locating and characterising suitable gas hydrate deposits on the Danube deep-sea fan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exhumed faults hosting hydrothermal systems provide direct insight into relationships between faulting and fluid flow, which in turn are valuable for making hydrogeological predictions in blind settings. The Grimsel Breccia Fault (Aar massif, Central Swiss Alps) is a late Neogene, exhumed dextral strike-slip fault with a maximum displacement of 25–45 m, and is associated with both fossil and active hydrothermal circulation. We mapped the fault system and modelled it in three dimensions, using the distinctive hydrothermal mineralisation as well as active thermal fluid discharge (the highest elevation documented in the Alps) to reveal the structural controls on fluid pathway extent and morphology. With progressive uplift and cooling, brittle deformation inherited the mylonitic shear zone network at Grimsel Pass; preconditioning fault geometry into segmented brittle reactivations of ductile shear zones and brittle inter-shear zone linkages. We describe ‘pipe’-like, vertically oriented fluid pathways: (1) within brittle fault linkage zones and (2) through alongstrike- restricted segments of formerly ductile shear zones reactivated by brittle deformation. In both cases, low-permeability mylonitic shear zones that escaped brittle reactivation provide important hydraulic seals. These observations show that fluid flow along brittle fault planes is not planar, but rather highly channelised into sub-vertical flow domains, with important implications for the exploration and exploitation of geothermal energy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A walking machine is a wheeled rover alternative, well suited for work in an unstructured environment and specially in abrupt terrain. They have some drawback like speed and power consumption, but they can achieve complex movements and protrude very little the environment they are working on. The locomotion system is determined by the terrain conditions and, in our case, this legged design has been chosen based in a working area like Rio Tinto in the South of Spain, which is a river area with abrupt terrain. A walking robot with so many degrees of freedom can be a challenge when dealing with the analysis and simulations of the legs. This paper shows how to deal with the kinematical analysis of the equations of a hexapod robot based on a design developed by the Center of Astrobiology INTA-CSIC following the classical formulation of equations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Single core capabilities have reached their maximum clock speed; new multicore architectures provide an alternative way to tackle this issue instead. The design of decoding applications running on top of these multicore platforms and their optimization to exploit all system computational power is crucial to obtain best results. Since the development at the integration level of printed circuit boards are increasingly difficult to optimize due to physical constraints and the inherent increase in power consumption, development of multiprocessor architectures is becoming the new Holy Grail. In this sense, it is crucial to develop applications that can run on the new multi-core architectures and find out distributions to maximize the potential use of the system. Today most of commercial electronic devices, available in the market, are composed of embedded systems. These devices incorporate recently multi-core processors. Task management onto multiple core/processors is not a trivial issue, and a good task/actor scheduling can yield to significant improvements in terms of efficiency gains and also processor power consumption. Scheduling of data flows between the actors that implement the applications aims to harness multi-core architectures to more types of applications, with an explicit expression of parallelism into the application. On the other hand, the recent development of the MPEG Reconfigurable Video Coding (RVC) standard allows the reconfiguration of the video decoders. RVC is a flexible standard compatible with MPEG developed codecs, making it the ideal tool to integrate into the new multimedia terminals to decode video sequences. With the new versions of the Open RVC-CAL Compiler (Orcc), a static mapping of the actors that implement the functionality of the application can be done once the application executable has been generated. This static mapping must be done for each of the different cores available on the working platform. It has been chosen an embedded system with a processor with two ARMv7 cores. This platform allows us to obtain the desired tests, get as much improvement results from the execution on a single core, and contrast both with a PC-based multiprocessor system. Las posibilidades ofrecidas por el aumento de la velocidad de la frecuencia de reloj de sistemas de un solo procesador están siendo agotadas. Las nuevas arquitecturas multiprocesador proporcionan una vía de desarrollo alternativa en este sentido. El diseño y optimización de aplicaciones de descodificación de video que se ejecuten sobre las nuevas arquitecturas permiten un mejor aprovechamiento y favorecen la obtención de mayores rendimientos. Hoy en día muchos de los dispositivos comerciales que se están lanzando al mercado están integrados por sistemas embebidos, que recientemente están basados en arquitecturas multinúcleo. El manejo de las tareas de ejecución sobre este tipo de arquitecturas no es una tarea trivial, y una buena planificación de los actores que implementan las funcionalidades puede proporcionar importantes mejoras en términos de eficiencia en el uso de la capacidad de los procesadores y, por ende, del consumo de energía. Por otro lado, el reciente desarrollo del estándar de Codificación de Video Reconfigurable (RVC), permite la reconfiguración de los descodificadores de video. RVC es un estándar flexible y compatible con anteriores codecs desarrollados por MPEG. Esto hace de RVC el estándar ideal para ser incorporado en los nuevos terminales multimedia que se están comercializando. Con el desarrollo de las nuevas versiones del compilador específico para el desarrollo de lenguaje RVC-CAL (Orcc), en el que se basa MPEG RVC, el mapeo estático, para entornos basados en multiprocesador, de los actores que integran un descodificador es posible. Se ha elegido un sistema embebido con un procesador con dos núcleos ARMv7. Esta plataforma nos permitirá llevar a cabo las pruebas de verificación y contraste de los conceptos estudiados en este trabajo, en el sentido del desarrollo de descodificadores de video basados en MPEG RVC y del estudio de la planificación y mapeo estático de los mismos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Andorra-I is the first implementation of a language based on the Andorra Principie, which states that determinate goals can (and shonld) be run before other goals, and even in a parallel fashion. This principie has materialized in a framework called the Basic Andorra model, which allows or-parallelism as well as (dependent) and-parallelism for determinate goals. In this report we show that it is possible to further extend this model in order to allow general independent and-parallelism for nondeterminate goals, withont greatly modifying the underlying implementation machinery. A simple an easy way to realize such an extensión is to make each (nondeterminate) independent goal determinate, by using a special "bagof" constract. We also show that this can be achieved antomatically by compile-time translation from original Prolog programs. A transformation that fulfüls this objective and which can be easily antomated is presented in this report.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a data-intensive architecture that demonstrates the ability to support applications from a wide range of application domains, and support the different types of users involved in defining, designing and executing data-intensive processing tasks. The prototype architecture is introduced, and the pivotal role of DISPEL as a canonical language is explained. The architecture promotes the exploration and exploitation of distributed and heterogeneous data and spans the complete knowledge discovery process, from data preparation, to analysis, to evaluation and reiteration. The architecture evaluation included large-scale applications from astronomy, cosmology, hydrology, functional genetics, imaging processing and seismology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last years significant efforts have been devoted to the development of advanced data analysis tools to both predict the occurrence of disruptions and to investigate the operational spaces of devices, with the long term goal of advancing the understanding of the physics of these events and to prepare for ITER. On JET the latest generation of the disruption predictor called APODIS has been deployed in the real time network during the last campaigns with the new metallic wall. Even if it was trained only with discharges with the carbon wall, it has reached very good performance, with both missed alarms and false alarms in the order of a few percent (and strategies to improve the performance have already been identified). Since for the optimisation of the mitigation measures, predicting also the type of disruption is considered to be also very important, a new clustering method, based on the geodesic distance on a probabilistic manifold, has been developed. This technique allows automatic classification of an incoming disruption with a success rate of better than 85%. Various other manifold learning tools, particularly Principal Component Analysis and Self Organised Maps, are also producing very interesting results in the comparative analysis of JET and ASDEX Upgrade (AUG) operational spaces, on the route to developing predictors capable of extrapolating from one device to another.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Use of electrodynamic bare tethers in exploring the Jovian system by tapping its rotational energy for power and propulsion is studied. The position of perijove and apojove in elliptical orbits, relative to the synchronous orbit at 2.24 times Jupiter’s radius, is exploited to conveniently make the induced Lorentz force to be drag or thrust, while generating power, and navigating the system. Capture and evolution to a low elliptical orbit near Jupiter, and capture into low circular orbits at moons Io and Europa are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Se investiga la distribución espacial de contenidos metálicos analizados sobre testigos de sondeos obtenidos en las campañas de exploración de la Veta Pallancata. Se aplica el análisis factorial a dicha distribución y a los cocientes de los valores metálicos, discriminando los que están correlacionados con la mineralización argentífera y que sirven como guías de exploración para hallar zonas de potenciales reservas por sus gradientes de variación.Abstract:The metal distribution in a vein may show the paths of hydrothermal fluid flow at the time of mineralization. Such information may assist for in-fill drilling. The Pallancata Vein has been intersected by 52 drill holes, whose cores were sampled and analysed, and the results plotted to examine the mineralisation trends. The spatial distribution of the ore is observed from the logAg/logPb ratio distribution. Au is in this case closely related to Ag (electrum and uytenbogaardtite, Ag3AuS2 ). The Au grade shows the same spatial distribution as the Ag grade. The logAg/logPb ratio distribution also suggests possible ore to be expected at deeper locations. Shallow supergene Ag enrichment was also observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern sensor technologies and simulators applied to large and complex dynamic systems (such as road traffic networks, sets of river channels, etc.) produce large amounts of behavior data that are difficult for users to interpret and analyze. Software tools that generate presentations combining text and graphics can help users understand this data. In this paper we describe the results of our research on automatic multimedia presentation generation (including text, graphics, maps, images, etc.) for interactive exploration of behavior datasets. We designed a novel user interface that combines automatically generated text and graphical resources. We describe the general knowledge-based design of our presentation generation tool. We also present applications that we developed to validate the method, and a comparison with related work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La observación de la Tierra es una herramienta de gran utilidad en la actualidad para el estudio de los fenómenos que se dan en la misma. La observación se puede realizar a distintas escalas y por distintos métodos dependiendo del propósito. El actual Trabajo Final de Grado persigue exponer la observación del territorio mediante técnicas de Teledetección, o Detección Remota, y su aplicación en la exploración de hidrocarburos. Desde la Segunda Guerra Mundial el capturar imágenes aéreas de regiones de la Tierra estaba restringido a usos cartográficos en el sentido estricto. Desde aquellos tiempos, hasta ahora, ha acontecido una serie de avances científicos que permiten deducir características intrínsecas de la Tierra mediante mecanismos complejos que no apreciamos a simple vista, pero que, están configurados mediante determinados parámetros geométricos y electrónicos, que permiten generar series temporales de fenómenos físicos que se dan en la Tierra. Hoy en día se puede afirmar que el aprovechamiento del espectro electromagnético está en un punto máximo. Se ha pasado del análisis de la región del espectro visible al análisis del espectro en su totalidad. Esto supone el desarrollo de nuevos algoritmos, técnicas y procesos para extraer la mayor cantidad de información acerca de la interacción de la materia con la radiación electromagnética. La información que generan los sistemas de captura va a servir para la aplicación directa e indirecta de métodos de prospección de hidrocarburos. Las técnicas utilizadas en detección por sensores remotos, aplicadas en campañas geofísicas, son utilizadas para minimizar costes y maximizar resultados en investigaciones de campo. La predicción de anomalías en la zona de estudio depende del analista, quien diseña, calcula y evalúa las variaciones de la energía electromagnética reflejada o emitida por la superficie terrestre. Para dicha predicción se revisarán distintos programas espaciales, se evaluará la bondad de registro y diferenciación espectral mediante el uso de distintas clasificaciones (supervisadas y no supervisadas). Por su influencia directa sobre las observaciones realizadas, se realiza un estudio de la corrección atmosférica; se programan distintos modelos de corrección atmosférica para imágenes multiespectrales y se evalúan los métodos de corrección atmosférica en datos hiperespectrales. Se obtendrá temperatura de la zona de interés utilizando los sensores TM-4, ASTER y OLI, así como un Modelo Digital del Terreno generado por el par estereoscópico capturado por el sensor ASTER. Una vez aplicados estos procedimientos se aplicarán los métodos directos e indirectos, para la localización de zonas probablemente afectadas por la influencia de hidrocarburos y localización directa de hidrocarburos mediante teledetección hiperespectral. Para el método indirecto se utilizan imágenes capturadas por los sensores ETM+ y ASTER. Para el método directo se usan las imágenes capturadas por el sensor Hyperion. ABSTRACT The observation of the Earth is a wonderful tool for studying the different kind of phenomena that occur on its surface. The observation could be done by different scales and by different techniques depending on the information of interest. This Graduate Thesis is intended to expose the territory observation by remote sensing acquiring data systems and the analysis that can be developed to get information of interest. Since Second World War taking aerials photographs of scene was restricted only to a cartographic sense. From these days to nowadays, it have been developed many scientific advances that make capable the interpretation of the surface behavior trough complex systems that are configure by specific geometric and electronic parameters that make possible acquiring time series of the phenomena that manifest on the earth’s surface. Today it is possible to affirm that the exploitation of the electromagnetic spectrum is on a maxim value. In the past, analysis of the electromagnetic spectrum was carry in a narrow part of it, today it is possible to study entire. This implicates the development of new algorithms, process and techniques for the extraction of information about the interaction of matter with electromagnetic radiation. The information that has been acquired by remote sensing sensors is going to be a helpful tool for the exploration of hydrocarbon through direct and vicarious methods. The techniques applied in remote sensing, especially in geophysical campaigns, are employed to minimize costs and maximize results of ground-based geologic investigations. Forecasting of anomalies in the region of interest depends directly on the expertise data analyst who designs, computes and evaluates variations in the electromagnetic energy reflected or emanated from the earth’s surface. For an optimal prediction a review of the capture system take place; assess of the goodness in data acquisition and spectral separability, is carried out by mean of supervised and unsupervised classifications. Due to the direct influence of the atmosphere in the register data, a study of the minimization of its influence has been done; a script has been programed for the atmospheric correction in multispectral data; also, a review of hyperspectral atmospheric correction is conducted. Temperature of the region of interest is computed using the images captured by TM-4, ASTER and OLI, in addition to a Digital Terrain Model generated by a pair of stereo images taken by ASTER sensor. Once these procedures have finished, direct and vicarious methods are applied in order to find altered zones influenced by hydrocarbons, as well as pinpoint directly hydrocarbon presence by mean of hyperspectral remote sensing. For this purpose ETM+ and ASTER sensors are used to apply the vicarious method and Hyperion images are used to apply the direct method.