867 resultados para virtual worlds working group


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Manganese nodules research has focused on the area between the Clarion Fracture Zone to the North and the Clipperton Fracture Zone to the South where significant concentrations were found ni Ni-Cu. During the CCOP/SOPAC-IOC/IDOE International workshop on the "Geology Mineral Resources and Geophysics of the South Pacific" held in Fiji in September 1975, a working group on manganese nodules was formed by scientists from: CNEXO, Brest, the Institute of Oceanography, New Zealand, Imperial College, London and the Technical University of Aachen. A draft project was presented in July 1976 by J. Andrews, University of Hawaii and G. Pautot, Cnexo on a joint survey under the name of: "Hawaii-Tahiti Transect program". Further details were worked on in September 1976 during the International Geological Congress in Sydney with the participation of D. Cronan, Imperial College, Glasby, New Zealand Geological Survey and G. Friedrich, Aachen TU. The scientific final program was established in July 1977, planning on the participation of three research vessels: the Suroit (CNEXO), the Kana Keoki (U. of Hawaii) and the Sonne (Aachen TU). Several survey areas were selected across the Pacific Ocean (Areas A, B, C, D, E, F, G and H) with about the same crustal age (about 40 million years) and a similar water depths. Being near large fault zones, the ares would be adequate to study the influences of biological productivity, sedimentation rate and possibly volcanic activity on the formation and growth of manganese nodules. The influnece of volcanic activity study would particularly apply to area G being situated near the Marquesas Fracture Zone. The cruise from R/V Sonne started in August 1978 over areas C, D, F, G K. The R/V suroit conducted a similar expedition in 1979 over areas A, B, C, D, E, H and I. Others cruises were planned during the 1979-1980 for the R/V Kana Keoki. The present text relates the R/V Sonne Cruises SO-06/1 and SO-06/2 held within the frame work of this international cooperative project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vast portions of Arctic and sub-Arctic Siberia, Alaska and the Yukon Territory are covered by ice-rich silty to sandy deposits that are containing large ice wedges, resulting from syngenetic sedimentation and freezing. Accompanied by wedge-ice growth in polygonal landscapes, the sedimentation process was driven by cold continental climatic and environmental conditions in unglaciated regions during the late Pleistocene, inducing the accumulation of the unique Yedoma deposits up to >50 meters thick. Because of fast incorporation of organic material into syngenetic permafrost during its formation, Yedoma deposits include well-preserved organic matter. Ice-rich deposits like Yedoma are especially prone to degradation triggered by climate changes or human activity. When Yedoma deposits degrade, large amounts of sequestered organic carbon as well as other nutrients are released and become part of active biogeochemical cycling. This could be of global significance for future climate warming as increased permafrost thaw is likely to lead to a positive feedback through enhanced greenhouse gas fluxes. Therefore, a detailed assessment of the current Yedoma deposit coverage and its volume is of importance to estimate its potential response to future climate changes. We synthesized the map of the coverage and thickness estimation, which will provide critical data needed for further research. In particular, this preliminary Yedoma map is a great step forward to understand the spatial heterogeneity of Yedoma deposits and its regional coverage. There will be further applications in the context of reconstructing paleo-environmental dynamics and past ecosystems like the mammoth-steppe-tundra, or ground ice distribution including future thermokarst vulnerability. Moreover, the map will be a crucial improvement of the data basis needed to refine the present-day Yedoma permafrost organic carbon inventory, which is assumed to be between 83±12 (Strauss et al., 2013, doi:10.1002/2013GL058088) and 129±30 (Walter Anthony et al., 2014, doi:10.1038/nature13560) gigatonnes (Gt) of organic carbon in perennially-frozen archives. Hence, here we synthesize data on the circum-Arctic and sub-Arctic distribution and thickness of Yedoma for compiling a preliminary circum-polar Yedoma map. For compiling this map, we used (1) maps of the previous Yedoma coverage estimates, (2) included the digitized areas from Grosse et al. (2013) as well as extracted areas of potential Yedoma distribution from additional surface geological and Quaternary geological maps (1.: 1:500,000: Q-51-V,G; P-51-A,B; P-52-A,B; Q-52-V,G; P-52-V,G; Q-51-A,B; R-51-V,G; R-52-V,G; R-52-A,B; 2.: 1:1,000,000: P-50-51; P-52-53; P-58-59; Q-42-43; Q-44-45; Q-50-51; Q-52-53; Q-54-55; Q-56-57; Q-58-59; Q-60-1; R-(40)-42; R-43-(45); R-(45)-47; R-48-(50); R-51; R-53-(55); R-(55)-57; R-58-(60); S-44-46; S-47-49; S-50-52; S-53-55; 3.: 1:2,500,000: Quaternary map of the territory of Russian Federation, 4.: Alaska Permafrost Map). The digitalization was done using GIS techniques (ArcGIS) and vectorization of raster Images (Adobe Photoshop and Illustrator). Data on Yedoma thickness are obtained from boreholes and exposures reported in the scientific literature. The map and database are still preliminary and will have to undergo a technical and scientific vetting and review process. In their current form, we included a range of attributes for Yedoma area polygons based on lithological and stratigraphical information from the original source maps as well as a confidence level for our classification of an area as Yedoma (3 stages: confirmed, likely, or uncertain). In its current version, our database includes more than 365 boreholes and exposures and more than 2000 digitized Yedoma areas. We expect that the database will continue to grow. In this preliminary stage, we estimate the Northern Hemisphere Yedoma deposit area to cover approximately 625,000 km². We estimate that 53% of the total Yedoma area today is located in the tundra zone, 47% in the taiga zone. Separated from west to east, 29% of the Yedoma area is found in North America and 71 % in North Asia. The latter include 9% in West Siberia, 11% in Central Siberia, 44% in East Siberia and 7% in Far East Russia. Adding the recent maximum Yedoma region (including all Yedoma uplands, thermokarst lakes and basins, and river valleys) of 1.4 million km² (Strauss et al., 2013, doi:10.1002/2013GL058088) and postulating that Yedoma occupied up to 80% of the adjacent formerly exposed and now flooded Beringia shelves (1.9 million km², down to 125 m below modern sea level, between 105°E - 128°W and >68°N), we assume that the Last Glacial Maximum Yedoma region likely covered more than 3 million km² of Beringia. Acknowledgements: This project is part of the Action Group "The Yedoma Region: A Synthesis of Circum-Arctic Distribution and Thickness" (funded by the International Permafrost Association (IPA) to J. Strauss) and is embedded into the Permafrost Carbon Network (working group Yedoma Carbon Stocks). We acknowledge the support by the European Research Council (Starting Grant #338335), the German Federal Ministry of Education and Research (Grant 01DM12011 and "CarboPerm" (03G0836A)), the Initiative and Networking Fund of the Helmholtz Association (#ERC-0013) and the German Federal Environment Agency (UBA, project UFOPLAN FKZ 3712 41 106).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A special Working Group, to study and develop standars related to Building Restoration, Rehabilitation and Maintenance, was formed in January 2001 in AENOR (Spanish Association for Codes and Standars) under the management of the Department of Building Construction of the School of Architecture of Madrid (DCTA-UPM). Three groups were organized to deal with different topics: Diagnosis, Techiques and Materials, and Maintenance. In this paper the differents topics in which the Diagnosis Subgroup is working are described: historical studies, constructive description of the building and building pathology. These will be basic to carry out a correct diagnosis of any type of building, whether it is historic or not. In the development of such topics, the recognizable architectural values are justified as they are prior to the diagnosis stage. As an example of the subgroup work, several pathology cards are shown which include: longitudinal cracks of mechanical origin in beams of concrete structures, façade closings and claddings, and general symptoms of installation services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El reciente desarrollo de la instrumentación diseñada para proporcionar datos de aceleraciones y movimientos del cajón número 8 del dique Botafoc (Ibiza), perteneciente a la Autoridad Portuaria de Baleares (Puertos del Estado), en conjunción con datos procedentes de una instrumentación compuesta por sensores de presión existente en el paramento vertical, proporciona un novedoso medio para analizar la respuesta estructural del cajón, no sólo ante la acción del oleaje, sino también ante los efectos producidos por las maniobras de los buques en el muelle. Como la medición de estas aceleraciones y velocidades angulares se hace a altas frecuencias (de hasta 400 Hz), podemos proporcionar datos válidos acerca del comportamiento estructural y de los movimientos reales del cajón, tratando de correlacionar este comportamiento con los resultados obtenidos por el grupo de trabajo PROVERBS (Probabilistic design of vertical breakwaters, MAST III EU Programme), y generando una base de datos estadística de movimientos que deben considerarse para enriquecer los conocimientos en este ámbito. Además, la posibilidad de registrar los efectos causados por las maniobras de atraquedesatraque-estancia de los buques, abre un nuevo punto de vista al diseño estructural de un dique-muelle, siendo también de gran interés para los diseñadores de obras marítimas y para la correcta definición de las maniobras del buque en el muelle. The recent deployment of new instrumentation designed to provide accelerations and angular velocities from caisson #8 at Botafoc seawall, Ibiza, along with an existing pressure sensor instrumentation at the vertical wall, provides a way to record and process data of the structural response, not only to waves, but also to effects caused by ship mooring operations at Botafoc seawall. As the measurement of these angular speeds and accelerations is programmed with sampling frecuencies up to 400 Hz, and by integrating all data through time we may provide suitable data of the structural behaviour of the caisson. This behaviour is tried to be correlated with the PROVERBS working group achievements (Probabilistic design of vertical breakwaters, MAST III EU Programme), generating a statistical movement database that must be used to improve knowledge on this subject. Also the possibility to record the effects caused by the different ship mooring operations is a new point of view of the complete structural design of a seawall-wharf, which is considered an interesting matter for coastal designers as well for a correct ship mooring processes definition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the W3C's Media Annotation Working Group (MAWG) is to promote interoperability between multimedia metadata formats on the Web. As experienced by everybody, audiovisual data is omnipresent on today's Web. However, different interaction interfaces and especially diverse metadata formats prevent unified search, access, and navigation. MAWG has addressed this issue by developing an interlingua ontology and an associated API. This article discusses the rationale and core concepts of the ontology and API for media resources. The specifications developed by MAWG enable interoperable contextualized and semantic annotation and search, independent of the source metadata format, and connecting multimedia data to the Linked Data cloud. Some demonstrators of such applications are also presented in this article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Provenance models are crucial for describing experimental results in science. The W3C Provenance Working Group has recently released the PROV family of specifications for provenance on the Web. While provenance focuses on what is executed, it is important in science to publish the general methods that describe scientific processes at a more abstract and general level. In this paper, we propose P-PLAN, an extension of PROV to represent plans that guid-ed the execution and their correspondence to provenance records that describe the execution itself. We motivate and discuss the use of P-PLAN and PROV to publish scientific workflows as Linked Data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En los últimos años hemos sido testigos de la creciente demanda de software para resolver problemas cada vez más complejos y de mayor valor agregado. Bajo estas circunstancias, nos podemos hacer la siguiente pregunta: ¿Está preparada la industria de software para entregar el software que se necesita en los próximos años, de acuerdo con las demandas del cliente? Hoy en día, muchos expertos creen que el éxito de esta industria dependerá de su capacidad para gestionar los proyectos, las personas y los recursos. En este sentido, la gestión de proyectos es un factor clave para el éxito de los proyectos software en todo el mundo. Además, considerando que las Pequeñas y Medianas Empresas de software (PYMEs) representan el 99,87% de las empresas españolas, es vital para este tipo de empresas la implementación de los procesos involucrados con la gestión de proyectos. Es cierto que existen muchos modelos que mejoran la eficacia de la gestión de proyectos, pero la mayoría de ellos se centra únicamente en dos procesos: la planificación del proyecto y la monitorización y control del proyecto, ninguno de los cuales a menudo es asequible para las PYMEs. Estos modelos se basan en el consenso de un grupo de trabajo designado para establecer cómo debe ser gestionado el proceso software. Los modelos son bastante útiles ya que proporcionan lineamientos generales sobre dónde empezar a mejorar la gestión de los proyectos, y en qué orden, a personas que no saben cómo hacerlo. Sin embargo, como se ha dicho anteriormente, la mayoría de estos modelos solamente funcionan en escenarios dentro de las grandes empresas. Por lo tanto, es necesario adaptar los modelos y herramientas para el contexto de PYMEs. Esta tesis doctoral presenta una solución complementaria basada en la aplicación de un metamodelo. Este metamodelo es creado para mejorar la calidad de los procesos de la gestión de proyectos a través de la incorporación de prácticas eficaces identificadas a través del análisis y estudio de los modelos y normas existentes relacionadas con la gestión de proyectos. viii ProMEP – Metamodelo para la gestión de proyectos Por lo tanto, el metamodelo PROMEP (Gestión de Proyectos basada en Prácticas Efectivas) permitirá establecer un proceso estándar de gestión de proyectos que puede adaptarse a los proyectos de cada empresa a través de dos pasos: En primer lugar, para obtener una fotografía instantánea (o base) de los procesos de gestión de proyectos de las PYMEs se creó un cuestionario de dos fases para identificar tanto las prácticas realizadas y como las no realizadas. El cuestionario propuesto se basa en el Modelo de Madurez y Capacidad Integrado para el Desarrollo v1.2 (CMMI-DEV v1.2). Como resultado adicional, se espera que la aplicación de este cuestionario ayude a las PYMEs a identificar aquellas prácticas que se llevan a cabo, pero no son documentadas, aquellas que necesitan más atención, y aquellas que no se realizan debido a la mala gestión o al desconocimiento. En segundo lugar, para apoyar fácilmente y eficazmente las tareas de gestión de proyectos software del metamodelo PROMEP, se diseñó una biblioteca de activos de proceso (PAL) para apoyar la definición de los procesos de gestión de proyectos y realizar una gestión cuantitativa de cada proyecto de las PYMEs. Ambos pasos se han implementado como una herramienta computacional que apoya nuestro enfoque de metamodelo. En concreto, la presente investigación propone la construcción del metamodelo PROMEP para aquellas PYMEs que desarrollan productos software de tal forma que les permita planificar, monitorizar y controlar sus proyectos software, identificar los riesgos y tomar las medidas correctivas necesarias, establecer y mantener un conjunto de activos de proceso, definir un mecanismo cuantitativo para predecir el rendimiento de los procesos, y obtener información de mejora. Por lo tanto, nuestro estudio sugiere un metamodelo alternativo para lograr mayores niveles de rendimiento en los entornos de PYMEs. Así, el objetivo principal de esta tesis es ayudar a reducir los excesos de trabajo y el tiempo de entrega, y aumentar así la calidad del software producido en este tipo de organizaciones. Abstract In recent years we have been witnessing the increasing demand for software to solve more and more complex tasks and greater added value. Under these circumstances, we can ourselves the following question: Is the software industry prepared to deliver the software that is needed in the coming years, according to client demands? Nowadays, many experts believe that the industry’ success will depend on its capacity to manage the projects, people and resources. In this sense, project management is a key factor for software project success around the world. Moreover, taking into account that small and medium-sized software enterprises (SMSe) are the 99.87% of the Spanish enterprises, it is vital for this type of enterprises to implement the processes involved in project management. It is true that there are many models that improve the project management effectiveness, but most of them are focused only on two processes: project planning and project monitoring and control, neither of which is affordable for SMSe. Such models are based on the consensus of a designated working group on how software process should be managed. They are very useful in that they provide general guidelines on where to start improving the project management, and in which order, to people who do not know how to do it. However, as we said, the majority of these models have only worked in scenarios within large companies. So, it is necessary to adapt these models and tools to the context of SMSe. A complementary solution based on the implementation of a metamodel is presented in this thesis. This metamodel is created to improve the quality of project management processes through the incorporation of effective practices identified through the analysis and study of relevant models and standards related to project management. Thus, the PROMEP (PROject Management based on Effective Practices) metamodel will allow establishing a project management standard process to be tailored to each enterprise’s project through two steps: Firstly, to obtain a baseline snapshot of project management processes in SMSe a two-phase questionnaire was created to identify both performed and nonperformed practices. The x ProMEP – Metamodelo para la gestión de proyectos proposed questionnaire is based on Capability Maturity Model Integration for Development v1.2. As additional result, it is expected that the application of the questionnaire to the processes will help SMSe to identify those practices which are performed but not documented, which practices need more attention, and which are not implemented due to bad management or unawareness. Secondly, to easily an effectively support the software project management tasks in the PROMEP metamodel, a Process Asset Library (PAL) is designed to support the definition of project management processes and to achieve quantitative project management in SMSe. Both steps have been implemented as a computational tool that supports our metamodel approach. Concretely, the present research proposes the accomplishment of the PROMEP metamodel for those SMSe which develop software products and enable them to plan, supervise and control their software projects, identify risks and take corrective actions, establish and maintain a set of process assets, define quantitative models that predict the process performance, and provide improvement information. So, our study suggests an alternative metamodel to achieve higher performance levels in the SMSe environments. The main objective of this thesis is help to reduce software overruns and delivery time, and increase software quality in these types of organizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automated and semi-automated accessibility evaluation tools are key to streamline the process of accessibility assessment, and ultimately ensure that software products, contents, and services meet accessibility requirements. Different evaluation tools may better fit different needs and concerns, accounting for a variety of corporate and external policies, content types, invocation methods, deployment contexts, exploitation models, intended audiences and goals; and the specific overall process where they are introduced. This has led to the proliferation of many evaluation tools tailored to specific contexts. However, tool creators, who may be not familiar with the realm of accessibility and may be part of a larger project, lack any systematic guidance when facing the implementation of accessibility evaluation functionalities. Herein we present a systematic approach to the development of accessibility evaluation tools, leveraging the different artifacts and activities of a standardized development process model (the Unified Software Development Process), and providing templates of these artifacts tailored to accessibility evaluation tools. The work presented specially considers the work in progress in this area by the W3C/WAI Evaluation and Report Working Group (ERT WG)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the sustained growth that we are experiencing in the number of SPARQL endpoints available, the need to be able to send federated SPARQL queries across these has also grown. To address this use case, the W3C SPARQL working group is defining a federation extension for SPARQL 1.1 which allows for combining graph patterns that can be evaluated over several endpoints within a single query. In this paper, we describe the syntax of that extension and formalize its semantics. Additionally, we describe how a query evaluation system can be implemented for that federation extension, describing some static optimization techniques and reusing a query engine used for data-intensive science, so as to deal with large amounts of intermediate and final results. Finally we carry out a series of experiments that show that our optimizations speed up the federated query evaluation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lo virtual es el lugar donde todo empieza, el germen de la imaginación productiva, un ámbito de pulsiones inaugurales y preexistencias sin forma donde todo convive, a la espera de ser diferenciado. Lo virtual es el sitio donde nacen las primeras exploraciones de cualquier acto de concepción, incluida la creación artística o el proyectar arquitectónico. Sin embargo, en las últimas tres décadas de revolución digital, el término ha sido utilizado de forma abusiva para referirse a todo tipo de entornos simulados informáticamente, es decir, a ficciones cerradas, programadas, controladas por el software y sus rutinas, radicalmente actualizadas, acabadas, completas, formalizadas. Paradójicamente, lo virtual ha servido para nombrar construcciones profundamente anti-virtuales. La telemática está propiciando el acceso del ser humano a un nuevo tipo de irrealidad cotidiana sustentada en prácticas espaciales cada vez menos vinculadas con la física y la biología. Esta condición fantasmagórica del habitar digital exige nuevos espacios de diálogo entre arquitectura y tecnología que se centren en el hecho imaginario. Para ello esta tesis propone —a partir de la recuperación del término griego arquitectónica— llevar el alcance de la disciplina hasta el hecho global del habitar. Y, al mismo tiempo, devolver al adjetivo virtual su auténtico significado preliminar, entendiendo que los verdaderos mundos virtuales no pueden simular nada, representar nada, formalizar nada, porque ellos son el origen infinito y amorfo de todo mundo. ABSTRACT The virtual is where it all starts, the seed of productive imagination, an area of inaugural impulses and formless preexistences that beat together, waiting to be differentiated. The virtual is the birthplace of any creative exploration, including those of the architectural project. However, in the last three decades of digital revolution, the term has been mostly misused to refer to all types of computer simulated environments; shut, finished, complete, formalized, radically actualized fictions controlled by software routines. Paradoxically, the virtual has been giving name to profoundly anti-virtual constructions. Telematics is allowing humans to access to a new kind of unreality, based on everyday spatial practices that are increasingly detached from physics and biology. This spectral condition of the digital living demands new dialectics between architecture and technology, focused on the imaginary. This thesis proposes — beginning by recovering the Greek word architectonics— to extend the scope of the discipline beyond edification to the overall fact of inhabiting. And, at the same time, to return to the adjective virtual its authentic preliminary meaning, realizing that the true virtual worlds cannot simulate, represent or formalize anything because they are the amorphous and endless source of every world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presente tesis se enmarca dentro de los trabajos realizados en el Proyecto CENIT OASIS (Operación de Autopistas Seguras Inteligentes y Sostenibles) sobre el impacto y la integración paisajística de las autopistas, y en los trabajos realizados por el grupo de trabajo GT 13 (paisaje) dentro del comité técnico nacional CTN 157 (proyectos) para normalización del Paisaje en España, del que la doctoranda es secretaria técnica. El objetivo principal de esta tesis es desarrollar una Metodología que permita la normalización del paisaje en España. Por ello, establece las bases para el desarrollo científico y profesional en el ámbito del paisaje, a través de la caracterización de la actividad científica y de la actividad normalizadora internacional. Para después elaborar una propuesta de documentos normativos para su regulación en España. Por último, se pone en práctica la única de las normas aplicables a un caso real, concretamente en la AP-7 a su paso por la provincia de Gerona. La caracterización de la actividad científica en el ámbito del paisaje proporciona una visión global que sirve de referencia a las futuras investigaciones en la materia, no existente hasta la fecha. Entre los múltiples resultados, se identifican las áreas de conocimiento y disciplinas afines desde las que se aborda el paisaje, se analiza la evolución de las temáticas y líneas de investigación en el campo, se determina la distribución e impacto de la producción científica, destacando los países y centros de investigación punteros y sus colaboraciones, y se determinan las publicaciones más destacadas en la materia. La caracterización de la actividad normativa internacional hasta la fecha supone un referente en este campo, habiendo traducido, analizado y clasificado decenas de documentos sobre temas como la terminología, la profesión de paisajista, las reglas generales para las intervenciones en el paisaje, las normas para la protección del paisaje y normas para la evaluación del impacto paisajístico. La tesis desarrolla tres documentos normativos, que se espera sean el germen de los futuros documentos legales para normalización del Paisaje en España. El principal objetivo de la normalización es dotar a los profesionales de las herramientas necesarias para desarrollar sus intervenciones en el paisaje. Para ello, se ha elaborado un documento normativo sobre terminología del concepto clave y los términos asociados en castellano, que sirva de referencia para un futuro documento normativo; un documento normativo que regule los estudios de integración e impacto paisajístico en España, definiendo una serie de pautas que ayuden a los profesionales a desarrollar los proyectos de intervención en el paisaje; un documento que regule y defina la profesión de arquitecto paisajista, identificando sus capacidades, formaciones y competencias. Por último, el documento de impacto e integración paisajística se aplica a un caso concreto de infraestructuras del transporte, dentro del proyecto OASIS, sirviendo como ejemplo a los profesionales de la materia para desarrollar futuras intervenciones. El enfoque de este documento coincide con el de paisaje ecológico, el análisis del paisaje se aborda desde lo visible (fenosistema) y desde los procesos que lo conforman (criptosistema). Y las medidas de integración pretenden conseguir que la infraestructura forme parte del paisaje y de los procesos que ocurren en él, lo que en la tesis se define como Infraestructuras Verdes. ABSTRACT The thesis is within the framework of the CENIT OASIS Project (Operation of Safe, Intelligent and Sustainable Highways) about the landscape impact and integration of highways, and the work done by the working group GT 13 (landscape) in the national technical committee CTN 157 (projects) for landscape standardization in Spain, of which the PhD is technical secretary. The main objective of this thesis is to develop a Methodology that allows the landscape standardization in Spain. Therefore, it establishes the basis for the scientific and professional development in the landscape field, through the characterization of scientific and international normalizing activity. It concludes with the proposal of regulatory documents for its use in Spain. Finally, it implements the only of the rules applicable to a real case, specifically in the AP- 7 passing through the Gerona province. The characterization of scientific activity in the landscape field provides an overview that is a reference in the researches in this field, non-existent to date. Among the many results, the areas of knowledge and related disciplines, from which the landscape is addressed, are identified; the evolution of topics and lines of research in the field are analyzed; the distribution and impact of scientific production is determined, highlighting the countries and leading research centers and collaborations; and the leading publications in the field are determined. The characterization of the international regulatory activity to date is a model in this field, having translated, analyzed and classified dozens of papers about terminology, the landscapist profession, general rules for intervention in the landscape, standards for the landscape protection and rules for the assessment of landscape impact. The thesis develops three normative documents, which are expected to be the germ of future legal documents for standardization landscape in Spain. The main objective of standardization is to provide the necessary tools for professionals who work developing interventions in the landscape. To do this, it has been developed a normative document about terminology on the key concept and the associated terms in Castilian, as a reference for a future normative document; a normative document that regulates studies of landscape integration and impact in Spain, defining a set of guidelines to help professionals to develop intervention projects in the landscape; a document to regulate and define the activities developed by the professionals, defining the profession of landscape architect, their capabilities and competencies. Finally, the document of landscape impact and integration is applied to a particular case of transport infrastructures within the OASIS project, serving as an example to professionals in the field to develop future interventions. The focus of this document coincides with the ecological landscape; the landscape analysis is approached from the visible (fenosystem) and from the processes that shape it (cryptosystem); and integration measures aim to achieve that the infrastructure take part of the landscape and its existing processes, which in this thesis is defined as Green Infrastructures.