978 resultados para GLOBAL DEFECT SOLUTIONS
Resumo:
Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.
Resumo:
LARES is a new spherical geodetic satellite designed for SLR observations. It is made of solid tungsten alloy covered with 92 corner cubes. Due to a very small area-to-mass ratio, the sensitivity of LARES orbits to non-gravitational forces is greatly minimized. We processed 82 weeks (Feb12-Aug13) of LARES observations from a global SLR network and we analyzed the contribution of LARES data to the current SLR products (e.g., global scale and geocenter coordinates). The quality of the combined LARES+LAGEOS-1/2 solutions is also addressed in the paper. Introduction LARES
Resumo:
Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.
Resumo:
This article provides an overview of the most essential issues in the trade and culture discourse from a global law perspective. It looks into the intensified disconnect between trade and culture and exposes its flaws and the considerable drawbacks that it brings with it. It is argued that these drawbacks become especially pronounced in the digital media environment, which has strongly affected both the conditions of trade with cultural products and services and cultural diversity in local and global contexts. In this modified setting, there could have been a number of feasible ‘trade and culture’ solutions – i.e. regulatory designs that whilst enhancing trade liberalisation are also conducive to cultural policy. Yet, the realisation of any of these options becomes chimerical as the line between trade and culture matters is drawn in a clear and resolute manner. The article is meant for an interdisciplinary audience and forthcoming in the Journal of Arts Management, Law and Society.
Resumo:
The intensified flows of goods, services, peoples and ideas across borders intrinsic to globalization have had numerous and multi-faceted effects. Those affecting culture have been perhaps the most controversial, as it is more often than not difficult to identify the spill-overs across economic and non-economic areas and across borders, as it is equally hard to qualify the effects of these spill-overs as positive or negative. The debate also tends to be politically and even emotionally charged, which has so far not proven advantageous to establishing a genuine dialogue, nor to finding solutions. This contention and the divergent interests of major players in the international community have been reflected in the institutions and rules of global law. It is the objective of this chapter to explore this institutional architecture, in particular its main (and opposing) constituent fora of the World Trade Organization (WTO) and the United Nations Educational Social and Cultural Organization (UNESCO). The chapter traces the evolution of these institutions and their interaction over time, as well as the underlying objectives, demands and strategies of the key proponents in the trade versus culture discourse, which ultimately shaped the existent law and policy. The chapter concludes with an appraisal of the present state of affairs situating the discussion into the contemporary global governance landscape.
Resumo:
The contribution of Starlette, Stella, and AJI-SAI is currently neglected when defining the International Terrestrial Reference Frame, despite a long time series of precise SLR observations and a huge amount of available data. The inferior accuracy of the orbits of low orbiting geodetic satellites is the main reason for this neglect. The Analysis Centers of the International Laser Ranging Service (ILRS ACs) do, however, consider including low orbiting geodetic satellites for deriving the standard ILRS products based on LAGEOS and Etalon satellites, instead of the sparsely observed, and thus, virtually negligible Etalons. We process ten years of SLR observations to Starlette, Stella, AJISAI, and LAGEOS and we assess the impact of these Low Earth Orbiting (LEO) SLR satellites on the SLR-derived parameters. We study different orbit parameterizations, in particular different arc lengths and the impact of pseudo-stochastic pulses and dynamical orbit parameters on the quality of the solutions. We found that the repeatability of the East and North components of station coordinates, the quality of polar coordinates, and the scale estimates of the reference are improved when combining LAGEOS with low orbiting SLR satellites. In the multi-SLR solutions, the scale and the Z component of geocenter coordinates are less affected by deficiencies in solar radiation pressure modeling than in the LAGEOS-1/2 solutions, due to substantially reduced correlations between the Z geocenter coordinate and empirical orbit parameters. Eventually, we found that the standard values of Center-of-mass corrections (CoM) for geodetic LEO satellites are not valid for the currently operating SLR systems. The variations of station-dependent differential range biases reach 52 and 25 mm for AJISAI and Starlette/Stella, respectively, which is why estimating station dependent range biases or using station-dependent CoM, instead of one value for all SLR stations, is strongly recommended.This clearly indicates that the ILRS effort to produce CoM corrections for each satellite, which are site-specific and depend on the system characteristics at the time of tracking,is very important and needs to be implemented in the SLR data analysis.
Resumo:
While most events related to the International Year of Deserts and Desertification 2006 took mainly a problem-oriented perspective and approach, the Bern Symposium held in May 2006 tried to adopt a more positive attitude by attempting to take stock of experience as well as best and worst practices in the past, both in development practice and in research. Through this deliberate focus on potentials, positive experiences, solutions and pathways, predominant passive and reactive attitudes and hopelessness might be better overcome. The Symposium was organized by CDE, NCCR North-South and Forum SLM.
Resumo:
The main aim of the methodology presented in this paper is to provide a framework for a participatory process for the appraisal and selection of options to mitigate desertification and land degradation. This methodology is being developed within the EU project DESIRE (www.desire-project.eu/) in collaboration with WOCAT (www.wocat.org). It is used to select promising conservation strategies for test-implementation in each of the 16 degradation and desertification hotspot sites in the Mediterranean and around the world. The methodology consists of three main parts: In a first step, prevention and mitigation strategies already applied at the respective DESIRE study site are identified and listed during a workshop with representatives of different stakeholders groups (land users, policy makers, researchers). The participatory and process-oriented approach initiates a mutual learning process among the different stakeholders by sharing knowledge and jointly reflecting on current problems and solutions related to land degradation and desertification. In the second step these identified, locally applied solutions (technologies and approaches) are assessed with the help of the WOCAT methodology. Comprehensive questionnaires and a database system have been developed to document and evaluate all relevant aspects of technical measures as well as implementation approaches by teams of researchers and specialists, together with land users. This research process ensures systematic assessing and piecing together of local information, together with specific details about the environmental and socio-economic setting. The third part consists of another stakeholder workshop where promising strategies for sustainable land management in the given context are selected, based on the best practices database of WOCAT, including the evaluated locally applied strategies at the DESIRE sites. These promising strategies will be assessed with the help of a selection and decision support tool and adapted for test-implementation at the study site.
Resumo:
This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.
Resumo:
Abstract interpretation-based data-flow analysis of logic programs is, at this point, relatively well understood from the point of view of general frameworks and abstract domains. On the other hand, comparatively little attention has been given to the problems which arise when analysis of a full, practical dialect of the Prolog language is attempted, and only few solutions to these problems have been proposed to date. Existing proposals generally restrict in one way or another the classes of programs which can be analyzed. This paper attempts to fill this gap by considering a full dialect of Prolog, essentially the recent ISO standard, pointing out the problems that may arise in the analysis of such a dialect, and proposing a combination of known and novel solutions that together allow the correct analysis of arbitrary programs which use the full power of the language.
Resumo:
Global data-flow analysis of (constraint) logic programs, which is generally based on abstract interpretation [7], is reaching a comparatively high level of maturity. A natural question is whether it is time for its routine incorporation in standard compilers, something which, beyond a few experimental systems, has not happened to date. Such incorporation arguably makes good sense only if: • the range of applications of global analysis is large enough to justify the additional complication in the compiler, and • global analysis technology can deal with all the features of "practical" languages (e.g., the ISO-Prolog built-ins) and "scales up" for large programs. We present a tutorial overview of a number of concepts and techniques directly related to the issues above, with special emphasis on the first one. In particular, we concéntrate on novel uses of global analysis during program development and debugging, rather than on the more traditional application área of program optimization. The idea of using abstract interpretation for validation and diagnosis has been studied in the context of imperative programming [2] and also of logic programming. The latter work includes issues such as using approximations to reduce the burden posed on programmers by declarative debuggers [6, 3] and automatically generating and checking assertions [4, 5] (which includes the more traditional type checking of strongly typed languages, such as Gódel or Mercury [1, 8, 9]) We also review some solutions for scalability including modular analysis, incremental analysis, and widening. Finally, we discuss solutions for dealing with meta-predicates, side-effects, delay declarations, constraints, dynamic predicates, and other such features which may appear in practical languages. In the discussion we will draw both from the literature and from our experience and that of others in the development and use of the CIAO system analyzer. In order to emphasize the practical aspects of the solutions discussed, the presentation of several concepts will be illustrated by examples run on the CIAO system, which makes extensive use of global analysis and assertions.
Resumo:
Global linear instability theory is concerned with the temporal or spatial development of small-amplitude perturbations superposed upon laminar steady or time-periodic three-dimensional flows, which are inhomogeneous in two(and periodic in one)or all three spatial directions.After a brief exposition of the theory,some recent advances are reported. First, results are presented on the implementation of a Jacobian-free Newton–Krylov time-stepping method into a standard finite-volume aerodynamic code to obtain global linear instability results in flows of industrial interest. Second, connections are sought between established and more-modern approaches for structure identification in flows, such as proper orthogonal decomposition and Koopman modes analysis (dynamic mode decomposition), and the possibility to connect solutions of the eigenvalue problem obtained by matrix formation or time-stepping with those delivered by dynamic mode decomposition, residual algorithm, and proper orthogonal decomposition analysis is highlighted in the laminar regime; turbulent and three-dimensional flows are identified as open areas for future research. Finally, a new stable very-high-order finite-difference method is implemented for the spatial discretization of the operators describing the spatial biglobal eigenvalue problem, parabolized stability equation three-dimensional analysis, and the triglobal eigenvalue problem; it is shown that, combined with sparse matrix treatment, all these problems may now be solved on standard desktop computers
Resumo:
A unified solution framework is presented for one-, two- or three-dimensional complex non-symmetric eigenvalue problems, respectively governing linear modal instability of incompressible fluid flows in rectangular domains having two, one or no homogeneous spatial directions. The solution algorithm is based on subspace iteration in which the spatial discretization matrix is formed, stored and inverted serially. Results delivered by spectral collocation based on the Chebyshev-Gauss-Lobatto (CGL) points and a suite of high-order finite-difference methods comprising the previously employed for this type of work Dispersion-Relation-Preserving (DRP) and Padé finite-difference schemes, as well as the Summationby- parts (SBP) and the new high-order finite-difference scheme of order q (FD-q) have been compared from the point of view of accuracy and efficiency in standard validation cases of temporal local and BiGlobal linear instability. The FD-q method has been found to significantly outperform all other finite difference schemes in solving classic linear local, BiGlobal, and TriGlobal eigenvalue problems, as regards both memory and CPU time requirements. Results shown in the present study disprove the paradigm that spectral methods are superior to finite difference methods in terms of computational cost, at equal accuracy, FD-q spatial discretization delivering a speedup of ð (10 4). Consequently, accurate solutions of the three-dimensional (TriGlobal) eigenvalue problems may be solved on typical desktop computers with modest computational effort.