938 resultados para Open Data, Bologna
Resumo:
Synoptic climatology relates the atmospheric circulation with the surface environment. The aim of this study is to examine the variability of the surface meteorological patterns, which are developing under different synoptic scale categories over a suburban area with complex topography. Multivariate Data Analysis techniques were performed to a data set with surface meteorological elements. Three principal components related to the thermodynamic status of the surface environment and the two components of the wind speed were found. The variability of the surface flows was related with atmospheric circulation categories by applying Correspondence Analysis. Similar surface thermodynamic fields develop under cyclonic categories, which are contrasted with the anti-cyclonic category. A strong, steady wind flow characterized by high shear values develops under the cyclonic Closed Low and the anticyclonic H–L categories, in contrast to the variable weak flow under the anticyclonic Open Anticyclone category.
Resumo:
We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.
Resumo:
This Editorial presents the focus, scope and policies of the inaugural issue of Nature Conservation, a new open access, peer-reviewed journal bridging natural sciences, social sciences and hands-on applications in conservation management. The journal covers all aspects of nature conservation and aims particularly at facilitating better interaction between scientists and practitioners. The journal will impose no restrictions on manuscript size or the use of colour. We will use an XML-based editorial workflow and several cutting-edge innovations in publishing and information dissemination. These include semantic mark-up of, and enhancements to published text, data, and extensive cross-linking within the journal and to external sources. We believe the journal will make an important contribution to better linking science and practice, offers rapid, peer-reviewed and flexible publication for authors and unrestricted access to content.
Resumo:
The development of global magnetospheric models, such as Space Weather Modeling Framework (SWMF), which can accurately reproduce and track space weather processes has high practical utility. We present an interval on 5 June 1998, where the location of the polar cap boundary, or open-closed field line boundary (OCB), can be determined in the ionosphere using a combination of instruments during a period encompassing a sharp northward to southward interplanetary field turning. We present both point- and time-varying comparisons of the observed and simulated boundaries in the ionosphere and find that when using solely the coupled ideal magnetohydrodynamic magnetosphere-ionosphere model, the rate of change of the OCB to a southward turning of the interplanetary field is significantly faster than that computed from the observational data. However, when the inner magnetospheric module is incorporated, the modeling framework both qualitatively, and often quantitatively, reproduces many elements of the studied interval prior to an observed substorm onset. This result demonstrates that the physics of the inner magnetosphere is critical in shaping the boundary between open and closed field lines during periods of southward interplanetary magnetic field (IMF) and provides significant insight into the 3-D time-dependent behavior of the Earth's magnetosphere in response to a northward-southward IMF turning. We assert that during periods that do not include the tens of minutes surrounding substorm expansion phase onset, the coupled SWMF model may provide a valuable and reliable tool for estimating both the OCB and magnetic field topology over a wide range of latitudes and local times.
Resumo:
Historic geomagnetic activity observations have been used to reveal centennial variations in the open solar flux and the near-Earth heliospheric conditions (the interplanetary magnetic field and the solar wind speed). The various methods are in very good agreement for the past 135 years when there were sufficient reliable magnetic observatories in operation to eliminate problems due to site-specific errors and calibration drifts. This review underlines the physical principles that allow these reconstructions to be made, as well as the details of the various algorithms employed and the results obtained. Discussion is included of: the importance of the averaging timescale; the key differences between “range” and “interdiurnal variability” geomagnetic data; the need to distinguish source field sector structure from heliospherically-imposed field structure; the importance of ensuring that regressions used are statistically robust; and uncertainty analysis. The reconstructions are exceedingly useful as they provide calibration between the in-situ spacecraft measurements from the past five decades and the millennial records of heliospheric behaviour deduced from measured abundances of cosmogenic radionuclides found in terrestrial reservoirs. Continuity of open solar flux, using sunspot number to quantify the emergence rate, is the basis of a number of models that have been very successful in reproducing the variation derived from geomagnetic activity. These models allow us to extend the reconstructions back to before the development of the magnetometer and to cover the Maunder minimum. Allied to the radionuclide data, the models are revealing much about how the Sun and heliosphere behaved outside of grand solar maxima and are providing a means of predicting how solar activity is likely to evolve now that the recent grand maximum (that had prevailed throughout the space age) has come to an end.
Resumo:
Traditionally, the formal scientific output in most fields of natural science has been limited to peer- reviewed academic journal publications, with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. In effect, this has constrained the representation and verification of the data provenance to the confines of the related publications. Detailed knowledge of a dataset’s provenance is essential to establish the pedigree of the data for its effective re-use, and to avoid redundant re-enactment of the experiment or computation involved. It is increasingly important for open-access data to determine their authenticity and quality, especially considering the growing volumes of datasets appearing in the public domain. To address these issues, we present an approach that combines the Digital Object Identifier (DOI) – a widely adopted citation technique – with existing, widely adopted climate science data standards to formally publish detailed provenance of a climate research dataset as an associated scientific workflow. This is integrated with linked-data compliant data re-use standards (e.g. OAI-ORE) to enable a seamless link between a publication and the complete trail of lineage of the corresponding dataset, including the dataset itself.
Resumo:
Semi-open street roofs protect pedestrians from intense sunshine and rains. Their effects on natural ventilation of urban canopy layers (UCL) are less understood. This paper investigates two idealized urban models consisting of 4(2×2) or 16(4×4) buildings under a neutral atmospheric condition with parallel (0°) or non-parallel (15°,30°,45°) approaching wind. The aspect ratio (building height (H) / street width (W)) is 1 and building width is B=3H. Computational fluid dynamic (CFD) simulations were first validated by experimental data, confirming that standard k-ε model predicted airflow velocity better than RNG k-ε model, realizable k–ε model and Reynolds stress model. Three ventilation indices were numerically analyzed for ventilation assessment, including flow rates across street roofs and openings to show the mechanisms of air exchange, age of air to display how long external air reaches a place after entering UCL, and purging flow rate to quantify the net UCL ventilation capacity induced by mean flows and turbulence. Five semi-open roof types are studied: Walls being hung above street roofs (coverage ratio λa=100%) at z=1.5H, 1.2H, 1.1H ('Hung1.5H', 'Hung1.2H', 'Hung1.1H' types); Walls partly covering street roofs (λa=80%) at z=H ('Partly-covered' type); Walls fully covering street roofs (λa=100%) at z=H ('Fully-covered' type).They basically obtain worse UCL ventilation than open street roof type due to the decreased roof ventilation. 'Hung1.1H', 'Hung1.2H', 'Hung1.5H' types are better designs than 'Fully-covered' and 'Partly-covered' types. Greater urban size contains larger UCL volume and requires longer time to ventilate. The methodologies and ventilation indices are confirmed effective to quantify UCL ventilation.
Resumo:
We describe the CHARMe project, which aims to link climate datasets with publications, user feedback and other items of "commentary metadata". The system will help users learn from previous community experience and select datasets that best suit their needs, as well as providing direct traceability between conclusions and the data that supported them. The project applies the principles of Linked Data and adopts the Open Annotation standard to record and publish commentary information. CHARMe contributes to the emerging landscape of "climate services", which will provide climate data and information to influence policy and decision-making. Although the project focuses on climate science, the technologies and concepts are very general and could be applied to other fields.
Resumo:
Observations of Earth from space have been made for over 40 years and have contributed to advances in many aspects of climate science. However, attempts to exploit this wealth of data are often hampered by a lack of homogeneity and continuity and by insufficient understanding of the products and their uncertainties. There is, therefore, a need to reassess and reprocess satellite datasets to maximize their usefulness for climate science. The European Space Agency has responded to this need by establishing the Climate Change Initiative (CCI). The CCI will create new climate data records for (currently) 13 essential climate variables (ECVs) and make these open and easily accessible to all. Each ECV project works closely with users to produce time series from the available satellite observations relevant to users' needs. A climate modeling users' group provides a climate system perspective and a forum to bring the data and modeling communities together. This paper presents the CCI program. It outlines its benefit and presents approaches and challenges for each ECV project, covering clouds, aerosols, ozone, greenhouse gases, sea surface temperature, ocean color, sea level, sea ice, land cover, fire, glaciers, soil moisture, and ice sheets. It also discusses how the CCI approach may contribute to defining and shaping future developments in Earth observation for climate science.
Resumo:
Large changes in the extent of northern subtropical arid regions during the Holocene are attributed to orbitally forced variations in monsoon strength and have been implicated in the regulation of atmospheric trace gas concentrations on millenial timescales. Models that omit biogeophysical feedback, however, are unable to account for the full magnitude of African monsoon amplification and extension during the early to middle Holocene (˜9500–5000 years B.P.). A data set describing land-surface conditions 6000 years B.P. on a 1° × 1° grid across northern Africa and the Arabian Peninsula has been prepared from published maps and other sources of palaeoenvironmental data, with the primary aim of providing a realistic lower boundary condition for atmospheric general circulation model experiments similar to those performed in the Palaeoclimate Modelling Intercomparison Project. The data set includes information on the percentage of each grid cell occupied by specific vegetation types (steppe, savanna, xerophytic woods/scrub, tropical deciduous forest, and tropical montane evergreen forest), open water (lakes), and wetlands, plus information on the flow direction of major drainage channels for use in large-scale palaeohydrological modeling.
Resumo:
In the concluding paper of this tetralogy, we here use the different geomagnetic activity indices to reconstruct the near-Earth interplanetary magnetic field (IMF) and solar wind flow speed, as well as the open solar flux (OSF) from 1845 to the present day. The differences in how the various indices vary with near-Earth interplanetary parameters, which are here exploited to separate the effects of the IMF and solar wind speed, are shown to be statistically significant at the 93% level or above. Reconstructions are made using four combinations of different indices, compiled using different data and different algorithms, and the results are almost identical for all parameters. The correction to the aa index required is discussed by comparison with the Ap index from a more extensive network of mid-latitude stations. Data from the Helsinki magnetometer station is used to extend the aa index back to 1845 and the results confirmed by comparison with the nearby St Petersburg observatory. The optimum variations, using all available long-term geomagnetic indices, of the near-Earth IMF and solar wind speed, and of the open solar flux, are presented; all with ±2sigma� uncertainties computed using the Monte Carlo technique outlined in the earlier papers. The open solar flux variation derived is shown to be very similar indeed to that obtained using the method of Lockwood et al. (1999).
Resumo:
We analyse the widely-used international/ Zürich sunspot number record, R, with a view to quantifying a suspected calibration discontinuity around 1945 (which has been termed the “Waldmeier discontinuity” [Svalgaard, 2011]). We compare R against the composite sunspot group data from the Royal Greenwich Observatory (RGO) network and the Solar Optical Observing Network (SOON), using both the number of sunspot groups, N{sub}G{\sub}, and the total area of the sunspots, A{sub}G{\sub}. In addition, we compare R with the recently developed interdiurnal variability geomagnetic indices IDV and IDV(1d). In all four cases, linearity of the relationship with R is not assumed and care is taken to ensure that the relationship of each with R is the same before and after the putative calibration change. It is shown the probability that a correction is not needed is of order 10{sup}−8{\sup} and that R is indeed too low before 1945. The optimum correction to R for values before 1945 is found to be 11.6%, 11.7%, 10.3% and 7.9% using A{sub}G{\sub}, N{sub)G{\sub}, IDV, and IDV(1d), respectively. The optimum value obtained by combining the sunspot group data is 11.6% with an uncertainty range 8.1-14.8% at the 2σ level. The geomagnetic indices provide an independent yet less stringent test but do give values that fall within the 2σ uncertainty band with optimum values are slightly lower than from the sunspot group data. The probability of the correction needed being as large as 20%, as advocated by Svalgaard [2011], is shown to be 1.6 × 10{sup}−5{\sup}.
Resumo:
Background: Massive Open Online Courses (MOOCs) have become immensely popular in a short span of time. However, there is very little research exploring MOOCs in the discipline of Health and Medicine. This paper is aimed to fill this void by providing a review of Health and Medicine related MOOCs. Objective: Provide a review of Health and Medicine related MOOCs offered by various MOOC platforms within the year 2013. Analyze and compare the various offerings, their target audience, typical length of a course and credentials offered. Discuss opportunities and challenges presented by MOOCs in the discipline of Health and Medicine. Methods: Health and Medicine related MOOCs were gathered using several methods to ensure the richness and completeness of data. Identified MOOC platform websites were used to gather the lists of offerings. In parallel, these MOOC platforms were contacted to access official data on their offerings. Two MOOC aggregator sites (Class Central and MOOC List) were also consulted to gather data on MOOC offerings. Eligibility criteria were defined to concentrate on the courses that were offered in 2013 and primarily on the subject ‘Health and Medicine’. All language translations in this paper were achieved using Google Translate. Results: The search identified 225 courses out of which 98 were eligible for the review (n = 98). 58% (57) of the MOOCs considered were offered on the Coursera platform and 94% (92) of all the MOOCs were offered in English. 90 MOOCs were offered by universities and the John Hopkins University offered the largest number of MOOCs (12). Only three MOOCs were offered by developing countries (China, West Indies, and Saudi Arabia). The duration of MOOCs varied from three weeks to 20 weeks with an average length of 6.7 weeks. On average MOOCs expected a participant to work on the material for 4.2 hours a week. Verified Certificates were offered by 14 MOOCs while three others offered other professional recognition. Conclusions: The review presents evidence to suggest that MOOCs can be used as a way to provide continuous medical education. It also shows the potential of MOOCs as a means of increasing health literacy among the public.
Resumo:
Variational data assimilation is commonly used in environmental forecasting to estimate the current state of the system from a model forecast and observational data. The assimilation problem can be written simply in the form of a nonlinear least squares optimization problem. However the practical solution of the problem in large systems requires many careful choices to be made in the implementation. In this article we present the theory of variational data assimilation and then discuss in detail how it is implemented in practice. Current solutions and open questions are discussed.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.