48 resultados para Open Data, Dati Aperti, Open Government Data
em CentAUR: Central Archive University of Reading - UK
Resumo:
We use the third perihelion pass by the Ulysses spacecraft to illustrate and investigate the “flux excess” effect, whereby open solar flux estimates from spacecraft increase with increasing heliocentric distance. We analyze the potential effects of small-scale structure in the heliospheric field (giving fluctuations in the radial component on timescales smaller than 1 h) and kinematic time-of-flight effects of longitudinal structure in the solar wind flow. We show that the flux excess is explained by neither very small-scale structure (timescales < 1 h) nor by the kinematic “bunching effect” on spacecraft sampling. The observed flux excesses is, however, well explained by the kinematic effect of larger-scale (>1 day) solar wind speed variations on the frozen-in heliospheric field. We show that averaging over an interval T (that is long enough to eliminate structure originating in the heliosphere yet small enough to avoid cancelling opposite polarity radial field that originates from genuine sector structure in the coronal source field) is only an approximately valid way of allowing for these effects and does not adequately explain or account for differences between the streamer belt and the polar coronal holes.
Resumo:
We investigate the “flux excess” effect, whereby open solar flux estimates from spacecraft increase with increasing heliocentric distance. We analyze the kinematic effect on these open solar flux estimates of large-scale longitudinal structure in the solar wind flow, with particular emphasis on correcting estimates made using data from near-Earth satellites. We show that scatter, but no net bias, is introduced by the kinematic “bunching effect” on sampling and that this is true for both compression and rarefaction regions. The observed flux excesses, as a function of heliocentric distance, are shown to be consistent with open solar flux estimates from solar magnetograms made using the potential field source surface method and are well explained by the kinematic effect of solar wind speed variations on the frozen-in heliospheric field. Applying this kinematic correction to the Omni-2 interplanetary data set shows that the open solar flux at solar minimum fell from an annual mean of 3.82 × 1016 Wb in 1987 to close to half that value (1.98 × 1016 Wb) in 2007, making the fall in the minimum value over the last two solar cycles considerably faster than the rise inferred from geomagnetic activity observations over four solar cycles in the first half of the 20th century.
Resumo:
Svalgaard and Cliver (2010) recently reported a consensus between the various reconstructions of the heliospheric field over recent centuries. This is a significant development because, individually, each has uncertainties introduced by instrument calibration drifts, limited numbers of observatories, and the strength of the correlations employed. However, taken collectively, a consistent picture is emerging. We here show that this consensus extends to more data sets and methods than reported by Svalgaard and Cliver, including that used by Lockwood et al. (1999), when their algorithm is used to predict the heliospheric field rather than the open solar flux. One area where there is still some debate relates to the existence and meaning of a floor value to the heliospheric field. From cosmogenic isotope abundances, Steinhilber et al. (2010) have recently deduced that the near-Earth IMF at the end of the Maunder minimum was 1.80 ± 0.59 nT which is considerably lower than the revised floor of 4nT proposed by Svalgaard and Cliver. We here combine cosmogenic and geomagnetic reconstructions and modern observations (with allowance for the effect of solar wind speed and structure on the near-Earth data) to derive an estimate for the open solar flux of (0.48 ± 0.29) × 1014 Wb at the end of the Maunder minimum. By way of comparison, the largest and smallest annual means recorded by instruments in space between 1965 and 2010 are 5.75 × 1014 Wb and 1.37 × 1014 Wb, respectively, set in 1982 and 2009, and the maximum of the 11 year running means was 4.38 × 1014 Wb in 1986. Hence the average open solar flux during the Maunder minimum is found to have been 11% of its peak value during the recent grand solar maximum.
Resumo:
We investigate the relationship between interdiurnal variation geomagnetic activity indices, IDV and IDV(1d), corrected sunspot number, R{sub}C{\sub}, and the group sunspot number R{sub}G{\sub}. R{sub}C{\sub} uses corrections for both the “Waldmeier discontinuity”, as derived in Paper 1 [Lockwood et al., 2014c], and the “Wolf discontinuity” revealed by Leussu et al. [2013]. We show that the simple correlation of the geomagnetic indices with R{sub}C{\sub}{sup}n{\sup} or R{sub}G{\sub}{sup}n{\sup} masks a considerable solar cycle variation. Using IDV(1d) or IDV to predict or evaluate the sunspot numbers, the errors are almost halved by allowing for the fact that the relationship varies over the solar cycle. The results indicate that differences between R{sub}C{\sub} and R{sub}G{\sub} have a variety of causes and are highly unlikely to be attributable to errors in either R{sub}G{\sub} alone, as has recently been assumed. Because it is not known if R{sub}C{\sub} or R{sub}G{\sub} is a better predictor of open flux emergence before 1874, a simple sunspot number composite is suggested which, like R{sub}G{\sub}, enables modelling of the open solar flux for 1610 onwards in Paper 3, but maintains the characteristics of R{sub}C{\sub}.
Resumo:
A comprehensive atmospheric boundary layer (ABL) data set was collected in eight fi eld experiments (two during each season) over open water and sea ice in the Baltic Sea during 1998–2001 with the primary objective to validate the coupled atmospheric- ice-ocean-land surface model BALTIMOS (BALTEX Integrated Model System). Measurements were taken by aircraft, ships and surface stations and cover the mean and turbulent structure of the ABL including turbulent fl uxes, radiation fl uxes, and cloud conditions. Measurement examples of the spatial variability of the ABL over the ice edge zone and of the stable ABL over open water demonstrate the wide range of ABL conditions collected and the strength of the data set which can also be used to validate other regional models.
Resumo:
Chongqing is the largest central-government-controlled municipality in China, which is now under going a rapid urbanization. The question remains open: What are the consequences of such rapid urbanization in Chongqing in terms of urban microclimates? An integrated study comprising three different research approaches is adopted in the present paper. By analyzing the observed annual climate data, an average rising trend of 0.10◦C/decade was found for the annual mean temperature from 1951 to 2010 in Chongqing,indicating a higher degree of urban warming in Chongqing. In addition, two complementary types of field measurements were conducted: fixed weather stations and mobile transverse measurement. Numerical simulations using a house-developed program are able to predict the urban air temperature in Chongqing.The urban heat island intensity in Chongqing is stronger in summer compared to autumn and winter.The maximum urban heat island intensity occurs at around midnight, and can be as high as 2.5◦C. In the day time, an urban cool island exists. Local greenery has a great impact on the local thermal environment.Urban green spaces can reduce urban air temperature and therefore mitigate the urban heat island. The cooling effect of an urban river is limited in Chongqing, as both sides of the river are the most developed areas, but the relative humidity is much higher near the river compared with the places far from it.
Resumo:
Massive Open Online Courses (MOOCs) have become very popular among learners millions of users from around the world registered with leading platforms. There are hundreds of universities (and other organizations) offering MOOCs. However, sustainability of MOOCs is a pressing concern as MOOCs incur up front creation costs, maintenance costs to keep content relevant and on-going support costs to provide facilitation while a course is being run. At present, charging a fee for certification (for example Coursera Signature Track and FutureLearn Statement of Completion) seems a popular business model. In this paper, the authors discuss other possible business models and their pros and cons. Some business models discussed here are: Freemium model – providing content freely but charging for premium services such as course support, tutoring and proctored exams. Sponsorships – courses can be created in collaboration with industry where industry sponsorships are used to cover the costs of course production and offering. For example Teaching Computing course was offered by the University of East Anglia on the FutureLearn platform with the sponsorship from British Telecom while the UK Government sponsored the course Introduction to Cyber Security offered by the Open University on FutureLearn. Initiatives and Grants – The government, EU commission or corporations could commission the creation of courses through grants and initiatives according to the skills gap identified for the economy. For example, the UK Government’s National Cyber Security Programme has supported a course on Cyber Security. Similar initiatives could also provide funding to support relevant course development and offering. Donations – Free software, Wikipedia and early OER initiatives such as the MIT OpenCourseware accept donations from the public and this could well be used as a business model where learners could contribute (if they wish) to the maintenance and facilitation of a course. Merchandise – selling merchandise could also bring revenue to MOOCs. As many participants do not seek formal recognition (European Commission, 2014) for their completion of a MOOC, merchandise that presents their achievement in a playful way could well be attractive for them. Sale of supplementary material –supplementary course material in the form of an online or physical book or similar could be sold with the revenue being reinvested in the course delivery. Selective advertising – courses could have advertisements relevant to learners Data sharing – though a controversial topic, sharing learner data with relevant employers or similar could be another revenue model for MOOCs. Follow on events – the courses could lead to follow on summer schools, courses or other real-life or online events that are paid-for in which case a percentage of the revenue could be passed on to the MOOC for its upkeep. Though these models are all possible ways of generating revenue for MOOCs, some are more controversial and sensitive than others. Nevertheless unless appropriate business models are identified the sustainability of MOOCs would be problematic.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
Cross-hole anisotropic electrical and seismic tomograms of fractured metamorphic rock have been obtained at a test site where extensive hydrological data were available. A strong correlation between electrical resistivity anisotropy and seismic compressional-wave velocity anisotropy has been observed. Analysis of core samples from the site reveal that the shale-rich rocks have fabric-related average velocity anisotropy of between 10% and 30%. The cross-hole seismic data are consistent with these values, indicating that observed anisotropy might be principally due to the inherent rock fabric rather than to the aligned sets of open fractures. One region with velocity anisotropy greater than 30% has been modelled as aligned open fractures within an anisotropic rock matrix and this model is consistent with available fracture density and hydraulic transmissivity data from the boreholes and the cross-hole resistivity tomography data. However, in general the study highlights the uncertainties that can arise, due to the relative influence of rock fabric and fluid-filled fractures, when using geophysical techniques for hydrological investigations.
Resumo:
The near-Earth heliospheric magnetic field intensity, |B|, exhibits a strong solar cycle variation, but returns to the same ``floor'' value each solar minimum. The current minimum, however, has seen |B| drop below previous minima, bringing in to question the existence of a floor, or at the very least requiring a re-assessment of its value. In this study we assume heliospheric flux consists of a constant open flux component and a time-varying contribution from CMEs. In this scenario, the true floor is |B| with zero CME contribution. Using observed CME rates over the solar cycle, we estimate the ``no-CME'' |B| floor at ~4.0 +/- 0.3 nT, lower than previous floor estimates and below |B| observed this solar minimum. We speculate that the drop in |B| observed this minimum may be due to a persistently lower CME rate than the previous minimum, though there are large uncertainties in the supporting observational data.
Resumo:
We use geomagnetic activity data to study the rise and fall over the past century of the solar wind flow speed VSW, the interplanetary magnetic field strength B, and the open solar flux FS. Our estimates include allowance for the kinematic effect of longitudinal structure in the solar wind flow speed. As well as solar cycle variations, all three parameters show a long-term rise during the first half of the 20th century followed by peaks around 1955 and 1986 and then a recent decline. Cosmogenic isotope data reveal that this constitutes a grand maximum of solar activity which began in 1920, using the definition that such grand maxima are when 25-year averages of the heliospheric modulation potential exceeds 600 MV. Extrapolating the linear declines seen in all three parameters since 1985, yields predictions that the grand maximum will end in the years 2013, 2014, or 2027 using VSW, FS, or B, respectively. These estimates are consistent with predictions based on the probability distribution of the durations of past grand solar maxima seen in cosmogenic isotope data. The data contradict any suggestions of a floor to the open solar flux: we show that the solar minimum open solar flux, kinematically corrected to allow for the excess flux effect, has halved over the past two solar cycles.
Resumo:
Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Because of the importance and potential usefulness of construction market statistics to firms and government, consistency between different sources of data is examined with a view to building a predictive model of construction output using construction data alone. However, a comparison of Department of Trade and Industry (DTI) and Office for National Statistics (ONS) series shows that the correlation coefcient (used as a measure of consistency) of the DTI output and DTI orders data and the correlation coefficient of the DTI output and ONS output data are low. It is not possible to derive a predictive model of DTI output based on DTI orders data alone. The question arises whether or not an alternative independent source of data may be used to predict DTI output data. Independent data produced by Emap Glenigan (EG), based on planning applications, potentially offers such a source of information. The EG data records the value of planning applications and their planned start and finish dates. However, as this data is ex ante and is not correlated with DTI output it is not possible to use this data to describe the volume of actual construction output. Nor is it possible to use the EG planning data to predict DTI construc-tion orders data. Further consideration of the issues raised reveal that it is not practically possible to develop a consistent predictive model of construction output using construction statistics gathered at different stages in the development process.
Resumo:
The high variability of the intensity of suprathermal electron flux in the solar wind is usually ascribed to the high variability of sources on the Sun. Here we demonstrate that a substantial amount of the variability arises from peaks in stream interaction regions, where fast wind runs into slow wind and creates a pressure ridge at the interface. Superposed epoch analysis centered on stream interfaces in 26 interaction regions previously identified in Wind data reveal a twofold increase in 250 eV flux (integrated over pitch angle). Whether the peaks result from the compression there or are solar signatures of the coronal hole boundary, to which interfaces may map, is an open question. Suggestive of the latter, some cases show a displacement between the electron and magnetic field peaks at the interface. Since solar information is transmitted to 1 AU much more quickly by suprathermal electrons compared to convected plasma signatures, the displacement may imply a shift in the coronal hole boundary through transport of open magnetic flux via interchange reconnection. If so, however, the fact that displacements occur in both directions and that the electron and field peaks in the superposed epoch analysis are nearly coincident indicate that any systematic transport expected from differential solar rotation is overwhelmed by a random pattern, possibly owing to transport across a ragged coronal hole boundary.
Resumo:
The ability to display and inspect powder diffraction data quickly and efficiently is a central part of the data analysis process. Whilst many computer programs are capable of displaying powder data, their focus is typically on advanced operations such as structure solution or Rietveld refinement. This article describes a lightweight software package, Jpowder, whose focus is fast and convenient visualization and comparison of powder data sets in a variety of formats from computers with network access. Jpowder is written in Java and uses its associated Web Start technology to allow ‘single-click deployment’ from a web page, http://www.jpowder.org. Jpowder is open source, free and available for use by anyone.