889 resultados para Open Data, Dati Aperti, Open Government Data
Resumo:
Background: The purpose of the study is to identify factors predictive of outcome after open globe injury in 273 patients admitted to the Royal Brisbane Hospital, Queensland, Australia between 1992 and 2003. Methods: Data were collected retrospectively regarding demographic and geographical factors, injury circumstances, initial visual acuity (VA), injury parameters, details of initial and subsequent surgery, final best corrected VA and complications. Multivariate analysis using binary logistic regression was utilized to identify which factors were related to outcome. Results: 83% of patients were male, with a mean age of 38.3 years. The mean duration of follow up was 16.6 months and 58% of patients (135 of 231) achieved an overall improvement in their vision. Forty-one cases (15%) were enucleated, with half of these cases performed primarily. The prognostic factors indicating the likelihood of a VA of counting fingers or worse were poor initial VA, a large laceration > 10 mm and the presence of a relative afferent pupil defect. Rural patients had a significantly worse final VA than city dwellers and had higher rates of endophthalmitis and enucleation. Conclusions: Assessment of prognostic factors at the time of presentation of an open globe injury enables realistic expectations of final visual outcome by the doctor and the patient. In order to improve outcomes in patients from rural areas, access to specialized eye services need to be upgraded.
Resumo:
Fires are integral to the healthy functioning of most ecosystems and are often poorly understood in policy and management, however, the relationship between floristic composition and habitat structure is intrinsically linked, particularly after fire. The aim of this study was to test whether the variability of habitat structure or floristic composition and abundance in forests at a regional scale can be explained in terms of fire frequency using historical data and experimental prescribed burns. We tested this hypothesis in open eucalypt forests of Fraser Island off the east coast of Australia. Fraser Island dunes show progressive stages in plant succession as access to nutrients decreases across the Island. We found that fire frequency was not a good predictor of floristic composition or abundance across dune systems; rather, its affects were dune specific. In contrast, habitat structure was strongly influenced by fire frequency, independent of dune system. A dense understorey occurred in frequently burnt areas, whereas infrequently burnt areas had a more even distribution of plant heights. Plant communities returned to pre-burn levels of composition and abundances within 6 months of a fire and frequently burnt areas were dominated by early successional species of plant. These ecosystems were characterized by low diversity and frequently burnt areas on the east coast were dominated by Pteridium. Greater midstorey canopy cover in low frequency areas reduces light penetration and allows other species to compete more effectively with Pteridium. Our results strongly indicate that frequent fires on the Island have resulted in a decrease in relative diversity through dominance of several species. Prescribed fire represents a powerful management tool to shape habitat structure and complexity of Fraser Island forests.
Resumo:
We developed a method to rapidly and safely live capture wild dugongs based on the “rodeo method” employed to catch marine turtles. This method entails close pursuit of a dugong by boat until it is fatigued. The dugong is then caught around the peduncle region by a catcher leaping off the boat, and the dugong is restrained at the water surface by several people while data are collected. Our sampling protocol involves a short restraint time, typically < 5 min. No ropes or nets were attached to the dugong to avoid the risk of entanglement and subsequent drowning. This method is suitable for shallow, open-water captures when weather and water conditions are fair, and may be adapted for deeper waters.
Resumo:
This paper outlines the methodology of blast fragmentation modeling undertaken for a green field feasibility study at the Riska gold deposit in Indonesia. The favoured milling process for the feasibility study was dump leaching,with no crushing of the ore material extracted from the pit. For this reason,blast fragmentation was a critical issue to be addressed by the study. A range of blast designs were considered with bench heights and blasthole diameters ranging from 4 m to 7 m and 76 mm to 102 mm respectively. Rock mass data was obtained from 19 diamond drill cores across the deposit (total drill length approximately 2200 m). Intact rock strength was estimated from qualitative strength descriptors,while the in situ block size distribution of the rock mass was estimated from the Rock Quality Designation (RQD) of the core.
Resumo:
Extensible Business Reporting Language (XBRL) is being adopted by European regulators as a data standard for the exchange of business information. This paper examines the approach of XBRL International (XII) to the meta-data standard's development and diffusion. We theorise the development of XBRL using concepts drawn from a model of successful open source projects. Comparison of the open source model to XBRL enables us to identify a number of interesting similarities and differences. In common with open source projects, the benefits and progress of XBRL have been overstated and 'hyped' by enthusiastic participants. While XBRL is an open data standard in terms of access to the equivalent of its 'source code' we find that the governance structure of the XBRL consortium is significantly different to a model open source approach. The barrier to participation that is created by requiring paid membership and a focus on transacting business at physical conferences and meetings is identified as particularly critical. Decisions about the technical structure of XBRL, the regulator-led pattern of adoption and the organisation of XII are discussed. Finally areas for future research are identified.
Resumo:
The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.
Resumo:
Forests play a pivotal role in timber production, maintenance and development of biodiversity and in carbon sequestration and storage in the context of the Kyoto Protocol. Policy makers and forest experts therefore require reliable information on forest extent, type and change for management, planning and modeling purposes. It is becoming increasingly clear that such forest information is frequently inconsistent and unharmonised between countries and continents. This research paper presents a forest information portal that has been developed in line with the GEOSS and INSPIRE frameworks. The web portal provides access to forest resources data at a variety of spatial scales, from global through to regional and local, as well as providing analytical capabilities for monitoring and validating forest change. The system also allows for the utilisation of forest data and processing services within other thematic areas. The web portal has been developed using open standards to facilitate accessibility, interoperability and data transfer.
Resumo:
PURPOSE: To validate a new miniaturised, open-field wavefront device which has been developed with the capacity to be attached to an ophthalmic surgical microscope or slit-lamp. SETTING: Solihull Hospital and Aston University, Birmingham, UK DESIGN: Comparative non-interventional study. METHODS: The dynamic range of the Aston Aberrometer was assessed using a calibrated model eye. The validity of the Aston Aberrometer was compared to a conventional desk mounted Shack-Hartmann aberrometer (Topcon KR1W) by measuring the refractive error and higher order aberrations of 75 dilated eyes with both instruments in random order. The Aston Aberrometer measurements were repeated five times to assess intra-session repeatability. Data was converted to vector form for analysis. RESULTS: The Aston Aberrometer had a large dynamic range of at least +21.0 D to -25.0 D. It gave similar measurements to a conventional aberrometer for mean spherical equivalent (mean difference ± 95% confidence interval: 0.02 ± 0.49D; correlation: r=0.995, p<0.001), astigmatic components (J0: 0.02 ± 0.15D; r=0.977, p<0.001; J45: 0.03 ± 0.28; r=0.666, p<0.001) and higher order aberrations RMS (0.02 ± 0.20D; r=0.620, p<0.001). Intraclass correlation coefficient assessments of intra-sessional repeatability for the Aston Aberrometer were excellent (spherical equivalent =1.000, p<0.001; astigmatic components J0 =0.998, p<0.001, J45=0.980, p<0.01; higher order aberrations RMS =0.961, p<0.001). CONCLUSIONS: The Aston Aberrometer gives valid and repeatable measures of refractive error and higher order aberrations over a large range. As it is able to measure continuously, it can provide direct feedback to surgeons during intraocular lens implantations and corneal surgery as to the optical status of the visual system.
Resumo:
It is generally believed that the structural reforms that were introduced in India following the macro-economic crisis of 1991 ushered in competition and forced companies to become more efficient. However, whether the post-1991 growth is an outcome of more efficient use of resources or greater use of factor inputs remains an open empirical question. In this paper, we use plant-level data from 1989–1990 and 2000–2001 to address this question. Our results indicate that while there was an increase in the productivity of factor inputs during the 1990s, most of the growth in value added is explained by growth in the use of factor inputs. We also find that median technical efficiency declined in all but one of the industries between 1989–1990 and 2000–2001, and that change in technical efficiency explains a very small proportion of the change in gross value added.
Resumo:
Monitoring land-cover changes on sites of conservation importance allows environmental problems to be detected, solutions to be developed and the effectiveness of actions to be assessed. However, the remoteness of many sites or a lack of resources means these data are frequently not available. Remote sensing may provide a solution, but large-scale mapping and change detection may not be appropriate, necessitating site-level assessments. These need to be easy to undertake, rapid and cheap. We present an example of a Web-based solution based on free and open-source software and standards (including PostGIS, OpenLayers, Web Map Services, Web Feature Services and GeoServer) to support assessments of land-cover change (and validation of global land-cover maps). Authorised users are provided with means to assess land-cover visually and may optionally provide uncertainty information at various levels: from a general rating of their confidence in an assessment to a quantification of the proportions of land-cover types within a reference area. Versions of this tool have been developed for the TREES-3 initiative (Simonetti, Beuchle and Eva, 2011). This monitors tropical land-cover change through ground-truthing at latitude / longitude degree confluence points, and for monitoring of change within and around Important Bird Areas (IBAs) by Birdlife International and the Royal Society for the Protection of Birds (RSPB). In this paper we present results from the second of these applications. We also present further details on the potential use of the land-cover change assessment tool on sites of recognised conservation importance, in combination with NDVI and other time series data from the eStation (a system for receiving, processing and disseminating environmental data). We show how the tool can be used to increase the usability of earth observation data by local stakeholders and experts, and assist in evaluating the impact of protection regimes on land-cover change.
Resumo:
Linked Data semantic sources, in particular DBpedia, can be used to answer many user queries. PowerAqua is an open multi-ontology Question Answering (QA) system for the Semantic Web (SW). However, the emergence of Linked Data, characterized by its openness, heterogeneity and scale, introduces a new dimension to the Semantic Web scenario, in which exploiting the relevant information to extract answers for Natural Language (NL) user queries is a major challenge. In this paper we discuss the issues and lessons learned from our experience of integrating PowerAqua as a front-end for DBpedia and a subset of Linked Data sources. As such, we go one step beyond the state of the art on end-users interfaces for Linked Data by introducing mapping and fusion techniques needed to translate a user query by means of multiple sources. Our first informal experiments probe whether, in fact, it is feasible to obtain answers to user queries by composing information across semantic sources and Linked Data, even in its current form, where the strength of Linked Data is more a by-product of its size than its quality. We believe our experiences can be extrapolated to a variety of end-user applications that wish to scale, open up, exploit and re-use what possibly is the greatest wealth of data about everything in the history of Artificial Intelligence. © 2010 Springer-Verlag.
Resumo:
The software architecture and development consideration for open metadata extraction and processing framework are outlined. Special attention is paid to the aspects of reliability and fault tolerance. Grid infrastructure is shown as useful backend for general-purpose task.