940 resultados para Data Standards
Resumo:
Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.
Resumo:
Aim: The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. Methods: A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis – a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups – an expert and a recent graduate group of Australian orthotist/prosthetists – were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. Results: In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. Conclusion: This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.
Resumo:
Accelerator mass spectrometry (AMS) is an ultrasensitive technique for measuring the concentration of a single isotope. The electric and magnetic fields of an electrostatic accelerator system are used to filter out other isotopes from the ion beam. The high velocity means that molecules can be destroyed and removed from the measurement background. As a result, concentrations down to one atom in 10^16 atoms are measurable. This thesis describes the construction of the new AMS system in the Accelerator Laboratory of the University of Helsinki. The system is described in detail along with the relevant ion optics. System performance and some of the 14C measurements done with the system are described. In a second part of the thesis, a novel statistical model for the analysis of AMS data is presented. Bayesian methods are used in order to make the best use of the available information. In the new model, instrumental drift is modelled with a continuous first-order autoregressive process. This enables rigorous normalization to standards measured at different times. The Poisson statistical nature of a 14C measurement is also taken into account properly, so that uncertainty estimates are much more stable. It is shown that, overall, the new model improves both the accuracy and the precision of AMS measurements. In particular, the results can be improved for samples with very low 14C concentrations or measured only a few times.
Resumo:
One of the central issues in making efficient use of IT in the design, construction and maintenance of buildings is the sharing of the digital building data across disciplines and lifecycle stages. One technology which enables data sharing is CAD layering, which to be of real use requires the definition of standards. This paper focuses on the background, objectives and effectiveness of the International standard ISO 13567, Organisation and naming of layers for CAD. In particular the efficiency and effectiveness of the standardisation and standard implementation process are in focus, rather than the technical details. The study was conducted as a qualitative study with a number of experts who responded to a semi-structured mail questionnaire, supplemented by personal interviews. The main results were that CAD layer standards based on the ISO standard have been implemented, particularly in northern European countries, but are not very widely used. A major problem which was identified was the lack of resources for marketing and implementing the standard as national variations, once it had been formally accepted.
Resumo:
For sign languages used by deaf communities, linguistic corpora have until recently been unavailable, due to the lack of a writing system and a written culture in these communities, and the very recent advent of digital video. Recent improvements in video and computer technology have now made larger sign language datasets possible; however, large sign language datasets that are fully machine-readable are still elusive. This is due to two challenges. 1. Inconsistencies that arise when signs are annotated by means of spoken/written language. 2. The fact that many parts of signed interaction are not necessarily fully composed of lexical signs (equivalent of words), instead consisting of constructions that are less conventionalised. As sign language corpus building progresses, the potential for some standards in annotation is beginning to emerge. But before this project, there were no attempts to standardise these practices across corpora, which is required to be able to compare data crosslinguistically. This project thus had the following aims: 1. To develop annotation standards for glosses (lexical/word level) 2. To test their reliability and validity 3. To improve current software tools that facilitate a reliable workflow Overall the project aimed not only to set a standard for the whole field of sign language studies throughout the world but also to make significant advances toward two of the world’s largest machine-readable datasets for sign languages – specifically the BSL Corpus (British Sign Language, http://bslcorpusproject.org) and the Corpus NGT (Sign Language of the Netherlands, http://www.ru.nl/corpusngt).
Resumo:
This study was undertaken by UKOLN on behalf of the Joint Information Systems Committee (JISC) in the period April to September 2008. Application profiles are metadata schemata which consist of data elements drawn from one or more namespaces, optimized for a particular local application. They offer a way for particular communities to base the interoperability specifications they create and use for their digital material on established open standards. This offers the potential for digital materials to be accessed, used and curated effectively both within and beyond the communities in which they were created. The JISC recognized the need to undertake a scoping study to investigate metadata application profile requirements for scientific data in relation to digital repositories, and specifically concerning descriptive metadata to support resource discovery and other functions such as preservation. This followed on from the development of the Scholarly Works Application Profile (SWAP) undertaken within the JISC Digital Repositories Programme and led by Andy Powell (Eduserv Foundation) and Julie Allinson (RRT UKOLN) on behalf of the JISC. Aims and Objectives 1.To assess whether a single metadata AP for research data, or a small number thereof, would improve resource discovery or discovery-to-delivery in any useful or significant way. 2.If so, then to:a.assess whether the development of such AP(s) is practical and if so, how much effort it would take; b.scope a community uptake strategy that is likely to be successful, identifying the main barriers and key stakeholders. 3.Otherwise, to investigate how best to improve cross-discipline, cross-community discovery-to-delivery for research data, and make recommendations to the JISC and others as appropriate. Approach The Study used a broad conception of what constitutes scientific data, namely data gathered, collated, structured and analysed using a recognizably scientific method, with a bias towards quantitative methods. The approach taken was to map out the landscape of existing data centres, repositories and associated projects, and conduct a survey of the discovery-to-delivery metadata they use or have defined, alongside any insights they have gained from working with this metadata. This was followed up by a series of unstructured interviews, discussing use cases for a Scientific Data Application Profile, and how widely a single profile might be applied. On the latter point, matters of granularity, the experimental/measurement contrast, the quantitative/qualitative contrast, the raw/derived data contrast, and the homogeneous/heterogeneous data collection contrast were discussed. The Study report was loosely structured according to the Singapore Framework for Dublin Core Application Profiles, and in turn considered: the possible use cases for a Scientific Data Application Profile; existing domain models that could either be used or adapted for use within such a profile; and a comparison existing metadata profiles and standards to identify candidate elements for inclusion in the description set profile for scientific data. The report also considered how the application profile might be implemented, its relationship to other application profiles, the alternatives to constructing a Scientific Data Application Profile, the development effort required, and what could be done to encourage uptake in the community. The conclusions of the Study were validated through a reference group of stakeholders.
Resumo:
Scientific research revolves around the production, analysis, storage, management, and re-use of data. Data sharing offers important benefits for scientific progress and advancement of knowledge. However, several limitations and barriers in the general adoption of data sharing are still in place. Probably the most important challenge is that data sharing is not yet very common among scholars and is not yet seen as a regular activity among scientists, although important efforts are being invested in promoting data sharing. In addition, there is a relatively low commitment of scholars to cite data. The most important problems and challenges regarding data metrics are closely tied to the more general problems related to data sharing. The development of data metrics is dependent on the growth of data sharing practices, after all it is nothing more than the registration of researchers’ behaviour. At the same time, the availability of proper metrics can help researchers to make their data work more visible. This may subsequently act as an incentive for more data sharing and in this way a virtuous circle may be set in motion. This report seeks to further explore the possibilities of metrics for datasets (i.e. the creation of reliable data metrics) and an effective reward system that aligns the main interests of the main stakeholders involved in the process. The report reviews the current literature on data sharing and data metrics. It presents interviews with the main stakeholders on data sharing and data metrics. It also analyses the existing repositories and tools in the field of data sharing that have special relevance for the promotion and development of data metrics. On the basis of these three pillars, the report presents a number of solutions and necessary developments, as well as a set of recommendations regarding data metrics. The most important recommendations include the general adoption of data sharing and data publication among scholars; the development of a reward system for scientists that includes data metrics; reducing the costs of data publication; reducing existing negative cultural perceptions of researchers regarding data publication; developing standards for preservation, publication, identification and citation of datasets; more coordination of data repository initiatives; and further development of interoperability protocols across different actors.
Resumo:
On 23-24 September 2009 an international discussion workshop on “Main Drivers for Successful Re-Use of Research Data” was held in Berlin, prepared and organised by the Knowledge Exchange working group on Primary Research Data. The main focus of the workshop was on the benefits, challenges and obstacles of re-using data from a researcher’s perspective. The use cases presented by researchers from a variety of disciplines were supplemented by two key notes and selected presentations by specialists from infrastructure institutions, publishers, and funding bodies on national and European level. Researchers' perspectives The workshop provided a critical evaluation of what lessons have been learned on sharing and re-using research data from a researcher’s perspective and what actions might be taken on to still improve the successful re-use. Despite the individual differences characterising the diverse disciplines it became clear that important issues are comparable. Combine forces to support re-use and sharing of data Apart from several technical challenges such as metadata exchange standards and quality assurance it was obvious that the most important obstacles to re-using research data more efficiently are socially determined. It was agreed that in order to overcome this problem more efforts should be made to rise awareness and combine forces to support re-using and sharing of research data on all levels (researchers, institutions, publishers, funders, governments).
Resumo:
This report considers the development of environmental quality standards (EQSs) for the salmonid fishery, cyprinid fishery, migratory fishery, commercial harvesting of marine fish for public consumption and commercial harvesting of shellfish for public consumption uses of controlled surface waters. Previous reports have been used to identify those parameters necessary for the maintenance of these five uses. Each water use is considered in a separate section within which identified parameters are discussed and standards proposed, a summary of the proposed standards is presented at the beginning of the relevant section. For salmonid, cyprinid and migratory fisheries, EQSs for substances in water have been proposed for the protection of these fisheries. For the commercial harvesting of marine fish and shellfish for public consumption uses 'Warning Levels' of substances in waters have been proposed. These 'Warning Levels' have been proposed by considering data on bioaccumulation and food standards and aim to prevent acceptable intake values and concentrations in fish/shellfish flesh exceeding statutory or recommended levels. For the commercial harvesting of marine fish for public consumption it has been concluded that the current EQSs for most List II substances for the protection of salt water life should be adequately stringent to protect this use, however for the commercial harvesting of shellfish for public consumption, these List II EQSs do not appear adequate to protect this use and more stringent 'Warning Levels' have been proposed. For all five uses considered in this report there has been found to be limited information on a number of the parameters considered and in general for indigenous species, this has been found to be especially so when considering migratory fisheries and the commercial harvesting of marine fish and shellfish.
Resumo:
This is the Proposed Environmental Quality Standards (EQS) for Phenol in Water prepared for the National Rivers Authority, and published by the Environment Agency in 1995. The report reviews the properties and uses of phenol, its fate, behaviour and reported concentrations in the environment and critically assesses the available data on its toxicity and bioaccumulation. The information is used to derive EQSs for the protection of fresh and saltwater life and for the abstraction of water to potable supply. Phenol is widely used as a chemical intermediate and the main sources for phenol in the environment are of anthropogenic origin. Phenol may also be formed during natural decomposition of organic material. The persistence of phenol in the aquatic environment is low with biodegradation being the main degradation process (half-lives of hours to days). Phenol is moderately toxic to aquatic organisms and its potential to bioaccumulate in aquatic organisms is low.
Resumo:
This is the Proposed Environmental Quality Standards (EQS) for Nonylphenol in Water produced by the Environment Agency in 1997. The report reviews the properties and uses of Nonylphenol, its fate, behaviour and reported concentrations in the environment, and critically assesses available data on its toxicity and bioaccumulation. The information is used to derive EQSs for the protection of fresh and saltwater life as well as for water abstracted to potable supply.Nonylphenol (NP) is used extensively in the production of other substances such as non-ionic ethoxylate surfactants. It is through the incomplete anaerobic biodegradation of these surfactants that most nonylphenol reaches the aquatic environment in effluents, e.g. from sewage treatment works and certain manufacturing operations. It was explicitly stated by the Environment Agency that the EQS was to be derived for NP and not Nonylphenol ethoxylates. However, since NP is unlikely to be present in the aquatic environment in the absence of other nonylphenol ethoxylate (NPE) degradation by-products, the toxicity, fate and behaviour of some of these (i.e. nonylphenol mono- and diethoxylates (NP1EO and NP2EO), mono- and di-nonylphenoxy carboxylic acids (NP1EC and NP2EC) have also been considered in this report. In the aquatic environment and during sewage treatment, NPEs are rapidly degraded to NP under aerobic conditions. NP may then be either fully mineralised or may be adsorbed to sediments. Since NP cannot be biodegraded under anaerobic conditions it can accumulate in sediments to high concentrations.
Resumo:
This brief note reviews five papers which were presented at the 1993 IFAC World Congress, on the theme 'standards and guidelines for computer-aided control engineering (CACE)'. This session was organized as part of the CACE Software Standardization Initiative, a combined effort of the IFAC and IEEE Control System Society committees on standards. The motivation of this report is to note the substantial progress that was made in this initiative, and to provide the basis for further discussion and work. The papers under review were concerned with integrated design environments, the use of the EXPRESS language for defining standard data structures, database management, user interfaces, and the modeling and simulation of hybrid systems.
Resumo:
As operational impacts from buildings are reduced, embodied impacts are increasing. However, the latter are seldom calculated in the UK; when they are, they tend to be calculated after the building has been constructed, or are underestimated by considering only the initial materials stage. In 2010, the UK Government recommended that a standard methodology for calculating embodied impacts of buildings be developed for early stage design decisions. This was followed in 2011-12 by the publication of the European TC350 standards defining the 'cradle to grave' impact of buildings and products through a process Life Cycle Analysis. This paper describes a new whole life embodied carbon and energy of buildings (ECEB) tool, designed as a usable empirical-based approach for early stage design decisions for UK buildings. The tool complies where possible with the TC350 standards. Initial results for a simple masonry construction dwelling are given in terms of the percentage contribution of each life cycle stage. The main difficulty in obtaining these results is found to be the lack of data, and the paper suggests that the construction and manufacturing industries now have a responsibility to develop new data in order to support this task. © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Resumo:
Due to concerns about environmental protection and resource utilization, product lifecycle management for end-of-life (EOL) has received increasing attention in many industrial sectors including manufacturing, maintenance/repair, and recycling/refurbishing of the product. To support these functions, crucial issues are studied to realize a product recovery management system (PRMS), including: (1) an architecture design for EOL services, such as remanufacturing and recycling; (2) a product data model required for EOL activity based on international standards; and (3) an infrastructure for information acquisition and mapping to product lifecycle information. The presented works are illustrated via a realistic scenario. © 2008 Elsevier B.V. All rights reserved.
Resumo:
During the parasite fauna investigation within 2005, the freshwater fish trypanosome Trypanosoma siniperca Chang 1964 was isolated from the blood of Mandarin carp (Siniperca chuatsi) from Niushan Lake, Hubei Province, central China. Blood trypomastigotes were observed only, and the density of infection was low. Light microscopy examinations of this material made it possible to study in detail the morphology of this parasite and redescribe it according to current standards. T. siniperca is characterized also on the molecular level using the sequences of SSU rRNA gene. Phylogenetic analyses based on these sequences allowed clearer phylogenetic relationships to be established with other fish trypanosomes sequenced to date.