899 resultados para Data protection -- Congresses
Resumo:
Traffic Simulation models tend to have their own data input and output formats. In an effort to standardise the input for traffic simulations, we introduce in this paper a set of data marts that aim to serve as a common interface between the necessaary data, stored in dedicated databases, and the swoftware packages, that require the input in a certain format. The data marts are developed based on real world objects (e.g. roads, traffic lights, controllers) rather than abstract models and hence contain all necessary information that can be transformed by the importing software package to their needs. The paper contains a full description of the data marts for network coding, simulation results, and scenario management, which have been discussed with industry partners to ensure sustainability.
Resumo:
In response to the need to leverage private finance and the lack of competition in some parts of the Australian public sector major infrastructure market, especially in very large economic infrastructure procured using Pubic Private Partnerships, the Australian Federal government has demonstrated its desire to attract new sources of in-bound foreign direct investment (FDI) into the Australian construction market. This paper aims to report on progress towards an investigation into the determinants of multinational contractors’ willingness to bid for Australian public sector major infrastructure projects and which is designed to give an improved understanding of matters surrounding FDI into the Australian construction sector. This research deploys Dunning’s eclectic theory for the first time in terms of in-bound FDI by multinational contractors and as head contractors bidding for Australian major infrastructure public sector projects. Elsewhere, the authors have developed Dunning’s principal hypothesis associated with his eclectic framework in order to suit the context of this research and to address a weakness arising in Dunning’s principal hypothesis that is based on a nominal approach to the factors in the eclectic framework and which fail to speak to the relative explanatory power of these factors. In this paper, an approach to reviewing and analysing secondary data, as part of the first stage investigation in this research, is developed and some illustrations given, vis-à-vis the selected sector (roads, bridges and tunnels) in Australia (as the host location) and using one of the selected home countries (Spain). In conclusion, some tentative thoughts are offered in anticipation of the completion of the first stage investigation - in terms of the extent to which this first stage based on secondary data only might suggest the relative importance of the factors in the eclectic framework. It is noted that more robust conclusions are expected following the future planned stages of the research and these stages including primary data are briefly outlined. Finally, and beyond theoretical contributions expected from the overall approach taken to developing and testing Dunning’s framework, other expected contributions concerning research method and practical implications are mentioned.
Resumo:
Researchers are increasingly involved in data-intensive research projects that cut across geographic and disciplinary borders. Quality research now often involves virtual communities of researchers participating in large-scale web-based collaborations, opening their earlystage research to the research community in order to encourage broader participation and accelerate discoveries. The result of such large-scale collaborations has been the production of ever-increasing amounts of data. In short, we are in the midst of a data deluge. Accompanying these developments has been a growing recognition that if the benefits of enhanced access to research are to be realised, it will be necessary to develop the systems and services that enable data to be managed and secured. It has also become apparent that to achieve seamless access to data it is necessary not only to adopt appropriate technical standards, practices and architecture, but also to develop legal frameworks that facilitate access to and use of research data. This chapter provides an overview of the current research landscape in Australia as it relates to the collection, management and sharing of research data. The chapter then explains the Australian legal regimes relevant to data, including copyright, patent, privacy, confidentiality and contract law. Finally, this chapter proposes the infrastructure elements that are required for the proper management of legal interests, ownership rights and rights to access and use data collected or generated by research projects.
Resumo:
This report provides an evaluation of the current available evidence-base for identification and surveillance of product-related injuries in children in Queensland. While the focal population was children in Queensland, the identification of information needs and data sources for product safety surveillance has applicability nationally for all age groups. The report firstly summarises the data needs of product safety regulators regarding product-related injury in children, describing the current sources of information informing product safety policy and practice, and documenting the priority product surveillance areas affecting children which have been a focus over recent years in Queensland. Health data sources in Queensland which have the potential to inform product safety surveillance initiatives were evaluated in terms of their ability to address the information needs of product safety regulators. Patterns in product-related injuries in children were analysed using routinely available health data to identify areas for future intervention, and the patterns in product-related injuries in children identified in health data were compared to those identified by product safety regulators. Recommendations were made for information system improvements and improved access to and utilisation of health data for more proactive approaches to product safety surveillance in the future.
Resumo:
Assurance of learning is a predominant feature in both quality enhancement and assurance in higher education. Assurance of learning is a process that articulates explicit program outcomes and standards, and systematically gathers evidence to determine the extent to which performance matches expectations. Benefits accrue to the institution through the systematic assessment of whole of program goals. Data may be used for continuous improvement, program development, and to inform external accreditation and evaluation bodies. Recent developments, including the introduction of the Tertiary Education and Quality Standards Agency (TEQSA) will require universities to review the methods they use to assure learning outcomes. This project investigates two critical elements of assurance of learning: 1. the mapping of graduate attributes throughout a program; and 2. the collection of assurance of learning data. An audit was conducted with 25 of the 39 Business Schools in Australian universities to identify current methods of mapping graduate attributes and for collecting assurance of learning data across degree programs, as well as a review of the key challenges faced in these areas. Our findings indicate that external drivers like professional body accreditation (for example: Association to Advance Collegiate Schools of Business (AACSB)) and TEQSA are important motivators for assuring learning, and those who were undertaking AACSB accreditation had more robust assurance of learning systems in place. It was reassuring to see that the majority of institutions (96%) had adopted an embedding approach to assuring learning rather than opting for independent standardised testing. The main challenges that were evident were the development of sustainable processes that were not considered a burden to academic staff, and obtainment of academic buy in to the benefits of assuring learning per se rather than assurance of learning being seen as a tick box exercise. This cultural change is the real challenge in assurance of learning practice.
Resumo:
Cold-formed steel stud walls are a major component of Light Steel Framing (LSF) building systems used in commercial, industrial and residential buildings. In the conventional LSF stud wall systems, thin steel studs are protected from fire by placing one or two layers of plasterboard on both sides with or without cavity insulation. However, there is very limited data about the structural and thermal performance of stud wall systems while past research showed contradicting results, for example, about the benefits of cavity insulation. This research was therefore conducted to improve the knowledge and understanding of the structural and thermal performance of cold-formed steel stud wall systems (both load bearing and non-load bearing) under fire conditions and to develop new improved stud wall systems including reliable and simple methods to predict their fire resistance rating. Full scale fire tests of cold-formed steel stud wall systems formed the basis of this research. This research proposed an innovative LSF stud wall system in which a composite panel made of two plasterboards with insulation between them was used to improve the fire rating. Hence fire tests included both conventional steel stud walls with and without the use of cavity insulation and the new composite panel system. A propane fired gas furnace was specially designed and constructed first. The furnace was designed to deliver heat in accordance with the standard time temperature curve as proposed by AS 1530.4 (SA, 2005). A compression loading frame capable of loading the individual studs of a full scale steel stud wall system was also designed and built for the load-bearing tests. Fire tests included comprehensive time-temperature measurements across the thickness and along the length of all the specimens using K type thermocouples. They also included the measurements of load-deformation characteristics of stud walls until failure. The first phase of fire tests included 15 small scale fire tests of gypsum plasterboards, and composite panels using different types of insulating material of varying thickness and density. Fire performance of single and multiple layers of gypsum plasterboards was assessed including the effect of interfaces between adjacent plasterboards on the thermal performance. Effects of insulations such as glass fibre, rock fibre and cellulose fibre were also determined while the tests provided important data relating to the temperature at which the fall off of external plasterboards occurred. In the second phase, nine small scale non-load bearing wall specimens were tested to investigate the thermal performance of conventional and innovative steel stud wall systems. Effects of single and multiple layers of plasterboards with and without vertical joints were investigated. The new composite panels were seen to offer greater thermal protection to the studs in comparison to the conventional panels. In the third phase of fire tests, nine full scale load bearing wall specimens were tested to study the thermal and structural performance of the load bearing wall assemblies. A full scale test was also conducted at ambient temperature. These tests showed that the use of cavity insulation led to inferior fire performance of walls, and provided good explanations and supporting research data to overcome the incorrect industry assumptions about cavity insulation. They demonstrated that the use of insulation externally in a composite panel enhanced the thermal and structural performance of stud walls and increased their fire resistance rating significantly. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. This research also included steady state tensile tests at ambient and elevated temperatures to address the lack of reliable mechanical properties for high grade cold-formed steels at elevated temperatures. Suitable predictive equations were developed for calculating the yield strength and elastic modulus at elevated temperatures. In summary, this research has developed comprehensive experimental thermal and structural performance data for both the conventional and the proposed non-load bearing and load bearing stud wall systems under fire conditions. Idealized hot flange temperature profiles have been developed for non-insulated, cavity insulated and externally insulated load bearing wall models along with suitable equations for predicting their failure times. A graphical method has also been proposed to predict the failure times (fire rating) of non-load bearing and load bearing walls under different load ratios. The results from this research are useful to both fire researchers and engineers working in this field. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF walls under fire conditions, and developed an innovative LSF wall system with increased fire rating. It has clearly demonstrated the detrimental effects of using cavity insulation, and has paved the way for Australian building industries to develop new wall panels with increased fire rating for commercial applications worldwide.
Resumo:
The relationship between the environment and human rights has long been recognised. It is now largely accepted that a ‘good’ environment is a necessary precondition for the enjoyment of a wide range of human rights, including the right to health, the right to an adequate standard of living, and even the right to life. It has even been suggested that as humans we all possess a right to live in an environment of a certain standard, based on the intrinsic value of the natural world to all human beings. In this context much has been written regarding the important role that the environment plays in human lives. This paper looks at the flip-side of this discussion, and examines what human rights can do for the environment. It is argued that, while there are valid criticisms for linking environmental protection too strongly to human needs, there is nonetheless much to be gained from using human rights law as a framework to achieve environmental protection.
Resumo:
In 2001, amendments to the Migration Act 1958 (Cth) made possible the offshore processing of protection claims. The same amendments also foreshadowed the processing of claims by ‘offshore entry persons’ in Australia according to non-statutory procedures. After disbanding offshore processing the then Rudd Labor Government commenced processing of protection claims by ‘offshore entry persons’ in Australia under the Refugee Status Assessment process (RSA). The RSA process sought to substitute well established legislative criteria for the grant of a protection visa, as interpreted by the courts, with administrative guidelines and decision-making immune from judicial review. This approach was rejected by the High Court in the cases M61 and M69. This article analyses these developments in light of Australia’s international protection obligations, as well as considering the practical obstacles that continue to confront offshore entry persons as they pursue judicial review of adverse refugee status determinations after the High Court’s decision.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling. Some of the core components of data modelling are addressed. A selection of results from the first data modelling activity implemented during the second year (2010; second grade) of a current longitudinal study are reported. Data modelling involves investigations of meaningful phenomena, deciding what is worthy of attention (identifying complex attributes), and then progressing to organising, structuring, visualising, and representing data. Reported here are children's abilities to identify diverse and complex attributes, sort and classify data in different ways, and create and interpret models to represent their data.
Resumo:
Notwithstanding the obvious potential advantages of information and communications technology (ICT) in the enhanced provision of healthcare services, there are some concerns associated with integration of and access to electronic health records. A security violation in health records, such as an unauthorised disclosure or unauthorised alteration of an individual's health information, can significantly undermine both healthcare providers' and consumers' confidence and trust in e-health systems. A crisis in confidence in any national level e-health system could seriously degrade the realisation of the system's potential benefits. In response to the privacy and security requirements for the protection of health information, this research project investigated national and international e-health development activities to identify the necessary requirements for the creation of a trusted health information system architecture consistent with legislative and regulatory requirements and relevant health informatics standards. The research examined the appropriateness and sustainability of the current approaches for the protection of health information. It then proposed an architecture to facilitate the viable and sustainable enforcement of privacy and security in health information systems under the project title "Open and Trusted Health Information Systems (OTHIS)". OTHIS addresses necessary security controls to protect sensitive health information when such data is at rest, during processing and in transit with three separate and achievable security function-based concepts and modules: a) Health Informatics Application Security (HIAS); b) Health Informatics Access Control (HIAC); and c) Health Informatics Network Security (HINS). The outcome of this research is a roadmap for a viable and sustainable architecture for providing robust protection and security of health information including elucidations of three achievable security control subsystem requirements within the proposed architecture. The successful completion of two proof-of-concept prototypes demonstrated the comprehensibility, feasibility and practicality of the HIAC and HIAS models for the development and assessment of trusted health systems. Meanwhile, the OTHIS architecture has provided guidance for technical and security design appropriate to the development and implementation of trusted health information systems whilst simultaneously offering guidance for ongoing research projects. The socio-economic implications of this research can be summarised in the fact that this research embraces the need for low cost security strategies against economic realities by using open-source technologies for overall test implementation. This allows the proposed architecture to be publicly accessible, providing a platform for interoperability to meet real-world application security demands. On the whole, the OTHIS architecture sets a high level of security standard for the establishment and maintenance of both current and future health information systems. This thereby increases healthcare providers‘ and consumers‘ trust in the adoption of electronic health records to realise the associated benefits.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the elementary school years, with opportunities for children to engage in data modeling. Data modeling involves investigations of meaningful phenomena, deciding what is worthy of attention, and then progressing to organizing, structuring, visualizing, and representing data. Reported here are some findings from a two-part activity (Baxter Brown’s Picnic and Planning a Picnic) implemented at the end of the second year of a current three-year longitudinal study (grade levels 1-3). Planning a Picnic was also implemented in a grade 7 class to provide an opportunity for the different age groups to share their products. Addressed here are the grade 2 children’s predictions for missing data in Baxter Brown’s Picnic, the questions posed and representations created by both grade levels in Planning a Picnic, and the metarepresentational competence displayed in the grade levels’ sharing of their products for Planning a Picnic.