894 resultados para Data Integrity
Resumo:
This paper describes a number of techniques for GNSS navigation message authentication. A detailed analysis of the security facilitated by navigation message authentication is given. The analysis takes into consideration the risk of critical applications that rely on GPS including transportation, finance and telecommunication networks. We propose a number of cryptographic authentication schemes for navigation data authentication. These authentication schemes provide authenticity and integrity of the navigation data to the receiver. Through software simulation, the performance of the schemes is quantified. The use of software simulation enables the collection of authentication performance data of different data channels, and the impact of various schemes on the infrastructure and receiver. Navigation message authentication schemes have been simulated at the proposed data rates of Galileo and GPS services, for which the resulting performance data is presented. This paper concludes by making recommendations for optimal implementation of navigation message authentication for Galileo and next generation GPS systems.
Resumo:
Over the last century, environmental and occupational medicine has played a significant role in the protection and improvement of public health. However, scientific integrity in this field has been increasingly threatened by pressure from some industries and governments. For example, it has been reported that the tobacco industry manipulated eminent scientists to legitimise their industrial positions, irresponsibly distorted risk and deliberately subverted scientific processes, and influenced many organisations in receipt of tobacco funding. Many environmental whistleblowers were sued and encountered numerous personal attacks. In some countries, scientific findings have been suppressed and distorted, and scientific advisory committees manipulated for political purposes by government agencies. How to respond to these threats is an important challenge for environmental and occupational medicine professionals and their societies. The authors recommend that professional organisations adopt a code of ethics that requires openness from public health professionals; that they not undertake research or use data where they do not have freedom to publish their results if these data have public health implications; that they disclose all possible conflicts; that the veracity of their research results should not be compromised; and that their research independence be protected through professional and legal support. The authors furthermore recommend that research funding for public health not be directly from the industry to the researcher. An independent, intermediate funding scheme should be established to ensure that there is no pressure to analyse data and publish results in bad faith. Such a funding system should also provide equal competition for funds and selection of the best proposals according to standard scientific criteria.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
The Council of Australian Governments (COAG) in 2003 gave in-principle approval to a best-practice report recommending a holistic approach to managing natural disasters in Australia incorporating a move from a traditional response-centric approach to a greater focus on mitigation, recovery and resilience with community well-being at the core. Since that time, there have been a range of complementary developments that have supported the COAG recommended approach. Developments have been administrative, legislative and technological, both, in reaction to the COAG initiative and resulting from regular natural disasters. This paper reviews the characteristics of the spatial data that is becoming increasingly available at Federal, state and regional jurisdictions with respect to their being fit for the purpose for disaster planning and mitigation and strengthening community resilience. In particular, Queensland foundation spatial data, which is increasingly accessible by the public under the provisions of the Right to Information Act 2009, Information Privacy Act 2009, and recent open data reform initiatives are evaluated. The Fitzroy River catchment and floodplain is used as a case study for the review undertaken. The catchment covers an area of 142,545 km2, the largest river catchment flowing to the eastern coast of Australia. The Fitzroy River basin experienced extensive flooding during the 2010–2011 Queensland floods. The basin is an area of important economic, environmental and heritage values and contains significant infrastructure critical for the mining and agricultural sectors, the two most important economic sectors for Queensland State. Consequently, the spatial datasets for this area play a critical role in disaster management and for protecting critical infrastructure essential for economic and community well-being. The foundation spatial datasets are assessed for disaster planning and mitigation purposes using data quality indicators such as resolution, accuracy, integrity, validity and audit trail.
Resumo:
The main theme of this thesis is to allow the users of cloud services to outsource their data without the need to trust the cloud provider. The method is based on combining existing proof-of-storage schemes with distance-bounding protocols. Specifically, cloud customers will be able to verify the confidentiality, integrity, availability, fairness (or mutual non-repudiation), data freshness, geographic assurance and replication of their stored data directly, without having to rely on the word of the cloud provider.
Resumo:
Within the QUT Business School (QUTBS)– researchers across economics, finance and accounting depend on data driven research. They analyze historic and global financial data across a range of instruments to understand the relationships and effects between them as they respond to news and events in their region. Scholars and Higher Degree Research Students in turn seek out universities which offer these particular datasets to further their research. This involves downloading and manipulating large datasets, often with a focus on depth of detail, frequency and long tail historical data. This is stock exchange data and has potential commercial value therefore the license for access tends to be very expensive. This poster reports the following findings: •The library has a part to play in freeing up researchers from the burden of negotiating subscriptions, fundraising and managing the legal requirements around license and access. •The role of the library is to communicate the nature and potential of these complex resources across the university to disciplines as diverse as Mathematics, Health, Information Systems and Creative Industries. •Has demonstrated clear concrete support for research by QUT Library and built relationships into faculty. It has made data available to all researchers and attracted new HDRs. The aim is to reach the output threshold of research outputs to submit into FOR Code 1502 (Banking, Finance and Investment) for ERA 2015. •It is difficult to identify what subset of dataset will be obtained given somewhat vague price tiers. •The integrity of data is variable as it is limited by the way it is collected, this occasionally raises issues for researchers(Cook, Campbell, & Kelly, 2012) •Improved library understanding of the content of our products and the nature of financial based research is a necessary part of the service.
Resumo:
This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.
Resumo:
The quality of data collection methods selected and the integrity of the data collected are integral tot eh success of a study. This chapter focuses on data collection and study validity. After reading the chapter, readers should be able to define types of data collection methods in quantitative research; list advantages and disadvantages of each method; discuss factors related to internal and external validity; critically evaluate data collection methods and discuss the need to operationalise variables of interest for data collection.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
This paper provides a three-layered framework to monitor the positioning performance requirements of Real-time Relative Positioning (RRP) systems of the Cooperative Intelligent Transport Systems (C-ITS) that support Cooperative Collision Warning (CCW) applications. These applications exploit state data of surrounding vehicles obtained solely from the Global Positioning System (GPS) and Dedicated Short-Range Communications (DSRC) units without using other sensors. To this end, the paper argues the need for the GPS/DSRC-based RRP systems to have an autonomous monitoring mechanism, since the operation of CCW applications is meant to augment safety on roads. The advantages of autonomous integrity monitoring are essential and integral to any safety-of-life system. The autonomous integrity monitoring framework proposed necessitates the RRP systems to detect/predict the unavailability of their sub-systems and of the integrity monitoring module itself, and, if available, to account for effects of data link delays and breakages of DSRC links, as well as of faulty measurement sources of GPS and/or integrated augmentation positioning systems, before the information used for safety warnings/alarms becomes unavailable, unreliable, inaccurate or misleading. Hence, a monitoring framework using a tight integration and correlation approach is proposed for instantaneous reliability assessment of the RRP systems. Ultimately, using the proposed framework, the RRP systems will provide timely alerts to users when the RRP solutions cannot be trusted or used for the intended operation.
Resumo:
The effects of reductions in cell wall lignin content, manifested by RNA interference suppression of coumaroyl 3'-hydroxylase, on plant growth, water transport, gas exchange, and photosynthesis were evaluated in hybrid poplar trees (Populus alba 3 grandidentata). The growth characteristics of the reduced lignin trees were significantly impaired, resulting in smaller stems and reduced root biomass when compared to wild-type trees, as well as altered leaf morphology and architecture. The severe inhibition of cell wall lignification produced trees with a collapsed xylem phenotype, resulting in compromised vascular integrity, and displayed reduced hydraulic conductivity and a greater susceptibility to wall failure and cavitation. In the reduced lignin trees, photosynthetic carbon assimilation and stomatal conductance were also greatly reduced, however, shoot xylem pressure potential and carbon isotope discrimination were higher and water-use efficiency was lower, inconsistent with water stress. Reductions in assimilation rate could not be ascribed to increased stomatal limitation. Starch and soluble sugars analysis of leaves revealed that photosynthate was accumulating to high levels, suggesting that the trees with substantially reduced cell wall lignin were not carbon limited and that reductions in sink strength were, instead, limiting photosynthesis.
Resumo:
The first consideration of any Australian Human Research Ethics Committee should be to satisfy itself that the project before them is worth undertaking. If the project does not add to the body of knowledge, if it does not improve social welfare or individual wellbeing then the use of human participants, their tissue or their data must be questioned. Sometimes, however, committees are criticised for appearing to adopt the role of scientific review committees. The intent of this paper is to provide researchers with an understanding of the ethical importance of demonstrating the merit of their research project and to help them develop protocols that show ethics committees that adequate attention has been paid to this central tenet in dealing ethically with human research participants. Any person proposing human research must be prepared to show that it is worthwhile. This paper will clarify the relationship between research merit and integrity, research ethics and the responsibilities of human research ethics committees.
Resumo:
This paper presents a multi-criteria based approach for nondestructive diagnostic structural integrity assessment of a decommissioned flatbed rail wagon (FBRW) used for road bridge superstructure rehabilitation and replacement applications. First, full-scale vibration and static test data sets are employed in a FE model of the FBRW to obtain the best ‘initial’ estimate of the model parameters. Second, the ‘final’ model parameters are predicted using sensitivity-based perturbation analysis without significant difficulties encountered. Consequently, the updated FBRW model is validated using the independent sets of full-scale laboratory static test data. Finally, the updated and validated FE model of the FBRW is used for structural integrity assessment of a single lane FBRW bridge subjected to the Australian bridge design traffic load.
Resumo:
Decision-making is such an integral aspect in health care routine that the ability to make the right decisions at crucial moments can lead to patient health improvements. Evidence-based practice, the paradigm used to make those informed decisions, relies on the use of current best evidence from systematic research such as randomized controlled trials. Limitations of the outcomes from randomized controlled trials (RCT), such as “quantity” and “quality” of evidence generated, has lowered healthcare professionals’ confidence in using EBP. An alternate paradigm of Practice-Based Evidence has evolved with the key being evidence drawn from practice settings. Through the use of health information technology, electronic health records (EHR) capture relevant clinical practice “evidence”. A data-driven approach is proposed to capitalize on the benefits of EHR. The issues of data privacy, security and integrity are diminished by an information accountability concept. Data warehouse architecture completes the data-driven approach by integrating health data from multi-source systems, unique within the healthcare environment.
Resumo:
Purpose To test an interventional patient skin integrity bundle, InSPiRE protocol, on the impact of pressure injuries (PrIs) in critically ill patients in an Australian adult intensive care unit (ICU). Methods Before and after design was used where the group of patients receiving the intervention (InSPiRE protocol) was compared with a similar control group who received standard care. Data collected included demographic and clinical variables, skin assessment, PrI presence and stage, and a Sequential Organ Failure Assessment (SOFA) score. Results Overall, 207 patients were enrolled, 105 in the intervention group and 102 in the control group. Most patients were men, mean age 55. The groups were similar on major demographic variables (age, SOFA scores, ICU length of stay). Pressure injury cumulative incidence was significantly lower in the intervention group (18%) compared to the control group for skin injuries(30.4%) (χ2=4.271, df=1, p=0.039) and mucous injuries (t test =3.27, p=<0.001) . Significantly fewer PrIs developing over time in the intervention group (Logrank= 11.842, df=1, p=<0.001) and patients developed fewer skin injuries (>3 PrIs/patient = 1/105) compared with the control group (>3 injuries/patient = 10/102) (p=0.018). Conclusion The intervention group, recieving the InSPiRE protocol, had lower PrI cumulative incidence, and reduced number and severity of PrIs that developed over time. Systematic and ongoing assessment of the patient's skin and PrI risk as well as implementation of tailored prevention measures are central to preventing PrIs.