144 resultados para information flow properties
em Queensland University of Technology - ePrints Archive
Resumo:
Security-critical communications devices must be evaluated to the highest possible standards before they can be deployed. This process includes tracing potential information flow through the device's electronic circuitry, for each of the device's operating modes. Increasingly, however, security functionality is being entrusted to embedded software running on microprocessors within such devices, so new strategies are needed for integrating information flow analyses of embedded program code with hardware analyses. Here we show how standard compiler principles can augment high-integrity security evaluations to allow seamless tracing of information flow through both the hardware and software of embedded systems. This is done by unifying input/output statements in embedded program execution paths with the hardware pins they access, and by associating significant software states with corresponding operating modes of the surrounding electronic circuitry.
Resumo:
In this paper we present a model for defining and enforcing a fine-grained information flow policy. We describe how the policy can be enforced on a typical computer and present experiments using the proposed model. A key feature of the model is that it allows the expression of rules which detail precisely which information elements are allowed to mix together. For example, the model allows the expression of a policy which forbids a doctor from mixing the personal medical details of the patients. The enforcement mechanisms tracks and records information flows within the system so that dynamic changes to the policy can be made with respect to information elements which may have propagated to different locations in the system.
Resumo:
The preparation of macroporous methacrylate monolithic material with controlled pore structures can be carried out in an unstirred mould through careful and precise control of the polymerisation kinetics and parameters. Contemporary synthesis conditions of methacrylate monolithic polymers are based on existing polymerisation schemes without an in-depth understanding of the dynamics of pore structure and formation. This leads to poor performance in polymer usage thereby affecting final product recovery and purity, retention time, productivity and process economics. The unique porosity of methacrylate monolithic polymer which propels its usage in many industrial applications can be controlled easily during its preparation. Control of the kinetics of the overall process through changes in reaction time, temperature and overall composition such as cross-linker and initiator contents allow the fine tuning of the macroporous structure and provide an understanding of the mechanism of pore formation within the unstirred mould. The significant effect of temperature of the reaction kinetics serves as an effectual means to control and optimise the pore structure and allows the preparation of polymers with different pore size distributions from the same composition of the polymerisation mixture. Increasing the concentration of the cross-linking monomer affects the composition of the final monoliths and also decreases the average pore size as a result of pre-mature formation of highly cross-linked globules with a reduced propensity to coalesce. The choice and concentration of porogen solvent is also imperative. Different porogens and porogen mixtures present different pore structure output. Example, larger pores are obtained in a poor solvent due to early phase separation.
Resumo:
Based on unique news data relating to gold and crude oil, we investigate how news volume and sentiment, shocks in trading activity, market depth and trader positions unrelated to information flow covary with realized volatility. Positive shocks to the rate of news arrival, and negative shocks to news sentiment exhibit the largest effects. After controlling for the level of news flow and cross-correlations, net trader positions play only a minor role. These findings are at odds with those of [Wang (2002a). The Journal of Futures Markets, 22, 427–450; Wang (2002b). The Financial Review, 37, 295–316], but are consistent with the previous literature which doesn't find a strong link between volatility and trader positions.
Resumo:
This paper discusses the areawide Dynamic ROad traffic NoisE (DRONE) simulator, and its implementation as a tool for noise abatement policy evaluation. DRONE involves integrating a road traffic noise estimation model with a traffic simulator to estimate road traffic noise in urban networks. An integrated traffic simulation-noise estimation model provides an interface for direct input of traffic flow properties from simulation model to noise estimation model that in turn estimates the noise on a spatial and temporal scale. The output from DRONE is linked with a geographical information system for visual representation of noise levels in the form of noise contour maps.
Resumo:
Language Modeling (LM) has been successfully applied to Information Retrieval (IR). However, most of the existing LM approaches only rely on term occurrences in documents, queries and document collections. In traditional unigram based models, terms (or words) are usually considered to be independent. In some recent studies, dependence models have been proposed to incorporate term relationships into LM, so that links can be created between words in the same sentence, and term relationships (e.g. synonymy) can be used to expand the document model. In this study, we further extend this family of dependence models in the following two ways: (1) Term relationships are used to expand query model instead of document model, so that query expansion process can be naturally implemented; (2) We exploit more sophisticated inferential relationships extracted with Information Flow (IF). Information flow relationships are not simply pairwise term relationships as those used in previous studies, but are between a set of terms and another term. They allow for context-dependent query expansion. Our experiments conducted on TREC collections show that we can obtain large and significant improvements with our approach. This study shows that LM is an appropriate framework to implement effective query expansion.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.
Resumo:
This chapter presents the preliminary findings of a qualitative study exploring people’s information experiences during the 2012 Queensland State election in Australia. Six residents of South East Queensland who were eligible to vote in the state election participated in a semi-structured interview. The interviews revealed five themes that depict participants’ information experience during the election: information sources, information flow, personal politics, party politics and sense making. Together these themes represent what is experienced as information, how information is experienced, as well as contextual aspects that were unique to voting in an election. The study outlined here is one in an emerging area of enquiry that has explored information experience as a research object. This study has revealed that people’s information experiences are rich, complex and dynamic, and that information experience as a construct of scholarly inquiry provides deep insights into the ways in which people relate to their information worlds. More studies exploring information experience within different contexts are needed to help develop our theoretical understanding of this important and emerging construct.
Resumo:
The overall aim of this research project was to provide a broader range of value propositions (beyond upfront traditional construction costs) that could transform both the demand side and supply side of the housing industry. The project involved gathering information about how building information is created, used and communicated and classifying building information, leading to the formation of an Information Flow Chart and Stakeholder Relationship Map. These were then tested via broad housing industry focus groups and surveys. The project revealed four key relationships that appear to operate in isolation to the whole housing sector and may have significant impact on the sustainability outcomes and life cycle costs of dwellings over their life cycle. It also found that although a lot of information about individual dwellings does already exist, this information is not coordinated or inventoried in any systematic manner and that national building information files of building passports would present value to a wide range of stakeholders.
Resumo:
Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this perspective. These were drawn from a detailed literature survey. A research framework for analysis of the interaction between lean and BIM was then compiled. The goal of the framework is to both guide and stimulate research; as such, the approach adopted up to this point is constructive. Ongoing research has identified 55 such interactions, the majority of which show positive synergy between the two.
Resumo:
In some delay-tolerant communication systems such as vehicular ad-hoc networks, information flow can be represented as an infectious process, where each entity having already received the information will try to share it with its neighbours. The random walk and random waypoint models are popular analysis tools for these epidemic broadcasts, and represent two types of random mobility. In this paper, we introduce a simulation framework investigating the impact of a gradual increase of bias in path selection (i.e. reduction of randomness), when moving from the former to the latter. Randomness in path selection can significantly alter the system performances, in both regular and irregular network structures. The implications of these results for real systems are discussed in details.
Resumo:
In this research we modelled computer network devices to ensure their communication behaviours meet various network standards. By modelling devices as finite-state machines and examining their properties in a range of configurations, we discovered a flaw in a common network protocol and produced a technique to improve organisations' network security against data theft.
Resumo:
Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.