908 resultados para information flow
Resumo:
With the increasing threat of cyber and other attacks on critical infrastructure, governments throughout the world have been organizing industry to share information on possible threats. In Australia the Office of the Attorney General has formed Trusted Information Sharing Networks (TISN) for the various critical industries such as banking and electricity. Currently the majority of information for a TISN is shared at physical meetings. To meet cyber threats there are clearly limitations to physical meetings. Many of these limitations can be overcome by the creation of a virtual information sharing network (VISN). However there are many challenges to overcome in the design of a VISN both from a policy and technical viewpoint. We shall discuss some of these challenges in this talk.
Resumo:
A statistical modeling method to accurately determine combustion chamber resonance is proposed and demonstrated. This method utilises Markov-chain Monte Carlo (MCMC) through the use of the Metropolis-Hastings (MH) algorithm to yield a probability density function for the combustion chamber frequency and find the best estimate of the resonant frequency, along with uncertainty. The accurate determination of combustion chamber resonance is then used to investigate various engine phenomena, with appropriate uncertainty, for a range of engine cycles. It is shown that, when operating on various ethanol/diesel fuel combinations, a 20% substitution yields the least amount of inter-cycle variability, in relation to combustion chamber resonance.
Resumo:
Creative processes, for instance, the development of visual effects or computer games, increasingly become part of the agenda of information systems researchers and practitioners. Such processes get their managerial challenges from the fact that they comprise both well-structured, transactional parts and creative parts. The latter can often not be precisely specified in terms of control flow, required resources, and outcome. The processes’ high uncertainty sets boundaries for the application of traditional business process management concepts, such as process automation, process modeling, process performance measurement, and risk management. Organizations must thus exercise caution when it comes to managing creative processes and supporting these with information technology. This, in turn, requires a profound understanding of the concept of creativity in business processes. In response to this, the present paper introduces a framework for conceptualizing creativity within business processes. The conceptual framework describes three types of uncertainty and constraints as well as the interrelationships among these. The study is grounded in the findings from three case studies that were conducted in the film and visual effects industry. Moreover, we provide initial evidence for the framework’s validity beyond this narrow focus. The framework is intended to serve as a sensitizing device that can guide further information systems research on creativity-related phenomena.
Resumo:
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.
Resumo:
We have developed a bioreactor vessel design which has the advantages of simplicity and ease of assembly and disassembly, and with the appropriately determined flow rate, even allows for a scaffold to be suspended freely regardless of its weight. This article reports our experimental and numerical investigations to evaluate the performance of a newly developed non-perfusion conical bioreactor by visualizing the flow through scaffolds with 45° and 90° fiber lay down patterns. The experiments were conducted at the Reynolds numbers (Re) 121, 170, and 218 based on the local velocity and width of scaffolds. The flow fields were captured using short-time exposures of 60 µm particles suspended in the bioreactor and illuminated using a thin laser sheet. The effects of scaffold fiber lay down pattern and Reynolds number were obtained and correspondingly compared to results obtained from a computational fluid dynamics (CFD) software package. The objectives of this article are twofold: to investigate the hypothesis that there may be an insufficient exchange of medium within the interior of the scaffold when using our non-perfusion bioreactor, and second, to compare the flows within and around scaffolds of 45° and 90° fiber lay down patterns. Scaffold porosity was also found to influence flow patterns. It was therefore shown that fluidic transport could be achieved within scaffolds with our bioreactor design, being a non-perfusion vessel. Fluid velocities were generally same of the same or one order lower in magnitude as compared to the inlet flow velocity. Additionally, the 90° fiber lay down pattern scaffold was found to allow for slightly higher fluid velocities within, as compared to the 45° fiber lay down pattern scaffold. This was due to the architecture and pore arrangement of the 90° fiber lay down pattern scaffold, which allows for fluid to flow directly through (channel-like flow).
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
System analysis within the traction power system is vital to the design and operation of an electrified railway. Loads in traction power systems are often characterised by their mobility, wide range of power variations, regeneration and service dependence. In addition, the feeding systems may take different forms in AC electrified railways. Comprehensive system studies are usually carried out by computer simulation. A number of traction power simulators have been available and they allow calculation of electrical interaction among trains and deterministic solutions of the power network. In the paper, a different approach is presented to enable load-flow analysis on various feeding systems and service demands in AC railways by adopting probabilistic techniques. It is intended to provide a different viewpoint to the load condition. Simulation results are given to verify the probabilistic-load-flow models.
Resumo:
Fuzzy logic has been applied to control traffic at road junctions. A simple controller with one fixed rule-set is inadequate to minimise delays when traffic flow rate is time-varying and likely to span a wide range. To achieve better control, fuzzy rules adapted to the current traffic conditions are used.
Resumo:
Information security policy defines the governance and implementation strategy for information security in alignment with the corporate risk policy objectives and strategies. Research has established that alignment between corporate concerns may be enhanced when strategies are developed concurrently using the same development process as an integrative relationship is established. Utilizing the corporate risk management framework for security policy management establishes such an integrative relationship between information security and corporate risk management objectives and strategies. There is however limitation in the current literature on presenting a definitive approach that fully integrates security policy management with the corporate risk management framework. This paper presents an approach that adopts a conventional corporate risk management framework for security policy development and management to achieve alignment with the corporate risk policy. A case example is examined to illustrate the alignment achieved in each process step with a security policy structure being consequently derived in the process. It is shown that information security policy management outcomes become both integral drivers and major elements of the corporate-level risk management considerations. Further study should involve assessing the impact of the use of the proposed framework in enhancing alignment as perceived in this paper.
Resumo:
The MPEG-21 Multimedia Framework provides for controlled distribution of multimedia works through its Intellectual Property Management and Protection ("IPMP") Components and Rights Expression Language ("MPEG REL"). The IPMP Components provide a framework by which the components of an MPEG-21 digital item can be protected from undesired access, while MPEG REL provides a mechanism for describing the conditions under which a component of a digital item may be used and distributed. This chapter describes how the IPMP Components and MPEG REL were used to implement a series of digital rights management applications at the Cooperative Research Centre for Smart Internet Technology in Australia. While the IPMP Components and MPEG REL were initially designed to facilitate the protection of copyright, the applications also show how the technology can be adapted to the protection of private personal information and sensitive corporate information.
Resumo:
National estimates of the prevalence of child abuse-related injuries are obtained from a variety of sectors including welfare, justice, and health resulting in inconsistent estimates across sectors. The International Classification of Diseases (ICD) is used as the international standard for categorising health data and aggregating data for statistical purposes, though there has been limited validation of the quality, completeness or concordance of these data with other sectors. This research study examined the quality of documentation and coding of child abuse recorded in hospital records in Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of over 1000 hospitalised injured children from 20 hospitals in Queensland. A data linkage methodology was used to link these records with records in the child welfare database. Cases were sampled from three sub-groups according to the presence of target ICD codes: Definite abuse, Possible abuse, unintentional injury. Less than 2% of cases coded as being unintentional were recoded after review as being possible abuse, and only 5% of cases coded as possible abuse cases were reclassified as unintentional, though there was greater variation in the classification of cases as definite abuse compared to possible abuse. Concordance of health data with child welfare data varied across patient subgroups. This study will inform the development of strategies to improve the quality, consistency and concordance of information between health and welfare agencies to ensure adequate system responses to children at risk of abuse.
Resumo:
Emergency departments (EDs) are often the first point of contact with an abused child. Despite legal mandate, the reporting of definite or suspected abusive injury to child safety authorities by ED clinicians varies due to a number of factors including training, access to child safety professionals, departmental culture and a fear of ‘getting it wrong’. This study examined the quality of documentation and coding of child abuse captured by ED based injury surveillance data and ED medical records in the state of Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of almost 1000 injured children included in the Queensland Injury Surveillance Unit database (QISU) from 10 hospitals in urban and rural centres. Independent experts re-coded the records based on their review of the notes. A data linkage methodology was then used to link these records with records in the state government’s child welfare database. Cases were sampled from three sub-groups according to the surveillance intent codes: Maltreatment by parent, Undetermined and Unintentional injury. Only 0.1% of cases coded as unintentional injury were recoded to maltreatment by parent, while 1.2% of cases coded as maltreatment by parent were reclassified as unintentional and 5% of cases where the intent was undetermined by the triage nurse were recoded as maltreatment by parent. Quality of documentation varied across type of hospital (tertiary referral centre, children’s, urban, regional and remote). Concordance of health data with child welfare data varied across patient subgroups. Outcomes from this research will guide initiatives to improve the quality of intentional child injury surveillance systems.
Resumo:
With an increasing level of collaboration amongst researchers, software developers and industry practitioners in the past three decades, building information modelling (BIM) is now recognized as an emerging technological and procedural shift within the architect, engineering and construction (AEC) industry. BIM is not only considered as a way to make a profound impact on the professions of AEC, but is also regarded as an approach to assist the industry to develop new ways of thinking and practice. Despite the widespread development and recognition of BIM, a succinct and systematic review of the existing BIM research and achievement is scarce. It is also necessary to take stock on existing applications and have a fresh look at where BIM should be heading and how it can benefit from the advances being made. This paper first presents a review of BIM research and achievement in AEC industry. A number of suggestions are then made for future research in BIM. This paper maintains that the value of BIM during design and construction phases is well documented over the last decade, and new research needs to expand the level of development and analysis from design/build stage to postconstruction and facility asset management. New research in BIM could also move beyond the traditional building type to managing the broader range of facilities and built assets and providing preventative maintenance schedules for sustainable and intelligent buildings
Resumo:
Probabilistic load flow techniques have been adopted in AC electrified railways to study the load demand under various train service conditions. This paper highlights the differences in probabilistic load flow analysis between the usual power systems and power supply systems in AC railways; discusses the possible difficulties in problem formulation and presents the link between train movement and the corresponding power demand for load flow calculation.