74 resultados para bureaucratic requirements
Resumo:
Engineering design processes are necessary to attain the requisite standards of integrity for high-assurance safety-related systems. Additionally, human factors design initiatives can provide critical insights that parameterise their development. Unfortunately, the popular perception of human factors as a “forced marriage” between engineering and psychology often provokes views where the ‘human factor’ is perceived as a threat to systems design. Some popular performance-based standards for developing safety-related systems advocate identifying and managing human factors throughout the system lifecycle. However, they also have a tendency to fall short in their guidance on the application of human factors methods and tools, let alone how the outputs generated can be integrated in to various stages of the design process. This case study describes a project that converged engineering with human factors to develop a safety argument for new low-cost railway level crossing technology for system-wide implementation in Australia. The paper enjoins the perspectives of a software engineer and cognitive psychologist and their involvement in the project over two years of collaborative work to develop a safety argument for low-cost level crossing technology. Safety and reliability requirements were informed by applying human factors analytical tools that supported the evaluation and quantification of human reliability where users interfaced with the technology. The project team was confronted with significant challenges in cross-disciplinary engagement, particularly with the complexities of dealing with incongruences in disciplinary language. They were also encouraged to think ‘outside the box’ as to how users of a system interpreted system states and ehaviour. Importantly, some of these states, while considered safe within the boundary of the constituent systems that implemented safety-related functions, could actually lead the users to engage in deviant behaviour. Psychology explained how user compliance could be eroded to levels that effectively undermined levels of risk reduction afforded by systems. Linking the engineering and psychology disciplines intuitively, overall safety performance was improved by introducing technical requirements and making design decisions that minimized the system states and behaviours that led to user deviancy. As a commentary on the utility of transdisciplinary collaboration for technical specification, the processes used to bridge the two disciplines are conceptualised in a graphical model.
Resumo:
This article considers the changes to the Swimming Pools Act 1992 (NSW)(Act) which established a State-wide online register of all private swimming pools in NSW requiring pool owners to register their pools by 19 November 2013. Amendments to the Act introduced changes to the conveyancing and residential tenancy regulations to require vendors and landlords to have a valid Compliance Certificate issued for their swimming pool before offering the property for sale or lease. This article provides a brief overview of the new sale and leasing requirements effective from 29 April 2014, focusing on its application to lot owners within strata and community title schemes and other owners of water front properties with pools on Crown Land Reserves.
Resumo:
In seeking to achieve Australian workplaces free from injury and disease NOHSC works to lead and coordinate national efforts to prevent workplace death, injury and disease. We seek to achieve our mission through the quality and relevance of information we provide and to influence the activities of all parties with roles in improving Australia’s OHS performance. NOHSC has five strategic objectives: • improving national data systems and analysis, • improving national access to OHS information, • improving national components of the OHS and related regulatory framework, • facilitating and coordinating national OHS research efforts, • monitoring progress against the National OHS Improvement Framework. This publication is a contribution to achieving those objectives
Resumo:
The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.
Resumo:
By definition, regulatory rules (in legal context called norms) intend to achieve specific behaviour from business processes, and might be relevant to the whole or part of a business process. They can impose conditions on different aspects of process models, e.g., control-flow, data and resources etc. Based on the rules sets, norms can be classified into various classes and sub-classes according to their effects. This paper presents an abstract framework consisting of a list of norms and a generic compliance checking approach on the idea of (possible) execution of processes. The proposed framework is independent of any existing formalism, and provides a conceptually rich and exhaustive ontology and semantics of norms needed for business process compliance checking. The possible uses of the proposed framework include to compare different compliance management frameworks (CMFs).
New paradigm, new educational requirements? Australian viewpoints on education for digital libraries
Resumo:
The rise in popularity of the digital library has lead to studies addressing digital library education and curricula development to emanate from the United States and Europe. However, to date very little research has been conducted with an Australian focus. Additionally, very few studies worldwide have sought the opinions of practitioners and the influence that these opinions may have on developing appropriate digital library curricula. The current paper is drawn from a larger study which sought to determine the skills and knowledge required of library and information professionals to work in a digital library environment. Data were collected via an online questionnaire from two target groups: practitioners working in academic libraries and Library and Information Science (LIS) educators across Australia. This paper examines in depth the findings from the survey specifically relating to the following topics. Firstly, whether or not there is a need for an educational programme to be targeted solely at the digital library environment. Secondly, the preferred delivery options for such a programme, and preferred models of digital library education. In addition, a determination on the elements which should be included in the curricula of a digital library education programme are discussed. Findings are compared and discussed with reference to the literature which informed the study. Finally, implications for the sustainability of library education programmes in Australia are identified and directions for further research highlighted.
Resumo:
CoMFA and CoMSIA analysis were utilized in this investigation to define the important interacting regions in paclitaxel/tubulin binding site and to develop selective paclitaxel-like active compounds. The starting geometry of paclitaxel analogs was taken from the crystal structure of docetaxel. A total of 28 derivatives of paclitaxel were divided into two groups—a training set comprising of 19 compounds and a test set comprising of nine compounds. They were constructed and geometrically optimized using SYBYL v6.6. CoMFA studies provided a good predictability (q2 = 0.699, r2 = 0.991, PC = 6, S.E.E. = 0.343 and F = 185.910). They showed the steric and electrostatic properties as the major interacting forces whilst the lipophilic property contribution was a minor factor for recognition forces of the binding site. These results were in agreement with the experimental data of the binding activities of these compounds. Five fields in CoMSIA analysis (steric, electrostatic, hydrophobic, hydrogen-bond acceptor and donor properties) were considered contributors in the ligand–receptor interactions. The results obtained from the CoMSIA studies were: q2 = 0.535, r2 = 0.983, PC = 5, S.E.E. = 0.452 and F = 127.884. The data obtained from both CoMFA and CoMSIA studies were interpreted with respect to the paclitaxel/tubulin binding site. This intuitively suggested where the most significant anchoring points for binding affinity are located. This information could be used for the development of new compounds having paclitaxel-like activity with new chemical entities to overcome the existing pharmaceutical barriers and the economical problem associated with the synthesis of the paclitaxel analogs. These will boost the wide use of this useful class of compounds, i.e. in brain tumors as the most of the present active compounds have poor blood–brain barrier crossing ratios and also, various tubulin isotypes has shown resistance to taxanes and other antimitotic agents.
Resumo:
This paper analyzes the limitations upon the amount of in- domain (NIST SREs) data required for training a probabilistic linear discriminant analysis (PLDA) speaker verification system based on out-domain (Switchboard) total variability subspaces. By limiting the number of speakers, the number of sessions per speaker and the length of active speech per session available in the target domain for PLDA training, we investigated the relative effect of these three parameters on PLDA speaker verification performance in the NIST 2008 and NIST 2010 speaker recognition evaluation datasets. Experimental results indicate that while these parameters depend highly on each other, to beat out-domain PLDA training, more than 10 seconds of active speech should be available for at least 4 sessions/speaker for a minimum of 800 speakers. If further data is available, considerable improvement can be made over solely out-domain PLDA training.
Resumo:
Aim: To estimate the colonoscopy burden of introducing population screening for colorectal cancer in New Zealand. Methods: Screening for colorectal cancer using biennial immunochemical faecal occult blood tests offered to people aged 50-74 years of age was modelled using population estimates from Statistics New Zealand for 2011-2031. Modelling to determine colonoscopy requirements was based on participation and test positivity rates from published results of screening programmes. Estimates of the number of procedures required for ongoing adenoma surveillance were calculated using screening literature results of adenoma yield, and New Zealand Guidelines for Adenoma Surveillance. Sensitivity analysis was undertaken on key parameters. Results: For a test positivity of 6.4%, biennial screening using immunochemical faecal occult blood testing with a 60% participation rate, would require 18,000 colonoscopies nationally, increasing to 28,000 by 2031. The majority of procedures are direct referrals from a positive FOBT, with surveillance colonoscopy numbers building over time. Conclusion: Colonoscopy requirements for immunochemical faecal occult blood based population screening for colorectal cancer are high. Significant expansion of services is required and careful management of surveillance procedures to ensure timely delivery of initial colonoscopies whilst maintaining symptomatic services. A model re-run informed by data from the screening pilot will allow improved estimates for the New Zealand setting.
Resumo:
Organ-specific immunity is a feature of many infectious diseases, including visceral leishmaniasis caused by Leishmania donovani. Experimental visceral leishmaniasis in genetically susceptible mice is characterized by an acute, resolving infection in the liver and chronic infection in the spleen. CD4+ T cell responses are critical for the establishment and maintenance of hepatic immunity in this disease model, but their role in chronically infected spleens remains unclear. In this study, we show that dendritic cells are critical for CD4+ T cell activation and expansion in all tissue sites examined. We found that FTY720-mediated blockade of T cell trafficking early in infection prevented Ag-specific CD4+ T cells from appearing in lymph nodes, but not the spleen and liver, suggesting that early CD4+ T cell priming does not occur in liver-draining lymph nodes. Extended treatment with FTY720 over the first month of infection increased parasite burdens, although this associated with blockade of lymphocyte egress from secondary lymphoid tissue, as well as with more generalized splenic lymphopenia. Importantly, we demonstrate that CD4+ T cells are required for the establishment and maintenance of antiparasitic immunity in the liver, as well as for immune surveillance and suppression of parasite outgrowth in chronically infected spleens. Finally, although early CD4+ T cell priming appeared to occur most effectively in the spleen, we unexpectedly revealed that protective CD4+ T cell-mediated hepatic immunity could be generated in the complete absence of all secondary lymphoid tissues.
Resumo:
The Body Area Network (BAN) is an emerging technology that focuses on monitoring physiological data in, on and around the human body. BAN technology permits wearable and implanted sensors to collect vital data about the human body and transmit it to other nodes via low-energy communication. In this paper, we investigate interactions in terms of data flows between parties involved in BANs under four different scenarios targeting outdoor and indoor medical environments: hospital, home, emergency and open areas. Based on these scenarios, we identify data flow requirements between BAN elements such as sensors and control units (CUs) and parties involved in BANs such as the patient, doctors, nurses and relatives. Identified requirements are used to generate BAN data flow models. Petri Nets (PNs) are used as the formal modelling language. We check the validity of the models and compare them with the existing related work. Finally, using the models, we identify communication and security requirements based on the most common active and passive attack scenarios.