934 resultados para requirements process
Resumo:
This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.
Resumo:
A browser is a convenient way to access resources located remotely on computer networks. Security in browsers has become a crucial issue for users who use them for sensitive applications without knowledge ofthe hazards. This research utilises a structure approach to analyse and propose enhancements to browser security. Standard evaluation for computer products is important as it helps users to ensure that the product they use is appropriate for their needs. Security in browsers, therefore, has been evaluated using the Common Criteria. The outcome of this was a security requirements profile which attempts to formalise the security needs of browsers. The information collected during the research was used to produce a prototype model for a secure browser program. Modifications to the Lynx browser were made to demonstrate the proposed enhancements.
Resumo:
A review of the main rolling models is conducted to assess their suitability for modelling the foil rolling process. Two such models are Fleck and Johnson's Hertzian model and Fleck, Johnson, Mear and Zhang's Influence Function model. Both of these models are approximated through the use of perturbation methods. Decrease in the computation time resulted when compared with the numerical solution. The Hertzian model was approximated using the ratio of the yield stress of the strip to the plane-strain Young's Modulus of the rolls as the small perturbation parameter. The Influence Function model approximation takes advantage of the solution of the well-known Aerofoil Integral Equation to gain an insight into how the choice of interior boundary points affects the stability of numerical solution of the model's equations. These approximations require less computation than their full models and, in the case of the Hertzian approximation, only introduces a small error in the predictions of roll force roll torque. Hence the Hertzian approximate method is suitable for on-line control. The predictions from the Influence Function approximation underestimates the predictions from the numerical results. Better approximation of the pressure in the plastic reduction regions is the main source of this error.
Resumo:
The present paper focuses on some interesting classes of process-control games, where winning essentially means successfully controlling the process. A master for one of these games is an agent who plays a winning strategy. In this paper we investigate situations in which even a complete model (given by a program) of a particular game does not provide enough information to synthesize—even incrementally—a winning strategy. However, if in addition to getting a program, a machine may also watch masters play winning strategies, then the machine is able to incrementally learn a winning strategy for the given game. Studied are successful learning from arbitrary masters and from pedagogically useful selected masters. It is shown that selected masters are strictly more helpful for learning than are arbitrary masters. Both for learning from arbitrary masters and for learning from selected masters, though, there are cases where one can learn programs for winning strategies from masters but not if one is required to learn a program for the master's strategy itself. Both for learning from arbitrary masters and for learning from selected masters, one can learn strictly more by watching m+1 masters than one can learn by watching only m. Last, a simulation result is presented where the presence of a selected master reduces the complexity from infinitely many semantic mind changes to finitely many syntactic ones.
Resumo:
Nature Refuges encompass the second largest extent of protected area estate in Queensland. Major problems exist in the data capture, map presentation, data quality and integrity of these boundaries. The spatial accuracies/inaccuracies of the Nature Refuge administrative boundaries directly influence the ability to preserve valuable ecosystems by challenging negative environmental impacts on these properties. This research work is about supporting the Nature Refuge Programs efforts to secure Queensland’s natural and cultural values on private land by utilising GIS and its advanced functionalities. The research design organizes and enters Queensland’s Nature Refuge boundaries into a spatial environment. Survey quality data collection techniques such as the Global Positioning Systems (GPS) are investigated to capture Nature Refuge boundary information. Using the concepts of map communication GIS Cartography is utilised for the protected area plan design. New spatial datasets are generated facilitating the effectiveness of investigative data analysis. The geodatabase model developed by this study adds rich GIS behaviour providing the capability to store, query, and manipulate geographic information. It provides the ability to leverage data relationships and enforces topological integrity creating savings in customization and productivity. The final phase of the research design incorporates the advanced functions of ArcGIS. These functions facilitate building spatial system models. The geodatabase and process models developed by this research can be easily modified and the data relating to mining can be replaced by other negative environmental impacts affecting the Nature Refuges. Results of the research are presented as graphs and maps providing visual evidence supporting the usefulness of GIS as means for capturing, visualising and enhancing spatial quality and integrity of Nature Refuge boundaries.
Resumo:
In this conversation, Kevin K. Kumashiro shares his reflections on challenges to publishing anti-oppressive research in educational journals. He then invites eight current and former editors of leading educational research journals--William F. Pinar, Elizabeth Graue, Carl A. Grant, Maenette K. P. Benham, Ronald H. Heck, James Joseph Scheurich, Allan Luke, and Carmen Luke--to critique and expand on his analysis. Kumashiro begins the conversation by describing his own experiences submitting manuscripts to educational research journals and receiving comments by anonymous reviewers and journal editors. He suggests three ways to rethink the collaborative potential of the peer-review process: as constructive, as multilensed, and as situated. The eight current and former editors of leading educational research journals then critique and expand Kumashiro's analysis. Kumashiro concludes the conversation with additional reflections on barriers and contradictions involved in advancing anti-oppressive educational research in educational journals. (Contains 3 notes.)
Resumo:
Business process modeling is widely regarded as one of the most popular forms of conceptual modeling. However, little is known about the capabilities and deficiencies of process modeling grammars and how existing deficiencies impact actual process modeling practice. This paper is a first contribution towards a theory-driven, exploratory empirical investigation of the ontological deficiencies of process modeling with the industry standard Business Process Modeling Notation (BPMN). We perform an analysis of BPMN using a theory of ontological expressiveness. Through a series of semi-structured interviews with BPMN adopters we explore empirically the actual use of this grammar. Nine ontological deficiencies related to the practice of modeling with BPMN are identified, for example, the capture of business rules and the specification of process decompositions. We also uncover five contextual factors that impact on the use of process modeling grammars, such as tool support and modeling conventions. We discuss implications for research and practice, highlighting the need for consideration of representational issues and contextual factors in decisions relating to BPMN adoption in organizations.
Resumo:
Purpose, Design/methodology / approach The acknowledgement of state significance in relation to development projects can result in special treatment by regulatory authorities, particularly in terms of environmental compliance and certain economic and other government support measures. However, defining just what constitutes a “significant project”, or a project of “state significance”, varies considerably between Australian states. In terms of establishing threshold levels, in Queensland there is even less clarity. Despite this lack of definition, the implications of “state significance” can nevertheless be considerable. For example, in Queensland if the Coordinator-General declares a project to be a “significant project” under the State Development and Public Works Organisation Act 1971, the environmental impact assessment process may become more streamlined – potentially circumventing certain provisions under The Integrated Planning Act 1997. If the project is not large enough to be so deemed, an extractive resource under the State Planning Policy 2/07 - Protection of Extractive Resources 2007 may be considered to be of State or regional significance and subsequently designated as a “Key Resource Area”. As a consequence, such a project is afforded some measure of resource protection but remains subject to the normal assessment process under the Integrated Development Assessment System, as well as the usual requirements of the vegetation management codes, and other regulations. Findings (Originality/value) & Research limitations / implications This paper explores the various meanings of “state significance” in Queensland and the ramifications for development projects in that state. It argues for a streamlining of the assessment process in order to avoid or minimise constraints acting on the state’s development. In so doing, it questions the existence of a strategic threat to the delivery of an already over-stretched infrastructure program.
Resumo:
Purpose: This paper aims to show that identification of expectations and software functional requirements via consultation with potential users is an integral component of the development of an emergency department patient admissions prediction tool. ---------- Design/methodology/approach: Thematic analysis of semi-structured interviews with 14 key health staff delivered rich data regarding existing practice and future needs. Participants included emergency department staff, bed managers, nurse unit managers, directors of nursing, and personnel from health administration. ---------- Findings: Participants contributed contextual insights on the current system of admissions, revealing a culture of crisis, imbued with misplayed communication. Their expectations and requirements of a potential predictive tool provided strategic data that moderated the development of the Emergency Department Patient Admissions Prediction Tool, based on their insistence that it feature availability, reliability and relevance. In order to deliver these stipulations, participants stressed that it should be incorporated, validated, defined and timely. ---------- Research limitations/implications: Participants were envisaging a concept and use of a tool that was somewhat hypothetical. However, further research will evaluate the tool in practice. ---------- Practical implications: Participants' unsolicited recommendations regarding implementation will not only inform a subsequent phase of the tool evaluation, but are eminently applicable to any process of implementation in a healthcare setting. ---------- Originality/value: The consultative process engaged clinicians and the paper delivers an insider view of an overburdened system, rather than an outsider's observations.
Resumo:
Dental pulp cells (DPCs) are capable of differentiating into odontoblasts that secrete reparative dentin after pulp injury. The molecular mechanisms governing reparative dentinogenesis are yet to be fully understood. Here we investigated the differential protein profile of human DPCs undergoing odontogenic induction for 7 days. Using two-dimensional differential gel electrophoresis coupled with matrix-assisted laser adsorption ionization time of flight mass spectrometry, 2 3 protein spots related to the early odontogenic differentiation were identified. These proteins included cytoskeleton proteins, nuclear proteins, cell membrane-bound molecules, proteins involved in matrix synthesis, and metabolic enzymes. The expression of four identified proteins, which were heteronuclear ribonuclear proteins C, annexin VI, collagen type VI, and matrilin-2, was confirmed by Western blot and real-time realtime polymerase chain reaction analyses. This study generated a proteome reference map during odontoblast- like differentiation of human DPCs, which will be valuable to better understand the underlying molecular mechanisms in odontoblast-like differentiation.