339 resultados para Computer Systems
Resumo:
Composite web services comprise several component web services. When a composite web service is executed centrally, a single web service engine is responsible for coordinating the execution of the components, which may create a bottleneck and degrade the overall throughput of the composite service when there are a large number of service requests. Potentially this problem can be handled by decentralizing execution of the composite web service, but this raises the issue of how to partition a composite service into groups of component services such that each group can be orchestrated by its own execution engine while ensuring acceptable overall throughput of the composite service. Here we present a novel penalty-based genetic algorithm to solve the composite web service partitioning problem. Empirical results show that our new algorithm outperforms existing heuristic-based solutions.
Resumo:
This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.
Resumo:
Music making affects relationships with self and others by generating a sense of belonging to a culture or ideology (Bamford, 2006; Barovick, 2001; Dillon & Stewart, 2006; Fiske, 2000; Hallam, 2001). Whilst studies from arts education research present compelling examples of these relationships, others argue that they do not present sufficiently validated evidence of a causal link between music making experiences and cognitive or social change (Winner & Cooper, 2000; Winner & Hetland, 2000a, 2000b, 2001). I have suggested elsewhere that this disconnection between compelling evidence and observations of the effects of music making are in part due to the lack of rigor in research and the incapacity of many methods to capture these experiences in meaningful ways (Dillon, 2006). Part of the answer to these questions about rigor and causality lay in the creative use of new media technologies that capture the results of relationships in music artefacts. Crucially, it is the effective management of these artefacts within computer systems that allows researchers and practitioners to collect, organize, analyse and then theorise such music making experiences.
Resumo:
Scientists need to transfer semantically similar queries across multiple heterogeneous linked datasets. These queries may require data from different locations and the results are not simple to combine due to differences between datasets. A query model was developed to make it simple to distribute queries across different datasets using RDF as the result format. The query model, based on the concept of publicly recognised namespaces for parts of each scientific dataset, was implemented with a configuration that includes a large number of current biological and chemical datasets. The configuration is flexible, providing the ability to transparently use both private and public datasets in any query. A prototype implementation of the model was used to resolve queries for the Bio2RDF website, including both Bio2RDF datasets and other datasets that do not follow the Bio2RDF URI conventions.
Resumo:
The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.
Resumo:
Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.
Resumo:
Monitoring environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; online collaboration, manual, automatic and human-in-the loop analysis.
Resumo:
Intelligent Tutoring Systems (ITSs) are computer systems designed to provide individualised help to students, learning in a problem solving context. The difference between an ITS and a Computer Assisted Instruction (CAI) system is that an ITS has a Student Model which allows it to provide a better educational environment. The Student Model contains information on what the student knows, and does not know, about the domain being learnt, as well as other personal characteristics such as preferred learning style. This research has resulted in the design and development of a new ITS: Personal Access Tutor (PAT). PAT is an ITS that helps students to learn Rapid Application Development in a database environment. More specifically, PAT focuses on helping students to learn how to create forms and reports in Microsoft Access. To provide an augmented learning environment, PAT’s architecture is different to most other ITSs. Instead of having a simulation, PAT uses a widelyused database development environment (Microsoft Access). This enables the students to ask for help, while developing real applications using real database software. As part of this research, I designed and created the knowledge base required for PAT. This contains four models: the domain, student, tutoring and exercises models. The Instructional Expert I created for PAT provides individualised help to the students to help them correctly finish each exercise, and also proposes the next exercise that a student should work on. PAT was evaluated by students enrolled in the Databases subject at QUT, and by staff members involved in teaching the subject. The results of the evaluation were positive and are discussed in the thesis.
Resumo:
Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.
Resumo:
Modern applications comprise multiple components, such as browser plug-ins, often of unknown provenance and quality. Statistics show that failure of such components accounts for a high percentage of software faults. Enabling isolation of such fine-grained components is therefore necessary to increase the robustness and resilience of security-critical and safety-critical computer systems. In this paper, we evaluate whether such fine-grained components can be sandboxed through the use of the hardware virtualization support available in modern Intel and AMD processors. We compare the performance and functionality of such an approach to two previous software based approaches. The results demonstrate that hardware isolation minimizes the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution's correctness. We also show that our relatively simple implementation has equivalent run-time performance, with overheads of less than 34%, does not require custom tool chains and provides enhanced functionality over software-only approaches, confirming that hardware virtualization technology is a viable mechanism for fine-grained component isolation.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
Alcohol consumption is enmeshed with Australian culture (Palk, 2008) and the use and misuse of alcohol contributes to considerable health and social harms (Barbor et al., 2010; English et al., 1995; Gutjahr, Gmel, & Rehm, 2001; Palk, 2008; Steenkamp, Harrison, & Allsop, 2002). Despite shifts in the way that alcohol is consumed and how it is used, it has been reported that one-third of all alcohol consumed is done so within licensed premises (Lang, Stockwell, Rydon, & Gamble, 1992). Consequently, licensed premises are over-represented as settings in which alcohol-related harms occur. These harms, particularly those related to violence, are associated with particular licensed premises operating in the night-time economy (Briscoe & Donnelly, 2001b; Chikritzhs, Stockwell, & Masters, 1997; Homel, Tomsen, & Thommeny, 1991; Stockwell, 1997). Police have a role in not only responding to the manifestation of harms, such as crime, injuries, assaults, domestic violence, stealing and sexual offences, but they also have a role in preventing problems, and thereby reducing alcohol and other drug-related harms (Doherty & Roche, 2003). Given the extent of alcohol consumption within licensed premises and the nature and extent of the harms, as well as the lack of opportunity to influence outcomes in other settings (e.g. the home), licensed premises offer police and other stakeholders a significant opportunity to influence positively the reduction of alcoholrelated harm. This research focuses specifically on the police role in policing licensed premises. Primarily, this research aims to investigate the factors which are relevant to why and how police officers respond to alcohol-related incidents inside and outside licensed premises. It examines the attitudes and beliefs of police and assesses their knowledge, capacity and ability to effectively police licensed premises. The research methodology uses three distinct surveys. Each contributes to understanding the motivations and practice of police officers in this important area of harm reduction. Study One involved a survey of police officers within a police district (Brisbane Central District) in Queensland, Australia and used a comprehensive questionnaire involving both quantitative and qualitative techniques. A key research outcome of Study One was the finding that officers had low levels of knowledge of the strategies that are effective in addressing alcohol-related harm both inside and outside licensed premises. Paradoxically, these officers also reported extensive recent experience in dealing with alcohol issues in these locations. In addition, these officers reported that alcohol was a significant contextual factor in the majority of matters to which they responded. Officers surveyed reported that alcohol increased the difficulty of responding to situations and that licensed premises (e.g. nightclubs, licensed clubs and hotels) were the most difficult contexts to police. Those surveyed were asked to self-assess their knowledge of the Liquor Act (Qld), which is the primary legislative authority in Queensland for regulating licensed premises. Surprisingly, well over half of the officers (65%) reported ‘no’ to ‘fair’ knowledge of the Act, despite officers believing that their skill level to police such premises was in the ‘good to very good range’. In an important finding, officers reported greater skill level to police outside licensed premises than inside such premises, indicating that officers felt less capable, from a skill perspective, to operate within the confines of a licensed premise than in the environment immediately outside such premises. Another key finding was that officers reported greater levels of training in responding to situations outside and around licensed premises than to situations inside licensed premises. Officers were also asked to identify the frequency with which they employed specified regulatory enforcement and community-based strategies. Irrespective of the type of response, ‘taking no action’ or passive policing interventions were not favoured by officers. The findings identified that officers favoured taking a range of strategies (sending home, releasing into the custody of friends, etc.) in preference to arrest. In another key finding, officers generally reported their support for operational stakeholder partnership approaches to policing licensed premises. This was evidenced by the high number of officers (over 90%) reporting that there should be shared responsibility for enforcing the provisions of the Liquor Act. Importantly, those surveyed also identified the factors which constrain or prevent them from policing licensed premises. Study Two involved interviewing a small but comprehensive group (n=11) of senior managers from within the Queensland Police Service (QPS) who have responsibility for setting operational and strategic policy. The aim of this study was to examine the attitudes, perceptions and influence that senior officers (at the strategy and policy-setting level) had on the officers at the operational level. This qualitative study was carried out using a purposive sampling (Denzin & Lincoln, 2005; Guba & Lincoln, 1989), focused interview and thematic analytic approach. The interview participants were drawn from three tiers of management at district, regional as well as the whole-of-organisational level. The first key theme emerging from the study related to role, in terms of both the QPS broader organisational role, and the individual officer role with respect to the policing of licensed premises. For the QPS organisational role, participants at all three strategic levels had a high degree of congruity as to the organisations service role; that is, to enhance public safety. With respect to participants’ beliefs as to whether police officers have knowledge and understanding of their individual roles concerning licensed premises (as opposed to the QPS role), participants reported most commonly that officers had a reasonable to clear understanding of their role. Participant comments also were supportive of the view that officers operating in the research area, Brisbane Central District (BCD), had a clearer understanding of their role than police operating in other locations. The second key theme to emerge identified a disparity between the knowledge and capability of specialist police, compared with general duties police, to police licensed premises. In fact, a number of the responses to a variety of questions differentiated specialist and general police in a range of domains. One such example related to the clarity of understanding of officer role. Participants agreed that specialist police (Liquor Enforcement & Proactive Strategies [LEAPS] officers) had more clarity of understanding in terms of their role than generalist police. Participants also were strongly of the opinion that specialist police had higher skill levels to deal with issues both inside and outside licensed premises. Some participants expressed the view that general duty police undertook purely response-related activities, or alternatively, dealt with lower order matters. Conversely, it was viewed that specialist police undertook more complex tasks because of their higher levels of knowledge and skill. The third key theme to emerge concerned the identification of barriers that serve to restrict or prevent police officers from policing licensed premises. Participant responses strongly indicated that there was a diversity of resourcing barriers that restrict police from undertaking their roles in licensed premises. Examples of such barriers were the lack of police and the low ratio of police to patrons, available officer time, and lack of organisational investment in skills and knowledge acquisition. However, some participants indicated that police resourcing in the BCD was appropriate and officers were equipped with sufficient powers (policy and legislation). Again, the issue of specialist police was raised by one participant who argued that increasing the numbers of specialist police would ameliorate the difficulties for police officers policing licensed premises. The fourth and last key theme to emerge from Study Two related to the perception of senior officers regarding the opportunity and capability of officers to leverage off external partnerships to reduce harms inside and outside licensed premises. Police working in partnership in BCD was seen as an effective harm reduction strategy and strongly supported by the participants. All participants demonstrated a high degree of knowledge as to who these partners were and could identify those government, non-government and community groups precisely. Furthermore, the majority of participants also held strong views that the partnerships were reasonably effective and worked to varying degrees depending on the nature of the partnership and issues such as resourcing. These senior officers identified better communication and coordination as factors that could potentially strengthen these partnerships. This research finding is particularly important for senior officers who have the capacity to shape the policy and strategic direction of the police service, not only in Queensland but throughout Australasia. Study Three examined the perceptions of those with links to the broader liquor industry (government, non-government and community but exclusive of police) concerning their understanding of the police role and the capacity of police to reduce alcohol-related harm inside and outside licensed premises, and their attitudes towards police. Participants (n=26) surveyed represented a range of areas including the liquor industry, business represenatives and government representatives from Queensland Fire and Rescue Service, Queensland Ambulance Service, Brisbane City Council and Queensland Health. The first key theme to emerge from Study Three related to participant understanding of the QPS organisational role, and importantly, individual officer role in policing licensed premises. In terms of participant understanding of the QPS role there was a clear understanding by the majority of participants that the police role was to act in ways consistent with the law and to otherwise engage in a range of enforcement-related activities. Participants saw such activities falling into two categories. The first category related to reactive policing, which included actions around responding to trouble in licensed premises, monitoring crowd controllers and removing trouble-makers. In the second category, proactive approaches, participants identified the following activities as consistent with that approach: early intervention with offenders, support of licensed premises operators and high visibility policing. When participants were asked about their understanding of individual officer roles in the policing of licensed premises, a range of responses were received but the consistent message that emerged was that there is a different role to be played by general duty (uniformed) police compared to specialist (LEAPS Unit) police, which reflects differences in knowledge, skill and capability. The second key theme that emerged from the data related to the external participants’ views of the knowledge and capability of specialist police, compared with general duty police, to police licensed premises. As noted in the first key theme, participants were universally of the view that the knowledge, skill and capability of police in specialist units (LEAPS Unit) was at a higher level than that of general duty police. Participants observed that these specialist officers were better trained than their colleagues in generalist areas and were therefore better able to intervene knowledgeably and authoritatively to deal with problems and issues as they emerged. Participants also reported that officers working within BCD generally had a positive attitude to their duties and had important local knowledge that they could use in the resolution of alcohol-related issues. Participants also commented on the importance of sound and effective QPS leadership, as well as the quality of the leadership in BCD. On both these measures, there was general consensus from participants, who reported positively on the importance and effectiveness of such leadership in BCD. The third key theme to emerge from Study Three concerned the identification of barriers that serve to restrict or prevent police officers from policing licensed premises. Overwhelmingly, external participants reported the lack of human resources (i.e. police officers) as the key barrier. Other resourcing limitations, such as available officer time, police computer systems, and the time taken to charge offenders, were identified as barriers. Some participants identified barriers in the liquor industry such as ‘dodgy operators’ and negative media attention as limitations. Other constraints to emerge related to government and policy barriers. These were reflected in comments about the collection by government of fees from licensees and better ‘powers’ for police to deal with offenders. The fourth and final key theme that emerged from Study Three related to the opportunities for and capability of police to leverage off external partnerships to reduce harms inside and outside licensed premises. Not surprisingly, participants had a comprehensive knowledge of a broad range of stakeholders, from a diversity of contexts, influential in addressing issues in licensed premises. Many participants reported their relationships with the police and other stakeholders as effective, productive and consistent with the objectives of partnering to reduce alcohol-related harm. On the other hand, there were those who were concerned with their relationship with other stakeholders, particularly those with a compliance function (e.g. Office of Liquor & Gaming Regulation [OLGR]). The resourcing limitations of partners and stakeholders were also raised as an important constraining factor in fulfilling the optimum relationship. Again, political issues were mentioned in terms of the impact on partnerships, with participants stating that there is at times political interference and that politicians complicate the relationships of stakeholders. There are some significant strengths with respect to the methodology of this research. The research is distinguished from previous work in that it examines these critical issues from three distinct perspectives (i.e. police officer, senior manager and external stakeholder). Other strengths relate to the strong theoretical framework that guides and informs the research. There are also some identified limitations, including the subjective nature of self-report data as well as the potential for bias by the author, which was controlled for using a range of initiatives. A further limitation concerns the potential for transferability and generalisability of the findings to other locations given the distinctive nature of the BCD. These limitations and issues of transferability are dealt with at length in the thesis. Despite a growing body of literature about contextual harms associated with alcohol, and specific research concerning police intervention in such contextual harms, there is still much to learn. While research on the subject of police engaging in alcohol-related incidents has focused on police behaviours and strategies in response to such issues, there is a paucity of research that focuses on the knowledge and understanding of officers engaged in such behaviours and practices. Given the scarcity of research dealing with the knowledge, skills and attitudes of police officers responding to harms inside and outside licensed premises, this research contributes significantly to what is a recent and growing body of research and literature in the field. The research makes a practical contribution to police agencies’ understanding of officer knowledge and police practice in ways that have the potential to shape education and training agendas, policy approaches around generalist versus specialist policing, strategic and operational strategy, as well as partnership engagements. The research also makes a theoretical contribution given that the research design is informed by the Three Circle
Resumo:
The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc. The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)