796 resultados para Police, Information Technology, Knoweldge Work, Knowledge Organisation, Systems Applications
Resumo:
Purpose – This paper aims to provide insights into the moral values embodied by a popular social networking site (SNS), Facebook. Design/methodology/approach – This study is based upon qualitative fieldwork, involving participant observation, conducted over a two-year period. The authors adopt the position that technology as well as humans has a moral character in order to disclose ethical concerns that are not transparent to users of the site. Findings – Much research on the ethics of information systems has focused on the way that people deploy particular technologies, and the consequences arising, with a view to making policy recommendations and ethical interventions. By focusing on technology as a moral actor with reach across and beyond the internet, the authors reveal the complex and diffuse nature of ethical responsibility and the consequent implications for governance of SNS. Research limitations/implications – The authors situate their research in a body of work known as disclosive ethics, and argue for an ongoing process of evaluating SNS to reveal their moral importance. Along with that of other authors in the genre, this work is largely descriptive, but the paper engages with prior research by Brey and Introna to highlight the scope for theory development. Practical implications – Governance measures that require the developers of social networking sites to revise their designs fail to address the diffuse nature of ethical responsibility in this case. Such technologies need to be opened up to scrutiny on a regular basis to increase public awareness of the issues and thereby disclose concerns to a wider audience. The authors suggest that there is value in studying the development and use of these technologies in their infancy, or if established, in the experiences of novice users. Furthermore, flash points in technological trajectories can prove useful sites of investigation. Originality/value – Existing research on social networking sites either fails to address ethical concerns head on or adopts a tool view of the technologies so that the focus is on the ethical behaviour of users. The authors focus upon the agency, and hence the moral character, of technology to show both the possibilities for, and limitations of, ethical interventions in such cases.
Unpacking user relations in an emerging ubiquitous computing environment : introducing the bystander
Resumo:
The move towards technological ubiquity is allowing a more idiosyncratic and dynamic working environment to emerge that may result in the restructuring of information communication technologies, and changes in their use through different user groups' actions. Taking a ‘practice’ lens to human agency, we explore the evolving roles of, and relationships between these user groups and their appropriation of emergent technologies by drawing upon Lamb and Kling's social actor framework. To illustrate our argument, we draw upon a study of a UK Fire Brigade that has introduced a variety of technologies in an attempt to move towards embracing mobile and ubiquitous computing. Our analysis of the enactment of such technologies reveals that Bystanders, a group yet to be taken as the central unit of analysis in information systems research, or considered in practice, are emerging as important actors. The research implications of our work relate to the need to further consider Bystanders in deployments other than those that are mobile and ubiquitous. For practice, we suggest that Bystanders require consideration in the systems development life cycle, particularly in terms of design and education in processes of use.
Resumo:
The idea of information literacy, broadly deÞned as the ability to recognise information needs and identify, evaluate and use information e¤ectively, has been of growing concern in the education sectors for a number of years; whilst in the workplace, employers and managers have perhaps attended more to the need for computer and information technology skill. New descriptions of information literacy, that may be of value to the business sector, are now beginning to appear as a result of qualitative research into how professional employees experience the e¤ective use of information. This paper summarises the outcomes of an investigation into the experience of information literacy amongst various types of professionals; and explores the possible di¤erences and interrelations between individual and organisational information literacy suggested by these outcomes. Seven di¤erent ways of experiencing information literacy were identiÞed. These experiences are closely related to important workplace processes such as environmental scanning, information management, corporate memory, and research and development; conÞrming that information literacy should be considered a signiÞcant part of the character of learning organisations as well as being a key characteristic of the organisationÕs employees. Implications of individual and organisational information literacy for beginning and continuing professional education are explored.
Resumo:
The Australian economy is currently supported by a resources boom and work opportunities in traditionally male dominated fields of construction and engineering and information technology are at a premium. Yet despite more than 25 years of anti discrimination and equal employment opportunity legislation these industries still employ few women in operational or management roles. This paper investigates the issue of the low representation of women in project management and their different work and career experiences through interviews with male and female project managers.
Resumo:
There is an increasing interest in the use of information technology as a participatory planning tool, particularly the use of geographical information technologies to support collaborative activities such as community mapping. However, despite their promise, the introduction of such technologies does not necessarily promote better participation nor improve collaboration. In part this can be attributed to a tendency for planners to focus on the technical considerations associated with these technologies at the expense of broader participation considerations. In this paper we draw on the experiences of a community mapping project with disadvantaged communities in suburban Australia to highlight the importance of selecting tools and techniques which support and enhance participatory planning. This community mapping project, designed to identify and document community-generated transport issues and solutions, had originally intended to use cadastral maps extracted from the government’s digital cadastral database as the foundation for its community mapping approach. It was quickly discovered that the local residents found the cadastral maps confusing as the maps lacked sufficient detail to orient them to their suburb (the study area). In response to these concerns and consistent with the project’s participatory framework, a conceptual base map based on resident’s views of landmarks of local importance was developed to support the community mapping process. Based on this community mapping experience we outline four key lessons learned regarding the process of community mapping and the place of geographical information technologies within this process.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.
Resumo:
Over the last decade, the majority of existing search techniques is either keyword- based or category-based, resulting in unsatisfactory effectiveness. Meanwhile, studies have illustrated that more than 80% of users preferred personalized search results. As a result, many studies paid a great deal of efforts (referred to as col- laborative filtering) investigating on personalized notions for enhancing retrieval performance. One of the fundamental yet most challenging steps is to capture precise user information needs. Most Web users are inexperienced or lack the capability to express their needs properly, whereas the existent retrieval systems are highly sensitive to vocabulary. Researchers have increasingly proposed the utilization of ontology-based tech- niques to improve current mining approaches. The related techniques are not only able to refine search intentions among specific generic domains, but also to access new knowledge by tracking semantic relations. In recent years, some researchers have attempted to build ontological user profiles according to discovered user background knowledge. The knowledge is considered to be both global and lo- cal analyses, which aim to produce tailored ontologies by a group of concepts. However, a key problem here that has not been addressed is: how to accurately match diverse local information to universal global knowledge. This research conducts a theoretical study on the use of personalized ontolo- gies to enhance text mining performance. The objective is to understand user information needs by a \bag-of-concepts" rather than \words". The concepts are gathered from a general world knowledge base named the Library of Congress Subject Headings. To return desirable search results, a novel ontology-based mining approach is introduced to discover accurate search intentions and learn personalized ontologies as user profiles. The approach can not only pinpoint users' individual intentions in a rough hierarchical structure, but can also in- terpret their needs by a set of acknowledged concepts. Along with global and local analyses, another solid concept matching approach is carried out to address about the mismatch between local information and world knowledge. Relevance features produced by the Relevance Feature Discovery model, are determined as representatives of local information. These features have been proven as the best alternative for user queries to avoid ambiguity and consistently outperform the features extracted by other filtering models. The two attempt-to-proposed ap- proaches are both evaluated by a scientific evaluation with the standard Reuters Corpus Volume 1 testing set. A comprehensive comparison is made with a num- ber of the state-of-the art baseline models, including TF-IDF, Rocchio, Okapi BM25, the deploying Pattern Taxonomy Model, and an ontology-based model. The gathered results indicate that the top precision can be improved remarkably with the proposed ontology mining approach, where the matching approach is successful and achieves significant improvements in most information filtering measurements. This research contributes to the fields of ontological filtering, user profiling, and knowledge representation. The related outputs are critical when systems are expected to return proper mining results and provide personalized services. The scientific findings have the potential to facilitate the design of advanced preference mining models, where impact on people's daily lives.
Resumo:
The purpose of the current study was to develop a measurement of information security culture in developing countries such as Saudi Arabia. In order to achieve this goal, the study commenced with a comprehensive review of the literature, the outcome being the development of a conceptual model as a reference base. The literature review revealed a lack of academic and professional research into information security culture in developing countries and more specifically in Saudi Arabia. Given the increasing importance and significant investment developing countries are making in information technology, there is a clear need to investigate information security culture from developing countries perspective such as Saudi Arabia. Furthermore, our analysis indicated a lack of clear conceptualization and distinction between factors that constitute information security culture and factors that influence information security culture. Our research aims to fill this gap by developing and validating a measurement model of information security culture, as well as developing initial understanding of factors that influence security culture. A sequential mixed method consisting of a qualitative phase to explore the conceptualisation of information security culture, and a quantitative phase to validate the model is adopted for this research. In the qualitative phase, eight interviews with information security experts in eight different Saudi organisations were conducted, revealing that security culture can be constituted as reflection of security awareness, security compliance and security ownership. Additionally, the qualitative interviews have revealed that factors that influence security culture are top management involvement, policy enforcement, policy maintenance, training and ethical conduct policies. These factors were confirmed by the literature review as being critical and important for the creation of security culture and formed the basis for our initial information security culture model, which was operationalised and tested in different Saudi Arabian organisations. Using data from two hundred and fifty-four valid responses, we demonstrated the validity and reliability of the information security culture model through Exploratory Factor Analysis (EFA), followed by Confirmatory Factor Analysis (CFA.) In addition, using Structural Equation Modelling (SEM) we were further able to demonstrate the validity of the model in a nomological net, as well as provide some preliminary findings on the factors that influence information security culture. The current study contributes to the existing body of knowledge in two major ways: firstly, it develops an information security culture measurement model; secondly, it presents empirical evidence for the nomological validity for the security culture measurement model and discovery of factors that influence information security culture. The current study also indicates possible future related research needs.
Resumo:
Building Information Modeling (BIM) is the use of virtual building information models to develop building design solutions and design documentation and to analyse construction processes. Recent advances in IT have enabled advanced knowledge management, which in turn facilitates sustainability and improves asset management in the civil construction industry. There are several important qualifiers and some disadvantages of the current suite of technologies. This paper outlines the benefits, enablers, and barriers associated with BIM and makes suggestions about how these issues may be addressed. The paper highlights the advantages of BIM, particularly the increased utility and speed, enhanced fault finding in all construction phases, and enhanced collaborations and visualisation of data. The paper additionally identifies a range of issues concerning the implementation of BIM as follows: IP, liability, risks, and contracts and the authenticity of users. Implementing BIM requires investment in new technology, skills training, and development of new ways of collaboration and Trade Practices concerns. However, when these challenges are overcome, BIM as a new information technology promises a new level of collaborative engineering knowledge management, designed to facilitate sustainability and asset management issues in design, construction, asset management practices, and eventually decommissioning for the civil engineering industry.
Resumo:
Almost half of all game players are now women. However, women only represent a small proportion of game developers. There is a lack of previous research to suggest why women don't pursue careers in games and how we can attract more women to the industry. In this paper, we investigate the issues and barriers that prevent women from entering the games industry, as well as the solutions and steps that can be taken to attract more women to the industry. We draw on the lessons learned by the information technology industry and report on a program of events that was conducted at the Queensland University of Technology in 2011. These events provided some insight into the issues surrounding the lack of women in the games industry, as well as some initial steps that we can take as an industry to attract and support more female developers.
Resumo:
There is no doubt that social engineering plays a vital role in compromising most security defenses, and in attacks on people, organizations, companies, or even governments. It is the art of deceiving and tricking people to reveal critical information or to perform an action that benefits the attacker in some way. Fraudulent and deceptive people have been using social engineering traps and tactics using information technology such as e-mails, social networks, web sites, and applications to trick victims into obeying them, accepting threats, and falling victim to various crimes and attacks such as phishing, sexual abuse, financial abuse, identity theft, impersonation, physical crime, and many other forms of attack. Although organizations, researchers, practitioners, and lawyers recognize the severe risk of social engineering-based threats, there is a severe lack of understanding and controlling of such threats. One side of the problem is perhaps the unclear concept of social engineering as well as the complexity of understand human behaviors in behaving toward, approaching, accepting, and failing to recognize threats or the deception behind them. The aim of this paper is to explain the definition of social engineering based on the related theories of the many related disciplines such as psychology, sociology, information technology, marketing, and behaviourism. We hope, by this work, to help researchers, practitioners, lawyers, and other decision makers to get a fuller picture of social engineering and, therefore, to open new directions of collaboration toward detecting and controlling it.
Resumo:
Although recommender systems and reputation systems have quite different theoretical and technical bases, both types of systems have the purpose of providing advice for decision making in e-commerce and online service environments. The similarity in purpose makes it natural to integrate both types of systems in order to produce better online advice, but their difference in theory and implementation makes the integration challenging. In this paper, we propose to use mappings to subjective opinions from values produced by recommender systems as well as from scores produced by reputation systems, and to combine the resulting opinions within the framework of subjective logic.
Resumo:
The complex supply chain relations of the construction industry, coupled with the substantial amount of information to be shared on a regular basis between the parties involved, make the traditional paper-based data interchange methods inefficient, error prone and expensive. The successful information technology (IT) applications that enable seamless data interchange, such as the Electronic Data Interchange (EDI) systems, have generally failed to be successfully implemented in the construction industry. An alternative emerging technology, Extensible Markup Language (XML), and its applicability to streamline business processes and to improve data interchange methods within the construction industry are analysed, as is the EDI technology to identify the strategic advantages that XML technology provides to overcome the barriers to implementation. In addition, the successful implementation of XML-based automated data interchange platforms for a large organization, and the proposed benefits thereof, are presented as a case study.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
Within Human-Computer Interaction (HCI) and Computer Supported Cooperative Work (CSCW) research, the notion of technologically-mediated awareness is often used for allowing relevant people to maintain a mental model of activities, behaviors and status information about each other so that they can organize and coordinate work or other joint activities. The initial conceptions of awareness focused largely on improving productivity and efficiency within work environments. With new social, cultural and commercial needs and the emergence of novel computing technologies, the focus of technologically-mediated awareness has extended from work environments to people’s everyday interactions. Hence, the scope of awareness has extended from conveying work related activities to people’s emotions, love, social status and other broad range of aspects. This trend of conceptualizing HCI design is termed as experience-focused HCI. In my PhD dissertation, designing for awareness, I have reported on how we, as HCI researchers, can design awareness systems from experience-focused HCI perspective that follow the trend of conveying awareness beyond the task-based, instrumental and productive needs. Within the overall aim to design for awareness, my research advocates ethnomethodologically-informed approaches for conceptualizing and designing for awareness. In this sense, awareness is not a predefined phenomenon but something that is situated and particular to a given environment. I have used this approach in two design cases of developing interactive systems that support awareness beyond task-based aspects in work environments. In both the cases, I have followed a complete design cycle: collecting an in-situ understanding of an environment, developing implications for a new technology, implementing a prototype technology to studying the use of the technology in its natural settings.