934 resultados para 280506 Coding and Information Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the results of a multi-method investigation of employee perceptions of fairness in relation to their career management experiences. Organisational justice theory (OJT) was developed as a theoretical framework and data were gathered via 325 quantitative questionnaires, 20 semi-structured interviews and the analysis of a variety of company documents and materials. The results of the questionnaire survey provided strong support for the salience of employee perceptions of justice in regard to their evaluations of organisational career management (OCM) practices, with statistical support emerging for both an agent-systems and interaction model of organisational justice. The qualitative semi-structured interviews provided more detailed analysis of how fairness was experienced in practice, and confirmed the importance of the OJT constructs of fairness within this career management context. Fairness themes to emerge from this analysis included, equity, needs, voice, bias suppression, consistency, ethicality, respect and feedback drawing on many of the central tenants of distributive, procedural, interpersonal and information justice. For the career management literature there is empirical confirmation of a new theoretical framework for understanding employee evaluations of, and reactions to, OCM practices. For the justice literatures a new contextual domain is explored and confirmed, thus extending further the influence and applicability of the theory. For practitioners a new framework for developing, delivering and evaluating their own OCM policies and systems is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO INCOMPLETE PAPERWORK, ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the agrifood sector, the explosive increase in information about environmental sustainability, often in uncoordinated information systems, has created a new form of ignorance ('meta-ignorance') that diminishes the effectiveness of information on decision-makers. Flows of information are governed by informal and formal social arrangements that we can collectively call Informational Institutions. In this paper, we have reviewed the recent literature on such institutions. From the perspectives of information theory and new institutional economics, current informational institutions are increasing the information entropy of communications concerning environmental sustainability and stakeholders' transaction costs of using relevant information. In our view this reduces the effectiveness of informational governance. Future research on informational governance should explicitly address these aspects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A possible gap exists between what parents and preschool providers know concerning children's readiness for school and what they should know when compared to teacher expectations. Students are experiencing difficulty in early schooling as a result of this gap in perspectives. This study's purpose was to describe, explain, and analyze the perspectives of parents, teachers, and preschool providers concerning school readiness. The qualitative strategy of interviewing was used with six parents, six teachers, and two preschool provider participants. Interview transcripts, field notes, member checking, and document analysis were used to interpret data and support findings. Categorization and coding organized data and aided in theory development. ^ Major findings of the study include: (a) All participant groups stress social skills, communication skills, and enthusiasm as most valuable for school readiness; (b) All participant groups agree parents have primary responsibility for readiness preparation; (c) Many participants suggest variables concerning family, economics, and home life contribute to a lack of readiness; (d) Parents place greater value on academic skills than teachers or preschool providers; (e) Preschool programs are identified as having the potential to significantly influence readiness; (f) Communicating, providing positive learning experiences, and providing preschool experience are valuable ways to prepare students for school, yet, differences were found in the types of experiences noted; (g) Participant perspectives indicate that informing parents of readiness expectations is of major importance, and they offer suggestions to accomplish this goal such as using public libraries and pediatrician offices as houses for written information and having kindergarten teachers make presentations at preschools. ^ This study concludes that parents and preschool providers do have knowledge concerning readiness for school. They may not, however, be in a position to carry out their responsibilities due to the intervening variables that inhibit the amount of time, interaction, and communication they have with the children in their care. This study discloses the beliefs of parents and preschool providers that children are ready for school, while teachers conclude that many children are not ready. Suggestions for readiness preparation and information dissemination are significant findings that offer implications for practice and future study. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Backscatter communication is an emerging wireless technology that recently has gained an increase in attention from both academic and industry circles. The key innovation of the technology is the ability of ultra-low power devices to utilize nearby existing radio signals to communicate. As there is no need to generate their own energetic radio signal, the devices can benefit from a simple design, are very inexpensive and are extremely energy efficient compared with traditional wireless communication. These benefits have made backscatter communication a desirable candidate for distributed wireless sensor network applications with energy constraints.

The backscatter channel presents a unique set of challenges. Unlike a conventional one-way communication (in which the information source is also the energy source), the backscatter channel experiences strong self-interference and spread Doppler clutter that mask the information-bearing (modulated) signal scattered from the device. Both of these sources of interference arise from the scattering of the transmitted signal off of objects, both stationary and moving, in the environment. Additionally, the measurement of the location of the backscatter device is negatively affected by both the clutter and the modulation of the signal return.

This work proposes a channel coding framework for the backscatter channel consisting of a bi-static transmitter/receiver pair and a quasi-cooperative transponder. It proposes to use run-length limited coding to mitigate the background self-interference and spread-Doppler clutter with only a small decrease in communication rate. The proposed method applies to both binary phase-shift keying (BPSK) and quadrature-amplitude modulation (QAM) scheme and provides an increase in rate by up to a factor of two compared with previous methods.

Additionally, this work analyzes the use of frequency modulation and bi-phase waveform coding for the transmitted (interrogating) waveform for high precision range estimation of the transponder location. Compared to previous methods, optimal lower range sidelobes are achieved. Moreover, since both the transmitted (interrogating) waveform coding and transponder communication coding result in instantaneous phase modulation of the signal, cross-interference between localization and communication tasks exists. Phase discriminating algorithm is proposed to make it possible to separate the waveform coding from the communication coding, upon reception, and achieve localization with increased signal energy by up to 3 dB compared with previous reported results.

The joint communication-localization framework also enables a low-complexity receiver design because the same radio is used both for localization and communication.

Simulations comparing the performance of different codes corroborate the theoretical results and offer possible trade-off between information rate and clutter mitigation as well as a trade-off between choice of waveform-channel coding pairs. Experimental results from a brass-board microwave system in an indoor environment are also presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a critical analysis of the extant literature pertaining to the networking behaviours of young jobseekers in both offline and online environments. A framework derived from information behaviour theory is proposed as a basis for conducting further research in this area. Method. Relevant material for the review was sourced from key research domains such as library and information science, job search research, and organisational research. Analysis. Three key research themes emerged from the analysis of the literature: (1) social networks, and the use of informal channels of information during job search, (2) the role of networking behaviours in job search, and (3) the adoption of social media tools. Tom Wilson’s general model of information behaviour was also identified as a suitable framework to conduct further research. Results. Social networks have a crucial informational utility during the job search process. However, the processes whereby young jobseekers engage in networking behaviours, both offline and online, remain largely unexplored. Conclusion. Identification and analysis of the key research themes reveal opportunities to acquire further knowledge regarding the networking behaviours of young jobseekers. Wilson’s model can be used as a framework to provide a holistic understanding of the networking process, from an information behaviour perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background

Despite the effectiveness of brief lifestyle intervention delivered in primary healthcare (PHC), implementation in routine practice remains suboptimal. Beliefs and attitudes have been shown to be associated with risk factor management practices, but little is known about the process by which clinicians' perceptions shape implementation. This study aims to describe a theoretical model to understand how clinicians' perceptions shape the implementation of lifestyle risk factor management in routine practice. The implications of the model for enhancing practices will also be discussed.

Methods

The study analysed data collected as part of a larger feasibility project of risk factor management in three community health teams in New South Wales (NSW), Australia. This included journal notes kept through the implementation of the project, and interviews with 48 participants comprising 23 clinicians (including community nurses, allied health practitioners and an Aboriginal health worker), five managers, and two project officers. Data were analysed using grounded theory principles of open, focused, and theoretical coding and constant comparative techniques to construct a model grounded in the data.

Results

The model suggests that implementation reflects both clinician beliefs about whether they should (commitment) and can (capacity) address lifestyle issues. Commitment represents the priority placed on risk factor management and reflects beliefs about role responsibility congruence, client receptiveness, and the likely impact of intervening. Clinician beliefs about their capacity for risk factor management reflect their views about self-efficacy, role support, and the fit between risk factor management ways of working. The model suggests that clinicians formulate different expectations and intentions about how they will intervene based on these beliefs about commitment and capacity and their philosophical views about appropriate ways to intervene. These expectations then provide a cognitive framework guiding their risk factor management practices. Finally, clinicians' appraisal of the overall benefits versus costs of addressing lifestyle issues acts to positively or negatively reinforce their commitment to implementing these practices.

Conclusion

The model extends previous research by outlining a process by which clinicians' perceptions shape implementation of lifestyle risk factor management in routine practice. This provides new insights to inform the development of effective strategies to improve such practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conceptual domain of agency theory is one of the dominant organisational theory perspectives applied in current family business research (Chrisman et al., 2010). According to agency theory (Jensen and Meckling, 1976), agency costs generally arise due to individuals’ selfinterest and decision making based on rational thinking and oriented toward own preferences. With more people involved in decision making, such as through the separation of ownership and management, agency costs occur due to different preferences and information asymmetries between the owner (principal) and the employed management (agent) (Jensen and Meckling, 1976). In other words, agents take decisions based on their individual preferences (for example, short term, financial gains) instead of the owners’ preferences (for example, long term, sustainable development).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on a study of ERP lifecycle major issues from the perspectives of individuals with substantial and diverse involvement with SAP Financials in Queensland Government. A survey was conducted of 117 ERP system project participants in five closely related state government agencies. A modified Delphi technique identified, rationalized and weighed perceived major issues in ongoing ERP life cycle implementation, management and support. The five agencies each implemented SAP Financials simultaneously using a common implementation partner. The three survey rounds of the Delphi technique, together with coding and synthesizing procedures, resulted in a set of 10 major issue categories with 38 sub-issues. Relative scores of issue importance are compared across government agencies, roles (client vs implementation partner) and organizational levels (strategic, technical and operational). Study findings confirm the importance of this finer partitioning of the data, and distinctions identified reflect the circumstances of ERP lifecycle implementation, management and support among the stakeholder groups. The study findings should also be of interest to stakeholders who seek to better understand the issues surrounding ERP systems and to better realise the benefits of ERP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the globalizing world, knowledge and information (and the social and technological settings for their production and communication) are now seen as keys to economic prosperity. The economy of a knowledge city creates value-added products using research, technology, and brainpower. The social benefit of knowledge-based urban development (KBUD); however, extends beyond aggregate economic growth.