95 resultados para Information Ethics and its Applications
Resumo:
This study investigates variation in IT professionals' experience of ethics with a view to enhancing their formation and support. This is explored through an examination of the experience of IT, IT professional ethics and IT professional ethics education. The study's principal contribution is the empirical study and description of IT professionals' experience of ethics. The empirical phase is preceded by a review of conceptions of IT and followed by an application of the findings to IT education. The study's empirical findings are based on 30 semi-structured interviews with IT professionals who represent a wide demographic, experience and IT sub-discipline range. Their experience of ethics is depicted as five citizenships: Citizenship of my world, Citizenship of the corporate world, Citizenship of a shared world, Citizenship of the client's world and Citizenship of the wider world. These signify an expanding awareness, which progressively accords rights to others and defines responsibility in terms of others. The empirical findings inform a Model of Ethical IT. This maps an IT professional space increasingly oriented towards others. Such a model provides a conceptual tool, available to prompt discussion and reflection, and which may be employed in pursuing formation aimed at experiential change. Its usefulness for the education of IT professionals with respect to ethics is explored. The research approach employed in this study is phenomenography. This method seeks to elicit and represent variation of experience. It understands experience as a relationship between a subject (IT professionals) and an object (ethics), and describes this relationship in terms of its foci and boundaries. The study's findings culminate in three observations, that change is indicated in the formation and support of IT professionals in: 1. IT professionals' experience of their discipline, moving towards a focus on information users; 2. IT professionals' experience of professional ethics, moving towards the adoption of other-centred attitudes; and 3. IT professionals' experience of professional development, moving towards an emphasis on a change in lived experience. Based on these results, employers, educators and professional bodies may want to evaluate how they approach professional formation and support, if they aim to promote a comprehensive awareness of ethics in IT professionals.
Resumo:
We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.
Resumo:
Bana et al. proposed the relation formal indistinguishability (FIR), i.e. an equivalence between two terms built from an abstract algebra. Later Ene et al. extended it to cover active adversaries and random oracles. This notion enables a framework to verify computational indistinguishability while still offering the simplicity and formality of symbolic methods. We are in the process of making an automated tool for checking FIR between two terms. First, we extend the work by Ene et al. further, by covering ordered sorts and simplifying the way to cope with random oracles. Second, we investigate the possibility of combining algebras together, since it makes the tool scalable and able to cover a wide class of cryptographic schemes. Specially, we show that the combined algebra is still computationally sound, as long as each algebra is sound. Third, we design some proving strategies and implement the tool. Basically, the strategies allow us to find a sequence of intermediate terms, which are formally indistinguishable, between two given terms. FIR between the two given terms is then guaranteed by the transitivity of FIR. Finally, we show applications of the work, e.g. on key exchanges and encryption schemes. In the future, the tool should be extended easily to cover many schemes. This work continues previous research of ours on use of compilers to aid in automated proofs for key exchange.
Resumo:
Stereolithography is a solid freeform technique (SFF) that was introduced in the late 1980s. Although many other techniques have been developed since then, stereolithography remains one of the most powerful and versatile of all SFF techniques. It has the highest fabrication accuracy and an increasing number of materials that can be processed is becoming available. In this paper we discuss the characteristic features of the stereolithography technique and compare it to other SFF techniques. The biomedical applications of stereolithography are reviewed, as well as the biodegradable resin materials that have been developed for use with stereolithography. Finally, an overview of the application of stereolithography in preparing porous structures for tissue engineering is given.
Resumo:
Nowadays, everyone can effortlessly access a range of information on the World Wide Web (WWW). As information resources on the web continue to grow tremendously, it becomes progressively more difficult to meet high expectations of users and find relevant information. Although existing search engine technologies can find valuable information, however, they suffer from the problems of information overload and information mismatch. This paper presents a hybrid Web Information Retrieval approach allowing personalised search using ontology, user profile and collaborative filtering. This approach finds the context of user query with least user’s involvement, using ontology. Simultaneously, this approach uses time-based automatic user profile updating with user’s changing behaviour. Subsequently, this approach uses recommendations from similar users using collaborative filtering technique. The proposed method is evaluated with the FIRE 2010 dataset and manually generated dataset. Empirical analysis reveals that Precision, Recall and F-Score of most of the queries for many users are improved with proposed method.
Resumo:
This thesis is a forward study of alumina nanofiber material in developing its applications biology field. It demonstrates that by applying proper modification strategy, alumina nanofiber is a promising material in protein purification and enzyme immobilization. The hydrophobic modification has dramatically improved the rejecting of protein molecular in purification system. On the other hand, utilisation of cross-linking agent firmly combined alumina nanofiber and target enzyme for immobilisation purpose. This step of progress could lead to inspiration of alumina nanofiber’s application in various area.
Resumo:
A power electronics-based buffer is examined in which through control of its PWM converters, the buffer-load combination is driven to operate under either constant power or constant impedance modes. A battery, incorporated within the buffer, provides the energy storage facility to facilitate the necessary power flow control. Real power demand from upstream supply is regulated under fault condition, and the possibility of voltage or network instability is reduced. The proposed buffer is also applied to a wind farm. It is shown that the buffer stabilizes the power contribution from the farm. Based on a battery cost-benefit analysis, a method is developed to determine the optimal level of the power supplied from the wind farm and the corresponding capacity of the battery storage system.
Resumo:
In this paper we describe our investigation of the role of investment in information technology (IT) on economic output and productivity in Australia over a period of about four decades. The framework used in this paper is the aggregate production function, where IT capital is considered as a separate input of production along with non-IT capital and labour. The empirical results from the study indicate the evidence of robust technical progress in the Australian economy in the 1990s. IT capital had a significant impact on output, labour productivity and technical progress in the 1990s. In recent years, however, the contribution of IT capital on output and labour productivity has slowed down. Regaining the IT capital productivity therefore remains as a key challenge for Australia, especially in the context of greater IT investment in the future.
Resumo:
Spectrum sensing is considered to be one of the most important tasks in cognitive radio. Many sensing detectors have been proposed in the literature, with the common assumption that the primary user is either fully present or completely absent within the window of observation. In reality, there are scenarios where the primary user signal only occupies a fraction of the observed window. This paper aims to analyse the effect of the primary user duty cycle on spectrum sensing performance through the analysis of a few common detectors. Simulations show that the probability of detection degrades severely with reduced duty cycle regardless of the detection method. Furthermore we show that reducing the duty cycle has a greater degradation on performance than lowering the signal strength.
Resumo:
Cooperative Systems provide, through the multiplication of information sources over the road, a lot of potential to improve the safety of road users, especially drivers. However, developing cooperative ITS applications requires additional resources compared to non-cooperative applications which are both time consuming and expensive. In this paper, we present a simulation architecture aimed at prototyping cooperative ITS applications in an accurate and detailed, close-to-reality environment; the architecture is designed to be modular and generalist. It can be used to simulate any type of CS applications as well as augmented perception. Then, we discuss the results of two applications deployed with our architecture, using a common freeway emergency braking scenario. The first application is Emergency Electronic Brake Light (EEBL); we discuss improvements in safety in terms of the number of crashes and the severity of crashes. The second application compares the performance of a cooperative risk assessment using an augmented map against a non-cooperative approach based on local-perception only. Our results show a systematic improvement of forward warning time for most vehicles in the string when using the augmented-map-based risk assessment.
Resumo:
Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."
Resumo:
Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.
Resumo:
Social Media Analytics is an emerging interdisciplinary research field that aims on combining, extending, and adapting methods for analysis of social media data. On the one hand it can support IS and other research disciplines to answer their research questions and on the other hand it helps to provide architectural designs as well as solution frameworks for new social media-based applications and information systems. The authors suggest that IS should contribute to this field and help to develop and process an interdisciplinary research agenda.