916 resultados para Information Ethics and its Applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digitization is the main feature of modern Information Science. Conjoining the digits and the coordinates, the relation between Information Science and high-dimensional space is consanguineous, and the information issues are transformed to the geometry problems in some high-dimensional spaces. From this basic idea, we propose Computational Information Geometry (CIG) to make information analysis and processing. Two kinds of applications of CIG are given, which are blurred image restoration and pattern recognition. Experimental results are satisfying. And in this paper, how to combine with groups of simple operators in some 2D planes to implement the geometrical computations in high-dimensional space is also introduced. Lots of the algorithms have been realized using software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates variation in IT professionals' experience of ethics with a view to enhancing their formation and support. This is explored through an examination of the experience of IT, IT professional ethics and IT professional ethics education. The study's principal contribution is the empirical study and description of IT professionals' experience of ethics. The empirical phase is preceded by a review of conceptions of IT and followed by an application of the findings to IT education. The study's empirical findings are based on 30 semi-structured interviews with IT professionals who represent a wide demographic, experience and IT sub-discipline range. Their experience of ethics is depicted as five citizenships: Citizenship of my world, Citizenship of the corporate world, Citizenship of a shared world, Citizenship of the client's world and Citizenship of the wider world. These signify an expanding awareness, which progressively accords rights to others and defines responsibility in terms of others. The empirical findings inform a Model of Ethical IT. This maps an IT professional space increasingly oriented towards others. Such a model provides a conceptual tool, available to prompt discussion and reflection, and which may be employed in pursuing formation aimed at experiential change. Its usefulness for the education of IT professionals with respect to ethics is explored. The research approach employed in this study is phenomenography. This method seeks to elicit and represent variation of experience. It understands experience as a relationship between a subject (IT professionals) and an object (ethics), and describes this relationship in terms of its foci and boundaries. The study's findings culminate in three observations, that change is indicated in the formation and support of IT professionals in: 1. IT professionals' experience of their discipline, moving towards a focus on information users; 2. IT professionals' experience of professional ethics, moving towards the adoption of other-centred attitudes; and 3. IT professionals' experience of professional development, moving towards an emphasis on a change in lived experience. Based on these results, employers, educators and professional bodies may want to evaluate how they approach professional formation and support, if they aim to promote a comprehensive awareness of ethics in IT professionals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bana et al. proposed the relation formal indistinguishability (FIR), i.e. an equivalence between two terms built from an abstract algebra. Later Ene et al. extended it to cover active adversaries and random oracles. This notion enables a framework to verify computational indistinguishability while still offering the simplicity and formality of symbolic methods. We are in the process of making an automated tool for checking FIR between two terms. First, we extend the work by Ene et al. further, by covering ordered sorts and simplifying the way to cope with random oracles. Second, we investigate the possibility of combining algebras together, since it makes the tool scalable and able to cover a wide class of cryptographic schemes. Specially, we show that the combined algebra is still computationally sound, as long as each algebra is sound. Third, we design some proving strategies and implement the tool. Basically, the strategies allow us to find a sequence of intermediate terms, which are formally indistinguishable, between two given terms. FIR between the two given terms is then guaranteed by the transitivity of FIR. Finally, we show applications of the work, e.g. on key exchanges and encryption schemes. In the future, the tool should be extended easily to cover many schemes. This work continues previous research of ours on use of compilers to aid in automated proofs for key exchange.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective speckle from a stick-on foil is a new approach to applying the objective white light speckle method to in-plane displacement measurements. By a relatively easy technique a thin aluminum foil is mounted onto the specimen surface and a random grating is scratched onto it, yielding high reflectance and fine optical details. After double exposure by a direct recording system without using a lens, the resulting holographic film possesses a broad spatial spectrum and displacement information. Full-field contour maps of equal displacement can be obtained that are of good contrast and high sensitivity and that have a large adjustable measurement range. The method can be applied to practical engineering problems for both plane and developable curved surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a view to solve the problems in modern information science, we put forward a new subject named High-Dimensional Space Geometrical Informatics (HDSGI). It builds a bridge between information science and point distribution analysis in high-dimensional space. A good many experimental results certified the correctness and availability of the theory of HDSGI. The proposed method for image restoration is an instance of its application in signal processing. Using an iterative "further blurring-debluring-further blurring" algorithm, the deblured image could be obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a view to solve the problems in modern information science, we put forward a new subject named High-Dimensional Space Geometrical Informatics (HDSGI). It builds a bridge between information science and point distribution analysis in high-dimensional space. A good many experimental results certified the correctness and availability of the theory of HDSGI. The proposed method for image restoration is an instance of its application in signal processing. Using an iterative "further blurring-debluring-further blurring" algorithm, the deblured image could be obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ophthalmic drug delivery system is very interesting and challenging due to the normal physiologically factor of eyes which reduces the bioavailability of ocular products. The development of a new ophthalmic dosage forms with the existing drugs to improve efficacy and bioavailability including better patients' compliance and convenience has become trend in the most pharmaceutical industries. The present review encompasses various conventional and novel ocular drug delivery systems, methods of preparation, characterization, recent researches carried out. Furthermore, the information on various commercially available in situ gel preparations and the existing patents of in situ drug delivery systems i.e. in situ gel formation of pectin, in situ gel for therapeutic use, medical uses of in situ formed gels and in situ gelling systems as sustained delivery for front of eye also covered in this review.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ACCURATE sensing of vehicle position and attitude is still a very challenging problem in many mobile robot applications. The mobile robot vehicle applications must have some means of estimating where they are and in which direction they are heading. Many existing indoor positioning systems are limited in workspace and robustness because they require clear lines-of-sight or do not provide absolute, driftfree measurements.The research work presented in this dissertation provides a new approach to position and attitude sensing system designed specifically to meet the challenges of operation in a realistic, cluttered indoor environment, such as that of an office building, hospital, industrial or warehouse. This is accomplished by an innovative assembly of infrared LED source that restricts the spreading of the light intensity distribution confined to a sheet of light and is encoded with localization and traffic information. This Digital Infrared Sheet of Light Beacon (DISLiB) developed for mobile robot is a high resolution absolute localization system which is simple, fast, accurate and robust, without much of computational burden or significant processing. Most of the available beacon's performance in corridors and narrow passages are not satisfactory, whereas the performance of DISLiB is very encouraging in such situations. This research overcomes most of the inherent limitations of existing systems.The work further examines the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. A simple and efficient method is investigated and realized using an FPGA for reducing the errors. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle.The application of encoded Digital Infrared Sheet of Light Beacon (DISLiB) system can be extended to intelligent control of the public transportation system. The system is capable of receiving traffic status input through a GSM (Global System Mobile) modem. The vehicles have infrared receivers and processors capable of decoding the information, and generating the audio and video messages to assist the driver. The thesis further examines the usefulness of the technique to assist the movement of differently-able (blind) persons in indoor or outdoor premises of his residence.The work addressed in this thesis suggests a new way forward in the development of autonomous robotics and guidance systems. However, this work can be easily extended to many other challenging domains, as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A conceptual problem that appears in different contexts of clustering analysis is that of measuring the degree of compatibility between two sequences of numbers. This problem is usually addressed by means of numerical indexes referred to as sequence correlation indexes. This paper elaborates on why some specific sequence correlation indexes may not be good choices depending on the application scenario in hand. A variant of the Product-Moment correlation coefficient and a weighted formulation for the Goodman-Kruskal and Kendall`s indexes are derived that may be more appropriate for some particular application scenarios. The proposed and existing indexes are analyzed from different perspectives, such as their sensitivity to the ranks and magnitudes of the sequences under evaluation, among other relevant aspects of the problem. The results help suggesting scenarios within the context of clustering analysis that are possibly more appropriate for the application of each index. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anycast in next generation Internet Protocol is a hot topic in the research of computer networks. It has promising potentials and also many challenges, such as architecture, routing, Quality-of-Service, anycast in ad hoc networks, application-layer anycast, etc. In this thesis, we tackle some important topics among them. The thesis at first presents an introduction about anycast, followed by the related work. Then, as our major contributions, a number of challenging issues are addressed in the following chapters. We tackled the anycast routing problem by proposing a requirement based probing algorithm at application layer for anycast routing. Compared with the existing periodical based probing routing algorithm, the proposed routing algorithm improves the performance in terms of delay. We addressed the reliable service problem by the design of a twin server model for the anycast servers, providing a transparent and reliable service for all anycast queries. We addressed the load balance problem of anycast servers by proposing new job deviation strategies, to provide a similar Quality-of-Service to all clients of anycast servers. We applied the mesh routing methodology in the anycast routing in ad hoc networking environment, which provides a reliable routing service and uses much less network resources. We combined the anycast protocol and the multicast protocol to provide a bidirectional service, and applied the service to Web-based database applications, achieving a better query efficiency and data synchronization. Finally, we proposed a new Internet based service, minicast, as the combination of the anycast and multicast protocols. Such a service has potential applications in information retrieval, parallel computing, cache queries, etc. We show that the minicast service consumes less network resources while providing the same services. The last chapter of the thesis presents the conclusions and discusses the future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a large-scale mood analysis in social media texts. We organise the paper in three parts: (1) addressing the problem of feature selection and classification of mood in blogosphere, (2) we extract global mood patterns at different level of aggregation from a large-scale data set of approximately 18 millions documents (3) and finally, we extract mood trajectory for an egocentric user and study how it can be used to detect subtle emotion signals in a user-centric manner, supporting discovery of hyper-groups of communities based on sentiment information. For mood classification, two feature sets proposed in psychology are used, showing that these features are efficient, do not require a training phase and yield classification results comparable to state of the art, supervised feature selection schemes, on mood patterns, empirical results for mood organisation in the blogosphere are provided, analogous to the structure of human emotion proposed independently in the psychology literature, and on community structure discovery, sentiment-based approach can yield useful insights into community formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of program specialization is to optimize programs by exploiting certain knowledge about the context in which the program will execute. There exist many program manipulation techniques which allow specializing the program in different ways. Among them, one of the best known techniques is partial evaluation, often referred to simply as program specialization, which optimizes programs by specializing them for (partially) known input data. In this work we describe abstract specialization, a technique whose main features are: (1) specialization is performed with respect to "abstract" valúes rather than "concrete" ones, and (2) abstract interpretation rather than standard interpretation of the program is used in order to propágate information about execution states. The concept of abstract specialization is at the heart of the specialization system in CiaoPP, the Ciao system preprocessor. In this paper we present a unifying view of the different specialization techniques used in CiaoPP and discuss their potential applications by means of examples. The applications discussed include program parallelization, optimization of dynamic scheduling (concurreney), and integration of partial evaluation techniques.