941 resultados para Networked Digital Environment
Resumo:
The purpose of the article is to provide first a doctrinal summary of the concept, rules and policy of exhaustion, first, on the international and EU level, and, later, under the law of the United States. Based upon this introduction, the paper turns to the analysis of the doctrine by the pioneer court decisions handed over in the UsedSoft, ReDigi, the German e-book/audio book cases, and the pending Tom Kabinet case from the Netherlands. Questions related to the licence versus sale dichotomy; the so-called umbrella solution; the “new copy theory”, migration of digital copies via the internet; the forward-and-delete technology; the issue of lex specialis and the theory of functional equivalence are covered later on. The author of the present article stresses that the answers given by the respective judges of the referred cases are not the final stop in the discussion. The UsedSoft preliminary ruling and the subsequent German domestic decisions highlight a special treatment for computer programs. On the other hand, the refusal of digital exhaustion in the ReDigi and the audio book/e-book cases might be in accordance with the present wording of copyright law; however, they do not necessarily reflect the proper trends of our ages. The paper takes the position that the need for digital exhaustion is constantly growing in society and amongst businesses. Indeed, there are reasonable arguments in favour of equalizing the resale of works sold in tangible and intangible format. Consequently, the paper urges the reconsideration of the norms on exhaustion on the international and EU level.
Resumo:
A számítógépes munkavégzés elterjedése számos szempontból és módszerrel vizsgálható. A szerző kutatásai során az ember-gép-környezet összhang megteremtésének igényéből kiindulva arra keresi a választ, hogy milyen tényezők befolyásolják a számítógéppel végzett tevékenységek megszervezésének és végrehajtásának hatékonyságát. Tanulmányában a kutatás indítófázisának főbb eredményeiről, a számítógép-használati szokásokról ad áttekintést, ami elengedhetetlen az irodai-adminisztratív jellegű tevékenységekhez kötődő kritikus tényezők feltárásához, továbbá a mérési és fejlesztési feladatok megalapozásához. A szűkebb értelemben vett ergonómiai szempontok mellett a digitális kompetenciák kérdéskörét vonta be a munkába, amit releváns kérdésnek tart a hatékonyság mérése szempontjából, mivel a számítógép megválasztása és a munkahely kialakítása nem értékelhető az emberi tényező alkalmassága és az elvégzendő feladat tartalma nélkül. __________ Office and administrative work, business corres-pondence, private contacts and learning are increasingly supported by computers. Moreover the technical possibilities of correspondence are wider than using a PC. It is accessible on the go by a cell phone. The author analysed the characteristics of the used devices, the working environment, satisfaction factors in connection with computer work and the digital competence by a survey. In his opinion development in an ergonomic approach is important not only to establish the technological novelties but to utilize the present possibilities of hardware and environment. The reason for this is that many people can not (or do not want) to follow the dynamic technological development of computers with buying the newest devices. The study compares the results of home and work characteristic of computer work. This research was carried out as part of the “TAMOP-4.2.1.B-10/2/ KONV-2010-0001” project with support by the European Union, co-financed by the European Social Fund.
Resumo:
This study describes how we used a prototype e-participation plat-form as a digital cultural probe to investigate youth motivation and engagement strategies. This is a novel way of considering digital cultural probes which can contribute to the better creation of e-participation platforms. This probe has been conducted as part of the research project STEP which aims at creating an e-participation platform to engage young European Citizens in environmental decision making. Our probe technique has given an insight into the environ-mental issues concerning young people across Europe as well as possible strat-egies for encouraging participation. How the e-participation platform can be utilised to support youth engagement through opportunities for social interac-tion and leadership is discussed. This study leads to a better understanding of how young people can co-operate with each other to provide collective intelli-gence and how this knowledge could contribute to effective e-participation of young people.
Resumo:
This study collected a sample of YouTube videos in which parents recorded their young children utilizing mobile touchscreen devices. Focusing on the more frequently viewed and highly-discussed videos, the paper analyzes the ways in which babies’ ‘digital dexterity’ is coded and understood in terms of contested notions of ‘naturalness’, and how the display of these capabilities is produced for a networked public. This reading of the ‘baby-iPad encounter’ helps expand existing scholarly concepts such as parental mediation and technology domestication. Recruiting several theoretical frameworks, the paper seeks to go beyond concerns of mobile devices and immobile children by analyzing children’s digital dexterity not just as a kind of mobility, but also as a set of reciprocal mobilizations that work across domestic, virtual and publically networked spaces.
Resumo:
Knowledge organization in the networked environment is guided by standards. Standards in knowledge organization are built on principles. For example, NISO Z39.19-1993 Guide to the Construction of Monolingual Thesauri (now undergoing revision) and NISO Z39.85- 2001 Dublin Core Metadata Element Set are two standards used in many implementations. Both of these standards were crafted with knowledge organization principles in mind. Therefore it is standards work guided by knowledge organization principles which can affect design of information services and technologies. This poster outlines five threads of thought that inform knowledge organization principles in the networked environment. An understanding of each of these five threads informs system evaluation. The evaluation of knowledge organization systems should be tightly linked to a rigorous understanding of the principles of construction. Thus some foundational evaluation questions grow from an understanding of stan dard s and pr inciples: on what pr inciples is this know ledge organization system built? How well does this implementation meet the ideal conceptualization of those principles? How does this tool compare to others built on the same principles?
Resumo:
The dissertation addresses the still not solved challenges concerned with the source-based digital 3D reconstruction, visualisation and documentation in the domain of archaeology, art and architecture history. The emerging BIM methodology and the exchange data format IFC are changing the way of collaboration, visualisation and documentation in the planning, construction and facility management process. The introduction and development of the Semantic Web (Web 3.0), spreading the idea of structured, formalised and linked data, offers semantically enriched human- and machine-readable data. In contrast to civil engineering and cultural heritage, academic object-oriented disciplines, like archaeology, art and architecture history, are acting as outside spectators. Since the 1990s, it has been argued that a 3D model is not likely to be considered a scientific reconstruction unless it is grounded on accurate documentation and visualisation. However, these standards are still missing and the validation of the outcomes is not fulfilled. Meanwhile, the digital research data remain ephemeral and continue to fill the growing digital cemeteries. This study focuses, therefore, on the evaluation of the source-based digital 3D reconstructions and, especially, on uncertainty assessment in the case of hypothetical reconstructions of destroyed or never built artefacts according to scientific principles, making the models shareable and reusable by a potentially wide audience. The work initially focuses on terminology and on the definition of a workflow especially related to the classification and visualisation of uncertainty. The workflow is then applied to specific cases of 3D models uploaded to the DFG repository of the AI Mainz. In this way, the available methods of documenting, visualising and communicating uncertainty are analysed. In the end, this process will lead to a validation or a correction of the workflow and the initial assumptions, but also (dealing with different hypotheses) to a better definition of the levels of uncertainty.
Resumo:
Introduction. The ToLigado Project - Your School Interactive Newspaper is an interactive virtual learning environment conceived, developed, implemented and supported by researchers at the School of the Future Research Laboratory of the University of Sao Paulo, Brazil. Method. This virtual learning environment aims to motivate trans-disciplinary research among public school students and teachers in 2,931 schools equipped with Internet-access computer rooms. Within this virtual community, students produce collective multimedia research documents that are immediately published in the portal. The project also aims to increase students' autonomy for research, collaborative work and Web authorship. Main sections of the portal are presented and described. Results. Partial results of the first two years' implementation are presented and indicate a strong motivation among students to produce knowledge despite the fragile hardware and software infrastructure at the time. Discussion. In this new environment, students should be seen as 'knowledge architects' and teachers as facilitators, or 'curiosity managers'. The ToLigado portal may constitute a repository for future studies regarding student attitudes in virtual learning environments, students' behaviour as 'authors', Web authorship involving collective knowledge production, teachers' behaviour as facilitators, and virtual learning environments as digital repositories of students' knowledge construction and social capital in virtual learning communities.
Resumo:
Context. Fossil systems are defined to be X- ray bright galaxy groups ( or clusters) with a two- magnitude difference between their two brightest galaxies within half the projected virial radius, and represent an interesting extreme of the population of galaxy agglomerations. However, the physical conditions and processes leading to their formation are still poorly constrained. Aims. We compare the outskirts of fossil systems with that of normal groups to understand whether environmental conditions play a significant role in their formation. We study the groups of galaxies in both, numerical simulations and observations. Methods. We use a variety of statistical tools including the spatial cross- correlation function and the local density parameter Delta(5) to probe differences in the density and structure of the environments of "" normal"" and "" fossil"" systems in the Millennium simulation. Results. We find that the number density of galaxies surrounding fossil systems evolves from greater than that observed around normal systems at z = 0.69, to lower than the normal systems by z = 0. Both fossil and normal systems exhibit an increment in their otherwise radially declining local density measure (Delta(5)) at distances of order 2.5 r(vir) from the system centre. We show that this increment is more noticeable for fossil systems than normal systems and demonstrate that this difference is linked to the earlier formation epoch of fossil groups. Despite the importance of the assembly time, we show that the environment is different for fossil and non- fossil systems with similar masses and formation times along their evolution. We also confirm that the physical characteristics identified in the Millennium simulation can also be detected in SDSS observations. Conclusions. Our results confirm the commonly held belief that fossil systems assembled earlier than normal systems but also show that the surroundings of fossil groups could be responsible for the formation of their large magnitude gap.
Resumo:
The purpose of this article is to study the application of the holographic interferometry techniques in the structural analysis of submarine environment. These techniques are widely used today, with applications in many areas. Nevertheless, its application in submarine environments presents some challenges. The application of two techniques, electronic speckle pattern interferometry (ESPI) and digital holography, comparison of advantages and disadvantages of each of them is presented. A brief study is done on the influence of water properties and the optical effects due to suspended particles as well as possible solutions to minimize these problems. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
In this article we argue that digital simulations promote and explore complex relations between the player and the machines cybernetic system with which it relates through gameplay, that is, the real application of tactics and strategies used by participants as they play the game. We plan to show that the realism of simulation, together with the merger of artificial objects with the real world, can generate interactive empathy between players and their avatars. In this text, we intend to explore augmented reality as a means to visualise interactive communication projects. With ARToolkit, Virtools and 3ds Max applications, we aim to show how to create a portable interactive platform that resorts to the environment and markers for constructing the games scenario. Many of the conventional functions of the human eye are being replaced by techniques where images do not position themselves in the traditional manner that we observe them (Crary, 1998), or in the way we perceive the real world. The digitalization of the real world to a new informational layer over objects, people or environments, needs to be processed and mediated by tools that amplify the natural human senses.
Resumo:
Since collaborative networked organisations are usually formed by independent and heterogeneous entities, it is natural that each member holds his own set of values, and that conflicts among partners might emerge because of some misalignment of values. In contrast, it is often stated in literature that the alignment between the value systems of members involved in collaborative processes is a prerequisite for successful co-working. As a result, the issue of core value alignment in collaborative networks started to attract attention. However, methods to analyse such alignment are lacking mainly because the concept of 'alignment' in this context is still ill defined and shows a multifaceted nature. As a contribution to the area, this article introduces an approach based on causal models and graph theory for the analysis of core value alignment in collaborative networks. The potential application of the approach is then discussed in the virtual organisations' breeding environment context.
Resumo:
Plain radiography still accounts for the vast majority of imaging studies that are performed at multiple clinical instances. Digital detectors are now prominent in many imaging facilities and they are the main driving force towards filmless environments. There has been a working paradigm shift due to the functional separation of acquisition, visualization, and storage with deep impact in the imaging workflows. Moreover with direct digital detectors images are made available almost immediately. Digital radiology is now completely integrated in Picture Archiving and Communication System (PACS) environments governed by the Digital Imaging and Communications in Medicine (DICOM) standard. In this chapter a brief overview of PACS architectures and components is presented together with a necessarily brief account of the DICOM standard. Special focus is given to the DICOM digital radiology objects and how specific attributes may now be used to improve and increase the metadata repository associated with image data. Regular scrutiny of the metadata repository may serve as a valuable tool for improved, cost-effective, and multidimensional quality control procedures.
Resumo:
In the past few years the so-called gadgets like cellular phones, personal data assistants and digital cameras are more widespread even with less technological aware users. However, for several reasons, the factory-floor itself seems to be hermetic to this changes ... After the fieldbus revolution, the factory-floor has seen an increased use of more and more powerful programmable logic controllers and user interfaces but the way they are used remains almost the same. We believe that new user-computer interaction techniques including multimedia and augmented rcaliry combined with now affordable technologies like wearable computers and wireless networks can change the way the factory personal works together with the roachines and the information system on the factory-floor. This new age is already starting with innovative uses of communication networks on the factory-floor either using "standard" networks or enhancing industrial networks with multimedia and wireless capabilities.
Resumo:
Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.
Resumo:
In this paper, we focus on large-scale and dense Cyber- Physical Systems, and discuss methods that tightly integrate communication and computing with the underlying physical environment. We present Physical Dynamic Priority Dominance ((PD)2) protocol that exemplifies a key mechanism to devise low time-complexity communication protocols for large-scale networked sensor systems. We show that using this mechanism, one can compute aggregate quantities such as the maximum or minimum of sensor readings in a time-complexity that is equivalent to essentially one message exchange. We also illustrate the use of this mechanism in a more complex task of computing the interpolation of smooth as well as non-smooth sensor data in very low timecomplexity.