899 resultados para extended collective licensing
Resumo:
Starting point in the European individualistic copyright ideology is that an individual author creates a work and controls the use of it. However, this paper argues that it is (and has always been) impossible to control the use of works after their publication. This has also been acknowledged by the legislator, who has introduced collective licensing agreements because of this impossibility. Since it is impossible to rigorously control the use of works this writing "Rough Justice or Zero Tolerance - Reassessing the Nature of Copyright in Light of Collective Licensing" examines what reality of copyright is actually about. Finding alternative (and hopefully more "true") ways to understand copyright helps us to create alternative solutions in order to solve possible problems we have as it comes e.g. to use of content in online environment. The paper makes a claim that copyright is actually about defining negotiation points for different stakeholders and that nothing in the copyright reality prevents us from defining e.g. a new negotiation point where representatives of consumers would meet representatives of right holders in order to agree on the terms of use for certain content types in online environment.
Resumo:
The possibility of the EU member states to adapt copyright legislation to new circumstances and to address unforeseen issues is limited by the list of exceptions and restrictions of the InfoSoc Directive. In spite of this constraint, the EU copyright framework provides for a possibility of introduction of non-voluntary forms of collective rights management that can help to tackle some of the contemporary problems with remuneration and access. This article is an attempt to deepen the understanding of non-voluntary collective management and its possible use. First, it provides a detailed description of the French mechanism adopted for facilitating mass digitization and making out-of-commerce books available, which was implemented through a new form of collective management of copyright. Then, it examines the mechanism’s compatibility with the InfoSoc Directive through comparison with the extended collective licensing.
Resumo:
Currently, lawmakers on both sides of the Atlantic are struggling with the problem of orphan works. In the impact assessment of its proposal for a directive of the European Parliament and of the Council on certain permitted uses of orphan works, the Eurpean Commission mentions six possible ways of dealing with the problem. Three of the six (a statutory exception to copyright; extended collective licensing; an orphan-specific license granted by collecting societies) have each had their heyday during the past few years. This article examines how and why these changes in popularity occurred. In addition, it explains why a limitation on remedies would be the most adequate solution for the problem in Europe.
Resumo:
This article examines the conditions under which a system of extended collective licensing (ECL) for the use of works contained in the collections of cultural heritage institutions (CHIs) participating in Europeana could function within a cross-border basis. ECL is understood as a form of collective rights management whereby the application of freely negotiated copyright licensing agreements between a user and a collective management organisation (“CMO”), is extended by law to non-members of the organisation. ECL regimes have already been put in place in a few Member States and so far, all have the ability to apply only on a national basis. This article proposes a mechanism that would allow works licensed under an ECL system in one territory of the European Union to be made available in all the territories of the Union. The proposal rests on the statutory recognition of the “country of origin” principle, as necessary and sufficient territory for the negotiation and application of an ECL solution for the rights clearance of works contained in the collection of a cultural heritage institution, including orphan works.
Resumo:
Schizophrenia may not be a single disease, but the result of a diverse set of related conditions. Modern neuroscience is beginning to reveal some of the genetic and environmental underpinnings of schizophrenia; however, an approach less well travelled is to examine the medical disorders that produce symptoms resembling schizophrenia. This book is the first major attempt to bring together the diseases that produce what has been termed 'secondary schizophrenia'. International experts from diverse backgrounds ask the questions: does this medical disorder, or drug, or condition cause psychosis? If yes, does it resemble schizophrenia? What mechanisms form the basis of this relationship? What implications does this understanding have for aetiology and treatment? The answers are a feast for clinicians and researchers of psychosis and schizophrenia. They mark the next step in trying to meet the most important challenge to modern neuroscience – understanding and conquering this most mysterious of human diseases.
Resumo:
Rats are superior to the most advanced robots when it comes to creating and exploiting spatial representations. A wild rat can have a foraging range of hundreds of meters, possibly kilometers, and yet the rodent can unerringly return to its home after each foraging mission, and return to profitable foraging locations at a later date (Davis, et al., 1948). The rat runs through undergrowth and pipes with few distal landmarks, along paths where the visual, textural, and olfactory appearance constantly change (Hardy and Taylor, 1980; Recht, 1988). Despite these challenges the rat builds, maintains, and exploits internal representations of large areas of the real world throughout its two to three year lifetime. While algorithms exist that allow robots to build maps, the questions of how to maintain those maps and how to handle change in appearance over time remain open. The robotic approach to map building has been dominated by algorithms that optimise the geometry of the map based on measurements of distances to features. In a robotic approach, measurements of distance to features are taken with range-measuring devices such as laser range finders or ultrasound sensors, and in some cases estimates of depth from visual information. The features are incorporated into the map based on previous readings of other features in view and estimates of self-motion. The algorithms explicitly model the uncertainty in measurements of range and the measurement of self-motion, and use probability theory to find optimal solutions for the geometric configuration of the map features (Dissanayake, et al., 2001; Thrun and Leonard, 2008). Some of the results from the application of these algorithms have been impressive, ranging from three-dimensional maps of large urban strucutures (Thrun and Montemerlo, 2006) to natural environments (Montemerlo, et al., 2003).
Resumo:
Two hundred years ago life writing was already highly popular in the form of autobiography, memoir, biography, journals, essays and diaries. It now commands a huge share of the publishing market, as there is an enormous demand from readers for narratives based directly on 'real lives'. There is a lot of common ground between the two main forms - autobiography/memoir and biography: both require skilled storytelling [rather than listing facts and events], research and imagination. The quality of the writing itself is crucial to the impact on the reader. A person can have an existing, worthy life but unfortunately write about it (or be written about) in a dull way. And how a person is remembered and valued can be a factor of life writing about or by them. This chapter will define and contextualise life writing, look at specific detailed examples, and offer guidance on how to write effectively.
Resumo:
This case study examines the way in which Knowledge Unlatched is combining collective action and open access licenses to encourage innovation in markets for specialist academic books. Knowledge Unlatched is a not for profit organisation that has been established to help a global community of libraries coordinate their book purchasing activities more effectively and, in so doing, to ensure that books librarians select for their own collections become available for free for anyone in the world to read. The Knowledge Unlatched model is an attempt to re-coordinate a market in order to facilitate a transition to digitally appropriate publishing models that include open access. It offers librarians an opportunity to facilitate the open access publication of books that their own readers would value access to. It provides publishers with a stable income stream on titles selected by libraries, as well as an ability to continue selling books to a wider market on their own terms. Knowledge Unlatched provides a rich case study for researchers and practitioners interested in understanding how innovations in procurement practices can be used to stimulate more effective, equitable markets for socially valuable products.
Resumo:
We generalized the Enskog theory originally developed for the hard-sphere fluid to fluids with continuous potentials, such as the Lennard–Jones. We derived the expression for the k and ω dependent transport coefficient matrix which enables us to calculate the transport coefficients for arbitrary length and time scales. Our results reduce to the conventional Chapman–Enskog expression in the low density limit and to the conventional k dependent Enskog theory in the hard-sphere limit. As examples, the self-diffusion of a single atom, the vibrational energy relaxation, and the activated barrier crossing dynamics problem are discussed.
Resumo:
In recent decades the debate among scholars, lawyers, politicians and others about how societies deal with their past has been constant and intensive. 'Legal Institutions and Collective Memories' situates the processes of transitional justice at the intersection between legal procedures and the production of collective and shared meanings of the past. Building upon the work of Maurice Halbwachs, this collection of essays emphasises the extended role and active involvement of contemporary law and legal institutions in public discourse about the past, and explores their impact on the shape that collective memories take in the course of time. The authors uncover a complex pattern of searching for truth, negotiating the past and cultivating the art of forgetting. Their contributions explore the ambiguous and intricate links between the production of justice, truth and memory. The essays cover a broad range of legal institutions, countries and topics. These include transitional trials as 'monumental spectacles' as well as constitutional courts, and the restitution of property rights in Central and Eastern Europe and Australia. The authors explore the biographies of victims and how their voices were repressed, as in the case of Korean Comfort Women. They explore the role of law and legal institutions in linking individual and collective memories in the transitional period through processes of lustration, and they analyse divided memories about the past and their impact on future reconciliation in South Africa. The collection offers a genuinely comparative approach, allied to cutting-edge theory.
Resumo:
The paper presents a detailed analysis on the collective dynamics and delayed state feedback control of a three-dimensional delayed small-world network. The trivial equilibrium of the model is first investigated, showing that the uncontrolled model exhibits complicated unbounded behavior. Then three control strategies, namely a position feedback control, a velocity feedback control, and a hybrid control combined velocity with acceleration feedback, are then introduced to stabilize this unstable system. It is shown in these three control schemes that only the hybrid control can easily stabilize the 3-D network system. And with properly chosen delay and gain in the delayed feedback path, the hybrid controlled model may have stable equilibrium, or periodic solutions resulting from the Hopf bifurcation, or complex stranger attractor from the period-doubling bifurcation. Moreover, the direction of Hopf bifurcation and stability of the bifurcation periodic solutions are analyzed. The results are further extended to any "d" dimensional network. It shows that to stabilize a "d" dimensional delayed small-world network, at least a "d – 1" order completed differential feedback is needed. This work provides a constructive suggestion for the high dimensional delayed systems.
Resumo:
This chapter considers the legal ramifications of Wikipedia, and other online media, such as the Encyclopedia of Life. Nathaniel Tkacz (2007) has observed: 'Wikipedia is an ideal entry-point from which to approach the shifting character of knowledge in contemporary society.' He observes: 'Scholarship on Wikipedia from computer science, history, philosophy, pedagogy and media studies has moved beyond speculation regarding its considerable potential, to the task of interpreting - and potentially intervening in - the significance of Wikipedia's impact' (Tkacz 2007). After an introduction, Part II considers the evolution and development of Wikipedia, and the legal troubles that have attended it. It also considers the establishment of rival online encyclopedia - such as Citizendium set up by Larry Sanger, the co-founder of Wikipedia; and Knol, the mysterious new project of Google. Part III explores the use of mass, collaborative authorship in the field of science. In particular, it looks at the development of the Encyclopedia of Life, which seeks to document the world's biodiversity. This chapter expresses concern that Wiki-based software had to develop in a largely hostile and inimical legal environment. It contends that copyright law and related fields of intellectual property need to be reformed in order better to accommodate users of copyright material (Rimmer 2007). This chapter makes a number of recommendations. First, there is a need to acknowledge and recognize forms of mass, collaborative production and consumption - not just individual authorship. Second, the view of a copyright 'work' and other subject matter as a complete and closed piece of cultural production also should be reconceptualised. Third, the defense of fair use should be expanded to accommodate a wide range of amateur, peer-to-peer production activities - not only in the United States, but in other jurisdictions as well. Fourth, the safe harbor protections accorded to Internet intermediaries, such as Wikipedia, should be strengthened. Fifth, there should be a defense in respect of the use of 'orphan works' - especially in cases of large-scale digitization. Sixth, the innovations of open source licensing should be expressly incorporated and entrenched within the formal framework of copyright laws. Finally, courts should craft judicial remedies to take into account concerns about political censorship and freedom of speech.
Resumo:
This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.
Resumo:
A microscopic study of the non‐Markovian (or memory) effects on the collective orientational relaxation in a dense dipolar liquid is carried out by using an extended hydrodynamic approach which provides a reliable description of the dynamical processes occuring at the molecular length scales. Detailed calculations of the wave‐vector dependent orientational correlation functions are presented. The memory effects are found to play an important role; the non‐Markovian results differ considerably from that of the Markovian theory. In particular, a slow long‐time decay of the longitudinal orientational correlation function is observed for dense liquids which becomes weaker in the presence of a sizeable translational contribution to the collective orientational relaxation. This slow decay can be attributed to the intermolecular correlations at the molecular length scales. The longitudinal component of the orientational correlation function becomes oscillatory in the underdamped limit of momenta relaxations and the frequency dependence of the friction reduce the frictional resistance on the collective excitations (commonly known as dipolarons) to make them long lived. The theory predicts that these dipolarons can, therefore, be important in chemical relaxation processes, in contradiction to the claims of some earlier theoretical studies.
Resumo:
We propose a method for the dynamic simulation of a collection of self-propelled particles in a viscous Newtonian fluid. We restrict attention to particles whose size and velocity are small enough that the fluid motion is in the creeping flow regime. We propose a simple model for a self-propelled particle, and extended the Stokesian Dynamics method to conduct dynamic simulations of a collection of such particles. In our description, each particle is treated as a sphere with an orientation vector p, whose locomotion is driven by the action of a force dipole Sp of constant magnitude S0 at a point slightly displaced from its centre. To simplify the calculation, we place the dipole at the centre of the particle, and introduce a virtual propulsion force Fp to effect propulsion. The magnitude F0 of this force is proportional to S0. The directions of Sp and Fp are determined by p. In isolation, a self-propelled particle moves at a constant velocity u0 p, with the speed u0 determined by S0. When it coexists with many such particles, its hydrodynamic interaction with the other particles alters its velocity and, more importantly, its orientation. As a result, the motion of the particle is chaotic. Our simulations are not restricted to low particle concentration, as we implement the full hydrodynamic interactions between the particles, but we restrict the motion of particles to two dimensions to reduce computation. We have studied the statistical properties of a suspension of self-propelled particles for a range of the particle concentration, quantified by the area fraction φa. We find several interesting features in the microstructure and statistics. We find that particles tend to swim in clusters wherein they are in close proximity. Consequently, incorporating the finite size of the particles and the near-field hydrodynamic interactions is of the essence. There is a continuous process of breakage and formation of the clusters. We find that the distributions of particle velocity at low and high φa are qualitatively different; it is close to the normal distribution at high φa, in agreement with experimental measurements. The motion of the particles is diffusive at long time, and the self-diffusivity decreases with increasing φa. The pair correlation function shows a large anisotropic build-up near contact, which decays rapidly with separation. There is also an anisotropic orientation correlation near contact, which decays more slowly with separation. Movies are available with the online version of the paper.