24 resultados para work of Marilyn Strathern
Resumo:
"The Protection of Traditional Knowledge Associated with Genetic Resources: The Role of Databases and Registers" ABSTRACT Yovana Reyes Tagle The misappropriation of TK has sparked a search for national and international laws to govern the use of indigenous peoples knowledge and protection against its commercial exploitation. There is a widespread perception that biopiracy or illegal access to genetic resources and associated traditional knowledge (TK) continues despite national and regional efforts to address this concern. The purpose of this research is to address the question of how documentation of TK through databases and registers could protect TK, in light of indigenous peoples increasing demands to control their knowledge and benefit from its use. Throughout the international debate over the protection of TK, various options have been brought up and discussed. At its core, the discussion over the legal protection of TK comes down to these issues: 1) The doctrinal question: What is protection of TK? 2) The methodological question: How can protection of TK be achieved? 3) The legal question: What should be protected? And 4) The policy questions: Who has rights and how should they be implemented? What kind of rights should indigenous peoples have over their TK? What are the central concerns the TK databases want to solve? The acceptance of TK databases and registers may bring with it both opportunities and dangers. How can the rights of indigenous peoples over their documented knowledge be assured? Documentation of TK was envisaged as a means to protect TK, but there are concerns about how documented TK can be protected from misappropriation. The methodology used in this research seeks to contribute to the understanding of the protection of TK. The steps taken in this research attempt to describe and to explain a) what has been done to protect TK through databases and registers, b) how this protection is taking place, and c) why the establishment of TK databases can or cannot be useful for the protection of TK. The selected case studies (Peru and Venezuela) seek to illustrate the complexity and multidisciplinary nature of the establishment of TK databases, which entail not only legal but also political, socio-economic and cultural issues. The study offers some conclusions and recommendations that have emerged after reviewing the national experiences, international instruments, work of international organizations, and indigenous peoples perspectives. This thesis concludes that if TK is to be protected from disclosure and unauthorized use, confidential databases are required. Finally, the TK database strategy needs to be strengthened by the legal protection of the TK itself.
Resumo:
A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.
Resumo:
There is a lack of integrative conceptual models that would help to better understand the underlying reasons for the alleged problems of MBA education. To address this challenge, we draw on the work of Pierre Bourdieu to examine MBA education as an activity with its own ‘economy of exchange’ and ‘rules of the game.’ We argue that application of Bourdieu’s theoretical ideas elucidates three key issues in debate around MBA education: the outcomes of MBA programs, the inculcation of potentially problematic values and practices through the programs, and the potential of self-regulation such as accreditation and ranking for impeding development of MBA education. First, Bourdieu’s notions of capital – intellectual, social and symbolic – shed light on the ‘economy of exchange’ in MBA education. Critics of MBA programs have pointed out that the value of MBA degrees lies not only in ‘learning.’ Bourdieu’s framework allows further analysis of this issue by distinguishing between intellectual (learning), social (social networks), and symbolic capital (credentials and prestige). Second, the concept of ‘habitus’ suggests how values and practices are inculcated through MBA education. This process is often a ‘voluntary’ one where problematic or ethically questionable ideas may be regarded as natural. Third, Bourdieu’s reflections on the ‘doxa’ and its reproduction and legitimation illuminate the role of accreditation and ranking in MBA education. An analysis of such self-regulation explains in part how the system may turn out impeding change.
Resumo:
The triangular space between memory, narrative and pictorial representation is the terrain on which this article is developed. Taking the art of memory developed by Giordano Bruno (1548 – 1600) and the art of painting subtly revolutionised by Adam Elsheimer (1578 – 1610) as test-cases, it is shown how both subvert the norms of mimesis and narration prevalent throughout the Renaissance, how disrupted memory creates “incoherent” narratives, and how perspective and the notion of “place” are questioned in a corollary way. Two paintings by Elsheimer are analysed and shown to include, in spite of their supposed “realism”, numerous incoherencies, aporias and strange elements – often overlooked. Thus, they do not conform to two of the basic rules governing both the classical art of memory and the humanist art of painting: well-defined places and the exhaustive translatability of words into images (and vice-versa). In the work of Bruno, both his philosophical claims and the literary devices he uses are analysed as hints for a similar (and contemporaneous) undermining of conventions about the transparency and immediacy of representation.
Resumo:
The title of the 14th International Conference on Electronic Publishing (ELPUB), “Publishing in the networked world: Transforming the nature of communication”, is a timely one. Scholarly communication and scientific publishing has recently been undergoing subtle changes. Published papers are no longer fixed physical objects, as they once were. The “convergence” of information, communication, publishing and web technologies along with the emergence of Web 2.0 and social networks has completely transformed scholarly communication and scientific papers turned to living and changing entities in the online world. The themes (electronic publishing and social networks; scholarly publishing models; and technological convergence) selected for the conference are meant to address the issues involved in this transformation process. We are pleased to present the proceedings book with more than 30 papers and short communications addressing these issues. What you hold in your hands is a by-product and the culmination of almost a Year long work of many people including conference organizers, authors, reviewers, editors and print and online publishers. The ELPUB 2010 conference was organized and hosted by the Hanken School of Economics in Helsinki, Finland. Professors Turid Hedlund of Hanken School of Economics and Yaşar Tonta of Hacettepe University Department of Information Management (Ankara, Turkey) served as General Chair and Program Chair, respectively. We received more than 50 submissions from several countries. All submissions were peer-reviewed by members of an international Program Committee whose contributions proved most valuable and appreciated. The 14th ELPUB conference carries on the tradition of previous conferences held in the United Kingdom (1997 and 2001), Hungary (1998), Sweden (1999), Russia (2000), the Czech Republic (2002), Portugal (2003), Brazil (2004), Belgium (2005), Bulgaria (2006), Austria (2007), Canada (2008) and Italy (2009). The ELPUB Digital Library, http://elpub.scix.net serves as archive for the papers presented at the ELPUB conferences through the years. The 15th ELPUB conference will be organized by the Department of Information Management of Hacettepe University and will take place in Ankara, Turkey, from 14-16 June 2011. (Details can be found at the ELPUB web site as the conference date nears by.) We thank Marcus Sandberg and Hannu Sääskilahti for copyediting, Library Director Tua Hindersson – Söderholm for accepting to publish the online as well as the print version of the proceedings. Thanks also to Patrik Welling for maintaining the conference web site and Tanja Dahlgren for administrative support. We warmly acknowledge the support in organizing the conference to colleagues at Hanken School of Economics and our sponsors.
Resumo:
The thesis aims at investigating the local dimension of the EU cohesion policy through the utilization of an alternative approach, which aims at the analysis of discourse and structures of power. The concrete case under analysis is the Interreg IV programme “Alpenrhein-Bodensee-Hochrhein”, which is conducted in the border region between Germany, Switzerland, Austria and the principality of Liechtenstein. The main research question is stated as such: What governmental rationalities can be found at work in the field of EU cross-border cooperation programmes? How is directive action and cooperation envisioned? How coherent are the different rationalities, which are found at work? The theoretical framework is based on a Foucaultian understanding of power and discourse and utilizes the notion of governmentalities as a way to de-stabilize the understanding of directive action and in order to highlight the dispersed and heterogeneous nature of governmental activity. The approach is situated within the general field of research on the European Union connected to basic conceptualisations such as the nature of power, the role of discourse and modes of subjectification. An approach termed “analytics of government”, based on the work of researchers like Mitchell Dean is introduced as the basic framework for the analysis. Four dimensions (visiblities, subjectivities, techniques/practices, problematisations) are presented as a set of tools with which governmental regimes of practices can be analysed. The empirical part of the thesis starts out with a discussion of the general framework of the European Union's cohesion policy and places the Interreg IV Alpenrhein-Bodensee-Hochrhein programme in this general context. The main analysis is based on eleven interviews which were conducted with different individuals, participating in the programme on different levels. The selection of interview partners aimed at maximising heterogeneity through including individuals from all parts of the programme region, obtaining different functions within the programme. The analysis reveals interesting aspects pertaining to the implementation and routine aspects of work within initiatives conducted under the heading of the EU cohesion policy. The central aspects of an Interreg IV Alpenrhein-Bodensee-Hochrhein – governmentality are sketched out. This includes a positive perception of the work atmosphere, administrative/professional characterisation of the selves and a de-politicization of the programme. Characteristic is the experience of tensions by interview partners and the use of discoursive strategies to resolve them. Negative perceptions play an important role for the specific governmental rationality. The thesis contributes to a better understanding of the local dimension of the European Union cohesion policy and questions established ways of thinking about governmental activity. It provides an insight into the working of power mechanisms in the constitution of fields of discourse and points out matters of practical importance as well as subsequent research questions.
Resumo:
This monograph describes the emergence of independent research on logic in Finland. The emphasis is placed on three well-known students of Eino Kaila: Georg Henrik von Wright (1916-2003), Erik Stenius (1911-1990), and Oiva Ketonen (1913-2000), and their research between the early 1930s and the early 1950s. The early academic work of these scholars laid the foundations for today's strong tradition in logic in Finland and also became internationally recognized. However, due attention has not been given to these works later, nor have they been comprehensively presented together. Each chapter of the book focuses on the life and work of one of Kaila's aforementioned students, with a fourth chapter discussing works on logic by authors who would later become known within other disciplines. Through an extensive use of correspondence and other archived material, some insight has been gained into the persons behind the academic personae. Unique and unpublished biographical material has been available for this task. The chapter on Oiva Ketonen focuses primarily on his work on what is today known as proof theory, especially on his proof theoretical system with invertible rules that permits a terminating root-first proof search. The independency of the parallel postulate is proved as an example of the strength of root-first proof search. Ketonen was to our knowledge Gerhard Gentzen's (the 'father' of proof theory) only student. Correspondence and a hitherto unavailable autobiographic manuscript, in addition to an unpublished article on the relationship between logic and epistemology, is presented. The chapter on Erik Stenius discusses his work on paradoxes and set theory, more specifically on how a rigid theory of definitions is employed to avoid these paradoxes. A presentation by Paul Bernays on Stenius' attempt at a proof of the consistency of arithmetic is reconstructed based on Bernays' lecture notes. Stenius correspondence with Paul Bernays, Evert Beth, and Georg Kreisel is discussed. The chapter on Georg Henrik von Wright presents his early work on probability and epistemology, along with his later work on modal logic that made him internationally famous. Correspondence from various archives (especially with Kaila and Charlie Dunbar Broad) further discusses his academic achievements and his experiences during the challenging circumstances of the 1940s.
Resumo:
This dissertation inquires into the relationship between gender and biopolitics. Biopolitics, according to Michel Foucault, is the mode of politics that is situated and exercised at the level of life. The dissertation claims that gender is a technology of biopower specific to the optimisation of the sexual reproduction of human life, deployed through the scientific and governmental problematisation of declining fertility rates in the mid-twentieth century. Just as Michel Foucault claimed that sexuality became a scientific and political discourse in the nineteenth century, gender has also since emerged in these fields. In this dissertation, gender is treated as neither a representation of sex nor a cultural construct or category of identity. Rather, a genealogy of gender as an apparatus of biopower in conducted. It demonstrates how scientific and theoretical developments in the twentieth century marshalled gender into the sex/sexuality apparatus as a new technology of liberal biopower. Gender, I argue, has become necessary for the Western liberal order to recapture and re-optimise the life-producing functions of sex that reproduce the very object of biopolitics: life. The concept of the life function is introduced to analyse the life-producing violence of the sex/sexuality/gender apparatus. To do this, the thesis rereads the work of Michel Foucault through Gilles Deleuze for a deeper grasp of the material strategies of biopower and how it produces categories of difference and divides population according to them. The work of Judith Butler, in turn, is used as a foil against which to rearticulate the question of how to examine gender genealogically and biopolitically. The dissertation then executes a genealogy of gender, tracing the changing rationalities of sex/sexuality/gender from early feminist thought, through mid-twentieth century sexological, feminist, and demographic research, to current EU policy. According to this genealogy, in the mid-twentieth century demographers perceived that sexuality/sex, which Foucault observed as the life-producing biopolitical apparatus, was no longer sufficiently disciplining human bodies to reproduce. The life function was escaping the grasp of biopower. The analysis demonstrates how gender theory was taken up as a means of reterritorialising the life function: nature would be disciplined to reproduce by controlling culture. The crucial theoretical and genealogical argument of the thesis, that gender is a discourse with biopolitical foundations and a technology of biopower, radically challenges the premises of gender theory and feminist politics, as well as the emancipatory potential often granted to the gender concept. The project asks what gender means, what biopolitical function it performs, and what is at stake for feminist politics when it engages with it. In so doing, it identifies biopolitics and the problem of life as possibly the most urgent arena for feminist politics today.
Resumo:
We propose to compress weighted graphs (networks), motivated by the observation that large networks of social, biological, or other relations can be complex to handle and visualize. In the process also known as graph simplication, nodes and (unweighted) edges are grouped to supernodes and superedges, respectively, to obtain a smaller graph. We propose models and algorithms for weighted graphs. The interpretation (i.e. decompression) of a compressed, weighted graph is that a pair of original nodes is connected by an edge if their supernodes are connected by one, and that the weight of an edge is approximated to be the weight of the superedge. The compression problem now consists of choosing supernodes, superedges, and superedge weights so that the approximation error is minimized while the amount of compression is maximized. In this paper, we formulate this task as the 'simple weighted graph compression problem'. We then propose a much wider class of tasks under the name of 'generalized weighted graph compression problem'. The generalized task extends the optimization to preserve longer-range connectivities between nodes, not just individual edge weights. We study the properties of these problems and propose a range of algorithms to solve them, with dierent balances between complexity and quality of the result. We evaluate the problems and algorithms experimentally on real networks. The results indicate that weighted graphs can be compressed efficiently with relatively little compression error.