926 resultados para UNCITRAL Model Law
Resumo:
Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
Following European legislative initiatives in the field of copyright limitations and exceptions, policy flexibilities formerly available to mem- ber states has been greatly diminished. The law in this area is increasingly incapable of accommodating any expansion in the scope of freely permitted acts, even where such expansion may be an appropriate response to changes in social and technological conditions. In this article, the causes of this problem are briefly canvassed and a number of potential solutions are noted. It is suggested that one such solution – the adoption of an open, factor-based model similar to s 107 of the United States’ Copyright Act – has not received the serious attention it deserves. The fair use paradigm has generally been dismissed as excessively unpredictable, contrary to international law and/or culturally alien. Drawing on recent fair use scholarship, it is argued here that these disadvantages are over-stated and that the potential for the development of a European fair use model merits investigation.
Resumo:
This research aimed to explore the extent to which police use of force was related to attitudes towards violence, agency type, and racism. Previous studies have found a culture of honor in the psychology of violence in the Southern United States. Were similar attitudes measurable among Texas professional line officers? Are there predictors of use of force?^ A self reported anonymous survey was administered to Texas patrol officers in the cities of Austin and Houston, and the Counties of Harris and Travis. A total of seventy-four questionnaires were used in the statistical analyses. Scales were developed measuring use of force, attitudes towards violence, and feelings on racism. Their relationship was examined.^ A regression model shows a strong and significant relationship between the officers' attitudes towards violence and the self-reported use of force. Further, agency type, municipal versus sheriff, also predicts use of force. Attitudes regarding race or racism, as measured by this study, were not predictive of use of force. ^
Resumo:
Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.
Resumo:
Irrespective of the diverse stances taken on the effect of the UNESCO Convention on Cultural Diversity in the external relations context, since its wording is fairly open-ended, it is clear to all observers that the Convention’s impact will largely depend on how it is implemented domestically. The discussion on the national implementation of the Convention, both in the policy and in the academic discourses, is only just emerging. The implementation model of the EU could set an important example for the international community and for the other State Parties that have ratified the UNESCO Convention, as both the EU and its Member States acting individually, have played a critical role in the adoption of the Convention, as well as in the longer process of promoting cultural concerns on the international scene. Against this backdrop, this article analyses the extent to which the EU internal law and policies, in particular in the key area of media, take into account the spirit and the letter of the UNESCO Convention on Cultural Diversity. The article seeks to critically evaluate the present state of affairs and make some recommendations for calibration of future policies.
Resumo:
Irrespective of the diverse stances taken on the effect of the UNESCO Convention on Cultural Diversity in the external relations context, since its wording is fairly open-ended, it is clear to all observers that the Convention’s impact will largely depend on how it is implemented domestically. The discussion on the national implementation of the Convention, both in the policy and in the academic discourses, is only just emerging, although six years the Convention’s entry into force have passed. The implementation model of the EU can set an important example for the international community and for the other State Parties that have ratified the UNESCO Convention, as both the EU and its Member States acting individually, have played a critical role in the adoption of the Convention, as well as in the longer process of promoting cultural concerns on the international scene. Against this backdrop, this article analyses the extent to which the EU internal law and policies, in particular in the key area of media, take into account the spirit and the letter of the UNESCO Convention on Cultural Diversity. Next to an assessment of the EU’s implementation of the Convention, the article also offers remarks of normative character – in the sense of what should be done to actually attain the objective of protecting and promoting cultural diversity. The article seeks to critically evaluate the present state of affairs and make some recommendations for calibration of future policies.
Impact of epinephrine and norepinephrine on two dynamic indices in a porcine hemorrhagic shock model
Resumo:
Abstract BACKGROUND: Pulse pressure variations (PPVs) and stroke volume variations (SVVs) are dynamic indices for predicting fluid responsiveness in intensive care unit patients. These hemodynamic markers underscore Frank-Starling law by which volume expansion increases cardiac output (CO). The aim of the present study was to evaluate the impact of the administration of catecholamines on PPV, SVV, and inferior vena cava flow (IVCF). METHODS: In this prospective, physiologic, animal study, hemodynamic parameters were measured in deeply sedated and mechanically ventilated pigs. Systemic hemodynamic and pressure-volume loops obtained by inferior vena cava occlusion were recorded. Measurements were collected during two conditions, that is, normovolemia and hypovolemia, generated by blood removal to obtain a mean arterial pressure value lower than 60 mm Hg. At each condition, CO, IVCF, SVV, and PPV were assessed by catheters and flow meters. Data were compared between the conditions normovolemia and hypovolemia before and after intravenous administrations of norepinephrine and epinephrine using a nonparametric Wilcoxon test. RESULTS: Eight pigs were anesthetized, mechanically ventilated, and equipped. Both norepinephrine and epinephrine significantly increased IVCF and decreased PPV and SVV, regardless of volemic conditions (p < 0.05). However, epinephrine was also able to significantly increase CO regardless of volemic conditions. CONCLUSION: The present study demonstrates that intravenous administrations of norepinephrine and epinephrine increase IVCF, whatever the volemic conditions are. The concomitant decreases in PPV and SVV corroborate the fact that catecholamine administration recruits unstressed blood volume. In this regard, understanding a decrease in PPV and SVV values, after catecholamine administration, as an obvious indication of a restored volemia could be an outright misinterpretation.
Resumo:
Patients suffering from cystic fibrosis (CF) show thick secretions, mucus plugging and bronchiectasis in bronchial and alveolar ducts. This results in substantial structural changes of the airway morphology and heterogeneous ventilation. Disease progression and treatment effects are monitored by so-called gas washout tests, where the change in concentration of an inert gas is measured over a single or multiple breaths. The result of the tests based on the profile of the measured concentration is a marker for the severity of the ventilation inhomogeneity strongly affected by the airway morphology. However, it is hard to localize underlying obstructions to specific parts of the airways, especially if occurring in the lung periphery. In order to support the analysis of lung function tests (e.g. multi-breath washout), we developed a numerical model of the entire airway tree, coupling a lumped parameter model for the lung ventilation with a 4th-order accurate finite difference model of a 1D advection-diffusion equation for the transport of an inert gas. The boundary conditions for the flow problem comprise the pressure and flow profile at the mouth, which is typically known from clinical washout tests. The natural asymmetry of the lung morphology is approximated by a generic, fractal, asymmetric branching scheme which we applied for the conducting airways. A conducting airway ends when its dimension falls below a predefined limit. A model acinus is then connected to each terminal airway. The morphology of an acinus unit comprises a network of expandable cells. A regional, linear constitutive law describes the pressure-volume relation between the pleural gap and the acinus. The cyclic expansion (breathing) of each acinus unit depends on the resistance of the feeding airway and on the flow resistance and stiffness of the cells themselves. Special care was taken in the development of a conservative numerical scheme for the gas transport across bifurcations, handling spatially and temporally varying advective and diffusive fluxes over a wide range of scales. Implicit time integration was applied to account for the numerical stiffness resulting from the discretized transport equation. Local or regional modification of the airway dimension, resistance or tissue stiffness are introduced to mimic pathological airway restrictions typical for CF. This leads to a more heterogeneous ventilation of the model lung. As a result the concentration in some distal parts of the lung model remains increased for a longer duration. The inert gas concentration at the mouth towards the end of the expirations is composed of gas from regions with very different washout efficiency. This results in a steeper slope of the corresponding part of the washout profile.
Resumo:
We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.
Resumo:
This paper examines how US and proposed international law relate to the recovery of archaeological data from historic shipwrecks. It argues that US federal admiralty law of salvage gives far less protection to historic submerged sites than do US laws protecting archaeological sites on US federal and Indian lands. The paper offers a simple model in which the net present value of the salvage and archaeological investigation of an historic shipwreck is maximized. It is suggested that salvage law gives insufficient protection to archaeological data, but that UNESCO's Convention on the Protection of the Underwater Cultural Heritage goes too far in the other direction. It is also suggested that a move towards maximizing the net present value of a wreck would be promoted if the US admiralty courts explicitly tied the size of salvage awards to the quality of the archaeology performed.