956 resultados para Covering law model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Irrespective of the diverse stances taken on the effect of the UNESCO Convention on Cultural Diversity in the external relations context, since its wording is fairly open-ended, it is clear to all observers that the Convention’s impact will largely depend on how it is implemented domestically. The discussion on the national implementation of the Convention, both in the policy and in the academic discourses, is only just emerging. The implementation model of the EU could set an important example for the international community and for the other State Parties that have ratified the UNESCO Convention, as both the EU and its Member States acting individually, have played a critical role in the adoption of the Convention, as well as in the longer process of promoting cultural concerns on the international scene. Against this backdrop, this article analyses the extent to which the EU internal law and policies, in particular in the key area of media, take into account the spirit and the letter of the UNESCO Convention on Cultural Diversity. The article seeks to critically evaluate the present state of affairs and make some recommendations for calibration of future policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Irrespective of the diverse stances taken on the effect of the UNESCO Convention on Cultural Diversity in the external relations context, since its wording is fairly open-ended, it is clear to all observers that the Convention’s impact will largely depend on how it is implemented domestically. The discussion on the national implementation of the Convention, both in the policy and in the academic discourses, is only just emerging, although six years the Convention’s entry into force have passed. The implementation model of the EU can set an important example for the international community and for the other State Parties that have ratified the UNESCO Convention, as both the EU and its Member States acting individually, have played a critical role in the adoption of the Convention, as well as in the longer process of promoting cultural concerns on the international scene. Against this backdrop, this article analyses the extent to which the EU internal law and policies, in particular in the key area of media, take into account the spirit and the letter of the UNESCO Convention on Cultural Diversity. Next to an assessment of the EU’s implementation of the Convention, the article also offers remarks of normative character – in the sense of what should be done to actually attain the objective of protecting and promoting cultural diversity. The article seeks to critically evaluate the present state of affairs and make some recommendations for calibration of future policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the understanding and representation of the impacts of volcanic eruptions on climate have improved in the last decades, uncertainties in the stratospheric aerosol forcing from large eruptions are now linked not only to visible optical depth estimates on a global scale but also to details on the size, latitude and altitude distributions of the stratospheric aerosols. Based on our understanding of these uncertainties, we propose a new model-based approach to generating a volcanic forcing for general circulation model (GCM) and chemistry–climate model (CCM) simulations. This new volcanic forcing, covering the 1600–present period, uses an aerosol microphysical model to provide a realistic, physically consistent treatment of the stratospheric sulfate aerosols. Twenty-six eruptions were modeled individually using the latest available ice cores aerosol mass estimates and historical data on the latitude and date of eruptions. The evolution of aerosol spatial and size distribution after the sulfur dioxide discharge are hence characterized for each volcanic eruption. Large variations are seen in hemispheric partitioning and size distributions in relation to location/date of eruptions and injected SO2 masses. Results for recent eruptions show reasonable agreement with observations. By providing these new estimates of spatial distributions of shortwave and long-wave radiative perturbations, this volcanic forcing may help to better constrain the climate model responses to volcanic eruptions in the 1600–present period. The final data set consists of 3-D values (with constant longitude) of spectrally resolved extinction coefficients, single scattering albedos and asymmetry factors calculated for different wavelength bands upon request. Surface area densities for heterogeneous chemistry are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract BACKGROUND: Pulse pressure variations (PPVs) and stroke volume variations (SVVs) are dynamic indices for predicting fluid responsiveness in intensive care unit patients. These hemodynamic markers underscore Frank-Starling law by which volume expansion increases cardiac output (CO). The aim of the present study was to evaluate the impact of the administration of catecholamines on PPV, SVV, and inferior vena cava flow (IVCF). METHODS: In this prospective, physiologic, animal study, hemodynamic parameters were measured in deeply sedated and mechanically ventilated pigs. Systemic hemodynamic and pressure-volume loops obtained by inferior vena cava occlusion were recorded. Measurements were collected during two conditions, that is, normovolemia and hypovolemia, generated by blood removal to obtain a mean arterial pressure value lower than 60 mm Hg. At each condition, CO, IVCF, SVV, and PPV were assessed by catheters and flow meters. Data were compared between the conditions normovolemia and hypovolemia before and after intravenous administrations of norepinephrine and epinephrine using a nonparametric Wilcoxon test. RESULTS: Eight pigs were anesthetized, mechanically ventilated, and equipped. Both norepinephrine and epinephrine significantly increased IVCF and decreased PPV and SVV, regardless of volemic conditions (p < 0.05). However, epinephrine was also able to significantly increase CO regardless of volemic conditions. CONCLUSION: The present study demonstrates that intravenous administrations of norepinephrine and epinephrine increase IVCF, whatever the volemic conditions are. The concomitant decreases in PPV and SVV corroborate the fact that catecholamine administration recruits unstressed blood volume. In this regard, understanding a decrease in PPV and SVV values, after catecholamine administration, as an obvious indication of a restored volemia could be an outright misinterpretation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patients suffering from cystic fibrosis (CF) show thick secretions, mucus plugging and bronchiectasis in bronchial and alveolar ducts. This results in substantial structural changes of the airway morphology and heterogeneous ventilation. Disease progression and treatment effects are monitored by so-called gas washout tests, where the change in concentration of an inert gas is measured over a single or multiple breaths. The result of the tests based on the profile of the measured concentration is a marker for the severity of the ventilation inhomogeneity strongly affected by the airway morphology. However, it is hard to localize underlying obstructions to specific parts of the airways, especially if occurring in the lung periphery. In order to support the analysis of lung function tests (e.g. multi-breath washout), we developed a numerical model of the entire airway tree, coupling a lumped parameter model for the lung ventilation with a 4th-order accurate finite difference model of a 1D advection-diffusion equation for the transport of an inert gas. The boundary conditions for the flow problem comprise the pressure and flow profile at the mouth, which is typically known from clinical washout tests. The natural asymmetry of the lung morphology is approximated by a generic, fractal, asymmetric branching scheme which we applied for the conducting airways. A conducting airway ends when its dimension falls below a predefined limit. A model acinus is then connected to each terminal airway. The morphology of an acinus unit comprises a network of expandable cells. A regional, linear constitutive law describes the pressure-volume relation between the pleural gap and the acinus. The cyclic expansion (breathing) of each acinus unit depends on the resistance of the feeding airway and on the flow resistance and stiffness of the cells themselves. Special care was taken in the development of a conservative numerical scheme for the gas transport across bifurcations, handling spatially and temporally varying advective and diffusive fluxes over a wide range of scales. Implicit time integration was applied to account for the numerical stiffness resulting from the discretized transport equation. Local or regional modification of the airway dimension, resistance or tissue stiffness are introduced to mimic pathological airway restrictions typical for CF. This leads to a more heterogeneous ventilation of the model lung. As a result the concentration in some distal parts of the lung model remains increased for a longer duration. The inert gas concentration at the mouth towards the end of the expirations is composed of gas from regions with very different washout efficiency. This results in a steeper slope of the corresponding part of the washout profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines how US and proposed international law relate to the recovery of archaeological data from historic shipwrecks. It argues that US federal admiralty law of salvage gives far less protection to historic submerged sites than do US laws protecting archaeological sites on US federal and Indian lands. The paper offers a simple model in which the net present value of the salvage and archaeological investigation of an historic shipwreck is maximized. It is suggested that salvage law gives insufficient protection to archaeological data, but that UNESCO's Convention on the Protection of the Underwater Cultural Heritage goes too far in the other direction. It is also suggested that a move towards maximizing the net present value of a wreck would be promoted if the US admiralty courts explicitly tied the size of salvage awards to the quality of the archaeology performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The doctrine of fair use allows limited copying of creative works based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops a formal model of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. The analysis highlights the role of technology in shaping the efficient standard. Discussion of several key cases illustrates the applicability of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Institutional Review Boards (IRBs) are the primary gatekeepers for the protection of ethical standards of federally regulated research on human subjects in this country. This paper focuses on what general, broad measures that may be instituted or enhanced to exemplify a "model IRB". This is done by examining the current regulatory standards of federally regulated IRBs, not private or commercial boards, and how many of those standards have been found either inadequate or not generally understood or followed. The analysis includes suggestions on how to bring about changes in order to make the IRB process more efficient, less subject to litigation, and create standardized educational protocols for members. The paper also considers how to include better oversight for multi-center research, increased centralization of IRBs, utilization of Data Safety Monitoring Boards when necessary, payment for research protocol review, voluntary accreditation, and the institution of evaluation/quality assurance programs. ^ This is a policy study utilizing secondary analysis of publicly available data. Therefore, the research for this paper focuses on scholarly medical/legal journals, web information from the Department of Health and Human Services, Federal Drug Administration, and the Office of the Inspector General, Accreditation Programs, law review articles, and current regulations applicable to the relevant portions of the paper. ^ Two issues are found to be consistently cited by the literature as major concerns. One is a need for basic, standardized educational requirements across all IRBs and its members, and secondly, much stricter and more informed management of continuing research. There is no federally regulated formal education system currently in place for IRB members, except for certain NIH-based trials. Also, IRBs are not keeping up with research once a study has begun, and although regulated to do so, it does not appear to be a great priority. This is the area most in danger of increased litigation. Other issues such as voluntary accreditation and outcomes evaluation are slowing gaining steam as the processes are becoming more available and more sought after, such as JCAHO accrediting of hospitals. ^ Adopting the principles discussed in this paper should promote better use of a local IRBs time, money, and expertise for protecting the vulnerable population in their care. Without further improvements to the system, there is concern that private and commercial IRBs will attempt to create a monopoly on much of the clinical research in the future as they are not as heavily regulated and can therefore offer companies quicker and more convenient reviews. IRBs need to consider the advantages of charging for their unique and important services as a cost of doing business. More importantly, there must be a minimum standard of education for all IRB members in the area of the ethical standards of human research and a greater emphasis placed on the follow-up of ongoing research as this is the most critical time for study participants and may soon lead to the largest area for litigation. Additionally, there should be a centralized IRB for multi-site trials or a study website with important information affecting the trial in real time. There needs to be development of standards and metrics to assess the performance of the IRBs for quality assurance and outcome evaluations. The boards should not be content to run the business of human subjects' research without determining how well that function is actually being carried out. It is important that federally regulated IRBs provide excellence in human research and promote those values most important to the public at large.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Retail clinics, also called convenience care clinics, have become a rapidly growing trend since their initial development in 2000. These clinics are coupled within a larger retail operation and are generally located in "big-box" discount stores such as Wal-mart or Target, grocery stores such as Publix or H-E-B, or in retail pharmacies such as CVS or Walgreen's (Deloitte Center for Health Solutions, 2008). Care is typically provided by nurse practitioners. Research indicates that this new health care delivery system reduces cost, raises quality, and provides a means of access to the uninsured population (e.g., Deloitte Center for Health Solutions, 2008; Convenient Care Association, 2008a, 2008b, 2008c; Hansen-Turton, Miller, Nash, Ryan, Counts, 2007; Salinsky, 2009; Scott, 2006; Ahmed & Fincham, 2010). Some healthcare analysts even suggest that retail clinics offer a feasible solution to the shortage of primary care physicians facing the nation (AHRQ Health Care Innovations Exchange, 2010). ^ The development and performance of retail clinics is heavily dependent upon individual state policies regulating NPs. Texas currently has one of the most highly regulated practice environments for NPs (Stout & Elton, 2007; Hammonds, 2008). In September 2009, Texas passed Senate Bill 532 addressing the scope of practice of nurse practitioners in the convenience care model. In comparison to other states, this law still heavily regulates nurse practitioners. However, little research has been conducted to evaluate the impact of state laws regulating nurse practitioners on the development and performance of retail clinics. ^ Objectives. (1). To describe the potential impact that SB 532 has on retail clinic performance. (2). To discuss the effectiveness, efficiency, and equity of the convenience care model. (3). To describe possible alternatives to Texas' nurse practitioner scope of practice guidelines as delineated in Texas Senate Bill 532. (4). To describe the type of nurse practitioner state regulation (i.e. independent, light, moderate, or heavy) that best promotes the convenience care model. ^ Methods. State regulations governing nurse practitioners can be characterized as independent, light, moderate, and heavy. Four state NP regulatory types and retail clinic performance were compared and contrasted to that of Texas regulations using Dunn and Aday's theoretical models for conducting policy analysis and evaluating healthcare systems. Criteria for measurement included effectiveness, efficiency, and equity. Comparison states were Arizona (Independent), Minnesota (Light), Massachusetts (Moderate), and Florida (Heavy). ^ Results. A comparative states analysis of Texas SB 532 and alternative NP scope of practice guidelines among the four states: Arizona, Florida, Massachusetts, and Minnesota, indicated that SB 532 has minimal potential to affect the shortage of primary care providers in the state. Although SB 532 may increase the number of NPs a physician may supervise, NPs are still heavily restricted in their scope of practice and limited in their ability to act as primary care providers. Arizona's example of independent NP practice provided the best alternative to affect the shortage of PCPs in Texas as evidenced by a lower uninsured rate and less ED visits per 1,000 population. A survey of comparison states suggests that retail clinics thrive in states that more heavily restrict NP scope of practice as opposed to those that are more permissive, with the exception of Arizona. An analysis of effectiveness, efficiency, and equity of the convenience care model indicates that retail clinics perform well in the areas of effectiveness and efficiency; but, fall short in the area of equity. ^ Conclusion. Texas Senate 532 represents an incremental step towards addressing the problem of a shortage of PCPs in the state. A comparative policy analysis of the other four states with varying degrees of NP scope of practice indicate that a more aggressive policy allowing for independent NP practice will be needed to achieve positive changes in health outcomes. Retail clinics pose a temporary solution to the shortage of PCPs and will need to expand their locations to poorer regions and incorporate some chronic care to obtain measurable health outcomes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dust can affect the radiative balance of the atmosphere by absorbing or reflecting incoming solar radiation and it can be a source of micronutrients, such as iron, to the ocean. It has been suggested that production, transport, and deposition of dust is influenced by climatic changes on glacial-interglacial timescales. Here we present a high-resolution aeolian dust record from the EPICA Dome C ice core in East Antarctica, which provides an undisturbed climate sequence over the last eight climatic cycles. We find that there is a significant correlation between dust flux and temperature records during glacial periods that is absent during interglacial periods. Our data suggests that dust flux is increasingly correlated with Antarctic temperature as climate becomes colder. We interpret this as progressive coupling of Antarctic and lower latitudes climate. Limited changes in glacial-interglacial atmospheric transport time Mahowald et al. (1999, doi:10.1029/1999JD900084), Jouzel et al. (2007, doi:10.1126/science.1141038), and Werner et al. (2002, doi:10.1029/2002JD002365) suggest that the sources and lifetime of dust are the major factors controlling the high glacial dust input. We propose that the observed ~25-fold increase in glacial dust flux over all eight glacial periods can be attributed to a strengthening of South American dust sources, together with a longer atmospheric dust particle life-time in the upper troposphere resulting from a reduced hydrological cycle during the ice ages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a new digital elevation model (DEM) is derived for the ice sheet in western Dronning Maud Land, Antarctica. It is based on differential interferometric synthetic aperture radar (SAR) from the European Remote Sensing 1/2 (ERS-1/2) satellites, in combination with ICESat's Geoscience Laser Altimeter System (GLAS). A DEM mosaic is compiled out of 116 scenes from the ERS-1 ice phase in 1994 and the ERS-1/2 tandem mission between 1996 and 1997 with the GLAS data acquired in 2003 that served as ground control. Using three different SAR processors, uncertainties in phase stability and baseline model, resulting in height errors of up to 20 m, are exemplified. Atmospheric influences at the same order of magnitude are demonstrated, and corresponding scenes are excluded. For validation of the DEM mosaic, covering an area of about 130,000 km**2 on a 50-m grid, independent ICESat heights (2004-2007), ground-based kinematic GPS (2005), and airborne laser scanner data (ALS, 2007) are used. Excluding small areas with low phase coherence, the DEM differs in mean and standard deviation by 0.5 +/- 10.1, 1.1 +/- 6.4, and 3.1 +/- 4.0 m from ICESat, GPS, and ALS, respectively. The excluded data points may deviate by more than 50 m. In order to suppress the spatially variable noise below a 5-m threshold, 18% of the DEM area is selectively averaged to a final product at varying horizontal spatial resolution. Apart from mountainous areas, the new DEM outperforms other currently available DEMs and may serve as a benchmark for future elevation models such as from the TanDEM-X mission to spatially monitor ice sheet elevation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To explore cause and consequences of past climate change, very accurate age models such as those provided by the astronomical timescale (ATS) are needed. Beyond 40 million years the accuracy of the ATS critically depends on the correctness of orbital models and radioisotopic dating techniques. Discrepancies in the age dating of sedimentary successions and the lack of suitable records spanning the middle Eocene have prevented development of a continuous astronomically calibrated geological timescale for the entire Cenozoic Era. We now solve this problem by constructing an independent astrochronological stratigraphy based on Earth's stable 405 kyr eccentricity cycle between 41 and 48 million years ago (Ma) with new data from deep-sea sedimentary sequences in the South Atlantic Ocean. This new link completes the Paleogene astronomical timescale and confirms the intercalibration of radioisotopic and astronomical dating methods back through the Paleocene-Eocene Thermal Maximum (PETM, 55.930 Ma) and the Cretaceous-Paleogene boundary (66.022 Ma). Coupling of the Paleogene 405 kyr cyclostratigraphic frameworks across the middle Eocene further paves the way for extending the ATS into the Mesozoic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chinese government commits to reach its peak carbon emissions before 2030, which requires China to implement new policies. Using a CGE model, this study conducts simulation studies on the functions of an energy tax and a carbon tax and analyzes their effects on macro-economic indices. The Chinese economy is affected at an acceptable level by the two taxes. GDP will lose less than 0.8% with a carbon tax of 100, 50, or 10 RMB/ton CO2 or 5% of the delivery price of an energy tax. Thus, the loss of real disposable personal income is smaller. Compared with implementing a single tax, a combined carbon and energy tax induces more emission reductions with relatively smaller economic costs. With these taxes, the domestic competitiveness of energy intensive industries is improved. Additionally, we found that the sooner such taxes are launched, the smaller the economic costs and the more significant the achieved emission reductions.