977 resultados para trade protection
Resumo:
How are resources split between caring for offspring and self-maintenance? Is the timing of immune challenge important? In burying beetles challenging the immune system prior to breeding does not affect the total number and quality of offspring produced during the individual's lifetime. However, the immune system is suppressed during breeding and if an immune challenge is presented during this time the beetle will upregulate its immune system, but at the detriment to the number of offspring produced during that breeding opportunity.We know that parental investment and immune investment are costly processes, but it is unclear which trait will be prioritized when both may be required. Here, we address this question using the burying beetle Nicrophorus vespilloides, carrion breeders that exhibit biparental care of young. Our results show that immunosuppression occurs during provision of parental care. We measured phenoloxidase (PO) on Days 1-8 of the breeding bout and results show a clear decrease in PO immediately from presentation of the breeding resource onward. Having established baseline immune investment during breeding we then manipulated immune investment at different times by applying a wounding challenge. Beetles were wounded prior to and during the parental care period and reproductive investment quantified. Different effects on reproductive output occur depending on the timing of wounding. Challenging the immune system with wounding prior to breeding does not affect reproductive output and subsequent lifetime reproductive success (LRS). LRS is also unaffected by applying an immune elicitor prior to breeding, though different arms of the immune system are up/downregulated, perhaps indicating a trade-off between cellular and humoral immunity. In contrast, wounding during breeding reduces reproductive output and to the greatest extent if the challenge is applied early in the breeding bout. Despite being immunosuppressed, breeding beetles can still respond to wounding by increasing PO, albeit not to prebreeding levels. This upregulation of PO during breeding may affect parental investment, resulting in a reduction in reproductive output. The potential role of juvenile hormone in controlling this trade-off is discussed.
Resumo:
Loss-of-mains protection is an important component of the protection systems of embedded generation. The role of loss-of-mains is to disconnect the embedded generator from the utility grid in the event that connection to utility dispatched generation is lost. This is necessary for a number of reasons, including the safety of personnel during fault restoration and the protection of plant against out-of-synchronism reclosure to the mains supply. The incumbent methods of loss-of-mains protection were designed when the installed capacity of embedded generation was low, and known problems with nuisance tripping of the devices were considered acceptable because of the insignificant consequence to system operation. With the dramatic increase in the installed capacity of embedded generation over the last decade, the limitations of current islanding detection methods are no longer acceptable. This study describes a new method of loss-of-mains protection based on phasor measurement unit (PMU) technology, specifically using a low cost PMU device of the authors' design which has been developed for distribution network applications. The proposed method addresses the limitations of the incumbent methods, providing a solution that is free of nuisance tripping and has a zero non-detection zone. This system has been tested experimentally and is shown to be practical, feasible and effective. Threshold settings for the new method are recommended based on data acquired from both the Great Britain and Ireland power systems.
Resumo:
Despite previous attempts at codification of international law regarding international responses to natural and human-made disasters, there is currently no binding international legal framework to regulate the provision of humanitarian assistance outside armed conflicts. Nevertheless, since the International Law Commission (ILC) included the protection of persons in the event of disasters on its programme of work in 2006, it has provisionally adopted eleven draft articles that have the potential to create binding obligations on states and humanitarian actors in disaster settings. Draft articles adopted include the definition of ‘a disaster’, the relationship of the draft articles to the international humanitarian law of armed conflict, recognition of the inherent dignity of the human person, and the duty of international cooperation. However, the final form of the draft articles has not been agreed. The Codification Division of the UN Office of Legal Affairs has proposed a framework convention format, which has seen support in the ILC and the UN General Assembly Sixth Committee. The overall aim of this article is to provide an analysis of the potential forms of international regulation open to the ILC and states in the context of humanitarian responses to disasters. However to avoid enchanting the ILC draft articles with unwarranted power, any examination of form requires an understanding of the substantive subject matter of the planned international regulation. The article therefore provides an overview of the international legal regulation of humanitarian assistance following natural and human-made disasters, and the ILC’s work to date on the topic. It then examines two key issues that remain to be addressed by the ILC and representatives of states in the UN General Assembly Sixth Committee. Drawing on the UN Guiding Principles on Internal Displacement, the development and implications of binding and non-binding international texts are examined, followed by an analysis of the suggested framework convention approach identified by the Special Rapporteur as a potential outcome of the ILC work.
Resumo:
In order to gain access to the EU, nations must be seen to implement formal instruments that protect the rights of minorities. This book examines the ways in which these tools have worked in a number of post-communist states, and explores the interaction of domestic and international structures that determine the application of these policies.
Resumo:
This publication traces how asylum seekers are repositioned in the existing European asylum legislation from asylum seekers as victims in need of protection, to criminals . It is argued that this is due to the European legislation concerning the area of freedom, security and justice. The latest asylum legislation seems to undermine the refugee status which -as it is widely known- is safeguarded by the 1951 Geneva Convention relating to the Status of Refugees and its relevant 1967 Protocol. Additionally, in this paper the role of social workers and other social scientists to protect the rights of asylum seekers and question the existing legislation is presented.
Resumo:
In this paper, we propose a system level design approach considering voltage over-scaling (VOS) that achieves error resiliency using unequal error protection of different computation elements, while incurring minor quality degradation. Depending on user specifications and severity of process variations/channel noise, the degree of VOS in each block of the system is adaptively tuned to ensure minimum system power while providing "just-the-right" amount of quality and robustness. This is achieved, by taking into consideration system level interactions and ensuring that under any change of operating conditions only the "lesscrucial" computations, that contribute less to block/system output quality, are affected. The design methodology applied to a DCT/IDCT system shows large power benefits (up to 69%) at reasonable image quality while tolerating errors induced by varying operating conditions (VOS, process variations, channel noise). Interestingly, the proposed IDCT scheme conceals channel noise at scaled voltages. ©2009 IEEE.
Resumo:
In this paper, we present a unique cross-layer design framework that allows systematic exploration of the energy-delay-quality trade-offs at the algorithm, architecture and circuit level of design abstraction for each block of a system. In addition, taking into consideration the interactions between different sub-blocks of a system, it identifies the design solutions that can ensure the least energy at the "right amount of quality" for each sub-block/system under user quality/delay constraints. This is achieved by deriving sensitivity based design criteria, the balancing of which form the quantitative relations that can be used early in the system design process to evaluate the energy efficiency of various design options. The proposed framework when applied to the exploration of energy-quality design space of the main blocks of a digital camera and a wireless receiver, achieves 58% and 33% energy savings under 41% and 20% error increase, respectively. © 2010 ACM.
Resumo:
Polar codes are one of the most recent advancements in coding theory and they have attracted significant interest. While they are provably capacity achieving over various channels, they have seen limited practical applications. Unfortunately, the successive nature of successive cancellation based decoders hinders fine-grained adaptation of the decoding complexity to design constraints and operating conditions. In this paper, we propose a systematic method for enabling complexity-performance trade-offs by constructing polar codes based on an optimization problem which minimizes the complexity under a suitably defined mutual information based performance constraint. Moreover, a low-complexity greedy algorithm is proposed in order to solve the optimization problem efficiently for very large code lengths.
Resumo:
Power dissipation and tolerance to process variations pose conflicting design requirements. Scaling of voltage is associated with larger variations, while Vdd upscaling or transistor up-sizing for process tolerance can be detrimental for power dissipation. However, for certain signal processing systems such as those used in color image processing, we noted that effective trade-offs can be achieved between Vdd scaling, process tolerance and "output quality". In this paper we demonstrate how these tradeoffs can be effectively utilized in the development of novel low-power variation tolerant architectures for color interpolation. The proposed architecture supports a graceful degradation in the PSNR (Peak Signal to Noise Ratio) under aggressive voltage scaling as well as extreme process variations in. sub-70nm technologies. This is achieved by exploiting the fact that some computations are more important and contribute more to the PSNR improvement compared to the others. The computations are mapped to the hardware in such a way that only the less important computations are affected by Vdd-scaling and process variations. Simulation results show that even at a scaled voltage of 60% of nominal Vdd value, our design provides reasonable image PSNR with 69% power savings.
Resumo:
In this paper, we propose a system level design approach considering voltage over-scaling (VOS) that achieves error resiliency using unequal error protection of different computation elements, while incurring minor quality degradation. Depending on user specifications and severity of process variations/channel noise, the degree of VOS in each block of the system is adaptively tuned to ensure minimum system power while providing "just-the-right" amount of quality and robustness. This is achieved, by taking into consideration block level interactions and ensuring that under any change of operating conditions, only the "less-crucial" computations, that contribute less to block/system output quality, are affected. The proposed approach applies unequal error protection to various blocks of a system-logic and memory-and spans multiple layers of design hierarchy-algorithm, architecture and circuit. The design methodology when applied to a multimedia subsystem shows large power benefits ( up to 69% improvement in power consumption) at reasonable image quality while tolerating errors introduced due to VOS, process variations, and channel noise.
Resumo:
Developed countries, led by the EU and the US, have consistently called for ‘deeper integration’ over the course of the past three decades i.e., the convergence of ‘behind-the-border’ or domestic polices and rules such as services, competition, public procurement, intellectual property (“IP”) and so forth. Following the collapse of the Doha Development Round, the EU and the US have pursued this push for deeper integration by entering into deep and comprehensive free trade agreements (“DCFTAs”) that are comprehensive insofar as they are not limited to tariffs but extend to regulatory trade barriers. More recently, the EU and the US launched negotiations on a Transatlantic Trade and Investment Partnership (“TTIP”) and a Trade in Services Agreement (“TISA”), which put tackling barriers resulting from divergences in domestic regulation in the area of services at the very top of the agenda. Should these agreements come to pass, they may well set the template for the rules of international trade and define the core features of domestic services market regulation. This article examines the regulatory disciplines in the area of services included in existing EU and US DCFTAs from a comparative perspective in order to delineate possible similarities and divergences and assess the extent to which these DCFTAs can shed some light into the possible outcome and limitations of future trade negotiations in services. It also discusses the potential impact of such negotiations on developing countries and, more generally, on the multilateral process.
Resumo:
Using first-principles molecular dynamics simulations, we have investigated the notion that amino acids can play a protective role when DNA is exposed to excess electrons produced by ionizing radiation. In this study we focus on the interaction of glycine with the DNA nucleobase thymine. We studied thymine-glycine dimers and a condensed phase model consisting of one thymine molecule solvated in amorphous glycine. Our results show that the amino acid acts as a protective agent for the nucleobase in two ways. If the excess electron is initially captured by the thymine, then a proton is transferred in a barrier-less way from a neighboring hydrogen-bonded glycine. This stabilizes the excess electron by reducing the net partial charge on the thymine. In the second mechanism the excess electron is captured by a glycine, which acts as a electron scavenger that prevents electron localization in DNA. Both these mechanisms introduce obstacles to further reactions of the excess electron within a DNA strand, e.g. by raising the free energy barrier associated with strand breaks.