918 resultados para rhetoric principle of imitatio
Resumo:
Preferentialism and multilateralism are not two independent and succinct avenues in the pur-suit of market access and regulatory policies. They historically build upon each other in a dialectical process, closely related and linked through regulatory bridges and references. They influence and direct each other in various ways. The paper mainly focuses on the evolution of international protection of intellectual property rights and of services. The multilateral regulation of the TRIPS and others derive from years of regulatory experience and high numbers of preferential agreements across the globe. The GATS and others, on the other hand, have entered the pluri- or multilateral stage early. Once regulation has reached the mul-tilateral stage, preferentialism focuses on WTO-plus and -extra commitments. Both areas, however, show close interaction. The principle of MFN ensures that multilateralism and preferentialism do not evolve independently from each other. It produces significant spill-over effects of preferential agreements. Such effects and the need to develop uniform and coherent regulatory standards have led in parallel to a number of preferential, plurilateral and multilateral regulatory initiatives. We submit that the process will eventually encourage the return to multilateralism and negotiations in international fora, in particular the WTO while traditional market access may stay with preferential relations among Nations. Such burden-sharing between different regulatory fora should be reflected in future WTO rules providing the overall backbone of the system.
Resumo:
Sino-African trade has seen a fifty-fold increase in the years 1999 to 2008. In some African regions, particularly in sub-Saharan Africa, China has even replaced the US as the most important trading partner today. But China holds not a single FTA on the African continent, while other major trading partners of African economies rely on an extensive framework of different trade agreements. What is, thus, the legal basis of the recent increase of Sino-African trade? Interestingly, Sino-African trade has seen a particularly strong increase in countries that have entered into tied aid agreements with China. These agreements are commonly known under the term ‘Angola-Model’ and consist of a multifaceted network of barter-trading-systems, aspects of tied aid and concessions for oil and other commodities linked with a state loan. It is likely that these agreements have an impact on the trade-flows between African countries and China. This paper discusses the legal character of this new form of economic cooperation, or modern version of tied aid. Critical legal aspects related to this form of tied aid refer to violation of the principle of most-favoured nation (MFN), illegitimate export subsidies, market access, public procurement and transparency in the international trading system. However, despite the recent outcry of the foremost Western community against the strategy of the Chinese government on the African continent, the practice of the Angola-Model based tied aid is not entirely new, and neither is it against the law. The case of tied aid is situated in a legal grey area that should be examined thoroughly in order to strengthen the international trading system and to support developing countries in their attempt to gain from tied aid arrangements.
Resumo:
We agree with the authors' attitude toward fostering the principle of parsimony (also known as Ockham's razor(3) ) - whereby no unnecessary entities/labels should be posited whenever a phenomenon can be reduced to a set of less complex constituents. Nevertheless, we take issue with some of the shortcuts which we feel they engaged in along their line of reasoning. This article is protected by copyright. All rights reserved.
Resumo:
This article discusses the tensions between the principle of state sovereignty and the idea of a "humanitarian intervention" (or a intervention on humanitarian grounds) as they resulted from the debate of leading legal scholars in the 19th and early 20th century. While prominent scholars such as Johann Caspar Bluntschli, Gustave Rolin Jaequemyns or Aegidius Arntz spoke out in favour of a form of "humanitarian interventions", others such as August Wilhelm Heffter or Pasquale Fiore were much more critical and in many cases spoke out in favour of absolute state sovereignty.
Resumo:
Because natural selection is likely to act on multiple genes underlying a given phenotypic trait, we study here the potential effect of ongoing and past selection on the genetic diversity of human biological pathways. We first show that genes included in gene sets are generally under stronger selective constraints than other genes and that their evolutionary response is correlated. We then introduce a new procedure to detect selection at the pathway level based on a decomposition of the classical McDonald–Kreitman test extended to multiple genes. This new test, called 2DNS, detects outlier gene sets and takes into account past demographic effects and evolutionary constraints specific to gene sets. Selective forces acting on gene sets can be easily identified by a mere visual inspection of the position of the gene sets relative to their two-dimensional null distribution. We thus find several outlier gene sets that show signals of positive, balancing, or purifying selection but also others showing an ancient relaxation of selective constraints. The principle of the 2DNS test can also be applied to other genomic contrasts. For instance, the comparison of patterns of polymorphisms private to African and non-African populations reveals that most pathways show a higher proportion of nonsynonymous mutations in non-Africans than in Africans, potentially due to different demographic histories and selective pressures.
Resumo:
The concentrations of the long-lived nuclear reaction products 129I and 36Cl have been measured in samples from the MEGAPIE liquid metal spallation target. Samples from the bulk target material (lead-bismuth eutectic, LBE), from the interface of the metal free surface with the cover gas, from LBE/steel interfaces and from noble metal absorber foils installed in the cover gas system were analysed using Accelerator Mass Spectrometry at the Laboratory of Ion beam Physics at ETH Zürich. The major part of 129I and 36Cl was found accumulated on the interfaces, particularly at the interface of LBE and the steel walls of the target container, while bulk LBE samples contain only a minor fraction of these nuclides. Both nuclides were also detected on the absorber foils to a certain extent (≪ 1% of the total amount). The latter number is negligible concerning the radio-hazard of the irradiated target material; however it indicates a certain affinity of the absorber foils for halogens, thus proving the principle of using noble metal foils for catching these volatile radionuclides. The total amounts of 129I and 36Cl in the target were estimated from the analytical data by averaging within the different groups of samples and summing up these averages over the total target. This estimation could account for about half of the amount of 129I and 36Cl predicted to be produced using nuclear physics modelling codes for both nuclides. The significance of the results and the associated uncertainties are discussed.
Resumo:
A total of 23 pollen diagrams [stored in the Alpine Palynological Data-Base (ALPADABA), Geobotanical Institute, Bern] cover the last 100 to over 1000 years. The sites include 15 lakes, seven mires, and one soil profile distributed in the Jura Mts (three sites), Swiss Plateau (two sites), northern Pre-Alps and Alps (six sites), central Alps (five sites), southern Alps (three sites), and southern Pre-Alps (four sites) in the western and southern part of Switzerland or just outside the national borders. The pollen diagrams have both a high taxonomic resolution and a high temporal resolution, with sampling distances of 0.5–3 cm, equivalent to 1 to 11 years for the last 100 years and 8 to 130 years for earlier periods. The chronology is based on absolute dating (14 sites: 210Pb 11 sites; 14C six sites; varve counting two sites) or on biostratigraphic correlation among pollen diagrams. The latter relies mainly on trends in Cannabis sativa, Ambrosia, Mercurialis annua, and Ostrya-type pollen. Individual pollen stratigraphies are discussed and sites are compared within each region. The principle of designating local, extra-local, and regional pollen signals and vegetation is exemplified by two pairs of sites lying close together. Trends in biostratigraphies shared by a major part of the pollen diagrams allow the following generalisations. Forest declined in phases since medieval times up to the late 19th century. Abies and Fagus declined consistently, whereas the behaviour of short-lived trees and trees of moist habitats differed among sites (Alnus glutinosa-type, Alnus viridis, Betula, Corylus avellana). In the present century, however, Picea and Pinus increased, followed by Fraxinus excelsior in the second half of this century. Grassland (traced by Gramineae and Plantago lanceolata-type pollen) increased, replacing much of the forest, and declined again in the second half of this century. Nitrate enrichment of the vegetation (traced by Urtica) took place in the first half of this century. These trends reflect the intensification of forest use and the expansion of grassland from medieval times up to the end of the last century, whereas subsequently parts of the grassland became used more intensively and the marginal parts were abandoned for forest regrowth. In most pollen diagrams human impact is the dominant factor in explaining inferred changes in vegetation, but climatic change plays a role at three sites.
Resumo:
In a marvelous but somewhat neglected paper, 'The Corporation: Will It Be Managed by Machines?' Herbert Simon articulated from the perspective of 1960 his vision of what we now call the New Economy the machine-aided system of production and management of the late twentieth century. Simon's analysis sprang from what I term the principle of cognitive comparative advantage: one has to understand the quite different cognitive structures of humans and machines (including computers) in order to explain and predict the tasks to which each will be most suited. Perhaps unlike Simon's better-known predictions about progress in artificial intelligence research, the predictions of this 1960 article hold up remarkably well and continue to offer important insights. In what follows I attempt to tell a coherent story about the evolution of machines and the division of labor between humans and machines. Although inspired by Simon's 1960 paper, I weave many other strands into the tapestry, from classical discussions of the division of labor to present-day evolutionary psychology. The basic conclusion is that, with growth in the extent of the market, we should see humans 'crowded into' tasks that call for the kinds of cognition for which humans have been equipped by biological evolution. These human cognitive abilities range from the exercise of judgment in situations of ambiguity and surprise to more mundane abilities in spatio-temporal perception and locomotion. Conversely, we should see machines 'crowded into' tasks with a well-defined structure. This conclusion is not based (merely) on a claim that machines, including computers, are specialized idiots-savants today because of the limits (whether temporary or permanent) of artificial intelligence; rather, it rests on a claim that, for what are broadly 'economic' reasons, it will continue to make economic sense to create machines that are idiots-savants.
Resumo:
In Thailand, communitarian ideas have been widely accepted and even institutionalized as a principle of national development plans and the Constitution of Thailand. This paper examines how and why the communitarian body of thought, described as "community culture thought," and originally created and shared within a small circle of social activists and academics in the early 1980s, came to be disseminated and authorized in Thai society. Contributors and participants, ways of expression, and avenues for disseminating this paradigm are the main topics in this paper. The paper reveals that these thoughts and concepts have been diversified and used as guiding principles by state elites, anti-state activists, and social reformists since the late 1980s. These people with such different political ideologies were connected through some key individuals. These critical connections networked them onto the same side for promoting communitarian thought in Thailand. When such leading advocates assumed key political positions, it was easy for them to push communitarian ideas into the guidelines and principles of state administration.
Resumo:
Many specialists in international trade have started saying that the era of a mega FTA is approaching. If the three poles of the global economy, namely East Asia, EU and the United States, form mega FTAs, most of the volume of global trade will be covered. That may be fine, but there will be many countries left out of the mega FTA, most of which will be the least developed countries (LDCs). Since the inception of the Doha Development Agenda (DDA) negotiations in 2001, the WTO and its member countries have tried to include LDCs in the world trading system through various means, including DFQF and AfT. Although these means have some positive impact on the economic development of LDCs, most of the LDCs will never feel comfortable with the current world trading system. To overcome the stalemate in the DDA and to create an inclusive world trading system, we need more commitment from both LDCs and non-LDCs. To surmount the prolonged stalemate in the DDA, we should understand how ordinary people in LDCs feel and think about the current world trading system. Those voices have seldom been listened to, even by the decision makers of their own countries. So as to understand the situation of the people in LDCs, IDE-JETRO carried out several research projects using macro, meso and micro approaches. For the micro level, we collected and analyzed statements from ordinary people concerning their opinions about the world trading system. The interviewees are ordinary people such as street vendors, farmers and factory workers. We asked about where they buy and sell daily necessities, their perception of imported goods, export promotion and free trade at large, etc. These ‘voices of the people’ surveys were conducted in Madagascar and Cambodia during 2013. Based on this research, and especially the findings from the ‘voices of the people’ surveys, we propose a ‘DDA-MDGs hybrid’ strategy to conclude DDA negotiations and develop a more inclusive and a little bit more ethical world trading system. Our proposal may be summarized in the following three points. (1) Aid for Trade (AfT) ver. 2 Currently AfT is mainly focused on coordinating several aid projects related to LDCs’ capacity building. However, this is inadequate; for the proposed ‘DDA-MDGs hybrid’, a super AfT is needed. The WTO, other development agencies and LDC governments will not only coordinate but also plan together aid projects for trade capacity building. AfT ver. 2 includes infrastructure projects either gran aid, ODA loans and private investment. This is in accordance with the post-MDGs argument which emphasizes the role of the private sector. (2) Ethical Attitude Reciprocity is a principle of multilateral agreement, and it has been a core promise since GATT. However, for designing an inclusive system, special and differential treatment (S&D) is still needed for disadvantaged members. To compromise full reciprocity and less than full reciprocity, an ethical attitude on the part of every member is needed in which every member refrains from insisting on the full rights and demands of its own country. As used herein, the term ‘ethical’ implies more consideration for LDCs, and it is almost identical to S&D but with a more positive attitude from developed countries (super S&D). (3) Collect Voices of the People In order to grasp the real situation of the people, the voices of the people on free trade will continue to be collected in other LDCs, and the findings and leanings will be fed back to the WTO negotiation space.
Resumo:
This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
Although there are numerous accurate measuring methods to determine soil moisture content in a spot, until very recently there were no precise in situ and in real time methods that were able to measure soil moisture content along a line. By means of the Distributed Fiber Optic Temperature Measurement method or DFOT, the temperature in 0.12 m intervals and long distances (up to 10,000 m) with a high time frequency and an accuracy of +0.2º C is determined. The principle of temperature measurement along a fiber optic cable is based on the thermal sensitivity of the relative intensities of backscattered photons that arise from collisions with electrons in the core of the glass fiber. A laser pulse, generated by the DTS unit, traversing a fiber optic cable will result in backscatter at two frequencies. The DTS quantifies the intensity of these backscattered photons and elapsed time between the pulse and the observed returned light. The intensity of one of the frequencies is strongly dependent on the temperature at the point where the scattering process occurred. The computed temperature is attributed to the position along the cable from which the light was reflected, computed from the time of travel for the light.