929 resultados para Principle of authority
Resumo:
This article discusses the tensions between the principle of state sovereignty and the idea of a "humanitarian intervention" (or a intervention on humanitarian grounds) as they resulted from the debate of leading legal scholars in the 19th and early 20th century. While prominent scholars such as Johann Caspar Bluntschli, Gustave Rolin Jaequemyns or Aegidius Arntz spoke out in favour of a form of "humanitarian interventions", others such as August Wilhelm Heffter or Pasquale Fiore were much more critical and in many cases spoke out in favour of absolute state sovereignty.
Resumo:
Because natural selection is likely to act on multiple genes underlying a given phenotypic trait, we study here the potential effect of ongoing and past selection on the genetic diversity of human biological pathways. We first show that genes included in gene sets are generally under stronger selective constraints than other genes and that their evolutionary response is correlated. We then introduce a new procedure to detect selection at the pathway level based on a decomposition of the classical McDonald–Kreitman test extended to multiple genes. This new test, called 2DNS, detects outlier gene sets and takes into account past demographic effects and evolutionary constraints specific to gene sets. Selective forces acting on gene sets can be easily identified by a mere visual inspection of the position of the gene sets relative to their two-dimensional null distribution. We thus find several outlier gene sets that show signals of positive, balancing, or purifying selection but also others showing an ancient relaxation of selective constraints. The principle of the 2DNS test can also be applied to other genomic contrasts. For instance, the comparison of patterns of polymorphisms private to African and non-African populations reveals that most pathways show a higher proportion of nonsynonymous mutations in non-Africans than in Africans, potentially due to different demographic histories and selective pressures.
Resumo:
The concentrations of the long-lived nuclear reaction products 129I and 36Cl have been measured in samples from the MEGAPIE liquid metal spallation target. Samples from the bulk target material (lead-bismuth eutectic, LBE), from the interface of the metal free surface with the cover gas, from LBE/steel interfaces and from noble metal absorber foils installed in the cover gas system were analysed using Accelerator Mass Spectrometry at the Laboratory of Ion beam Physics at ETH Zürich. The major part of 129I and 36Cl was found accumulated on the interfaces, particularly at the interface of LBE and the steel walls of the target container, while bulk LBE samples contain only a minor fraction of these nuclides. Both nuclides were also detected on the absorber foils to a certain extent (≪ 1% of the total amount). The latter number is negligible concerning the radio-hazard of the irradiated target material; however it indicates a certain affinity of the absorber foils for halogens, thus proving the principle of using noble metal foils for catching these volatile radionuclides. The total amounts of 129I and 36Cl in the target were estimated from the analytical data by averaging within the different groups of samples and summing up these averages over the total target. This estimation could account for about half of the amount of 129I and 36Cl predicted to be produced using nuclear physics modelling codes for both nuclides. The significance of the results and the associated uncertainties are discussed.
Resumo:
A total of 23 pollen diagrams [stored in the Alpine Palynological Data-Base (ALPADABA), Geobotanical Institute, Bern] cover the last 100 to over 1000 years. The sites include 15 lakes, seven mires, and one soil profile distributed in the Jura Mts (three sites), Swiss Plateau (two sites), northern Pre-Alps and Alps (six sites), central Alps (five sites), southern Alps (three sites), and southern Pre-Alps (four sites) in the western and southern part of Switzerland or just outside the national borders. The pollen diagrams have both a high taxonomic resolution and a high temporal resolution, with sampling distances of 0.5–3 cm, equivalent to 1 to 11 years for the last 100 years and 8 to 130 years for earlier periods. The chronology is based on absolute dating (14 sites: 210Pb 11 sites; 14C six sites; varve counting two sites) or on biostratigraphic correlation among pollen diagrams. The latter relies mainly on trends in Cannabis sativa, Ambrosia, Mercurialis annua, and Ostrya-type pollen. Individual pollen stratigraphies are discussed and sites are compared within each region. The principle of designating local, extra-local, and regional pollen signals and vegetation is exemplified by two pairs of sites lying close together. Trends in biostratigraphies shared by a major part of the pollen diagrams allow the following generalisations. Forest declined in phases since medieval times up to the late 19th century. Abies and Fagus declined consistently, whereas the behaviour of short-lived trees and trees of moist habitats differed among sites (Alnus glutinosa-type, Alnus viridis, Betula, Corylus avellana). In the present century, however, Picea and Pinus increased, followed by Fraxinus excelsior in the second half of this century. Grassland (traced by Gramineae and Plantago lanceolata-type pollen) increased, replacing much of the forest, and declined again in the second half of this century. Nitrate enrichment of the vegetation (traced by Urtica) took place in the first half of this century. These trends reflect the intensification of forest use and the expansion of grassland from medieval times up to the end of the last century, whereas subsequently parts of the grassland became used more intensively and the marginal parts were abandoned for forest regrowth. In most pollen diagrams human impact is the dominant factor in explaining inferred changes in vegetation, but climatic change plays a role at three sites.
Resumo:
In a marvelous but somewhat neglected paper, 'The Corporation: Will It Be Managed by Machines?' Herbert Simon articulated from the perspective of 1960 his vision of what we now call the New Economy the machine-aided system of production and management of the late twentieth century. Simon's analysis sprang from what I term the principle of cognitive comparative advantage: one has to understand the quite different cognitive structures of humans and machines (including computers) in order to explain and predict the tasks to which each will be most suited. Perhaps unlike Simon's better-known predictions about progress in artificial intelligence research, the predictions of this 1960 article hold up remarkably well and continue to offer important insights. In what follows I attempt to tell a coherent story about the evolution of machines and the division of labor between humans and machines. Although inspired by Simon's 1960 paper, I weave many other strands into the tapestry, from classical discussions of the division of labor to present-day evolutionary psychology. The basic conclusion is that, with growth in the extent of the market, we should see humans 'crowded into' tasks that call for the kinds of cognition for which humans have been equipped by biological evolution. These human cognitive abilities range from the exercise of judgment in situations of ambiguity and surprise to more mundane abilities in spatio-temporal perception and locomotion. Conversely, we should see machines 'crowded into' tasks with a well-defined structure. This conclusion is not based (merely) on a claim that machines, including computers, are specialized idiots-savants today because of the limits (whether temporary or permanent) of artificial intelligence; rather, it rests on a claim that, for what are broadly 'economic' reasons, it will continue to make economic sense to create machines that are idiots-savants.
Resumo:
This study aims at investigating the social and behavioral predictors of consistent condom use among female commercial sex workers (FCSWs) in Ghana. Street commercial sex workers were interviewed in Accra, Kumasi and Techiman. Whereas respondents had attained certain accurate knowledge about HIV transmission routes, misconceptions were still commonly reported. The level of condom education was very low (14%), however consistent condom use (all the time) with clients was relatively high (49.6%), 38.89% reported using condom sometimes and 11.56% reported never using condoms. ^ 277 of the respondent ants did not use condoms all the time. 163 of them reported not using condoms due to refusal by their clients, the remaining 64 respondents did not even request their clients to use condom due to cultural perception of power, lack of authority and the fear of loosing clients. ^ Significant predictive factors associated with consistency of condom use among FCSWs in a multivariate analysis were; age, level of education, religion, and number of customers. Some of the major obstacles to condom use by the FCSWs were refusal by clients, availability of free condoms, trying to communicate trust to their clients, and the lack of empowerment to negotiate safer sex with clients. Some of the respondents may have developed a false sense of safety by subjectively assessing whether their clients were well and do not look sick, but they were unaware that HIV carriers may show no obvious symptoms of illness at all. ^ In summary, this study points to an urgent need for reestablishing effective prevention intervention and some insights of what is required of such program in Ghana. ^
Resumo:
Introduction:Today, many countries, regardless of developed or developing, are trying to promote decentralization. According to Manor, as his quoting of Nickson’s argument, decentralization stems from the necessity to strengthen local governments as proxy of civil society to fill the yawning gap between the state and civil society (Manor [1999]: 30). With the end to the Cold War following the collapse of the Soviet Union rendering the cause of the “leadership of the central government to counter communism” meaningless, Manor points out, it has become increasingly difficult to respond flexibly to changes in society under the centralized system. Then, what benefits can be expected from the effectuation of decentralization? Litvack-Ahmad-Bird cited the four points: attainment of allocative efficiency in the face of different local preferences for local public goods; improvement to government competitiveness; realization of good governance; and enhancement of the legitimacy and sustainability of heterogeneous national states (Litvack, Ahmad & Bird [1998]: 5). They all contribute to reducing the economic and social costs of a central government unable to respond to changes in society and enhancing the efficiency of state administration through the delegation of authority to local governments. Why did Indonesia have a go at decentralization? As Maryanov recognizes, reasons for the implementation of decentralization in Indonesia have never been explicitly presented (Maryanov [1958]: 17). But there was strong momentum toward building a democratic state in Indonesia at the time of independence, and as indicated by provisions of Article 18 of the 1945 Constitution, there was the tendency in Indonesia from the beginning to debate decentralization in association with democratization. That said debate about democratization was fairly abstract and the main points are to ease the tensions, quiet the complaints, satisfy the political forces and thus stabilize the process of government (Maryanov [1958]: 26-27). What triggered decentralization in Indonesia in earnest, of course, was the collapse of the Soeharto regime in May 1998. The Soeharto regime, regarded as the epitome of the centralization of power, became incapable of effectively dealing with problems in administration of the state and development administration. Besides, the post-Soeharto era of “reform (reformasi)” demanded the complete wipeout of the Soeharto image. In contraposition to the centralization of power was decentralization. The Soeharto regime that ruled Indonesia for 32 years was established in 1966 under the banner of “anti-communism.” The end of the Cold War structure in the late 1980s undermined the legitimate reason the centralization of power to counter communism claimed by the Soeharto regime. The factor for decentralization cited by Manor is applicable here. Decentralization can be interpreted to mean not only the reversal of the centralized system of government due to its inability to respond to changes in society, as Manor points out, but also the participation of local governments in the process of the nation state building through the more positive transfer of power (democratic decentralization) and in the coordinated pursuit with the central government for a new shape of the state. However, it is also true that a variety of problems are gushing out in the process of implementing decentralization in Indonesia. This paper discusses the relationship between decentralization and the formation of the nation state with the awareness of the problems and issues described above. Section 1 retraces the history of decentralization by examining laws and regulations for local administration and how they were actually implemented or not. Section 2 focuses on the relationships among the central government, local governments, foreign companies and other actors in the play over the distribution of profits from exploitation of natural resources, and examines the process of the ulterior motives of these actors and the amplification of mistrust spawning intense conflicts that, in extreme cases, grew into separation and independence movements. Section 3 considers the merits and demerits at this stage of decentralization implemented since 2001 and shed light on the significance of decentralization in terms of the nation state building. Finally, Section 4 attempts to review decentralization as the “opportunity to learn by doing” for the central and local governments in the process of the nation state building. In the context of decentralization in Indonesia, deconcentration (dekonsentrasi), decentralization (desentralisasi) and support assignments (tugas pembantuan; medebewind, a Dutch word, was used previously) are defined as follows. Dekonsentrasi means that when the central government puts a local office of its own, or an outpost agency, in charge of implementing its service without delegating the administrative authority over this particular service. The outpost agency carries out the services as instructed by the central government. A head of a local government, when acting for the central government, gets involved in the process of dekonsentrasi. Desentralisasi, meanwhile, occurs when the central government cedes the administrative authority over a particular service to local governments. Under desentralisasi, local governments can undertake the particular service at their own discretion, and the central government, after the delegation of authority, cannot interfere with how local governments handle that service. Tugas pembantuan occur when the central government makes local governments or villages, or local governments make villages, undertake a particular service. In this case, the central government, or local governments, provides funding, equipment and materials necessary, and officials of local governments and villages undertake the service under the supervision and guidance of the central or local governments. Tugas pembantuan are maintained until local governments and villages become capable of undertaking that particular service on their own.
Resumo:
In Thailand, communitarian ideas have been widely accepted and even institutionalized as a principle of national development plans and the Constitution of Thailand. This paper examines how and why the communitarian body of thought, described as "community culture thought," and originally created and shared within a small circle of social activists and academics in the early 1980s, came to be disseminated and authorized in Thai society. Contributors and participants, ways of expression, and avenues for disseminating this paradigm are the main topics in this paper. The paper reveals that these thoughts and concepts have been diversified and used as guiding principles by state elites, anti-state activists, and social reformists since the late 1980s. These people with such different political ideologies were connected through some key individuals. These critical connections networked them onto the same side for promoting communitarian thought in Thailand. When such leading advocates assumed key political positions, it was easy for them to push communitarian ideas into the guidelines and principles of state administration.
Resumo:
Many specialists in international trade have started saying that the era of a mega FTA is approaching. If the three poles of the global economy, namely East Asia, EU and the United States, form mega FTAs, most of the volume of global trade will be covered. That may be fine, but there will be many countries left out of the mega FTA, most of which will be the least developed countries (LDCs). Since the inception of the Doha Development Agenda (DDA) negotiations in 2001, the WTO and its member countries have tried to include LDCs in the world trading system through various means, including DFQF and AfT. Although these means have some positive impact on the economic development of LDCs, most of the LDCs will never feel comfortable with the current world trading system. To overcome the stalemate in the DDA and to create an inclusive world trading system, we need more commitment from both LDCs and non-LDCs. To surmount the prolonged stalemate in the DDA, we should understand how ordinary people in LDCs feel and think about the current world trading system. Those voices have seldom been listened to, even by the decision makers of their own countries. So as to understand the situation of the people in LDCs, IDE-JETRO carried out several research projects using macro, meso and micro approaches. For the micro level, we collected and analyzed statements from ordinary people concerning their opinions about the world trading system. The interviewees are ordinary people such as street vendors, farmers and factory workers. We asked about where they buy and sell daily necessities, their perception of imported goods, export promotion and free trade at large, etc. These ‘voices of the people’ surveys were conducted in Madagascar and Cambodia during 2013. Based on this research, and especially the findings from the ‘voices of the people’ surveys, we propose a ‘DDA-MDGs hybrid’ strategy to conclude DDA negotiations and develop a more inclusive and a little bit more ethical world trading system. Our proposal may be summarized in the following three points. (1) Aid for Trade (AfT) ver. 2 Currently AfT is mainly focused on coordinating several aid projects related to LDCs’ capacity building. However, this is inadequate; for the proposed ‘DDA-MDGs hybrid’, a super AfT is needed. The WTO, other development agencies and LDC governments will not only coordinate but also plan together aid projects for trade capacity building. AfT ver. 2 includes infrastructure projects either gran aid, ODA loans and private investment. This is in accordance with the post-MDGs argument which emphasizes the role of the private sector. (2) Ethical Attitude Reciprocity is a principle of multilateral agreement, and it has been a core promise since GATT. However, for designing an inclusive system, special and differential treatment (S&D) is still needed for disadvantaged members. To compromise full reciprocity and less than full reciprocity, an ethical attitude on the part of every member is needed in which every member refrains from insisting on the full rights and demands of its own country. As used herein, the term ‘ethical’ implies more consideration for LDCs, and it is almost identical to S&D but with a more positive attitude from developed countries (super S&D). (3) Collect Voices of the People In order to grasp the real situation of the people, the voices of the people on free trade will continue to be collected in other LDCs, and the findings and leanings will be fed back to the WTO negotiation space.
Resumo:
This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
Although there are numerous accurate measuring methods to determine soil moisture content in a spot, until very recently there were no precise in situ and in real time methods that were able to measure soil moisture content along a line. By means of the Distributed Fiber Optic Temperature Measurement method or DFOT, the temperature in 0.12 m intervals and long distances (up to 10,000 m) with a high time frequency and an accuracy of +0.2º C is determined. The principle of temperature measurement along a fiber optic cable is based on the thermal sensitivity of the relative intensities of backscattered photons that arise from collisions with electrons in the core of the glass fiber. A laser pulse, generated by the DTS unit, traversing a fiber optic cable will result in backscatter at two frequencies. The DTS quantifies the intensity of these backscattered photons and elapsed time between the pulse and the observed returned light. The intensity of one of the frequencies is strongly dependent on the temperature at the point where the scattering process occurred. The computed temperature is attributed to the position along the cable from which the light was reflected, computed from the time of travel for the light.
Resumo:
This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.