888 resultados para Continuity principle


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This brief article is devoted to a critique of the arguments put forward by the Attorney General of Canada in connection with the Reference concerning certain questions relating to the secession of Quebec (hereinafter, "the Reference"). This critique will not be presented from a plainly positivist standpoint. On the contrary, I will be examining in particular (1) how the approach taken by the Attorney General impoverished the legal concepts of the rule of law anf federalism, both of which were, however, central to her submission; and, in a more general way, (2) how the excessively detailed analysis of constitutional texts contributes to the impoverishment of the symbolic function of the law, however essential that dimension may be to its legitimacy. My criticism will take into account the reasons for judgement delivered recently by the Supreme Court in the Reference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[Support Institutions:] Department of Administration of Health, University of Montreal, Canada Public Health School of Fudan University, Shanghai, China

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of regulating an economy with environmental pollution. We examine the distributional impact of the polluter-pays principle which requires that any agent compensates all other agents for the damages caused by his or her (pollution) emissions. With constant marginal damages we show that regulation via the polluter-pays principle leads to the unique welfare distribution that assigns non-negative individual welfare and renders each agent responsible for his or her pollution impact. We extend both the polluter-pays principle and this result to increasing marginal damages due to pollution. We also discuss the acceptability of the polluter-pays principle and compare it with the Vickrey-Clark-Groves mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis has been realised through a scholarship offered by the Government of Canada to the Government of the Republic of Mauritius under the Programme Canadien de Bourses de la Francophonie

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En année 408 après J.-C., l’Espagne, malgré sa position péninsulaire à la fin de l’Europe, était intégrée à une culture pan-Méditerranéenne qui s’étendait du Portugal jusqu’à la Syrie. Trois décennies n’étaient pas encore passées depuis l’instauration du Christianisme comme religion de l’état romain et l’Eglise Catholique était en pleine croissance. L’année suivante, l’Espagne entra sur une voie de transformation irrémédiable alors que les païens, avec leurs langues barbares Germaniques franchirent les Pyrénées portant la guerre et la misère aux Hispano-Romains et fondant leurs royaumes là où auparavant gouvernait l’état romain. Dans le désarroi du Ve siècle, les évêques Catholiques luttèrent pour imposer leur dominance dans les communautés et dans les coeurs des pieux. À la lumière des progrès dans l’archéologie et la qualité des éditions critiques de nos sources littéraires est venu le moment d’identifier les évêques ibériques avec une attention aux conditions régionales. Ce mémoire caractérise les évêques de l’Espagne et du Portugal et démontre les épreuves auxquelles ils firent face comme intermédiaires entre indigènes et envahisseurs, comme évangélistes parmi les païens, persécuteurs des apostates et gardiens de la romanitas à la fin du monde Antique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that every cardinal incentive compatible voting mechanism satisfying a continuity condition, must be ordinal. Our results apply to many standard models in mechanism design without transfers, including the standard voting models with any domain restrictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Die Arbeit behandelt den Vorschlag für eine EU-Verordnung KOM/2000/7/final, 2000/0212(COD) des europäischen Parlaments und des Rates von der Kommission der Europäischen Gemeinschaften als Grundlage einer Marktöffnungsverordnung und welche Veränderungen sich dadurch in Deutschland einstellen werden. Ausschreibungen von Verkehrsleistungen werden zunehmen. Die Ausschreibungsarten werden sich in ländlichen Regionen von denen in Verkehrszentren unterscheiden. In der Region werden sich Bedarfslösungen stärker durchsetzen. Kürzungen von Verkehrsleistungen werden hier stärker ausfallen als in den Zentren und damit zu einem kleineren Leistungsvolumen führen. Aufgrund des geringen Leistungsumfangs gibt es weniger Interessenten. Bei Standardausschreibungen werden deshalb auch häufig die Varianten der beschränkten oder die freihändige Vergabe gewählt. Funktionale Ausschreibungen haben nur eine untergeordnete Bedeutung. In den Verkehrszentren sind die Lose größer und damit für viele Anbieter interessant. Die Verkehrszusatzleistungen sind zudem komplexer. Standardausschreibungen in öffentlicher Vergabeart werden sich hier vermutlich als Norm durchsetzen. Die VOL/A wird sicherlich ihre Bedeutung und ihren dafür notwendigen Regelungsumfang in Deutschland als deutsches oder als europäisches Recht behalten. Ob der empfehlende Charakter der DIN EN 13816 Norm „ÖPNV: Definition, Festlegung von Leistungszielen und Messung der Servicequalität“ erhalten werden kann und nicht als Steuerungselement zur Standardisierung im ÖPNV beitragen wird, ist dabei zu bezweifeln. Durch diese Wettbewerbspflicht wird der Aufgabenträger zum Besteller von Verkehrsleistungen. Damit geht die Verkehrsplanung in die Verantwortung des Aufgabenträgers über und gerät stärker in den Einflussbereich der Politik. Die strategisch abstrakte und die konkrete Verkehrsplanung wachsen für den Normfall der Standardausschreibung zusammen. Die Hoffnung auf eine bessere Netzintegration und eine Standardisierung des ÖPNV Angebots und der ÖPNV Qualität entsteht. Es entwickelt sich dadurch aber auch die Gefahr der Abhängigkeit des Nahverkehrsangebots von der derzeitigen Haushaltslage oder der Interessenlage der Politik. Kontinuität in Angebot und Qualität werden zu erklärten Planungszielen. Der Verkehrsplaner auf der Bestellerseite muss die Planung in Ausschreibungsunterlagen umsetzen. Dies erfordert erweiterte Kompetenzen in den Bereichen Betriebswirtschaft, Logistik, Jura, Informatik und Führungskompetenzen. Ausbildende Institutionen müssen darauf bereits im Vorfeld der Umsetzung reagieren. Durch die zeitliche Verzögerung der Umsetzung der Planung durch die Ausschreibungsschritte sind in der Verkehrsplanung längere Planungsvorlaufzeiten einzukalkulieren. Vorausschauender zu planen, wird dabei wichtiger. Auch eventuelle Fehler in der Planung sind nicht mehr so einfach zu korrigieren. Durch den gestiegenen Einsatz von Technologien in den Fahrzeugen besteht für den Verkehrsplaner dafür häufiger die Möglichkeit, Planungsänderungen auf ihre Wirksamkeit im Hinblick auf Attraktivität für den Fahrgast anhand von den ermittelten Fahrgastzahlen zu kontrollieren. Dasselbe gilt auch für Marketing- und Vertriebsmaßnahmen, wie für die Tarifpolitik. Die Zahlen stehen nicht nur für diese Rückkopplung zur Verfügung, sondern dienen auch als Planungsgrundlage für zukünftige Maßnahmen. Dem Planer stehen konkretere Zahlen für die Planung zur Verfügung. Ein Aspekt, der aufgrund der Sanktionsmaßnahmen bei Ausschreibungen an Bedeutung gewinnen wird, ist die Möglichkeit, Qualität von Verkehrsleistungen möglichst objektiv beurteilen zu können. Praxisrelevante Auswirkungen auf die Verkehrsplanung des öffentlichen Personennahverkehrs ergeben sich hauptsächlich durch die gestiegene Komplexität in der Planung selbst und den dadurch unverzichtbaren gewordenen Einsatz von Computerunterstützung. Die Umsetzung in Ausschreibungsunterlagen der Planung und die Kontrolle stellen neue Elemente im Aufgabenbereich des Verkehrsplaners dar und erfordern damit breiter ausgelegte Kernkompetenzen. Es werden mehr Verkehrsplaner mit breiterer Ausbildung benötigt werden. Diese Arbeit hat aufgezeigt, dass sich mit der Integration des Ausschreibungsgedankens in den Ablauf der Verkehrsplanung eine sprunghafte Entwicklung in der Planungstätigkeit ergeben wird. Aufgrund der in Zukunft steigenden Qualität und Quantität der Planungsgrundlagen und der ebenfalls gestiegenen Ansprüche an die Bewertungsparameter ergeben sich Veränderungen und neue Anforderungen auf diesem Gebiet, die in erster Linie für die Hochschulen und andere ausbildende Einrichtungen, aber auch für die Verkehrsplanung unterstützende Industrie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relativistic density functional theory is widely applied in molecular calculations with heavy atoms, where relativistic and correlation effects are on the same footing. Variational stability of the Dirac Hamiltonian is a very important field of research from the beginning of relativistic molecular calculations on, among efforts for accuracy, efficiency, and density functional formulation, etc. Approximations of one- or two-component methods and searching for suitable basis sets are two major means for good projection power against the negative continuum. The minimax two-component spinor linear combination of atomic orbitals (LCAO) is applied in the present work for both light and super-heavy one-electron systems, providing good approximations in the whole energy spectrum, being close to the benchmark minimax finite element method (FEM) values and without spurious and contaminated states, in contrast to the presence of these artifacts in the traditional four-component spinor LCAO. The variational stability assures that minimax LCAO is bounded from below. New balanced basis sets, kinetic and potential defect balanced (TVDB), following the minimax idea, are applied with the Dirac Hamiltonian. Its performance in the same super-heavy one-electron quasi-molecules shows also very good projection capability against variational collapse, as the minimax LCAO is taken as the best projection to compare with. The TVDB method has twice as many basis coefficients as four-component spinor LCAO, which becomes now linear and overcomes the disadvantage of great time-consumption in the minimax method. The calculation with both the TVDB method and the traditional LCAO method for the dimers with elements in group 11 of the periodic table investigates their difference. New bigger basis sets are constructed than in previous research, achieving high accuracy within the functionals involved. Their difference in total energy is much smaller than the basis incompleteness error, showing that the traditional four-spinor LCAO keeps enough projection power from the numerical atomic orbitals and is suitable in research on relativistic quantum chemistry. In scattering investigations for the same comparison purpose, the failure of the traditional LCAO method of providing a stable spectrum with increasing size of basis sets is contrasted to the TVDB method, which contains no spurious states already without pre-orthogonalization of basis sets. Keeping the same conditions including the accuracy of matrix elements shows that the variational instability prevails over the linear dependence of the basis sets. The success of the TVDB method manifests its capability not only in relativistic quantum chemistry but also for scattering and under the influence of strong external electronic and magnetic fields. The good accuracy in total energy with large basis sets and the good projection property encourage wider research on different molecules, with better functionals, and on small effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The basic idea behind improving local food security consists of two paths; first, accessibility (price, stock) and second, availability (quantity and biodiversity); both are perquisites to the provision of nutrients and a continuous food supply with locally available resources. The objectives of this thesis are to investigate if indigenous knowledge still plays an important role in traditional farming in the Minangkabau`s culture, thus supporting local food security. If the indigenous knowledge still plays a role in food culture in the Minangkabau`s culture which is linked to the matrilineal role and leads to a sound nutrition. Further, it should be tested if marantau influences traditional farming and food culture in Minangkabau`s, and if the local government plays a role in changing of traditional farming systems and food culture. Furthermore this thesis wants to prove if education and gender are playing a role in changing traditional farming system and food culture, and if the mass media affects traditional farming systems and food culture for the Minangkabau. The study was completed at four locations in West Sumatera; Nagari Ulakan (NU) (coastal area), Nagari Aia Batumbuak (NAB) (hilly area), Nagari Padang Laweh Malalo (NPLM) (lake area), Nagari Pandai Sikek (NPS) (hilly area). The rainfall ranged from 1400- 4800 mm annually with fertile soils. Data was collected by using PRA (Participatory Rural Appraisal) to investigate indigenous knowledge (IK) and its interactions, which is also combining with in depth-interview, life history, a survey using semi-structured-questionnaire, pictures, mapping, and expert interview. The data was collected from June - September 2009 and June 2010. The materials are; map of area, list of names, questionnaires, voices recorder, note book, and digital camera. The sampling method was snowball sampling which resulted in the qualitative and quantitative data taken. For qualitative data, ethnography and life history was used. For quantitative, a statistical survey with a semi-structured questionnaire was used. 50 respondents per each site participated voluntarily. Data was analyzed by performing MAXQDA 10, and F4 audio analysis software (created and developed by Philip-University Marburg). The data is clustered based on causality. The results show that; the role of IK on TFS (traditional farming system) shown on NPLM which has higher food crop biodiversity in comparison to the other three places even though it has relatively similar temperature and rainfall. This high food crop biodiversity is due to the awareness of local people who realized that they lived in unfavourable climate and topography; therefore they are more prepared for any changes that may occur. Carbohydrate intake is 100 % through rice even though they are growing different staple crops. Whereas most of the people said in the interviews that not eating rice is like not really eating for them. In addition to that, mothers still play an important role in kitchen activities. But when the agriculture income is low, mothers have to decide whether to change the meals or to feel insecure about their food supply. Marantau yields positive impact through the remittances it provides to invest on the farm. On the other hand, it results in fewer workers for agriculture, and therefore a negative impact on the transfer of IK. The investigation showed that the local government has a PTS (Padi Tanam Sabatang) programme which still does not guarantee that the farmers are getting sufficient revenue from their land. The low agricultural income leads to situation of potential food insecurity. It is evident that education is equal among men and women, but in some cases women tend to leave school earlier because of arranged marriages or the distances of school from their homes. Men predominantly work in agriculture and fishing, while women work in the kitchen. In NAB, even though women work on farmland they earn less then men. Weaving (NPS) and kitchen activity is recognized as women’s work, which also supports the household income. Mass media is not yielding any changes in TFS and food culture in these days. The traditional farming system has changed because of intensive agricultural extension which has introduced new methods of agriculture for the last three decades (since the 1980’s). There is no evidence that they want to change any of their food habits because of the mass media despite the lapau activity which allows them to get more food choices, instead preparing traditional meal at home. The recommendations of this thesis are: 1) The empowerment of farmers. It is regarding the self sufficient supply of manure, cooperative seed, and sustainable farm management. Farmers should know – where are they in their state of knowledge – so they can use their local wisdom and still collaborate with new sources of knowledge. Farmers should learn the prognosis of supply and demand next prior to harvest. There is a need for farm management guidelines; that can be adopted from both their local wisdom and modern knowledge. 2) Increase of non-agricultural income Increasing the non-agricultural income is strongly recommended. The remittances can be invested on non-agricultural jobs. 3) The empowerment of the mother. The mother plays an important role in farm to fork activities; the mother can be an initiator and promoter of cultivating spices in the backyard. Improvement of nutritional knowledge through information and informal public education can be done through arisan ibu-ibu and lapau activity. The challenges to apply these recommendations are: 1) The gap between institutions and organizations of local governments. There is more than one institution involved in food security policy. 2) Training and facilities for field extension agriculture (FEA) is needed because the rapid change of interaction between local government and farmer’s dependent on this agency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the past few years, there has been much discussion of a shift from rule-based systems to principle-based systems for natural language processing. This paper outlines the major computational advantages of principle-based parsing, its differences from the usual rule-based approach, and surveys several existing principle-based parsing systems used for handling languages as diverse as Warlpiri, English, and Spanish, as well as language translation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Machine translation has been a particularly difficult problem in the area of Natural Language Processing for over two decades. Early approaches to translation failed since interaction effects of complex phenomena in part made translation appear to be unmanageable. Later approaches to the problem have succeeded (although only bilingually), but are based on many language-specific rules of a context-free nature. This report presents an alternative approach to natural language translation that relies on principle-based descriptions of grammar rather than rule-oriented descriptions. The model that has been constructed is based on abstract principles as developed by Chomsky (1981) and several other researchers working within the "Government and Binding" (GB) framework. Thus, the grammar is viewed as a modular system of principles rather than a large set of ad hoc language-specific rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In any discipline, where uncertainty and variability are present, it is important to have principles which are accepted as inviolate and which should therefore drive statistical modelling, statistical analysis of data and any inferences from such an analysis. Despite the fact that two such principles have existed over the last two decades and from these a sensible, meaningful methodology has been developed for the statistical analysis of compositional data, the application of inappropriate and/or meaningless methods persists in many areas of application. This paper identifies at least ten common fallacies and confusions in compositional data analysis with illustrative examples and provides readers with necessary, and hopefully sufficient, arguments to persuade the culprits why and how they should amend their ways

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Hardy-Weinberg law, formulated about 100 years ago, states that under certain assumptions, the three genotypes AA, AB and BB at a bi-allelic locus are expected to occur in the proportions p2, 2pq, and q2 respectively, where p is the allele frequency of A, and q = 1-p. There are many statistical tests being used to check whether empirical marker data obeys the Hardy-Weinberg principle. Among these are the classical xi-square test (with or without continuity correction), the likelihood ratio test, Fisher's Exact test, and exact tests in combination with Monte Carlo and Markov Chain algorithms. Tests for Hardy-Weinberg equilibrium (HWE) are numerical in nature, requiring the computation of a test statistic and a p-value. There is however, ample space for the use of graphics in HWE tests, in particular for the ternary plot. Nowadays, many genetical studies are using genetical markers known as Single Nucleotide Polymorphisms (SNPs). SNP data comes in the form of counts, but from the counts one typically computes genotype frequencies and allele frequencies. These frequencies satisfy the unit-sum constraint, and their analysis therefore falls within the realm of compositional data analysis (Aitchison, 1986). SNPs are usually bi-allelic, which implies that the genotype frequencies can be adequately represented in a ternary plot. Compositions that are in exact HWE describe a parabola in the ternary plot. Compositions for which HWE cannot be rejected in a statistical test are typically “close" to the parabola, whereas compositions that differ significantly from HWE are “far". By rewriting the statistics used to test for HWE in terms of heterozygote frequencies, acceptance regions for HWE can be obtained that can be depicted in the ternary plot. This way, compositions can be tested for HWE purely on the basis of their position in the ternary plot (Graffelman & Morales, 2008). This leads to nice graphical representations where large numbers of SNPs can be tested for HWE in a single graph. Several examples of graphical tests for HWE (implemented in R software), will be shown, using SNP data from different human populations