15 resultados para Correspondences, Doctrine of.
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The New Swiss interdepartemental cooperation structure results in a new setting in negotiations and stimulates the counterpart or for the same setting and at least for the same domestic cooperation.
Resumo:
Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem
Resumo:
Mapping the relevant principles and norms of international law, the paper discusses scientific evidence and identifies current legal foundations of climate change mitigation adaptation and communication in international environmental law, human rights protection and international trade regulation in WTO law. It briefly discusses the evolution and architecture of relevant multilateral environmental agreements, in particular the UN Framework Convention on Climate Change. It discusses the potential role of human rights in identifying pertinent goals and values of mitigation and adaptation and eventually turns to principles and rules of international trade regulation and investment protection which are likely to be of crucial importance should the advent of a new multilateral agreement fail to materialize. The economic and legal relevance of rules on tariffs, border tax adjustment and subsidies, services and intellectual property and investment law are discussed in relation to the production, supply and use of energy. Moreover, lessons from trade negotiations may be drawn for negotiations of future environmental instruments. The paper offers a survey of the main interacting areas of public international law and discusses the intricate interaction of all these components informing climate change mitigation, adaptation and communication in international law in light of an emerging doctrine of multilayered governance. It seeks to contribute to greater coherence of what today is highly fragmented and rarely discussed in an overall context. The paper argues that trade regulation will be of critical importance in assessing domestic policies and potential trade remedies offer powerful incentives for all nations alike to participate in a multilateral framework defining appropriate goals and principles.
Resumo:
This appraisal of David Scott FitzGerald and David Cook-Martín's Culling the Masses: The Democratic Origins of Racist Immigration Policy in the Americas argues that there is no ‘elective affinity’ between liberalism and racism, which is the core argument of the book. The notion of ‘elective affinity’, which the authors borrow from Max Weber, requires a structural homology between the ‘electively’ related elements that just does not exist in this case. The relationship between both is entirely contingent, ‘racism’ being a doctrine of inter-group relations while ‘liberalism’ is a doctrine of intra-group relations, with no consideration of how the boundaries of the group are constituted.
Resumo:
In this work, a method that synchronizes two video sequences is proposed. Unlike previous methods, which require the existence of correspondences between features tracked in the two sequences, and/or that the cameras are static or jointly moving, the proposed approach does not impose any of these constraints. It works when the cameras move independently, even if different features are tracked in the two sequences. The assumptions underlying the proposed strategy are that the intrinsic parameters of the cameras are known and that two rigid objects, with independent motions on the scene, are visible in both sequences. The relative motion between these objects is used as clue for the synchronization. The extrinsic parameters of the cameras are assumed to be unknown. A new synchronization algorithm for static or jointly moving cameras that see (possibly) different parts of a common rigidly moving object is also proposed. Proof-of-concept experiments that illustrate the performance of these methods are presented, as well as a comparison with a state-of-the-art approach.
Resumo:
Statistical shape models (SSMs) have been used widely as a basis for segmenting and interpreting complex anatomical structures. The robustness of these models are sensitive to the registration procedures, i.e., establishment of a dense correspondence across a training data set. In this work, two SSMs based on the same training data set of scoliotic vertebrae, and registration procedures were compared. The first model was constructed based on the original binary masks without applying any image pre- and post-processing, and the second was obtained by means of a feature preserving smoothing method applied to the original training data set, followed by a standard rasterization algorithm. The accuracies of the correspondences were assessed quantitatively by means of the maximum of the mean minimum distance (MMMD) and Hausdorf distance (H(D)). Anatomical validity of the models were quantified by means of three different criteria, i.e., compactness, specificity, and model generalization ability. The objective of this study was to compare quasi-identical models based on standard metrics. Preliminary results suggest that the MMMD distance and eigenvalues are not sensitive metrics for evaluating the performance and robustness of SSMs.
Resumo:
Statistical models have been recently introduced in computational orthopaedics to investigate the bone mechanical properties across several populations. A fundamental aspect for the construction of statistical models concerns the establishment of accurate anatomical correspondences among the objects of the training dataset. Various methods have been proposed to solve this problem such as mesh morphing or image registration algorithms. The objective of this study is to compare a mesh-based and an image-based statistical appearance model approaches for the creation of nite element(FE) meshes. A computer tomography (CT) dataset of 157 human left femurs was used for the comparison. For each approach, 30 finite element meshes were generated with the models. The quality of the obtained FE meshes was evaluated in terms of volume, size and shape of the elements. Results showed that the quality of the meshes obtained with the image-based approach was higher than the quality of the mesh-based approach. Future studies are required to evaluate the impact of this finding on the final mechanical simulations.
Resumo:
Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.
Resumo:
Turkish agriculture has been experiencing a period of unique policy experiment over the last couple years. A World Bank-initiated project, called the Agricultural Reform Implementation Project (ARIP), has been at the forefront of policy change. It was initially promoted by the Bank as an exemplary reform package which could also be adopted by other developing countries. It was introduced in 2001 as part of a major International Monetary Fund (IMF)/World Bank-imposed program of “structural adjustment” after the country had been hit by a major financial crisis. The project has finally come to an end in 2009, and there is now an urgent need for a retrospective assessment of its overall impact on the agricultural sector. Has it fulfilled its ambitious objective of reforming and restructuring Turkish agriculture? Or should it be recorded as a failure of the neo-liberal doctrine? This book aims at finding answers to these questions by investigating the legacy of ARIP from a multi-disciplinary perspective.
Resumo:
Spatial scaling is an integral aspect of many spatial tasks that involve symbol-to-referent correspondences (e.g., map reading, drawing). In this study, we asked 3–6-year-olds and adults to locate objects in a two-dimensional spatial layout using information from a second spatial representation (map). We examined how scaling factor and reference features, such as the shape of the layout or the presence of landmarks, affect performance. Results showed that spatial scaling on this simple task undergoes considerable development, especially between 3 and 5 years of age. Furthermore, the youngest children showed large individual variability and profited from landmark information. Accuracy differed between scaled and un-scaled items, but not between items using different scaling factors (1:2 vs. 1:4), suggesting that participants encoded relative rather than absolute distances.
Resumo:
Statistical appearance models have recently been introduced in bone mechanics to investigate bone geometry and mechanical properties in population studies. The establishment of accurate anatomical correspondences is a critical aspect for the construction of reliable models. Depending on the representation of a bone as an image or a mesh, correspondences are detected using image registration or mesh morphing. The objective of this study was to compare image-based and mesh-based statistical appearance models of the femur for finite element (FE) simulations. To this aim, (i) we compared correspondence detection methods on bone surface and in bone volume; (ii) we created an image-based and a mesh-based statistical appearance models from 130 images, which we validated using compactness, representation and generalization, and we analyzed the FE results on 50 recreated bones vs. original bones; (iii) we created 1000 new instances, and we compared the quality of the FE meshes. Results showed that the image-based approach was more accurate in volume correspondence detection and quality of FE meshes, whereas the mesh-based approach was more accurate for surface correspondence detection and model compactness. Based on our results, we recommend the use of image-based statistical appearance models for FE simulations of the femur.
Resumo:
This paper demonstrates a mixed approach to the theme of the instrumentality of law by both analysing the goal of a legal transformation and the techniques adapted to achieve it. The correct recognition of a certain practical necessity has lead the Swiss Federal Tribunal to an intriguing judgement “Fussballclub Lohn-Fall” of 1997. The legal remedies provided for cases of unfair advantage have been then creatively modified praeter legem. The adaptation was strongly influenced by foreign legal patterns. The Swiss Code of Obligations of 1911 provides a norm in art. 21 on unfair advantage (unconscionable contract), prescribing that if one party takes unjustified advantage over the weaknesses of another in order to receive an excessive benefit, such a contract is avoidable. Its wording has been shaped over a hundred years ago and still remains intact. However, over the course of the 20th century the necessity for a more efficient protection has arisen. The legal doctrine and jurisprudence were constantly pointing out the incompleteness of the remedies provided by art. 21 of the Code of Obligations. In the “Fussballclub Lohn-Fall” (BGE 123 III 292) the Swiss Federal Tribunal finally introduced the possibility to modify the contract. Its decision has been described as “a sign of the zeitgeist, spirit of the time”. It was the Swiss legal doctrine that has imposed the new measure under the influence of the German “quantitative Teilnichtigkeit” (quantitative partial nullity). The historical heritage of the Roman laesio enormis has also played its role.