937 resultados para International relief - Evaluation
Resumo:
Systemic thinking may be traced hack to several roots. Some of them can he found in Taoism, the basic concepts of which are the achievement of cosmic harmony and a well-balanced social order. Others can be found in Greek philosophy. Similarly, modern physics in its most advanced branches is now recognizing basic aspects of these same roots in a scientific guise. The more the process of research and theory building advances, the more phenomena are recognized as complex and interdependent with other phenomena. Interdisciplinary research and the constitution of new disciplines are contributing to a scientific approximation of integral reality, which is becoming more and more like the one everyone knows as prescientific. The transcendence of the narrow boundaries of positivist sciences seems to be becoming a necessity for scientific evolution. The ecological crisis of the twentieth century may itself lead to increased systemic thinking, and it is in full awareness of the fact that there are no simple solutions that the systemic evaluator tries to cope with the problems of the dynamics of social and political interventions in the Third World as a means of development co-operation..
Resumo:
We review and extend the core literature on international transfer price manipulation to avoid or evade taxes. Under negotiated transfer pricing with a viable bargaining structure, including performance evaluation disconnected from the transfer price, divisions voluntarily exchange accurate information to obtain firm-wide optimality, a result not dependent on restraint from exercising internal market power. For intangible licenses, a larger optimal profit shift for a given tax rate change strengthens incentives for transfer pricing abuse. In practice, an intangible's arm's length range is viewed as a guideline, a context where incentives for abuse materialize. Transfer pricing for intangibles obliges greater tax authority scrutiny.
Resumo:
Genetics education for physicians has been a popular publication topic in the United States and in Europe for over 20 years. Decreasing numbers of medical genetics professionals and an increasing volume of genetic information has created a dire need for increased genetics training in medical school and in clinical practice. This study aimed to assess how well pediatrics-focused primary care physicians apply their general genetics knowledge to clinical genetic testing using scenario-based questions. We chose to specifically focus on knowledge of the diagnostic applicability of Chromosomal Microarray (CMA) technology in pediatrics because of its recent recommendation by the International Standard Cytogenomic Array (ISCA) Consortium as a first-tier genetic test for individuals with developmental disabilities and/or congenital anomalies. Proficiency in ordering baseline genetic testing was evaluated for eighty-one respondents from four pediatrics-focused residencies (categorical pediatrics, pediatric neurology, internal medicine/pediatrics, and family practice) at two large residency programs in Houston, Texas. Similar to other studies, we found an overall deficit of genetic testing knowledge, especially among family practice residents. Interestingly, residents who elected to complete a genetics rotation in medical school scored significantly better than expected, as well as better than residents who did not elect to complete a genetics rotation. We suspect that the insufficient knowledge among physicians regarding a baseline genetics work-up is leading to redundant (i.e. concurrent karyotype and CMA) and incorrect (i.e. ordering CMA to detect achondroplasia) genetic testing and is contributing to rising health care costs in the United States. Our results provide specific teaching points upon which medical schools can focus education about clinical genetic testing and suggest that increased collaboration between primary care physicians and genetics professionals could benefit patient health care overall.
Resumo:
La información básica sobre el relieve de una cuenca hidrográfica, mediante metodologías analítico-descriptivas, permite a quienes evalúan proyectos relacionados con el uso de los recursos naturales, tales como el manejo integrado de cuencas, estudios sobre impacto ambiental, degradación de suelos, deforestación, conservación de los recursos hídricos, entre otros, contar para su análisis con los parámetros físicos necesarios. Estos procesos mencionados tienen un fuerte componente espacial y el empleo de Sistemas de Información Geográfica (SIG) son de suma utilidad, siendo los Modelos Digitales de Elevación (DEM) y sus derivados un componente relevante de esta base de datos. Los productos derivados de estos modelos, como pendiente, orientación o curvatura, resultarán tan precisos como el DEM usado para derivarlos. Por otra parte, es fundamental maximizar la habilidad del modelo para representar las variaciones del terreno; para ello se debe seleccionar una adecuada resolución (grilla) de acuerdo con los datos disponibles para su generación. En este trabajo se evalúa la calidad altimétrica de seis DEMs generados a partir de dos sistemas diferentes de captura de datos fuente y de distintas resoluciones de grilla. Para determinar la exactitud de los DEMs habitualmente se utiliza un grupo de puntos de control considerados como "verdad de campo" que se comparan con los generados por el modelo en la misma posición geográfica. El área seleccionada para realizar el estudio está ubicada en la localidad de Arrecifes, provincia de Buenos Aires (Argentina) y tiene una superficie de aproximadamente 120 ha. Los resultados obtenidos para los dos algoritmos y para los tres tamaños de grilla analizados presentaron los siguientes resultados: el algoritmo DEM from contourn, un RMSE (Root Mean Squared Error) de ± 0,11 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,15 m (para grilla de 10 m). Para el algoritmo DEM from vector/points, un RMSE de ± 0,09 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,11 m (para grilla de 10 m). Los resultados permiten concluir que el DEM generado a partir de puntos acotados del terreno como datos fuente y con el menor tamaño de grilla es el único que satisface los valores enumerados en la bibliografía, tanto nacional como internacional, lo que lo hace apto para proyectos relacionados con recursos naturales a nivel de ecotopo (predial). El resto de los DEMs generados presentan un RMSE que permite asegurar su aptitud para la evaluación de proyectos relacionados con el uso de los recursos naturales a nivel de unidad de paisaje (conjunto de ecotopos).
Resumo:
The present volume contains the planktological data collected during the expedition of the "Meteor" to the Indian Ocean in 1964/65. It was the main objective of the expedition to study the up- and downwelling conditioned along the western and eastern coasts of the Arabian Sea by the northeastern monsoon. It is from these areas that the greater part of the data here presented was obtained. A few values from the Red Sea have been added. As the title "Planktological-Chemical Data" implies, it was chiefly with the help of chemical methods that the planktological investigations, with the exception of the particle size analysis and phytoplankton counting conducted optically, were carried out. These investigations were above all devoted to a quantitative survey of particulate matter and plankton, the latter being sampled by water-bottle and net. The zooplankton hauls were taken with the Indian Ocean Standard Net according to the international guidelines laid down for the expedition. As a rule, double catches were made at every station, one sample being intended for laboratory analysis at the Indian Ocean Biological Centre in Ernakulam, South India, and the other for the Institut für Meereskunde in Kiel. In addition to determining the standing stock, the production rate of phytoplankton was measured by the 14C method. These experiments were mainly conducted during the latter half of the expedition. The planktological studies primarily covered the euphotic zone, extending into the underlying water layers up to a depth of 600 m. The investigations were above all directed towards ascertaining the quantity of organic substance, formed by primary production, in its relation to environmental conditions and determining whether or not organic substance is actively transported from the surface into the deeper layers by the periodically migration organisms of the deep scattering layers. Depending on the station time available, a few samples could now and then be taken from deeper layers. The present volume of planktological-chemical data addresses itself to all those concerned processing the extensive material collected during the International Indian Ocean Expedition. As a readily accessible work of reference, it hopes to serve as an aid in the evaluation and interpretation of the expedition results. The complementary ecological data such as temperature, salinity, and oxygen content as well as the figures obtained on abundance and distribution in depth of the nutrients essential for primary production may be found in the volume of physical-chemical data published in Series A of the "Meteor"-Forschungsergebnisse No. 2, 1966 (Dietrich et al., 1966).
Resumo:
Politicians, social scientists and general readers have noted in both Cuban and international academic forums and periodicals that the well-being enjoyed by the Cuban people in the 1980s has been seriously compromised since the economic crisis of the 1990s. Even for the most skeptical of observers it is clear that this worsening of conditions can be attributed not only to external factors, such as the breakup of the international socialist system, the tightening of the US blockade, and the worldwide economic crisis suffered by underdeveloped countries, but also to internal factors that have kept the country from taking full advantage of the human and material potential available on the island. Although Cuba is currently experiencing an economic recovery from the collapse in GDP in the mid 1990s following the collapse of its ties with the Socialist Bloc, it continues to maintain high import coefficients due to longstanding structural.
Resumo:
The International FusionMaterials Irradiation Facility (IFMIF) is a future neutron source based on the D-Li stripping reaction, planned to test candidate fusionmaterials at relevant fusion irradiation conditions. During the design of IFMIF special attention was paid to the structural materials for the blanket and first wall, because they will be exposed to the most severe irradiation conditions in a fusion reactor. Also the irradiation of candidate materials for solid breeder blankets is planned in the IFMIF reference design. This paper focuses on the assessment of the suitability of IFMIF irradiation conditions for testing functionalmaterials to be used in liquid blankets and diagnostics systems, since they are been also considered within IFMIF objectives. The study has been based on the analysis and comparison of the main expected irradiation parameters in IFMIF and DEMO reactor.
Resumo:
Non-destructive, visual evaluation and mechanical testing techniques were used to assess the structural properties of 374 samples of chestnut (Castanea sativa). The principal components method was applied to establish and interpret correlations between variables obtained of modulus of elasticity, bending strength and density. The static modulus of elasticity presented higher correlation values than those obtained using non-destructive methods. Bending strength presented low correlations with the non-destructive parameters, but there was some relation to the different knot ratios defined. The relationship was stronger with the most widely used ratio, CKDR. No significant correlations were observed between any of the variables and density.
Resumo:
Experiences relating to the InternationalMasters in Rural Development from the Technical University of Madrid (Universidad Politécnica de Madrid, UPM), the first Spanish programme to receive a mention as a Registered Education Programme by InternationalProject Management Association (IPMA) are considered. Backed by an educational strategy based on Project-Based Learning dating back twenty years, this programme has managed to adapt to the competence evaluation requirements proposed by the European Space for Higher Education (ESHE). In order to do this the training is linked to the professional qualification using competences as a reference leading to the qualification in project management as established by the IPMA.
Resumo:
Accreditation models in the international context mainly consider the evaluation of learning outcomes and the ability of programs (or higher education institutions) to achieve the educational objectives stated in their mission. However, it is not clear if these objectives and therefore their outcomes satisfy real national and regional needs, a critical point in engineering master's programs, especially in developing countries. The aim of this paper is to study the importance of the local relevancy evaluation of these programs and to analyze the main models of quality assurance and accreditation bodies of USA, Europe and Latin America, in order to ascertain whether the relevancy is evaluated or not. After a literature review, we found that in a free-market economic context and international education, the accreditation of master’s programs follows an international accreditation model, and doesn´t take in account in most cases criteria and indicators for local relevancy. It concludes that it is necessary both, international accreditation to ensure the effectiveness of the program (achievement of learning outcomes) and the national accreditation through which it could ensure local relevancy of programs, for which we are giving some indicators.
Resumo:
This paper describes the first five SEALS Evaluation Campaigns over the semantic technologies covered by the SEALS project (ontology engineering tools, ontology reasoning tools, ontology matching tools, semantic search tools, and semantic web service tools). It presents the evaluations and test data used in these campaigns and the tools that participated in them along with a comparative analysis of their results. It also presents some lessons learnt after the execution of the evaluation campaigns and draws some final conclusions.
Resumo:
In the present paper the influence of the reference system with regard to the characterization of the surface finishing is analyzed. The effect of the reference system’s choice on the most representative surface finishing parameters (e.g. roughness average Ra and root mean square values Rq) is studied. The study can also be applied to their equivalent parameters in waviness and primary profiles. Based on ISO and ASME standards, three different types of regression lines (central, mean and orthogonal) are theoretically and experimentally analyzed, identifying the validity and applicability fields of each one depending on profile’s geometry. El presente trabajo realiza un estudio de la influencia que supone la elección del sistema de referencia en la determinación los valores de los parámetros más relevantes empleados en la caracterización del acabado superficial tales como la rugosidad media aritmética Ra o la rugosidad media cuadrática Rq y sus equivalentes en los perfiles de ondulación y completo. Partiendo de la definición establecida por las normas ISO y ASME, se analizan tres tipos de líneas de regresión cuadrática (línea central, línea media y línea ortogonal), delimitando los campos de validez y de aplicación de cada una de ellas en función de la geometría del perfil. Para ello se plantean diversos tipos de perfiles y se desarrolla un estudio teórico y experimental de los mismos.
Resumo:
In the present uncertain global context of reaching an equal social stability and steady thriving economy, power demand expected to grow and global electricity generation could nearly double from 2005 to 2030. Fossil fuels will remain a significant contribution on this energy mix up to 2050, with an expected part of around 70% of global and ca. 60% of European electricity generation. Coal will remain a key player. Hence, a direct effect on the considered CO2 emissions business-as-usual scenario is expected, forecasting three times the present CO2 concentration values up to 1,200ppm by the end of this century. Kyoto protocol was the first approach to take global responsibility onto CO2 emissions monitoring and cap targets by 2012 with reference to 1990. Some of principal CO2emitters did not ratify the reduction targets. Although USA and China spur are taking its own actions and parallel reduction measures. More efficient combustion processes comprising less fuel consuming, a significant contribution from the electricity generation sector to a CO2 dwindling concentration levels, might not be sufficient. Carbon Capture and Storage (CCS) technologies have started to gain more importance from the beginning of the decade, with research and funds coming out to drive its come in useful. After first researching projects and initial scale testing, three principal capture processes came out available today with first figures showing up to 90% CO2 removal by its standard applications in coal fired power stations. Regarding last part of CO2 reduction chain, two options could be considered worthy, reusing (EOR & EGR) and storage. The study evaluates the state of the CO2 capture technology development, availability and investment cost of the different technologies, with few operation cost analysis possible at the time. Main findings and the abatement potential for coal applications are presented. DOE, NETL, MIT, European universities and research institutions, key technology enterprises and utilities, and key technology suppliers are the main sources of this study. A vision of the technology deployment is presented.
Resumo:
We present an evaluation of a spoken language dialogue system with a module for the management of userrelated information, stored as user preferences and privileges. The flexibility of our dialogue management approach, based on Bayesian Networks (BN), together with a contextual information module, which performs different strategies for handling such information, allows us to include user information as a new level into the Context Manager hierarchy. We propose a set of objective and subjective metrics to measure the relevance of the different contextual information sources. The analysis of our evaluation scenarios shows that the relevance of the short-term information (i.e. the system status) remains pretty stable throughout the dialogue, whereas the dialogue history and the user profile (i.e. the middle-term and the long-term information, respectively) play a complementary role, evolving their usefulness as the dialogue evolves.