927 resultados para Nature and Poetry
Resumo:
Diese Dissertation stellt eine Studie da, welche sich mit den Änderungen in der Governance der Hochschulbildung in Vietnam beschäftigt. Das zentrale Ziel dieser Forschungsarbeit ist die Untersuchung der Herkunft und Änderung in der Beziehung der Mächte zwischen dem vietnamesischen Staat und den Hochschulbildungsinstituten (HI), welche hauptsächlich aus der Interaktion dieser beiden Akteure resultiert. Die Macht dieser beiden Akteure wurde im sozialen Bereich konstruiert und ist hauptsächlich durch ihre Nützlichkeit und Beiträge für die Hochschulbildung bestimmt. Diese Arbeit beschäftigt sich dabei besonders mit dem Aspekt der Lehrqualität. Diese Studie nimmt dabei die Perspektive einer allgemeinen Governance ein, um die Beziehung zwischen Staat und HI zu erforschen. Zudem verwendet sie die „Resource Dependence Theory“ (RDT), um das Verhalten der HI in Bezug auf die sich verändernde Umgebung zu untersuchen, welche durch die Politik und eine abnehmende Finanzierung charakterisiert ist. Durch eine empirische Untersuchung der Regierungspolitik sowie der internen Steuerung und den Praktiken der vier führenden Universitäten kommt die Studie zu dem Schluss, dass unter Berücksichtigung des Drucks der Schaffung von Einkommen die vietnamesischen Universitäten sowohl Strategien als auch Taktiken entwickelt haben, um Ressourcenflüsse und Legitimität zu kontrollieren. Die Entscheidungs- und Zielfindung der Komitees, die aus einer Mehrheit von Akademikern bestehen, sind dabei mächtiger als die der Manager. Daher werden bei initiativen Handlungen der Universitäten größtenteils Akademiker mit einbezogen. Gestützt auf die sich entwickelnden Muster der Ressourcenbeiträge von Akademikern und Studierenden für die Hochschulbildung prognostiziert die Studie eine aufstrebende Governance Konfiguration, bei der die Dimensionen der akademischen Selbstverwaltung und des Wettbewerbsmarktes stärker werden und die Regulation des Staates rational zunimmt. Das derzeitige institutionelle Design und administrative System des Landes, die spezifische Gewichtung und die Koordinationsmechanismen, auch als sogenanntes effektives Aufsichtssystem zwischen den drei Schlüsselakteuren - der Staat, die HI/Akademiker und die Studierenden – bezeichnet, brauchen eine lange Zeit zur Detektion und Etablierung. In der aktuellen Phase der Suche nach einem solchen System sollte die Regierung Management-Tools stärken, wie zum Beispiel die Akkreditierung, belohnende und marktbasierte Instrumente und das Treffen informations-basierter Entscheidungen. Darüber hinaus ist es notwendig die Transparenz der Politik zu erhöhen und mehr Informationen offenzulegen.
Resumo:
The right to food has become a pillar of international humanitarian and human rights law. The increasing number of food-related emergencies and the evolution of the international order brought the more precise notion of food security and made a potential right to receive food aid emerge. Despite this apparent centrality, recent statistics show that a life free from hunger is for many people all over the world still a utopian idea. The paper will explore nature and content of the right to food, food security and food aid under international law in order to understand the reasons behind the substantial failure of this right-centred approach, emphasising the lack of legal effects of many food-related provisions because of excessive moral connotations of the right to be free from hunger. Bearing in mind the three-dimensional nature of food security, the paper will also suggest that all attention has been focused on the availability of food, while real difficulties arise in terms of accessibility and adequacy. Emergency situations provide an excellent example of this unbalance, as the emerging right to receive food aid focus itself on the availability of food, without improving local production and adequacy. Looking at other evolving sectors of international law, such as the protection of the environment, and particularly the safeguard of biological diversity, alternative solutions will be envisaged in order to “feed” the right to food.
Resumo:
The study aims to get deeper insight into the highly extensive system of animal husbandry in the Mahafaly region of southwestern Madagascar. It tries to understand the major drivers for pastoral dynamics, land and resource use along a gradient in altitude and vegetation to consider the area’s high spatial and temporal heterogeneity. The study also analyzes the reproductive performance of local livestock as well as the owners’ culling strategies to determine herd dynamics, opportunities for economic growth, and future potential for rural development. Across seasons, plateau herds from both livestock species covered longer distances (cattle 13.6±3.02 km, goats 12.3±3.48 km) and were found further away from the settlements (cattle 3.1±0.96 km, goats 2.8±0.98 km) than those from the coastal plain (walking_dist: cattle 9.5±3.25 km, goats 9.2±2.57 km; max_dist: cattle 2.6±1.28 km, goats 1.8±0.61 km). Transhumant cattle were detected more vulnerable through limited access to pasture land and water resources compared to local herds. Seasonal water shortage has been confirmed as a key constraint on the plateau while livestock keeping along the coast is more limited by dry season forage availability. However, recent security issues and land use conflicts with local crop farmers are gaining importance and force livestock owners to adapt their traditional grazing management, resulting in spatio-temporal variation of livestock numbers and in the impending risk of local overgrazing and degradation of rangelands. Among the 133 plant species consumed by livestock, 13 were determined of major importance for the animals’ nutrition. The nutritive value and digestibility of the natural forage, as well as its abundance in the coastal zone, substantially decreased over the course of the dry season and emphasized the importance of supplementary forage plants, in particular Euphorbia stenoclada. At the same time, an unsustainable utilization and overexploitation of its wild stocks may raise the pressure on the vegetation and pasture resources within the nearby Tsimanampetsotsa National Park. Age at first parturition was 40.5±0.59 months for cattle and 21.3±0.63 months for goats. Both species showed long parturition intervals (cattle 24.2±0.48 months, goats 12.4±0.30 months), mostly due to the maintenance of poorly performing breeding females within the herds. Reported offspring mortality, however, was low with 2.5% of cattle and 18.8% of goats dying before reaching maturity. The analysis of economic information revealed higher than expected market dynamics, especially for zebus, resulting in annual contribution margins of 33 € per cattle unit and 11 € per goat unit. The application of the PRY Herd Life model to simulate herd development for present management and two alternate scenarios confirmed the economic profitability of the current livestock system and showed potential for further productive and economic development. However, this might be clearly limited by the region’s restricted carrying capacity. Summarizing, this study illustrates the highly extensive and resources-driven character of the livestock system in the Mahafaly region, with herd mobility being a central element to cope with seasonal shortages in forage and water. But additional key drivers and external factors are gaining importance and increasingly affect migration decisions and grazing management. This leads to an increased risk of local overgrazing and overexploitation of natural pasture resources and intensifies the tension between pastoral and conservation interests. At the same time, it hampers the region’s agronomic development, which has not yet been fully exploited. The situation therefore demonstrates the need for practical improvement suggestions and implication measures, such as the systematic forestation of supplemental forage plant species in the coastal zone or a stronger integration of animal husbandry and crop production, to sustain the traditional livestock system without compromising peoples’ livelihoods while at the same time minimizing the pastoral impact on the area’s unique nature and environment.
Resumo:
Investing in global environmental and adaptation benefits in the context of agriculture and food security initiatives can play an important role in promoting sustainable intensification. This is a priority for the Global Environment Facility (GEF), created in 1992 with a mandate to serve as financial mechanism of several multilateral environmental agreements. To demonstrate the nature and extent of GEF financing, we conducted an assessment of the entire portfolio over a period of two decades (1991–2011) to identify projects with direct links to agriculture and food security. A cohort of 192 projects and programs were identified and used as a basis for analyzing trends in GEF financing. The projects and programs together accounted for a total GEF financing of US$1,086.8 million, and attracted an additional US$6,343.5 million from other sources. The value-added of GEF financing for ecosystem services and resilience in production systems was demonstrated through a diversity of interventions in the projects and programs that utilized US$810.6 million of the total financing. The interventions fall into the following four main categories in accordance with priorities of the GEF: sustainable land management (US$179.3 million), management of agrobiodiversity (US$113.4 million), sustainable fisheries and water resource management (US$379.8 million), and climate change adaptation (US$138.1 million). By aligning GEF priorities with global aspirations for sustainable intensification of production systems, the study shows that it is possible to help developing countries tackle food insecurity while generating global environmental benefits for a healthy and resilient planet.
Resumo:
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms. An implementation of this architecture could migrate a null thread in 66 cycles -- over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
Resumo:
When discussing the traditional and new missions of higher education (1996 Report to UNESCO of the International Commission on Education for the 21st Century) Jacques Delors stated that "Excessive attraction to social sciences has broken equilibrium of available graduates for workforce, thus causing doubts of graduates and employers on the quality of knowledge provided by higher education". Likewise, when discussing the progress of science and technology, the 1998 UNESCO World Conference on Higher Education concluded that "Another challenge concerts the latest advancements of Science, the sine qua non of sustainable development"; and that “with Information Technology, the unavoidable invasion of virtual reality has increased the distance between industrial and developing countries". Recreational Science has a long tradition all over the Educational World; it aims to show the basic aspects of Science, aims to entertain, and aims to induce thinking. Until a few years ago, this field of knowledge consisted of a few books, a few kits and other classical (yet innovative) ways to popularize the knowledge of Nature and the laws governing it. In Spain, the interest for recreational science has increased in the last years. First, new recreational books are being published and found in bookstores. Second the number of Science-related museums and exhibits is increasing. And third, new television shows are produced and new short science-based, superficial sketches are found in variety programs. However, actual programs in Spanish television dealing seriously with Science are scarce. Recreational Science, especially that related to physical phenomena like light or motion, is generally found at Science Museums because special equipment is required. On the contrary, Science related mathematics, quizzes and puzzles use to gather into books, e.g. the extensive collections by Martin Gardner. However, lately Science podcasts have entered the field of science communication. Not only traditional science journals and television channels are providing audio and video podcasts, but new websites deal exclusively with science podcasts, in particular on Recreational Science. In this communication we discuss the above mentioned trends and show our experience in the last two years in participating at Science Fairs and university-sponsored events to attract students to science and technology careers. We show a combination of real examples (e.g., mathemagic), imagination, use of information technology, and use of social networks. We present as well an experience on designing a computational, interactive tool to promote chemistry among high school, prospective students using computers ("Dancing with Bionanomolecules"). Like the concepts related to Web 2.0, it has been already proposed that a new framework for communication of science is emerging, i.e., Science Communication 2.0, where people and institutions develop new innovative ways to explain science topics to diverse publics – and where Recreational Science is likely to play a leading role
Resumo:
En este trabajo se describe la naturaleza y secuencia de adquisición de las preguntas interrogativas parciales en niños de habla catalana y/o castellana dentro de un marco de análisis según el cual la adquisición de las estructuras lingüísticas se construye gradualmente desde estructuras concretas hasta estructuras más abstractas. La muestra utilizada se compone de 10 niños y niñas procedentes de corpus longitudinales cuyas edades van de los 17 meses a los 3 años. El análisis se ha realizado atendiendo a la estructura sintáctica de la oración, los errores, los pronombres y adverbios interrogativos, y la tipología verbal. Los resultados muestran que la secuencia de adquisición pasa por un momento inicial caracterizado por producciones estereotipadas o fórmulas, durante el cual sólo aparecen algunas partículas interrogativas en estructuras muy concretas. Posteriormente la interrogación aparece con otros pronombres y adverbios y se diversifica a otros verbos, además, no se observan errores en la construcción sintáctica. Estos resultados suponen un hecho diferencial respecto de estudios previos en lengua inglesa
Resumo:
La Universidad del Rosario es una de las universidades más antiguas de Colombia; fundada en 1653, se ha caracterizado por ser una universidad tradicional. A pesar de esta tradición, una Facultad perteneciente a la institución ha generado un profundo cambio cultural que ha transformado la naturaleza y el desempeño de la Universidad. La presente investigación explora este cambio utilizando un modelo que estudia la cultura como una realidad compleja. El resultado de este trabajo es interesante desde el punto de vista teórico, dado que es un ejemplo de “cambio de los agentes” en lo que respecta a cómo puede transformarse una institución antigua y tradicional en términos académicos y cómo puede estudiarse dicho caso.
Resumo:
As ubiquitous systems have moved out of the lab and into the world the need to think more systematically about how there are realised has grown. This talk will present intradisciplinary work I have been engaged in with other computing colleagues on how we might develop more formal models and understanding of ubiquitous computing systems. The formal modelling of computing systems has proved valuable in areas as diverse as reliability, security and robustness. However, the emergence of ubiquitous computing raises new challenges for formal modelling due to their contextual nature and dependence on unreliable sensing systems. In this work we undertook an exploration of modelling an example ubiquitous system called the Savannah game using the approach of bigraphical rewriting systems. This required an unusual intra-disciplinary dialogue between formal computing and human- computer interaction researchers to model systematically four perspectives on Savannah: computational, physical, human and technical. Each perspective in turn drew upon a range of different modelling traditions. For example, the human perspective built upon previous work on proxemics, which uses physical distance as a means to understand interaction. In this talk I hope to show how our model explains observed inconsistencies in Savannah and ex- tend it to resolve these. I will then reflect on the need for intradisciplinary work of this form and the importance of the bigraph diagrammatic form to support this form of engagement. Speaker Biography Tom Rodden Tom Rodden (rodden.info) is a Professor of Interactive Computing at the University of Nottingham. His research brings together a range of human and technical disciplines, technologies and techniques to tackle the human, social, ethical and technical challenges involved in ubiquitous computing and the increasing used of personal data. He leads the Mixed Reality Laboratory (www.mrl.nott.ac.uk) an interdisciplinary research facility that is home of a team of over 40 researchers. He founded and currently co-directs the Horizon Digital Economy Research Institute (www.horizon.ac.uk), a university wide interdisciplinary research centre focusing on ethical use of our growing digital footprint. He has previously directed the EPSRC Equator IRC (www.equator.ac.uk) a national interdisciplinary research collaboration exploring the place of digital interaction in our everyday world. He is a fellow of the British Computer Society and the ACM and was elected to the ACM SIGCHI Academy in 2009 (http://www.sigchi.org/about/awards/).
Resumo:
What are the effects of natural disasters on electoral results? Some authors claim that catastrophes have a negative effect on the survival of leaders in a democracy because voters have a propensity to punish politicians for not preventing or poorly handling a crisis. In contrast, this paper finds that these events might be beneficial for leaders. Disasters are linked to leader survival through clientelism: they generate an in-flow of resources in the form of aid, which increase money for buying votes. Analyzing the rainy season of 2010-2011 in Colombia, considered its worst disaster in history, I use a difference-in-differences strategy to show that in the local election incumbent parties benefited from the disaster. The result is robust to different specifications and alternative explanations. Moreover, places receiving more aid and those with judicial evidence of vote-buying irregularities, are more likely to reelect the incumbent, supporting the mechanism proposed by this paper.
Resumo:
The contributions of the correlated and uncorrelated components of the electron-pair density to atomic and molecular intracule I(r) and extracule E(R) densities and its Laplacian functions ∇2I(r) and ∇2E(R) are analyzed at the Hartree-Fock (HF) and configuration interaction (CI) levels of theory. The topologies of the uncorrelated components of these functions can be rationalized in terms of the corresponding one-electron densities. In contrast, by analyzing the correlated components of I(r) and E(R), namely, IC(r) and EC(R), the effect of electron Fermi and Coulomb correlation can be assessed at the HF and CI levels of theory. Moreover, the contribution of Coulomb correlation can be isolated by means of difference maps between IC(r) and EC(R) distributions calculated at the two levels of theory. As application examples, the He, Ne, and Ar atomic series, the C2-2, N2, O2+2 molecular series, and the C2H4 molecule have been investigated. For these atoms and molecules, it is found that Fermi correlation accounts for the main characteristics of IC(r) and EC(R), with Coulomb correlation increasing slightly the locality of these functions at the CI level of theory. Furthermore, IC(r), EC(R), and the associated Laplacian functions, reveal the short-ranged nature and high isotropy of Fermi and Coulomb correlation in atoms and molecules
Resumo:
Given an observed test statistic and its degrees of freedom, one may compute the observed P value with most statistical packages. It is unknown to what extent test statistics and P values are congruent in published medical papers. Methods: We checked the congruence of statistical results reported in all the papers of volumes 409–412 of Nature (2001) and a random sample of 63 results from volumes 322–323 of BMJ (2001). We also tested whether the frequencies of the last digit of a sample of 610 test statistics deviated from a uniform distribution (i.e., equally probable digits).Results: 11.6% (21 of 181) and 11.1% (7 of 63) of the statistical results published in Nature and BMJ respectively during 2001 were incongruent, probably mostly due to rounding, transcription, or type-setting errors. At least one such error appeared in 38% and 25% of the papers of Nature and BMJ, respectively. In 12% of the cases, the significance level might change one or more orders of magnitude. The frequencies of the last digit of statistics deviated from the uniform distribution and suggested digit preference in rounding and reporting.Conclusions: this incongruence of test statistics and P values is another example that statistical practice is generally poor, even in the most renowned scientific journals, and that quality of papers should be more controlled and valued
Resumo:
This dissertation studies the effects of Information and Communication Technologies (ICT) on the banking sector and the payments system. It provides insight into how technology-induced changes occur, by exploring both the nature and scope of main technology innovations and evidencing their economic implications for banks and payment systems. Some parts in the dissertation are descriptive. They summarise the main technological developments in the field of finance and link them to economic policies. These parts are complemented with sections of the study that focus on assessing the extent of technology application to banking and payment activities. Finally, it includes also some work which borrows from the economic literature on banking. The need for an interdisciplinary approach arises from the complexity of the topic and the rapid path of change to which it is subject. The first chapter provides an overview of the influence of developments in ICT on the evolution of financial services and international capital flows. We include main indicators and discuss innovation in the financial sector, exchange rates and international capital flows. The chapter concludes with impact analysis and policy options regarding the international financial architecture, some monetary policy issues and the role of international institutions. The second chapter is a technology assessment study that focuses on the relationship between technology and money. The application of technology to payments systems is transforming the way we use money and, in some instances, is blurring the definition of what constitutes money. This chapter surveys the developments in electronic forms of payment and their relationship to the banking system. It also analyses the challenges posed by electronic money for regulators and policy makers, and in particular the opportunities created by two simultaneous processes: the Economic and Monetary Union and the increasing use of electronic payment instruments. The third chapter deals with the implications of developments in ICT on relationship banking. The financial intermediation literature explains relationship banking as a type of financial intermediation characterised by proprietary information and multiple interactions with customers. This form of banking is important for the financing of small and medium-sized enterprises. We discuss the effects of ICT on the banking sector as a whole and then apply these developments to the case of relationship banking. The fourth chapter is an empirical study of the effects of technology on the banking business, using a sample of data from the Spanish banking industry. The design of the study is based on some of the events described in the previous chapters, and also draws from the economic literature on banking. The study shows that developments in information management have differential effects on wholesale and retail banking activities. Finally, the last chapter is a technology assessment study on electronic payments systems in Spain and the European Union. It contains an analysis of existing payment systems and ongoing or planned initiatives in Spain. It forms part of a broader project comprising a series of country-specific analyses covering ten European countries. The main issues raised across the countries serve as the starting point to discuss implications of the development of electronic money for regulation and policies, and in particular, for monetary-policy making.
Resumo:
What do the designers tend to achieve? To relate themselves to the reality by producing visual registers of emotions and thoughts, or by projecting and producing objects that are functional, adapting technologies to daily needs. That requires that a designer be a keen observer of his physical surroundings and have a fine sensibility to cultures, enabling him to disassemble the latent forms of the reality and cultural symbolisms in order to perceive the order underlying them and the principles of their composition and unity. Only then could he reproduce the nature and respond to cultural callings. In this process of understanding the surrounding reality of nature and cultures, a designer always moves, generally without being aware of it, between two processes: identity search and self-identification.
Resumo:
Regulatory agencies such as Europol, Frontex, Eurojust, CEPOL as well as bodies such as OLAF, have over the past decade become increasingly active within the institutional architecture constituting the EU’s Area of Freedom, Security and Justice and are now placed at the forefront of implementing and developing the EU’s internal security model. A prominent feature of agency activity is the large-scale proliferation of ‘knowledge’ on security threats via the production of policy tools such as threat assessments, risk analyses, periodic and situation reports. These instruments now play a critical role in providing the evidence-base that supports EU policymaking, with agency-generated ‘knowledge’ feeding political priority setting and decision-making within the EU’s new Internal Security Strategy (ISS). This paper examines the nature and purpose of knowledge generated by EU Home Affairs agencies. It asks where does this knowledge originate? How does it measure against criteria of objectivity, scientific rigour, reliability and accuracy? And how is it processed in order to frame threats, justify actions and set priorities under the ISS?