1000 resultados para EUROPE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Review of Memory and Gender in Medieval Europe, 900-1200 by Elizabeth van Houts (Toronto UP, 1999).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work focuses on the role of macroseismology in the assessment of seismicity and probabilistic seismic hazard in Northern Europe. The main type of data under consideration is a set of macroseismic observations available for a given earthquake. The macroseismic questionnaires used to collect earthquake observations from local residents since the late 1800s constitute a special part of the seismological heritage in the region. Information of the earthquakes felt on the coasts of the Gulf of Bothnia between 31 March and 2 April 1883 and on 28 July 1888 was retrieved from the contemporary Finnish and Swedish newspapers, while the earthquake of 4 November 1898 GMT is an example of an early systematic macroseismic survey in the region. A data set of more than 1200 macroseismic questionnaires is available for the earthquake in Central Finland on 16 November 1931. Basic macroseismic investigations including preparation of new intensity data point (IDP) maps were conducted for these earthquakes. Previously disregarded usable observations were found in the press. The improved collection of IDPs of the 1888 earthquake shows that this event was a rare occurrence in the area. In contrast to earlier notions it was felt on both sides of the Gulf of Bothnia. The data on the earthquake of 4 November 1898 GMT were augmented with historical background information discovered in various archives and libraries. This earthquake was of some concern to the authorities, because extra fire inspections were conducted in three towns at least, i.e. Tornio, Haparanda and Piteå, located in the centre of the area of perceptibility. This event posed the indirect hazard of fire, although its magnitude around 4.6 was minor on the global scale. The distribution of slightly damaging intensities was larger than previously outlined. This may have resulted from the amplification of the ground shaking in the soft soil of the coast and river valleys where most of the population was found. The large data set of the 1931 earthquake provided an opportunity to apply statistical methods and assess methodologies that can be used when dealing with macroseismic intensity. It was evaluated using correspondence analysis. Different approaches such as gridding were tested to estimate the macroseismic field from the intensity values distributed irregularly in space. In general, the characteristics of intensity warrant careful consideration. A more pervasive perception of intensity as an ordinal quantity affected by uncertainties is advocated. A parametric earthquake catalogue comprising entries from both the macroseismic and instrumental era was used for probabilistic seismic hazard assessment. The parametric-historic methodology was applied to estimate seismic hazard at a given site in Finland and to prepare a seismic hazard map for Northern Europe. The interpretation of these results is an important issue, because the recurrence times of damaging earthquakes may well exceed thousands of years in an intraplate setting such as Northern Europe. This application may therefore be seen as an example of short-term hazard assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of people as buyers and eaters of food has changed significantly. From being protected by a paternalistic welfare state, people appear to be accorded more freedom and responsibility as individuals, where attention is redirected from the state towards market relations. Many have asserted that these changes are accompanied by fragmentation, individualisation, and privatisation, leading to individual uncertainty and lack of confidence. But empirical observations do not always confirm this, distrust is not necessarily growing and while responsibilities may change, the state still plays an active role. This dissertation explores changing relationships between states and markets, on the one hand, and ordinary people in their capacities as consumers and citizens, on the other. Do we see the emergence of new forms of regulation of food consumption? If so, what is the scope and what are the characteristics? Theories of regulation addressing questions about individualisation and self-governance are combined with a conceptualisation of consumption as processes of institutionalisation, involving daily routines, the division of labour between production and consumption, and the institutional field in which consumption is embedded. The analyses focus on the involvement of the state, food producers and scientific, first of all nutritional, expertise in regulating consumption, and on popular responses. Two periods come out as important, first when the ideas of “designing the good life” emerged, giving the state a very particular role in regulating food consumption, and, second, when this “designing” is replaced by ideas of choice and individual responsibility. One might say that “consumer choice” has become a mode of regulation. I use mainly historical studies from Norway to analyse the shifting role of the state in regulating food consumption, complemented with population surveys from six European countries to study how modernisation processes are associated with trust. The studies find that changing regulation is not only a question of societal or state vs individual responsibilities. Degrees of organisation and formalisation are important as well. While increasing organisation may represent discipline and abuses of power (including exploitation of consumer loyalty), organisation can also, to the consumer, provide higher predictability, systems to deal with malfeasance, and efficiency which may provide conditions for acting. The welfare state and the neo-liberal state have very different types of solutions. The welfare state solution is based on (national) egalitarianism, paternalism and discipline (of the market as well as households). Such solutions are still prominent in Norway. Individualisation and self-regulation may represent a regulatory response not only to a declining legitimacy of this kind of interventionism, but also increasing organisational complexity. This is reflected in large-scale re-regulation of markets as well as in relationships with households and consumers. Individualisation of responsibility is to the consumer not a matter of the number of choices that are presented on the shelves, but how choice as a form of consumer based involvement is institutionalised. It is recognition of people as “end-consumers”, as social actors, with systems of empowerment politically as well as via the provisioning system. ‘Consumer choice’ as a regulatory strategy includes not only communicative efforts to make people into “choosing consumers”, but also the provision of institutions which recognise consumer interests and agency. When this is lacking we find distrust as representing powerlessness. Individual responsibility-taking represents agency and is not always a matter of loyal support to shared goals, but involves protest and creativity. More informal (‘communitarian’) innovations may be an indication of that, where self-realisation is intimately combined with responsibility for social problems. But as solutions to counteract existing imbalances of power in the food market the impacts of such initiatives are probably more as part of consumer mobilisation and politicisation than as alternative provisioning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing body of empirical research examines the structure and effectiveness of corporate governance systems around the world. An important insight from this literature is that corporate governance mechanisms address the excessive use of managerial discretionary powers to get private benefits by expropriating the value of shareholders. One possible way of expropriation is to reduce the quality of disclosed earnings by manipulating the financial statements. This lower quality of earnings should then be reflected by the stock price of firm according to value relevance theorem. Hence, instead of testing the direct effect of corporate governance on the firm’s market value, it is important to understand the causes of the lower quality of accounting earnings. This thesis contributes to the literature by increasing knowledge about the extent of the earnings management – measured as the extent of discretionary accruals in total disclosed earnings - and its determinants across the Transitional European countries. The thesis comprises of three essays of empirical analysis of which first two utilize the data of Russian listed firms whereas the third essay uses data from 10 European economies. More specifically, the first essay adds to existing research connecting earnings management to corporate governance. It testifies the impact of the Russian corporate governance reforms of 2002 on the quality of disclosed earnings in all publicly listed firms. This essay provides empirical evidence of the fact that the desired impact of reforms is not fully substantiated in Russia without proper enforcement. Instead, firm-level factors such as long-term capital investments and compliance with International financial reporting standards (IFRS) determine the quality of the earnings. The result presented in the essay support the notion proposed by Leuz et al. (2003) that the reforms aimed to bring transparency do not correspond to desired results in economies where investor protection is lower and legal enforcement is weak. The second essay focuses on the relationship between the internal-control mechanism such as the types and levels of ownership and the quality of disclosed earnings in Russia. The empirical analysis shows that the controlling shareholders in Russia use their powers to manipulate the reported performance in order to get private benefits of control. Comparatively, firms owned by the State have significantly better quality of disclosed earnings than other controllers such as oligarchs and foreign corporations. Interestingly, market performance of firms controlled by either State or oligarchs is better than widely held firms. The third essay provides useful evidence on the fact that both ownership structures and economic characteristics are important factors in determining the quality of disclosed earnings in three groups of countries in Europe. Evidence suggests that ownership structure is a more important determinant in developed and transparent countries, while economic determinants are important determinants in developing and transitional countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analyses the results of five Eurobarometer surveys (of 1995, 1997, 1998, 2000 and 2005) designed to measure which languages Europeans consider most useful to know. Most Europeans are of the opinion that English is the most useful, followed by French and German. During the last decade the popularity of French and German as useful languages has been decreasing significantly, while English has remained universally favoured as the most useful language. French and German have lost their popularity especially among those who do not speak them as a foreign language. On the other hand, Spanish, Russian and other languages (often these include languages of neighbouring countries, minority languages or a second official language of the country in question) have kept and even increased their former level of popularity. Opinions about useful languages vary according to a respondent’s knowledge of languages, education and profession. This article analyses these differences and discusses their impact on the study of foreign languages and the future of the practice of foreign languages in Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the past two centuries, nationalism has been among the most influential legitimizing principles of political organization. According to its simple definition, nationalism is a principle or a way of thinking and acting which holds that the world is divided into nations, and that national and political units should be congruent. Nationalism can thus be divided into two aspects: internal and external. Internally, the political units, i.e., states, should be made up of only one nation. Externally each nation-state should be sovereign. Transnational national governance of rights of national minorities violates both these principles. This study explores the formation, operation, and effectiveness of the European post-Cold War minorities system. The study identifies two basic approaches to minority rights: security and justice. These approaches have been used to legitimize international minority politics and they also inform the practice of transnational governance. The security approach is based on the recognition that the norm of national self-determination cannot be fulfilled in all relevant cases, and so minority rights are offered as a compensation to the dissatisfied national groups, reducing their aspiration to challenge the status quo. From the justice perspective, minority rights are justified as a compensatory strategy against discrimination caused by majority nation-building. The research concludes that the post-Cold War minorities system was justified on the basis of a particular version of the security approach, according to which only Eastern European minority situations are threatening because of the ethnic variant of nationalism that exists in that region. This security frame was essential in internationalising minority issues and justifying the swift development of norms and institutions to deal with these issues. However, from the justice perspective this approach is problematic, since it justified double standards in European minority politics. Even though majority nation-building is often detrimental to minorities also in Western Europe, Western countries can treat their minorities more or less however they choose. One of the main contributions of this thesis is the detailed investigation of the operation of the post-Cold War minorities system. For the first decade since its creation in the early 1990s, the system operated mainly through its security track, which is based on the field activities of the OSCE that are supported by the EU. The study shows how the effectiveness of this track was based on inter-organizational cooperation in which various transnational actors compensate for each other s weaknesses. After the enlargement of the EU and dissolution of the membership conditionality this track, which was limited to Eastern Europe from the start, has become increasingly ineffective. Since the EU enlargement, the focus minorities system has shifted more and more towards its legal track, which is based on the Framework Convention for the Protection of National Minorities (Council of Europe). The study presents in detail how a network of like-minded representatives of governments, international organizations, and independent experts was able strengthen the framework convention s (originally weak) monitoring system considerably. The development of the legal track allows for a more universal and consistent, justice-based approach to minority rights in contemporary Europe, but the nationalist principle of organization still severely hinders the materialization of this possibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embryonic stem cells offer potentially a ground-breaking insight into health and diseases and are said to offer hope in discovering cures for many ailments unimaginable few years ago. Human embryonic stem cells are undifferentiated, immature cells that possess an amazing ability to develop into almost any body cell such as heart muscle, bone, nerve and blood cells and possibly even organs in due course. This remarkable feature, enabling embryonic stem cells to proliferate indefinitely in vitro (in a test tube), has branded them as a so-called miracle cure . Their potential use in clinical applications provides hope to many sufferers of debilitating and fatal medical conditions. However, the emergence of stem cell research has resulted in intense debates about its promises and dangers. On the one hand, advocates hail its potential, ranging from alleviating and even curing fatal and debilitating diseases such as Parkinson s, diabetes, heart ailments and so forth. On the other hand, opponents decry its dangers, drawing attention to the inherent risks of human embryo destruction, cloning for research purposes and reproductive cloning eventually. Lately, however, the policy battles surrounding human embryonic stem cell innovation have shifted from being a controversial research to scuffles within intellectual property rights. In fact, the ability to obtain patents represents a pivotal factor in the economic success or failure of this new biotechnology. Although, stem cell patents tend to more or less satisfy the standard patentability requirements, they also raise serious ethical and moral questions about the meaning of the exclusions on ethical or moral grounds as found in European and to an extent American and Australian patent laws. At present there is a sort of a calamity over human embryonic stem cell patents in Europe and to an extent in Australia and the United States. This in turn has created a sense of urgency to engage all relevant parties in the discourse on how best to approach patenting of this new form of scientific innovation. In essence, this should become a highly favoured patenting priority. To the contrary, stem cell innovation and its reliance on patent protection risk turmoil, uncertainty, confusion and even a halt on not only stem cell research but also further emerging biotechnology research and development. The patent system is premised upon the fundamental principle of balance which ought to ensure that the temporary monopoly awarded to the inventor equals that of the social benefit provided by the disclosure of the invention. Ensuring and maintaining this balance within the patent system when patenting human embryonic stem cells is of crucial contemporary relevance. Yet, the patenting of human embryonic stem cells raises some fundamental moral, social and legal questions. Overall, the present approach of patenting human embryonic stem cell related inventions is unsatisfactory and ineffective. This draws attention to a specific question which provides for a conceptual framework for this work. That question is the following: how can the investigated patent offices successfully deal with patentability of human embryonic stem cells? This in turn points at the thorny issue of application of the morality clause in this field. In particular, the interpretation of the exclusions on ethical or moral grounds as found in Australian, American and European legislative and judicial precedents. The Thesis seeks to compare laws and legal practices surrounding patentability of human embryonic stem cells in Australia and the United States with that of Europe. By using Europe as the primary case study for lessons and guidance, the central goal of the Thesis then becomes the determination of the type of solutions available to Europe with prospects to apply such to Australia and the United States. The Dissertation purports to define the ethical implications that arise with patenting human embryonic stem cells and intends to offer resolutions to the key ethical dilemmas surrounding patentability of human embryonic stem cells and other morally controversial biotechnology inventions. In particular, the Thesis goal is to propose a functional framework that may be used as a benchmark for an informed discussion on the solution to resolving ethical and legal tensions that come with patentability of human embryonic stem cells in Australian, American and European patent worlds. Key research questions that arise from these objectives and which continuously thread throughout the monograph are: 1. How do common law countries such as Australia and the United States approach and deal with patentability of human embryonic stem cells in their jurisdictions? These practices are then compared to the situation in Europe as represented by the United Kingdom (first two chapters), the Court of Justice of the European Union and the European Patent Office decisions (Chapter 3 onwards) in order to obtain a full picture of the present patenting procedures on the European soil. 2. How are ethical and moral considerations taken into account at patent offices investigated when assessing patentability of human embryonic stem cell related inventions? In order to assess this part, the Thesis evaluates how ethical issues that arise with patent applications are dealt with by: a) Legislative history of the modern patent system from its inception in 15th Century England to present day patent laws. b) Australian, American and European patent offices presently and in the past, including other relevant legal precedents on the subject matter. c) Normative ethical theories. d) The notion of human dignity used as the lowest common denominator for the interpretation of the European morality clause. 3. Given the existence of the morality clause in form of Article 6(1) of the Directive 98/44/EC of the European Parliament and of the Council of 6 July 1998 on the legal protection of biotechnological inventions which corresponds to Article 53(a) European Patent Convention, a special emphasis is put on Europe as a guiding principle for Australia and the United States. Any room for improvement of the European morality clause and Europe s current manner of evaluating ethical tensions surrounding human embryonic stem cell inventions is examined. 4. A summary of options (as represented by Australia, the United States and Europe) available as a basis for the optimal examination procedure of human embryonic stem cell inventions is depicted, whereas the best of such alternatives is deduced in order to create a benchmark framework. This framework is then utilised on and promoted as a tool to assist Europe (as represented by the European Patent Office) in examining human embryonic stem cell patent applications. This method suggests a possibility of implementing an institution solution. 5. Ultimately, a question of whether such reformed European patent system can be used as a founding stone for a potential patent reform in Australia and the United States when examining human embryonic stem cells or other morally controversial inventions is surveyed. The author wishes to emphasise that the guiding thought while carrying out this work is to convey the significance of identifying, analysing and clarifying the ethical tensions surrounding patenting human embryonic stem cells and ultimately present a solution that adequately assesses patentability of human embryonic stem cell inventions and related biotechnologies. In answering the key questions above, the Thesis strives to contribute to the broader stem cell debate about how and to which extent ethical and social positions should be integrated into the patenting procedure in pluralistic and morally divided democracies of Europe and subsequently Australia and the United States.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comparison of microsite occupancy and the spatial structure of regeneration in three areas of late-successional Norway spruce dominated forest. Pallas-Ylläs is understood to have been influenced only by small-scale disturbance; Dvina-Pinega has had sporadic larger-scale disturbances; Kazkim has been affected by fire. All spruce and birch trees with diameter at breast height (DBH) ?10 cm were mapped in five stands on 40 m x 400 m transects, and those with DBH < 10 cm on 2 or 4 m x 400 m subplots. Microsite type was inventoried at 1m intervals along the centre line and for each tree with DBH < 10 cm. At all study areas small seedlings (h < 0.3 m, DBH < 10 cm) preferentially occupied disturbed microsites. In contrast, spruce saplings (h ? 1.3 m, DBH <10 cm) at all study areas showed less, or no, preference. At Pallas-Ylläs spruce seedlings (h < 1.3 m, DBH < 10 cm) and saplings (h ? 1.3 m, DBH < 10 cm) exhibited spatial correlation at scales from 32-52 m. At Dvina-Pinega saplings of both spruce and birch exhibited spatial correlation at scales from 32-81 m. At Kazkim spatial correlation of seedlings and saplings of both species was exhibited over variable distances. No spatial cross-correlation was found between overstorey basal area (DBH ? 10 cm) and regeneration (h ? 1.3 m, DBH < 10 cm) at any study area. The results confirm the importance of disturbed microsites for seedling establishment, but suggest that undisturbed microsites may sometimes be more advantageous for long-term tree survival. The regeneration gap concept may not be useful in describing the regeneration dynamics of late-successional boreal forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thunderstorm is a dangerous electrical phenomena in the atmosphere. Thundercloud is formed when thermal energy is transported rapidly upwards in convective updraughts. Electrification occurs in the collisions of cloud particles in the strong updraught. When the amount of charge in the cloud is large enough, electrical breakdown, better known as a flash, occurs. Lightning location is nowadays an essential tool for the detection of severe weather. Located flashes indicate in real time the movement of hazardous areas and the intensity of lightning activity. Also, an estimate for the flash peak current can be determined. The observations can be used in damage surveys. The most simple way to represent lightning data is to plot the locations on a map, but the data can be processed in more complex end-products and exploited in data fusion. Lightning data serves as an important tool also in the research of lightning-related phenomena, such as Transient Luminous Events. Most of the global thunderstorms occur in areas with plenty of heat, moisture and tropospheric instability, for example in the tropical land areas. In higher latitudes like in Finland, the thunderstorm season is practically restricted to the summer season. Particular feature of the high-latitude climatology is the large annual variation, which regards also thunderstorms. Knowing the performance of any measuring device is important because it affects the accuracy of the end-products. In lightning location systems, the detection efficiency means the ratio between located and actually occurred flashes. Because in practice it is impossible to know the true number of actually occurred flashes, the detection efficiency has to be esimated with theoretical methods.