847 resultados para Italian novels of the 20th century
Resumo:
Contribution to ARI Remix. ARI remix is a three-year digital humanities, artist interviews and oral history project collecting and presenting memories of Australian Artist-run culture in Queensland, New South Wales and the Australian Capital Territory between 1980 and 2000. Its focus is fleshing out and illuminating the ephemeral and neglected histories of the many lively and socially engaged artistic scenes along the east coast of Australia during the last two decades of the 20th century.
Resumo:
Wild salmon stocks in the northern Baltic rivers became endangered in the second half of the 20th century, mainly due to recruitment overfishing. As a result, supplementary stocking was widely practised, and supplementation of the Tornionjoki salmon stock took place over a 25 year period until 2002. The stock has been closely monitored by electrofishing, smolt trapping, mark-recapture studies, catch samples and catch surveys. Background information on hatchery-reared stocked juveniles was also collected for this study. Bayesian statistics was applied to the data as this method offers the possibility of bringing prior information into the analysis and an advanced ability for incorporating uncertainty, and also provides probabilities for a multitude of hypotheses. Substantial divergences between reared and wild Tornionjoki salmon were identified in both demographic and phenological characteristics. The divergences tended to be larger the longer the duration spent in hatchery and the more favourable the hatchery conditions were for fast growth. Differences in environment likely induced most of the divergences, but selection of brood fish might have resulted in genotypic divergence in maturation age of reared salmon. Survival of stocked 1-year old juveniles to smolt varied from about 10% to about 25%. Stocking on the lower reach of the river seemed to decrease survival, and the negative effect of stocking volume on survival raises the concern of possible similar effects on the extant wild population. Post-smolt survival of wild Tornionjoki smolts was on average two times higher than that of smolts stocked as parr and 2.5 times higher than that of stocked smolts. Smolts of different groups showed synchronous variation and similar long-term survival trends. Both groups of reared salmon were more vulnerable to offshore driftnet and coastal trapnet fishing than wild salmon. Average survival from smolt to spawners of wild salmon was 2.8 times higher than that of salmon stocked as parr and 3.3 times higher than that of salmon stocked as smolts. Wild salmon and salmon stocked as parr were found to have similar lifetime survival rates, while stocked smolts have a lifetime survival rate over 4 times higher than the two other groups. If eggs are collected from the wild brood fish, stocking parr would therefore not be a sensible option. Stocking smolts instead would create a net benefit in terms of the number of spawners, but this strategy has serious drawbacks and risks associated with the larger phenotypic and demographic divergences from wild salmon. Supplementation was shown not to be the key factor behind the recovery of the Tornionjoki and other northern Baltic salmon stocks. Instead, a combination of restrictions in the sea fishery and simultaneous occurrence of favourable natural conditions for survival were the main reasons for the revival in the 1990 s. This study questions the effectiveness of supplementation as a conservation management tool. The benefits of supplementation seem at best limited. Relatively high occurrences of reared fish in catches may generate false optimism concerning the effects of supplementation. Supplementation may lead to genetic risks due to problems in brood fish collection and artificial rearing with relaxed natural selection and domestication. Appropriate management of fisheries is the main alternative to supplementation, without which all other efforts for long-term maintenance of a healthy fish resource fail.
Resumo:
The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.
Resumo:
Since the second half of the 20th century, cancer has become a dominant disease in Western countries, endangering people regardless of age, gender, race or social status. Every year almost eight million people die of cancer worldwide. In Finland every fourth person is expected to fall ill with cancer at some stage of his or her life. During the 20th century, along with rapid changes in the medical system, people s awareness of cancer has increased a great deal. This has also influenced the image of cancer in popular discourse over the past decades. However, from the scientific point of view there is still much that is unclear about the disease. This thesis shows that this is a big problem for ordinary people, as, according to culture-bound illness ideology, people need an explanation about the origin of their illness in order to help them cope. The main aim of this thesis is to examine the process of being ill with cancer from the patient s point of view, in order to analyse attitudes and behaviour towards cancer and its significance and culture-bound images. This narrative-based study concentrates on patients voicings , which are important in understanding the cancer experience and when attempting to make it more open within current cultural and societal settings. The Kun sairastuin syöpään ( when I fell ill with cancer ) writing competition organised by Suomen Syöpäpotilaat ry (the Finnish Cancer Patients Association), Suomen Syöpäyhdistys ry (the Finnish Cancer Union), and Suomalaisen Kirjallisuuden Seuran kansanrunousarkisto (the Finnish Literary Society Folklore Archive) was announced on the 1st of May 1994 and lasted until the 30th of September 1994. As a result, a total of 672 cancer narratives, totalling 6384 pages, were received, filled with experiences relating to cancer. Written cancer narratives form a body of empirical data that is suitable for content or textual analysis. In this thesis, content analysis is adopted in order to become familiar with the texts and to preselect the themes and analytical units for further examination. I use multiple perspectives in order to interpret cancer patients ideas and reasoning. The ethnomedical approach unites popular health beliefs that originated in Finnish folk medicine, as well as connecting alternative medicine, which patients make use of, with biomedicine, the dominant form of medicine today. In addition to this, patients narratives, which are composed of various structural segments, are approached from the folklorist s perspective. In this way they can be seen as short pathographies, reconstructions of self-negotiation and individual decision making during the illness process. Above all, cancer patients writing describe their feelings, thoughts and experiences. Factors that appear insignificant to modern medicine, overwhelmed as it is by medical technologies that concentrate on dysfunctional tissue within diseased bodies. Ethnomedical study of cancer patients writings gives access to the human side of cancer discourse, and combines both medical, and popular, knowledge of cancer. In my view, the natural world and glimpses of tradition are bound together with one general aim within cancer narratives: to tackle the illness and mediate its meanings. Furthermore, the narrative approach reveals that participants write with the hope of offering a different interpretation of the cancer experience, and thus of confronting culturally pre-defined images and ideologies.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Trafficking in human beings has become one of the most talked about criminal concerns of the 21st century. But this is not all that it has become. Trafficking has also been declared as one of the most pressing human rights issues of our time. In this sense, it has become a part of the expansion of the human rights phenomenon. Although it is easy to see that the crime of trafficking violates several of the human rights of its victims, it is still, in its essence, a fairly conventional although particularly heinous and often transnational crime, consisting of acts between private actors, and lacking, therefore, the vertical effect associated traditionally with human rights violations. This thesis asks, then, why, and how, has the anti-trafficking campaign been translated in human rights language. And even more fundamentally: in light of the critical, theoretical studies surrounding the expansion of the human rights phenomenon, especially that of Costas Douzinas, who has declared that we have come to the end of human rights as a consequence of the expansion and bureaucratization of the phenomenon, can human rights actually bring salvation to the victims of trafficking? The thesis demonstrates that the translation process of the anti-trafficking campaign into human rights language has been a complicated process involving various actors, including scholars, feminist NGOs, local activists and global human rights NGOs. It has also been driven by a complicated web of interests, the most prevalent one the sincere will to help the victims having become entangled with other aims, such as political, economical, and structural goals. As a consequence of its fragmented background, the human rights approach to trafficking seeks still its final form, consisting of several different claims. After an assessment of these claims from a legal perspective, this thesis concludes that the approach is most relevant regarding the mistreatment of victims of trafficking in the hands of state authorities. It seems to be quite common that authorities have trouble identifying the victims of trafficking, which means that the rights granted to themin international and national documents are not realized in practice, but victims of trafficking are systematically deported as illegal immigrants. It is argued that in order to understand the measures of the authorities, and to assess the usefulness of human rights, it is necessary to adopt a Foucauldian perspective and to observe the measures as biopolitical defence mechanisms. From a biopolitical perspective, the victims of trafficking can be seen as a threat to the population a threat that must be eliminated either by assimilating them to the main population with the help of disciplinary techniques, or by excluding them completely from the society. This biopolitical aim is accomplished through an impenetrable net of seemingly insignificant practices and discourses that not even the participants are aware of. As a result of these practices and discourses, trafficking victims only very few of fit the myth of the perfect victim, produced by biopolitical discourses become invisible and therefore subject to deportation as (risky) illegal immigrants, turning them into bare life in the Agambenian sense, represented by the homo sacer, who cannot be sacrificed, yet does not enjoy the protection of the society and its laws. It is argued, following Jacques Rancière and Slavoj i ek, that human rights can, through their universality and formal equality, provide bare life the tools to formulate political claims and therefore utilize their politicization through their exclusion to return to the sphere of power and politics. Even though human rights have inevitably become entangled with biopolitical practices, they are still perhaps the most efficient way to challenge biopower. Human rights have not, therefore, become useless for the victims of trafficking, but they must be conceived as a universal tool to formulate political claims and challenge power .In the case of trafficking this means that human rights must be utilized to constantly renegotiate the borders of the problematic concept of victim of trafficking created by international instruments, policies and discourses, including those that are sincerely aimed to provide help for the victims.
Resumo:
The estimation of water and solute transit times in catchments is crucial for predicting the response of hydrosystems to external forcings (climatic or anthropogenic). The hydrogeochemical signatures of tracers (either natural or anthropogenic) in streams have been widely used to estimate transit times in catchments as they integrate the various processes at stake. However, most of these tracers are well suited for catchments with mean transit times lower than about 4-5 years. Since the second half of the 20th century, the intensification of agriculture led to a general increase of the nitrogen load in rivers. As nitrate is mainly transported by groundwater in agricultural catchments, this signal can be used to estimate transit times greater than several years, even if nitrate is not a conservative tracer. Conceptual hydrological models can be used to estimate catchment transit times provided their consistency is demonstrated, based on their ability to simulate the stream chemical signatures at various time scales and catchment internal processes such as N storage in groundwater. The objective of this study was to assess if a conceptual lumped model was able to simulate the observed patterns of nitrogen concentration, at various time scales, from seasonal to pluriannual and thus if it was relevant to estimate the nitrogen transit times in headwater catchments. A conceptual lumped model, representing shallow groundwater flow as two parallel linear stores with double porosity, and riparian processes by a constant nitrogen removal function, was applied on two paired agricultural catchments which belong to the Research Observatory ORE AgrHys. The Global Likelihood Uncertainty Estimation (GLUE) approach was used to estimate parameter values and uncertainties. The model performance was assessed on (i) its ability to simulate the contrasted patterns of stream flow and stream nitrate concentrations at seasonal and inter-annual time scales, (ii) its ability to simulate the patterns observed in groundwater at the same temporal scales, and (iii) the consistency of long-term simulations using the calibrated model and the general pattern of the nitrate concentration increase in the region since the beginning of the intensification of agriculture in the 1960s. The simulated nitrate transit times were found more sensitive to climate variability than to parameter uncertainty, and average values were found to be consistent with results from others studies in the same region involving modeling and groundwater dating. This study shows that a simple model can be used to simulate the main dynamics of nitrogen in an intensively polluted catchment and then be used to estimate the transit times of these pollutants in the system which is crucial to guide mitigation plans design and assessment. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
6 p.
Resumo:
Background: Health expectancy is a useful tool to monitor health inequalities. The evidence about the recent changes in social inequalities in healthy expectancy is relatively scarce and inconclusive, and most studies have focused on Anglo-Saxon and central or northern European countries. The objective of this study was to analyse the changes in socioeconomic inequalities in disability-free life expectancy in a Southern European population, the Basque Country, during the first decade of the 21st century. Methods: This was an ecological cross-sectional study of temporal trends on the Basque population in 1999-2003 and 2004-2008. All-cause mortality rate, life expectancy, prevalence of disability and disability free-life expectancy were calculated for each period according to the deprivation level of the area of residence. The slope index of inequality and the relative index of inequality were calculated to summarize and compare the inequalities in the two periods. Results: Disability free-life expectancy decreased as area deprivation increased both in men and in women. The difference between the most extreme groups in 2004-2008 was 6.7 years in men and 3.7 in women. Between 1999-2003 and 2004-2008, socioeconomic inequalities in life expectancy decreased, and inequalities in disability-free expectancy increased in men and decreased in women. Conclusions: This study found important socioeconomic inequalities in health expectancy in the Basque Country. These inequalities increased in men and decreased in women in the first decade of the 21st century, during which the Basque Country saw considerable economic growth.
Resumo:
Hamlet (1601), de William Shakespeare, é, desde o Fólio de 1623, circundada por um enorme e variado volume de leituras, que abrangem desde textos críticos e teóricos até as mais diversas adaptações teatrais e cinematográficas. Desde o final do século 19, o cinema vem adaptando peças de Shakespeare, fornecendo novos pontos de vista e sugestões para a encenação dessa obra ao levá-la inúmeras vezes para as telas. Dentre uma longa lista de adaptações fílmicas de Hamlet, o Hamlet mainstream de Franco Zeffirelli (1990) e o Hamlet 2000 (2000), filme independente de Michael Almereyda, compõem o corpus eleito para análise nesta dissertação. Dialogando com noções de críticos e teóricos que desenvolveram estudos sobre o conceito de adaptação, tais como André Bazin, Robert Stam e Linda Hutcheon, sugiro uma desierarquização entre a peça shakespeariana e os filmes logo, entre literatura/teatro e cinema. O objetivo final deste trabalho encontra-se na proposta de uma reflexão sobre esses filmes enquanto potenciais materiais críticos elucidativos para o estudo da peça, úteis na discussão de alguns de seus mais importantes temas e/ou questões
Resumo:
This is the history of contamination in sediments from the Mersey Estuary: Development of a chronology for the contamination of the Mersey Estuary by heavy metals and organochlorines Report produced by the Environment Agency in 1998. This report looks at the history of industrial contamination of the Mersey and Ribble Estuaries back to the early part of the last century, many decades before the start of monitoring programmes providing a remarkably detailed picture of very complex changes. There is a clear record in the sediment of the contamination by each heavy metal (including: Cu, Cr, Hg, Pb, Zn) and organochlorine chemical (including DDT isomers and PCB congeners) studied. The results of the study clearly show the increases in levels of contamination as industry expanded early last century followed by various improvements as this century progressed. Each pollutant has its own idiosyncratic pattern of change with some improvements predating modern environmental concerns whilst other changes seem to relate directly to recent improvements in legislative control. Overall, for the pollutants studied, the results clearly demonstrate the magnitude of improvement that has been achieved in what was a very polluted area. The only major reservation to this story is that despite the wide range of substances covered, many other potentially important pollutants remain to be studied in a similar manner.
Resumo:
A sediment core was collected from the centre of Wanghu Lake, in the Middle Reaches of the Yangtze River. The recent part of the core was dated using a combination of Pb-210 and spheroidal carbonaceous particle (SCP) techniques. Extrapolating this chronology dated the laminated section of the core, between 723 and 881 mm, to the first half of the 18th century and this section was selected for detailed study. The thicknesses of the laminae were measured using reflecting and polarizing microscopes whilst geochemistry was determined by an electron probe. The thickness of the dark layers was found to be positively correlated with titanium concentrations, and negatively correlated with aluminium and potassium concentrations. The thickness of the light layers was found to be negatively correlated with the concentrations of titanium. It is concluded that the dark layers were deposited from the Fushui River, a tributary of the Yangtze River, under periods of normal flow whilst the light Layers were mainly deposited from the Yangtze River itself during flood periods. Documentary evidence for floods occurring in the take catchment corresponded with thick laminations of high titanium concentration. Further, two of the three thickest, light laminations with low titanium concentrations were found to be synchronous with recorded flood dates of the main Yangtze River in its Middle Reaches, but one was synchronous with a local drought. These data suggest that the Lake sediment provides an archive of the relative water levels of the Yangtze and Wanghu including floods of both the main Yangtze River and the local hydrological regime. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The discipline of Artificial Intelligence (AI) was born in the summer of 1956 at Dartmouth College in Hanover, New Hampshire. Half of a century has passed, and AI has turned into an important field whose influence on our daily lives can hardly be overestimated. The original view of intelligence as a computer program - a set of algorithms to process symbols - has led to many useful applications now found in internet search engines, voice recognition software, cars, home appliances, and consumer electronics, but it has not yet contributed significantly to our understanding of natural forms of intelligence. Since the 1980s, AI has expanded into a broader study of the interaction between the body, brain, and environment, and how intelligence emerges from such interaction. This advent of embodiment has provided an entirely new way of thinking that goes well beyond artificial intelligence proper, to include the study of intelligent action in agents other than organisms or robots. For example, it supplies powerful metaphors for viewing corporations, groups of agents, and networked embedded devices as intelligent and adaptive systems acting in highly uncertain and unpredictable environments. In addition to giving us a novel outlook on information technology in general, this broader view of AI also offers unexpected perspectives into how to think about ourselves and the world around us. In this chapter, we briefly review the turbulent history of AI research, point to some of its current trends, and to challenges that the AI of the 21st century will have to face. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Chemical and isotopic data of the lava samples dredged in the southern Bach Ridge and the northern Italian Ridge of the Musicians Seamounts province, northeast of Hawaii. Although most of the samples analyzed are generally altered, a few are fresh. The latter exhibits similar geochemical and isotopic characteristics to normal MORB (Mid-Ocean Ridge Basalts). There are systematic geochemical trends from hotspot to mid-ocean ridge in the province. Incompatible element and isotopic variations suggest that the flow field had at least two distinct parental magmas, one with higher and one with lower MgO concentrations. The two parental magmas could be related by a magma mixing model. The major and trace element modeling shows that the two parental magmas could not have been produced by different degrees of melting of a homogeneous mantle source, but they are consistent with melting of a generally depleted mantle containing variable volumes of embedded enriched heterogeneity enriched interbeds.
Resumo:
Emeseh, Engobo, 'Corporate Responsibility for Crime: Thinking outside the Box' I University of Botswana Law Journal (2005) 28-49 RAE2008