17 resultados para Italian novels of the 20th century
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Trafficking in human beings has become one of the most talked about criminal concerns of the 21st century. But this is not all that it has become. Trafficking has also been declared as one of the most pressing human rights issues of our time. In this sense, it has become a part of the expansion of the human rights phenomenon. Although it is easy to see that the crime of trafficking violates several of the human rights of its victims, it is still, in its essence, a fairly conventional although particularly heinous and often transnational crime, consisting of acts between private actors, and lacking, therefore, the vertical effect associated traditionally with human rights violations. This thesis asks, then, why, and how, has the anti-trafficking campaign been translated in human rights language. And even more fundamentally: in light of the critical, theoretical studies surrounding the expansion of the human rights phenomenon, especially that of Costas Douzinas, who has declared that we have come to the end of human rights as a consequence of the expansion and bureaucratization of the phenomenon, can human rights actually bring salvation to the victims of trafficking? The thesis demonstrates that the translation process of the anti-trafficking campaign into human rights language has been a complicated process involving various actors, including scholars, feminist NGOs, local activists and global human rights NGOs. It has also been driven by a complicated web of interests, the most prevalent one the sincere will to help the victims having become entangled with other aims, such as political, economical, and structural goals. As a consequence of its fragmented background, the human rights approach to trafficking seeks still its final form, consisting of several different claims. After an assessment of these claims from a legal perspective, this thesis concludes that the approach is most relevant regarding the mistreatment of victims of trafficking in the hands of state authorities. It seems to be quite common that authorities have trouble identifying the victims of trafficking, which means that the rights granted to themin international and national documents are not realized in practice, but victims of trafficking are systematically deported as illegal immigrants. It is argued that in order to understand the measures of the authorities, and to assess the usefulness of human rights, it is necessary to adopt a Foucauldian perspective and to observe the measures as biopolitical defence mechanisms. From a biopolitical perspective, the victims of trafficking can be seen as a threat to the population a threat that must be eliminated either by assimilating them to the main population with the help of disciplinary techniques, or by excluding them completely from the society. This biopolitical aim is accomplished through an impenetrable net of seemingly insignificant practices and discourses that not even the participants are aware of. As a result of these practices and discourses, trafficking victims only very few of fit the myth of the perfect victim, produced by biopolitical discourses become invisible and therefore subject to deportation as (risky) illegal immigrants, turning them into bare life in the Agambenian sense, represented by the homo sacer, who cannot be sacrificed, yet does not enjoy the protection of the society and its laws. It is argued, following Jacques Rancière and Slavoj i ek, that human rights can, through their universality and formal equality, provide bare life the tools to formulate political claims and therefore utilize their politicization through their exclusion to return to the sphere of power and politics. Even though human rights have inevitably become entangled with biopolitical practices, they are still perhaps the most efficient way to challenge biopower. Human rights have not, therefore, become useless for the victims of trafficking, but they must be conceived as a universal tool to formulate political claims and challenge power .In the case of trafficking this means that human rights must be utilized to constantly renegotiate the borders of the problematic concept of victim of trafficking created by international instruments, policies and discourses, including those that are sincerely aimed to provide help for the victims.