10 resultados para NORMALIZATION

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Czechoslovakia, the occupation of 1968 denoted the beginning of normalization , a political and societal stagnation that lasted two decades. Dissident initiative Charter 77 emerged in 1977, demanding that the leaders of the country respect human rights. The Helsinki process provided a macro-level framework that influenced opposition and dissident activities throughout Eastern Europe. The study contributes a focused empirical analysis of the period of normalization and the dissident movement Charter 77. Dissent in general is seen as an existential attitude; it can be encapsulated as a morally rationalized critical stance as derived from shared experience or interpretation of injustice, which serves as a basis for a shared collective identity comprising oppositional consciousness as one unifying factor. The study suggests that normalization can be understood as a fundamentally violent process and discusses the structural and cultural manifestations of violence with relation to Charter 77. In general, the aim of the system was to passivize the society to such an extent that it would not constitute a potential threat to the hegemonic rule of the regime. Normalization caused societal stagnation and apoliticization, but it also benefited those who accepted the new political reality. The study, however, questions the image of Czechoslovakia s allegedly highly repressive rule by showing that there was also quite considerable tolerance of Charter 77 and consideration before severe repression was brought to bear against dissidents. Furthermore, the study provides understanding of the motives and impetuses behind dissent, the strategic shifts in Charter 77 activities, and the changes in the regime s policies toward Charter 77. The study also adds new perspective on the common image of Charter 77 as a non political initiative and suggests that Charter 77 was, in fact, a political entity, an actively political one in the latter half of the 1980s. Charter 77 was a de facto hybrid of a traditional dissident initiative and an oppositional actor. Charter 77 adopted a two-dimension approach: firstly, it still emphasized its role as a citizens initiative supporting human rights, but, secondly, at the same time, it was a directly political actor supporting and furthering the development of political opposition against the ruling power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The time of the large sequencing projects has enabled unprecedented possibilities of investigating more complex aspects of living organisms. Among the high-throughput technologies based on the genomic sequences, the DNA microarrays are widely used for many purposes, including the measurement of the relative quantity of the messenger RNAs. However, the reliability of microarrays has been strongly doubted as robust analysis of the complex microarray output data has been developed only after the technology had already been spread in the community. An objective of this study consisted of increasing the performance of microarrays, and was measured by the successful validation of the results by independent techniques. To this end, emphasis has been given to the possibility of selecting candidate genes with remarkable biological significance within specific experimental design. Along with literature evidence, the re-annotation of the probes and model-based normalization algorithms were found to be beneficial when analyzing Affymetrix GeneChip data. Typically, the analysis of microarrays aims at selecting genes whose expression is significantly different in different conditions followed by grouping them in functional categories, enabling a biological interpretation of the results. Another approach investigates the global differences in the expression of functionally related groups of genes. Here, this technique has been effective in discovering patterns related to temporal changes during infection of human cells. Another aspect explored in this thesis is related to the possibility of combining independent gene expression data for creating a catalog of genes that are selectively expressed in healthy human tissues. Not all the genes present in human cells are active; some involved in basic activities (named housekeeping genes) are expressed ubiquitously. Other genes (named tissue-selective genes) provide more specific functions and they are expressed preferably in certain cell types or tissues. Defining the tissue-selective genes is also important as these genes can cause disease with phenotype in the tissues where they are expressed. The hypothesis that gene expression could be used as a measure of the relatedness of the tissues has been also proved. Microarray experiments provide long lists of candidate genes that are often difficult to interpret and prioritize. Extending the power of microarray results is possible by inferring the relationships of genes under certain conditions. Gene transcription is constantly regulated by the coordinated binding of proteins, named transcription factors, to specific portions of the its promoter sequence. In this study, the analysis of promoters from groups of candidate genes has been utilized for predicting gene networks and highlighting modules of transcription factors playing a central role in the regulation of their transcription. Specific modules have been found regulating the expression of genes selectively expressed in the hippocampus, an area of the brain having a central role in the Major Depression Disorder. Similarly, gene networks derived from microarray results have elucidated aspects of the development of the mesencephalon, another region of the brain involved in Parkinson Disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accelerator mass spectrometry (AMS) is an ultrasensitive technique for measuring the concentration of a single isotope. The electric and magnetic fields of an electrostatic accelerator system are used to filter out other isotopes from the ion beam. The high velocity means that molecules can be destroyed and removed from the measurement background. As a result, concentrations down to one atom in 10^16 atoms are measurable. This thesis describes the construction of the new AMS system in the Accelerator Laboratory of the University of Helsinki. The system is described in detail along with the relevant ion optics. System performance and some of the 14C measurements done with the system are described. In a second part of the thesis, a novel statistical model for the analysis of AMS data is presented. Bayesian methods are used in order to make the best use of the available information. In the new model, instrumental drift is modelled with a continuous first-order autoregressive process. This enables rigorous normalization to standards measured at different times. The Poisson statistical nature of a 14C measurement is also taken into account properly, so that uncertainty estimates are much more stable. It is shown that, overall, the new model improves both the accuracy and the precision of AMS measurements. In particular, the results can be improved for samples with very low 14C concentrations or measured only a few times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We search for b to s\mu^+\mu^- transitions in B meson (B^+, B^0, or B^0_s) decays with 924pb^{-1} of p pbar collisions at sqrt(s)=1.96 TeV collected with the CDF II detector at the Fermilab Tevatron. We find excesses with significances of 4.5, 2.9, and 2.4 standard deviations in the B^+ to \mu^+\mu^-K^+, B^0 to \mu^+\mu^-K^*(892)^0, and B_s^0 to \mu^+\mu^-\phi decay modes, respectively. Using B to J/psi h (h = K^+, K^*(892)^0, phi) decays as normalization channels, we report branching fractions for the previously observed B^+ and B^0 decays, BR(B^+ to \mu^+\mu^-K^+)=(0.59\pm0.15\pm0.04) x 10^{-6}, and BR(B^0 to \mu^+\mu^-K^*(892)^0)=(0.81\pm0.30\pm0.10) x 10^{-6}, where the first uncertainty is statistical, and the second is systematic. These measurements are consistent with the world average results, and are competitive with the best available measurements. We set an upper limit on the relative branching fraction BR(B_s^0 to \mu^+\mu^-\phi)/BR(B_s^0 to J/\psi\phi)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We search for b→sμ+μ- transitions in B meson (B+, B0, or Bs0) decays with 924  pb-1 of pp̅ collisions at √s=1.96  TeV collected with the CDF II detector at the Fermilab Tevatron. We find excesses with significances of 4.5, 2.9, and 2.4 standard deviations in the B+→μ+μ-K+, B0→μ+μ-K*(892)0, and Bs0→μ+μ-ϕ decay modes, respectively. Using B→J/ψh (h=K+, K*(892)0, ϕ) decays as normalization channels, we report branching fractions for the previously observed B+ and B0 decays, B(B+→μ+μ-K+)=(0.59±0.15±0.04)×10-6, and B(B0→μ+μ-K*(892)0)=(0.81±0.30±0.10)×10-6, where the first uncertainty is statistical, and the second is systematic. We set an upper limit on the relative branching fraction B(Bs0→μ+μ-ϕ)/B(Bs0→J/ψϕ)<2.6(2.3)×10-3 at the 95(90)% confidence level, which is the most stringent to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on a search for the flavor-changing neutral-current decay D0 \to {\mu}+ {\mu}- in pp collisions at \surd s = 1.96 TeV using 360 pb-1 of integrated luminosity collected by the CDF II detector at the Fermilab Tevatron collider. A displaced vertex trigger selects long-lived D0 candidates in the {\mu}+ {\mu}-, {\pi}+{\pi}-, and K-{\pi}+ decay modes. We use the Cabibbo-favored D0 \to K-{\pi}+ channel to optimize the selection criteria in an unbiased manner, and the kinematically similar D0 \to{\pi}+ {\pi}- channel for normalization. We set an upper limit on the branching fraction (D0 --> {\mu}+ {\mu}-)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the central role of legitimacy in social and organizational life, we know little of the subtle meaning-making processes through which organizational phenomena, such as industrial restructuring, are legitimated in contemporary society. Therefore, this paper examines the discursive legitimation strategies used when making sense of global industrial restructuring in the media. Based on a critical discourse analysis of extensive media coverage of a revolutionary pulp and paper sector merger, we distinguish and analyze five legitimation strategies: (1) normalization, (2) authorization, (3) rationalization, (4) moralization, and (5) narrativization. We argue that while these specific legitimation strategies appear in individual texts, their recurring use in the intertextual totality of the public discussion establishes the core elements of the emerging legitimating discourse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.