696 resultados para normalization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a two stage split vector quantization method with optimum bit allocation, for achieving minimum computational complexity. This also results in much lower memory requirement than the recently proposed switched split vector quantization method. To improve the rate-distortion performance further, a region specific normalization is introduced, which results in 1 bit/vector improvement over the typical two stage split vector quantizer, for wide-band LSF quantization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the use of a two stage transform vector quantizer (TSTVQ) for coding of line spectral frequency (LSF) parameters in wideband speech coding. The first stage quantizer of TSTVQ, provides better matching of source distribution and the second stage quantizer provides additional coding gain through using an individual cluster specific decorrelating transform and variance normalization. Further coding gain is shown to be achieved by exploiting the slow time-varying nature of speech spectra and thus using inter-frame cluster continuity (ICC) property in the first stage of TSTVQ method. The proposed method saves 3-4 bits and reduces the computational complexity by 58-66%, compared to the traditional split vector quantizer (SVQ), but at the expense of 1.5-2.5 times of memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The time of the large sequencing projects has enabled unprecedented possibilities of investigating more complex aspects of living organisms. Among the high-throughput technologies based on the genomic sequences, the DNA microarrays are widely used for many purposes, including the measurement of the relative quantity of the messenger RNAs. However, the reliability of microarrays has been strongly doubted as robust analysis of the complex microarray output data has been developed only after the technology had already been spread in the community. An objective of this study consisted of increasing the performance of microarrays, and was measured by the successful validation of the results by independent techniques. To this end, emphasis has been given to the possibility of selecting candidate genes with remarkable biological significance within specific experimental design. Along with literature evidence, the re-annotation of the probes and model-based normalization algorithms were found to be beneficial when analyzing Affymetrix GeneChip data. Typically, the analysis of microarrays aims at selecting genes whose expression is significantly different in different conditions followed by grouping them in functional categories, enabling a biological interpretation of the results. Another approach investigates the global differences in the expression of functionally related groups of genes. Here, this technique has been effective in discovering patterns related to temporal changes during infection of human cells. Another aspect explored in this thesis is related to the possibility of combining independent gene expression data for creating a catalog of genes that are selectively expressed in healthy human tissues. Not all the genes present in human cells are active; some involved in basic activities (named housekeeping genes) are expressed ubiquitously. Other genes (named tissue-selective genes) provide more specific functions and they are expressed preferably in certain cell types or tissues. Defining the tissue-selective genes is also important as these genes can cause disease with phenotype in the tissues where they are expressed. The hypothesis that gene expression could be used as a measure of the relatedness of the tissues has been also proved. Microarray experiments provide long lists of candidate genes that are often difficult to interpret and prioritize. Extending the power of microarray results is possible by inferring the relationships of genes under certain conditions. Gene transcription is constantly regulated by the coordinated binding of proteins, named transcription factors, to specific portions of the its promoter sequence. In this study, the analysis of promoters from groups of candidate genes has been utilized for predicting gene networks and highlighting modules of transcription factors playing a central role in the regulation of their transcription. Specific modules have been found regulating the expression of genes selectively expressed in the hippocampus, an area of the brain having a central role in the Major Depression Disorder. Similarly, gene networks derived from microarray results have elucidated aspects of the development of the mesencephalon, another region of the brain involved in Parkinson Disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anatomical brain networks change throughout life and with diseases. Genetic analysis of these networks may help identify processes giving rise to heritable brain disorders, but we do not yet know which network measures are promising for genetic analyses. Many factors affect the downstream results, such as the tractography algorithm used to define structural connectivity. We tested nine different tractography algorithms and four normalization methods to compute brain networks for 853 young healthy adults (twins and their siblings). We fitted genetic structural equation models to all nine network measures, after a normalization step to increase network consistency across tractography algorithms. Probabilistic tractography algorithms with global optimization (such as Probtrackx and Hough) yielded higher heritability statistics than 'greedy' algorithms (such as FACT) which process small neighborhoods at each step. Some global network measures (probtrackx-derived GLOB and ST) showed significant genetic effects, making them attractive targets for genome-wide association studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accelerator mass spectrometry (AMS) is an ultrasensitive technique for measuring the concentration of a single isotope. The electric and magnetic fields of an electrostatic accelerator system are used to filter out other isotopes from the ion beam. The high velocity means that molecules can be destroyed and removed from the measurement background. As a result, concentrations down to one atom in 10^16 atoms are measurable. This thesis describes the construction of the new AMS system in the Accelerator Laboratory of the University of Helsinki. The system is described in detail along with the relevant ion optics. System performance and some of the 14C measurements done with the system are described. In a second part of the thesis, a novel statistical model for the analysis of AMS data is presented. Bayesian methods are used in order to make the best use of the available information. In the new model, instrumental drift is modelled with a continuous first-order autoregressive process. This enables rigorous normalization to standards measured at different times. The Poisson statistical nature of a 14C measurement is also taken into account properly, so that uncertainty estimates are much more stable. It is shown that, overall, the new model improves both the accuracy and the precision of AMS measurements. In particular, the results can be improved for samples with very low 14C concentrations or measured only a few times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hydrolysis of cupric ion has been studied at various ionic strengths (0·01, 0·05, 0·1 and 0·5 M). The results are analyzed employing 'core + links' theory, log-log plot, normalization plot, and extrapolation method for obtaining the pure mononuclear curve. The stability constants of Cu2(OH)2++, Cu3(OH)4++, Cu(OH)+ and Cu(OH)2 have been reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We search for b to s\mu^+\mu^- transitions in B meson (B^+, B^0, or B^0_s) decays with 924pb^{-1} of p pbar collisions at sqrt(s)=1.96 TeV collected with the CDF II detector at the Fermilab Tevatron. We find excesses with significances of 4.5, 2.9, and 2.4 standard deviations in the B^+ to \mu^+\mu^-K^+, B^0 to \mu^+\mu^-K^*(892)^0, and B_s^0 to \mu^+\mu^-\phi decay modes, respectively. Using B to J/psi h (h = K^+, K^*(892)^0, phi) decays as normalization channels, we report branching fractions for the previously observed B^+ and B^0 decays, BR(B^+ to \mu^+\mu^-K^+)=(0.59\pm0.15\pm0.04) x 10^{-6}, and BR(B^0 to \mu^+\mu^-K^*(892)^0)=(0.81\pm0.30\pm0.10) x 10^{-6}, where the first uncertainty is statistical, and the second is systematic. These measurements are consistent with the world average results, and are competitive with the best available measurements. We set an upper limit on the relative branching fraction BR(B_s^0 to \mu^+\mu^-\phi)/BR(B_s^0 to J/\psi\phi)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We search for b→sμ+μ- transitions in B meson (B+, B0, or Bs0) decays with 924  pb-1 of pp̅ collisions at √s=1.96  TeV collected with the CDF II detector at the Fermilab Tevatron. We find excesses with significances of 4.5, 2.9, and 2.4 standard deviations in the B+→μ+μ-K+, B0→μ+μ-K*(892)0, and Bs0→μ+μ-ϕ decay modes, respectively. Using B→J/ψh (h=K+, K*(892)0, ϕ) decays as normalization channels, we report branching fractions for the previously observed B+ and B0 decays, B(B+→μ+μ-K+)=(0.59±0.15±0.04)×10-6, and B(B0→μ+μ-K*(892)0)=(0.81±0.30±0.10)×10-6, where the first uncertainty is statistical, and the second is systematic. We set an upper limit on the relative branching fraction B(Bs0→μ+μ-ϕ)/B(Bs0→J/ψϕ)<2.6(2.3)×10-3 at the 95(90)% confidence level, which is the most stringent to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previously, it was reported from this laboratory that the heme groups of hemoglobin are “buried” within globin at pH 4.0 and not dissociated, on the basis of the obiligatory requirement of urea for the reaction of N-bromosuccinimide with the heme groups of hemoglobin at pH4.0, and also on the basis of the “normalization” of the spectrum of hemoglobin at this pH in the presence of urea or sucrose. In the present study, it has been shown that the behaviour of sperm whale myoglobin with respect to its reaction with N-bromosuccinimide and with respect to spectral “normalization” in urea or sucrose are essentially similar to that of hemoglobin. It has also been demonstrated that the spectral “normalization” obtained with crystalline hemin is not identical with that obtained with either hemoglobin or myoglobin. The bearing of the results of the present study on the earlier work on hemoglobin is indicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A reaction of N-bromosuccinimide with the heme groups of hemoglobin has been studied spectrophotometrically. The reaction brings about the disappearance of characteristic absorption peaks of hemoglobin and is accompanied by the release of inorganic iron from the heme groups. Urea is obligatory for the reaction to take place at pH 4.0, while it can occur in the absence of urea at pH 7.0. The spectrum of hemoglobin which does not show any peak in the Soret region at pH 4.0 is “normalized” in the presence of urea or sucrose at the same pH. The effect of “normalization” in 8 M urea is apparent over the pH range 3.0–4.5. From the obligatory requirement of urea and sucrose for “normalization” of spectrum and the dependence of the release of inorganic iron on the concentration of urea, it is suggested that heme groups are “buried” within the globin at pH 4.0 and not dissociated from globin as supposed before.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on a search for the flavor-changing neutral-current decay D0 \to {\mu}+ {\mu}- in pp collisions at \surd s = 1.96 TeV using 360 pb-1 of integrated luminosity collected by the CDF II detector at the Fermilab Tevatron collider. A displaced vertex trigger selects long-lived D0 candidates in the {\mu}+ {\mu}-, {\pi}+{\pi}-, and K-{\pi}+ decay modes. We use the Cabibbo-favored D0 \to K-{\pi}+ channel to optimize the selection criteria in an unbiased manner, and the kinematically similar D0 \to{\pi}+ {\pi}- channel for normalization. We set an upper limit on the branching fraction (D0 --> {\mu}+ {\mu}-)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the central role of legitimacy in social and organizational life, we know little of the subtle meaning-making processes through which organizational phenomena, such as industrial restructuring, are legitimated in contemporary society. Therefore, this paper examines the discursive legitimation strategies used when making sense of global industrial restructuring in the media. Based on a critical discourse analysis of extensive media coverage of a revolutionary pulp and paper sector merger, we distinguish and analyze five legitimation strategies: (1) normalization, (2) authorization, (3) rationalization, (4) moralization, and (5) narrativization. We argue that while these specific legitimation strategies appear in individual texts, their recurring use in the intertextual totality of the public discussion establishes the core elements of the emerging legitimating discourse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.