49 resultados para graphics processing units


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cognitive impairments of attention, memory and executive functions are a fundamental feature of the pathophysiology of schizophrenia. The neurophysiological and neurochemical changes in the auditory cortex are shown to underlie cognitive impairmentsin schizophrenia patients. Functional state of the neural substrate of auditory information processing could be objectively and non-invasively probed with auditory event-related potentials (ERPs) and event- related fields (ERFs). In the current work, we explored the neurochemical effect on the neural origins of auditory information processing in relation to schizophrenia. By means of ERPs/ERFs we aimed to determine how neural substrates of auditory information processing are modulated by antipsychotic medication in schizophrenia spectrum patients (Studies I, II) and by neuropharmacological challenges in healthy human subjects (Studies III, IV). First, with auditory ERPs we investigated the effects of olanzapine (Study I) and risperidone (Study II) in a group of patients with schizophrenia spectrum disorders. After 2 and 4 weeks of treatment, olanzapine has no significant effects on mismatch negativity(MMN) and P300, which, as it has been suggested, respectively reflect preattentive and attention-dependent information processing. After 2 weeks of treatment, risperidone has no significant effect on P300, however risperidone reduces P200 amplitude. This latter effect of risperidone on neural resources responsible for P200 generation could be partly explained through the action of dopamine. Subsequently, we used simultaneous EEG/MEG to investigate the effects of memantine (Study III) and methylphenidate (Study IV) in healthy subjects. We found that memantine modulates MMN response without changing other ERP components. This could be interpreted as being due to the possible influence of memantine through the NMDA receptors on auditory change- detection mechanism, with processing of auditory stimuli remaining otherwise unchanged. Further, we found that methylphenidate does not modulate the MMN response. This finding could indicate no association between catecholaminergic activities and electrophysiological measures of preattentive auditory discrimination processes reflected in the MMN. However, methylphenidate decreases the P200 amplitudes. This could be interpreted as a modulation of auditory information processing reflected in P200 by dopaminergic and noradrenergic systems. Taken together, our set of studies indicates a complex pattern of neurochemical influences produced by the antipsychotic drugs in the neural substrate of auditory information processing in patients with schizophrenia spectrum disorders and by the pharmacological challenges in healthy subjects studied with ERPs and ERFs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been suggested that semantic information processing is modularized according to the input form (e.g., visual, verbal, non-verbal sound). A great deal of research has concentrated on detecting a separate verbal module. Also, it has traditionally been assumed in linguistics that the meaning of a single clause is computed before integration to a wider context. Recent research has called these views into question. The present study explored whether it is reasonable to assume separate verbal and nonverbal semantic systems in the light of the evidence from event-related potentials (ERPs). The study also provided information on whether the context influences processing of a single clause before the local meaning is computed. The focus was on an ERP called N400. Its amplitude is assumed to reflect the effort required to integrate an item to the preceding context. For instance, if a word is anomalous in its context, it will elicit a larger N400. N400 has been observed in experiments using both verbal and nonverbal stimuli. Contents of a single sentence were not hypothesized to influence the N400 amplitude. Only the combined contents of the sentence and the picture were hypothesized to influence the N400. The subjects (n = 17) viewed pictures on a computer screen while hearing sentences through headphones. Their task was to judge the congruency of the picture and the sentence. There were four conditions: 1) the picture and the sentence were congruent and sensible, 2) the sentence and the picture were congruent, but the sentence ended anomalously, 3) the picture and the sentence were incongruent but sensible, 4) the picture and the sentence were incongruent and anomalous. Stimuli from the four conditions were presented in a semi-randomized sequence. Their electroencephalography was simultaneously recorded. ERPs were computed for the four conditions. The amplitude of the N400 effect was largest in the incongruent sentence-picture -pairs. The anomalously ending sentences did not elicit a larger N400 than the sensible sentences. The results suggest that there is no separate verbal semantic system, and that the meaning of a single clause is not processed independent of the context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Candida yeast species are widespread opportunistic microbes, which are usually innocent opportunists unless the systemic or local defense system of the host becomes compromised. When they adhere on a fertile substrate such as moist and warm, protein-rich human mucosal membrane or biomaterial surface, they become activated and start to grow pseudo and real hyphae. Their growth is intricately guided by their ability to detect surface defects (providing secure hiding , thigmotropism) and nutrients (source of energy, chemotropism). The hypothesis of this work was that body mobilizes both non-specific and specific host defense against invading candidal cells and that these interactions involve resident epithelial cells, rapidly responding non-specific protector neutrophils and mast cells as well as the antigen presenting and responding den-dritic cell lymphocyte plasma cell system. It is supposed that Candida albicans, as a result of dar-winistic pressure, has developed or is utilizing strategies to evade these host defense reactions by e.g. adhering to biomaterial surfaces and biofilms. The aim of the study was to assess the host defense by taking such key molecules of the anti-candidal defense into focus, which are also more or less characteristic for the main cellular players in candida-host cell interactions. As a model for candidal-host interaction, sections of chronic hyperplastic candidosis were used and compared with sections of non-infected leukoplakia and healthy tissue. In this thesis work, neutrophil-derived anti-candidal α-defensin was found in the epithelium, not only diffusely all over in the epithelium, but as a strong α-defensin-rich superficial front probably able to slow down or prevent penetration of candida into the epithelium. Neutrophil represents the main host defence cell in the epithelium, to which it can rapidly transmigrate from the circulation and where it forms organized multicellular units known as microabscesses (study I). Neutrophil chemotactic inter-leukin-8 (IL-8) and its receptor (IL-8R) were studied and were surprisingly also found in the candidal cells, probably helping the candida to keep away from IL-8- and neutrophil-rich danger zones (study IV). Both leukocytes and resident epithelial cells contained TLR2, TLR4 and TLR6 receptors able to recognize candidal structures via utilization of receptors similar to the Toll of the banana fly. It seems that candida can avoid host defence via stimulation of the candida permissive TLR2 instead of the can-dida injurious TLR4 (study V). TLR also provides the danger signal to the immune system without which it will not be activated to specifically respond against candidal antigens. Indeed, diseased sites contained receptor activator of nuclear factor kappa B ligand (RANKL; II study), which is important for the antigen capturing, processing and presenting dendritic cells and for the T lymphocyte activation (study III). Chronic hyperplastic candidosis provides a disease model that is very useful to study local and sys-temic host factors, which under normal circumstances restrain C. albicans to a harmless commensal state, but failure of which in e.g. HIV infection, cancer and aging may lead to chronic infection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines the feasibility of a forest inventory method based on two-phase sampling in estimating forest attributes at the stand or substand levels for forest management purposes. The method is based on multi-source forest inventory combining auxiliary data consisting of remote sensing imagery or other geographic information and field measurements. Auxiliary data are utilized as first-phase data for covering all inventory units. Various methods were examined for improving the accuracy of the forest estimates. Pre-processing of auxiliary data in the form of correcting the spectral properties of aerial imagery was examined (I), as was the selection of aerial image features for estimating forest attributes (II). Various spatial units were compared for extracting image features in a remote sensing aided forest inventory utilizing very high resolution imagery (III). A number of data sources were combined and different weighting procedures were tested in estimating forest attributes (IV, V). Correction of the spectral properties of aerial images proved to be a straightforward and advantageous method for improving the correlation between the image features and the measured forest attributes. Testing different image features that can be extracted from aerial photographs (and other very high resolution images) showed that the images contain a wealth of relevant information that can be extracted only by utilizing the spatial organization of the image pixel values. Furthermore, careful selection of image features for the inventory task generally gives better results than inputting all extractable features to the estimation procedure. When the spatial units for extracting very high resolution image features were examined, an approach based on image segmentation generally showed advantages compared with a traditional sample plot-based approach. Combining several data sources resulted in more accurate estimates than any of the individual data sources alone. The best combined estimate can be derived by weighting the estimates produced by the individual data sources by the inverse values of their mean square errors. Despite the fact that the plot-level estimation accuracy in two-phase sampling inventory can be improved in many ways, the accuracy of forest estimates based mainly on single-view satellite and aerial imagery is a relatively poor basis for making stand-level management decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis reports on investigations into the influence of heat treatment on the manufacturing of oat flakes. Sources of variation in the oat flake quality are reviewed, including the whole chain from the farm to the consumer. The most important quality parameters of oat flakes are the absence of lipid hydrolysing enzymes, specific weight, thickness, breakage (fines), water absorption. Flavour, colour and pasting properties are also important, but were not included in the experimental part of this study. Of particular interest was the role of heat processing. The first possible heat treatment may occur already during grain drying, which in Finland generally happens at the farm. At the mill, oats are often kilned to stabilise the product by inactivating lipid hydrolysing enzymes. Almost invariably steaming is used during flaking, to soften the groats and reduce flake breakage. This thesis presents the use of a material science approach to investigating a complex system, typical of food processes. A combination of fundamental and empirical rheological measurements was used together with a laboratory scale process to simulate industrial processing. The results were verified by means of industrial trials. Industrially produced flakes at three thickness levels (nominally 0.75, 0.85 and 0.90 mm) were produced from kilned and unkilned oat groats, and the flake strength was measured at different moisture contents. Kilning was not found to significantly affect the force required to puncture a flake with a 2mm cylindrical probe, which was taken as a measure of flake strength. To further investigate how heat processing contributes to flake quality, dynamic mechanical analysis was used to characterise the effect of heat on the mechanical properties of oats. A marked stiffening of the groat, of up to about 50% increase in storage modulus, was observed during first heating at around 36 to 57°C. This was also observed in tablets prepared from ground groats and extracted oat starch. This stiffening was thus attributed to increased adhesion between starch granules. Groats were steamed in a laboratory steamer and were tempered in an oven at 80 110°C for 30 90 min. The maximum force required to compress the steamed groats to 50% strain increased from 50.7 N to 57.5 N as the tempering temperature was increased from 80 to 110°C. Tempering conditions also affected water absorption. A significantly higher moisture content was observed for kilned (18.9%) compared to unkilned (17.1%) groats, but otherwise had no effect on groat height, maximum force or final force after a 5 s relaxation time. Flakes were produced from the tempered groats using a laboratory flaking machine, using a roll gap of 0.4 mm. Apart from specific weight, flake properties were not influenced by kilning. Tempering conditions however had significant effects on the specific weight, thickness and water absorption of the flakes, as well as on the amount of fine material (<2 mm) produced during flaking. Flake strength correlated significantly with groat strength and flake thickness. Trial flaking at a commercial mill confirmed that groat temperature after tempering influenced water absorption. Variation in flake strength was observed , but at the groat temperatures required to inactivate lipase, it was rather small. Cold flaking of groats resulted in soft, floury flakes. The results presented in this thesis suggest that heating increased the adhesion between starch granules. This resulted in an increase in the stiffness and brittleness of the groat. Brittle fracture, rather than plastic flow, during flaking could result in flaws and cracks in the flake. These would be expected to increase water absorption. This was indeed observed as tempering temperature increased. Industrial trials, conducted with different groat temperatures, confirmed the main findings of the laboratory experiments. The approach used in the present study allowed the systematic study of the effect of interacting process parameters on product quality. There have been few scientific studies of oat processing, and these results can be used to understand the complex effects of process variables on flake quality. They also offer an insight into what happens as the oat groat is deformed into a flake.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Milk microfiltration (0.05-0.2 um) is a membrane separation technique which divides milk components into casein-enriched and native whey fractions. Hitherto the effect of intensive microfiltration including a diafiltration step for both cheese and whey processing has not been studied. The microfiltration performance of skimmed milk was studied with polymeric and ceramic MF membranes. The changes caused by decreased concentration of milk lactose, whey protein and ash content for cheese milk quality and ripening were studied. The effects of cheese milk modification on the milk coagulation properties, cheese recovery yield, cheese composition, ripening and sensory quality as well as on the whey recovery yield and composition by microfiltration were studied. The functional properties of whey protein concentrate from native whey were studied and the detailed composition of whey protein concentrate powders made from cheese wheys after cheese milk pretreatments such as high temperature heat treatment (HH), microfiltration (MF) and ultrafiltration (UF) were compared. The studied polymeric spiral wound microfiltration membranes had 38.5% lower energy consumption, 30.1% higher retention of whey proteins to milk retentate and 81.9% lower permeate flux values compared to ceramic membranes. All studied microfiltration membranes were able to separate main whey proteins from skimmed milk. The optimal lactose content of Emmental cheese milk exceeded 3.2% and reduction of whey proteins and ash content of cheese milk with high concentration factor (CF) values increased the rate of cheese ripening. Reduction of whey protein content in cheese milk increased the concentration of caseinomacropeptide (CMP) of total proteins in cheese whey. Reduction of milk whey protein, lactose and ash content reduces milk rennet clotting time and increased the firmness of the coagulum. Cheese yield calculated from raw milk to cheese was lower with microfiltrated milks due to native whey production. Amounts of a-lactalbumin (a-LA) and b-lactoglobulin (b-LG) were significantly higher in the reference whey, indicating that HH, MF and UF milk pretreatments decrease the amounts of these valuable whey proteins in whey. Even low CF values in milk microfiltration (CF 1.4) reduced nutritional value of cheese whey. From the point of view of utilization of milk components it would be beneficial if the amount of native whey and the CMP content of cheese whey could be maximized. Whey protein concentrate powders made of native whey had excellent functional properties and their detailed amino acid composition differed from those of cheese whey protein concentrate powders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The average daily intake of folate, one of the B vitamins, falls below recommendations among the Finnish population. Bread and cereals are the main sources of folate, rye being the most significant single source. Processing is a prerequisite for the consumption of whole grain rye; however, little is known about the effect of processing on folates. Moreover, data on the bioavailability of endogenous cereal folates are scarce. The aim of this study was to examine the variation in as well as the effect of fermentation, germination, and thermal processes on folate contents in rye. Bioavailability of endogenous rye folates was investigated in a four-week human intervention study. One of the objectives throughout the work was to optimise and evaluate analytical methods for determining folate contents in cereals. Affinity chromatographic purification followed by high-performance liquid chromatography (HPLC) was a suitable method for analysing cereal products for folate vitamers, and microbiological assay with Lactobacillus rhamnosus reliably quantified the total folate. However, HPLC gave approximately 30% lower results than the microbiological assay. The folate content of rye was high and could be further increased by targeted processing. The vitamer distribution of whole grain rye was characterised by a large proportion of formylated vitamers followed by 5-methyltetrahydrofolate. In sourdough fermentation of rye, the studied yeasts synthesized and lactic acid bacteria mainly depleted folate. Two endogenous bacteria isolated from rye flour were found to produce folate during fermentation. Inclusion of baker s yeast in sourdough fermentation raised the folate level so that the bread could contain more folate than the flour it was made of. Germination markedly increased the folate content of rye, with particularly high folate concentrations in hypocotylar roots. Thermal treatments caused significant folate losses but the preceding germination compensated well for the losses. In the bioavailability study, moderate amounts of endogenous folates in the form of different rye products and orange juice incorporated in the diet improved the folate status among healthy adults. Endogenous folates from rye and orange juice showed similar bioavailability to folic acid from fortified white bread. In brief, it was shown that the folate content of rye can be enhanced manifold by optimising and combining food processing techniques. This offers some practical means to increase the daily intake of folate in a bioavailable form.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dimeric phenolic compounds lignans and dilignols form in the so-called oxidative coupling reaction of phenols. Enzymes such as peroxidases and lac-cases catalyze the reaction using hydrogen peroxide or oxygen respectively as oxidant generating phenoxy radicals which couple together according to certain rules. In this thesis, the effects of the structures of starting materials mono-lignols and the effects of reaction conditions such as pH and solvent system on this coupling mechanism and on its regio- and stereoselectivity have been studied. After the primary coupling of two phenoxy radicals a very reactive quinone me-thide intermediate is formed. This intermediate reacts quickly with a suitable nucleophile which can be, for example, an intramolecular hydroxyl group or another nucleophile such as water, methanol, or a phenolic compound in the reaction system. This reaction is catalyzed by acids. After the nucleophilic addi-tion to the quinone methide, other hydrolytic reactions, rearrangements, and elimination reactions occur leading finally to stable dimeric structures called lignans or dilignols. Similar reactions occur also in the so-called lignification process when monolignol (or dilignol) reacts with the growing lignin polymer. New kinds of structures have been observed in this thesis. The dimeric com-pounds with so-called spirodienone structure have been observed to form both in the dehydrodimerization of methyl sinapate and in the beta-1-type cross-coupling reaction of two different monolignols. This beta-1-type dilignol with a spirodienone structure was the first synthetized and published dilignol model compound, and at present, it has been observed to exist as a fundamental construction unit in lignins. The enantioselectivity of the oxidative coupling reaction was also studied for obtaining enantiopure lignans and dilignols. A rather good enantioselectivity was obtained in the oxidative coupling reaction of two monolignols with chiral auxiliary substituents using peroxidase/H2O2 as an oxidation system. This observation was published as one of the first enantioselective oxidative coupling reaction of phenols. Pure enantiomers of lignans were also obtained by using chiral cryogenic chromatography as a chiral resolution technique. This technique was shown to be an alternative route to prepare enantiopure lignans or lignin model compounds in a preparative scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Taita Hills in southeastern Kenya form the northernmost part of Africa’s Eastern Arc Mountains, which have been identified by Conservation International as one of the top ten biodiversity hotspots on Earth. As with many areas of the developing world, over recent decades the Taita Hills have experienced significant population growth leading to associated major changes in land use and land cover (LULC), as well as escalating land degradation, particularly soil erosion. Multi-temporal medium resolution multispectral optical satellite data, such as imagery from the SPOT HRV, HRVIR, and HRG sensors, provides a valuable source of information for environmental monitoring and modelling at a landscape level at local and regional scales. However, utilization of multi-temporal SPOT data in quantitative remote sensing studies requires the removal of atmospheric effects and the derivation of surface reflectance factor. Furthermore, for areas of rugged terrain, such as the Taita Hills, topographic correction is necessary to derive comparable reflectance throughout a SPOT scene. Reliable monitoring of LULC change over time and modelling of land degradation and human population distribution and abundance are of crucial importance to sustainable development, natural resource management, biodiversity conservation, and understanding and mitigating climate change and its impacts. The main purpose of this thesis was to develop and validate enhanced processing of SPOT satellite imagery for use in environmental monitoring and modelling at a landscape level, in regions of the developing world with limited ancillary data availability. The Taita Hills formed the application study site, whilst the Helsinki metropolitan region was used as a control site for validation and assessment of the applied atmospheric correction techniques, where multiangular reflectance field measurements were taken and where horizontal visibility meteorological data concurrent with image acquisition were available. The proposed historical empirical line method (HELM) for absolute atmospheric correction was found to be the only applied technique that could derive surface reflectance factor within an RMSE of < 0.02 ps in the SPOT visible and near-infrared bands; an accuracy level identified as a benchmark for successful atmospheric correction. A multi-scale segmentation/object relationship modelling (MSS/ORM) approach was applied to map LULC in the Taita Hills from the multi-temporal SPOT imagery. This object-based procedure was shown to derive significant improvements over a uni-scale maximum-likelihood technique. The derived LULC data was used in combination with low cost GIS geospatial layers describing elevation, rainfall and soil type, to model degradation in the Taita Hills in the form of potential soil loss, utilizing the simple universal soil loss equation (USLE). Furthermore, human population distribution and abundance were modelled with satisfactory results using only SPOT and GIS derived data and non-Gaussian predictive modelling techniques. The SPOT derived LULC data was found to be unnecessary as a predictor because the first and second order image texture measurements had greater power to explain variation in dwelling unit occurrence and abundance. The ability of the procedures to be implemented locally in the developing world using low-cost or freely available data and software was considered. The techniques discussed in this thesis are considered equally applicable to other medium- and high-resolution optical satellite imagery, as well the utilized SPOT data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.