35 resultados para Ecosystem Function Analysis
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Rapid changes in biodiversity are occurring globally, as a consequence of anthropogenic disturbance. This has raised concerns, since biodiversity is known to significantly contribute to ecosystem functions and services. Marine benthic communities participate in numerous functions provided by soft-sedimentary ecosystems. Eutrophication-induced oxygen deficiency is a growing threat against infaunal communities, both in open sea areas and in coastal zones. There is thus a need to understand how such disturbance affects benthic communities, and what is lost in terms of ecosystem functioning if benthic communities are harmed. In this thesis, the status of benthic biodiversity was assessed for the open Baltic Sea, a system severely affected by broad-scale hypoxia. Long-term monitoring data made it possible to establish quantitative biodiversity baselines against which change could be compared. The findings show that benthic biodiversity is currently severely impaired in large areas of the open Baltic Sea, from the Bornholm Basin to the Gulf of Finland. The observed reduction in biodiversity indicates that benthic communities are structurally and functionally impoverished in several of the sub-basins due to the hypoxic stress. A more detailed examination of disturbance impacts (through field studies and -experiments) on benthic communities in coastal areas showed that changes in benthic community structure and function took place well before species were lost from the system. The degradation of benthic community structure and function was directed by the type of disturbance, and its specific temporal and spatial characteristics. The observed shifts in benthic trait composition were primarily the result of reductions in species’ abundances, or of changes in demographic characteristics, such as the loss of large, adult bivalves. Reduction in community functions was expressed as declines in the benthic bioturbation potential and in secondary biomass production. The benthic communities and their degradation accounted for a substantial proportion of the changes observed in ecosystem multifunctionality. Individual ecosystem functions (i.e. measures of sediment ecosystem metabolism, elemental cycling, biomass production, organic matter transformation and physical structuring) were observed to differ in their response to increasing hypoxic disturbance. Interestingly, the results suggested that an impairment of ecosystem functioning could be detected at an earlier stage if multiple functions were considered. Importantly, the findings indicate that even small-scale hypoxic disturbance can reduce the buffering capacity of sedimentary ecosystem, and increase the susceptibility of the system towards further stress. Although the results of the individual papers are context-dependent, their combined outcome implies that healthy benthic communities are important for sustaining overall ecosystem functioning as well as ecosystem resilience in the Baltic Sea.
Resumo:
The target of the thesis was to find out has the decision to outsource part of Filtronic LK warehouse function been profitable. Furthermore, another thesis target was to demonstrate current logistics processes between TPLP and company and find out the targets for developing these processes. The decision to outsource part of logistical funtions have been profitable during the first business year. Partnership includes always business risks. Risk increases high asset specific investments. In the other hand investment to partnership increases mutual trust and commitment between parties. By developing partnership risks and opportunitic behaviour can be decreased. The potential of managing material and data flows between logistic service provider and company observed. By analyzing inventory effiency were highlighted the need for decreasing the capital invested to inventories. The recommendations for managing outsourced logistical funtions were established such as improving partnership, process development, performance measurement and invoice checking.
Resumo:
Coastal areas harbour high biodiversity, but are simultaneously affected by rapid degradations of species and habitats due to human interactions. Such alterations also affect the functioning of the ecosystem, which is primarily governed by the characteristics or traits expressed by the organisms present. Marine benthic fauna is nvolved in numerous functions such as organic matter transformation and transport, secondary production, oxygen transport as well as nutrient cycling. Approaches utilising the variety of faunal traits to assess benthic community functioning have rapidly increased and shown the need for further development of the concept. In this thesis, I applied biological trait analysis that allows for assessments of a multitude of categorical traits and thus evaluation of multiple functional aspects simultaneously. I determined the functional trait structure, diversity and variability of coastal zoobenthic communities in the Baltic Sea. The measures were related to recruitment processes, habitat heterogeneity, large-scale environmental and taxonomic gradients as well as anthropogenic impacts. The studies comprised spatial scales from metres to thousands of kilometres, and temporal scales spanning one season as well as a decade. The benthic functional structure was found to vary within and between seagrass landscape microhabitats and four different habitats within a coastal bay, in papers I and II respectively. Expressions of trait categories varied within habitats, while the density of individuals was found to drive the functional differences between habitats. The findings in paper III unveiled high trait richness of Finnish coastal benthos (25 traits and 102 cateogries) although this differed between areas high and low in salinity and human pressure. In paper IV, the natural reduction in taxonomic richness across the Baltic Sea led to an overall reduction in function. However, functional richness in terms of number of trait categories remained comparatively high at low taxon richness. Changes in number of taxa within trait categories were also subtle and some individual categories were maintained or even increased. The temporal analysis in papers I and III highlighted generalities in trait expressions and dominant trait categories in a seagrass landscape as well as a “type organism” for the northern Baltic Sea. Some initial findings were made in all four papers on the role of common and rare species and traits for benthic community functioning. The findings show that common and rare species may not always express the same trait categories in relation to each other. Rare species in general did not express unique functional properties. In order to advance the understanding of the approach, I also assessed some issues concerning the limitations of the concept. This was conducted by evaluating the link between trait category and taxonomic richness using especially univariate measures. My results also show the need to collaborate nationally and internationally on safeguarding the utility of taxonomic and trait data. The findings also highlight the importance of including functional trait information into current efforts in marine spatial planning and biomonitoring.
Resumo:
This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.
Resumo:
Uusi EPR-reaktorikonsepti on suunniteltu selviytymään tapauksista, joissa reaktorinsydän sulaa ja sula puhkaisee paineastian. Suojarakennuksen sisälle on suunniteltu alue, jolle sula passiivisesti kerätään, pidätetään ja jäähdytetään. Alueelle laaditaan valurautaelementeistä ns.sydänsieppari, joka tulvitetaan vedellä. Sydänsulan tuottama jälkilämpö siirtyyveteen, mistä se poistetaan suojarakennuksen jälkilämmönpoistojärjestelmän kautta. Suuri osa lämmöstä poistuu sydänsulasta sen yläpuolella olevaan veteen, mutta lämmönsiirron tehostamiseksi myös sydänsiepparin alapuolelle on sijoitettu vedellä täytettävät jäähdytyskanavat. Jotta sydänsiepparin toiminta voitaisiin todentaa, on Lappeenrannan Teknillisellä Yliopistolla rakennettu Volley-koelaitteisto tätä tarkoitusta varten. Koelaitteisto koostuu kahdesta täysimittaisesta valuraudasta tehdystä jäähdytyskanavasta. Sydänsulan tuottamaa jälkilämpöä simuloidaan koelaitteistossa sähkövastuksilla. Tässä työssä kuvataan simulaatioiden suorittaminen ja vertaillaan saatuja arvoja mittaustuloksiin. Työ keskittyy sydänsiepparista jäähdytyskanaviin tapahtuvan lämmönsiirron teoriaan jamekanismeihin. Työssä esitetään kolme erilaista korrelaatiota lämmönsiirtokertoimille allaskiehumisen tapauksessa. Nämä korrelaatiot soveltuvat erityisesti tapauksiin, joissa vain muutamia mittausparametreja on tiedossa. Työn toinen osa onVolley 04 -kokeiden simulointi. Ensin käytettyä simulointitapaa on kelpoistettuvertaamalla tuloksia Volley 04 ja 05 -kokeisiin, joissa koetta voitiin jatkaa tasapainotilaan ja joissa jäähdytteen käyttäytyminen jäähdytyskanavassa on tallennettu myös videokameralla. Näiden simulaatioiden tulokset ovat hyvin samanlaisiakuin mittaustulokset. Korkeammilla lämmitystehoilla kokeissa esiintyi vesi-iskuja, jotka rikkoivat videoinnin mahdollistavia ikkunoita. Tämän johdosta osassa Volley 04 -kokeita ikkunat peitettiin metallilevyillä. Joitakin kokeita jouduttiin keskeyttämään laitteiston suurten lämpöjännitysten johdosta. Tällaisten testien simulaatiot eivät ole yksinkertaisia suorittaa. Veden pinnan korkeudesta ei ole visuaalista havaintoa. Myöskään jäähdytteen tasapainotilanlämpötiloista ei ole tarkkaa tietoa, mutta joitakin oletuksia voidaan tehdä samoilla parametreilla tehtyjen Volley 05 -kokeiden perusteella. Mittaustulokset Volley 04 ja 05 -kokeista, jotka on videoitu ja voitu ajaa tasapainotilaan saakka, antoivat simulaatioiden kanssa hyvin samankaltaisia lämpötilojen arvoja. Keskeytettyjen kokeiden ekstrapolointi tasapainotilaan ei onnistunut kovin hyvin. Kokeet jouduttiin keskeyttämään niin paljon ennen termohydraulista tasapainoa, ettei tasapainotilan reunaehtoja voitu ennustaa. Videonauhoituksen puuttuessa ei veden pinnan korkeudesta saatu lisätietoa. Tuloksista voidaan lähinnä esittää arvioita siitä, mitä suuruusluokkaa mittapisteiden lämpötilat tulevat olemaan. Nämä lämpötilat ovat kuitenkin selvästi alle sydänsiepparissa käytettävän valuraudan sulamislämpötilan. Joten simulaatioiden perusteella voidaan sanoa, etteivät jäähdytyskanavien rakenteet sula, mikäli niissä on pienikin jäähdytevirtaus, eikä useampia kuin muutama vierekkäinen kanava ole täysin kuivana.
Resumo:
In a centrifugal compressor the flow around the diffuser is collected and led to the pipe system by a spiral-shaped volute. In this study a single-stage centrifugal compressor with three different volutes is investigated. The compressorwas first equipped with the original volute, the cross-section of which was a combination of a rectangle and semi-circle. Next a new volute with a fully circular cross-section was designed and manufactured. Finally, the circular volute wasmodified by rounding the tongue and smoothing the tongue area. The overall performance of the compressor as well as the static pressure distribution after the impeller and on the volute surface were measured. The flow entering the volute was measured using a three-hole Cobra-probe, and flow visualisations were carriedout in the exit cone of the volute. In addition, the radial force acting on theimpeller was measured using magnetic bearings. The complete compressor with thecircular volute (inlet pipe, full impeller, diffuser, volute and outlet pipe) was also modelled using computational fluid dynamics (CFD). A fully 3-D viscous flow was solved using a Navier-Stokes solver, Finflo, developed at Helsinki University of Technology. Chien's k-e model was used to take account of the turbulence. The differences observed in the performance of the different volutes were quite small. The biggest differences were at low speeds and high volume flows,i.e. when the flow entered the volute most radially. In this operating regime the efficiency of the compressor with the modified circular volute was about two percentage points higher than with the other volutes. Also, according to the Cobra-probe measurements and flow visualisations, the modified circular volute performed better than the other volutes in this operating area. The circumferential static pressure distribution in the volute showed increases at low flow, constant distribution at the design flow and decrease at high flow. The non-uniform static pressure distribution of the volute was transmitted backwards across the vaneless diffuser and observed at the impeller exit. At low volume flow a strong two-wave pattern developed into the static pressure distribution at the impeller exit due to the response of the impeller to the non-uniformity of pressure. The radial force of the impeller was the greatest at the choke limit, the smallest atthe design flow, and moderate at low flow. At low flow the force increase was quite mild, whereas the increase at high flow was rapid. Thus, the non-uniformityof pressure and the force related to it are strong especially at high flow. Theforce caused by the modified circular volute was weaker at choke and more symmetric as a function of the volume flow than the force caused by the other volutes.
Resumo:
Electric motors driven by adjustable-frequency converters may produce periodic excitation forces that can cause torque and speed ripple. Interaction with the driven mechanical system may cause undesirable vibrations that affect the system performance and lifetime. Direct drives in sensitive applications, such as elevators or paper machines, emphasize the importance of smooth torque production. This thesis analyses the non-idealities of frequencyconverters that produce speed and torque ripple in electric drives. The origin of low order harmonics in speed and torque is examined. It is shown how different current measurement error types affect the torque. As the application environment, direct torque control (DTC) method is applied to permanent magnet synchronous machines (PMSM). A simulation model to analyse the effect of the frequency converter non-idealities on the performance of the electric drives is created. Themodel enables to identify potential problems causing torque vibrations and possibly damaging oscillations in electrically driven machine systems. The model is capable of coupling with separate simulation software of complex mechanical loads. Furthermore, the simulation model of the frequency converter's control algorithm can be applied to control a real frequency converter. A commercial frequencyconverter with standard software, a permanent magnet axial flux synchronous motor and a DC motor as the load are used to detect the effect of current measurement errors on load torque. A method to reduce the speed and torque ripple by compensating the current measurement errors is introduced. The method is based on analysing the amplitude of a selected harmonic component of speed as a function oftime and selecting a suitable compensation alternative for the current error. The speed can be either measured or estimated, so the compensation method is applicable also for speed sensorless drives. The proposed compensation method is tested with a laboratory drive, which consists of commercial frequency converter hardware with self-made software and a prototype PMSM. The speed and torque rippleof the test drive are reduced by applying the compensation method. In addition to the direct torque controlled PMSM drives, the compensation method can also beapplied to other motor types and control methods.
Resumo:
Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.
Resumo:
Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.
Resumo:
Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.
Resumo:
The main objective of this master’s thesis was to quantitatively study the reliability of market and sales forecasts of a certain company by measuring bias, precision and accuracy of these forecasts by comparing forecasts against actual values. Secondly, the differences of bias, precision and accuracy between markets were explained by various macroeconomic variables and market characteristics. Accuracy and precision of the forecasts seems to vary significantly depending on the market that is being forecasted, the variable that is being forecasted, the estimation period, the length of the estimated period, the forecast horizon and the granularity of the data. High inflation, low income level and high year-on-year market volatility seems to be related with higher annual market forecast uncertainty and high year-on-year sales volatility with higher sales forecast uncertainty. When quarterly market size is forecasted, correlation between macroeconomic variables and forecast errors reduces. Uncertainty of the sales forecasts cannot be explained with macroeconomic variables. Longer forecasts are more uncertain, shorter estimated period leads to higher uncertainty, and usually more recent market forecasts are less uncertain. Sales forecasts seem to be more uncertain than market forecasts, because they incorporate both market size and market share risks. When lead time is more than one year, forecast risk seems to grow as a function of root forecast horizon. When lead time is less than year, sequential error terms are typically correlated, and therefore forecast errors are trending or mean-reverting. The bias of forecasts seems to change in cycles, and therefore the future forecasts cannot be systematically adjusted with it. The MASE cannot be used to measure whether the forecast can anticipate year-on-year volatility. Instead, we constructed a new relative accuracy measure to cope with this particular situation.
Resumo:
The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,
Resumo:
This study presents an automatic, computer-aided analytical method called Comparison Structure Analysis (CSA), which can be applied to different dimensions of music. The aim of CSA is first and foremost practical: to produce dynamic and understandable representations of musical properties by evaluating the prevalence of a chosen musical data structure through a musical piece. Such a comparison structure may refer to a mathematical vector, a set, a matrix or another type of data structure and even a combination of data structures. CSA depends on an abstract systematic segmentation that allows for a statistical or mathematical survey of the data. To choose a comparison structure is to tune the apparatus to be sensitive to an exclusive set of musical properties. CSA settles somewhere between traditional music analysis and computer aided music information retrieval (MIR). Theoretically defined musical entities, such as pitch-class sets, set-classes and particular rhythm patterns are detected in compositions using pattern extraction and pattern comparison algorithms that are typical within the field of MIR. In principle, the idea of comparison structure analysis can be applied to any time-series type data and, in the music analytical context, to polyphonic as well as homophonic music. Tonal trends, set-class similarities, invertible counterpoints, voice-leading similarities, short-term modulations, rhythmic similarities and multiparametric changes in musical texture were studied. Since CSA allows for a highly accurate classification of compositions, its methods may be applicable to symbolic music information retrieval as well. The strength of CSA relies especially on the possibility to make comparisons between the observations concerning different musical parameters and to combine it with statistical and perhaps other music analytical methods. The results of CSA are dependent on the competence of the similarity measure. New similarity measures for tonal stability, rhythmic and set-class similarity measurements were proposed. The most advanced results were attained by employing the automated function generation – comparable with the so-called genetic programming – to search for an optimal model for set-class similarity measurements. However, the results of CSA seem to agree strongly, independent of the type of similarity function employed in the analysis.
Resumo:
B lymphocytes constitute a key branch of adaptive immunity by providing specificity to recognize a vast variety of antigens by B cell antigen receptors (BCR) and secreted antibodies. Antigen recognition activates the cells and can produce antibody secreting plasma cells via germinal center reaction that leads to the maturation of antigen recognition affinity and switching of antibody effector class. The specificity of antigen recognition is achieved through a multistep developmental pathway that is organized by interplay of transcription factors and signals through BCR. Lymphoid malignancies arise from different stages of development in abnormal function of transcriptional regulation. To understand the B cell development and the function of B cells, a thorough understanding of the regulation of gene expression is important. The transcription factors of the Ikaros family and Bcl6 are frequently associated with lymphoma generation. The aim of this study was to reveal the targets of Ikaros, Helios and Bcl6 mediated gene regulation and to find out the function of Ikaros and Helios in B cells. This study uses gene targeted DT40 B cell lines and establishes a role for Ikaros family factors Ikaros and Helios in the regulation of BCR signaling that is important at developmental checkpoints, for cell survival and in activation. Ikaros and Helios had opposing roles in the regulation of BCR signals. Ikaros was found to directly repress the SHIP gene that encodes a signaling lipid-metabolizing enzyme, whereas Helios had activating effect on SHIP expression. The findings demonstrate a balancing function for these two Ikaros family transcription factors in the regulation of BCR signaling as well as in the regulation of gene expression. Bcl6 was found to repress plasma cell gene expression program while maintaining gene expression profile of B cells. Analysis of direct Bcl6 target genes suggested novel mechanisms for Bcl6-mediated suppression of plasma cell differentiation and promoting germinal center phenotype.
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.