916 resultados para Topology-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As culture-based methods for the diagnosis of invasive fungal diseases (IFD) in leukemia and hematopoietic SCT patients have limited performance, non-culture methods are increasingly being used. The third European Conference on Infections in Leukemia (ECIL-3) meeting aimed at establishing evidence-based recommendations for the use of biological tests in adult patients, based on the grading system of the Infectious Diseases Society of America. The following biomarkers were investigated as screening tests: galactomannan (GM) for invasive aspergillosis (IA); β-glucan (BG) for invasive candidiasis (IC) and IA; Cryptococcus Ag for cryptococcosis; mannan (Mn) Ag/anti-mannan (A-Mn) Ab for IC, and PCR for IA. Testing for GM, Cryptococcus Ag and BG are included in the revised EORTC/MSG (European Organization for Research and Treatment of Cancer/Mycoses Study Group) consensus definitions for IFD. Strong evidence supports the use of GM in serum (A II), and Cryptococcus Ag in serum and cerebrospinal fluid (CSF) (A II). Evidence is moderate for BG detection in serum (B II), and the combined Mn/A-Mn testing in serum for hepatosplenic candidiasis (B III) and candidemia (C II). No recommendations were formulated for the use of PCR owing to a lack of standardization and clinical validation. Clinical utility of these markers for the early management of IFD should be further assessed in prospective randomized interventional studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation‑based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi‑resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Among the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, have the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical‑based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The widespread misuse of drugs has increased the number of multiresistant bacteria, and this means that tools that can rapidly detect and characterize bacterial response to antibiotics are much needed in the management of infections. Various techniques, such as the resazurin-reduction assays, the mycobacterial growth indicator tube or polymerase chain reaction-based methods, have been used to investigate bacterial metabolism and its response to drugs. However, many are relatively expensive or unable to distinguish between living and dead bacteria. Here we show that the fluctuations of highly sensitive atomic force microscope cantilevers can be used to detect low concentrations of bacteria, characterize their metabolism and quantitatively screen (within minutes) their response to antibiotics. We applied this methodology to Escherichia coli and Staphylococcus aureus, showing that live bacteria produced larger cantilever fluctuations than bacteria exposed to antibiotics. Our preliminary experiments suggest that the fluctuation is associated with bacterial metabolism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tässä diplomityössä suunnitellaan yksivaiheisen turbiinin ylisooninen staattori ja alisooninen roottori, tulo-osa ja diffuusori. Työn alussa tarkastellaan aksiaaliturbiinin käyttökohteita ja teoriaa, jonka jälkeen esitetään suunnittelun perustana olevat menetelmät ja periaatteet. Perussuunnittelu tehdään Traupelinmenetelmällä WinAxtu 1.1 suunnitteluohjelmalla ja hyötysuhde arvioidaan lisäksiExcel-pohjaisella laskennalla. Ylisooninen staattori suunnitellaan perussuunnittelun tuloksiin perustuen, soveltamalla karakteristikoiden menetelmää suuttimen laajenevaan osaan ja pinta-alasuhteita suppenevaan osaan. Roottorin keskiviiva piirretään Sahlbergin menetelmällä ja siiven muoto määritetään A3K7 paksuusjakauman sekä tiheän siipihilan muotoilun periaatteita yhdistämällä. Tulo-osa suunnitellaan mahdollisimman jouhevaksi geometriatietojen ja kirjallisuuden esimerkkien mukaisesti. Lopuksi tulo-osaa mallinnetaan CFD-laskennalla. Diffuusori suunnitellaan käyttämällä soveltuvin osin kirjallisuudessa esitettyjätietoja, tulo-osan geometriaa ja CFD-laskentaa. Suunnittelutuloksia verrataan lopuksi kirjallisuudessa esitettyihin tuloksiin ja arvioidaan suunnittelun onnistumista sekä mahdollisia ongelmakohtia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Convective transport, both pure and combined with diffusion and reaction, can be observed in a wide range of physical and industrial applications, such as heat and mass transfer, crystal growth or biomechanics. The numerical approximation of this class of problemscan present substantial difficulties clue to regions of high gradients (steep fronts) of the solution, where generation of spurious oscillations or smearing should be precluded. This work is devoted to the development of an efficient numerical technique to deal with pure linear convection and convection-dominated problems in the frame-work of convection-diffusion-reaction systems. The particle transport method, developed in this study, is based on using rneshless numerical particles which carry out the solution along the characteristics defining the convective transport. The resolution of steep fronts of the solution is controlled by a special spacial adaptivity procedure. The serni-Lagrangian particle transport method uses an Eulerian fixed grid to represent the solution. In the case of convection-diffusion-reaction problems, the method is combined with diffusion and reaction solvers within an operator splitting approach. To transfer the solution from the particle set onto the grid, a fast monotone projection technique is designed. Our numerical results confirm that the method has a spacial accuracy of the second order and can be faster than typical grid-based methods of the same order; for pure linear convection problems the method demonstrates optimal linear complexity. The method works on structured and unstructured meshes, demonstrating a high-resolution property in the regions of steep fronts of the solution. Moreover, the particle transport method can be successfully used for the numerical simulation of the real-life problems in, for example, chemical engineering.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Technological progress has made a huge amount of data available at increasing spatial and spectral resolutions. Therefore, the compression of hyperspectral data is an area of active research. In somefields, the original quality of a hyperspectral image cannot be compromised andin these cases, lossless compression is mandatory. The main goal of this thesisis to provide improved methods for the lossless compression of hyperspectral images. Both prediction- and transform-based methods are studied. Two kinds of prediction based methods are being studied. In the first method the spectra of a hyperspectral image are first clustered and and an optimized linear predictor is calculated for each cluster. In the second prediction method linear prediction coefficients are not fixed but are recalculated for each pixel. A parallel implementation of the above-mentioned linear prediction method is also presented. Also,two transform-based methods are being presented. Vector Quantization (VQ) was used together with a new coding of the residual image. In addition we have developed a new back end for a compression method utilizing Principal Component Analysis (PCA) and Integer Wavelet Transform (IWT). The performance of the compressionmethods are compared to that of other compression methods. The results show that the proposed linear prediction methods outperform the previous methods. In addition, a novel fast exact nearest-neighbor search method is developed. The search method is used to speed up the Linde-Buzo-Gray (LBG) clustering method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When a bloodstream infection (BSI) is suspected, most of the laboratory results-biochemical and haematologic-are available within the first hours after hospital admission of the patient. This is not the case for diagnostic microbiology, which generally takes a longer time because blood culture, which is to date the reference standard for the documentation of the BSI microbial agents, relies on bacterial or fungal growth. The microbial diagnosis of BSI directly from blood has been proposed to speed the determination of the etiological agent but was limited by the very low number of circulating microbes during these paucibacterial infections. Thanks to recent advances in molecular biology, including the improvement of nucleic acid extraction and amplification, several PCR-based methods for the diagnosis of BSI directly from whole blood have emerged. In the present review, we discuss the advantages and limitations of these new molecular approaches, which at best complement the culture-based diagnosis of BSI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During evolution, the immune system has diversified to protect the host from the extremely wide array of possible pathogens. Until recently, immune responses were dissected by use of global approaches and bulk tools, averaging responses across samples and potentially missing particular contributions of individual cells. This is a strongly limiting factor, considering that initial immune responses are likely to be triggered by a restricted number of cells at the vanguard of host defenses. The development of novel, single-cell technologies is a major innovation offering great promise for basic and translational immunology with the potential to overcome some of the limitations of traditional research tools, such as polychromatic flow cytometry or microscopy-based methods. At the transcriptional level, much progress has been made in the fields of microfluidics and single-cell RNA sequencing. At the protein level, mass cytometry already allows the analysis of twice as many parameters as flow cytometry. In this review, we explore the basis and outcome of immune-cell diversity, how genetically identical cells become functionally different, and the consequences for the exploration of host-immune defense responses. We will highlight the advantages, trade-offs, and potential pitfalls of emerging, single-cell-based technologies and how they provide unprecedented detail of immune responses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diplomityössä esitetään menetelmä populaation monimuotoisuuden mittaamiseen liukulukukoodatuissa evoluutioalgoritmeissa, ja tarkastellaan kokeellisesti sen toimintaa. Evoluutioalgoritmit ovat populaatiopohjaisia menetelmiä, joilla pyritään ratkaisemaan optimointiongelmia. Evoluutioalgoritmeissa populaation monimuotoisuuden hallinta on välttämätöntä, jotta suoritettu haku olisi riittävän luotettavaa ja toisaalta riittävän nopeaa. Monimuotoisuuden mittaaminen on erityisen tarpeellista tutkittaessa evoluutioalgoritmien dynaamista käyttäytymistä. Työssä tarkastellaan haku- ja tavoitefunktioavaruuden monimuotoisuuden mittaamista. Toistaiseksi ei ole ollut olemassa täysin tyydyttäviä monimuotoisuuden mittareita, ja työn tavoitteena on kehittää yleiskäyttöinen menetelmä liukulukukoodattujen evoluutioalgoritmien suhteellisen ja absoluuttisen monimuotoisuuden mittaamiseen hakuavaruudessa. Kehitettyjen mittareiden toimintaa ja käyttökelpoisuutta tarkastellaan kokeellisesti ratkaisemalla optimointiongelmia differentiaalievoluutioalgoritmilla. Toteutettujen mittareiden toiminta perustuu keskihajontojen laskemiseen populaatiosta. Keskihajonnoille suoritetaan skaalaus, joko alkupopulaation tai nykyisen populaation suhteen, riippuen lasketaanko absoluuttista vai suhteellista monimuotoisuutta. Kokeellisessa tarkastelussa havaittiin kehitetyt mittarit toimiviksi ja käyttökelpoisiksi. Tavoitefunktion venyttäminen koordinaattiakseleiden suunnassa ei vaikuta mittarin toimintaan. Myöskään tavoitefunktion kiertäminen koordinaatistossa ei vaikuta mittareiden tuloksiin. Esitetyn menetelmän aikakompleksisuus riippuu lineaarisesti populaation koosta, ja mittarin toiminta on siten nopeaa suuriakin populaatioita käytettäessä. Suhteellinen monimuotoisuus antaa vertailukelpoisia tuloksia riippumatta parametrien lukumäärästä tai populaation koosta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman päätavoitteena oli määritellä, mitä on paperitehtaan avaintehtävissä tarvittava, tulos- ja kustannustietoista toimintaa edistävä talousosaaminen. Kirjallisuusanalyysin perusteella muodostettiin malli talousosaamisen rakentumisesta. Mallia testattiin haastattelemalla erään paperitehtaan avaintehtävissä toimivia henkilöitä. Tulosten perusteella muodostettiin lopullinen käsitys paperitehtaassa tarvittavasta talousosaamisesta ja sen kehittämiseksi soveltuvista menetelmistä. Tutkimus osoitti, että talousosaaminen rakentuu sekä työntekijän sisäisten ominaisuuksien että hänen tiedollisen ja taidollisen talousosaamisen yhdistelmästä. Tietotaitotaso näyttäisi jakautuvan useaan eri kerrokseen sen mukaan, miten laajalti tiedot ja taidot ovat organisaatiossa sovellettavissa. Tärkeimmiksi sisäisiksi ominaisuuksiksi muodostuivat vuorovaikutusosaaminen, vastuuntuntoisuus ja ongelmanratkaisukyky. Talousosaamisen vahvistamisessa avainasemaan nousivat vuorovaikutukselliset, erityisesti viestintään liittyvät keinot ja osaamisen hyödyntämistä tukevan työympäristön luominen.