897 resultados para Simulation-based methods
Resumo:
The newsworthiness of an event is partly determined by how unusual it isand this paper investigates the business cycle implications of this fact. In particular, weanalyze the consequences of information structures in which some types of signals are morelikely to be observed after unusual events. Such signals may increase both uncertainty anddisagreement among agents and when embedded in a simple business cycle model, can helpus understand why we observe (i) occasional large changes in macro economic aggregatevariables without a correspondingly large change in underlying fundamentals (ii) persistentperiods of high macroeconomic volatility and (iii) a positive correlation between absolutechanges in macro variables and the cross-sectional dispersion of expectations as measuredby survey data. These results are consequences of optimal updating by agents when theavailability of some signals is positively correlated with tail-events. The model is estimatedby likelihood based methods using individual survey responses and a quarterly time seriesof total factor productivity along with standard aggregate time series. The estimated modelsuggests that there have been episodes in recent US history when the impact on outputof innovations to productivity of a given magnitude was more than eight times as largecompared to other times.
Resumo:
Copy number variation (CNV) has recently gained considerable interest as a source of genetic variation likely to play a role in phenotypic diversity and evolution. Much effort has been put into the identification and mapping of regions that vary in copy number among seemingly normal individuals in humans and a number of model organisms, using bioinformatics or hybridization-based methods. These have allowed uncovering associations between copy number changes and complex diseases in whole-genome association studies, as well as identify new genomic disorders. At the genome-wide scale, however, the functional impact of CNV remains poorly studied. Here we review the current catalogs of CNVs, their association with diseases and how they link genotype and phenotype. We describe initial evidence which revealed that genes in CNV regions are expressed at lower and more variable levels than genes mapping elsewhere, and also that CNV not only affects the expression of genes varying in copy number, but also have a global influence on the transcriptome. Further studies are warranted for complete cataloguing and fine mapping of CNVs, as well as to elucidate the different mechanisms by which they influence gene expression.
Resumo:
The objectives of this work were to analyze theoretical genetic gains of maize due to recurrent selection among full-sib and half-sib families, obtained by Design I, Full-Sib Design and Half-Sib Design, and genotypic variability and gene loss with long term selection. The designs were evaluated by simulation, based on average estimated gains after ten selection cycles. The simulation process was based on seven gene systems with ten genes (with distinct degrees of dominance), three population classes (with different gene frequencies), under three environmental conditions (heritability values), and four selection strategies. Each combination was repeated ten times, amounting to 25, 200 simulations. Full-sib selection is generally more efficient than half-sib selection, mainly with favorable dominant genes. The use of full-sib families derived by Design I is generally more efficient than using progenies obtained by Full-Sib Design. Using Design I with 50 males and 200 females (effective size of 160) did not result in improved populations with minimum genotypic variability. In the populations with lower effective size (160 and 400) the loss of favorable genes was restricted to recessive genes with reduced frequencies.
Resumo:
Companies are under IAS 40 required to report fair values of investment properties on the balance sheet or to disclose them in the notes. The standard requires also that companies have to disclose the methods and significant assumptions applied in determining fair values of investment properties. However, IAS 40 does not include any illustrative examples or other guidance on how to apply the disclosure requirements. We use a sample with publicly traded companies from the real estate sector in the EU. We find that a majority of the companies use income based methods for the measurement of fair values but there are considerable cross-country variations in the level of disclosures about the assumptions used in determining fair values. More specifically, we find that Scandinavian and German origin companies disclose more than French and English origin companies. We also test whether disclosure quality is associated with enforcement quality measured with the “Rule of Law” index according to Kaufmann et al. (2010), and associated with a secrecy- versus transparency-measure based on Gray (1988). We find a positive association between disclosure and earnings quality and a negative association with secrecy.
Resumo:
The objective of this study was to determine the minimum number of plants per plot that must be sampled in experiments with sugarcane (Saccharum officinarum) full-sib families in order to provide an effective estimation of genetic and phenotypic parameters of yield-related traits. The data were collected in a randomized complete block design with 18 sugarcane full-sib families and 6 replicates, with 20 plants per plot. The sample size was determined using resampling techniques with replacement, followed by an estimation of genetic and phenotypic parameters. Sample-size estimates varied according to the evaluated parameter and trait. The resampling method permits an efficient comparison of the sample-size effects on the estimation of genetic and phenotypic parameters. A sample of 16 plants per plot, or 96 individuals per family, was sufficient to obtain good estimates for all traits considered of all the characters evaluated. However, for Brix, if sample separation by trait were possible, ten plants per plot would give an efficient estimate for most of the characters evaluated.
Resumo:
Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.
Resumo:
As culture-based methods for the diagnosis of invasive fungal diseases (IFD) in leukemia and hematopoietic SCT patients have limited performance, non-culture methods are increasingly being used. The third European Conference on Infections in Leukemia (ECIL-3) meeting aimed at establishing evidence-based recommendations for the use of biological tests in adult patients, based on the grading system of the Infectious Diseases Society of America. The following biomarkers were investigated as screening tests: galactomannan (GM) for invasive aspergillosis (IA); β-glucan (BG) for invasive candidiasis (IC) and IA; Cryptococcus Ag for cryptococcosis; mannan (Mn) Ag/anti-mannan (A-Mn) Ab for IC, and PCR for IA. Testing for GM, Cryptococcus Ag and BG are included in the revised EORTC/MSG (European Organization for Research and Treatment of Cancer/Mycoses Study Group) consensus definitions for IFD. Strong evidence supports the use of GM in serum (A II), and Cryptococcus Ag in serum and cerebrospinal fluid (CSF) (A II). Evidence is moderate for BG detection in serum (B II), and the combined Mn/A-Mn testing in serum for hepatosplenic candidiasis (B III) and candidemia (C II). No recommendations were formulated for the use of PCR owing to a lack of standardization and clinical validation. Clinical utility of these markers for the early management of IFD should be further assessed in prospective randomized interventional studies.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation‑based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi‑resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Among the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, have the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical‑based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.
Resumo:
The widespread misuse of drugs has increased the number of multiresistant bacteria, and this means that tools that can rapidly detect and characterize bacterial response to antibiotics are much needed in the management of infections. Various techniques, such as the resazurin-reduction assays, the mycobacterial growth indicator tube or polymerase chain reaction-based methods, have been used to investigate bacterial metabolism and its response to drugs. However, many are relatively expensive or unable to distinguish between living and dead bacteria. Here we show that the fluctuations of highly sensitive atomic force microscope cantilevers can be used to detect low concentrations of bacteria, characterize their metabolism and quantitatively screen (within minutes) their response to antibiotics. We applied this methodology to Escherichia coli and Staphylococcus aureus, showing that live bacteria produced larger cantilever fluctuations than bacteria exposed to antibiotics. Our preliminary experiments suggest that the fluctuation is associated with bacterial metabolism.
Resumo:
Tässä diplomityössä suunnitellaan yksivaiheisen turbiinin ylisooninen staattori ja alisooninen roottori, tulo-osa ja diffuusori. Työn alussa tarkastellaan aksiaaliturbiinin käyttökohteita ja teoriaa, jonka jälkeen esitetään suunnittelun perustana olevat menetelmät ja periaatteet. Perussuunnittelu tehdään Traupelinmenetelmällä WinAxtu 1.1 suunnitteluohjelmalla ja hyötysuhde arvioidaan lisäksiExcel-pohjaisella laskennalla. Ylisooninen staattori suunnitellaan perussuunnittelun tuloksiin perustuen, soveltamalla karakteristikoiden menetelmää suuttimen laajenevaan osaan ja pinta-alasuhteita suppenevaan osaan. Roottorin keskiviiva piirretään Sahlbergin menetelmällä ja siiven muoto määritetään A3K7 paksuusjakauman sekä tiheän siipihilan muotoilun periaatteita yhdistämällä. Tulo-osa suunnitellaan mahdollisimman jouhevaksi geometriatietojen ja kirjallisuuden esimerkkien mukaisesti. Lopuksi tulo-osaa mallinnetaan CFD-laskennalla. Diffuusori suunnitellaan käyttämällä soveltuvin osin kirjallisuudessa esitettyjätietoja, tulo-osan geometriaa ja CFD-laskentaa. Suunnittelutuloksia verrataan lopuksi kirjallisuudessa esitettyihin tuloksiin ja arvioidaan suunnittelun onnistumista sekä mahdollisia ongelmakohtia.
Resumo:
The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.
Resumo:
Technological progress has made a huge amount of data available at increasing spatial and spectral resolutions. Therefore, the compression of hyperspectral data is an area of active research. In somefields, the original quality of a hyperspectral image cannot be compromised andin these cases, lossless compression is mandatory. The main goal of this thesisis to provide improved methods for the lossless compression of hyperspectral images. Both prediction- and transform-based methods are studied. Two kinds of prediction based methods are being studied. In the first method the spectra of a hyperspectral image are first clustered and and an optimized linear predictor is calculated for each cluster. In the second prediction method linear prediction coefficients are not fixed but are recalculated for each pixel. A parallel implementation of the above-mentioned linear prediction method is also presented. Also,two transform-based methods are being presented. Vector Quantization (VQ) was used together with a new coding of the residual image. In addition we have developed a new back end for a compression method utilizing Principal Component Analysis (PCA) and Integer Wavelet Transform (IWT). The performance of the compressionmethods are compared to that of other compression methods. The results show that the proposed linear prediction methods outperform the previous methods. In addition, a novel fast exact nearest-neighbor search method is developed. The search method is used to speed up the Linde-Buzo-Gray (LBG) clustering method.
Resumo:
When a bloodstream infection (BSI) is suspected, most of the laboratory results-biochemical and haematologic-are available within the first hours after hospital admission of the patient. This is not the case for diagnostic microbiology, which generally takes a longer time because blood culture, which is to date the reference standard for the documentation of the BSI microbial agents, relies on bacterial or fungal growth. The microbial diagnosis of BSI directly from blood has been proposed to speed the determination of the etiological agent but was limited by the very low number of circulating microbes during these paucibacterial infections. Thanks to recent advances in molecular biology, including the improvement of nucleic acid extraction and amplification, several PCR-based methods for the diagnosis of BSI directly from whole blood have emerged. In the present review, we discuss the advantages and limitations of these new molecular approaches, which at best complement the culture-based diagnosis of BSI.
Resumo:
Kivihiokkeen valmistus on energiaintensiivistä. Käytetystä energiasta muuttuu yli 90 prosenttia lämmöksi. Hiomolla käytetystä lämmöksi muuttuneesta tehosta voidaan paperikoneelle siirtää noin puolet. Mekaanisen massan valmistuksen ja paperikoneen vesikierrot erotetaan toisistaan häiriöaineiden kulkeutumisen estämiseksi. Vesikiertojen erottamisella katkaistaan myös lämmön siirtyminen hiomolta paperikoneelle massojen mukana. Käyttämällä lämmönsiirtimiä hiomon vesien jäähdytyksessä, voidaan hiomon hiomakoneiden suihkuvesivesilämpötilaa alentaa. Lämmönsiirto vaikuttaa paperikoneella annostelumassojen laimennusten kautta perälaatikkolämpötilaa kohottavasti. Työn tehtäväksi määritettiin kesäkuukausina esiintyvä hiomakoneiden suihkuveden raakavesijäähdytyksen tarpeen poistaminen ensisijaisesti niin, että ylimäärälämpö hyödynnetään tehtaalla. Työn muiksi tavoitteiksi muodostui annostelumassojen lämpötilan hallinta, etenkin muutokset, joilla voidaan nostaa hylkymassan annostelulämpötilaa. Työn kokeellinen osa tehtiin UPM Kymmene Oyj Kajaanin tehtailla syksyn 2004 aikana. Työssä tutkittiin WinGEMS simulointiohjelmalla tehtyjen mallien avulla lämmön siirtymistä hiomon ja paperikone 2:n välillä, sekä lämmönsiirtoa pois tasealueelta. Simulointimalli nykytilanteesta rakennettiin yksityiskohtaisesti nykyisen tuotantoprosessin kaltaiseksi ja siitä muokattiin eri vaihtoehtoja, joilla ratkaistiin tutkimukselle asetetut tehtävät. Kytkentämuutoksilla pystyttiin siirtämään hiomolta yli 85 % hiomakoneiden suihkuveden ylimäärälämmöstä ilman uusia laitehankintoja. Asentamalla lopuksi lämmönsiirrin hiomon puhdassuodoslinjaan, hiomakoneiden suihkuveden jäähdytystarve poistettiin kokonaan. Samalla alennettiin valkaisuun menevän massan lämpötilaa, jolloin peroksidivalkaisun kemikaalikulutus väheni yli 10 %. Lämmönsiirrinverkostosta tehtiin kesätilanteen pinch-analyysi, jolla selvitettiin prosessin lämmitys ja jäähdytystarpeet. Analyysin perusteella selvisi, että kytkennöissä ei rikota pinch sääntöjä ja, että prosessissa esiintyy kynnysongelma, jossa prosessi tarvitsee ainoastaan jäähdytystä.