501 resultados para "Bootstrap"
Resumo:
Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.
Resumo:
Botryosphaeria rhodina (anamorph Lasiodiplodia theobromae) is a common endophyte and opportunistic pathogen on more than 500 tree species in the tropics and subtropics. During routine disease surveys of plantations in Australia and Venezuela several isolates differing from L. theobromae were identified and subsequently characterized based upon morphology and ITS and EF1-a nucleotide sequences. These isolates grouped into three strongly supported clades related to but different from the known taxa, B. rhodina and L. gonubiensis, These have been described here as three new species L. venezuelensis sp. nov., L. crassispora sp. nov. and L. rubropurpurea sp. nov. The three could be distinguished easily from each other and the two described species of Lasiodiplodia, thus confirming phylogenetic separations. Furthermore all five Lasiodiplodia spp. now recognized separated from Diplodia spp. and Dothiorella spp. with 100% bootstrap support.
Resumo:
Avian haemophili demonstrating in vitro satellitic growth, also referred to as the V-factor or NAD requirement, have mainly been classified with Avibacterium paragallinarum (Haemophilus paragallinarum), Avibacterium avium (Pasteurella avium), Avibacterium volantium (Pasteurella volantium) and Avibacterium sp. A (Pasteurella species A). The aim of the present study was to assess the taxonomic position of 18 V-factor-requiring isolates of unclassified Haemophilus-like organisms isolated from galliforme, anseriforme, columbiforme and gruiforme birds as well as kestrels and psittacine birds including budgerigars by conventional phenotypic tests and 16S rRNA gene sequencing. All isolates shared phenotypical characteristics which allowed classification with Pasteurellaceae. Haemolysis of bovine red blood cells was negative. Haemin (X-factor) was not required for growth. Maximum-likelihood phylogenetic analysis including bootstrap analysis showed that six isolates were related to the avian 16S rRNA group and were classified as Avibacterium according to 16S rRNA sequence analysis. Surprisingly, the other 12 isolates were unrelated to Avibacterium. Two isolates were unrelated to any of the known 16S rRNA groups of Pasteurellaceae. Two isolates were related to Volucribacter of the avian 16S rRNA group. Seven isolates belonged to the Testudinis 16S rRNA group and out of these, two isolates were closely related to taxa 14 and 32 of Bisgaard, whereas four other isolates were found to form a genus-like group distantly related to taxon 40 and one isolated remained distantly related to other members of the Testudinis group. One isolate was closely related to taxon 26 (a member of Actinobacillus sensu stricto). The study documented major genetic diversity among V-factor-requiring avian isolates beyond the traditional interpretation that they only belong to Avibacterium, underlining the limited value of satellitic growth for identification of avian members of Pasteurellaceae. Our study also emphasized that these organisms will never be isolated without the use of special media satisfying the V-factor requirement.
Resumo:
Objective: To examine if streamlining a medical research funding application process saved time for applicants. Design: Cross-sectional surveys before and after the streamlining. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures: Average researcher time spent preparing an application and the total time for all applications in working days. Results: The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions: Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications.
Resumo:
The DNA polymorphism among 22 isolates of Sclerospora graminicola, the causal agent of downy mildew disease of pearl millet was assessed using 20 inter simple sequence repeats (ISSR) primers. The objective of the study was to examine the effectiveness of using ISSR markers for unravelling the extent and pattern of genetic diversity in 22 S. graminicola isolates collected from different host cultivars in different states of India. The 19 functional ISSR primers generated 410 polymorphic bands and revealed 89% polymorphism and were able to distinguish all the 22 isolates. Polymorphic bands used to construct an unweighted pair group method of averages (UPGMA) dendrogram based on Jaccard's co-efficient of similarity and principal coordinate analysis resulted in the formation of four major clusters of 22 isolates. The standardized Nei genetic distance among the 22 isolates ranged from 0.0050 to 0.0206. The UPGMA clustering using the standardized genetic distance matrix resulted in the identification of four clusters of the 22 isolates with bootstrap values ranging from 15 to 100. The 3D-scale data supported the UPGMA results, which resulted into four clusters amounting to 70% variation among each other. However, comparing the two methods show that sub clustering by dendrogram and multi dimensional scaling plot is slightly different. All the S. graminicola isolates had distinct ISSR genotypes and cluster analysis origin. The results of ISSR fingerprints revealed significant level of genetic diversity among the isolates and that ISSR markers could be a powerful tool for fingerprinting and diversity analysis in fungal pathogens.
Resumo:
It is known that DNA-binding proteins can slide along the DNA helix while searching for specific binding sites, but their path of motion remains obscure. Do these proteins undergo simple one-dimensional (1D) translational diffusion, or do they rotate to maintain a specific orientation with respect to the DNA helix? We measured 1D diffusion constants as a function of protein size while maintaining the DNA-protein interface. Using bootstrap analysis of single-molecule diffusion data, we compared the results to theoretical predictions for pure translational motion and rotation-coupled sliding along the DNA. The data indicate that DNA-binding proteins undergo rotation-coupled sliding along the DNA helix and can be described by a model of diffusion along the DNA helix on a rugged free-energy landscape. A similar analysis including the 1D diffusion constants of eight proteins of varying size shows that rotation-coupled sliding is a general phenomenon. The average free-energy barrier for sliding along the DNA was 1.1 +/- 0.2 k(B)T. Such small barriers facilitate rapid search for binding sites.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
Topics in Spatial Econometrics — With Applications to House Prices Spatial effects in data occur when geographical closeness of observations influences the relation between the observations. When two points on a map are close to each other, the observed values on a variable at those points tend to be similar. The further away the two points are from each other, the less similar the observed values tend to be. Recent technical developments, geographical information systems (GIS) and global positioning systems (GPS) have brought about a renewed interest in spatial matters. For instance, it is possible to observe the exact location of an observation and combine it with other characteristics. Spatial econometrics integrates spatial aspects into econometric models and analysis. The thesis concentrates mainly on methodological issues, but the findings are illustrated by empirical studies on house price data. The thesis consists of an introductory chapter and four essays. The introductory chapter presents an overview of topics and problems in spatial econometrics. It discusses spatial effects, spatial weights matrices, especially k-nearest neighbours weights matrices, and various spatial econometric models, as well as estimation methods and inference. Further, the problem of omitted variables, a few computational and empirical aspects, the bootstrap procedure and the spatial J-test are presented. In addition, a discussion on hedonic house price models is included. In the first essay a comparison is made between spatial econometrics and time series analysis. By restricting the attention to unilateral spatial autoregressive processes, it is shown that a unilateral spatial autoregression, which enjoys similar properties as an autoregression with time series, can be defined. By an empirical study on house price data the second essay shows that it is possible to form coordinate-based, spatially autoregressive variables, which are at least to some extent able to replace the spatial structure in a spatial econometric model. In the third essay a strategy for specifying a k-nearest neighbours weights matrix by applying the spatial J-test is suggested, studied and demonstrated. In the final fourth essay the properties of the asymptotic spatial J-test are further examined. A simulation study shows that the spatial J-test can be used for distinguishing between general spatial models with different k-nearest neighbours weights matrices. A bootstrap spatial J-test is suggested to correct the size of the asymptotic test in small samples.
Resumo:
In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.
Resumo:
This paper is concerned with using the bootstrap to obtain improved critical values for the error correction model (ECM) cointegration test in dynamic models. In the paper we investigate the effects of dynamic specification on the size and power of the ECM cointegration test with bootstrap critical values. The results from a Monte Carlo study show that the size of the bootstrap ECM cointegration test is close to the nominal significance level. We find that overspecification of the lag length results in a loss of power. Underspecification of the lag length results in size distortion. The performance of the bootstrap ECM cointegration test deteriorates if the correct lag length is not used in the ECM. The bootstrap ECM cointegration test is therefore not robust to model misspecification.
Resumo:
In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.
Resumo:
Tutkielmassa sovelletaan aineiston edustavuutta mittaavaa laatuindikaattoria Suomen uhritutkimuspilottiin tilanteessa, jossa ilmenee vastauskatoa. Vastauskato on kasvava ongelma tilastotutkimuksissa: jos tutkimukseen osallistuneet eivät edusta otosjoukkoa tutkittavan asian suhteen, voi vastauskadosta aiheutuva harha olla estimoiduissa tunnusluvuissa hyvinkin suuri. Tutkimuksissa näkee usein julkaistavan vastausasteen ikään kuin se kertoisi aukottomasti tutkimuksen laadusta. Pelkkä korkea vastausaste ei kuitenkaan välttämättä takaa estimaattien harhattomuutta, sillä se ei kerro mitään vastanneiden ja vastaamattomien eroista tutkittavan asian suhteen. Tarvitaan siis muita mittareita, joilla vastanneiden laatua voitaisiin paremmin arvioida, ja R-indikaattori tarjoaa yhden vaihtoehdon. R-indikaattori mittaa otosalkioiden vastausalttiuksien välistä vaihtelua. R-indikaattorin estimoiminen edellyttää siis vastausalttiuksien estimointia, mikä puolestaan edellyttää apumuuttujien olemassaoloa kaikille otosalkioille. Vastausalttiuksien estimoimiseen käytettiin linkkifunktiona sekä logistista mallia että ja Särndalin ja Lundströmin (2008) vastausvaikutusten mallia. Vastauskäyttäytymiseen vaikuttavan apumuuttujajoukon valinta tehtiin alan kirjallisuuteen perustuen (Groves & Couper 1998). Koska R-indikaattorin estimaattori on satunnaismuuttuja, täytyi sille estimoida varianssi ja mahdollinen harha (Shlomo ym. 2009). Estimoinnissa käytettiin Bootstrap-pseudotoistomenetelmää, jossa alkuperäisestä aineistosta poimitaan niin kutsuttuja pseudo-otoksia, joiden avulla R-indikaattorin estimaattorille voidaan laskea keskivirhe. Suomen uhritutkimuspilotti koostui kolmesta eri tiedonkeruumenetelmällä poimitusta otoksesta: CAPI-, CATI- CAVVIotoksesta. Vastausasteet vaihtelivat aineistoissa paljon, mutta R-indikaattorin estimaatit olivat kaikille aineistoille liki samat. Suurempi vastausaste ei siis merkinnyt parempaa edustavuutta. Lisäksi CAVVI-aineistossa muistutusviestein ja -kirjein suoritettu vastausasteen kasvattaminen huononsi edustavuutta R-indikaattorin näkökulmasta. Mielivaltainen vastausasteen kasvattaminen ei siis ole välttämättä perusteltua. R-indikaattorin estimaattorin ominaisuuksien osalta empiiriset tulokset vahvistivat RISQ-projektin aiempia tutkimustuloksia. Estimaattorin arvo oli sitä pienempi mitä enemmän vastausalttiuden mallissa oli selittäjiä, koska tällöin vastausalttiuksien varianssi kasvoi (Schouten ym. 2009). Otoskoko vaikutti merkittävästi varianssin suuruuteen: mitä pienempi otoskoko oli, sitä leveämmät olivat luottamusvälit ja sitä vaikeampi oli tehdä johtopäätöksiä edustavuudesta.
Resumo:
The objectives of this study were to make a detailed and systematic empirical analysis of microfinance borrowers and non-borrowers in Bangladesh and also examine how efficiency measures are influenced by the access to agricultural microfinance. In the empirical analysis, this study used both parametric and non-parametric frontier approaches to investigate differences in efficiency estimates between microfinance borrowers and non-borrowers. This thesis, based on five articles, applied data obtained from a survey of 360 farm households from north-central and north-western regions in Bangladesh. The methods used in this investigation involve stochastic frontier (SFA) and data envelopment analysis (DEA) in addition to sample selectivity and limited dependent variable models. In article I, technical efficiency (TE) estimation and identification of its determinants were performed by applying an extended Cobb-Douglas stochastic frontier production function. The results show that farm households had a mean TE of 83% with lower TE scores for the non-borrowers of agricultural microfinance. Addressing institutional policies regarding the consolidation of individual plots into farm units, ensuring access to microfinance, extension education for the farmers with longer farming experience are suggested to improve the TE of the farmers. In article II, the objective was to assess the effects of access to microfinance on household production and cost efficiency (CE) and to determine the efficiency differences between the microfinance participating and non-participating farms. In addition, a non-discretionary DEA model was applied to capture directly the influence of microfinance on farm households production and CE. The results suggested that under both pooled DEA models and non-discretionary DEA models, farmers with access to microfinance were significantly more efficient than their non-borrowing counterparts. Results also revealed that land fragmentation, family size, household wealth, on farm-training and off farm income share are the main determinants of inefficiency after effectively correcting for sample selection bias. In article III, the TE of traditional variety (TV) and high-yielding-variety (HYV) rice producers were estimated in addition to investigating the determinants of adoption rate of HYV rice. Furthermore, the role of TE as a potential determinant to explain the differences of adoption rate of HYV rice among the farmers was assessed. The results indicated that in spite of its much higher yield potential, HYV rice production was associated with lower TE and had a greater variability in yield. It was also found that TE had a significant positive influence on the adoption rates of HYV rice. In article IV, we estimated profit efficiency (PE) and profit-loss between microfinance borrowers and non-borrowers by a sample selection framework, which provided a general framework for testing and taking into account the sample selection in the stochastic (profit) frontier function analysis. After effectively correcting for selectivity bias, the mean PE of the microfinance borrowers and non-borrowers were estimated at 68% and 52% respectively. This suggested that a considerable share of profits were lost due to profit inefficiencies in rice production. The results also demonstrated that access to microfinance contributes significantly to increasing PE and reducing profit-loss per hectare land. In article V, the effects of credit constraints on TE, allocative efficiency (AE) and CE were assessed while adequately controlling for sample selection bias. The confidence intervals were determined by the bootstrap method for both samples. The results indicated that differences in average efficiency scores of credit constrained and unconstrained farms were not statistically significant although the average efficiencies tended to be higher in the group of unconstrained farms. After effectively correcting for selectivity bias, household experience, number of dependents, off-farm income, farm size, access to on farm training and yearly savings were found to be the main determinants of inefficiencies. In general, the results of the study revealed the existence substantial technical, allocative, economic inefficiencies and also considerable profit inefficiencies. The results of the study suggested the need to streamline agricultural microfinance by the microfinance institutions (MFIs), donor agencies and government at all tiers. Moreover, formulating policies that ensure greater access to agricultural microfinance to the smallholder farmers on a sustainable basis in the study areas to enhance productivity and efficiency has been recommended. Key Words: Technical, allocative, economic efficiency, DEA, Non-discretionary DEA, selection bias, bootstrapping, microfinance, Bangladesh.
Resumo:
The problem of structural system identification when measurements originate from multiple tests and multiple sensors is considered. An offline solution to this problem using bootstrap particle filtering is proposed. The central idea of the proposed method is the introduction of a dummy independent variable that allows for simultaneous assimilation of multiple measurements in a sequential manner. The method can treat linear/nonlinear structural models and allows for measurements on strains and displacements under static/dynamic loads. Illustrative examples consider measurement data from numerical models and also from laboratory experiments. The results from the proposed method are compared with those from a Kalman filter-based approach and the superior performance of the proposed method is demonstrated. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
We recast the reconstruction problem of diffuse optical tomography (DOT) in a pseudo-dynamical framework and develop a method to recover the optical parameters using particle filters, i.e., stochastic filters based on Monte Carlo simulations. In particular, we have implemented two such filters, viz., the bootstrap (BS) filter and the Gaussian-sum (GS) filter and employed them to recover optical absorption coefficient distribution from both numerically simulated and experimentally generated photon fluence data. Using either indicator functions or compactly supported continuous kernels to represent the unknown property distribution within the inhomogeneous inclusions, we have drastically reduced the number of parameters to be recovered and thus brought the overall computation time to within reasonable limits. Even though the GS filter outperformed the BS filter in terms of accuracy of reconstruction, both gave fairly accurate recovery of the height, radius, and location of the inclusions. Since the present filtering algorithms do not use derivatives, we could demonstrate accurate contrast recovery even in the middle of the object where the usual deterministic algorithms perform poorly owing to the poor sensitivity of measurement of the parameters. Consistent with the fact that the DOT recovery, being ill posed, admits multiple solutions, both the filters gave solutions that were verified to be admissible by the closeness of the data computed through them to the data used in the filtering step (either numerically simulated or experimentally generated). (C) 2011 Optical Society of America