864 resultados para Hierarchical sampling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural information over the entire course of binding interactions based on the analyses of energy landscapes is described, which provides a framework to understand the events involved during biomolecular recognition. Conformational dynamics of malectin's exquisite selectivity for diglucosylated N-glycan (Dig-N-glycan), a highly flexible oligosaccharide comprising of numerous dihedral torsion angles, are described as an example. For this purpose, a novel approach based on hierarchical sampling for acquiring metastable molecular conformations constituting low-energy minima for understanding the structural features involved in a biologic recognition is proposed. For this purpose, four variants of principal component analysis were employed recursively in both Cartesian space and dihedral angles space that are characterized by free energy landscapes to select the most stable conformational substates. Subsequently, k-means clustering algorithm was implemented for geometric separation of the major native state to acquire a final ensemble of metastable conformers. A comparison of malectin complexes was then performed to characterize their conformational properties. Analyses of stereochemical metrics and other concerted binding events revealed surface complementarity, cooperative and bidentate hydrogen bonds, water-mediated hydrogen bonds, carbohydrate-aromatic interactions including CH-pi and stacking interactions involved in this recognition. Additionally, a striking structural transition from loop to beta-strands in malectin CRD upon specific binding to Dig-N-glycan is observed. The interplay of the above-mentioned binding events in malectin and Dig-N-glycan supports an extended conformational selection model as the underlying binding mechanism.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the experimental design. Our findings provide novel evidence to aid the design of future sampling programs and improve our general understanding of the mechanisms regulating elemental fingerprints.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers the use of servo-mechanisms as part of a tightly integrated homogeneous Wireless Multi- media Sensor Network (WMSN). We describe the design of our second generation WMSN node platform, which has increased image resolution, in-built audio sensors, PIR sensors, and servo- mechanisms. These devices have a wide disparity in their energy consumption and in the information quality they return. As a result, we propose a framework that establishes a hierarchy of devices (sensors and actuators) within the node and uses frequent sampling of cheaper devices to trigger the activation of more energy-hungry devices. Within this framework, we consider the suitability of servos for WMSNs by examining the functional characteristics and by measuring the energy consumption of 2 analog and 2 digital servos, in order to determine their impact on overall node energy cost. We also implement a simple version of our hierarchical sampling framework to evaluate the energy consumption of servos relative to other node components. The evaluation results show that: (1) the energy consumption of servos is small relative to audio/image signal processing energy cost in WMSN nodes; (2) digital servos do not necessarily consume as much energy as is currently believed; and (3) the energy cost per degree panning is lower for larger panning angles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ecosystems. Coastal oceanic upwelling, for example, has been associated with elevatedbiomass and abundance patterns of certain functional groups, e.g., corticated macroalgae.In the upwelling system of Northern Chile, we examined measures of intertidal macrobenthiccomposition, structure and trophic ecology across eighteen shores varying in theirproximity to two coastal upwelling centres, in a hierarchical sampling design (spatial scalesof >1 and >10 km). The influence of coastal upwelling on intertidal communities was confirmedby the stable isotope values (δ13C and δ15N) of consumers, including a dominantsuspension feeder, grazers, and their putative resources of POM, epilithic biofilm, andmacroalgae. We highlight the utility of muscle δ15N from the suspension feeding mussel,Perumytilus purpuratus, as a proxy for upwelling, supported by satellite data and previousstudies. Where possible, we used corrections for broader-scale trends, spatial autocorrelation,ontogenetic dietary shifts and spatial baseline isotopic variation prior to analysis. Ourresults showed macroalgal assemblage composition, and benthic consumer assemblagestructure, varied significantly with the intertidal influence of coastal upwelling, especiallycontrasting bays and coastal headlands. Coastal topography also separated differences inconsumer resource use. This suggested that coastal upwelling, itself driven by coastlinetopography, influences intertidal communities by advecting nearshore phytoplankton populationsoffshore and cooling coastal water temperatures. We recommend the isotopic valuesof benthic organisms, specifically long-lived suspension feeders, as in situ alternativesto offshore measurements of upwelling influence

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spatial patterns in assemblage structures are generated by ecological processes that occur on multiple scales. Identifying these processes is important for the prediction of impact, for restoration and for conservation of biodiversity. This study used a hierarchical sampling design to quantify variations in assemblage structures of Brazilian estuarine fish across 2 spatial scales and to reveal the ecological processes underlying the patterns observed. Eight areas separated by 0.7 to 25 km (local scale) were sampled in 5 estuaries separated by 970 to 6000 km (regional scale) along the coast, encompassing both tropical and subtropical regions. The assemblage structure varied significantly in terms of relative biomass and presence/absence of species on both scales, but the regional variation was greater than the local variation for either dataset. However, the 5 estuaries sampled segregated into 2 major groups largely congruent with the Brazilian and Argentinian biogeographic provinces. Three environmental variables (mean temperature of the coldest month, mangrove area and mean annual precipitation) and distance between estuaries explained 44.8 and 16.3%, respectively, of the regional-scale variability in the species relative biomass. At the local scale, the importance of environmental predictors for the spatial structure of the assemblages differed between estuarine systems. Overall, these results support the idea that on a regional scale, the composition of fish assemblages is simultaneously determined by environmental filters and species dispersal capacity, while on a local scale, the effect of environmental factors should vary depending on estuary-specific physical and hydrological characteristics © 2013 Inter-Research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seagrasses are ecosystem engineers that offer important habitat for a large number of species and provide a range of ecosystem services. Many seagrass ecosystems are dominated by a single species; with research showing that genotypic diversity at fine spatial scales plays an important role in maintaining a range of ecosystem functions. However, for most seagrass species, information on fine-scale patterns of genetic variation in natural populations is lacking. In this study we use a hierarchical sampling design to determine levels of genetic and genotypic diversity at different spatial scales (centimeters, meters, kilometers) in the Australian seagrass Zostera muelleri. Our analysis shows that at fine-spatial scales (< 1 m) levels of genotypic diversity are relatively low (R (Plots) = 0.37 ± 0.06 SE), although there is some intermingling of genotypes. At the site (10's m) and meadow location (km) scale we found higher levels of genotypic diversity (R (sites) = 0.79 ± 0.04 SE; R (Locations) = 0.78 ± 0.04 SE). We found some sharing of genotypes between sites within meadows, but no sharing of genotypes between meadow locations. We also detected a high level of genetic structuring between meadow locations (FST = 0.278). Taken together, our results indicate that both sexual and asexual reproduction are important in maintaining meadows of Z. muelleri. The dominant mechanism of asexual reproduction appears to occur via localised rhizome extension, although the sharing of a limited number of genotypes over the scale of 10's of metres could also result from the localised dispersal and recruitment of fragments. The large number of unique genotypes at the meadow scale indicates that sexual reproduction is important in maintaining these populations, while the high level of genetic structuring suggests little gene flow and connectivity between our study sites. These results imply that recovery from disturbances will occur through both sexual and asexual regeneration, but the limited connectivity at the landscape-scale implies that recovery at meadow-scale losses is likely to be limited.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A vespa-da-madeira, Sirex noctilio Fabricius (Hymenoptera: Siricidae) foi introduzida no Brasil em 1988 e tornou-se a principal praga dos plantios de pínus. Encontra-se distribuída em aproximadamente 1.000.000 de ha em diferentes níveis populacionais nos Estados do Rio Grande do Sul, Santa Catarina, Paraná, São Paulo e Minas Gerais. O controle da população da vespa-da-madeira é feito principalmente pela utilização do nematoide Deladenus siricidicola Bedding (Nematoda: Neothylenchidae). A avaliação da eficiência dos inimigos naturais é dificultada por não haver um sistema de amostragem apropriado. Este estudo testou o sistema de amostragem hierárquica para definir o tamanho da amostra para monitorar a população de S. noctilio e também a eficiência dos inimigos naturais, a qual mostrou-se adequada.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses a method for scaling SVM with Gaussian kernel function to handle large data sets by using a selective sampling strategy for the training set. It employs a scalable hierarchical clustering algorithm to construct cluster indexing structures of the training data in the kernel induced feature space. These are then used for selective sampling of the training data for SVM to impart scalability to the training process. Empirical studies made on real world data sets show that the proposed strategy performs well on large data sets.