904 resultados para Hierarchical scaffold


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An emerging consensus in cognitive science views the biological brain as a hierarchically-organized predictive processing system. This is a system in which higher-order regions are continuously attempting to predict the activity of lower-order regions at a variety of (increasingly abstract) spatial and temporal scales. The brain is thus revealed as a hierarchical prediction machine that is constantly engaged in the effort to predict the flow of information originating from the sensory surfaces. Such a view seems to afford a great deal of explanatory leverage when it comes to a broad swathe of seemingly disparate psychological phenomena (e.g., learning, memory, perception, action, emotion, planning, reason, imagination, and conscious experience). In the most positive case, the predictive processing story seems to provide our first glimpse at what a unified (computationally-tractable and neurobiological plausible) account of human psychology might look like. This obviously marks out one reason why such models should be the focus of current empirical and theoretical attention. Another reason, however, is rooted in the potential of such models to advance the current state-of-the-art in machine intelligence and machine learning. Interestingly, the vision of the brain as a hierarchical prediction machine is one that establishes contact with work that goes under the heading of 'deep learning'. Deep learning systems thus often attempt to make use of predictive processing schemes and (increasingly abstract) generative models as a means of supporting the analysis of large data sets. But are such computational systems sufficient (by themselves) to provide a route to general human-level analytic capabilities? I will argue that they are not and that closer attention to a broader range of forces and factors (many of which are not confined to the neural realm) may be required to understand what it is that gives human cognition its distinctive (and largely unique) flavour. The vision that emerges is one of 'homomimetic deep learning systems', systems that situate a hierarchically-organized predictive processing core within a larger nexus of developmental, behavioural, symbolic, technological and social influences. Relative to that vision, I suggest that we should see the Web as a form of 'cognitive ecology', one that is as much involved with the transformation of machine intelligence as it is with the progressive reshaping of our own cognitive capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate that it is possible to link multi-chain molecular dynamics simulations with the tube model using a single chain slip-links model as a bridge. This hierarchical approach allows significant speed up of simulations, permitting us to span the time scales relevant for a comparison with the tube theory. Fitting the mean-square displacement of individual monomers in molecular dynamics simulations with the slip-spring model, we show that it is possible to predict the stress relaxation. Then, we analyze the stress relaxation from slip-spring simulations in the framework of the tube theory. In the absence of constraint release, we establish that the relaxation modulus can be decomposed as the sum of contributions from fast and longitudinal Rouse modes, and tube survival. Finally, we discuss some open questions regarding possible future directions that could be profitable in rendering the tube model quantitative, even for mildly entangled polymers

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have investigated the use of a laminin coated compressed collagen gel containing corneal fibroblasts (keratocytes) as a novel scaffold to support the growth of corneal limbal epithelial stem cells. The growth of limbal epithelial cells was compared between compressed collagen gel and a clinically proven conventional substrate, denuded amniotic membrane. Following compression of the collagen gel, encapsulated keratocytes remained viable and scanning electron microscopy showed that fibres within the compressed gel were dense, homogeneous and similar in structure to those within denuded amniotic membrane. Limbal epithelial cells were successfully expanded upon the compressed collagen resulting in stratified layers of cells containing desmosome and hemidesmosome structures. The resulting corneal constructs of both the groups shared a high degree of transparency, cell morphology and cell stratification. Similar protein expression profiles for cytokeratin 3 and cytokeratin 14 and no significant difference in cytokeratin 12 mRNA expression levels by real time PCR were also observed. This study provides the first line of evidence that a laminin coated compressed collagen gel containing keratocytes can adequately support limbal epithelial cell expansion, stratification and differentiation to a degree that is comparable to the leading conventional scaffold, denuded amniotic membrane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we focus on the one year ahead prediction of the electricity peak-demand daily trajectory during the winter season in Central England and Wales. We define a Bayesian hierarchical model for predicting the winter trajectories and present results based on the past observed weather. Thanks to the flexibility of the Bayesian approach, we are able to produce the marginal posterior distributions of all the predictands of interest. This is a fundamental progress with respect to the classical methods. The results are encouraging in both skill and representation of uncertainty. Further extensions are straightforward at least in principle. The main two of those consist in conditioning the weather generator model with respect to additional information like the knowledge of the first part of the winter and/or the seasonal weather forecast. Copyright (C) 2006 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, two approaches have been introduced that distribute the molecular fragment mining problem. The first approach applies a master/worker topology, the second approach, a completely distributed peer-to-peer system, solves the scalability problem due to the bottleneck at the master node. However, in many real world scenarios the participating computing nodes cannot communicate directly due to administrative policies such as security restrictions. Thus, potential computing power is not accessible to accelerate the mining run. To solve this shortcoming, this work introduces a hierarchical topology of computing resources, which distributes the management over several levels and adapts to the natural structure of those multi-domain architectures. The most important aspect is the load balancing scheme, which has been designed and optimized for the hierarchical structure. The approach allows dynamic aggregation of heterogenous computing resources and is applied to wide area network scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

R. H. Whittaker's idea that plant diversity can be divided into a hierarchy of spatial components from alpha at the within-habitat scale through beta for the turnover of species between habitats to gamma along regional gradients implies the underlying existence of alpha, beta, and gamma niches. We explore the hypothesis that the evolution of a, (3, and gamma niches is also hierarchical, with traits that define the a niche being labile, while those defining a and 7 niches are conservative. At the a level we find support for the hypothesis in the lack of close significant phylogenetic relationship between meadow species that have similar a niches. In a second test, a niche overlap based on a variety of traits is compared between congeners and noncongeners in several communities; here, too, there is no evidence of a correlation between a niche and phylogeny. To test whether beta and gamma niches evolve conservatively, we reconstructed the evolution of relevant traits on evolutionary trees for 14 different clades. Tests against null models revealed a number of instances, including some in island radiations, in which habitat (beta niche) and elevational maximum (an aspect of the gamma niche) showed evolutionary conservatism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic polymorphisms in deoxyribonucleic acid coding regions may have a phenotypic effect on the carrier, e.g. by influencing susceptibility to disease. Detection of deleterious mutations via association studies is hampered by the large number of candidate sites; therefore methods are needed to narrow down the search to the most promising sites. For this, a possible approach is to use structural and sequence-based information of the encoded protein to predict whether a mutation at a particular site is likely to disrupt the functionality of the protein itself. We propose a hierarchical Bayesian multivariate adaptive regression spline (BMARS) model for supervised learning in this context and assess its predictive performance by using data from mutagenesis experiments on lac repressor and lysozyme proteins. In these experiments, about 12 amino-acid substitutions were performed at each native amino-acid position and the effect on protein functionality was assessed. The training data thus consist of repeated observations at each position, which the hierarchical framework is needed to account for. The model is trained on the lac repressor data and tested on the lysozyme mutations and vice versa. In particular, we show that the hierarchical BMARS model, by allowing for the clustered nature of the data, yields lower out-of-sample misclassification rates compared with both a BMARS and a frequen-tist MARS model, a support vector machine classifier and an optimally pruned classification tree.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we focus on the one year ahead prediction of the electricity peak-demand daily trajectory during the winter season in Central England and Wales. We define a Bayesian hierarchical model for predicting the winter trajectories and present results based on the past observed weather. Thanks to the flexibility of the Bayesian approach, we are able to produce the marginal posterior distributions of all the predictands of interest. This is a fundamental progress with respect to the classical methods. The results are encouraging in both skill and representation of uncertainty. Further extensions are straightforward at least in principle. The main two of those consist in conditioning the weather generator model with respect to additional information like the knowledge of the first part of the winter and/or the seasonal weather forecast. Copyright (C) 2006 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Helices and sheets are ubiquitous in nature. However, there are also some examples of self-assembling molecules forming supramolecular helices and sheets in unnatural systems. Unlike supramolecular sheets there are a very few examples of peptide sub-units that can be used to construct supramolecular helical architectures using the backbone hydrogen bonding functionalities of peptides. In this report we describe the design and synthesis of two single turn/bend forming peptides (Boc-Phe-Aib-Ile-OMe 1 and Boc-Ala-Leu-Aib-OMe 2) (Aib: alpha-aminoisobutyric acid) and a series of double-turn forming peptides (Boc-Phe-Aib-IIe-Aib-OMe 3, Boc-Leu-Aib-Gly-Aib-OMe 4 and Boc-gamma-Abu-Aib-Leu-Aib-OMe 5) (gamma-Abu: gamma-aminobutyric acid). It has been found that, in crystals, on self-assembly, single turn/bend forming peptides form either a supramolecular sheet (peptide 1) or a supramolecular helix (peptide 2). unlike self-associating double turn forming peptides, which have only the option of forming supramolecular helical assemblages. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A series of water-soluble synthetic dipeptides (1-3) with an N-terminally located beta-alanine residue, beta-alanyl-L-valine (1), beta-alanyl-L-isoleucine (2), and beta-alanyl-L-phenylalanine (3, form hydrogen-bonded supramolecular double helices with a pitch length of 1 nm, whereas the C-terminally positioned beta-alanine containing dipeptide (4), L-phenylalanyl-beta-alanine, does not form a supramolecular double helical structure. beta-Ala-Xaa (Xaa = Val/Ile/Phe) can be regarded as a new motif for the formation of supramolecular double helical structures in the solid state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anion directed, template syntheses of two dinuclear copper(II) complexes of mono-condensed Schiff base ligand Hdipn (4-[(3-aminopentylimino)-methyl]-benzene-1,3-diol) involving 2,4- dihydroxybenzaldehyde and 1,3-diaminopentane were realized in the presence of bridging azide and acetate anions. Both complexes, [Cu-2(dipn)(2)(N-3)(2)] (1) and [Cu-2(dip(n))(2)(OAc)(2)] (2) have been characterized by X-ray crystallography. The two mononuclear units are joined together by basal-apical, double end-on azido bridges in complex 1 and by basal-apical, double mono-atomic acetate oxygen-bridges in 2. Both complexes form rectangular grid-like supramolecular structures via H-bonds connecting the azide or acetate anion and the p-hydroxy group of 2,4- dihydroxybenzaldehyde. Variable-temperature (300-2 K) magnetic susceptibility measurements reveal that complex 1 has antiferromagnetic coupling (J = -2.10 cm (1)) through the azide bridge while 2 has intra-dimer ferromagnetic coupling through the acetate bridge and inter-dimer antiferromagnetic coupling through H-bonds (J = 2.85 cm (1), J' = -1.08 cm (1)). (C) 2009 Elsevier B. V. All rights reserved.