947 resultados para RIETVELD REFINEMENT
Resumo:
During cortical synaptic development, thalamic axons must establish synaptic connections despite the presence of the more abundant intracortical projections. How thalamocortical synapses are formed and maintained in this competitive environment is unknown. Here, we show that astrocyte-secreted protein hevin is required for normal thalamocortical synaptic connectivity in the mouse cortex. Absence of hevin results in a profound, long-lasting reduction in thalamocortical synapses accompanied by a transient increase in intracortical excitatory connections. Three-dimensional reconstructions of cortical neurons from serial section electron microscopy (ssEM) revealed that, during early postnatal development, dendritic spines often receive multiple excitatory inputs. Immuno-EM and confocal analyses revealed that majority of the spines with multiple excitatory contacts (SMECs) receive simultaneous thalamic and cortical inputs. Proportion of SMECs diminishes as the brain develops, but SMECs remain abundant in Hevin-null mice. These findings reveal that, through secretion of hevin, astrocytes control an important developmental synaptic refinement process at dendritic spines.
Resumo:
Transcranial magnetic stimulation (TMS) is a widely used, noninvasive method for stimulating nervous tissue, yet its mechanisms of effect are poorly understood. Here we report new methods for studying the influence of TMS on single neurons in the brain of alert non-human primates. We designed a TMS coil that focuses its effect near the tip of a recording electrode and recording electronics that enable direct acquisition of neuronal signals at the site of peak stimulus strength minimally perturbed by stimulation artifact in awake monkeys (Macaca mulatta). We recorded action potentials within ∼1 ms after 0.4-ms TMS pulses and observed changes in activity that differed significantly for active stimulation as compared with sham stimulation. This methodology is compatible with standard equipment in primate laboratories, allowing easy implementation. Application of these tools will facilitate the refinement of next generation TMS devices, experiments and treatment protocols.
Resumo:
Belief revision is a well-research topic within AI. We argue that the new model of distributed belief revision as discussed here is suitable for general modelling of judicial decision making, along with extant approach as known from jury research. The new approach to belief revision is of general interest, whenever attitudes to information are to be simulated within a multi-agent environment with agents holding local beliefs yet by interaction with, and influencing, other agents who are deliberating collectively. In the approach proposed, it's the entire group of agents, not an external supervisor, who integrate the different opinions. This is achieved through an election mechanism, The principle of "priority to the incoming information" as known from AI models of belief revision are problematic, when applied to factfinding by a jury. The present approach incorporates a computable model for local belief revision, such that a principle of recoverability is adopted. By this principle, any previously held belief must belong to the current cognitive state if consistent with it. For the purposes of jury simulation such a model calls for refinement. Yet we claim, it constitutes a valid basis for an open system where other AI functionalities (or outer stiumuli) could attempt to handle other aspects of the deliberation which are more specifi to legal narrative, to argumentation in court, and then to the debate among the jurors.
Resumo:
In this paper, we discuss the problem of maintenance of a CBR system for retrieval of rotationally symmetric shapes. The special feature of this system is that similarity is derived primarily from graph matching algorithms. The special problem of such a system is that it does not operate on search indices that may be derived from single cases and then used for visualisation and principle component analyses. Rather, the system is built on a similarity metric defined directly over pairs of cases. The problems of efficiency, consistency, redundancy, completeness and correctness are discussed for such a system. Performance measures for the CBR system are given, and the results for trials of the system are presented. The competence of the current case-base is discussed, with reference to a representation of cases as points in an n-dimensional feature space, and a Gramian visualisation. A refinement of the case base is performed as a result of the competence analysis and the performance of the case-base before and after refinement is compared.
Resumo:
Belief revision is a well-researched topic within Artificial Intelligence (AI). We argue that the new model of belief revision as discussed here is suitable for general modelling of judicial decision making, along with the extant approach as known from jury research. The new approach to belief revision is of general interest, whenever attitudes to information are to be simulated within a multi-agent environment with agents holding local beliefs yet by interacting with, and influencing, other agents who are deliberating collectively. The principle of 'priority to the incoming information', as known from AI models of belief revision, is problematic when applied to factfinding by a jury. The present approach incorporates a computable model for local belief revision, such that a principle of recoverability is adopted. By this principle, any previously held belief must belong to the current cognitive state if consistent with it. For the purposes of jury simulation such a model calls for refinement. Yet, we claim, it constitutes a valid basis for an open system where other AI functionalities (or outer stimuli) could attempt to handle other aspects of the deliberation which are more specific to legal narratives, to argumentation in court, and then to the debate among the jurors.
Resumo:
The problem of deriving parallel mesh partitioning algorithms for mapping unstructured meshes to parallel computers is discussed in this chapter. In itself this raises a paradox - we seek to find a high quality partition of the mesh, but to compute it in parallel we require a partition of the mesh. In fact, we overcome this difficulty by deriving an optimisation strategy which can find a high quality partition even if the quality of the initial partition is very poor and then use a crude distribution scheme for the initial partition. The basis of this strategy is to use a multilevel approach combined with local refinement algorithms. Three such refinement algorithms are outlined and some example results presented which show that they can produce very high global quality partitions, very rapidly. The results are also compared with a similar multilevel serial partitioner and shown to be almost identical in quality. Finally we consider the impact of the initial partition on the results and demonstrate that the final partition quality is, modulo a certain amount of noise, independent of the initial partition.
Resumo:
An unstructured cell-centred finite volume method for modelling viscoelastic flow is presented. The method is applied to the flow through a planar channel and the 4:1 planar contraction for creeping flow of an Oldroyd-B fluid. The results are presented for a range of Weissenberg numbers. In the case of the planar channel results are compared with analytical solutions. For the 4:1 planar contraction benchmark problem the convection terms in the constitutive equations are approximated using both first and second order differencing schemes to compare the techniques and the effect of mesh refinement on the solution is investigated. This is the first time that a fully unstructured, cell-centredfinitevolume technique has been used to model the Oldroyd-B fluid for the test cases presented in this paper.
Resumo:
This paper describes research into retrieval based on 3-dimensional shapes for use in the metal casting industry. The purpose of the system is to advise a casting engineer on the design aspects of a new casting by reference to similar castings which have been prototyped and tested in the past. The key aspects of the system are the orientation of the shape within the mould, the positions of feeders and chills, and particular advice concerning special problems and solutions, and possible redesign. The main focus of this research is the effectiveness of similarity measures based on 3-dimensional shapes. The approach adopted here is to construct similarity measures based on a graphical representation deriving from a shape decomposition used extensively by experienced casting design engineers. The paper explains the graphical representation and discusses similarity measures based on it. Performance measures for the CBR system are given, and the results for trials of the system are presented. The competence of the current case-base is discussed, with reference to a representation of cases as points in an n-dimensional feature space, and its principal components visualization. A refinement of the case base is performed as a result of the competence analysis and the performance of the case-base before and after refinement is compared.
Resumo:
This paper describes work towards the deployment of flexible self-management into real-time embedded systems. A challenging project which focuses specifically on the development of a dynamic, adaptive automotive middleware is described, and the specific self-management requirements of this project are discussed. These requirements have been identified through the refinement of a wide-ranging set of use cases requiring context-sensitive behaviours. A sample of these use-cases is presented to illustrate the extent of the demands for self-management. The strategy that has been adopted to achieve self-management, based on the use of policies is presented. The embedded and real-time nature of the target system brings the constraints that dynamic adaptation capabilities must not require changes to the run-time code (except during hot update of complete binary modules), adaptation decisions must have low latency, and because the target platforms are resource-constrained the self-management mechanism have low resource requirements (especially in terms of processing and memory). Policy-based computing is thus and ideal candidate for achieving the self-management because the policy itself is loaded at run-time and can be replaced or changed in the future in the same way that a data file is loaded. Policies represent a relatively low complexity and low risk means of achieving self-management, with low run-time costs. Policies can be stored internally in ROM (such as default policies) as well as externally to the system. The architecture of a designed-for-purpose powerful yet lightweight policy library is described. A suitable evaluation platform, supporting the whole life-cycle of feasibility analysis, concept evaluation, development, rigorous testing and behavioural validation has been devised and is described.
Resumo:
In this chapter we look at JOSTLE, the multilevel graph-partitioning software package, and highlight some of the key research issues that it addresses. We first outline the core algorithms and place it in the context of the multilevel refinement paradigm. We then look at issues relating to its use as a tool for parallel processing and, in particular, partitioning in parallel. Since its first release in 1995, JOSTLE has been used for many mesh-based parallel scientific computing applications and so we also outline some enhancements such as multiphase mesh-partitioning, heterogeneous mapping and partitioning to optimise subdomain shape
Resumo:
This paper presents two multilevel refinement algorithms for the capacitated clustering problem. Multilevel refinement is a collaborative technique capable of significantly aiding the solution process for optimisation problems. The central methodologies of the technique are filtering solutions from the search space and reducing the level of problem detail to be considered at each level of the solution process. The first multilevel algorithm uses a simple tabu search while the other executes a standard local search procedure. Both algorithms demonstrate that the multilevel technique is capable of aiding the solution process for this combinatorial optimisation problem.
Resumo:
Within the building evacuation context, wayfinding describes the process in which an individual located within an arbitrarily complex enclosure attempts to find a path which leads them to relative safety, usually the exterior of the enclosure. Within most evacuation modelling tools, wayfinding is completely ignored; agents are either assigned the shortest distance path or use a potential field to find the shortest path to the exits. In this paper a novel wayfinding technique that attempts to represent the manner in which people wayfind within structures is introduced and demonstrated through two examples. The first step is to encode the spatial information of the enclosure in terms of a graph. The second step is to apply search algorithms to the graph to find possible routes to the destination and assign a cost to the routes based on their personal route preferences such as "least time" or "least distance" or a combination of criteria. The third step is the route execution and refinement. In this step, the agent moves along the chosen route and reassesses the route at regular intervals and may decide to take an alternative path if the agent determines that an alternate route is more favourable e.g. initial path is highly congested or is blocked due to fire.
Resumo:
Rhodanines (2-thio-4-oxothiazolidines) are synthetic small molecular weight organic molecules with diverse applications in biochemistry, medicinal chemistry, photochemistry, coordination chemistry and industry. The X-ray crystal structure determination of two rhodanine derivatives, namely (I), 3-aminorhodanine [3-amino-2-thio-4-oxothiazolidine], C3H4N2OS2, and (II) 3-methylrhodanine [3-methyl-2-thio-4-oxothiazolidine], C4H5NOS2, have been conducted at 100 K. I crystallizes in the monoclinic space group P2(1)/n with unit cell parameters a = 9.662(2), b = 9.234(2), c = 13.384(2) angstrom, beta = 105.425(3)degrees, V = 1151.1(3) angstrom(3), Z = 8 (2 independent molecules per asymmetric unit), density (calculated) = 1.710 mg/m(3), absorption coefficient = 0.815 mm(-1). II crystallizes in the orthorhombic space group Iba2 with unit cell a = 20.117(4), b = 23.449(5), c = 7.852(2) angstrom, V = 3703.9(12) angstrom(3), Z = 24 (three independent molecules per asymmetric unit), density (calculated) = 1.584 mg/m(3), absorption coefficient 0.755 mm(-1). For I in the final refinement cycle the data/restraints/parameter ratios were 2639/0/161, goodness-of-fit on F-2 = 0.934, final R indices [I > 2sigma(I)] were R1 = 0.0299, wR2 = 0.0545 and R indices (all data) R1 = 0.0399, wR2 = 0.0568. The largest difference peak and hole were 0.402 and -0.259 e angstrom(-3). For II in the final refinement cycle the data/restraints/parameter ratios were 3372/1/221, goodness-of-fit on F(2) = 0.950, final R indices [I > 2sigma(I)] were R1 = 0.0407, wR2 = 0.1048 and R indices (all data) R1 = 0.0450, wR2 = 0.1088. The absolute structure parameter = 0.19(9) and largest difference peak and hole 0.934 and -0.301 e angstrom(-3). Details of the geometry of the five molecules (two for I and three for II) and the crystal structures are fully discussed. Corresponding features of the molecular geometry are highly consistent and firmly establish the geometry of the rhodanine
Resumo:
Zinkin's lucid challenge to Jung makes perfect sense. Indeed, it is the implications of this `making sense' that this paper addresses. For Zinkin's characterization of the `self' takes it as a `concept' requiring coherence; a variety of abstract non-contextual knowledge that itself has a mythical heritage. Moreover, Zinkin's refinement of Jung seeks to make his work fit for the scientific paradigm of modernity. In turn, modernity's paradigm owes much to Newton's notion of knowledge via reductionism. Here knowledge or investigation is divided up into the smallest possible units with the aim of eventually putting it all together into `one' picture of scientific truth. Unfortunately, `reductionism' does not do justice to the resonant possibilities of Jung's writing. These look forward to a new scientific paradigm of the twenty-first century, of the interactive `field', emergence and complexity theory. The paper works paradoxically by discovering Zinkin's `intersubjective self' after all, in two undervalued narratives by Jung, his doctoral thesis and a short late ghost story. However, in the ambivalences and radical fictional experimentation of these fascinating texts can be discerned an-Other self, one both created and found. [From the Publisher]
Resumo:
N-acetyl-L-glutamic acid, crystallizes in the orthorhombic space group P2(1)2(1)2(1) with unit cell parameters a = 4.747(3), b = 12.852(7), c = 13.906(7) Å, V = 848.5(8) Å3, Z = 4, density (calculated) = 1.481 mg/m3, linear absorption coefficient 0.127 mm−1. The crystal structure determination was carried out with MoKalpha X-ray data measured with liquid nitrogen cooling at 100(2) K temperature. In the final refinement cycle the data/restraints/parameter ratios were 1,691/0/131; goodness-of-fit on F(2) = 1.122. Final R indices for [I > 2sigma(I)] were R1 = 0.0430, wR2 = 0.0878 and R indices (all data) R1 = 0.0473, wR2 = 0.0894. The largest electron density difference peak and hole were 0.207 and −0.154 eÅ(−3). Details of the molecular geometry are discussed and compared with a model DFT structure calculated using Gaussian 98.