11 resultados para storing
em Aston University Research Archive
Resumo:
Error rates of a Boolean perceptron with threshold and either spherical or Ising constraint on the weight vector are calculated for storing patterns from biased input and output distributions derived within a one-step replica symmetry breaking (RSB) treatment. For unbiased output distribution and non-zero stability of the patterns, we find a critical load, α p, above which two solutions to the saddlepoint equations appear; one with higher free energy and zero threshold and a dominant solution with non-zero threshold. We examine this second-order phase transition and the dependence of α p on the required pattern stability, κ, for both one-step RSB and replica symmetry (RS) in the spherical case and for one-step RSB in the Ising case.
Resumo:
A formalism for modelling the dynamics of Genetic Algorithms (GAs) using methods from statistical mechanics, originally due to Prugel-Bennett and Shapiro, is reviewed, generalized and improved upon. This formalism can be used to predict the averaged trajectory of macroscopic statistics describing the GA's population. These macroscopics are chosen to average well between runs, so that fluctuations from mean behaviour can often be neglected. Where necessary, non-trivial terms are determined by assuming maximum entropy with constraints on known macroscopics. Problems of realistic size are described in compact form and finite population effects are included, often proving to be of fundamental importance. The macroscopics used here are cumulants of an appropriate quantity within the population and the mean correlation (Hamming distance) within the population. Including the correlation as an explicit macroscopic provides a significant improvement over the original formulation. The formalism is applied to a number of simple optimization problems in order to determine its predictive power and to gain insight into GA dynamics. Problems which are most amenable to analysis come from the class where alleles within the genotype contribute additively to the phenotype. This class can be treated with some generality, including problems with inhomogeneous contributions from each site, non-linear or noisy fitness measures, simple diploid representations and temporally varying fitness. The results can also be applied to a simple learning problem, generalization in a binary perceptron, and a limit is identified for which the optimal training batch size can be determined for this problem. The theory is compared to averaged results from a real GA in each case, showing excellent agreement if the maximum entropy principle holds. Some situations where this approximation brakes down are identified. In order to fully test the formalism, an attempt is made on the strong sc np-hard problem of storing random patterns in a binary perceptron. Here, the relationship between the genotype and phenotype (training error) is strongly non-linear. Mutation is modelled under the assumption that perceptron configurations are typical of perceptrons with a given training error. Unfortunately, this assumption does not provide a good approximation in general. It is conjectured that perceptron configurations would have to be constrained by other statistics in order to accurately model mutation for this problem. Issues arising from this study are discussed in conclusion and some possible areas of further research are outlined.
Resumo:
This paper formulates several mathematical models for determining the optimal sequence of component placements and assignment of component types to feeders simultaneously or the integrated scheduling problem for a type of surface mount technology placement machines, called the sequential pick-andplace (PAP) machine. A PAP machine has multiple stationary feeders storing components, a stationary working table holding a printed circuit board (PCB), and a movable placement head to pick up components from feeders and place them to a board. The objective of integrated problem is to minimize the total distance traveled by the placement head. Two integer nonlinear programming models are formulated first. Then, each of them is equivalently converted into an integer linear type. The models for the integrated problem are verified by two commercial packages. In addition, a hybrid genetic algorithm previously developed by the authors is adopted to solve the models. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total traveling distance.
Resumo:
The nasal absorption of larger peptide and protein drugs is generally low. The importance of the mucus layer and enzymic degradation in reducing absorption were investigated. Reversed-phase high-performance liquid chromatographic (HPLC) methods were developed to assay a variety of compounds. Pig gastric mucus (PGM) was selected to investigate the importance of the mucus layer. A method of treating and storing PGM was developed and evaluated which was representative of the gel in vivo. The nature of the mucus barrier was evaluated in vitro with three-compartment diffusion cells and a series of compounds with differing physicochemical properties. Mucus retarded the diffusion of all the compounds with molecular weight and charge exerting a marked effect. Binding to mucus was investigated by a centrifugation method. All of the compounds tested were found to bind to mucus with the exception of the negatively charged molecule benzoic acid. The small peptides did not demonstrate greater binding to mucus than any of the other compounds evaluated. The effect of some absorption enhancers upon the rate of diffusion of tryptophan through mucus was determined in vi tro. At the concentrations employed the enhancers EDTA, N-acetylcysteine and taurodeoxycholic acid exerted no effect, whilst taurocholic acid and cholic acid, were found to slightly reduce the rate of diffusion. The intracellular and luminal proteolytic activity of the nose was investigated in the sheep animal model with a nasal mucosal homogenate and a nasal wash preparation respectively and a series of chemically similar peptides. Hydrolysis was also investigated with the proteolytic enzymes carboxypeptidase A, cytosolic leucine aminopeptidase and microsomal leucine aminopeptidase. Sheep nasal mucosa possesses significant peptide hydrolase activity capable of degrading all the substrates tested. Considerable variation in susceptibility was observed. Degradation occurred excl us i ve ly at the pept ide bond between the aromatic amino ac id and glycine, indicating some specificity for aromatic amino acids. Hydrolysis profiles indicated the presence of both aminopeptidase and carboxypeptidase enzymes. The specific activity of the microsomal fraction was found to be greater than the cytosolic fraction. Hydrolysis in the nasal wash indicated the presence of either luminal or loosely-bound proteases, which can degrade peptide substrates. The same specificity for aromatic amino acids was observed and aminopeptidase activity demonstrated. The specific activity of the nasal wash was smaller than that of the homogenate.
Resumo:
SPOT simulation imagery was acquired for a test site in the Forest of Dean in Gloucestershire, U.K. This data was qualitatively and quantitatively evaluated for its potential application in forest resource mapping and management. A variety of techniques are described for enhancing the image with the aim of providing species level discrimination within the forest. Visual interpretation of the imagery was more successful than automated classification. The heterogeneity within the forest classes, and in particular between the forest and urban class, resulted in poor discrimination using traditional `per-pixel' automated methods of classification. Different means of assessing classification accuracy are proposed. Two techniques for measuring textural variation were investigated in an attempt to improve classification accuracy. The first of these, a sequential segmentation method, was found to be beneficial. The second, a parallel segmentation method, resulted in little improvement though this may be related to a combination of resolution in size of the texture extraction area. The effect on classification accuracy of combining the SPOT simulation imagery with other data types is investigated. A grid cell encoding technique was selected as most appropriate for storing digitised topographic (elevation, slope) and ground truth data. Topographic data were shown to improve species-level classification, though with sixteen classes overall accuracies were consistently below 50%. Neither sub-division into age groups or the incorporation of principal components and a band ratio significantly improved classification accuracy. It is concluded that SPOT imagery will not permit species level classification within forested areas as diverse as the Forest of Dean. The imagery will be most useful as part of a multi-stage sampling scheme. The use of texture analysis is highly recommended for extracting maximum information content from the data. Incorporation of the imagery into a GIS will both aid discrimination and provide a useful management tool.
Resumo:
This paper explores the role of transactive memory in enabling knowledge transfer between globally distributed teams. While the information systems literature has recently acknowledged the role transactive memory plays in improving knowledge processes and performance in colocated teams, little is known about its contribution to distributed teams. To contribute to filling this gap, knowledge-transfer challenges and processes between onsite and offshore teams were studied at TATA Consultancy Services. In particular, the paper describes the transfer of knowledge between onsite and offshore teams through encoding, storing and retrieving processes. An in-depth case study of globally distributed software development projects was carried out, and a qualitative, interpretive approach was adopted. The analysis of the case suggests that in order to overcome differences derived from the local contexts of the onsite and offshore teams (e.g. different work routines, methodologies and skills), some specific mechanisms supporting the development of codified and personalized ‘directories’ were introduced. These include the standardization of templates and methodologies across the remote sites as well as frequent teleconferencing sessions and occasional short visits. These mechanisms contributed to the development of the notion of ‘who knows what’ across onsite and offshore teams despite the challenges associated with globally distributed teams, and supported the transfer of knowledge between onsite and offshore teams. The paper concludes by offering theoretical and practical implications.
Resumo:
Current models of word production assume that words are stored as linear sequences of phonemes which are structured into syllables only at the moment of production. This is because syllable structure is always recoverable from the sequence of phonemes. In contrast, we present theoretical and empirical evidence that syllable structure is lexically represented. Storing syllable structure would have the advantage of making representations more stable and resistant to damage. On the other hand, re-syllabifications affect only a minimal part of phonological representations and occur only in some languages and depending on speech register. Evidence for these claims comes from analyses of aphasic errors which not only respect phonotactic constraints, but also avoid transformations which move the syllabic structure of the word further away from the original structure, even when equating for segmental complexity. This is true across tasks, types of errors, and, crucially, types of patients. The same syllabic effects are shown by apraxic patients and by phonological patients who have more central difficulties in retrieving phonological representations. If syllable structure was only computed after phoneme retrieval, it would have no way to influence the errors of phonological patients. Our results have implications for psycholinguistic and computational models of language as well as for clinical and educational practices.
Resumo:
Extensive loss of adipose tissue is a hallmark of cancer cachexia but the cellular and molecular basis remains unclear. This study has examined morphologic and molecular characteristics of white adipose tissue in mice bearing a cachexia-inducing tumour, MAC16. Adipose tissue from tumour-bearing mice contained shrunken adipocytes that were heterogeneous in size. Increased fibrosis was evident by strong collagen-fibril staining in the tissue matrix. Ultrastructure of 'slimmed' adipocytes revealed severe delipidation and modifications in cell membrane conformation. There were major reductions in mRNA levels of adipogenic transcription factors including CCAAT/enhancer binding protein alpha (C/EBPα), CCAAT/enhancer binding protein beta, peroxisome proliferator-activated receptor gamma, and sterol regulatory element binding protein-1c (SREBP-1c) in adipose tissue, which was accompanied by reduced protein content of C/EBPα and SREBP-1. mRNA levels of SREBP-1c targets, fatty acid synthase, acetyl CoA carboxylase, stearoyl CoA desaturase 1 and glycerol-3-phosphate acyl transferase, also fell as did glucose transporter-4 and leptin. In contrast, mRNA levels of peroxisome proliferators-activated receptor gamma coactivator-1alpha and uncoupling protein-2 were increased in white fat of tumour-bearing mice. These results suggest that the tumour-induced impairment in the formation and lipid storing capacity of adipose tissue occurs in mice with cancer cachexia. © 2006 Cancer Research UK.
Resumo:
AIM: To determine the force needed to extract a drop from a range of current prostaglandin monotherapy eye droppers and how this related to the comfortable and maximum pressure subjects could exert. METHODS: The comfortable and maximum pressure subjects could apply to an eye dropper constructed around a set of cantilevered pressure sensors and mounted above their eye was assessed in 102 subjects (mean 51.2±18.7 years), repeated three times. A load cell amplifier, mounted on a stepper motor controlled linear slide, was constructed and calibrated to test the force required to extract the first three drops from 13 multidose or unidose latanoprost medication eye droppers. RESULTS: The pressure that could be exerted on a dropper comfortably (25.9±17.7 Newtons, range 1.2-87.4) could be exceeded with effort (to 64.8±27.1 Newtons, range 19.9-157.8; F=19.045, p<0.001), and did not differ between repeats (F=0.609, p=0.545). Comfortable and maximum pressures exerted were correlated (r=0.618, p<0.001), neither were influenced strongly by age (r=0.138, p=0.168; r=-0.118, p=0237, respectively), but were lower in women than in men (F=12.757, p=0.001). The force required to expel a drop differed between dropper designs (F=22.528, p<0.001), ranging from 6.4 Newtons to 23.4 Newtons. The force needed to exert successive drops increased (F=36.373, p<0.001) and storing droppers in the fridge further increased the force required (F=7.987, p=0.009). CONCLUSIONS: Prostaglandin monotherapy droppers for glaucoma treatment vary in their resistance to extract a drop and with some a drop could not be comfortably achieved by half the population, which may affect compliance and efficacy.
Resumo:
This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.
Resumo:
One of the current challenges in model-driven engineering is enabling effective collaborative modelling. Two common approaches are either storing the models in a central repository, or keeping them under a traditional file-based version control system and build a centralized index for model-wide queries. Either way, special attention must be paid to the nature of these repositories and indexes as networked services: they should remain responsive even with an increasing number of concurrent clients. This paper presents an empirical study on the impact of certain key decisions on the scalability of concurrent model queries, using an Eclipse Connected Data Objects model repository and a Hawk model index. The study evaluates the impact of the network protocol, the API design and the internal caching mechanisms and analyzes the reasons for their varying performance.