707 resultados para Integrable hierarchies
Resumo:
G.R. BURTON and R.J. DOUGLAS, Uniqueness of the polar factorisation and projection of a vector-valued mapping. Ann. I.H. Poincare ? A.N. 20 (2003), 405-418.
Resumo:
Taylor, L. (2004). Client-ship and Citizenship in Latin America. Bulletin of Latin American Research. 23(2), pp.213-227. RAE2008
Resumo:
BACKGROUND:In the current climate of high-throughput computational biology, the inference of a protein's function from related measurements, such as protein-protein interaction relations, has become a canonical task. Most existing technologies pursue this task as a classification problem, on a term-by-term basis, for each term in a database, such as the Gene Ontology (GO) database, a popular rigorous vocabulary for biological functions. However, ontology structures are essentially hierarchies, with certain top to bottom annotation rules which protein function predictions should in principle follow. Currently, the most common approach to imposing these hierarchical constraints on network-based classifiers is through the use of transitive closure to predictions.RESULTS:We propose a probabilistic framework to integrate information in relational data, in the form of a protein-protein interaction network, and a hierarchically structured database of terms, in the form of the GO database, for the purpose of protein function prediction. At the heart of our framework is a factorization of local neighborhood information in the protein-protein interaction network across successive ancestral terms in the GO hierarchy. We introduce a classifier within this framework, with computationally efficient implementation, that produces GO-term predictions that naturally obey a hierarchical 'true-path' consistency from root to leaves, without the need for further post-processing.CONCLUSION:A cross-validation study, using data from the yeast Saccharomyces cerevisiae, shows our method offers substantial improvements over both standard 'guilt-by-association' (i.e., Nearest-Neighbor) and more refined Markov random field methods, whether in their original form or when post-processed to artificially impose 'true-path' consistency. Further analysis of the results indicates that these improvements are associated with increased predictive capabilities (i.e., increased positive predictive value), and that this increase is consistent uniformly with GO-term depth. Additional in silico validation on a collection of new annotations recently added to GO confirms the advantages suggested by the cross-validation study. Taken as a whole, our results show that a hierarchical approach to network-based protein function prediction, that exploits the ontological structure of protein annotation databases in a principled manner, can offer substantial advantages over the successive application of 'flat' network-based methods.
Resumo:
The Science of Network Service Composition has clearly emerged as one of the grand themes driving many of our research questions in the networking field today [NeXtworking 2003]. This driving force stems from the rise of sophisticated applications and new networking paradigms. By "service composition" we mean that the performance and correctness properties local to the various constituent components of a service can be readily composed into global (end-to-end) properties without re-analyzing any of the constituent components in isolation, or as part of the whole composite service. The set of laws that would govern such composition is what will constitute that new science of composition. The combined heterogeneity and dynamic open nature of network systems makes composition quite challenging, and thus programming network services has been largely inaccessible to the average user. We identify (and outline) a research agenda in which we aim to develop a specification language that is expressive enough to describe different components of a network service, and that will include type hierarchies inspired by type systems in general programming languages that enable the safe composition of software components. We envision this new science of composition to be built upon several theories (e.g., control theory, game theory, network calculus, percolation theory, economics, queuing theory). In essence, different theories may provide different languages by which certain properties of system components can be expressed and composed into larger systems. We then seek to lift these lower-level specifications to a higher level by abstracting away details that are irrelevant for safe composition at the higher level, thus making theories scalable and useful to the average user. In this paper we focus on services built upon an overlay management architecture, and we use control theory and QoS theory as example theories from which we lift up compositional specifications.
Resumo:
With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.
Resumo:
— Consideration of how people respond to the question What is this? has suggested new problem frontiers for pattern recognition and information fusion, as well as neural systems that embody the cognitive transformation of declarative information into relational knowledge. In contrast to traditional classification methods, which aim to find the single correct label for each exemplar (This is a car), the new approach discovers rules that embody coherent relationships among labels which would otherwise appear contradictory to a learning system (This is a car, that is a vehicle, over there is a sedan). This talk will describe how an individual who experiences exemplars in real time, with each exemplar trained on at most one category label, can autonomously discover a hierarchy of cognitive rules, thereby converting local information into global knowledge. Computational examples are based on the observation that sensors working at different times, locations, and spatial scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels, which are reconciled by implicit underlying relationships that the network’s learning process discovers. The ARTMAP information fusion system can, moreover, integrate multiple separate knowledge hierarchies, by fusing independent domains into a unified structure. In the process, the system discovers cross-domain rules, inferring multilevel relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, the ARTMAP information fusion network features distributed code representations which exploit the model’s intrinsic capacity for one-to-many learning (This is a car and a vehicle and a sedan) as well as many-to-one learning (Each of those vehicles is a car). Fusion system software, testbed datasets, and articles are available from http://cns.bu.edu/techlab.
Resumo:
Avalanche Photodiodes (APDs) have been used in a wide range of low light sensing applications such as DNA sequencing, quantum key distribution, LIDAR and medical imaging. To operate the APDs, control circuits are required to achieve the desired performance characteristics. This thesis presents the work on development of three control circuits including a bias circuit, an active quench and reset circuit and a gain control circuit all of which are used for control and performance enhancement of the APDs. The bias circuit designed is used to bias planar APDs for operation in both linear and Geiger modes. The circuit is based on a dual charge pumps configuration and operates from a 5 V supply. It is capable of providing milliamp load currents for shallow-junction planar APDs that operate up to 40 V. With novel voltage regulators, the bias voltage provided by the circuit can be accurately controlled and easily adjusted by the end user. The circuit is highly integrable and provides an attractive solution for applications requiring a compact integrated APD device. The active quench and reset circuit is designed for APDs that operate in Geiger-mode and are required for photon counting. The circuit enables linear changes in the hold-off time of the Geiger-mode APD (GM-APD) from several nanoseconds to microseconds with a stable setting step of 6.5 ns. This facilitates setting the optimal `afterpulse-free' hold-off time for any GM-APD via user-controlled digital inputs. In addition this circuit doesn’t require an additional monostable or pulse generator to reset the detector, thus simplifying the circuit. Compared to existing solutions, this circuit provides more accurate and simpler control of the hold-off time while maintaining a comparable maximum count-rate of 35.2 Mcounts/s. The third circuit designed is a gain control circuit. This circuit is based on the idea of using two matched APDs to set and stabilize the gain. The circuit can provide high bias voltage for operating the planar APD, precisely set the APD’s gain (with the errors of less than 3%) and compensate for the changes in the temperature to maintain a more stable gain. The circuit operates without the need for external temperature sensing and control electronics thus lowering the system cost and complexity. It also provides a simpler and more compact solution compared to previous designs. The three circuits designed in this project were developed independently of each other and are used for improving different performance characteristics of the APD. Further research on the combination of the three circuits will produce a more compact APD-based solution for a wide range of applications.
Resumo:
This thesis presents research theorising the use of social network sites (SNS) for the consumption of cultural goods. SNS are Internet-based applications that enable people to connect, interact, discover, and share user-generated content. They have transformed communication practices and are facilitating users to present their identity online through the disclosure of information on a profile. SNS are especially effective for propagating content far and wide within a network of connections. Cultural goods constitute hedonic experiential goods with cultural, artistic, and entertainment value, such as music, books, films, and fashion. Their consumption is culturally dependant and they have unique characteristics that distinguish them from utilitarian products. The way in which users express their identity on SNS is through the sharing of cultural interests and tastes. This makes cultural good consumption vulnerable to the exchange of content and ideas that occurs across an expansive network of connections within these social systems. This study proposes the lens of affordances to theorise the use of social network sites for the consumption of cultural goods. Qualitative case study research using two phases of data collection is proposed in the application of affordances to the research topic. The interaction between task, technology, and user characteristics is investigated by examining each characteristic in detail, before investigating the actual interaction between the user and the artifact for a particular purpose. The study contributes to knowledge by (i) improving our understanding of the affordances of social network sites for the consumption of cultural goods, (ii) demonstrating the role of task, technology and user characteristics in mediating user behaviour for user-artifact interactions, (iii) explaining the technical features and user activities important to the process of consuming cultural goods using social network sites, and (iv) theorising the consumption of cultural goods using SNS by presenting a theoretical research model which identifies empirical indicators of model constructs and maps out affordance dependencies and hierarchies. The study also provides a systematic research process for applying the concept of affordances to the study of system use.
Resumo:
BACKGROUND: Serotonin is a neurotransmitter that has been linked to a wide variety of behaviors including feeding and body-weight regulation, social hierarchies, aggression and suicidality, obsessive compulsive disorder, alcoholism, anxiety, and affective disorders. Full understanding of serotonergic systems in the central nervous system involves genomics, neurochemistry, electrophysiology, and behavior. Though associations have been found between functions at these different levels, in most cases the causal mechanisms are unknown. The scientific issues are daunting but important for human health because of the use of selective serotonin reuptake inhibitors and other pharmacological agents to treat disorders in the serotonergic signaling system. METHODS: We construct a mathematical model of serotonin synthesis, release, and reuptake in a single serotonergic neuron terminal. The model includes the effects of autoreceptors, the transport of tryptophan into the terminal, and the metabolism of serotonin, as well as the dependence of release on the firing rate. The model is based on real physiology determined experimentally and is compared to experimental data. RESULTS: We compare the variations in serotonin and dopamine synthesis due to meals and find that dopamine synthesis is insensitive to the availability of tyrosine but serotonin synthesis is sensitive to the availability of tryptophan. We conduct in silico experiments on the clearance of extracellular serotonin, normally and in the presence of fluoxetine, and compare to experimental data. We study the effects of various polymorphisms in the genes for the serotonin transporter and for tryptophan hydroxylase on synthesis, release, and reuptake. We find that, because of the homeostatic feedback mechanisms of the autoreceptors, the polymorphisms have smaller effects than one expects. We compute the expected steady concentrations of serotonin transporter knockout mice and compare to experimental data. Finally, we study how the properties of the the serotonin transporter and the autoreceptors give rise to the time courses of extracellular serotonin in various projection regions after a dose of fluoxetine. CONCLUSIONS: Serotonergic systems must respond robustly to important biological signals, while at the same time maintaining homeostasis in the face of normal biological fluctuations in inputs, expression levels, and firing rates. This is accomplished through the cooperative effect of many different homeostatic mechanisms including special properties of the serotonin transporters and the serotonin autoreceptors. Many difficult questions remain in order to fully understand how serotonin biochemistry affects serotonin electrophysiology and vice versa, and how both are changed in the presence of selective serotonin reuptake inhibitors. Mathematical models are useful tools for investigating some of these questions.
Resumo:
Social and ecological factors are important in shaping sexual dimorphism in Anthropoidea, but there is also a tendency for body-size dimorphism and canine dimorphism to increase with increased body size (Rensch's rule) (Rensch: Evolution Above the Species Level. London: Methuen, 1959.) Most ecologist interpret Rensch's rule to be a consequence of social and ecological selective factors that covary with body size, but recent claims have been advanced that dimorphism is principally a consequence of selection for increased body size alone. Here we assess the effects of body size, body-size dimorphism, and social structure on canine dimorphism among platyrrhine monkeys. Platyrrhine species examined are classified into four behavioral groups reflecting the intensity of intermale competition for access to females or to limiting resources. As canine dimorphism increases, so does the level of intermale competition. Those species with monogamous and polyandrous social structures have the lowest canine dimorphism, while those with dominance rank hierarchies of males have the most canine dimorphism. Species with fission-fusion social structures and transitory intermale breeding-season competition fall between these extremes. Among platyrrhines there is a significant positive correlation between body size and canine dimorphism However, within levels of competition, no significant correlation was found between the two. Also, with increased body size, body-size dimorphism tends to increase, and this correlation holds in some cases within competition levels. In an analysis of covariance, once the level of intermale competition is controlled for, neither molar size nor molar-size dimorphism accounts for a significant part of the variance in canine dimorphism. A similar analysis using body weight as a measure of size and dimorphism yields a less clear-cut picture: body weight contributes significantly to the model when the effects of the other factors are controlled. Finally, in a model using head and body length as a measure of size and dimorphism, all factors and the interactions between them are significant. We conclude that intermale competition among platyrrhine species is the most important factor explaining variations in canine dimorphism. The significant effects of size and size dimorphism in some models may be evidence that natural (as opposed to sexual) selection also plays a role in the evolution of increased canine dimorphism.
Resumo:
This study, "Civil Rights on the Cell Block: Race, Reform, and Violence in Texas Prisons and the Nation, 1945-1990," offers a new perspective on the historical origins of the modern prison industrial complex, sexual violence in working-class culture, and the ways in which race shaped the prison experience. This study joins new scholarship that reperiodizes the Civil Rights era while also considering how violence and radicalism shaped the civil rights struggle. It places the criminal justice system at the heart of both an older racial order and within a prison-made civil rights movement that confronted the prison's power to deny citizenship and enforce racial hierarchies. By charting the trajectory of the civil rights movement in Texas prisons, my dissertation demonstrates how the internal struggle over rehabilitation and punishment shaped civil rights, racial formation, and the political contest between liberalism and conservatism. This dissertation offers a close case study of Texas, where the state prison system emerged as a national model for penal management. The dissertation begins with a hopeful story of reform marked by an apparently successful effort by the State of Texas to replace its notorious 1940s plantation/prison farm system with an efficient, business-oriented agricultural enterprise system. When this new system was fully operational in the 1960s, Texas garnered plaudits as a pioneering, modern, efficient, and business oriented Sun Belt state. But this reputation of competence and efficiency obfuscated the reality of a brutal system of internal prison management in which inmates acted as guards, employing coercive means to maintain control over the prisoner population. The inmates whom the prison system placed in charge also ran an internal prison economy in which money, food, human beings, reputations, favors, and sex all became commodities to be bought and sold. I analyze both how the Texas prison system managed to maintain its high external reputation for so long in the face of the internal reality and how that reputation collapsed when inmates, inspired by the Civil Rights Movement, revolted. My dissertation shows that this inmate Civil Rights rebellion was a success in forcing an end to the existing system but a failure in its attempts to make conditions in Texas prisons more humane. The new Texas prison regime, I conclude, utilized paramilitary practices, privatized prisons, and gang-related warfare to establish a new system that focused much more on law and order in the prisons than on the legal and human rights of prisoners. Placing the inmates and their struggle at the heart of the national debate over rights and "law and order" politics reveals an inter-racial social justice movement that asked the courts to reconsider how the state punished those who committed a crime while also reminding the public of the inmates' humanity and their constitutional rights.
Resumo:
BACKGROUND: A hierarchical taxonomy of organisms is a prerequisite for semantic integration of biodiversity data. Ideally, there would be a single, expansive, authoritative taxonomy that includes extinct and extant taxa, information on synonyms and common names, and monophyletic supraspecific taxa that reflect our current understanding of phylogenetic relationships. DESCRIPTION: As a step towards development of such a resource, and to enable large-scale integration of phenotypic data across vertebrates, we created the Vertebrate Taxonomy Ontology (VTO), a semantically defined taxonomic resource derived from the integration of existing taxonomic compilations, and freely distributed under a Creative Commons Zero (CC0) public domain waiver. The VTO includes both extant and extinct vertebrates and currently contains 106,947 taxonomic terms, 22 taxonomic ranks, 104,736 synonyms, and 162,400 cross-references to other taxonomic resources. Key challenges in constructing the VTO included (1) extracting and merging names, synonyms, and identifiers from heterogeneous sources; (2) structuring hierarchies of terms based on evolutionary relationships and the principle of monophyly; and (3) automating this process as much as possible to accommodate updates in source taxonomies. CONCLUSIONS: The VTO is the primary source of taxonomic information used by the Phenoscape Knowledgebase (http://phenoscape.org/), which integrates genetic and evolutionary phenotype data across both model and non-model vertebrates. The VTO is useful for inferring phenotypic changes on the vertebrate tree of life, which enables queries for candidate genes for various episodes in vertebrate evolution.
Resumo:
p.77-83
Resumo:
In recent history, a number of tragic events have borne a consistent message; the social structures that existed prior to and during the evacuation significantly affected the decisions made and the actions adopted by the evacuating population in response to the emergency. This type of influence over behaviour has long been neglected in the modelling community. This paper is an attempt to introduce some of these considerations into evacuation models and to demonstrate their impact. To represent this type of behaviour within evacuation models a mechanism to represent the membership and position within social hierarchies is established. In addition, individuals within the social groupings are given the capacity to communicate relevant pieces of data such as the need to evacuate—impacting the response time—and the location of viable exits—impacting route selection. Furthermore, the perception and response to this information is also affected by the social circumstances in which individuals find themselves. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Este estudio intenta esclarecer las transformaciones físicas y socioeconómicas de los asentamientos rurales de la región española de Castilla y León, durante la segunda mitad del siglo XX. Se analiza la evolución temporal de la forma urbana a través de un Sistema de Información Geográfico (SIG), calculando unos índices métricos y comparándolos con la información demográfica histórica. Los resultados pretenden mostrar los efectos de la especialización funcional económica, causada por la integración en las jerarquías productivas globales, sobre la estructura urbana. La pérdida gradual de las características tradicionales de los pueblos castellanos, como la compacidad y la integración en el entorno, debido a la pérdida o degradación de la arquitectura popular y la construcción de nuevas edificaciones industriales, supone un riesgo para las futuras políticas de desarrollo local. Se considera necesario preservar la identidad paisajística y evitar la destrucción del patrimonio cultural para poder revitalizar estos territorios.