948 resultados para knowledge structure
Resumo:
In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.
Resumo:
Makromolekulare Wirkstoffträgersysteme sind von starkem Interesse bezüglich der klinischen Anwendung chemotherapeutischer Agenzien. Um ihr klinisches Potential zu untersuchen ist es von besonderer Bedeutung das pharmakokinetische Profil in vivo zu bestimmen. Jede Veränderung der Polymerstruktur beeinflusst die Körperverteilung des entsprechenden Makromoleküls. Aufgrund dessen benötigt man detailliertes Wissen über Struktur-Eigenschaftsbeziehungen im lebenden Organismus, um das Nanocarrier System für zukünftige Anwendungen einzustellen. In dieser Beziehung stellt das präklinische Screening mittels radioaktiver Markierung und Positronen-Emissions-Tomographie eine nützliche Methode für schnelle sowie quantitative Beobachtung von Wirkstoffträgerkandidaten dar. Insbesondere poly(HPMA) und PEG sind im Arbeitsgebiet Polymer-basierter Therapeutika stark verbreitet und von ihnen abgeleitete Strukturen könnten neue Generationen in diesem Forschungsbereich bieten.rnDie vorliegende Arbeit beschreibt die erfolgreiche Synthese verschiedener HPMA und PEG basierter Polymer-Architekturen – Homopolymere, Statistische und Block copolymere – die mittels RAFT und Reaktivesterchemie durchgeführt wurde. Des Weiteren wurden die genannten Polymere mit Fluor-18 und Iod-131 radioaktiv markiert und mit Hilfe von microPET und ex vivo Biodistributionsstudien in tumortragenden Ratten biologisch evaluiert. Die Variation in Polymer-Architektur und darauffolgende Analyse in vivo resultierte in wichtige Schlussfolgerungen. Das hydrophile / lipophile Gleichgewicht hatte einen bedeutenden Einfluss auf das pharmakokinetische Profil, mit besten in vivo Eigenschaften (geringe Aufnahme in Leber und Milz sowie verlängerte Blutzirkulationszeit) für statistische HPMA-LMA copolymere mit steigendem hydrophoben Anteil. Außerdem zeigten Langzeitstudien mit Iod-131 eine verstärkte Retention von hochmolekularen, HPMA basierten statistischen Copolymeren im Tumorgewebe. Diese Beobachtung bestätigte den bekannten EPR-Effekt. Hinzukommend stellen Überstrukturbildung und damit Polymergröße Schlüsselfaktoren für effizientes Tumor-Targeting dar, da Polymerstrukturen über 200 nm in Durchmesser schnell vom MPS erkannt und vom Blutkreislauf eliminiert werden. Aufgrund dessen wurden die hier synthetisierten HPMA Block copolymere mit PEG Seitengruppen chemisch modifiziert, um eine Verminderung in Größe sowie eine Reduktion in Blutausscheidung zu induzieren. Dieser Ansatz führte zu einer erhöhten Tumoranreicherung im Walker 256 Karzinom Modell. Generell wird die Körperverteilung von HPMA und PEG basierten Polymeren stark durch die Polymer-Architektur sowie das Molekulargewicht beeinflusst. Außerdem hängt ihre Effizienz hinsichtlich Tumorbehandlung deutlich von den individuellen Charakteristika des einzelnen Tumors ab. Aufgrund dieser Beobachtungen betont die hier vorgestellte Dissertation die Notwendigkeit einer detaillierten Polymer-Charakterisierung, kombiniert mit präklinischem Screening, um polymere Wirkstoffträgersysteme für individualisierte Patienten-Therapie in der Zukunft maßzuschneidern.rn
Resumo:
The first outcome of this project was a synchronous description of the most widely spoken Romani dialect in the Czech and Slovak Republics, aimed at teachers and lecturers of the Romani language. This is intended to serve as a methodological guide for the demonstration of various grammatical phenomena, but may also assist people who want a basic knowledge of the linguistic structure of this neo-Indian language. The grammatical material is divided into 23 chapters, in a sequence which may be followed in teaching or studying. The book includes examples of the grammatical elements, but not exercises or articles. The second work produced was a textbook of Slovak Romani, which is the most detailed in the Czech or Slovak Republics to date. It is aimed at all those interested in active use of the Romani language: high school and university students, people working with the Roma, and Roma who speak little or nothing of the language of their forebears, The book includes 34 lessons, each containing relevant Romani tests (articles and dialogues), a short vocabulary list, grammatical explanations, exercises and examples of Romani written or oral expression. The textbook also contains a considerable amount of ethno-cultural information and notes on the life and traditions of the Roman, as well as pointing out some differences between different dialects. A brief Romani-Czech phrase book is included as an appendix.
Resumo:
The Austrian philosopher Ludwig Wittgenstein famously proposed a style of philosophy that was directed against certain pictures [bild] that tacitly direct our language and forms of life. His aim was to show the fly the way out of the fly bottle and to fight against the bewitchment of our intelligence by means of language: “A picture held us captive. And we could not get outside it, for it lay in our language and language seemed to repeat it to us inexorably” (Wittgenstein 1953, 115). In this context Wittgenstein is talking of philosophical pictures, deep metaphors that have structured our language but he does also use the term picture in other contexts (see Owen 2003, 83). I want to appeal to Wittgenstein in my use of the term ideology to refer to the way in which powerful underlying metaphors in neoclassical economics have a strong rhetorical and constitutive force at the level of public policy. Indeed, I am specifically speaking of the notion of ‘the performative’ in Wittgenstein and Austin. The notion of the knowledge economy has a prehistory in Hayek (1937; 1945) who founded the economics of knowledge in the 1930s, in Machlup (1962; 1970), who mapped the emerging employment shift to the US service economy in the early 1960s, and to sociologists Bell (1973) and Touraine (1974) who began to tease out the consequences of these changes for social structure in the post-industrial society in the early 1970s. The term has been taken up since by economists, sociologists, futurists and policy experts recently to explain the transition to the so-called ‘new economy’. It is not just a matter of noting these discursive strands in the genealogy of the ‘knowledge economy’ and related or cognate terms. We can also make a number of observations on the basis of this brief analysis. First, there has been a succession of terms like ‘postindustrial economy’, ‘information economy’, ‘knowledge economy’, ‘learning economy’, each with a set of related concepts emphasising its social, political, management or educational aspects. Often these literatures are not cross-threading and tend to focus on only one aspect of phenomena leading to classic dichotomies such as that between economy and society, knowledge and information. Second, these terms and their family concepts are discursive, historical and ideological products in the sense that they create their own meanings and often lead to constitutive effects at the level of policy. Third, while there is some empirical evidence to support claims concerning these terms, at the level of public policy these claims are empirically underdetermined and contain an integrating, visionary or futures component, which necessarily remains untested and is, perhaps, in principle untestable.
Resumo:
The evolution of pharmaceutical care is identified through a complete review of the literature published in the American Journal of Health-System Pharmacy, the sole comprehensive publication of institutional pharmacy practice. The evolution is categorized according to characteristics of structure (organizational structure, the role of the pharmacist), process (drug delivery systems, formulary management, acquiring drug products, methods to impact drug therapy decisions), and outcomes (cost of drug delivery, cost of drug acquisition and use, improved safety, improved health outcomes) recorded from the 1950s through the 1990s. While significant progress has been made in implementing basic drug distribution systems, levels of pharmacy involvement with direct patient care is still limited.^ A new practice framework suggests enhanced direct patient care involvement through increase in the efficiency and effectiveness of traditional pharmacy services. Recommendations advance internal and external organizational structure relationships that position pharmacists to fully use their unique skills and knowledge to impact drug therapy decisions and outcomes. Specific strategies facilitate expansion of the breadth and scope of each process component in order to expand the depth of integration of pharmacy and pharmaceutical care within the broad healthcare environment. Economic evaluation methods formally evaluate the impact of both operational and clinical interventions.^ Outcome measurements include specific recommendations and methods to increase efficiency of drug acquisition, emphasizing pharmacists' roles that impact physician prescribing decisions. Effectiveness measures include those that improve safety of drug distribution systems, decrease the potential of adverse drug therapy events, and demonstrate that pharmaceutical care can significantly contribute to improvement in overall health status.^ The implementation of the new framework is modeled on a case study at the M.D. Anderson Cancer Center. The implementation of several new drug distribution methods facilitated the redeployment of personnel from distributive functions to direct patient care activities with significant personnel and drug cost reduction. A cost-benefit analysis illustrates that framework process enhancements produced a benefit-to-cost ratio of 7.9. In addition, measures of effectiveness demonstrated significant levels of safety and enhanced drug therapy outcomes. ^
Resumo:
Our research project develops an intranet search engine with concept- browsing functionality, where the user is able to navigate the conceptual level in an interactive, automatically generated knowledge map. This knowledge map visualizes tacit, implicit knowledge, extracted from the intranet, as a network of semantic concepts. Inductive and deductive methods are combined; a text ana- lytics engine extracts knowledge structures from data inductively, and the en- terprise ontology provides a backbone structure to the process deductively. In addition to performing conventional keyword search, the user can browse the semantic network of concepts and associations to find documents and data rec- ords. Also, the user can expand and edit the knowledge network directly. As a vision, we propose a knowledge-management system that provides concept- browsing, based on a knowledge warehouse layer on top of a heterogeneous knowledge base with various systems interfaces. Such a concept browser will empower knowledge workers to interact with knowledge structures.
Resumo:
The validation of rodent models for restless legs syndrome (Willis-Ekbom disease) and periodic limb movements during sleep requires knowledge of physiological limb motor activity during sleep in rodents. This study aimed to determine the physiological time structure of tibialis anterior activity during sleep in mice and rats, and compare it with that of healthy humans. Wild-type mice (n = 9) and rats (n = 8) were instrumented with electrodes for recording the electroencephalogram and electromyogram of neck muscles and both tibialis anterior muscles. Healthy human subjects (31 ± 1 years, n = 21) underwent overnight polysomnography. An algorithm for automatic scoring of tibialis anterior electromyogram events of mice and rats during non-rapid eye movement sleep was developed and validated. Visual scoring assisted by this algorithm had inter-rater sensitivity of 92-95% and false-positive rates of 13-19% in mice and rats. The distribution of the time intervals between consecutive tibialis anterior electromyogram events during non-rapid eye movement sleep had a single peak extending up to 10 s in mice, rats and human subjects. The tibialis anterior electromyogram events separated by intervals <10 s mainly occurred in series of two-three events, their occurrence rate in humans being lower than in mice and similar to that in rats. In conclusion, this study proposes reliable rules for scoring tibialis anterior electromyogram events during non-rapid eye movement sleep in mice and rats, demonstrating that their physiological time structure is similar to that of healthy young human subjects. These results strengthen the basis for translational rodent models of periodic limb movements during sleep and restless legs syndrome/Willis-Ekbom disease.
Resumo:
In most habitats, vegetation provides the main structure of the environment. This complexity can facilitate biodiversity and ecosystem services. Therefore, measures of vegetation structure can serve as indicators in ecosystem management. However, many structural measures are laborious and require expert knowledge. Here, we used consistent and convenient measures to assess vegetation structure over an exceptionally broad elevation gradient of 866–4550m above sea level at Mount Kilimanjaro, Tanzania. Additionally, we compared (human)-modified habitats, including maize fields, traditionally managed home gardens, grasslands, commercial coffee farms and logged and burned forests with natural habitats along this elevation gradient. We distinguished vertical and horizontal vegetation structure to account for habitat complexity and heterogeneity. Vertical vegetation structure (assessed as number, width and density of vegetation layers, maximum canopy height, leaf area index and vegetation cover) displayed a unimodal elevation pattern, peaking at intermediate elevations in montane forests, whereas horizontal structure (assessed as coefficient of variation of number, width and density of vegetation layers, maximum canopy height, leaf area index and vegetation cover) was lowest at intermediate altitudes. Overall, vertical structure was consistently lower in modified than in natural habitat types, whereas horizontal structure was inconsistently different in modified than in natural habitat types, depending on the specific structural measure and habitat type. Our study shows how vertical and horizontal vegetation structure can be assessed efficiently in various habitat types in tropical mountain regions, and we suggest to apply this as a tool for informing future biodiversity and ecosystem service studies.
Resumo:
Open innovation is increasingly being adopted in business and describes a situation in which firms exchange ideas and knowledge with external participants, such as customers, suppliers, partner firms, and universities. This article extends the concept of open innovation with a push model of open innovation: knowledge is voluntarily created outside a firm by individuals and organisations who proceed to push knowledge into a firm’s open innovation project. For empirical analysis, we examine source code and newsgroup data on the Eclipse Development Platform. We find that outsiders invest as much in the firm’s project as the founding firm itself. Based on the insights from Eclipse, we develop four propositions: ‘preemptive generosity’ of a firm, ‘continuous commitment’, ‘adaptive governance structure’, and ‘low entry barrier’ are contexts that enable the push model of open innovation.
Resumo:
One of the key factors behind the growth in global trade in recent decades is an increase in intermediate input as a result of the development of vertical production networks (Feensta, 1998). It is widely recognized that the formation of production networks is due to the expansion of multinational enterprises' (MNEs) activities. MNEs have been differentiated into two types according to their production structure: horizontal and vertical foreign direct investment (FDI). In this paper, we extend the model presented by Zhang and Markusen (1999) to include horizontal and vertical FDI in a model with traded intermediates, using numerical general equilibrium analysis. The simulation results show that horizontal MNEs are more likely to exist when countries are similar in size and in relative factor endowments. Vertical MNEs are more likely to exist when countries differ in relative factor endowments, and trade costs are positive. From the results of the simulation, lower trade costs of final goods and differences in factor intensity are conditions for attracting vertical MNEs.
Resumo:
In parallel to the effort of creating Open Linked Data for the World Wide Web there is a number of projects aimed for developing the same technologies but in the context of their usage in closed environments such as private enterprises. In the paper, we present results of research on interlinking structured data for use in Idea Management Systems - a still rare breed of knowledge management systems dedicated to innovation management. In our study, we show the process of extending an ontology that initially covers only the Idea Management System structure towards the concept of linking with distributed enterprise data and public data using Semantic Web technologies. Furthermore we point out how the established links can help to solve the key problems of contemporary Idea Management Systems
Resumo:
Diffusion controls the gaseous transport process in soils when advective transport is almost null. Knowledge of the soil structure and pore connectivity are critical issues to understand and modelling soil aeration, sequestration or emission of greenhouse gasses, volatilization of volatile organic chemicals among other phenomena. In the last decades these issues increased our attention as scientist have realize that soil is one of the most complex materials on the earth, within which many biological, physical and chemical processes that support life and affect climate change take place. A quantitative and explicit characterization of soil structure is difficult because of the complexity of the pore space. This is the main reason why most theoretical approaches to soil porosity are idealizations to simplify this system. In this work, we proposed a more realistic attempt to capture the complexity of the system developing a model that considers the size and location of pores in order to relate them into a network. In the model we interpret porous soils as heterogeneous networks where pores are represented by nodes, characterized by their size and spatial location, and the links representing flows between them. In this work we perform an analysis of the community structure of porous media of soils represented as networks. For different real soils samples, modelled as heterogeneous complex networks, spatial communities of pores have been detected depending on the values of the parameters of the porous soil model used. These types of models are named as Heterogeneous Preferential Attachment (HPA). Developing an exhaustive analysis of the model, analytical solutions are obtained for the degree densities and degree distribution of the pore networks generated by the model in the thermodynamic limit and shown that the networks exhibit similar properties to those observed in other complex networks. With the aim to study in more detail topological properties of these networks, the presence of soil pore community structures is studied. The detection of communities of pores, as groups densely connected with only sparser connections between groups, could contribute to understand the mechanisms of the diffusion phenomena in soils.
Resumo:
In computer science, different types of reusable components for building software applications were proposed as a direct consequence of the emergence of new software programming paradigms. The success of these components for building applications depends on factors such as the flexibility in their combination or the facility for their selection in centralised or distributed environments such as internet. In this article, we propose a general type of reusable component, called primitive of representation, inspired by a knowledge-based approach that can promote reusability. The proposal can be understood as a generalisation of existing partial solutions that is applicable to both software and knowledge engineering for the development of hybrid applications that integrate conventional and knowledge based techniques. The article presents the structure and use of the component and describes our recent experience in the development of real-world applications based on this approach.
Resumo:
Many macroscopic properties: hardness, corrosion, catalytic activity, etc. are directly related to the surface structure, that is, to the position and chemical identity of the outermost atoms of the material. Current experimental techniques for its determination produce a “signature” from which the structure must be inferred by solving an inverse problem: a solution is proposed, its corresponding signature computed and then compared to the experiment. This is a challenging optimization problem where the search space and the number of local minima grows exponentially with the number of atoms, hence its solution cannot be achieved for arbitrarily large structures. Nowadays, it is solved by using a mixture of human knowledge and local search techniques: an expert proposes a solution that is refined using a local minimizer. If the outcome does not fit the experiment, a new solution must be proposed again. Solving a small surface can take from days to weeks of this trial and error method. Here we describe our ongoing work in its solution. We use an hybrid algorithm that mixes evolutionary techniques with trusted region methods and reuses knowledge gained during the execution to avoid repeated search of structures. Its parallelization produces good results even when not requiring the gathering of the full population, hence it can be used in loosely coupled environments such as grids. With this algorithm, the solution of test cases that previously took weeks of expert time can be automatically solved in a day or two of uniprocessor time.
Resumo:
In the last decade, complex networks have widely been applied to the study of many natural and man-made systems, and to the extraction of meaningful information from the interaction structures created by genes and proteins. Nevertheless, less attention has been devoted to metabonomics, due to the lack of a natural network representation of spectral data. Here we define a technique for reconstructing networks from spectral data sets, where nodes represent spectral bins, and pairs of them are connected when their intensities follow a pattern associated with a disease. The structural analysis of the resulting network can then be used to feed standard data-mining algorithms, for instance for the classification of new (unlabeled) subjects. Furthermore, we show how the structure of the network is resilient to the presence of external additive noise, and how it can be used to extract relevant knowledge about the development of the disease.