900 resultados para Knowledge Structure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population genetic and phylogeography of two common mediterranean species were studied in 10 localities located on the coasts of Toscana, Puglia and Calabria. The aim of the study was to verify the extent of genetic breaks, in areas recognized as boundaries between Mediterranean biogeographic sectors. From about 100 sequences obtained from the mitochondrial Cytochrome Oxidase subunit I (COI) gene of Halocynthia papillosa and Hexaplex trunculus genetic diversity, genetic structure at small and large distances and demographic history of both specieswere analyzed. No evidences of genetic breaks were found for the two species in Toscana and Puglia. The genetic structure of H. trunculus evidences the extent of a barrier to gene flow localized in Calabria, which could be represented by the Siculo-Tunisian Strait and the Strait of Messina. The observed patterns showed similar level of gene flow at small distances in both species, although the two species have different larval ecology. These results suggest that other factors, such as currents, local dynamics and seasonal temperatures, influence the connectivity along the Italian peninsula. The geographic distribution of the haplotypes shows that H. papillosacould represent a single genetic pool in expansion, whereas H. trunculus has two distinct genetic pools in expansion. The demographic pattern of the two species suggests that Pleistocene sea level oscillations, in particular of the LGM, may have played a key role in shaping genetic structure of the two species. This knowledge provides basic information, useful for the definition of management plans, or for the design of a network of marine protected areas along the Italian peninsula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the perspective of a new-generation opto-electronic technology based on organic semiconductors, a major objective is to achieve a deep and detailed knowledge of the structure-property relationships, in order to optimize the electronic, optical, and charge transport properties by tuning the chemical-physical characteristics of the compounds. The purpose of this dissertation is to contribute to such understanding, through suitable theoretical and computational studies. Precisely, the structural, electronic, optical, and charge transport characteristics of several promising organic materials recently synthesized are investigated by means of an integrated approach encompassing quantum-chemical calculations, molecular dynamics and kinetic Monte Carlo simulations. Particular care is addressed to the rationalization of optical and charge transport properties in terms of both intra- and intermolecular features. Moreover, a considerable part of this project involves the development of a home-made set of procedures and parts of software code required to assist the modeling of charge transport properties in the framework of the non-adiabatic hopping mechanism applied to organic crystalline materials. As a first part of my investigations, I mainly discuss the optical, electronic, and structural properties of several core-extended rylene derivatives, which can be regarded to as model compounds for graphene nanoribbons. Two families have been studied, consisting in bay-linked perylene bisimide oligomers and N-annulated rylenes. Beside rylene derivatives, my studies also concerned electronic and spectroscopic properties of tetracene diimides, quinoidal oligothiophenes, and oxygen doped picene. As an example of device application, I studied the structural characteristics governing the efficiency of resistive molecular memories based on a derivative of benzoquinone. Finally, as a second part of my investigations, I concentrate on the charge transport properties of perylene bisimides derivatives. Precisely, a comprehensive study of the structural and thermal effects on the charge transport of several core-twisted chlorinated and fluoro-alkylated perylene bisimide n-type semiconductors is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Functional materials have great importance due to their many important applications. The characterization of supramolecular architectures which are held together by non-covalent interactions is of most importance to understand their properties. Solid-state NMR methods have recently been proven to be able to unravel such structure-property relations with the help of fast magic-angle spinning and advanced pulse sequences. The aim of the current work is to understand the structure and dynamics of functional supramolecular materials which are potentially important for fuel-cell (proton conducting membrane materials) and solar-cell or plastic-electronic applications (photo-reactive aromatic materials). In particular, hydrogen-bonding networks, local proton mobility, molecular packing arrangements, and local dynamics will be studied by the use of advanced solid-state NMR methods. The first class of materials studied in this work is proton conducting polymers which also form hydrogen-bonding network. Different materials, which are prepared for high 1H conduction by different approaches are studied: PAA-P4VP, PVPA-ABPBI, Tz5Si, and Triazole-functional systems. The materials are examples of the following major groups; - Homopolymers with specific functional groups (Triazole functional polysiloxanes). - Acid-base polymer blends approach (PAA-P4VP, PVPA-ABPBI). - Acid-base copolymer approach (Triazole-PVPA). - Acid doped polymers (Triazole functional polymer doped with H3PO4). Perylenebisimide (PBI) derivatives, a second type of important functional supramolecular materials with potent applications in plastic electronics, were also investigated by means of solid-state NMR. The preparation of conducting nanoscopic fibers based on the self-assembling functional units is an appealing aim as they may be incorporated in molecular electronic devices. In this category, perylene derivatives have attracted great attention due to their high charge carrier mobility. A detailed knowledge about their supramolecular structure and molecular dynamics is crucial for the understanding of their electronic properties. The aim is to understand the structure, dynamics and packing arrangements which lead to high electron conductivity in PBI derivatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our research asked the following main questions: how the characteristics of professionals service firms allow them to successfully innovate in exploiting through exploring by combining internal and external factors of innovation and how these ambidextrous organisations perceive these factors; and how do successful innovators in professional service firms use corporate entrepreneurship models in their new service development processes? With a goal to shed light on innovation in professional knowledge intensive business service firms’ (PKIBS), we concluded a qualitative analysis of ten globally acting law firms, providing business legal services. We analyse the internal and factors of innovation that are critical for PKIBS’ innovation. We suggest how these firms become ambidextrous in changing environment. Our findings show that this kind of firms has particular type of ambidexterity due to their specific characteristics. As PKIBS are very dependant on its human capital, governance structure, and the high expectations of their clients, their ambidexterity is structural, but also contextual at the same time. In addition, we suggest 3 types of corporate entrepreneurship models that international PKIBS use to enhance innovation in turbulent environments. We looked at how law firms going through turbulent environments were using corporate entrepreneurship activities as a part of their strategies to be more innovative. Using visual mapping methodology, we developed three types of innovation patterns in the law firms. We suggest that corporate entrepreneurship models depend on successful application of mainly three elements: who participates in corporate entrepreneurship initiatives; what are the formal processes that enhances these initiatives; and what are the policies applied to this type of behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser Shock Peening (LSP) is a surface enhancement treatment which induces a significant layer of beneficial compressive residual stresses of up to several mm underneath the surface of metal components in order to improve the detrimental effects of the crack growth behavior rate in it. The aim of this thesis is to predict the crack growth behavior in metallic specimens with one or more stripes which define the compressive residual stress area induced by the Laser Shock Peening treatment. The process was applied as crack retardation stripes perpendicular to the crack propagation direction with the object of slowing down the crack when approaching the peened stripes. The finite element method has been applied to simulate the redistribution of stresses in a cracked model when it is subjected to a tension load and to a compressive residual stress field, and to evaluate the Stress Intensity Factor (SIF) in this condition. Finally, the Afgrow software is used to predict the crack growth behavior of the component following the Laser Shock Peening treatment and to detect the improvement in the fatigue life comparing it to the baseline specimen. An educational internship at the “Research & Technologies Germany – Hamburg” department of AIRBUS helped to achieve knowledge and experience to write this thesis. The main tasks of the thesis are the following: •To up to date Literature Survey related to “Laser Shock Peening in Metallic Structures” •To validate the FE model developed against experimental measurements at coupon level •To develop design of crack growth slowdown in Centered Cracked Tension specimens based on residual stress engineering approach using laser peened strip transversal to the crack path •To evaluate the Stress Intensity Factor values for Centered Cracked Tension specimens after the Laser Shock Peening treatment via Finite Element Analysis •To predict the crack growth behavior in Centered Cracked Tension specimens using as input the SIF values evaluated with the FE simulations •To validate the results by means of experimental tests

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of a high-intensity antiproton beam with momentum up to 15,GeV/c at the future FAIR will open a unique opportunity to investigate wide areas of nuclear physics with the $overline{P}$ANDA (anti{$overline{P}$}roton ANnihilations at DArmstadt) detector. Part of these investigations concern the Electromagnetic Form Factors of the proton in the time-like region and the study of the Transition Distribution Amplitudes, for which feasibility studies have been performed in this Thesis. rnMoreover, simulations to study the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter of $overline{P}$ANDA are presented. This detector is crucial especially for the reconstruction of processes like $bar pprightarrow e^+ e^- pi^0$, investigated in this work. Different arrangements of dead material were studied. The results show that both, the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter fullfill the requirements for the detection of backward particles, and that this detector is necessary for the reconstruction of the channels of interest. rnrnThe study of the annihilation channel $bar pprightarrow e^+ e^-$ will improve the knowledge of the Electromagnetic Form Factors in the time-like region, and will help to understand their connection with the Electromagnetic Form Factors in the space-like region. In this Thesis the feasibility of a measurement of the $bar pprightarrow e^+ e^-$ cross section with $overline{P}$ANDA is studied using Monte-Carlo simulations. The major background channel $bar pprightarrow pi^+ pi^-$ is taken into account. The results show a $10^9$ background suppression factor, which assure a sufficiently clean signal with less than 0.1% background contamination. The signal can be measured with an efficiency greater than 30% up to $s=14$,(GeV/c)$^2$. The Electromagnetic Form Factors are extracted from the reconstructed signal and corrected angular distribution. Above this $s$ limit, the low cross section will not allow the direct extraction of the Electromagnetic Form Factors. However, the total cross section can still be measured and an extraction of the Electromagnetic Form Factors is possible considering certain assumptions on the ratio between the electric and magnetic contributions.rnrnThe Transition Distribution Amplitudes are new non-perturbative objects describing the transition between a baryon and a meson. They are accessible in hard exclusive processes like $bar pprightarrow e^+ e^- pi^0$. The study of this process with $overline{P}$ANDA will test the Transition Distribution Amplitudes approach. This work includes a feasibility study for measuring this channel with $overline{P}$ANDA. The main background reaction is here $bar pprightarrow pi^+ pi^- pi^0$. A background suppression factor of $10^8$ has been achieved while keeping a signal efficiency above 20%.rnrnrnPart of this work has been published in the European Physics Journal A 44, 373-384 (2010).rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Makromolekulare Wirkstoffträgersysteme sind von starkem Interesse bezüglich der klinischen Anwendung chemotherapeutischer Agenzien. Um ihr klinisches Potential zu untersuchen ist es von besonderer Bedeutung das pharmakokinetische Profil in vivo zu bestimmen. Jede Veränderung der Polymerstruktur beeinflusst die Körperverteilung des entsprechenden Makromoleküls. Aufgrund dessen benötigt man detailliertes Wissen über Struktur-Eigenschaftsbeziehungen im lebenden Organismus, um das Nanocarrier System für zukünftige Anwendungen einzustellen. In dieser Beziehung stellt das präklinische Screening mittels radioaktiver Markierung und Positronen-Emissions-Tomographie eine nützliche Methode für schnelle sowie quantitative Beobachtung von Wirkstoffträgerkandidaten dar. Insbesondere poly(HPMA) und PEG sind im Arbeitsgebiet Polymer-basierter Therapeutika stark verbreitet und von ihnen abgeleitete Strukturen könnten neue Generationen in diesem Forschungsbereich bieten.rnDie vorliegende Arbeit beschreibt die erfolgreiche Synthese verschiedener HPMA und PEG basierter Polymer-Architekturen – Homopolymere, Statistische und Block copolymere – die mittels RAFT und Reaktivesterchemie durchgeführt wurde. Des Weiteren wurden die genannten Polymere mit Fluor-18 und Iod-131 radioaktiv markiert und mit Hilfe von microPET und ex vivo Biodistributionsstudien in tumortragenden Ratten biologisch evaluiert. Die Variation in Polymer-Architektur und darauffolgende Analyse in vivo resultierte in wichtige Schlussfolgerungen. Das hydrophile / lipophile Gleichgewicht hatte einen bedeutenden Einfluss auf das pharmakokinetische Profil, mit besten in vivo Eigenschaften (geringe Aufnahme in Leber und Milz sowie verlängerte Blutzirkulationszeit) für statistische HPMA-LMA copolymere mit steigendem hydrophoben Anteil. Außerdem zeigten Langzeitstudien mit Iod-131 eine verstärkte Retention von hochmolekularen, HPMA basierten statistischen Copolymeren im Tumorgewebe. Diese Beobachtung bestätigte den bekannten EPR-Effekt. Hinzukommend stellen Überstrukturbildung und damit Polymergröße Schlüsselfaktoren für effizientes Tumor-Targeting dar, da Polymerstrukturen über 200 nm in Durchmesser schnell vom MPS erkannt und vom Blutkreislauf eliminiert werden. Aufgrund dessen wurden die hier synthetisierten HPMA Block copolymere mit PEG Seitengruppen chemisch modifiziert, um eine Verminderung in Größe sowie eine Reduktion in Blutausscheidung zu induzieren. Dieser Ansatz führte zu einer erhöhten Tumoranreicherung im Walker 256 Karzinom Modell. Generell wird die Körperverteilung von HPMA und PEG basierten Polymeren stark durch die Polymer-Architektur sowie das Molekulargewicht beeinflusst. Außerdem hängt ihre Effizienz hinsichtlich Tumorbehandlung deutlich von den individuellen Charakteristika des einzelnen Tumors ab. Aufgrund dieser Beobachtungen betont die hier vorgestellte Dissertation die Notwendigkeit einer detaillierten Polymer-Charakterisierung, kombiniert mit präklinischem Screening, um polymere Wirkstoffträgersysteme für individualisierte Patienten-Therapie in der Zukunft maßzuschneidern.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first outcome of this project was a synchronous description of the most widely spoken Romani dialect in the Czech and Slovak Republics, aimed at teachers and lecturers of the Romani language. This is intended to serve as a methodological guide for the demonstration of various grammatical phenomena, but may also assist people who want a basic knowledge of the linguistic structure of this neo-Indian language. The grammatical material is divided into 23 chapters, in a sequence which may be followed in teaching or studying. The book includes examples of the grammatical elements, but not exercises or articles. The second work produced was a textbook of Slovak Romani, which is the most detailed in the Czech or Slovak Republics to date. It is aimed at all those interested in active use of the Romani language: high school and university students, people working with the Roma, and Roma who speak little or nothing of the language of their forebears, The book includes 34 lessons, each containing relevant Romani tests (articles and dialogues), a short vocabulary list, grammatical explanations, exercises and examples of Romani written or oral expression. The textbook also contains a considerable amount of ethno-cultural information and notes on the life and traditions of the Roman, as well as pointing out some differences between different dialects. A brief Romani-Czech phrase book is included as an appendix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Austrian philosopher Ludwig Wittgenstein famously proposed a style of philosophy that was directed against certain pictures [bild] that tacitly direct our language and forms of life. His aim was to show the fly the way out of the fly bottle and to fight against the bewitchment of our intelligence by means of language: “A picture held us captive. And we could not get outside it, for it lay in our language and language seemed to repeat it to us inexorably” (Wittgenstein 1953, 115). In this context Wittgenstein is talking of philosophical pictures, deep metaphors that have structured our language but he does also use the term picture in other contexts (see Owen 2003, 83). I want to appeal to Wittgenstein in my use of the term ideology to refer to the way in which powerful underlying metaphors in neoclassical economics have a strong rhetorical and constitutive force at the level of public policy. Indeed, I am specifically speaking of the notion of ‘the performative’ in Wittgenstein and Austin. The notion of the knowledge economy has a prehistory in Hayek (1937; 1945) who founded the economics of knowledge in the 1930s, in Machlup (1962; 1970), who mapped the emerging employment shift to the US service economy in the early 1960s, and to sociologists Bell (1973) and Touraine (1974) who began to tease out the consequences of these changes for social structure in the post-industrial society in the early 1970s. The term has been taken up since by economists, sociologists, futurists and policy experts recently to explain the transition to the so-called ‘new economy’. It is not just a matter of noting these discursive strands in the genealogy of the ‘knowledge economy’ and related or cognate terms. We can also make a number of observations on the basis of this brief analysis. First, there has been a succession of terms like ‘postindustrial economy’, ‘information economy’, ‘knowledge economy’, ‘learning economy’, each with a set of related concepts emphasising its social, political, management or educational aspects. Often these literatures are not cross-threading and tend to focus on only one aspect of phenomena leading to classic dichotomies such as that between economy and society, knowledge and information. Second, these terms and their family concepts are discursive, historical and ideological products in the sense that they create their own meanings and often lead to constitutive effects at the level of policy. Third, while there is some empirical evidence to support claims concerning these terms, at the level of public policy these claims are empirically underdetermined and contain an integrating, visionary or futures component, which necessarily remains untested and is, perhaps, in principle untestable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of pharmaceutical care is identified through a complete review of the literature published in the American Journal of Health-System Pharmacy, the sole comprehensive publication of institutional pharmacy practice. The evolution is categorized according to characteristics of structure (organizational structure, the role of the pharmacist), process (drug delivery systems, formulary management, acquiring drug products, methods to impact drug therapy decisions), and outcomes (cost of drug delivery, cost of drug acquisition and use, improved safety, improved health outcomes) recorded from the 1950s through the 1990s. While significant progress has been made in implementing basic drug distribution systems, levels of pharmacy involvement with direct patient care is still limited.^ A new practice framework suggests enhanced direct patient care involvement through increase in the efficiency and effectiveness of traditional pharmacy services. Recommendations advance internal and external organizational structure relationships that position pharmacists to fully use their unique skills and knowledge to impact drug therapy decisions and outcomes. Specific strategies facilitate expansion of the breadth and scope of each process component in order to expand the depth of integration of pharmacy and pharmaceutical care within the broad healthcare environment. Economic evaluation methods formally evaluate the impact of both operational and clinical interventions.^ Outcome measurements include specific recommendations and methods to increase efficiency of drug acquisition, emphasizing pharmacists' roles that impact physician prescribing decisions. Effectiveness measures include those that improve safety of drug distribution systems, decrease the potential of adverse drug therapy events, and demonstrate that pharmaceutical care can significantly contribute to improvement in overall health status.^ The implementation of the new framework is modeled on a case study at the M.D. Anderson Cancer Center. The implementation of several new drug distribution methods facilitated the redeployment of personnel from distributive functions to direct patient care activities with significant personnel and drug cost reduction. A cost-benefit analysis illustrates that framework process enhancements produced a benefit-to-cost ratio of 7.9. In addition, measures of effectiveness demonstrated significant levels of safety and enhanced drug therapy outcomes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our research project develops an intranet search engine with concept- browsing functionality, where the user is able to navigate the conceptual level in an interactive, automatically generated knowledge map. This knowledge map visualizes tacit, implicit knowledge, extracted from the intranet, as a network of semantic concepts. Inductive and deductive methods are combined; a text ana- lytics engine extracts knowledge structures from data inductively, and the en- terprise ontology provides a backbone structure to the process deductively. In addition to performing conventional keyword search, the user can browse the semantic network of concepts and associations to find documents and data rec- ords. Also, the user can expand and edit the knowledge network directly. As a vision, we propose a knowledge-management system that provides concept- browsing, based on a knowledge warehouse layer on top of a heterogeneous knowledge base with various systems interfaces. Such a concept browser will empower knowledge workers to interact with knowledge structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The validation of rodent models for restless legs syndrome (Willis-Ekbom disease) and periodic limb movements during sleep requires knowledge of physiological limb motor activity during sleep in rodents. This study aimed to determine the physiological time structure of tibialis anterior activity during sleep in mice and rats, and compare it with that of healthy humans. Wild-type mice (n = 9) and rats (n = 8) were instrumented with electrodes for recording the electroencephalogram and electromyogram of neck muscles and both tibialis anterior muscles. Healthy human subjects (31 ± 1 years, n = 21) underwent overnight polysomnography. An algorithm for automatic scoring of tibialis anterior electromyogram events of mice and rats during non-rapid eye movement sleep was developed and validated. Visual scoring assisted by this algorithm had inter-rater sensitivity of 92-95% and false-positive rates of 13-19% in mice and rats. The distribution of the time intervals between consecutive tibialis anterior electromyogram events during non-rapid eye movement sleep had a single peak extending up to 10 s in mice, rats and human subjects. The tibialis anterior electromyogram events separated by intervals <10 s mainly occurred in series of two-three events, their occurrence rate in humans being lower than in mice and similar to that in rats. In conclusion, this study proposes reliable rules for scoring tibialis anterior electromyogram events during non-rapid eye movement sleep in mice and rats, demonstrating that their physiological time structure is similar to that of healthy young human subjects. These results strengthen the basis for translational rodent models of periodic limb movements during sleep and restless legs syndrome/Willis-Ekbom disease.