942 resultados para Sufism--Terminology


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite years of effort in building organisational taxonomies, the potential of ontologies to support knowledge management in complex technical domains is under-exploited. The authors of this chapter present an approach to using rich domain ontologies to support sense-making tasks associated with resolving mechanical issues. Using Semantic Web technologies, the authors have built a framework and a suite of tools which support the whole semantic knowledge lifecycle. These are presented by describing the process of issue resolution for a simulated investigation concerning failure of bicycle brakes. Foci of the work have included ensuring that semantic tasks fit in with users’ everyday tasks, to achieve user acceptability and support the flexibility required by communities of practice with differing local sub-domains, tasks, and terminology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background - Limiting the amount of alcohol in children's medicines is advisable but as alcohol is the second most common solvent used in liquid preparations, paediatric patients with increased medication intake may be exposed to a considerable alcohol intake. Few medicines are specifically designed for children in Paediatric Intensive Care (PICU), and therefore adult formulations are frequently administered, with high medication use further exposing a PICU patient to undesired alcohol intake. Aims - This small pilot study aimed to examiine the intake of a sample of PICU patients, highlight common medicines used on PICU containing alcohol, provide alternatives where possible and where alternatives are not possible, provide the prescriber with a list of the higher alcohol containing medicines. Method - A retrospective medication chart review was undertaken as a two point snap shot. Data collected included age, weight, medications prescribed and the formulations used at time of the study. The patients' sedation score was recorded. The electronic medicine compendium (EMC) was consulted for any ethanol content for the commercially available products. The manufacturer was contacted for ethanol content of all ‘specials’ and any commercial products found to contain ethanol from the EMC. The PICU patient's daily intake of ethanol was calculated. The calculation was converted to an adult equivalent alcohol unit intake and although this method of conversion is crude and does not take physiological differences of adult and children into account, it was done in order to provide the clinician with commonly used terminology in deciding the risk to the patient. Results - Twenty-eight patients were prescribed a range of 69 different medications. Of the 69 medicines, 12 products were found to contain ethanol. Patient ages ranged from a 26 week premature infant to 15 years old, weights ranges from 0.7 kg to 45 kg. Only 2 out of the 28 patients did not receive ethanol containing medications, and most patients were prescribed at least two medicines containing ethanol. Daily ethanol intake uncorrected for weight ranged from 0.006 ml to 2.18 ml (median 0.26 ml). Converting this to adult units per week, alcohol intake ranged from 0.07 to 15.2 units (median 1.4 units). The two patients receiving above 15 units/week adult equivalent were prescribed an oral morphine weaning regimen, therefore the high alcohol exposure was short term. The most common drugs prescribed containing alcohol were found to be nystatin, ranitidine, furosemide and morphine. No commercially available alcohol-free oral liquid preparations were found for ranitidine, furosemide or morphine at the time of the study. Correlation of the sedation score against ethanol intake was difficult to analyse as most patients were actively sedated. Conclusions - Polypharmacy in PICU patients increases the exposure to alcohol. Some commercially available medicines provide excessive ethanol intake, providing the clinician with ethical, potentially economical dilemmas of prescribing an unlicensed medicine to minimise ethanol exposure. Further research is required to evaluate the scope of the problem, effects of exposure and provision of alcohol free formulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the first of two linked papers exploring decision making in nursing which integrate research evidence from different clinical and academic disciplines. Currently there are many decision-making theories, each with their own distinctive concepts and terminology, and there is a tendency for separate disciplines to view their own decision-making processes as unique. Identifying good nursing decisions and where improvements can be made is therefore problematic, and this can undermine clinical and organizational effectiveness, as well as nurses' professional status. Within the unifying framework of psychological classification, the overall aim of the two papers is to clarify and compare terms, concepts and processes identified in a diversity of decision-making theories, and to demonstrate their underlying similarities. It is argued that the range of explanations used across disciplines can usefully be re-conceptualized as classification behaviour. This paper explores problems arising from multiple theories of decision making being applied to separate clinical disciplines. Attention is given to detrimental effects on nursing practice within the context of multidisciplinary health-care organizations and the changing role of nurses. The different theories are outlined and difficulties in applying them to nursing decisions highlighted. An alternative approach based on a general model of classification is then presented in detail to introduce its terminology and the unifying framework for interpreting all types of decisions. The classification model is used to provide the context for relating alternative philosophical approaches and to define decision-making activities common to all clinical domains. This may benefit nurses by improving multidisciplinary collaboration and weakening clinical elitism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the second of two linked papers exploring decision making in nursing. The first paper, 'Classifying clinical decision making: a unifying approach' investigated difficulties with applying a range of decision-making theories to nursing practice. This is due to the diversity of terminology and theoretical concepts used, which militate against nurses being able to compare the outcomes of decisions analysed within different frameworks. It is therefore problematic for nurses to assess how good their decisions are, and where improvements can be made. However, despite the range of nomenclature, it was argued that there are underlying similarities between all theories of decision processes and that these should be exposed through integration within a single explanatory framework. A proposed solution was to use a general model of psychological classification to clarify and compare terms, concepts and processes identified across the different theories. The unifying framework of classification was described and this paper operationalizes it to demonstrate how different approaches to clinical decision making can be re-interpreted as classification behaviour. Particular attention is focused on classification in nursing, and on re-evaluating heuristic reasoning, which has been particularly prone to theoretical and terminological confusion. Demonstrating similarities in how different disciplines make decisions should promote improved multidisciplinary collaboration and a weakening of clinical elitism, thereby enhancing organizational effectiveness in health care and nurses' professional status. This is particularly important as nurses' roles continue to expand to embrace elements of managerial, medical and therapeutic work. Analysing nurses' decisions as classification behaviour will also enhance clinical effectiveness, and assist in making nurses' expertise more visible. In addition, the classification framework explodes the myth that intuition, traditionally associated with nurses' decision making, is less rational and scientific than other approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interest in bioenergy as a viable alternative to fossil fuels is increasing. This emergent sector is subject to a range of ambitious initiatives promoted by National Governments to generate energy from renewable sources. Transition to energy production from biomass still lacks a feasible infrastructure particularly from a supply chain and business perspective. Supply chain integration has not been studied widely providing a deficit in the literature and in practice. This paper presents results from a pilot study designed to identify attributes that helps optimise such supply chains. To consider this challenge it is important to identify those characteristics that integrate bioenergy supply chains and ascertain if they are distinct from those found in conventional energy models. In general terms the supply chain is defined by upstream at the point of origin of raw materials and downstream at the point of distribution to final customer. It remains to be seen if this is the case for bioenergy supply chains as there is an imbalance between knowledge and practice, even understanding the terminology. The initial pilot study results presented in the paper facilitates understanding the gap between general supply chain knowledge and what is practiced within bioenergy organisations. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: 'Neuromarketing' is a term that has often been used in the media in recent years. These public discussions have generally centered around potential ethical aspects and the public fear of negative consequences for society in general, and consumers in particular. However, positive contributions to the scientific discourse from developing a biological model that tries to explain context-situated human behavior such as consumption have often been neglected. We argue for a differentiated terminology, naming commercial applications of neuroscientific methods 'neuromarketing' and scientific ones 'consumer neuroscience'. While marketing scholars have eagerly integrated neuroscientific evidence into their theoretical framework, neurology has only recently started to draw its attention to the results of consumer neuroscience.Discussion: In this paper we address key research topics of consumer neuroscience that we think are of interest for neurologists; namely the reward system, trust and ethical issues. We argue that there are overlapping research topics in neurology and consumer neuroscience where both sides can profit from collaboration. Further, neurologists joining the public discussion of ethical issues surrounding neuromarketing and consumer neuroscience could contribute standards and experience gained in clinical research.Summary: We identify the following areas where consumer neuroscience could contribute to the field of neurology:. First, studies using game paradigms could help to gain further insights into the underlying pathophysiology of pathological gambling in Parkinson's disease, frontotemporal dementia, epilepsy, and Huntington's disease.Second, we identify compulsive buying as a common interest in neurology and consumer neuroscience. Paradigms commonly used in consumer neuroscience could be applied to patients suffering from Parkinson's disease and frontotemporal dementia to advance knowledge of this important behavioral symptom.Third, trust research in the medical context lacks empirical behavioral and neuroscientific evidence. Neurologists entering this field of research could profit from the extensive knowledge of the biological foundation of trust that scientists in economically-orientated neurosciences have gained.Fourth, neurologists could contribute significantly to the ethical debate about invasive methods in neuromarketing and consumer neuroscience. Further, neurologists should investigate biological and behavioral reactions of neurological patients to marketing and advertising measures, as they could show special consumer vulnerability and be subject to target marketing. © 2013 Javor et al.; licensee BioMed Central Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked Learning, e-Learning and Technology Enhanced Learning have each been defined in different ways, as people's understanding about technology in education has developed. Yet each could also be considered as a terminology competing for a contested conceptual space. Theoretically this can be a ‘fertile trans-disciplinary ground for represented disciplines to affect and potentially be re-orientated by others’ (Parchoma and Keefer, 2012), as differing perspectives on terminology and subject disciplines yield new understandings. Yet when used in government policy texts to describe connections between humans, learning and technology, terms tend to become fixed in less fertile positions linguistically. A deceptively spacious policy discourse that suggests people are free to make choices conceals an economically-based assumption that implementing new technologies, in themselves, determines learning. Yet it actually narrows choices open to people as one route is repeatedly in the foreground and humans are not visibly involved in it. An impression that the effective use of technology for endless improvement is inevitable cuts off critical social interactions and new knowledge for multiple understandings of technology in people's lives. This paper explores some findings from a corpus-based Critical Discourse Analysis of UK policy for educational technology during the last 15 years, to help to illuminate the choices made. This is important when through political economy, hierarchical or dominant neoliberal logic promotes a single ‘universal model’ of technology in education, without reference to a wider social context (Rustin, 2013). Discourse matters, because it can ‘mould identities’ (Massey, 2013) in narrow, objective economically-based terms which 'colonise discourses of democracy and student-centredness' (Greener and Perriton, 2005:67). This undermines subjective social, political, material and relational (Jones, 2012: 3) contexts for those learning when humans are omitted. Critically confronting these structures is not considered a negative activity. Whilst deterministic discourse for educational technology may leave people unconsciously restricted, I argue that, through a close analysis, it offers a deceptively spacious theoretical tool for debate about the wider social and economic context of educational technology. Methodologically it provides insights about ways technology, language and learning intersect across disciplinary borders (Giroux, 1992), as powerful, mutually constitutive elements, ever-present in networked learning situations. In sharing a replicable approach for linguistic analysis of policy discourse I hope to contribute to visions others have for a broader theoretical underpinning for educational technology, as a developing field of networked knowledge and research (Conole and Oliver, 2002; Andrews, 2011).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systemized analysis of trends towards integration and hybridization in contemporary expert systems is conducted, and a particular class of applied expert systems, integrated expert systems, is considered. For this purpose, terminology, classification, and models, proposed by the author, are employed. As examples of integrated expert systems, Russian systems designed in this field and available to the majority of specialists are analyzed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiscale systems that are characterized by a great range of spatial–temporal scales arise widely in many scientific domains. These range from the study of protein conformational dynamics to multiphase processes in, for example, granular media or haemodynamics, and from nuclear reactor physics to astrophysics. Despite the diversity in subject areas and terminology, there are many common challenges in multiscale modelling, including validation and design of tools for programming and executing multiscale simulations. This Theme Issue seeks to establish common frameworks for theoretical modelling, computing and validation, and to help practical applications to benefit from the modelling results. This Theme Issue has been inspired by discussions held during two recent workshops in 2013: ‘Multiscale modelling and simulation’ at the Lorentz Center, Leiden (http://www.lorentzcenter.nl/lc/web/2013/569/info.php3?wsid=569&venue=Snellius), and ‘Multiscale systems: linking quantum chemistry, molecular dynamics and microfluidic hydrodynamics’ at the Royal Society Kavli Centre. The objective of both meetings was to identify common approaches for dealing with multiscale problems across different applications in fluid and soft matter systems. This was achieved by bringing together experts from several diverse communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ironically, the “learning of percent” is one of the most problematic aspects of school mathematics. In our view, these difficulties are not associated with the arithmetic aspects of the “percent problems”, but mostly with two methodological issues: firstly, providing students with a simple and accurate understanding of the rationale behind the use of percent, and secondly - overcoming the psychological complexities of the fluent and comprehensive understanding by the students of the sometimes specific wordings of “percent problems”. Before we talk about percent, it is necessary to acquaint students with a much more fundamental and important (regrettably, not covered by the school syllabus) classical concepts of quantitative and qualitative comparison of values, to give students the opportunity to learn the relevant standard terminology and become accustomed to conventional turns of speech. Further, it makes sense to briefly touch on the issue (important in its own right) of different representations of numbers. Percent is just one of the technical, but common forms of data representation: p% = p × % = p × 0.01 = p × 1/100 = p/100 = p × 10-2 "Percent problems” are involved in just two cases: I. The ratio of a variation m to the standard M II. The relative deviation of a variation m from the standard M The hardest and most essential in each specific "percent problem” is not the routine arithmetic actions involved, but the ability to figure out, to clearly understand which of the variables involved in the problem instructions is the standard and which is the variation. And in the first place, this is what teachers need to patiently and persistently teach their students. As a matter of fact, most primary school pupils are not yet quite ready for the lexical specificity of “percent problems”. ....Math teachers should closely, hand in hand with their students, carry out a linguistic analysis of the wording of each problem ... Schoolchildren must firmly understand that a comparison of objects is only meaningful when we speak about properties which can be objectively expressed in terms of actual numerical characteristics. In our opinion, an adequate acquisition of the teaching unit on percent cannot be achieved in primary school due to objective psychological specificities related to this age and because of the level of general training of students. Yet, if we want to make this topic truly accessible and practically useful, it should be taught in high school. A final question to the reader (quickly, please): What is greater: % of e or e% of Pi

Relevância:

10.00% 10.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 41A25, 41A27, 41A36.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we first overview the French project on heritage called PATRIMA, launched in 2011 as one of the Projets d'investissement pour l'avenir, a French funding program meant to last for the next ten years. The overall purpose of the PATRIMA project is to promote and fund research on various aspects of heritage presentation and preservation. Such research being interdisciplinary, research groups in history, physics, chemistry, biology and computer science are involved in this project. The PATRIMA consortium involves research groups from universities and from the main museums or cultural heritage institutions in Paris and surroundings. More specifically, the main members of the consortium are the two universities of Cergy-Pontoise and Versailles Saint-Quentin and the following famous museums or cultural institutions: Musée du Louvre, Château de Versailles, Bibliothèque nationale de France, Musée du Quai Branly, Musée Rodin. In the second part of the paper, we focus on two projects funded by PATRIMA named EDOP and Parcours and dealing with data integration. The goal of the EDOP project is to provide users with a data space for the integration of heterogeneous information about heritage; Linked Open Data are considered for an effective access to the corresponding data sources. On the other hand, the Parcours project aims at building an ontology on the terminology about the techniques dealing with restoration and/or conservation. Such an ontology is meant to provide a common terminology to researchers using different databases and different vocabularies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the sharing of value in business transactions. Although there is an increased usage of the terminology of value in marketing (such concepts as value based selling and pricing), as well as in purchasing (value-based purchasing), the definition of the term is still vague. In order to better understand the definition of value, the author’s argue that it is important to understand the sharing of value, in general and the element of power for the sharing of value in particular. The aim of this paper is to add to this debate and this requires us to critique the current models. The key process that the analysis of power will help to explain is the division of the available revenue stream flowing up the chain from the buyer's customers. If the buyer and supplier do not cooperate, then power will be key in the sharing of that money flow. If buyers and suppliers fully cooperate, they may be able to reduce their costs and/or increase the quality of the sales offering the buyer makes to their customer.