914 resultados para Principle of non-contradiction
Resumo:
The Herglotz problem is a generalization of the fundamental problem of the calculus of variations. In this paper, we consider a class of non-differentiable functions, where the dynamics is described by a scale derivative. Necessary conditions are derived to determine the optimal solution for the problem. Some other problems are considered, like transversality conditions, the multi-dimensional case, higher-order derivatives and for several independent variables.
Resumo:
The thesis is an investigation of the principle of least effort (Zipf 1949 [1972]). The principle is simple (all effort should be least) and universal (it governs the totality of human behavior). Since the principle is also functional, the thesis adopts a functional theory of language as its theoretical framework, i.e. Natural Linguistics. The explanatory system of Natural Linguistics posits that higher principles govern preferences, which, in turn, manifest themselves as concrete, specific processes in a given language. Therefore, the thesis’ aim is to investigate the principle of least effort on the basis of external evidence from English. The investigation falls into the three following strands: the investigation of the principle itself, the investigation of its application in articulatory effort and the investigation of its application in phonological processes. The structure of the thesis reflects the division of its broad aims. The first part of the thesis presents its theoretical background (Chapter One and Chapter Two), the second part of the thesis deals with application of least effort in articulatory effort (Chapter Three and Chapter Four), whereas the third part discusses the principle of least effort in phonological processes (Chapter Five and Chapter Six). Chapter One serves as an introduction, examining various aspects of the principle of least effort such as its history, literature, operation and motivation. It overviews various names which denote least effort, explains the origins of the principle and reviews the literature devoted to the principle of least effort in a chronological order. The chapter also discusses the nature and operation of the principle, providing numerous examples of the principle at work. It emphasizes the universal character of the principle from the linguistic field (low-level phonetic processes and language universals) and the non-linguistic ones (physics, biology, psychology and cognitive sciences), proving that the principle governs human behavior and choices. Chapter Two provides the theoretical background of the thesis in terms of its theoretical framework and discusses the terms used in the thesis’ title, i.e. hierarchy and preference. It justifies the selection of Natural Linguistics as the thesis’ theoretical framework by outlining its major assumptions and demonstrating its explanatory power. As far as the concepts of hierarchy and preference are concerned, the chapter provides their definitions and reviews their various understandings via decision theories and linguistic preference-based theories. Since the thesis investigates the principle of least effort in language and speech, Chapter Three considers the articulatory aspect of effort. It reviews the notion of easy and difficult sounds and discusses the concept of articulatory effort, overviewing its literature as well as various understandings in a chronological fashion. The chapter also presents the concept of articulatory gestures within the framework of Articulatory Phonology. The thesis’ aim is to investigate the principle of least effort on the basis of external evidence, therefore Chapters Four and Six provide evidence in terms of three experiments, text message studies (Chapter Four) and phonological processes in English (Chapter Six). Chapter Four contains evidence for the principle of least effort in articulation on the basis of experiments. It describes the experiments in terms of their predictions and methodology. In particular, it discusses the adopted measure of effort established by means of the effort parameters as well as their status. The statistical methods of the experiments are also clarified. The chapter reports on the results of the experiments, presenting them in a graphical way and discusses their relation to the tested predictions. Chapter Four establishes a hierarchy of speakers’ preferences with reference to articulatory effort (Figures 30, 31). The thesis investigates the principle of least effort in phonological processes, thus Chapter Five is devoted to the discussion of phonological processes in Natural Phonology. The chapter explains the general nature and motivation of processes as well as the development of processes in child language. It also discusses the organization of processes in terms of their typology as well as the order in which processes apply. The chapter characterizes the semantic properties of processes and overviews Luschützky’s (1997) contribution to NP with respect to processes in terms of their typology and incorporation of articulatory gestures in the concept of a process. Chapter Six investigates phonological processes. In particular, it identifies the issues of lenition/fortition definition and process typology by presenting the current approaches to process definitions and their typology. Since the chapter concludes that no coherent definition of lenition/fortition exists, it develops alternative lenition/fortition definitions. The chapter also revises the typology of phonological processes under effort management, which is an extended version of the principle of least effort. Chapter Seven concludes the thesis with a list of the concepts discussed in the thesis, enumerates the proposals made by the thesis in discussing the concepts and presents some questions for future research which have emerged in the course of investigation. The chapter also specifies the extent to which the investigation of the principle of least effort is a meaningful contribution to phonology.
Resumo:
Background: Falls are common events in older people, which cause considerable morbidity and mortality. Non-pharmacological interventions are an important approach to prevent falls. There are a large number of systematic reviews of non-pharmacological interventions, whose evidence needs to be synthesized in order to facilitate evidence-based clinical decision making. Objectives: To systematically examine reviews and meta-analyses that evaluated non-pharmacological interventions to prevent falls in older adults in the community, care facilities and hospitals. Methods: We searched the electronic databases Pubmed, the Cochrane Database of Systematic Reviews, EMBASE, CINAHL, PsycINFO, PEDRO and TRIP from January 2009 to March 2015, for systematic reviews that included at least one comparative study, evaluating any non-pharmacological intervention, to prevent falls amongst older adults. The quality of the reviews was assessed using AMSTAR and ProFaNE taxonomy was used to organize the interventions. Results: Fifty-nine systematic reviews were identified which consisted of single, multiple and multi-factorial non-pharmacological interventions to prevent falls in older people. The most frequent ProFaNE defined interventions were exercises either alone or combined with other interventions, followed by environment/assistive technology interventions comprising environmental modifications, assistive and protective aids, staff education and vision assessment/correction. Knowledge was the third principle class of interventions as patient education. Exercise and multifactorial interventions were the most effective treatments to reduce falls in older adults, although not all types of exercise were equally effective in all subjects and in all settings. Effective exercise programs combined balance and strength training. Reviews with a higher AMSTAR score were more likely to contain more primary studies, to be updated and to perform meta-analysis. Conclusions: The aim of this overview of reviews of non-pharmacological interventions to prevent falls in older people in different settings, is to support clinicians and other healthcare workers with clinical decision-making by providing a comprehensive perspective of findings.
Resumo:
With progressive climate change, the preservation of biodiversity is becoming increasingly important. Only if the gene pool is large enough and requirements of species are diverse, there will be species that can adapt to the changing circumstances. To maintain biodiversity, we must understand the consequences of the various strategies. Mathematical models of population dynamics could provide prognoses. However, a model that would reproduce and explain the mechanisms behind the diversity of species that we observe experimentally and in nature is still needed. A combination of theoretical models with detailed experiments is needed to test biological processes in models and compare predictions with outcomes in reality. In this thesis, several food webs are modeled and analyzed. Among others, models are formulated of laboratory experiments performed in the Zoological Institute of the University of Cologne. Numerical data of the simulations is in good agreement with the real experimental results. Via numerical simulations it can be demonstrated that few assumptions are necessary to reproduce in a model the sustained oscillations of the population size that experiments show. However, analysis indicates that species "thrown together by chance" are not very likely to survive together over long periods. Even larger food nets do not show significantly different outcomes and prove how extraordinary and complicated natural diversity is. In order to produce such a coexistence of randomly selected species—as the experiment does—models require additional information about biological processes or restrictions on the assumptions. Another explanation for the observed coexistence is a slow extinction that takes longer than the observation time. Simulated species survive a comparable period of time before they die out eventually. Interestingly, it can be stated that the same models allow the survival of several species in equilibrium and thus do not follow the so-called competitive exclusion principle. This state of equilibrium is more fragile, however, to changes in nutrient supply than the oscillating coexistence. Overall, the studies show, that having a diverse system means that population numbers are probably oscillating, and on the other hand oscillating population numbers stabilize a food web both against demographic noise as well as against changes of the habitat. Model predictions can certainly not be converted at their face value into policies for real ecosystems. But the stabilizing character of fluctuations should be considered in the regulations of animal populations.
Resumo:
Practice placement education has been recognised as an integral and critical component of the training of occupational therapy students. Although there is an extensive body of literature on clinical education and traditional practice placement education models, there has been limited research on alternative placements.-------- This paper reviews the literature on various practice placement education models and presents a contemporary view on how it is currently delivered. The literature is examined with a particular focus on the increasing range of practice placement education opportunities, such as project and role-emerging placements. The drivers for non-traditional practice placement education include shortages of traditional placement options, health reform and changing work practices, potential for role development and influence on practice choice. The benefits and challenges of non-traditional practice placement education are discussed, including supervision issues, student evaluation, professional and personal development and the opportunity to practise clinical skills.--------- Further research is recommended to investigate occupational therapy graduates' perceptions of role-emerging and project placements in order to identify the benefits or otherwise of these placements and to contribute to the limited body of knowledge of emerging education opportunities.
Resumo:
Extended spectrum β-lactamases or ESBLs, which are derived from non-ESBL precursors by point mutation of β-lactamase genes (bla), are spreading rapidly all over the world and have caused considerable problems in the treatment of infections caused by bacteria which harbour them. The mechanism of this resistance is not fully understood and a better understanding of these mechanisms might significantly impact on choosing proper diagnostic and treatment strategies. Previous work on SHV β-lactamase gene, blaSHV, has shown that only Klebsiella pneumoniae strains which contain plasmid-borne blaSHV are able to mutate to phenotypically ESBL-positive strains and there was also evidence of an increase in blaSHV copy number. Therefore, it was hypothesised that although specific point mutation is essential for acquisition of ESBL activity, it is not yet enough, and blaSHV copy number amplification is also essential for an ESBL-positive phenotype, with homologous recombination being the likely mechanism of blaSHV copy number expansion. In this study, we investigated the mutation rate of non-ESBL expressing K. pneumoniae isolates to an ESBL-positive status by using the MSS-maximum likelihood method. Our data showed that blaSHV mutation rate of a non-ESBL expressing isolate is lower than the mutation rate of the other single base changes on the chromosome, even with a plasmid-borne blaSHV gene. On the other hand, mutation rate from a low MIC ESBL-positive (≤ 8 µg/mL for cefotaxime) to high MIC ESBL-positive (≥16 µg/mL for cefotaxime) is very high. This is because only gene copy number increase is needed which is probably mediated by homologous recombination that typically takes place at a much higher frequencies than point mutations. Using a subinhibitory concentration of novobiocin, as a homologous recombination inhibitor, revealed that this is the case.
Resumo:
This article is an abbreviated version of a debate between two economists holding somewhat different perspectives on the nature of non-market production in the space of new digital media. While the ostensible focus here is on the role of markets in the innovation of new technologies to create new economic value, this context also serves to highlight the private and public value of digital literacy.
Resumo:
Patent systems around the world are being pressed to recognise and protect challengingly new and exciting subject matter in order to keep pace with the rapid technological advancement of our age and the fact we are moving into the era of the ‘knowledge economy’. This rapid development and pressure to expand the bounds of what has traditionally been recognised as patentable subject matter has created uncertainty regarding what it is that the patent system is actually supposed to protect. Among other things, the patent system has had to contend with uncertainty surrounding claims to horticultural and agricultural methods, artificial living micro-organisms, methods of treating the human body, computer software and business methods. The contentious issue of the moment is one at whose heart lies the important distinction between what is a mere abstract idea and what is properly an invention deserving of the monopoly protection afforded by a patent. That question is whether purely intangible inventions, being methods that do not involve a physical aspect or effect or cause a physical transformation of matter, constitute patentable subject matter. This paper goes some way to addressing these uncertainties by considering how the Australian approach to the question can be informed by developments arising in the United States of America, and canvassing some of the possible lessons we in Australia might learn from the approaches taken thus far in the United States.