970 resultados para Basic Research, Methedology
Resumo:
Workshops are an important part of the IFPA annual meeting as they allow for discussion of specialized topics. At IFPA meeting 2011 there were twelve themed workshops, four of which are summarized in this report. These workshops related to both basic science and clinical research into placental growth and nutrient sensing and were divided into 1) placenta: predicting future health; 2) roles of lipids in the growth and development of feto-placental unit; 3) placental nutrient sensing; 4) placental research to solve clinical problems: a translational approach.
Resumo:
In ecological economics the terms sustainable development and transdisciplinarity are closely related. It is shown that this close relation is due to the fact that research for sustainable development has to be issue oriented and reflect the diversity, complexity and dynamics of the processes involved as well as their variability between specific problem situations. Furthermore, the knowledge of people involved and their needs and interests at stake have to be taken into account. There are three basic and interrelated questions about issues to be addressed in sustainability research: (1) In which way do processes constitute a problem field and where are the needs for change? (2) What are more sustainable practices? (3) How can existing practices be transformed? To treat them properly, transdisciplinary research is needed. The emergence of transdisciplinary research in the North and the South is described. By distinguishing analytically among basic, applied and transdisciplinary research the challenges that have to be tackled in transdisciplinary projects are analyzed.
Resumo:
While scientific research and the methodologies involved have gone through substantial technological evolution the technology involved in the publication of the results of these endeavors has remained relatively stagnant. Publication is largely done in the same manner today as it was fifty years ago. Many journals have adopted electronic formats, however, their orientation and style is little different from a printed document. The documents tend to be static and take little advantage of computational resources that might be available. Recent work, Gentleman and Temple Lang (2004), suggests a methodology and basic infrastructure that can be used to publish documents in a substantially different way. Their approach is suitable for the publication of papers whose message relies on computation. Stated quite simply, Gentleman and Temple Lang propose a paradigm where documents are mixtures of code and text. Such documents may be self-contained or they may be a component of a compendium which provides the infrastructure needed to provide access to data and supporting software. These documents, or compendiums, can be processed in a number of different ways. One transformation will be to replace the code with its output -- thereby providing the familiar, but limited, static document. In this paper we apply these concepts to a seminal paper in bioinformatics, namely The Molecular Classification of Cancer, Golub et al. (1999). The authors of that paper have generously provided data and other information that have allowed us to largely reproduce their results. Rather than reproduce this paper exactly we demonstrate that such a reproduction is possible and instead concentrate on demonstrating the usefulness of the compendium concept itself.
Resumo:
A recent article in this journal (Ioannidis JP (2005) Why most published research findings are false. PLoS Med 2: e124) argued that more than half of published research findings in the medical literature are false. In this commentary, we examine the structure of that argument, and show that it has three basic components: 1)An assumption that the prior probability of most hypotheses explored in medical research is below 50%. 2)Dichotomization of P-values at the 0.05 level and introduction of a “bias” factor (produced by significance-seeking), the combination of which severely weakens the evidence provided by every design. 3)Use of Bayes theorem to show that, in the face of weak evidence, hypotheses with low prior probabilities cannot have posterior probabilities over 50%. Thus, the claim is based on a priori assumptions that most tested hypotheses are likely to be false, and then the inferential model used makes it impossible for evidence from any study to overcome this handicap. We focus largely on step (2), explaining how the combination of dichotomization and “bias” dilutes experimental evidence, and showing how this dilution leads inevitably to the stated conclusion. We also demonstrate a fallacy in another important component of the argument –that papers in “hot” fields are more likely to produce false findings. We agree with the paper’s conclusions and recommendations that many medical research findings are less definitive than readers suspect, that P-values are widely misinterpreted, that bias of various forms is widespread, that multiple approaches are needed to prevent the literature from being systematically biased and the need for more data on the prevalence of false claims. But calculating the unreliability of the medical research literature, in whole or in part, requires more empirical evidence and different inferential models than were used. The claim that “most research findings are false for most research designs and for most fields” must be considered as yet unproven.
Resumo:
Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.
Resumo:
Practice is subject to increasing pressure to demonstrate its ability to achieve outcomes required by public policy makers. As part of this process social work practice has to engage with issues around advancing knowledge-based learning processes in a close collaboration with education and research based perspectives. This has given rise to approaches seeking to combine research methodology, field research and practical experience. Practice research is connected to both “the science of the concrete” – a field of research oriented towards subjects more than objects and “mode 2 knowledge production” – an application-oriented research where frameworks and findings are discussed by a number of partners. Practice research is defined into two approaches: practice research – collaboration between practice and research – and practitioner research – processes controlled and accomplished by practitioners. The basic stakeholders in practice research are social workers, service users, administrators, management, organisations, politicians and researchers. Accordingly, practice research is necessarily collaborative, involving a meeting point for different views, interests and needs, where complexity and dilemmas are inherent. Instead of attempting to balance or reconcile these differences, it is important to respect the differences if collaboration is to be established. The strength of both practice and research in practice research is to address these difficult challenges. The danger for both fields is to avoid and reject them.
Resumo:
Cognitive-perceptive 'basic symptoms' are used complementary to ultra-high-risk criteria in order to predict onset of psychosis in the pre-psychotic phase. The aim was to investigate the prevalence of a broad selection of 'basic symptoms' in a representative general adolescent population sample (GPS; N=96) and to compare it with adolescents first admitted for early onset psychosis (EOP; N=87) or non-psychotic psychiatric disorders (NP; N=137).
Resumo:
We developed a small version of the Caltech active strand cloud water collector (CASCC) for biogeochemical investigations in ecological applications. The device is battery powered and thus allows operation at locations where mains power is not available. The collector is designed for sampling periods of up to one week, depending on fog frequency. Our new device is equipped with standard sensors for air temperature, relative humidity, wind, and horizontal visibility for fog detection with a low-cost optical sensor. In mountain areas and during times when clouds are thin the installation of the visibility sensor became a key issue, which limits the potential to estimate liquid water content of the sampled fog. Field tests with 5 devices at three different sites in the Swiss Alps (Niesen) and the Jura Mountains (Lägeren, Switzerland) during two extended summer seasons in 2006 and 2007 showed that in almost all cases it was possible to obtain sample volumes which were large enough for the examination of basic inorganic chemistry of the collected cloud water. Collection rates varied typically from 12 to 30 mL h− 1. The fog droplet cutoff diameter is ≈ 6 μm, which is low enough to include all droplet sizes that are relevant for the liquid water content of typical fog types in the collected samples. From theoretical assumptions of the collection efficiency and theoretical droplet spectra it is possible to estimate the liquid water content of the sampled fog or cloud. Our new fog collector can be constructed and operated at relatively low costs. In combination with chemical and isotopic analyses of the sampled water, this allows to quantify nutrient and pollutant fluxes as is typically needed in ecosystem biogeochemistry studies.
Resumo:
This introduction and translation is part of the research project International Constitutional Law. All amendments up to and including the 59th Amendment of 11th July 2012 have been translated and included into a consolidated edition. There have been no more amendments until today (8th October 2013).
Resumo:
Self-Determination Theory (Deci and Ryan in Intrinsic motivation and self-determination in human behavior. Plenum Press, New York, 1985) suggests that certain experiences, such as competence, are equally beneficial to everyone’s well-being (universal hypothesis), whereas Motive Disposition Theory (McClelland in Human motivation. Scott, Foresman, Glenview, IL, 1985) predicts that some people, such as those with a high achievement motive, should benefit particularly from such experiences (matching hypothesis). Existing research on motives as moderators of the relationship between basic need satisfaction and positive outcomes supports both these seemingly inconsistent views. Focusing on the achievement motive, we sought to resolve this inconsistency by considering the specificity of the outcome variables. When predicting domain-specific well-being and flow, the achievement motive should interact with felt competence. However, when it comes to predicting general well-being and flow, felt competence should unfold its effects without being moderated by the achievement motive. Two studies confirmed these assumptions indicating that the universal and matching hypotheses are complementary rather than mutually exclusive.
Resumo:
Textbooks, across all disciplines, are prone to contain errors; grammatical, editorial, factual, or judgemental. The following is an account of one of the possible effects of such errors; how an error becomes entrenched and even exaggerated as later textbooks fail to correct the original error. The example considered here concerns the origins of one of the most basic and important tools of to day's medical research, the randomised controlled trial. It is the result of a systematic study of 26 British, French and German history of medicine textbooks since 1996.
Resumo:
Introduction: A need for baccalaureate prepared nurses to find and use evidence in practice exists. Whereas using this evidence in practice may be a masters level expectation, current practice demands that baccalaureate prepared nurses acquire a basic understanding of how to use evidence in practice. Nursing students at the senior level have had exposure to critiquing research, however, they have difficulty translating evidence to practice. [See PDF for complete abstract]
Resumo:
Research on pre-service teacher internships has become a dynamic area of investigation in teacher education whose growth seems to correspond with increased activity at the institutional level over the past two decades. Introducing or expanding field experiences has been a common strategy in nearly all teacher education programs for the last twenty years, and reforming teacher education with a focus on its practical aspects still ranks near the top of education policy agendas. This article provides an introduction to the research field, addressing five basic issues: (1) precision in the definition of the construct, (2) main sources of research literature, (3) elaboration of the construct in terms of effects and mediating variables, (4) the methodological challenges of empirical research, and (5) major areas of future research. Emphasis is placed on the often ignored aspect that internships elicit both intended and unintended effects, including not only positive but also adverse side effects.