966 resultados para Methodological problems
Resumo:
A numerical procedure, based on the parametric differentiation and implicit finite difference scheme, has been developed for a class of problems in the boundary-layer theory for saddle-point regions. Here, the results are presented for the case of a three-dimensional stagnation-point flow with massive blowing. The method compares very well with other methods for particular cases (zero or small mass blowing). Results emphasize that the present numerical procedure is well suited for the solution of saddle-point flows with massive blowing, which could not be solved by other methods.
Resumo:
The analysis of sequential data is required in many diverse areas such as telecommunications, stock market analysis, and bioinformatics. A basic problem related to the analysis of sequential data is the sequence segmentation problem. A sequence segmentation is a partition of the sequence into a number of non-overlapping segments that cover all data points, such that each segment is as homogeneous as possible. This problem can be solved optimally using a standard dynamic programming algorithm. In the first part of the thesis, we present a new approximation algorithm for the sequence segmentation problem. This algorithm has smaller running time than the optimal dynamic programming algorithm, while it has bounded approximation ratio. The basic idea is to divide the input sequence into subsequences, solve the problem optimally in each subsequence, and then appropriately combine the solutions to the subproblems into one final solution. In the second part of the thesis, we study alternative segmentation models that are devised to better fit the data. More specifically, we focus on clustered segmentations and segmentations with rearrangements. While in the standard segmentation of a multidimensional sequence all dimensions share the same segment boundaries, in a clustered segmentation the multidimensional sequence is segmented in such a way that dimensions are allowed to form clusters. Each cluster of dimensions is then segmented separately. We formally define the problem of clustered segmentations and we experimentally show that segmenting sequences using this segmentation model, leads to solutions with smaller error for the same model cost. Segmentation with rearrangements is a novel variation to the segmentation problem: in addition to partitioning the sequence we also seek to apply a limited amount of reordering, so that the overall representation error is minimized. We formulate the problem of segmentation with rearrangements and we show that it is an NP-hard problem to solve or even to approximate. We devise effective algorithms for the proposed problem, combining ideas from dynamic programming and outlier detection algorithms in sequences. In the final part of the thesis, we discuss the problem of aggregating results of segmentation algorithms on the same set of data points. In this case, we are interested in producing a partitioning of the data that agrees as much as possible with the input partitions. We show that this problem can be solved optimally in polynomial time using dynamic programming. Furthermore, we show that not all data points are candidates for segment boundaries in the optimal solution.
Resumo:
This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.
Resumo:
The idea of ‘wicked’ problems has made a valuable contribution to recognising the complexity and challenges of contemporary planning. However, some wicked policy problems are further complicated by a significant moral, psychological, religious or cultural dimension. This is particularly the case for problems that possess strong elements of abjection and symbolic pollution and high degrees of psychosocial sensitivity. Because this affects the way these problems are framed and discussed they are also characterised by high levels of verbal proscription. As a result, they are not discussed in the rational and emotion-free way that conventional planning demands and can become obscured or inadequately acknowledged in planning processes. This further contributes to their wickedness and intractability. Through paradigmatic urban planning examples, we argue that placing their unspeakable nature at the forefront of enquiry will enable planners to advocate for a more contextually and culturally situated approach to planning, which accommodates both emotional and embodied talk alongside more technical policy contributions. Re-imagining wicked problems in this way has the potential to enhance policy and plan-making and to disrupt norms, expose their contingency, and open new ways of planning for both the unspeakable and the merely wicked.
Resumo:
The aim of the present study is to analyze Confucian understandings of the Christian doctrine of salvation in order to find the basic problems in the Confucian-Christian dialogue. I will approach the task via a systematic theological analysis of four issues in order to limit the thesis to an appropriate size. They are analyzed in three chapters as follows: 1. The Confucian concept concerning the existence of God. Here I discuss mainly the issue of assimilation of the Christian concept of God to the concepts of Sovereign on High (Shangdi) and Heaven (Tian) in Confucianism. 2. The Confucian understanding of the object of salvation and its status in Christianity. 3. The Confucian understanding of the means of salvation in Christianity. Before beginning this analysis it is necessary to clarify the vast variety of controversies, arguments, ideas, opinions and comments expressed in the name of Confucianism; thus, clear distinctions among different schools of Confucianism are given in chapter 2. In the last chapter I will discuss the results of my research in this study by pointing out the basic problems that will appear in the analysis. The results of the present study provide conclusions in three related areas: the tacit differences in the ways of thinking between Confucians and Christians, the basic problems of the Confucian-Christian dialogue, and the affirmative elements in the dialogue. In addition to a summary, a bibliography and an index, there are also eight appendices, where I have introduced important background information for readers to understand the present study.
Resumo:
This study examines philosophically the main theories and methodological assumptions of the field known as the cognitive science of religion (CSR). The study makes a philosophically informed reconstruction of the methodological principles of the CSR, indicates problems with them, and examines possible solutions to these problems. The study focuses on several different CSR writers, namely, Scott Atran, Justin Barrett, Pascal Boyer and Dan Sperber. CSR theorising is done in the intersection between cognitive sciences, anthropology and evolutionary psychology. This multidisciplinary nature makes CSR a fertile ground for philosophical considerations coming from philosophy of psychology, philosophy of mind and philosophy of science. The study begins by spelling out the methodological assumptions and auxiliary theories of CSR writers by situating these theories and assumptions in the nexus of existing approaches to religion. The distinctive feature of CSR is its emphasis on information processing: CSR writers claim that contemporary cognitive sciences can inform anthropological theorising about the human mind and offer tools for producing causal explanations. Further, they claim to explain the prevalence and persistence of religion by cognitive systems that undergird religious thinking. I also examine the core theoretical contributions of the field focusing mainly on the (1) “minimally counter-intuitiveness hypothesis” and (2) the different ways in which supernatural agent representations activate our cognitive systems. Generally speaking, CSR writers argue for the naturalness of religion: religious ideas and practices are widespread and pervasive because human cognition operates in such a way that religious ideas are easy to acquire and transmit. The study raises two philosophical problems, namely, the “problem of scope” and the “problem of religious relevance”. The problem of scope is created by the insistence of several critics of the CSR that CSR explanations are mostly irrelevant for explaining religion. Most CSR writers themselves hold that cognitive explanations can answer most of our questions about religion. I argue that the problem of scope is created by differences in explanation-begging questions: the former group is interested in explaining different things than the latter group. I propose that we should not stick too rigidly to one set of methodological assumptions, but rather acknowledge that different assumptions might help us to answer different questions about religion. Instead of adhering to some robust metaphysics as some strongly naturalistic writers argue, we should adopt a pragmatic and explanatory pluralist approach which would allow different kinds of methodological presuppositions in the study of religion provided that they attempt to answer different kinds of why-questions, since religion appears to be a multi-faceted phenomenon that spans over a variety of fields of special sciences. The problem of religious relevance is created by the insistence of some writers that CSR theories show religious beliefs to be false or irrational, whereas others invoke CSR theories to defend certain religious ideas. The problem is interesting because it reveals the more general philosophical assumptions of those who make such interpretations. CSR theories can (and have been) interpreted in terms of three different philosophical frameworks: strict naturalism, broad naturalism and theism. I argue that CSR theories can be interpreted inside all three frameworks without doing violence to the theories and that these frameworks give different kinds of results regarding the religious relevance of CSR theories.
Resumo:
In this 'Summary Guidance for Daily Practice', we describe the basic principles of prevention and management of foot problems in persons with diabetes. This summary is based on the International Working Group on the Diabetic Foot (IWGDF) Guidance 2015. There are five key elements that underpin prevention of foot problems: (1) identification of the at-risk foot; (2) regular inspection and examination of the at-risk foot; (3) education of patient, family and healthcare providers; (4) routine wearing of appropriate footwear, and; (5) treatment of pre-ulcerative signs. Healthcare providers should follow a standardized and consistent strategy for evaluating a foot wound, as this will guide further evaluation and therapy. The following items must be addressed: type, cause, site and depth, and signs of infection. There are seven key elements that underpin ulcer treatment: (1) relief of pressure and protection of the ulcer; (2) restoration of skin perfusion; (3) treatment of infection; (4) metabolic control and treatment of co-morbidity; (5) local wound care; (6) education for patient and relatives, and; (7) prevention of recurrence. Finally, successful efforts to prevent and manage foot problems in diabetes depend upon a well-organized team, using a holistic approach in which the ulcer is seen as a sign of multi-organ disease, and integrating the various disciplines involved.
Resumo:
Foot problems complicating diabetes are a source of major patient suffering and societal costs. Investing in evidence-based, internationally appropriate diabetic foot care guidance is likely among the most cost-effective forms of healthcare expenditure, provided it is goal-focused and properly implemented. The International Working Group on the Diabetic Foot (IWGDF) has been publishing and updating international Practical Guidelines since 1999. The 2015 updates are based on systematic reviews of the literature, and recommendations are formulated using the Grading of Recommendations Assessment Development and Evaluation system. As such, we changed the name from 'Practical Guidelines' to 'Guidance'. In this article we describe the development of the 2015 IWGDF Guidance documents on prevention and management of foot problems in diabetes. This Guidance consists of five documents, prepared by five working groups of international experts. These documents provide guidance related to foot complications in persons with diabetes on: prevention; footwear and offloading; peripheral artery disease; infections; and, wound healing interventions. Based on these five documents, the IWGDF Editorial Board produced a summary guidance for daily practice. The resultant of this process, after reviewed by the Editorial Board and by international IWGDF members of all documents, is an evidence-based global consensus on prevention and management of foot problems in diabetes. Plans are already under way to implement this Guidance. We believe that following the recommendations of the 2015 IWGDF Guidance will almost certainly result in improved management of foot problems in persons with diabetes and a subsequent worldwide reduction in the tragedies caused by these foot problems.
Resumo:
This thesis is an assessment of the hoax hypothesis, mainly propagated in Stephen C. Carlson's 2005 monograph "The Gospel Hoax: Morton Smith's Invention of Secret Mark", which suggests that professor Morton Smith (1915-1991) forged Clement of Alexandria's letter to Theodore. This letter Smith claimed to have discovered as an 18th century copy in the monastery of Mar Saba in 1958. The Introduction narrates the discovery story of Morton Smith and traces the manuscript's whereabouts up to its apparent disappearance in 1990 following with a brief history of scholarship of the MS and some methodological considerations. Chapters 2 and 3 deal with the arguments for the hoax (mainly by Stephen C. Carlson) and against it (mainly Scott G. Brown). Chapter 2 looks at the MS in its physical aspects, and chapter 3 assesses its subject matter. I conclude that some of the details fit reasonably well with the hoax hypothesis, but on the whole the arguments against it are more persuasive. Especially Carlson's use of QDE-analysis (Questioned Document Examination) has many problems. Comparing the handwriting of Clement's letter to Morton Smith's handwriting I conclude that there are some "repeated differences" between them suggesting that Smith is not the writer of the disputed letter. Clement's letter to Theodore derives most likely from antiquity though the exact details of its character are not discussed in length in this thesis. In Chapter 4 I take a special look at Stephen C. Carlson's arguments which propose that Morton Smith hid clues of his identity to the MS and the materials surrounding it. Comparing these alleged clues to known pseudoscientific works I conclude that Carlson utilizes here methods normally reserved for building a conspiracy theory; thus Carlson's hoax hypothesis has serious methodological flaws in respect to these hidden clues. I construct a model of these questionable methods titled "a boisterous pseudohistorical method" that contains three parts: 1) beginning with a question that from the beginning implicitly contains the answer, 2) considering everything will do as evidence for the conspiracy theory, and 3) abandoning probability and thinking literally that everything is connected. I propose that Stephen C. Carlson utilizes these pseudoscientific methods in his unearthing of Morton Smith's "clues". Chapter 5 looks briefly at the literary genre I title "textual puzzle -thriller". Because even biblical scholarship follows the signs of the times, I propose Carlson's hoax hypothesis has its literary equivalents in fiction in titles like Dan Brown's "Da Vinci Code" and in academic works in titles like John Dart's "Decoding Mark". All of these are interested in solving textual puzzles, even though the methodological choices are not acceptable for scholarship. Thus the hoax hypothesis as a whole is alternatively either unpersuasive or plain bad science.
Resumo:
An error-free computational approach is employed for finding the integer solution to a system of linear equations, using finite-field arithmetic. This approach is also extended to find the optimum solution for linear inequalities such as those arising in interval linear programming probloms.
Resumo:
As more is known about contemporary cultural shifts and the effect this has on the young, research must consider how children operate as global citizens. Children are innocent and vulnerable, but also actively participate in the world; research into early childhood must therefore refine ideas and conceptions and develop research methodologies that see children as superdiverse young citizens. Intergenerational collaborative drawing, which involves adult researchers and children drawing together, is a method that supports superdimensions. A group of researchers tested the method to consider the politics of research, particularly when researcher neutrality and the conventions around gathering ‘unsullied’ data are challenged.
Resumo:
The remarkable advances made in recombinant DNA technology over the last two decades have paved way for the use of gene transfer to treat human diseases. Several protocols have been developed for the introduction and expression of genes in humans, but the clinical efficacy has not been conclusively demonstrated in any of them. The eventual success of gene therapy for genetic and acquired disorders depends on the development of better gene transfer vectors for sustained, long term expression of foreign genes as well as a better understanding of the pathophysiology of human diseases, it is heartening to note that some of the gene therapy protocols have found other applications such as the genetic immunization or DNA vaccines, which is being heralded as the third vaccine revolution, Gene therapy is yet to become a dream come true, but the light is seen at the end of the tunnel.
Resumo:
Understanding the ways in which teachers make sense of what they do and why is critical to a broader understanding of pedagogy. Historically, teachers have been understood through the thematic and content analysis of their beliefs or philosophies. In this paper, we argue that discourse analysis (DA) involves a much finer-grained analysis of the ‘lifeworlds’ of teachers and, in our view, provides a more detailed canvas from which inferences can be made. Our argument is structured in four parts. We begin by locating DA within the physical education (PE) literature and discuss what others have referred to as its relatively modest use. Following our location of DA, we outline a conceptual framework that we regard as useful, which contains six interrelated principles. We then introduce the idea of interpretive repertoires, which we consider to have particular explanatory power as well as being a sophisticated way to represent the subjectivities of PE teachers. Finally, we discuss the methodological strengths of interpretive repertoires. The paper concludes with a discussion on the theoretical and practical merits of adopting DA to analyse problems within PE.