912 resultados para Truth And Method
Resumo:
The questions of whether science pursues truth as correspondence to reality and whether science in fact progresses towards attaining a truthful understanding of physical reality are fundamental and contested in the philosophy of science. On one side of the debate stands Popper, who argues that science is objective, necessarily assumes a correspondence theory of truth, and inevitably progresses toward truth as physical theories develop, gaining a more truthful understanding of reality through progressively more sophisticated empirical analysis. Conversely Kuhn, influenced by postmodern philosophy, argues that ultimate truth cannot be attained since no objective metaphysical reality exists and it cannot be known, and consequently the notion of scientific objectivity and "progress" is a myth, marred by philosophical and ideological value judgments. Ultimately, Kuhn reduces so-called scientific progress through the adoption of successive paradigms to leaps of "faith". This paper seeks a reconciliation of the two extremes, arguing that Popper is correct in the sense that science assumes a correspondence theory of truth and may progress toward truth as physical theories develop, while simultaneously acknowledging with Kuhn that science is not purely objective and free of value judgments. The notion of faith is also critical, for it was the acknowledgement of God's existence as the creator and instituter of observable natural laws which allowed the development of science and the scientific method in the first place. Therefore, accepting and synthesising the contentions that science is to some extent founded on faith, assumes and progresses toward truth, and is subject to value judgments is necessary for the progress of science.
Resumo:
This research project involves a comparative, cross-national study of truth and reconciliation commissions (TRCs) in countries around the world that have used these extra-judicial institutions to pursue justice and promote national reconciliation during periods of democratic transition or following a civil conflict marked by intense violence and severe human rights abuses. An important objective of truth and reconciliation commissions involves instituting measures to address serious human rights abuses that have occurred as a result of discrimination, ethnocentrism and racism. In recent years, rather than solely utilizing traditional methods of conflict resolution and criminal prosecution, transitional governments have established truth and reconciliation commissions as part of efforts to foster psychological, social and political healing.
The primary objective of this research project is to determine why there has been a proliferation of truth and reconciliation commissions around the world in recent decades, and assess whether the perceived effectiveness of these commissions is real and substantial. In this work, using a multi-method approach that involves quantitative and qualitative analysis, I consider the institutional design and structural composition of truth and reconciliation commissions, as well as the roles that these commissions play in the democratic transformation of nations with a history of civil conflict and human rights violations.
In addition to a focus on institutional design of truth and reconciliation commissions, I use a group identity framework that is grounded in social identity theory to examine the historical background and sociopolitical context in which truth commissions have been adopted in countries around the world. This group identity framework serves as an invaluable lens through which questions related to truth and reconciliation commissions and other transitional justice mechanisms can be explored. I also present a unique theoretical framework, the reconciliatory democratization paradigm, that is especially useful for examining the complex interactions between the various political elements that directly affect the processes of democratic consolidation and reconciliation in countries in which truth and reconciliation commissions have been established. Finally, I tackle the question of whether successor regimes that institute truth and reconciliation commissions can effectively address the human rights violations that occurred in the past, and prevent the recurrence of these abuses.
Resumo:
This exegesis examines how a writer can effectively negotiate the relationship between author, character, fact and truth, in a work of Creative Nonfiction. It was found that individual truths, in a work of Creative Nonfiction, are not necessarily universal truths due to individual, cultural, historical and religious circumstances. What was also identified, through the examination of published Creative Nonfiction, is a necessity to ensure there are clear demarcation lines between authorial truth and fiction. The Creative Nonfiction works examined, which established this framework for the reader, ensured an ethical relationship between author and audience. These strategies and frameworks were then applied to my own Creative Nonfiction.
Resumo:
This article sets out to interpret the construction of truth discourse in the War of Canudos, through the classic 'Rebellion in the backland' by Euclides da Cunha. To enrich the research, the articles wrote by Cunha, while he was a war correspondent for the Estado de São Paulo newspaper, will be analyzed, too. Along with the text, the expression “truth-effects” designed by French philosopher Michel Foucault is being used. “Effects of truth” is an expression in reference to the idea of discourses being neither true nor false. In Os sertões, the effects of truth emerge from strategic power disputes amongst the Church, landowners, politicians and a seaside ruling elite that ignores the reality of the poor and forsaken hinterlands. Keywords: discourse, power, truth.
Resumo:
Largely as a result of mass unemployment problems in many European countries, the dynamics of job creation has in recent years attracted increased interest on the part of academics as well as policy-makers. In connection to this, a large number of studies carried out in various countries have concluded that SMEs play a very large and/or growing role as job creators (Birch, 1979; Baldwin and Picot, 1995; Davidsson, 1995a; Davidsson, Lindmark and Olofsson, 1993; 1994; 1995; 1997a; 1997b; Fumagelli and Mussati, 1993; Kirchhoff and Phillips, 1988; Spilling, 1995; for further reference to studies carried out in a large number of countries see also Aiginger and Tichy, 1991; ENSR, 1994; Loveman and Sengenberger, 1991; OECD, 1987; Storey and Johnson, 1987). While most researchers agree on the importance of SMEs, there is some controversy as regards whether this is mainly a result of many small start-ups and incremental expansions, or if a small minority of high growth SMEs contribute the lion’s share of new employment. This is known as the ‘mice vs. gazelles’ or ‘flyers vs. trundlers’ debate. Storey strongly advocates the position that the small group of high growth SMEs are the ‘real’ job creators (Storey, 1994; Storey & Johnson, 1987), whereas, e.g., the Davidsson et al research in Sweden (cf. above) gives more support for the ‘mice’ hypothesis.
Resumo:
In this paper, my aim is to address the twin concerns raised in this session - models of practice and geographies or spaces of practice - through regarding a selection of works and processes that have arisen from my recent research. Setting up this discussion, I first present a short critique of the idea of models of creative practice, recognising possible problems with the attempt to generalise or abstract its complexities. Working through a series of portraits of my working environment, I will draw from Lefebvre’s Rhythmanalysis as a way of understanding an art practice both spatially and temporally, suggesting that changes and adjustments can occur through attending to both intuitions and observations of the complex of rhythmic layers constantly at play in any event. Reflecting on my recent studio practice I explore these rhythms through the evocation of a twin axis: the horizontal and the vertical and the arcs of difference or change that occur between them, in both spatial and temporal senses. What this analysis suggests is the idea that understanding does not only emerge from the construction of general principles, derived from observation of the particular, but that the study of rhythms allows us to maintain the primacy of the particular. This makes it well suited to a study of creative methods and objects, since it is to the encounter with and expression of the particular that art practices, most certainly my own, are frequently directed.
Resumo:
Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.
Resumo:
Introduction QC, EQA and method evaluation are integral to delivery of quality patient results. To ensure QUT graduates have a solid grounding in these key areas of practice, a theory-to-practice approach is used to progressively develop and consolidate these skills. Methods Using a BCG assay for serum albumin, each student undertakes an eight week project analysing two levels of QC alongside ‘patient’ samples. Results are assessed using both single rules and Multirules. Concomitantly with the QC analyses, an EQA project is undertaken; students analyse two EQA samples, twice in the semester. Results are submitted using cloud software and data for the full ‘peer group’ returned to students in spreadsheets and incomplete Youden plots. Youden plots are completed with target values and calculated ALP values and analysed for ‘lab’ and method performance. The method has a low-level positive bias, which leads to the need to investigate an alternative method. Building directly on the EQA of the first project and using the scenario of a lab that services renal patients, students undertake a method validation comparing BCP and BCG assays in another eight-week project. Precision and patient comparison studies allow students to assess whether the BCP method addresses the proportional bias of the BCG method and overall is a ‘better’ alternative method for analysing serum albumin, accounting for pragmatic factors, such as cost, as well as performance characteristics. Results Students develop understanding of the purpose and importance of QC and EQA in delivering quality results, the need to optimise testing to deliver quality results and importantly, a working knowledge of the analyses that go into ensuring this quality. In parallel to developing these key workplace competencies, students become confident, competent practitioners, able to pipette accurately and precisely and organise themselves in a busy, time pressured work environment.
Resumo:
A system for temporal data mining includes a computer readable medium having an application configured to receive at an input module a temporal data series and a threshold frequency. The system is further configured to identify, using a candidate identification and tracking module, one or more occurrences in the temporal data series of a candidate episode and increment a count for each identified occurrence. The system is also configured to produce at an output module an output for those episodes whose count of occurrences results in a frequency exceeding the threshold frequency.
Resumo:
A system for temporal data mining includes a computer readable medium having an application configured to receive at an input module a temporal data series having events with start times and end times, a set of allowed dwelling times and a threshold frequency. The system is further configured to identify, using a candidate identification and tracking module, one or more occurrences in the temporal data series of a candidate episode and increment a count for each identified occurrence. The system is also configured to produce at an output module an output for those episodes whose count of occurrences results in a frequency exceeding the threshold frequency.
Resumo:
Before installation, a voltage source converter is usually subjected to heat-run test to verify its thermal design and performance under load. For heat-run test, the converter needs to be operated at rated voltage and rated current for a substantial length of time. Hence, such tests consume huge amount of energy in case of high-power converters. Also, the capacities of the source and loads available in the research and development (R&D) centre or the production facility could be inadequate to conduct such tests. This paper proposes a method to conduct heat-run tests on high-power, pulse width modulated (PWM) converters with low energy consumption. The experimental set-up consists of the converter under test and another converter (of similar or higher rating), both connected in parallel on the ac side and open on the dc side. Vector-control or synchronous reference frame control is employed to control the converters such that one draws certain amount of reactive power and the other supplies the same; only the system losses are drawn from the mains. The performance of the controller is validated through simulation and experiments. Experimental results, pertaining to heat-run tests on a high-power PWM converter, are presented at power levels of 25 kVA to 150 kVA.