687 resultados para scientific method
em Queensland University of Technology - ePrints Archive
Resumo:
Separability is a concept that is very difficult to define, and yet much of our scientific method is implicitly based upon the assumption that systems can sensibly be reduced to a set of interacting components. This paper examines the notion of separability in the creation of bi-ambiguous compounds that is based upon the CHSH and CH inequalities. It reports results of an experiment showing that violations of the CHSH and CH inequality can occur in human conceptual combination.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
Table of Contents Timeline of Thinkers Timeline of Thoughts Evolution of Science Chapter 1. Introduction Chapter 2. Humans: the measure of all things Chapter 3. Men with beards: long beards Chapter 4. I doubt it Chapter 5. With good reason Chapter 6. Here be dragons Chapter 7. Stirrings of science Chapter 8. Degrees of separation Chapter 9. The Greek legacy Chapter 10. A scientific focus Chapter 11. Questions of science Chapter 12. Creatures of habit Chapter 13. A scientific method Chapter 14. Outside the square Chapter 15. Probably Chapter 16. Human, all too human Chapter 17. Cultures of science Chapter 18. 21st Century Science Chapter 19. Science in question Chapter 20. How do we know? Chapter 21. Sources
Resumo:
Molecular biology is a scientific discipline which has changed fundamentally in character over the past decade to rely on large scale datasets – public and locally generated - and their computational analysis and annotation. Undergraduate education of biologists must increasingly couple this domain context with a data-driven computational scientific method. Yet modern programming and scripting languages and rich computational environments such as R and MATLAB present significant barriers to those with limited exposure to computer science, and may require substantial tutorial assistance over an extended period if progress is to be made. In this paper we report our experience of undergraduate bioinformatics education using the familiar, ubiquitous spreadsheet environment of Microsoft Excel. We describe a configurable extension called QUT.Bio.Excel, a custom ribbon, supporting a rich set of data sources, external tools and interactive processing within the spreadsheet, and a range of problems to demonstrate its utility and success in addressing the needs of students over their studies.
Resumo:
Legal Theories: Contexts and Practices presents legal theory as a living and evolving entity. The reader is brought into its story as an active participant who is challenged to think about where they sit within the history and traditions of legal theory and jurisprudence. This second edition explores how lawyers and the courts adopt theoretical and jurisprudential positions and how they are influenced by the historical, social, cultural, and legal conditions characteristic of the time in which they live. It considers how legal theories, too, are influenced by those conditions, and how these combined forces influence and continue to affect contemporary legal thinking and legal interpretation.
Resumo:
A recurring question for cognitive science is whether functional neuroimaging data can provide evidence for or against psychological theories. As posed, the question reflects an adherence to a popular scientific method known as 'strong inference'. The method entails constructing multiple hypotheses (Hs) and designing experiments so that alternative possible outcomes will refute at least one (i.e., 'falsify' it). In this article, after first delineating some well-documented limitations of strong inference, I provide examples of functional neuroimaging data being used to test Hs from rival modular information-processing models of spoken word production. 'Strong inference' for neuroimaging involves first establishing a systematic mapping of 'processes to processors' for a common modular architecture. Alternate Hs are then constructed from psychological theories that attribute the outcome of manipulating an experimental factor to two or more distinct processing stages within this architecture. Hs are then refutable by a finding of activity differentiated spatially and chronometrically by experimental condition. When employed in this manner, the data offered by functional neuroimaging may be more useful for adjudicating between accounts of processing loci than behavioural measures.
Resumo:
Engaging middle-school students in science continues to be a challenge in Australian schools. One initiative that has been tried in the senior years but is a more recent development in the middle years is the context-based approach. In this ethnographic study, we researched the teaching and learning transactions that occurred in one 9th grade science class studying a context-based Environmental Science unit that included visits to the local creek for 11 weeks. Data were derived from field notes, audio and video recorded conversations, interviews, student journals and classroom documents with a particular focus on two selected groups of students. This paper presents two assertions that highlight pedagogical approaches that contributed to learning. Firstly, spontaneous teaching episodes created opportunities for in-the-moment questioning by the teacher that led to students’ awareness of environmental issues and the scientific method; secondly, group work using flip cameras afforded opportunities for students to connect the science concepts with the context. Furthermore, students reported positively about the unit and expressed their appreciation for the opportunity to visit the creek frequently. This findings from this study should encourage teachers to take students into the real-world field for valuable teaching and learning experiences that are not available in the formal classroom.
Resumo:
The questions of whether science pursues truth as correspondence to reality and whether science in fact progresses towards attaining a truthful understanding of physical reality are fundamental and contested in the philosophy of science. On one side of the debate stands Popper, who argues that science is objective, necessarily assumes a correspondence theory of truth, and inevitably progresses toward truth as physical theories develop, gaining a more truthful understanding of reality through progressively more sophisticated empirical analysis. Conversely Kuhn, influenced by postmodern philosophy, argues that ultimate truth cannot be attained since no objective metaphysical reality exists and it cannot be known, and consequently the notion of scientific objectivity and "progress" is a myth, marred by philosophical and ideological value judgments. Ultimately, Kuhn reduces so-called scientific progress through the adoption of successive paradigms to leaps of "faith". This paper seeks a reconciliation of the two extremes, arguing that Popper is correct in the sense that science assumes a correspondence theory of truth and may progress toward truth as physical theories develop, while simultaneously acknowledging with Kuhn that science is not purely objective and free of value judgments. The notion of faith is also critical, for it was the acknowledgement of God's existence as the creator and instituter of observable natural laws which allowed the development of science and the scientific method in the first place. Therefore, accepting and synthesising the contentions that science is to some extent founded on faith, assumes and progresses toward truth, and is subject to value judgments is necessary for the progress of science.
Resumo:
Purpose: The cornea is known to be susceptible to forces exerted by eyelids. There have been previous attempts to quantify eyelid pressure but the reliability of the results is unclear. The purpose of this study was to develop a technique using piezoresistive pressure sensors to measure upper eyelid pressure on the cornea. Methods: The technique was based on the use of thin (0.18 mm) tactile piezoresistive pressure sensors, which generate a signal related to the applied pressure. A range of factors that influence the response of this pressure sensor were investigated along with the optimal method of placing the sensor in the eye. Results: Curvature of the pressure sensor was found to impart force, so the sensor needed to remain flat during measurements. A large rigid contact lens was designed to have a flat region to which the sensor was attached. To stabilise the contact lens during measurement, an apparatus was designed to hold and position the sensor and contact lens combination on the eye. A calibration system was designed to apply even pressure to the sensor when attached to the contact lens, so the raw digital output could be converted to actual pressure units. Conclusions: Several novel procedures were developed to use tactile sensors to measure eyelid pressure. The quantification of eyelid pressure has a number of applications including eyelid reconstructive surgery and the design of soft and rigid contact lenses.
Resumo:
Purpose–The growing debate in the literature indicates that the initiative to implement Knowledge Based Urban Development (KBUD) approaches in urban development process is neither simple nor quick. Many research efforts has therefore, been put forward to the development of appropriate KBUD framework and KBUD practical approaches. But this has lead to a fragmented and incoherent methodological approach. This paper outlines and compares a few most popular KBUD frameworks selected from the literature. It aims to identify some key and common features in the effort to achieve a unified method of KBUD framework. Design/methodology/approach–This paper reviews, examines and identifies various popular KBUD frameworks discussed in the literature from urban planners’ viewpoint. It employs a content analysis technique i.e. a research tool used to determine the presence of certain words or concepts within texts or sets of texts. Originality/value–The paper reports on the key and common features of a few selected most popular KBUD frameworks. The synthesis of the results is based from a perspective of urban planners. The findings which encompass a new KBUD framework incorporating the key and common features will be valuable in setting a platform to achieve a unified method of KBUD. Practical implications –The discussion and results presented in this paper should be significant to researchers and practitioners and to any cities and countries that are aiming for KBUD. Keywords – Knowledge based urban development, Knowledge based urban development framework, Urban development and knowledge economy
A Modified inverse integer Cholesky decorrelation method and the performance on ambiguity resolution
Resumo:
One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.
Resumo:
A recent advance in biosecurity surveillance design aims to benefit island conservation through early and improved detection of incursions by non-indigenous species. The novel aspects of the design are that it achieves a specified power of detection in a cost-managed system, while acknowledging heterogeneity of risk in the study area and stratifying the area to target surveillance deployment. The design also utilises a variety of surveillance system components, such as formal scientific surveys, trapping methods, and incidental sightings by non-biologist observers. These advances in design were applied to black rats (Rattus rattus) representing the group of invasive rats including R. norvegicus, and R. exulans, which are potential threats to Barrow Island, Australia, a high value conservation nature reserve where a proposed liquefied natural gas development is a potential source of incursions. Rats are important to consider as they are prevalent invaders worldwide, difficult to detect early when present in low numbers, and able to spread and establish relatively quickly after arrival. The ‘exemplar’ design for the black rat is then applied in a manner that enables the detection of a range of non-indigenous species of rat that could potentially be introduced. Many of the design decisions were based on expert opinion as data gaps exist in empirical data. The surveillance system was able to take into account factors such as collateral effects on native species, the availability of limited resources on an offshore island, financial costs, demands on expertise and other logistical constraints. We demonstrate the flexibility and robustness of the surveillance system and discuss how it could be updated as empirical data are collected to supplement expert opinion and provide a basis for adaptive management. Overall, the surveillance system promotes an efficient use of resources while providing defined power to detect early rat incursions, translating to reduced environmental, resourcing and financial costs.