849 resultados para Science (General)
Resumo:
This project developed a visual strategy and graphic outcomes to communicate the results of a scientific collaborative project to the Mackay community. During 2013 and 2014 a team from CSIRO engaged with the community in Mackay to collaboratively develop a set of strategies to improve the management of the Great Barrier Reef. The result of this work was a 300+ page scientific report that needed to be translated and summarised to the general community. The aim of this project was to strategically synthesise information contained in the report and to design and produce an outcome to be distributed to the participant community. By working with the CISRO researchers, an action toolkit was developed, with twelve cards and a booklet. Each card represented the story behind a certain local management issue and the actions that the participants suggested should be taken in order to improve management of The Reef. During the design synthesis it was identified that for all management issues there was a reference to the need to develop some sort of "educational campaign" to the area. That was then translated as an underlying action to support all other actions proposed in the toolkit.
Resumo:
Although kimberlite pipes/bodies are usually the remains of volcanic vents, in-vent deposits, and subvolcanic intrusions, the terminology used for kimberlite rocks has largely developed independently of that used in mainstream volcanology. Existing kimberlite terminology is not descriptive and includes terms that are rarely used, used differently, and even not used at all in mainstream volcanology. In addition, kimberlite bodies are altered to varying degrees, making application of genetic terminology difficult because original components and depositional textures are commonly masked by alteration. This paper recommends an approach to the terminology for kimberlite rocks that is consistent with usage for other volcanic successions. In modern terrains the eruption and emplacement origins of deposits can often be readily deduced, but this is often not the case for old, variably altered and deformed rock successions. A staged approach is required whereby descriptive terminology is developed first, followed by application of genetic terminology once all features, including the effects of alteration on original texture and depositional features, together with contact relationships and setting, have been evaluated. Because many volcanic successions consist of both primary volcanic deposits as well as volcanic sediments, terminology must account for both possibilities.
Resumo:
Membrane proteins play important roles in many biochemical processes and are also attractive targets of drug discovery for various diseases. The elucidation of membrane protein types provides clues for understanding the structure and function of proteins. Recently we developed a novel system for predicting protein subnuclear localizations. In this paper, we propose a simplified version of our system for predicting membrane protein types directly from primary protein structures, which incorporates amino acid classifications and physicochemical properties into a general form of pseudo-amino acid composition. In this simplified system, we will design a two-stage multi-class support vector machine combined with a two-step optimal feature selection process, which proves very effective in our experiments. The performance of the present method is evaluated on two benchmark datasets consisting of five types of membrane proteins. The overall accuracies of prediction for five types are 93.25% and 96.61% via the jackknife test and independent dataset test, respectively. These results indicate that our method is effective and valuable for predicting membrane protein types. A web server for the proposed method is available at http://www.juemengt.com/jcc/memty_page.php
Resumo:
In treatment comparison experiments, the treatment responses are often correlated with some concomitant variables which can be measured before or at the beginning of the experiments. In this article, we propose schemes for the assignment of experimental units that may greatly improve the efficiency of the comparison in such situations. The proposed schemes are based on general ranked set sampling. The relative efficiency and cost-effectiveness of the proposed schemes are studied and compared. It is found that some proposed schemes are always more efficient than the traditional simple random assignment scheme when the total cost is the same. Numerical studies show promising results using the proposed schemes.
Resumo:
This paper deals with new results obtained in regard to the reconstruction properties of side-band Fresnel holograms (SBFH) of self-imaging type objects (for example, gratings) as compared with those of general objects. The major finding is that a distribution I2, which appears on the real-image plane along with the conventional real-image I1, remains a 2Z distribution (where 2Z is the axial distance between the object and its self-imaging plane) under a variety of situations, while its nature and focusing properties differ from one situation to another. It is demonstrated that the two distributions I1 and I2 can be used in the development of a novel technique for image subtraction.
Resumo:
Consider a general regression model with an arbitrary and unknown link function and a stochastic selection variable that determines whether the outcome variable is observable or missing. The paper proposes U-statistics that are based on kernel functions as estimators for the directions of the parameter vectors in the link function and the selection equation, and shows that these estimators are consistent and asymptotically normal.
Resumo:
Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971) considered optimal set size for ranked set sampling (RSS) with fixed operational costs. This framework can be very useful in practice to determine whether RSS is beneficial and to obtain the optimal set size that minimizes the variance of the population estimator for a fixed total cost. In this article, we propose a scheme of general RSS in which more than one observation can be taken from each ranked set. This is shown to be more cost-effective in some cases when the cost of ranking is not so small. We demonstrate using the example in Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971), by taking two or more observations from one set even with the optimal set size from the RSS design can be more beneficial.
Resumo:
This doctoral thesis in theoretical philosophy is a systematic analysis of Karl Popper's philosophy of science and its relation to his theory of three worlds. The general aim is to study Popper's philosophy of science and to show that Popper's theory of three worlds was a restatement of his earlier positions. As a result, a new reading of Popper's philosophy and development is offered and the theory of three worlds is analysed in a new manner. It is suggested that the theory of three worlds is not purely an ontological theory, but has a profound epistemological motivation. In Part One, Popper's epistemology and philosophy of science is analysed. It is claimed that Popper's thinking was bifurcated: he held two profound positions without noticing the tension between them. Popper adopted the position called the theorist around 1930 and focused on the logical structure of scientific theories. In Logik der Forschung (1935), he attempted to build a logic of science on the grounds that scientific theories may be regarded as universal statements which are not verifiable but can be falsified. Later, Popper emphasized another position, called here the processionalist. Popper focused on the study of science as a process and held that a) philosophy of science should study the growth of knowledge and that b) all cognitive processes are constitutive. Moreover, the constitutive idea that we see the world in the searchlight of our theories was combined with the biological insight that knowledge grows by trial and error. In Part Two, the theory of three worlds is analysed systematically. The theory is discussed as a cluster of theories which originate from Popper's attempt to solve some internal problems in his thinking. Popper adhered to realism and wished to reconcile the theorist and the processionalist. He also stressed the real and active nature of the human mind, and the possibility of objective knowledge. Finally, he wished to create a scientific world view.
Resumo:
Bertrand Russell (1872 1970) introduced the English-speaking philosophical world to modern, mathematical logic and foundational study of mathematics. The present study concerns the conception of logic that underlies his early logicist philosophy of mathematics, formulated in The Principles of Mathematics (1903). In 1967, Jean van Heijenoort published a paper, Logic as Language and Logic as Calculus, in which he argued that the early development of modern logic (roughly the period 1879 1930) can be understood, when considered in the light of a distinction between two essentially different perspectives on logic. According to the view of logic as language, logic constitutes the general framework for all rational discourse, or meaningful use of language, whereas the conception of logic as calculus regards logic more as a symbolism which is subject to reinterpretation. The calculus-view paves the way for systematic metatheory, where logic itself becomes a subject of mathematical study (model-theory). Several scholars have interpreted Russell s views on logic with the help of the interpretative tool introduced by van Heijenoort,. They have commonly argued that Russell s is a clear-cut case of the view of logic as language. In the present study a detailed reconstruction of the view and its implications is provided, and it is argued that the interpretation is seriously misleading as to what he really thought about logic. I argue that Russell s conception is best understood by setting it in its proper philosophical context. This is constituted by Immanuel Kant s theory of mathematics. Kant had argued that purely conceptual thought basically, the logical forms recognised in Aristotelian logic cannot capture the content of mathematical judgments and reasonings. Mathematical cognition is not grounded in logic but in space and time as the pure forms of intuition. As against this view, Russell argued that once logic is developed into a proper tool which can be applied to mathematical theories, Kant s views turn out to be completely wrong. In the present work the view is defended that Russell s logicist philosophy of mathematics, or the view that mathematics is really only logic, is based on what I term the Bolzanian account of logic . According to this conception, (i) the distinction between form and content is not explanatory in logic; (ii) the propositions of logic have genuine content; (iii) this content is conferred upon them by special entities, logical constants . The Bolzanian account, it is argued, is both historically important and throws genuine light on Russell s conception of logic.
Resumo:
This article discusses the design and development of GRDB (General Purpose Relational Data Base System) which has been implemented on a DEC-1090 system in Pascal. GRDB is a general purpose database system designed to be completely independent of the nature of data to be handled, since it is not tailored to the specific requirements of any particular enterprise. It can handle different types of data such as variable length records and textual data. Apart from the usual database facilities such as data definition and data manipulation, GRDB supports User Definition Language (UDL) and Security definition language. These facilities are provided through a SEQUEL-like General Purpose Query Language (GQL). GRDB provides adequate protection facilities up to the relation level. The concept of “security matrix” has been made use of to provide database protection. The concept of Unique IDentification number (UID) and Password is made use of to ensure user identification and authentication. The concept of static integrity constraints has been used to ensure data integrity. Considerable efforts have been made to improve the response time through indexing on the data files and query optimisation. GRDB is designed for an interactive use but alternate provision has been made for its use through batch mode also. A typical Air Force application (consisting of data about personnel, inventory control, and maintenance planning) has been used to test GRDB and it has been found to perform satisfactorily.
Resumo:
The use of capacitors for electrical energy storage actually predates the invention of the battery. Alessandro Volta is attributed with the invention of the battery in 1800, where he first describes a battery as an assembly of plates of two different materials (such as copper and zinc) placed in an alternating stack and separated by paper soaked in brine or vinegar [1]. Accordingly, this device was referred to as Volta’s pile and formed the basis of subsequent revolutionary research and discoveries on the chemical origin of electricity. Before the advent of Volta’s pile, however, eighteenth century researchers relied on the use of Leyden jars as a source of electrical energy. Built in the mid-1700s at the University of Leyden in Holland, a Leyden jar is an early capacitor consisting of a glass jar coated inside and outside with a thin layer of silver foil [2, 3]. With the outer foil being grounded, the inner foil could be charged with an electrostatic generator, or a source of static electricity, and could produce a strong electrical discharge from a small and comparatively simple device.
Resumo:
Australia’s and New Zealand’s major agricultural manure management emission sources are reported to be, in descending order of magnitude: (1) methane (CH4) from dairy farms in both countries; (2) CH4 from pig farms in Australia; and nitrous oxide (N2O) from (3) beef feedlots and (4) poultry sheds in Australia. We used literature to critically review these inventory estimates. Alarmingly for dairy farm CH4 (1), our review revealed assumptions and omissions that when addressed could dramatically increase this emission estimate. The estimate of CH4 from Australian pig farms (2) appears to be accurate, according to industry data and field measurements. The N2O emission estimates for beef feedlots (3) and poultry sheds (4) are based on northern hemisphere default factors whose appropriateness for Australia is questionable and unverified. Therefore, most of Australasia’s key livestock manure management greenhouse gas (GHG) emission profiles are either questionable or are unsubstantiated by region-specific research. Encouragingly, GHG from dairy shed manure are relatively easy to mitigate because they are a point source which can be managed by several ‘close-to-market’ abatement solutions. Reducing these manure emissions therefore constitutes an opportunity for meaningful action sooner compared with the more difficult-to-implement and long-term strategies that currently dominate agricultural GHG mitigation research. At an international level, our review highlights the critical need to carefully reassess GHG emission profiles, particularly if such assessments have not been made since the compilation of original inventories. Failure to act in this regard presents the very real risk of missing the ‘low hanging fruit’ in the rush towards a meaningful response to climate change
Resumo:
This study examines philosophically the main theories and methodological assumptions of the field known as the cognitive science of religion (CSR). The study makes a philosophically informed reconstruction of the methodological principles of the CSR, indicates problems with them, and examines possible solutions to these problems. The study focuses on several different CSR writers, namely, Scott Atran, Justin Barrett, Pascal Boyer and Dan Sperber. CSR theorising is done in the intersection between cognitive sciences, anthropology and evolutionary psychology. This multidisciplinary nature makes CSR a fertile ground for philosophical considerations coming from philosophy of psychology, philosophy of mind and philosophy of science. The study begins by spelling out the methodological assumptions and auxiliary theories of CSR writers by situating these theories and assumptions in the nexus of existing approaches to religion. The distinctive feature of CSR is its emphasis on information processing: CSR writers claim that contemporary cognitive sciences can inform anthropological theorising about the human mind and offer tools for producing causal explanations. Further, they claim to explain the prevalence and persistence of religion by cognitive systems that undergird religious thinking. I also examine the core theoretical contributions of the field focusing mainly on the (1) “minimally counter-intuitiveness hypothesis” and (2) the different ways in which supernatural agent representations activate our cognitive systems. Generally speaking, CSR writers argue for the naturalness of religion: religious ideas and practices are widespread and pervasive because human cognition operates in such a way that religious ideas are easy to acquire and transmit. The study raises two philosophical problems, namely, the “problem of scope” and the “problem of religious relevance”. The problem of scope is created by the insistence of several critics of the CSR that CSR explanations are mostly irrelevant for explaining religion. Most CSR writers themselves hold that cognitive explanations can answer most of our questions about religion. I argue that the problem of scope is created by differences in explanation-begging questions: the former group is interested in explaining different things than the latter group. I propose that we should not stick too rigidly to one set of methodological assumptions, but rather acknowledge that different assumptions might help us to answer different questions about religion. Instead of adhering to some robust metaphysics as some strongly naturalistic writers argue, we should adopt a pragmatic and explanatory pluralist approach which would allow different kinds of methodological presuppositions in the study of religion provided that they attempt to answer different kinds of why-questions, since religion appears to be a multi-faceted phenomenon that spans over a variety of fields of special sciences. The problem of religious relevance is created by the insistence of some writers that CSR theories show religious beliefs to be false or irrational, whereas others invoke CSR theories to defend certain religious ideas. The problem is interesting because it reveals the more general philosophical assumptions of those who make such interpretations. CSR theories can (and have been) interpreted in terms of three different philosophical frameworks: strict naturalism, broad naturalism and theism. I argue that CSR theories can be interpreted inside all three frameworks without doing violence to the theories and that these frameworks give different kinds of results regarding the religious relevance of CSR theories.
Resumo:
A graphical method is presented for synthesis of the general, seven-link, two-degree-of-freedom plane linkage to generate functions of two variables. The method is based on point position reduction and permits synthesis of the linkage to satisfy upto six arbitrarily selected precision positions.