869 resultados para Limit of meaning of a Scientific Theory
Resumo:
This thesis traces a genealogy of the discourse of mathematics education reform in Ireland at the beginning of the twenty first century at a time when the hegemonic political discourse is that of neoliberalism. It draws on the work of Michel Foucault to identify the network of power relations involved in the development of a single case of curriculum reform – in this case Project Maths. It identifies the construction of an apparatus within the fields of politics, economics and education, the elements of which include institutions like the OECD and the Government, the bureaucracy, expert groups and special interest groups, the media, the school, the State, state assessment and international assessment. Five major themes in educational reform emerge from the analysis: the arrival of neoliberal governance in Ireland; the triumph of human capital theory as the hegemonic educational philosophy here; the dominant role of OECD/PISA and its values in the mathematics education discourse in Ireland; the fetishisation of western scientific knowledge and knowledge as commodity; and the formation of a new kind of subjectivity, namely the subjectivity of the young person as a form of human-capital-to-be. In particular, it provides a critical analysis of the influence of OECD/PISA on the development of mathematics education policy here – especially on Project Maths curriculum, assessment and pedagogy. It unpacks the arguments in favour of curriculum change and lays bare their ideological foundations. This discourse contextualises educational change as occurring within a rapidly changing economic environment where the concept of the State’s economic aspirations and developments in science, technology and communications are reshaping both the focus of business and the demands being put on education. Within this discourse, education is to be repurposed and its consequences measured against the paradigm of the Knowledge Economy – usually characterised as the inevitable or necessary future of a carefully defined present.
Resumo:
Tony Mann provides a review of the book: Theory of Games and Economic Behavior, John von Neumann and Oskar Morgenstern, Princeton University Press, 1944.
Resumo:
1. Barnacles are a good model organism for the study of open populations with space-limited recruitment. These models are applicable to other species with open supply of new individuals and resource limitation. The inclusion of space in models leads to reductions in recruitment with increasing density, and thus predictions of population size and stability are possible. 2. Despite the potential generality of a demographic theory for open space-limited populations, the models currently have a narrow empirical base. In this study, a model for an open population with space-limited recruitment was extended to include size-specific survival and promotions to any size class. The assumptions of this model were tested using data from a pan-European study of the barnacle Chthamalus montagui Southward. Two models were constructed: a 6-month model and a periodic annual model. Predicted equilibria and their stabilities were compared between shores. 3. Tests of model assumptions supported the extension of the theory to include promotions to any size class. Mortality was found to be size-specific and density independent. Studied populations were open, with recruitment proportional to free space. 4. The 6-month model showed a significant interaction between time and location for equilibrium free space. This may have been due to contrasts in the timing of structuring processes (i.e. creating and filling space) between Mediterranean and Atlantic systems. Integration of the 6-month models into a periodic annual model removed the differences in equilibrium-free space between locations. 5. Model predictions show a remarkable similarity between shores at a European scale. Populations were persistent and all solutions were stable. This reflects the apparent absence of density-dependent mortality and a high adult survivorship in C. montagui. As populations are intrinsically stable, observations of fluctuations in density are directly attributable to variations in the environmental forcing of recruitment or mortality
Resumo:
A combined experimental and theoretical investigation of the nature of the active form of gold in oxide-supported gold catalysts for the water gas shift reaction has been performed. In situ extended X-ray absorption fine structure (EXAFS) and X-ray absorption near-edge structure (XANES) experiments have shown that in the fresh catalysts the gold is in the form of highly dispersed gold ions. However, under water gas shift reaction conditions, even at temperatures as low as 100 degrees C, the evidence from EXAFS and XANES is only 14 consistent with rapid, and essentially complete, reduction of the gold to form metallic clusters containing about 50 atoms. The presence of Au-Ce distances in the EXAFS spectra, and the fact that about 15% of the gold atoms can be reoxidized after exposure to air at 150 degrees C, is indicative of a close interaction between a fraction (ca. 15%) of the gold atoms and the oxide support. Density functional theory (DFT) calculations are entirely consistent with this model and suggest that an important aspect of the active and stable form of gold under water gas shift reaction conditions is the location of a partially oxidized gold (Audelta+) species at a cerium cation vacancy in the surface of the oxide support. It is found that even with a low loading gold catalysts (0.2%) the fraction of ionic gold under water gas shift conditions is below the limit of detection by XANES (<5%). It is concluded that under water gas shift reaction conditions the active form of gold comprises small metallic gold clusters in intimate contact with the oxide support.
Resumo:
Transparency in nonprofit sector and foundations, as an element to enhance the confidence of stakeholders in the organization, is a fact shown by several studies in recent decades. Transparency can be considered in various fields and through different channels. In our study we focused on the analysis of the organizational and economic transparency of foundations, shown through the voluntary information on their Website. We review the theoretical previous studies published to put to the foundations within the framework of the social economy. This theoretical framework has focused on accountability that make foundations in relation to its social function and its management, especially since the most recent focus of information transparency across the Website.In this theoretical framework was made an index to quantify the voluntary information which is shown on its website. This index has been developed ad hoc for this study and applied to a group of large corporate foundations.With the application of these data are obtained two kind of results, to a descriptive level and to a inferential level.We analyzed the statistical correlation between economic transparency and organizational transparency offered in the Website through quantified variables by a multiple linear regression. This empirical analysis allows us to draw conclusions about the level of transparency offered by these organizations in relation to their organizational and financial information, as well as explain the relation between them.
Resumo:
This paper presents an overview of R-matrix theory of electron scattering by diatomic and polyatomic molecules. The paper commences with a detailed discussion of the fixed-nuclei approximation which in recent years has been used as the basis of the most accurate ab initio calculations. This discussion includes an overview of the computer codes which enable electron collisions with both diatomic and polyatomic molecules to be calculated. Nuclear motion including rotational and vibrational excitation and dissociation is then discussed. In non-resonant energy regions, or when the scattered electron energy is not close to thresholds, the adiabatic-nuclei approximation can be successfully used. However, when these conditions are not applicable, non-adiabatic R-matrix theory must be used and a detailed discussion of this theory is given. Finally, recent applications of the theory to treat electron scattering by polyatomic molecules are reviewed and a detailed comparison of R-matrix calculations and experimental measurements for water is presented.
Resumo:
Alloying metals is often used as an effective way to enhance the reactivity of surfaces. Aiming to shed light on the effect of alloying on reaction mechanisms, we carry out a comparative study of CO oxidation on Cu3Pt(111), Pt(111), and Cu(111) by means of density functional theory calculations. Alloying effects on the bonding sites and bonding energies of adsorbates, and the reaction pathways are investigated. It is shown that CO preferentially adsorbs on an atop site of Pt and O preferentially adsorbs on a fcc hollow site of three Cu atoms on Cu3Pt(111). It is also found that the adsorption energies of CO (or O-a) decreases on Pt (or Cu) on the alloy surface with respect to those on pure metals. More importantly, having identified the transition states for CO oxidation on those three surfaces, we found an interesting trend for the reaction barrier on the three surfaces. Similar to the adsorption energies, the reaction barrier on Cu3Pt possesses an intermediate value of those on pure Pt and Cu metals. The physical origin of these results has been analyzed in detail. (C) 2001 American Institute of Physics.
Resumo:
Dealing with uncertainty problems in intelligent systems has attracted a lot of attention in the AI community. Quite a few techniques have been proposed. Among them, the Dempster-Shafer theory of evidence (DS theory) has been widely appreciated. In DS theory, Dempster's combination rule plays a major role. However, it has been pointed out that the application domains of the rule are rather limited and the application of the theory sometimes gives unexpected results. We have previously explored the problem with Dempster's combination rule and proposed an alternative combination mechanism in generalized incidence calculus. In this paper we give a comprehensive comparison between generalized incidence calculus and the Dempster-Shafer theory of evidence. We first prove that these two theories have the same ability in representing evidence and combining DS-independent evidence. We then show that the new approach can deal with some dependent situations while Dempster's combination rule cannot. Various examples in the paper show the ways of using generalized incidence calculus in expert systems.
Resumo:
Masses and progenitor evolutionary states of Type II supernovae remain almost unconstrained by direct observations. Only one robust observation of a progenitor (SN 1987A) and one plausible observation (SN 1993J) are available. Neither matched theoretical predictions, and in this Letter we report limits on a third progenitor (SN 1999gi). The Hubble Space Telescope (HST) has imaged the site of the Type II-P supernova SN 1999gi with the Wide Field Planetary Camera 2 (WFPC2) in two filters (F606W and F300W) prior to explosion. The distance to the host galaxy (NGC 3184) of 7.9 Mpc means that the most luminous, massive stars are resolved as single objects in the archive images. The supernova occurred in a resolved, young OB association 2.3 kpc from the center of NGC 3184 with an association age of about 4 Myr. Follow-up images of SN 1999gi with WFPC2 taken 14 months after discovery determine the precise position of the supernova on the preexplosion frames. An upper limit of the absolute magnitude of the progenitor is estimated (M-v greater than or equal to -5.1). By comparison with stellar evolutionary tracks, this can be interpreted as a stellar mass, and we determine an upper mass limit of 9(-2)(+3) M.. We discuss the possibility of determining the masses or mass limits for numerous nearby core-collapse supernovae using the HST archive enhanced by our current SNAP program.
Resumo:
Microbial ecology is currently undergoing a revolution, with repercussions spreading throughout microbiology, ecology and ecosystem science. The rapid accumulation of molecular data is uncovering vast diversity, abundant uncultivated microbial groups and novel microbial functions. This accumulation of data requires the application of theory to provide organization, structure, mechanistic insight and, ultimately, predictive power that is of practical value, but the application of theory in microbial ecology is currently very limited. Here we argue that the full potential of the ongoing revolution will not be realized if research is not directed and driven by theory, and that the generality of established ecological theory must be tested using microbial systems.
Resumo:
In two experiments, we tested some of the central claims of the empathizing-systemizing (E-S) theory. Experiment 1 showed that the systemizing quotient (SQ) was unrelated to performance on a mathematics test, although it was correlated with statistics-related attitudes, self-efficacy, and anxiety. In Experiment 2, systemizing skills, and gender differences in these skills, were more strongly related to spatial thinking styles than to SQ. In fact, when we partialled the effect of spatial thinking styles, SQ was no longer related to systemizing skills. Additionally, there was no relationship between the Autism Spectrum Quotient (AQ) and the SQ, or skills and interest in mathematics and mechanical reasoning. We discuss the implications of our findings for the E-S theory, and for understanding the autistic cognitive profile.
Resumo:
Often the modification and enhancement of large scientific software systems are severely hampered because many components of the system are written in an implementation dependent fashion, they are inadequately documented, and their functionalities are not precisely known. In this paper we consider how mathematics may be employed to alleviate some of these problems. In particular, we illustrate how the formal specification notation VDM-SL is being used to specify precisely abstract data types for use in the development of scientific software.
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.