900 resultados para Which-way experiments
Resumo:
Modern toxicology investigates a wide array of both old and new health hazards. Priority setting is needed to select agents for research from the plethora of exposure circumstances. The changing societies and a growing fraction of the aged have to be taken into consideration. A precise exposure assessment is of importance for risk estimation and regulation. Toxicology contributes to the exploration of pathomechanisms to specify the exposure metrics for risk estimation. Combined effects of co-existing agents are not yet sufficiently understood. Animal experiments allow a separate administration of agents which can not be disentangled by epidemiological means, but their value is limited for low exposure levels in many of today’s settings. As an experimental science, toxicology has to keep pace with the rapidly growing knowledge about the language of the genome and the changing paradigms in cancer development. During the pioneer era of assembling a working draft of the human genome, toxicogenomics has been developed. Gene and pathway complexity have to be considered when investigating gene–environment interactions. For a best conduct of studies, modern toxicology needs a close liaison with many other disciplines like epidemiology and bioinformatics.
Resumo:
Modern toxicology investigates a wide array of both old and new health hazards. Priority setting is needed to select agents for research from the plethora of exposure circumstances. The changing societies and a growing fraction of the aged have to be taken into consideration. A precise exposure assessment is of importance for risk estimation and regulation. Toxicology contributes to the exploration of pathomechanisms to specify the exposure metrics for risk estimation. Combined effects of co-existing agents are not yet sufficiently understood. Animal experiments allow a separate administration of agents which can not be disentangled by epidemiological means, but their value is limited for low exposure levels in many of today's settings. As an experimental science, toxicology has to keep pace with the rapidly growing knowledge about the language of the genome and the changing paradigms in cancer development. During the pioneer era of assembling a working draft of the human genome, toxicogenomics has been developed. Gene and pathway complexity have to be considered when investigating gene-environment interactions. For a best conduct of studies, modem toxicology needs a close liaison with many other disciplines like epidemiology and bioinformatics. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Processing language is postulated to involve a mental simulation, or re-enactment of perceptual, motor, and introspective states that were acquired experientially (Barsalou, 1999, 2008). One such aspect that is mentally simulated during processing of certain concepts is spatial location. For example, upon processing the word “moon” the prominent spatial location of the concept (e.g. ‘upward’) is mentally simulated. In six eye-tracking experiments, we investigate how mental simulations of spatial location affect processing. We first address a conflict in previous literature whereby processing is shown to be impacted in both a facilitatory and inhibitory way. Two of our experiments showed that mental simulations of spatial association facilitate saccades launched toward compatible locations; however, a third experiment showed an inhibitory effect on saccades launched towards incompatible locations. We investigated these differences with further experiments, which led us to conclude that the nature of the effect (facilitatory or inhibitory) is dependent on the demands of the task and, in fitting with the theory of Grounded Cognition (Barsalou, 2008), that mental simulations impact processing in a dynamic way. Three further experiments explored the nature of verticality – specifically, whether ‘up’ is perceived as away from gravity, or above our head. Using similar eye-tracking methods, and by manipulating the position of participants, we were able to dissociate these two possible standpoints. The results showed that mental simulations of spatial location facilitated saccades to compatible locations, but only when verticality was dissociated from gravity (i.e. ‘up’ was above the participant’s head). We conclude that this is not due to an ‘embodied’ mental simulation, but rather a result of heavily ingrained visuo-motor association between vertical space and eye movements.
Resumo:
Mapping the physical world, the arrangement of continents and oceans, cities and villages, mountains and deserts, while not without its own contentious aspects, can at least draw upon centuries of previous work in cartography and discovery. To map virtual spaces is another challenge altogether. Are cartographic conventions applicable to depictions of the blogosphere, or the internet in general? Is a more mathematical approach required to even start to make sense of the shape of the blogosphere, to understand the network created by and between blogs? With my research comparing information flows in the Australian and French political blogs, visualising the data obtained is important as it can demonstrate the spread of ideas and topics across blogs. However, how best to depict the flows, links, and the spaces between is still unclear. Is network theory and systems of hubs and nodes more relevant than mass communication theories to the research at hand, influencing the nature of any map produced? Is it even a good idea to try and apply boundaries like ‘Australian’ and ‘French’ to parts of a map that does not reflect international borders or the Mercator projection? While drawing upon some of my work-in-progress, this paper will also evaluate previous maps of the blogosphere and approaches to depicting networks of blogs. As such, the paper will provide a greater awareness of the tools available and the strengths and limitations of mapping methodologies, helping to shape the direction of my research in a field still very much under development.
Resumo:
Player experience of spatiality in first-person, single-player games is informed by the maps and navigational aids provided by the game. This project uses textual analysis to examine the way these maps and navigational aids inform the experience of spatiality in Fallout 3, BioShock and BioShock 2. Spatiality is understood as trialectic, incorporating perceived, conceived and lived space, drawing on the work of Henri Lefebvre and Edward Soja. The most prominent elements of the games’ maps and navigational aids are analysed in terms of how they inform players’ experience of the games’ spaces. In particular this project examines the in-game maps these games incorporate, the waypoint navigation and fast-travel systems in Fallout 3, and the guide arrow and environmental cues in the BioShock games.
Resumo:
Tres hormigas intentan mover a empujoncitos un pedazo de tarta colocado en lo alto de una colina muy peculiar.
Resumo:
La pobreza es un problema mundial que afecta a personas de diferentes maneras. El propósito de este artículo es explorar dos teorías principales que abordan la pobreza y la posibilidad de su superación, que son los enfoques de capital humano y de capacidades humanas. El enfoque del capital humano se centra exclusivamente en la faceta económica de la pobreza; en esta perspectiva, la pobreza se define como la falta de dinero y puede abordarse mediante el aumento de los ingresos financieros de las personas que viven en la pobreza. El enfoque de las capacidades humanas ve la pobreza como un problema multidimensional que va más allá de la economía para áreas como la salud, la educación y la libertad. Este enfoque se orienta hacia el cambio social y ayudar a las personas en situación de pobreza para descubrir y desarrollar su potencial. El autor considera que las capacidades humanas abarcan con mayor precisión el alcance de la pobreza y las personas afectadas por el mismo, aunque debido a su amplia gama ha sido difícil diseñar e implementar políticas eficaces que aborden todas las facetas de la pobreza.
Resumo:
We experimentally determine weak values for a single photon's polarization, obtained via a weak measurement that employs a two-photon entangling operation, and postselection. The weak values cannot be explained by a semiclassical wave theory, due to the two-photon entanglement. We observe the variation in the size of the weak value with measurement strength, obtaining an average measurement of the S-1 Stokes parameter more than an order of magnitude outside of the operator's spectrum for the smallest measurement strengths.
Resumo:
Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2)_mu discrepancy, proposed U(1) extensions of the SM gauge group have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle, the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for hidden photons, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e+(A,Z)->e+(A,Z)+l^+l^- is investigated and a search for a very narrow resonance in the invariant mass distribution of the lepton pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsäcker-Williams approximation to calculate the signal cross section of the process, which is widely used to design such experimental setups, is investigated. In a next step, the reaction e+(A,Z)->e+(A,Z)+l^+l^- is analyzed as signal and background process in order to describe existing data obtained by the A1 experiment at MAMI with the aim to give accurate predictions of exclusion limits for the hidden photon parameter space. Finally, the derived methods are used to find predictions for future experiments, e.g., at MESA or at JLAB, allowing for a comprehensive study of the discovery potential of the complementary experiments. In the last part, a feasibility study for probing the hidden photon model by rare kaon decays is performed. For this purpose, invisible as well as visible decays of the hidden photon are considered within different classes of models. This allows one to find bounds for the parameter space from existing data and to estimate the reach of future experiments.
Resumo:
Esta é uma pesquisa sobre o uso de metáforas na construção de modelos por parte do físico escocês James Clerk Maxwell. O objetivo da pesquisa foi buscar compreender de que maneira o uso de metáforas e modelos é legítimo na ciência e em que medida contribui para seu sucesso. Além disso, busca compreender em que medida o uso de artifícios como modelos e analogias entre ramos distintos da ciência são impulsionadores de sucesso explicativo e preditivo da teoria do físico estudado. Explora as crenças teológicas e filosóficas do autor, que vê o mundo como unidade, permitindo a analogia entre ramos distintos da física. Seus desenvolvimentos em torno de teorias como calor, cores, óptica, magnetismo e eletricidade permitem evidenciar essa visão em todo o seu trabalho. Maxwell é considerado inaugurador de nova metodologia com o uso de modelos e metáforas. Explora o desenvolvimento da teoria das cores, da descrição matemática da estabilidade dos anéis de Saturno e o desenvolvimento da teoria dos gases como preâmbulo à discussão da teoria do eletromagnetismo. Descreve o desenvolvimento teórico do eletromagnetismo em seus diversos momentos. A construção da teoria do eletromagnetismo evidencia paulatino abandono do mecanicismo, uso intenso de modelos e metáforas temporários e ênfase na quantificação e no uso de experimentos. Discute o relacionamento de Maxwell com as discussões filosóficas, sociais e teológicas de sua época, seu engajamento em atividades práticas nesse sentido e suas influências científicas e filosóficas. Descreve e discute os textos filosóficos do cientista, em que se evidenciam sua ontologia, suas crenças teológicas e sua concepção de analogias. Discute a questão do uso de analogias em ciência e compara diversos autores que abordam o tema. A metodologia utilizada foi a de levantamento bibliográfico com análise crítica da literatura do autor e de seus comentadores, além de comentário crítico sobre os textos primários e secundários. Conclui que o sucesso científico de Maxwell deve-se à sua aposta numa unidade do mundo garantida por Deus, bem como na unidade entre o mundo e a mente humana, posturas que mostraram ser bem-sucedidas quando aplicadas à metodologia científica. Conclui também pela legitimidade e necessidade do uso de metáforas e modelos no empreendimento científico.