998 resultados para astro particle physics
Resumo:
In the present chapter some prototype gas and gas-surface processes occurring within the hypersonic flow layer surrounding spacecrafts at planetary entry are discussed. The discussion is based on microscopic dynamical calculations of the detailed cross sections and rate coefficients performed using classical mechanics treatments for atoms, molecules and surfaces. Such treatment allows the evaluation of the efficiency of thermal processes (both at equilibrium and nonequilibrium distributions) based on state-to-state and state specific calculations properly averaged over the population of the initial states. The dependence of the efficiency of the considered processes on the initial partitioning of energy among the various degrees of freedom is discussed.
Role of the environmental spectrum in the decoherence and dissipation of a quantum Brownian particle
Resumo:
Due to source contamination and wearing of instrument components problems caused by the direct insertion probe technique, a new way of introduction of low volatile compounds into mass spectrometer was tested. This new scheme comprises the introduction of the low volatile compounds solutions via a six port valve connected to a particle beam interface. Solutions of isatin were injected into this system and the best results were obtained with CH2Cl2, CH3OH and CH3CN. The solution inlet system has shown to be advantageous over the conventional way of direct insertion probe introduction.
Resumo:
The large hadron collider constructed at the European organization for nuclear research, CERN, is the world’s largest single measuring instrument ever built, and also currently the most powerful particle accelerator that exists. The large hadron collider includes six different experiment stations, one of which is called the compact muon solenoid, or the CMS. The main purpose of the CMS is to track and study residue particles from proton-proton collisions. The primary detectors utilized in the CMS are resistive plate chambers (RPCs). To obtain data from these detectors, a link system has been designed. The main idea of the link system is to receive data from the detector front-end electronics in parallel form, and to transmit it onwards in serial form, via an optical fiber. The system is mostly ready and in place. However, a problem has occurred with innermost RPC detectors, located in sector labeled RE1/1; transmission lines for parallel data suffer from signal integrity issues over long distances. As a solution to this, a new version of the link system has been devised, a one that fits in smaller space and can be located within the CMS, closer to the detectors. This RE1/1 link system has been so far completed only partially, with just the mechanical design and casing being done. In this thesis, link system electronics for RE1/1 sector has been designed, by modifying the existing link system concept to better meet the requirements of the RE1/1 sector. In addition to completion of the prototype of the RE1/1 link system electronics, some testing for the system has also been done, to ensure functionality of the design.
Resumo:
The singular properties of hydrogenated amorphous carbon (a-C:H) thin filmsdeposited by pulsed DC plasma enhanced chemical vapor deposition (PECVD), such as hardness and wear resistance, make it suitable as protective coating with low surface energy for self-assembly applications. In this paper, we designed fluorine-containing a-C:H (a-C:H:F) nanostructured surfaces and we characterized them for self-assembly applications. Sub-micron patterns were generated on silicon through laser lithography while contact angle measurements, nanotribometer, atomic force microscopy (AFM), and scanning electron microscopy (SEM) were used to characterize the surface. a-C:H:F properties on lithographied surfaces such as hydrophobicity and friction were improved with the proper relative quantity of CH4 and CHF3 during deposition, resulting in ultrahydrophobic samples and low friction coefficients. Furthermore, these properties were enhanced along the direction of the lithographypatterns (in-plane anisotropy). Finally, self-assembly properties were tested with silicananoparticles, which were successfully assembled in linear arrays following the generated patterns. Among the main applications, these surfaces could be suitable as particle filter selector and cell colony substrate.
Resumo:
In the last two decades of studying the Solar Energetic Particle (SEP) phenomenon, intensive emphasis has been put on how and when and where these SEPs are injected into interplanetary space. It is well known that SEPs are related to solar flares and CMEs. However, the role of each in the acceleration of SEPs has been under debate since the major role was taken from flares ascribed to CMEs step by step after the skylab mission, which started the era of CME spaceborn observations. Since then, the shock wave generated by powerful CMEs in between 2-5 solar radii is considered the major accelerator. The current paradigm interprets the prolonged proton intensity-time profile in gradual SEP events as a direct effect of accelerated SEPs by shock wave propagating in the interplanetary medium. Thus the powerful CME is thought of as a starter for the acceleration and its shock wave as a continuing accelerator to result in such an intensity-time profile. Generally it is believed that a single powerful CME which might or might not be associated with a flare is always the reason behind such gradual events.
In this work we use the Energetic and Relativistic Nucleus and Electrons ERNE instrument on board Solar and Heliospheric Observatory SOHO to present an empirical study to show the possibility of multiple accelerations in SEP events. In the beginning we found 18 double-peaked SEP events by examining 88 SEP events. The peaks in the intensity-time profile were separated by 3-24 hours. We divided the SEP events according to possible multiple acceleration into four groups and in one of these groups we find evidence for multiple acceleration in velocity dispersion and change in the abundance ratio associated at transition to the second peak. Then we explored the intensity-time profiles of all SEP events during solar cycle 23 and found that most of the SEP events are associated with multiple eruptions at the Sun and we call those events as Multi-Eruption Solar Energetic Particles (MESEP) events. We use the data available by Large Angle and Spectrometric Coronograph LASCO on board SOHO to determine the CME associated with such events and YOHKOH and GOES satellites data to determine the flare associated with such events. We found four types of MESEP according to the appearance of the peaks in the intensity-time profile in large variation of energy levels. We found that it is not possible to determine whether the peaks are related to an eruption at the Sun or not, only by examining the anisotropy flux, He/p ratio and velocity dispersion. Then we chose a rare event in which there is evidence of SEP acceleration from behind previous CME. This work resulted in a conclusion which is inconsistent with the current SEP paradigm. Then we discovered through examining another MESEP event, that energetic particles accelerated by a second CME can penetrate a previous CME-driven decelerating shock. Finally, we report the previous two MESEP events with new two events and find a common basis for second CME SEPs penetrating previous decelerating shocks. This phenomenon is reported for the first time and expected to have significant impact on modification of the current paradigm of the solar energetic particle events.
Resumo:
The high sensitivity and excellent timing accuracy of Geiger mode avalanche photodiodes makes them ideal sensors as pixel detectors for particle tracking in high energy physics experiments to be performed in future linear colliders. Nevertheless, it is well known that these sensors suffer from dark counts and afterpulsing noise, which induce false hits (indistinguishable from event detection) as well as an increase of the necessary area of the readout system. In this work, we present a comparison between APDs fabricated in a high voltage 0.35 µm and a high integration 0.13 µm commercially available CMOS technologies that has been performed to determine which of them best fits the particle collider requirements. In addition, a readout circuit that allows low noise operation is introduced. Experimental characterization of the proposed pixel is also presented in this work.
Resumo:
Aerosol size distributions from 6 to 700 nm were measured simultaneously at an urban background site and a roadside station in Oporto. The particle number concentration was higher at the traffic exposed site, where up to 90% of the size spectrum was dominated by the nucleation mode. Larger aerosol mode diameters were observed in the urban background site possibly due to the coagulation processes or uptake of gases during transport. Factor analysis has shown that road traffic and the neighbour stationary sources localised upwind affect the urban area thought intra-regional pollutant transport.
Resumo:
This study evaluates the application of an intelligent hybrid system for time-series forecasting of atmospheric pollutant concentration levels. The proposed method consists of an artificial neural network combined with a particle swarm optimization algorithm. The method not only searches relevant time lags for the correct characterization of the time series, but also determines the best neural network architecture. An experimental analysis is performed using four real time series and the results are shown in terms of six performance measures. The experimental results demonstrate that the proposed methodology achieves a fair prediction of the presented pollutant time series by using compact networks.
Resumo:
This work is devoted to the development of numerical method to deal with convection diffusion dominated problem with reaction term, non - stiff chemical reaction and stiff chemical reaction. The technique is based on the unifying Eulerian - Lagrangian schemes (particle transport method) under the framework of operator splitting method. In the computational domain, the particle set is assigned to solve the convection reaction subproblem along the characteristic curves created by convective velocity. At each time step, convection, diffusion and reaction terms are solved separately by assuming that, each phenomenon occurs separately in a sequential fashion. Moreover, adaptivities and projection techniques are used to add particles in the regions of high gradients (steep fronts) and discontinuities and transfer a solution from particle set onto grid point respectively. The numerical results show that, the particle transport method has improved the solutions of CDR problems. Nevertheless, the method is time consumer when compared with other classical technique e.g., method of lines. Apart from this advantage, the particle transport method can be used to simulate problems that involve movingsteep/smooth fronts such as separation of two or more elements in the system.
Resumo:
The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.
Resumo:
New luminometric particle-based methods were developed to quantify protein and to count cells. The developed methods rely on the interaction of the sample with nano- or microparticles and different principles of detection. In fluorescence quenching, timeresolved luminescence resonance energy transfer (TR-LRET), and two-photon excitation fluorescence (TPX) methods, the sample prevents the adsorption of labeled protein to the particles. Depending on the system, the addition of the analyte increases or decreases the luminescence. In the dissociation method, the adsorbed protein protects the Eu(III) chelate on the surface of the particles from dissociation at a low pH. The experimental setups are user-friendly and rapid and do not require hazardous test compounds and elevated temperatures. The sensitivity of the quantification of protein (from 40 to 500 pg bovine serum albumin in a sample) was 20-500-fold better than in most sensitive commercial methods. The quenching method exhibited low protein-to-protein variability and the dissociation method insensitivity to the assay contaminants commonly found in biological samples. Less than ten eukaryotic cells were detected and quantified with all the developed methods under optimized assay conditions. Furthermore, two applications, the method for detection of the aggregation of protein and the cell viability test, were developed by utilizing the TR-LRET method. The detection of the aggregation of protein was allowed at a more than 10,000 times lower concentration, 30 μg/L, compared to the known methods of UV240 absorbance and dynamic light scattering. The TR-LRET method was combined with a nucleic acid assay with cell-impermeable dye to measure the percentage of dead cells in a single tube test with cell counts below 1000 cells/tube.