939 resultados para GENERATOR
Resumo:
[EN]The meccano method is a novel and promising mesh generation method for simultaneously creating adaptive tetrahedral meshes and volume parametrizations of a complex solid. We highlight the fact that the method requires minimum user intervention and has a low computational cost. The method builds a 3-D triangulation of the solid as a deformation of an appropriate tetrahedral mesh of the meccano. The new mesh generator combines an automatic parametrization of surface triangulations, a local refinement algorithm for 3-D nested triangulations and a simultaneous untangling and smoothing procedure. At present, the procedure is fully automatic for a genus-zero solid. In this case, the meccano can be a single cube. The efficiency of the proposed technique is shown with several applications...
Resumo:
[EN]A three-dimensional finite element model for the pollutant dispersion is presented. In these environmental processes over a complex terrain, a mesh generator capable of adapting itself to the topographic characteristics is essential. The first stage of the model consists on the construction of an adaptive tetrahedral mesh of a rectangular region bounded in its lower part by the terrain and in its upper part by a horizontal plane. Once the mesh is constructed, an adaptive local refinement of tetrahedra is used in order to capture the plume rise. Wind measurements are used to compute an interpolated wind field, that is modified by using a mass-consistent model and perturbing its vertical component to introduce the plume rise effect...
Resumo:
[EN]A three-dimensional air pollution model for the short-term simulation of emission, transport and reaction of pollutants is presented. In the finite element simulation of these environmental processes over a complex terrain, a mesh generator capable of adapting itself to the topographic characteristics is essential, A local refinement of tetrahedra is used in order to capture the plume rise. Then a wind field is computed by using a mass-consistent model and perturbing its vertical component to introduce the plume rise effect. Finally, an Eulerian convection-diffusionreaction model is used to simulate the pollutant dispersion…
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
In order to improve the animal welfare, the Council Directive 1999/74/EC (defining minimum standards for the welfare of laying hens) will ban conventional cage systems since 2012, in favour of enriched cages or floor systems. As a consequence an increased risk of bacterial contamination of eggshell is expected (EFSA, 2005). Furthermore egg-associated salmonellosis is an important public health problem throughout the world (Roberts et al., 1994). In this regard the introduction of efficient measures to reduce eggshell contamination by S. Enteritidis or other bacterial pathogens, and thus to prevent any potential or additional food safety risk for Human health, may be envisaged. The hot air pasteurization can be a viable alternative for the decontamination of the surface of the egg shell. Few studies have been performed on the decontamination power of this technique on table eggs (Hou et al, 1996; James et al., 2002). The aim of this study was to develop innovative techniques to remove surface contamination of shell eggs by hot air under natural or forced convection. Initially two simplified finite element models describing the thermal interaction between the air and egg were developed, respectively for the natural and forced convection. The numerical models were validated using an egg simulant equipped by type-K thermocouple (Chromel/Alumel). Once validated, the models allowed the selection of a thermal cycle with an inner temperature always lower than 55°C. Subsequently a specific apparatus composed by two hot air generators, one cold air generator and rolling cylinder support, was built to physically condition the eggs. The decontamination power of the thermal treatments was evaluated on shell eggs experimentally inoculated with either Salmonella Enteritidis, Escherichia coli, Listeria monocytogenes and on shell eggs containing only the indigenous microflora. The applicability of treatments was further evaluated by comparing quality traits of treated and not treated eggs immediately after the treatment and after 28 days of storage at 20°C. The results showed that the treatment characterized by two shots of hot air at 350°C for 8 sec, spaced by a cooling interval of 32 (forced convection), reduce the bacterial population of more than 90% (Salmonella enteritidis and Listeria monocytogenes). No statistically significant results were obtained comparing E. coli treated and not treated eggs as well as indigenous microflora treated and not treated eggs. A reduction of 2.6 log was observed on Salmonella enteritidis load of eggs immediately after the treatment in oven at 200°C for 200 minutes (natural convection). Furthermore no detrimental effects on quality traits of treated eggs were recorded. These results support the hot air techniques for the surface decontamination of table eggs as an effective industrial process.
Resumo:
This thesis is about three major aspects of the identification of top quarks. First comes the understanding of their production mechanism, their decay channels and how to translate theoretical formulae into programs that can simulate such physical processes using Monte Carlo techniques. In particular, the author has been involved in the introduction of the POWHEG generator in the framework of the ATLAS experiment. POWHEG is now fully used as the benchmark program for the simulation of ttbar pairs production and decay, along with MC@NLO and AcerMC: this will be shown in chapter one. The second chapter illustrates the ATLAS detectors and its sub-units, such as calorimeters and muon chambers. It is very important to evaluate their efficiency in order to fully understand what happens during the passage of radiation through the detector and to use this knowledge in the calculation of final quantities such as the ttbar production cross section. The last part of this thesis concerns the evaluation of this quantity deploying the so-called "golden channel" of ttbar decays, yielding one energetic charged lepton, four particle jets and a relevant quantity of missing transverse energy due to the neutrino. The most important systematic errors arising from the various part of the calculation are studied in detail. Jet energy scale, trigger efficiency, Monte Carlo models, reconstruction algorithms and luminosity measurement are examples of what can contribute to the uncertainty about the cross-section.
Resumo:
In der vorliegenden Arbeit sollte die Fähigkeit untersucht werden, Schmerzreize auf der Haut zu lokalisieren und deren Intensität zu differenzieren. Während dieser Diskriminationsaufgaben wurde die elektrische Aktivität des Gehirns gemessen.Traditionell werden dem nozizeptiven System nur geringe Diskriminationsleistungen zugeschrieben. In einer ersten Versuchsreihe sollten daher die räumlichen Diskriminationsleistungen für nozizeptive und taktile Reize verglichen werden. Auf dem Handrücken konnten schmerzhaft Laserhitzereize genauso gut lokalisiert werden wie taktile Reize (von-Frey-Haar). Nur ein mechanischer Nadelreiz, der taktiles und nozizeptives System koaktivierte, konnte noch besser lokalisiert werden. In der zweiten Versuchsreihe wurden während verschiedener Diskriminationsaufgaben (räumliche Diskrimination, Intensitätsdiskrimination) und einer Ablenkungsaufgabe (mentale Arithmetik) Laser-evozierte Potenziale von der Kopfhaut abgeleitet. Eine Dipolquellenanalyse zeigte als erstes eine Aktivierung des frontalen Operculums, entsprechend einem zur Zeit noch umstrittenen Projektionsgebiet eines nozizeptiven Thalamuskerns (VMpo), gefolgt vom primären somatosensorische Kortex (SI) und dem Gyrus cinguli. Im Gegensatz zum taktilen System wurde SI signifikant später aktiviert als SII (bzw. das Operculum). Die Diskriminationsaufgaben erhöhten die Aktivität aller Quellen im Vergleich zu der Ablenkungsbedingung. Dies konnte sogar für die früheste Quelle im Operculum gezeigt werden.Die frühe sensorisch-diskriminative Komponente der Schmerzverarbeitung im Operculum zeigte eine Hemisphärenasymmetrie, mit stärkerer Aktivierung der linken Hemisphäre unabhängig von der Stimulationsseite.
Resumo:
The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.