19 resultados para 240302 Nuclear and Particle Physics

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Syksy Räsänen's presentation at Kirjastoverkkopäivät, Helsinki 21.10.2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis reviews the role of nuclear and conventional power plants in the future energy system. The review is done by utilizing freely accesible publications in addition to generating load duration and ramping curves for Nordic energy system. As the aim of the future energy system is to reduce GHG-emissions and avoid further global warming, the need for flexible power generation increases with the increased share of intermittent renewables. The goal of this thesis is to offer extensive understanding of possibilities and restrictions that nuclear power and conventional power plants have regarding flexible and sustainable generation. As a conclusion, nuclear power is the only technology that is able to provide large scale GHG-free power output variations with good ramping values. Most of the currently operating plants are able to take part in load following as the requirement to do so is already required to be included in the plant design. Load duration and ramping curves produced prove that nuclear power is able to cover most of the annual generation variation and ramping needs in the Nordic energy system. From the conventional power generation methods, only biomass combustion can be considered GHG-free because biomass is considered carbon neutral. CFB combusted biomass has good load follow capabilities in good ramping and turndown ratios. All the other conventional power generation technologies generate GHG-emissions and therefore the use of these technologies should be reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Standard Model of particle physics is currently the best description of fundamental particles and their interactions. All particles save the Higgs boson have been observed in particle accelerator experiments over the years. Despite the predictive power the Standard Model there are many phenomena that the scenario does not predict or explain. Among the most prominent dilemmas is matter-antimatter asymmetry, and much effort has been made in formulating scenarios that accurately predict the correct amount of matter-antimatter asymmetry in the universe. One of the most appealing explanations is baryogenesis via leptogenesis which not only serves as a mechanism of producing excess matter over antimatter but can also explain why neutrinos have very small non-zero masses. Interesting leptogenesis scenarios arise when other possible candidates of theories beyond the Standard Model are brought into the picture. In this thesis, we have studied leptogenesis in an extra dimensional framework and in a modified version of supersymmetric Standard Model. The first chapters of this thesis introduce the standard cosmological model, observations made on the photon to baryon ratio and necessary preconditions for successful baryogenesis. Baryogenesis via leptogenesis is then introduced and its connection to neutrino physics is illuminated. The final chapters concentrate on extra dimensional theories and supersymmetric models and their ability to accommodate leptogenesis. There, the results of our research are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report summarizes the work done by a consortium consisting of Lappeenranta University of Technology, Aalto University and VTT Technical Research Centre of Finland in the New Type Nuclear Reactors (NETNUC) project during 2008–2011. The project was part of the Sustainable Energy (SusEn) research programme of the Academy of Finland. A wide range of generation IV nuclear technologies were studied during the project and the research consisted of multiple tasks. This report contains short articles summarizing the results of the individual tasks. In addition, the publications produced and the persons involved in the project are listed in the appendices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Organization for Nuclear Research (CERN) operates the largest particle collider in the world. This particle collider is called the Large Hadron Collider (LHC) and it will undergo a maintenance break sometime in 2017 or 2018. During the break, the particle detectors, which operate around the particle collider, will be serviced and upgraded. Following the improvement in performance of the particle collider, the requirements for the detector electronics will be more demanding. In particular, the high amount of radiation during the operation of the particle collider sets requirements for the electronics that are uncommon in commercial electronics. Electronics that are built to function in the challenging environment of the collider have been designed at CERN. In order to meet the future challenges of data transmission, a GigaBit Transceiver data transmission module and an E-Link data bus have been developed. The next generation of readout electronics is designed to benefit from these technologies. However, the current readout electronics chips are not compatible with these technologies. As a result, in addition to new Gas Electron Multiplier (GEM) detectors and other technology, a new compatible chip is developed to function within the GEMs for the Compact Muon Solenoid (CMS) project. In this thesis, the objective was to study a data transmission interface that will be located on the readout chip between the E-Link bus and the control logic of the chip. The function of the module is to handle data transmission between the chip and the E-Link. In the study, a model of the interface was implemented with the Verilog hardware description language. This process was simulated by using chip design software by Cadence. State machines and operating principles with alternative possibilities for implementation are introduced in the E-Link interface design procedure. The functionality of the designed logic is demonstrated in simulation results, in which the implemented model is proven to be suitable for its task. Finally, suggestions that should be considered for improving the design have been presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Large Hadron Collider (LHC) in The European Organization for Nuclear Research (CERN) will have a Long Shutdown sometime during 2017 or 2018. During this time there will be maintenance and a possibility to install new detectors. After the shutdown the LHC will have a higher luminosity. A promising new type of detector for this high luminosity phase is a Triple-GEM detector. During the shutdown these detectors will be installed at the Compact Muon Solenoid (CMS) experiment. The Triple-GEM detectors are now being developed at CERN and alongside also a readout ASIC chip for the detector. In this thesis a simulation model was developed for the ASICs analog front end. The model will help to carry out more extensive simulations and also simulate the whole chip before the whole design is finished. The proper functioning of the model was tested with simulations, which are also presented in the thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CERNin tutkimuskeskuksen rakenteilla olevan hadronikiihdyttimen eräs tarkoitus on todistaa Higgsin bosonin olemassaolo. Higgsin bosonin löytyminen yhtenäistäisi nykyisen hiukkasfysiikan teorian ja antaisi selityksen sille kuinka hiukkaset saavat massansa. Kiihdyttimen CMS koeasema on tarkoitettu erityisesti myonien ilmaisuun. Tämä työ liittyy CMS koeaseman RPC-ilmaisintyypin linkkijärjestelmään, jonka tarkoituksena on käsitellä ilmaisimelta tulevia myonien aiheuttamia signaaleja ja lähettää tiedot tärkeäksi katsotuista törmäystapahtumista tallennettavaksi analysointia varten. Työssä on toteutettu linkkijärjestelmän ohjaus- ja linkkikorteille testiympäristö, jolla voidaan todeta järjestelmän eri osien keskinäinen yhteensopivuus ja toimivuus. Työn alkuosassa esitellään ilmaisimen linkkijärjestelmän eri osat ja niiden merkitykset. Työn loppuosassa käydään läpi eri testimenetelmiä ja analysoidaan niiden antamia tuloksia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to investigate some important features of granular flows and suspension flows by computational simulation methods. Granular materials have been considered as an independent state ofmatter because of their complex behaviors. They sometimes behave like a solid, sometimes like a fluid, and sometimes can contain both phases in equilibrium. The computer simulation of dense shear granular flows of monodisperse, spherical particles shows that the collisional model of contacts yields the coexistence of solid and fluid phases while the frictional model represents a uniform flow of fluid phase. However, a comparison between the stress signals from the simulations and experiments revealed that the collisional model would result a proper match with the experimental evidences. Although the effect of gravity is found to beimportant in sedimentation of solid part, the stick-slip behavior associated with the collisional model looks more similar to that of experiments. The mathematical formulations based on the kinetic theory have been derived for the moderatesolid volume fractions with the assumption of the homogeneity of flow. In orderto make some simulations which can provide such an ideal flow, the simulation of unbounded granular shear flows was performed. Therefore, the homogeneous flow properties could be achieved in the moderate solid volume fractions. A new algorithm, namely the nonequilibrium approach was introduced to show the features of self-diffusion in the granular flows. Using this algorithm a one way flow can beextracted from the entire flow, which not only provides a straightforward calculation of self-diffusion coefficient but also can qualitatively determine the deviation of self-diffusion from the linear law at some regions nearby the wall inbounded flows. Anyhow, the average lateral self-diffusion coefficient, which was calculated by the aforementioned method, showed a desirable agreement with thepredictions of kinetic theory formulation. In the continuation of computer simulation of shear granular flows, some numerical and theoretical investigations were carried out on mass transfer and particle interactions in particulate flows. In this context, the boundary element method and its combination with the spectral method using the special capabilities of wavelets have been introduced as theefficient numerical methods to solve the governing equations of mass transfer in particulate flows. A theoretical formulation of fluid dispersivity in suspension flows revealed that the fluid dispersivity depends upon the fluid properties and particle parameters as well as the fluid-particle and particle-particle interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate nanostructures are increasingly used for analytical purposes. Such particles are often generated by chemical synthesis from non-renewable raw materials. Generation of uniform nanoscale particles is challenging and particle surfaces must be modified to make the particles biocompatible and water-soluble. Usually nanoparticles are functionalized with binding molecules (e.g., antibodies or their fragments) and a label substance (if needed). Overall, producing nanoparticles for use in bioaffinity assays is a multistep process requiring several manufacturing and purification steps. This study describes a biological method of generating functionalized protein-based nanoparticles with specific binding activity on the particle surface and label activity inside the particles. Traditional chemical bioconjugation of the particle and specific binding molecules is replaced with genetic fusion of the binding molecule gene and particle backbone gene. The entity of the particle shell and binding moieties are synthesized from generic raw materials by bacteria, and fermentation is combined with a simple purification method based on inclusion bodies. The label activity is introduced during the purification. The process results in particles that are ready-to-use as reagents in bioaffinity. Apoferritin was used as particle body and the system was demonstrated using three different binding moieties: a small protein, a peptide and a single chain Fv antibody fragment that represents a complex protein including disulfide bridge.If needed, Eu3+ was used as label substance. The results showed that production system resulted in pure protein preparations, and the particles were of homogeneous size when visualized with transmission electron microscopy. Passively introduced label was stably associated with the particles, and binding molecules genetically fused to the particle specifically bound target molecules. Functionality of the particles in bioaffinity assays were successfully demonstrated with two types of assays; as labels and in particle-enhanced agglutination assay. This biological production procedure features many advantages that make the process especially suited for applications that have frequent and recurring requirements for homogeneous functional particles. The production process of ready, functional and watersoluble particles follows principles of “green chemistry”, is upscalable, fast and cost-effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The report presents the results of the commercialization project called the Container logistic services for forest bioenergy. The project promotes new business that is emerging around overall container logistic services in the bioenergy sector. The results assess the European markets of the container logistics for biomass, enablers for new business creation and required service bundles for the concept. We also demonstrate the customer value of the container logistic services for different market segments. The concept analysis is based on concept mapping, quality function deployment process (QFD) and business network analysis. The business network analysis assesses key shareholders and their mutual connections. The performance of the roadside chipping chain is analysed by the logistic cost simulation, RFID system demonstration and freezing tests. The EU has set the renewable energy target to 20 % in 2020 of which Biomass could account for two-thirds. In the Europe, the production of wood fuels was 132.9 million solid-m3 in 2012 and production of wood chips and particles was 69.0 million solidm3. The wood-based chips and particle flows are suitable for container transportation providing market of 180.6 million loose- m3 which mean 4.5 million container loads per year. The intermodal logistics of trucks and trains are promising for the composite containers because the biomass does not freeze onto the inner surfaces in the unloading situations. The overall service concept includes several packages: container rental, container maintenance, terminal services, RFID-tracking service, and simulation and ERP-integration service. The container rental and maintenance would provide transportation entrepreneurs a way to increase the capacity without high investment costs. The RFID-concept would lead to better work planning improving profitability throughout the logistic chain and simulation supports fuel supply optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.