813 resultados para UPM
Resumo:
ObjectKineticMonteCarlo models allow for the study of the evolution of the damage created by irradiation to time scales that are comparable to those achieved experimentally. Therefore, the essential ObjectKineticMonteCarlo parameters can be validated through comparison with experiments. However, this validation is not trivial since a large number of parameters is necessary, including migration energies of point defects and their clusters, binding energies of point defects in clusters, as well as the interactionradii. This is particularly cumbersome when describing an alloy, such as the Fe–Cr system, which is of interest for fusion energy applications. In this work we describe an ObjectKineticMonteCarlo model for Fe–Cr alloys in the dilute limit. The parameters used in the model come either from density functional theory calculations or from empirical interatomic potentials. This model is used to reproduce isochronal resistivity recovery experiments of electron irradiateddiluteFe–Cr alloys performed by Abe and Kuramoto. The comparison between the calculated results and the experiments reveal that an important parameter is the capture radius between substitutionalCr and self-interstitialFe atoms. A parametric study is presented on the effect of the capture radius on the simulated recovery curves.
Resumo:
Since the early days of logic programming, researchers in the field realized the potential for exploitation of parallelism present in the execution of logic programs. Their high-level nature, the presence of nondeterminism, and their referential transparency, among other characteristics, make logic programs interesting candidates for obtaining speedups through parallel execution. At the same time, the fact that the typical applications of logic programming frequently involve irregular computations, make heavy use of dynamic data structures with logical variables, and involve search and speculation, makes the techniques used in the corresponding parallelizing compilers and run-time systems potentially interesting even outside the field. The objective of this article is to provide a comprehensive survey of the issues arising in parallel execution of logic programming languages along with the most relevant approaches explored to date in the field. Focus is mostly given to the challenges emerging from the parallel execution of Prolog programs. The article describes the major techniques used for shared memory implementation of Or-parallelism, And-parallelism, and combinations of the two. We also explore some related issues, such as memory management, compile-time analysis, and execution visualization.
Resumo:
We will present calculations of opacities for matter under LTE conditions. Opacities are needed in radiation transport codes to study processes like Inertial Confinement Fusion and plasma amplifiers in X-ray secondary sources. For the calculations we use the code BiGBART, with either a hydrogenic approximation with j-splitting or self-consistent data generated with the atomic physics code FAC. We calculate the atomic structure, oscillator strengths, radiative transition energies, including UTA computations, and photoionization cross-sections. A DCA model determines the configurations considered in the computation of the opacities. The opacities obtained with these two models are compared with experimental measurements.
Resumo:
Seeding plasma-based softx-raylaser (SXRL) demonstrated diffraction-limited, fully coherent in space and in time beam but with energy not exceeding 1 μJ per pulse. Quasi-steady-state (QSS) plasmas demonstrated to be able to store high amount of energy and then amplify incoherent SXRL up to several mJ. Using 1D time-dependant Bloch–Maxwell model including amplification of noise, we demonstrated that femtosecond HHG cannot be efficiently amplified in QSS plasmas. However, using Chirped Pulse Amplification concept on HHG seed allows to extract most of the stored energy, reaching up to 5 mJ in fully coherent pulses that can be compressed down to 130 fs.
Study of rapid ionisation for simulation of soft X-ray lasers with the 2D hydro-radiative code ARWEN
Resumo:
We present our fast ionisation routine used to study transient softX-raylasers with ARWEN, a two-dimensional hydrodynamic code incorporating adaptative mesh refinement (AMR) and radiative transport. We compute global rates between ion stages assuming an effective temperature between singly-excited levels of each ion. A two-step method is used to obtain in a straightforward manner the variation of ion populations over long hydrodynamic time steps. We compare our model with existing theoretical results both stationary and transient, finding that the discrepancies are moderate except for large densities. We simulate an existing Molybdenum Ni-like transient softX-raylaser with ARWEN. Use of the fast ionisation routine leads to a larger increase in temperature and a larger gain zone than when LTE datatables are used.
Resumo:
AnewRelativisticScreenedHydrogenicModel has been developed to calculate atomic data needed to compute the optical and thermodynamic properties of high energy density plasmas. The model is based on anewset of universal screeningconstants, including nlj-splitting that has been obtained by fitting to a large database of ionization potentials and excitation energies. This database was built with energies compiled from the National Institute of Standards and Technology (NIST) database of experimental atomic energy levels, and energies calculated with the Flexible Atomic Code (FAC). The screeningconstants have been computed up to the 5p3/2 subshell using a Genetic Algorithm technique with an objective function designed to minimize both the relative error and the maximum error. To select the best set of screeningconstants some additional physical criteria has been applied, which are based on the reproduction of the filling order of the shells and on obtaining the best ground state configuration. A statistical error analysis has been performed to test the model, which indicated that approximately 88% of the data lie within a ±10% error interval. We validate the model by comparing the results with ionization energies, transition energies, and wave functions computed using sophisticated self-consistent codes and experimental data.
Resumo:
The engineering design of fissionchambers as on-line radiation detectors for IFMIF is being performed in the framework of the IFMIF-EVEDA works. In this paper the results of the experiments performed in the BR2 reactor during the phase-2 of the foreseen validation activities are addressed. Two detectors have been tested in a mixedneutron-gamma field with high neutron fluence and gamma absorbed dose rates, comparable with the expected values in the HFTM in IFMIF. Since the neutron spectra in all BR2 channels are dominated by the thermal neutron component, the detectors have been surrounded by a cylindrical gadolinium screen to cut the thermal neutron component, in order to get a more representative test for IFMIF conditions. The integrated gamma absorbed dose was about 4 × 1010 Gy and the fast neutron fluence (E > 0.1 MeV) 4 × 1020 n/cm2. The fissionchambers were calibrated in three BR2 channels with different neutron-to-gamma ratio, and the long-term evolution of the signals was studied and compared with theoretical calculations
Resumo:
An analysis of a stretch of coastline shows multiple alterations through environmental climate actions. The narrow, fragile line displays singularities due to three basic causes. The first is the discontinuity in feed or localised loss of solid coastal material. Called massics, their simplest examples are deltas and undersea canyons. The second is due to a brusque change in the alignment of the shoreline’s edge, headlands, groins, harbour and defence works. Given the name of geometric singularities, their simplest uses are artificial beaches in the shelter of a straight groin or spits. The third is due to littoral dynamics, emerged or submerged obstacles which diffract and refract wave action, causing a change in the sea level’s super-elevation in breaker areas. Called dynamics, the simplest examples are salients, tombolos and shells. Discussion of the causes giving rise to variations in the coastline and formation of singularities is the raison d’être of investigation, using actual cases to check the suitability of the classification proposed, the tangential or differential action of waves on the coastal landscape in addition to possible simple, compound and complex shapes detected in nature, both in erosion and deposit processes
Resumo:
Radiative shock waves play a pivotal role in the transport energy into the stellar medium. This fact has led to many efforts to scale the astrophysical phenomena to accessible laboratory conditions and their study has been highlighted as an area requiring further experimental investigations. Low density material with high atomic mass is suitable to achieve radiative regime, and, therefore, low density xenon gas is commonly used for the medium in which the radiative shock propagates. In this work the averageionization and the thermodynamicregimes of xenonplasmas are determined as functions of the matter density and temperature in a wide range of plasma conditions. The results obtained will be applied to characterize blastwaveslaunched in xenonclusters
Resumo:
In this work we propose a method for cleaving silicon-based photonic chips by using a laser based micromachining system, consisting of a ND:YVO4laser emitting at 355 nm in nanosecond pulse regime and a micropositioning system. The laser makes grooved marks placed at the desired locations and directions where cleaves have to be initiated, and after several processing steps, a crack appears and propagate along the crystallographic planes of the silicon wafer. This allows cleavage of the chips automatically and with high positioning accuracy, and provides polished vertical facets with better quality than the obtained with other cleaving process, which eases the optical characterization of photonic devices. This method has been found to be particularly useful when cleaving small-sized chips, where manual cleaving is hard to perform; and also for polymeric waveguides, whose facets get damaged or even destroyed with polishing or manual cleaving processing. Influence of length of the grooved line and speed of processing is studied for a variety of silicon chips. An application for cleaving and characterizing sol–gel waveguides is presented. The total amount of light coupled is higher than when using any other procedure.
Resumo:
The research work as presented in this article covers the design of detached breakwaters since they constitute a type of coastal defence work with which to combat many of the erosion problems found on beaches in a stable, sustainable fashion. The main aim of this work is to formulate a functional and environmental (but not structural) design method, enabling the fundamental characteristics of a detached breakwater to be defined as a function of the effect it is wished to induce on the coast, and taking into account variables of a different nature (climate, geomorphology and geometry) influencing the changes the shoreline undergoes after its construction. With this article, it is intended to submit the final result of the investigation undertaken, applying the detached breakwater design method as developed to solving a practical case. Thus it may be shown how the method enables a detached breakwater’s geometric pre-sizing to be tackled at a place on the coast with certain climate, geomorphology and littoral dynamic characteristics, first setting the final state of equilibrium it is wanted to obtain therein after its construction.
Resumo:
Although still in an early stage, offshore wind development is now characterized by a boom process. This leads to the necessity of applying an integral management model for the design of offshore wind facilities, being the purpose of the model to achieve technical, economical and environmental viability, all within a sustainable development framework. The foregoing led to the research project exposed in this paper, consisting of drawing up an offshore wind farms methodological proposal; this methodology has a global and/or general nature or point of view whilst searching for optimization of the overall process of operations leading to the design of this type of installations and establishing collated theoretical bases for the further development of management tools. This methodological proposal follows a classical engineering thought scheme: it begins with the alternatives study, and ends with the detailed design. With this in mind, the paper includes the following sections: introduction, methodology used for the research project, conditioning factors, methodological proposal for the design of offshore wind farms, checking the methodological proposal, and conclusions
Resumo:
Offshore wind farms are beginning to form part of coastal and marine landscapes located in dynamic surroundings. An integral management model must therefore be applied to achieve not only technical and economic viability of the project but also respect for the environment. Amongst other aspects, the latter calls for an analysis of the possible impact these facilities may have on littoral processes and this requires the differences between littoral processes prior and subsequent to the facility’s construction to be known. The maritime climate, the composition of the coast, lay-out distribution and characteristics of the facility’s components need to be known, particularly foundations as they are the main obstacles waves and currents meet. This article first addresses different aspects related to an offshore wind farm’s influence on the analysis of how it affects littoral dynamics and, because of their importance in this study, pays special attention to foundations. Coastal erosion due to this type of facility is then examined. The main conclusion of this article is that, whilst there are certain opinions claiming the coast is not affected by the presence of this kind of facility since the distance from location to coast and between wind turbine generators themselves is long, the impact must be analysed in each specific case, at least until experience proves otherwise and criteria are adopted in this respect.