967 resultados para Palaeomagnetism Applied to Tectonics
Resumo:
Due to the particular characteristics of the fusion products, i.e. very short pulses (less than a few μs long for ions when arriving to the walls; less than 1 ns long for X-rays), very high fluences ( 10 13 particles/cm 2 for both ions and X rays photons) and broad particle energy spectra (up to 10 MeV ions and 100 keV photons), the laser fusion community lacks of facilities to accurately test plasma facing materials under those conditions. In the present work, the ability of ultraintese lasers to create short pulses of energetic particles and high fluences is addressed as a solution to reproduce those ion and X-ray bursts. Based on those parameters, a comparison between fusion ion and laser driven ion beams is presented and discussed, describing a possible experimental set-up to generate with lasers the appropriate ion pulses. At the same time, the possibility of generating X-ray or neutron beams which simulate those of laser fusion environments is also indicated and assessed under current laser intensities. It is concluded that ultraintense lasers should play a relevant role in the validation of materials for laser fusion facilities.
Resumo:
The ability of ultraintese lasers to create short pulses of energetic particles and high fluences is addressed as a solution to reproduce ion and X-ray ICF bursts for the characterization and validation of plasma facing components. The possibility of using a laser neutron source for material testing will also be discussed.
Resumo:
Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn?t be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don?t have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: ? Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R pvalue. In this way we consider the implications of reducing the number of points. ? Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology.
Resumo:
Salamanca, situated in center of Mexico is among the cities which suffer most from the air pollution in Mexico. The vehicular park and the industry, as well as orography and climatic characteristics have propitiated the increment in pollutant concentration of Sulphur Dioxide (SO2). In this work, a Multilayer Perceptron Neural Network has been used to make the prediction of an hour ahead of pollutant concentration. A database used to train the Neural Network corresponds to historical time series of meteorological variables and air pollutant concentrations of SO2. Before the prediction, Fuzzy c-Means and K-means clustering algorithms have been implemented in order to find relationship among pollutant and meteorological variables. Our experiments with the proposed system show the importance of this set of meteorological variables on the prediction of SO2 pollutant concentrations and the neural network efficiency. The performance estimation is determined using the Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The results showed that the information obtained in the clustering step allows a prediction of an hour ahead, with data from past 2 hours.
Resumo:
Enhanced learning environments are arising with great success within the field of cognitive skills training in minimally invasive surgery (MIS) because they provides multiple benefits since they avoid time, spatial and cost constraints. TELMA [1,2] is a new technology enhanced learning platform that promotes collaborative and ubiquitous training of surgeons. This platform is based on four main modules: an authoring tool, a learning content and knowledge management system, an evaluation module and a professional network. TELMA has been designed and developed focused on the user; therefore it is necessary to carry out a user validation as final stage of the development. For this purpose, e-MIS validity [3] has been defined. This validation includes usability, contents and functionality validities both for the development and production stages of any e-Learning web platform. Using e-MIS validity, the e-Learning is fully validated since it includes subjective and objective metrics. The purpose of this study is to specify and apply a set of objective and subjective metrics using e-MIS validity to test usability, contents and functionality of TELMA environment within the development stage.
Resumo:
In this paper we present a new tool to perform guided HAZOP analyses. This tool uses a functional model of the process that merges its functional and its structural information in a natural way. The functional modeling technique used is called D-higraphs. This tool solves some of the problems and drawbacks of other existing methodologies for the automation of HAZOPs. The applicability and easy understanding of the proposed methodology is shown in an industrial case.
Resumo:
To perform advanced manipulation of remote environments such as grasping, more than one finger is required implying higher requirements for the control architecture. This paper presents the design and control of a modular 3-finger haptic device that can be used to interact with virtual scenarios or to teleoperate dexterous remote hands. In a modular haptic device, each module allows the interaction with a scenario by using a single finger; hence, multi-finger interaction can be achieved by adding more modules. Control requirements for a multifinger haptic device are analyzed and new hardware/software architecture for these kinds of devices is proposed. The software architecture described in this paper is distributed and the different modules communicate to allow the remote manipulation. Moreover, an application in which this haptic device is used to interact with a virtual scenario is shown.
Resumo:
A global Lagrangian descriptor applied to the Kuroshio current
Resumo:
inor actinides (MAs) transmutation is a main design objective of advanced nuclear systems such as generation IV Sodium Fast Reactors (SFRs). In advanced fuel cycles, MA contents in final high level waste packages are main contributors to short term heat production as well as to long-term radiotoxicity. Therefore, MA transmutation would have an impact on repository designs and would reduce the environment burden of nuclear energy. In order to predict such consequences Monte Carlo (MC) transport codes are used in reactor design tasks and they are important complements and references for routinely used deterministic computational tools. In this paper two promising Monte Carlo transport-coupled depletion codes, EVOLCODE and SERPENT, are used to examine the impact of MA burning strategies in a SFR core, 3600 MWth. The core concept proposal for MA loading in two configurations is the result of an optimization effort upon a preliminary reference design to reduce the reactivity insertion as a consequence of sodium voiding, one of the main concerns of this technology. The objective of this paper is double. Firstly, efficiencies of the two core configurations for MA transmutation are addressed and evaluated in terms of actinides mass changes and reactivity coefficients. Results are compared with those without MA loading. Secondly, a comparison of the two codes is provided. The discrepancies in the results are quantified and discussed.
Resumo:
The origins for this work arise in response to the increasing need for biologists and doctors to obtain tools for visual analysis of data. When dealing with multidimensional data, such as medical data, the traditional data mining techniques can be a tedious and complex task, even to some medical experts. Therefore, it is necessary to develop useful visualization techniques that can complement the expert’s criterion, and at the same time visually stimulate and make easier the process of obtaining knowledge from a dataset. Thus, the process of interpretation and understanding of the data can be greatly enriched. Multidimensionality is inherent to any medical data, requiring a time-consuming effort to get a clinical useful outcome. Unfortunately, both clinicians and biologists are not trained in managing more than four dimensions. Specifically, we were aimed to design a 3D visual interface for gene profile analysis easy in order to be used both by medical and biologist experts. In this way, a new analysis method is proposed: MedVir. This is a simple and intuitive analysis mechanism based on the visualization of any multidimensional medical data in a three dimensional space that allows interaction with experts in order to collaborate and enrich this representation. In other words, MedVir makes a powerful reduction in data dimensionality in order to represent the original information into a three dimensional environment. The experts can interact with the data and draw conclusions in a visual and quickly way.
Resumo:
This paper refers to the numerical solution of the classical Darcy's problem of plane fluid through isotropic media. Regarding the numerical procedure,the Laplace equation, is a classical one in mathematical physics and several procedures have been devised in order to solve it. So as to show the capability of the method, the paper presents some exemples.
Resumo:
The use of data mining techniques for the gene profile discovery of diseases, such as cancer, is becoming usual in many researches. These techniques do not usually analyze the relationships between genes in depth, depending on the different variety of manifestations of the disease (related to patients). This kind of analysis takes a considerable amount of time and is not always the focus of the research. However, it is crucial in order to generate personalized treatments to fight the disease. Thus, this research focuses on finding a mechanism for gene profile analysis to be used by the medical and biologist experts. Results: In this research, the MedVir framework is proposed. It is an intuitive mechanism based on the visualization of medical data such as gene profiles, patients, clinical data, etc. MedVir, which is based on an Evolutionary Optimization technique, is a Dimensionality Reduction (DR) approach that presents the data in a three dimensional space. Furthermore, thanks to Virtual Reality technology, MedVir allows the expert to interact with the data in order to tailor it to the experience and knowledge of the expert.
Resumo:
This paper tackles the optimization of applications in multi-provider hybrid cloud scenarios from an economic point of view. In these scenarios the great majority of solutions offer the automatic allocation of resources on different cloud providers based on their current prices. However our approach is intended to introduce a novel solution by making maximum use of divide and rule. This paper describes a methodology to create cost aware cloud applications that can be broken down into the three most important components in cloud infrastructures: computation, network and storage. A real videoconference system has been modified in order to evaluate this idea with both theoretical and empirical experiments. This system has become a widely used tool in several national and European projects for e-learning and collaboration purposes.
Resumo:
Security intrusions in large systems is a problem due to its lack of scalability with the current IDS-based approaches. This paper describes the RECLAMO project, where an architecture for an Automated Intrusion Response System (AIRS) is being proposed. This system will infer the most appropriate response for a given attack, taking into account the attack type, context information, and the trust and reputation of the reporting IDSs. RECLAMO is proposing a novel approach: diverting the attack to a specific honeynet that has been dynamically built based on the attack information. Among all components forming the RECLAMO's architecture, this paper is mainly focused on defining a trust and reputation management model, essential to recognize if IDSs are exposing an honest behavior in order to accept their alerts as true. Experimental results confirm that our model helps to encourage or discourage the launch of the automatic reaction process.
Resumo:
This work studies the most beneficial way of allocating water in an irrigation community in water shortage situations. Therefore, it proposes that the irrigation surface area be divided into homogeneous zones, each with a beneficial relationship with respect to the water applied. The mathematical formula that enables one to obtain the optimal quota for the users or irrigation community as a whole has been found for individual relations of a quadratic or power type, and these have yielded different and complementary characteristics. Dimensionless variables have been used to display the results, and to compare with other alternative allocation rules such as the proportional rule, referencing the situation without water restrictions. As a result, for each water shortage situation, the water that is allocated to each user is obtained, together with the losses in individual income and the losses for the community as a whole. Furthermore, a proposal is put forth for establishing the marginal benefit from the water available, which could be of interest in enabling each community to analyze whether it is in its best interest to invest in increasing the resource, or to sell the resource to other users. Finally, an example is given to demonstrate how the method works and to show that, when the differences between the production schemes are considered, the differences in benefit reduction between the proportional allocation and the optimal allocation are also sizeable. Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)IR.1943-4774.0000667