963 resultados para Palaeomagnetism Applied to Geologic Processes
Resumo:
One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.
Resumo:
The project aims to gather an understanding of additive manufacturing and other manufacturing 4.0 techniques with an eyesight for industrialization. First the internal material anisotropy of elements created with the most economically feasible FEM technique was established. An understanding of the main drivers for variability for AM was portrayed, with the focus on achieving material internal isotropy. Subsequently, a technique for deposition parameter optimization was presented, further procedure testing was performed following other polymeric materials and composites. A replicability assessment by means of the use of technology 4.0 was proposed, and subsequent industry findings gathered the ultimate need of developing a process that demonstrate how to re-engineer designs in order to show the best results with AM processing. The latest study aims to apply the Industrial Design and Structure Method (IDES) and applying all the knowledge previously stacked into fully reengineer a product with focus of applying tools from 4.0 era, from product feasibility studies, until CAE – FEM analysis and CAM – DfAM. These results would help in making AM and FDM processes a viable option to be combined with composites technologies to achieve a reliable, cost-effective manufacturing method that could also be used for mass market, industry applications.
Resumo:
This postdoctoral study on the application of the RIME intervention in women that had undergone mastectomy and were in treatment, aimed to promote psychospiritual and social transformations to improve the quality of life, self-esteem and hope. A total of 28 women participated and were randomized into two groups. Brief Psychotherapy (PB) (average of six sessions) was administered in the Control Group, and RIME (three sessions) and BP (average of five sessions) were applied in the RIME Group. The quantitative results indicated a significant improvement (38.3%) in the Perception of Quality of Life after RIME according to the WHOQOL, compared both to the BP of the Control Group (12.5%), and the BP of the RIME Group (16.2%). There was a significant improvement in Self-esteem (Rosenberg) after RIME (14.6%) compared to the BP of the Control Group (worsened 35.9%), and the BP of the RIME Group (8.3%). The improvement in well-being, considering the focus worked on (Visual Analog Scale), was significant in the RIME Group (bad to good), as well as in the Control Group (unpleasant to good). The qualitative results indicated that RIME promotes creative transformations in the intrapsychic and interpersonal dimensions, so that new meanings and/or new attitudes emerge into the consciousness. It was observed that RIME has more strength of psychic structure, ego strengthening and provides a faster transformation that BP, therefore it can be indicated for crisis treatment in the hospital environment.
Resumo:
The present review addresses certain important aspects regarding nanoparticles and the environment, with an emphasis on plant science. The production and characterization of nanoparticles is the focus of this review, providing an idea of the range and the consolidation of these aspects in the literature, with modifications on the routes of synthesis and the application of the analytical techniques for characterization of the nanoparticles (NPs). Additionally, aspects related to the interaction between the NPs and plants, their toxicities, and the phytoremediation process, among others, are also discussed. Future trends are also presented, supplying evidence for certain possibilities regarding new research involving nanoparticles and plants.
Resumo:
This work aims at the geochemical study of Pitinga cryolite mineralization through REE and Y analyses in disseminated and massive cryolite ore deposits, as well as in fluorite occurrences. REE signatures in fluorite and cryolite are similar to those in the Madeira albite granite. The highest ΣREE values are found in magmatic cryolite (677 to 1345 ppm); ΣREE is lower in massive cryolite. Average values for the different cryolite types are 10.3 ppm, 6.66 ppm and 8.38 ppm (for nucleated, caramel and white types, respectively). Disseminated fluorite displays higher ΣREE values (1708 and 1526ppm) than fluorite in late veins(34.81ppm). Yttrium concentration is higher in disseminated fluorite and in magmatic cryolite. The evolution of several parameters (REEtotal, LREE/HREE, Y) was followed throughout successive stages of evolution in albite granites and associated mineralization. At the end of the process, late cryolite was formed with low REEtotal content. REE data indicate that the MCD was formed by, and the disseminated ore enriched by (additional formation of hydrothermal disseminated cryolite), hydrothermal fluids, residual from albite granite. The presence of tetrads is poorly defined, although nucleated, caramel and white cryolite types show evidence for tetrad effect.
Resumo:
According to some estimates, world's population growth is expected about 50% over the next 50 years. Thus, one of the greatest challenges faced by Engineering is to find effective options to food storage and conservation. Some researchers have investigated how to design durable buildings for storing and conserving food. Nowadays, developing concrete with mechanical resistance for room temperatures is a parameter that can be achieved easily. On the other hand, associating it to low temperature of approximately 35 °C negative requires less empiricism, being necessary a suitable dosage method and a careful selection of the material constituents. This ongoing study involves these parameters. The presented concrete was analyzed through non-destructive tests that examines the material properties periodically and verifies its physical integrity. Concrete with and without incorporated air were studied. The results demonstrated that both are resistant to freezing.
Resumo:
Shot peening is a cold-working mechanical process in which a shot stream is propelled against a component surface. Its purpose is to introduce compressive residual stresses on component surfaces for increasing the fatigue resistance. This process is widely applied in springs due to the cyclical loads requirements. This paper presents a numerical modelling of shot peening process using the finite element method. The results are compared with experimental measurements of the residual stresses, obtained by the X-rays diffraction technique, in leaf springs submitted to this process. Furthermore, the results are compared with empirical and numerical correlations developed by other authors.
Resumo:
This work deals with an improved plane frame formulation whose exact dynamic stiffness matrix (DSM) presents, uniquely, null determinant for the natural frequencies. In comparison with the classical DSM, the formulation herein presented has some major advantages: local mode shapes are preserved in the formulation so that, for any positive frequency, the DSM will never be ill-conditioned; in the absence of poles, it is possible to employ the secant method in order to have a more computationally efficient eigenvalue extraction procedure. Applying the procedure to the more general case of Timoshenko beams, we introduce a new technique, named ""power deflation"", that makes the secant method suitable for the transcendental nonlinear eigenvalue problems based on the improved DSM. In order to avoid overflow occurrences that can hinder the secant method iterations, limiting frequencies are formulated, with scaling also applied to the eigenvalue problem. Comparisons with results available in the literature demonstrate the strength of the proposed method. Computational efficiency is compared with solutions obtained both by FEM and by the Wittrick-Williams algorithm.
Resumo:
The implementation of confidential contracts between a container liner carrier and its customers, because of the Ocean Shipping Reform Act (OSRA) 1998, demands a revision in the methodology applied in the carrier's planning of marketing and sales. The marketing and sales planning process should be more scientific and with a better use of operational research tools considering the selection of the customers under contracts, the duration of the contracts, the freight, and the container imbalances of these contracts are basic factors for the carrier's yield. This work aims to develop a decision support system based on a linear programming model to generate the business plan for a container liner carrier, maximizing the contribution margin of its freight.
Resumo:
Background: The present work aims at the application of the decision theory to radiological image quality control ( QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods: Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results: Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion: The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.
Resumo:
Background: Cerebral palsy (CP) patients have motor limitations that can affect functionality and abilities for activities of daily living (ADL). Health related quality of life and health status instruments validated to be applied to these patients do not directly approach the concepts of functionality or ADL. The Child Health Assessment Questionnaire (CHAQ) seems to be a good instrument to approach this dimension, but it was never used for CP patients. The purpose of the study was to verify the psychometric properties of CHAQ applied to children and adolescents with CP. Methods: Parents or guardians of children and adolescents with CP, aged 5 to 18 years, answered the CHAQ. A healthy group of 314 children and adolescents was recruited during the validation of the CHAQ Brazilian-version. Data quality, reliability and validity were studied. The motor function was evaluated by the Gross Motor Function Measure (GMFM). Results: Ninety-six parents/guardians answered the questionnaire. The age of the patients ranged from 5 to 17.9 years (average: 9.3). The rate of missing data was low(< 9.3%). The floor effect was observed in two domains, being higher only in the visual analogue scales (<= 35.5%). The ceiling effect was significant in all domains and particularly high in patients with quadriplegia (81.8 to 90.9%) and extrapyramidal (45.4 to 91.0%). The Cronbach alpha coefficient ranged from 0.85 to 0.95. The validity was appropriate: for the discriminant validity the correlation of the disability index with the visual analogue scales was not significant; for the convergent validity CHAQ disability index had a strong correlation with the GMFM (0.77); for the divergent validity there was no correlation between GMFM and the pain and overall evaluation scales; for the criterion validity GMFM as well as CHAQ detected differences in the scores among the clinical type of CP (p < 0.01); for the construct validity, the patients' disability index score (mean: 2.16; SD: 0.72) was higher than the healthy group ( mean: 0.12; SD: 0.23)(p < 0.01). Conclusion: CHAQ reliability and validity were adequate to this population. However, further studies are necessary to verify the influence of the ceiling effect on the responsiveness of the instrument.
Resumo:
The tomato culture demands large quantities of mineral nutrients, which are supplied by synthetic fertilizers in the conventional cultivation system. In the organic cultivation system only alternative fertilizers are allowed by the certifiers and accepted as safe for humans and environment. The chemical composition of rice bran, oyster flour, cattle manure and ground charcoal, as well as soils and tomato fruits were evaluated by instrumental neutron activation analysis (INAA). The potential contribution of organic fertilizers to the enrichment of chemical elements in soil and their transfer to fruits was investigated using concentration ratios for fertilizer and soil samples, and also for soil and tomato. Results evidenced that these alternative fertilizers could be taken as important sources of Br, Ca, Ce, K, Na and Zn for the organic tomato culture.
Resumo:
The agricultural supplies used in the organic system to control pests and diseases as well as to fertilize soil are claimed to be beneficial to plants and innocuous to human health and to the environment. The chemical composition of six agricultural supplies commonly used in the organic tomato culture, was evaluated by instrumental neutron activation analysis (INAA). Results were compared to the maximum limits established by the Environment Control Agency of the Sao Paulo State (CETESB) and the Guidelines for Organic Quality Standard of Instituto Biodinamico (IBD). Concentrations above reference values were found for Co, Cr and Zn in compost, Cr and Zn in cattle manure and Zn in rice bran.
Resumo:
Nb(3)Sn is one of the most used superconducting materials for applications in high magnetic fields. The improvement of the critical current densities (J(c)) is important, and must be analyzed together with the optimization of the flux pinning acting in the material. For Nb(3)Sn, it is known that the grain boundaries are the most effective pinning centers. However, the introduction of artificial pinning centers (APCs) with different superconducting properties has been proved to be beneficial for J(c). As these APCs are normally in the nanometric-scale, the conventional heat treatment profiles used for Nb(3)Sn wires cannot be directly applied, leading to excessive grain growth and/or increase of the APCs cross sections. In this work, the heat treatment profiles for Nb(3)Sn superconductor wires with Cu(Sn) artificial pinning centers in nanometric-scale were analyzed in an attempt to improve J(c) . It is described a methodology to optimize the heat treatment profiles in respect to diffusion, reaction and formation of the superconducting phases. Microstructural, transport and magnetic characterization were performed in an attempt to find the pinning mechanisms acting in the samples. It was concluded that the maximum current densities were found when normal phases (due to the introduction of the APCs) are acting as main pinning centers in the global behavior of the Nb(3)Sn superconducting wire.
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.