894 resultados para Computational Mechanics, Numerical Analysis, Meshfree Method, Meshless Method, Time Dependent, MEMS
Resumo:
This work is concerned with the design and analysis of hp-version discontinuous Galerkin (DG) finite element methods for boundary-value problems involving the biharmonic operator. The first part extends the unified approach of Arnold, Brezzi, Cockburn & Marini (SIAM J. Numer. Anal. 39, 5 (2001/02), 1749-1779) developed for the Poisson problem, to the design of DG methods via an appropriate choice of numerical flux functions for fourth order problems; as an example we retrieve the interior penalty DG method developed by Suli & Mozolevski (Comput. Methods Appl. Mech. Engrg. 196, 13-16 (2007), 1851-1863). The second part of this work is concerned with a new a-priori error analysis of the hp-version interior penalty DG method, when the error is measured in terms of both the energy-norm and L2-norm, as well certain linear functionals of the solution, for elemental polynomial degrees $p\ge 2$. Also, provided that the solution is piecewise analytic in an open neighbourhood of each element, exponential convergence is also proven for the p-version of the DG method. The sharpness of the theoretical developments is illustrated by numerical experiments.
Resumo:
Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.
Resumo:
Dissertação (mestrado)—Universidade de BrasÃlia, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2016.
Resumo:
We shall consider the weak formulation of a linear elliptic model problem with discontinuous Dirichlet boundary conditions. Since such problems are typically not well-defined in the standard H^1-H^1 setting, we will introduce a suitable saddle point formulation in terms of weighted Sobolev spaces. Furthermore, we will discuss the numerical solution of such problems. Specifically, we employ an hp-discontinuous Galerkin method and derive an L^2-norm a posteriori error estimate. Numerical experiments demonstrate the effectiveness of the proposed error indicator in both the h- and hp-version setting. Indeed, in the latter case exponential convergence of the error is attained as the mesh is adaptively refined.
Resumo:
The behavior of the fluid flux in oil fields is influenced by different factors and it has a big impact on the recovery of hydrocarbons. There is a need of evaluating and adapting the actual technology to the worldwide reservoirs reality, not only on the exploration (reservoir discovers) but also on the development of those that were already discovered, however not yet produced. The in situ combustion (ISC) is a suitable technique for these recovery of hydrocarbons, although it remains complex to be implemented. The main objective of this research was to study the application of the ISC as an advanced oil recovery technique through a parametric analysis of the process using vertical wells within a semi synthetic reservoir that had the characteristics from the brazilian northwest, in order to determine which of those parameters could influence the process, verifying the technical and economical viability of the method on the oil industry. For that analysis, a commercial reservoir simulation program for thermal processes was used, called steam thermal and advanced processes reservoir simulator (STARS) from the computer modeling group (CMG). This study aims, through the numerical analysis, find results that help improve mainly the interpretation and comprehension of the main problems related to the ISC method, which are not yet dominated. From the results obtained, it was proved that the mediation promoted by the thermal process ISC over the oil recovery is very important, with rates and cumulated production positively influenced by the method application. It was seen that the application of the method improves the oil mobility as a function of the heating when the combustion front forms inside the reservoir. Among all the analyzed parameters, the activation energy presented the bigger influence, it means, the lower the activation energy the bigger the fraction of recovered oil, as a function of the chemical reactions speed rise. It was also verified that the higher the enthalpy of the reaction, the bigger the fraction of recovered oil, due to a bigger amount of released energy inside the system, helping the ISC. The reservoir parameters: porosity and permeability showed to have lower influence on the ISC. Among the operational parameters that were analyzed, the injection rate was the one that showed a stronger influence on the ISC method, because, the higher the value of the injection rate, the higher was the result obtained, mainly due to maintaining the combustion front. In connection with the oxygen concentration, an increase of the percentage of this parameter translates into a higher fraction of recovered oil, because the quantity of fuel, helping the advance and the maintenance of the combustion front for a longer period of time. About the economic analysis, the ISC method showed to be economically feasible when evaluated through the net present value (NPV), considering the injection rates: the higher the injection rate, the higher the financial incomes of the final project
Resumo:
En este trabajo se realizan simulaciones de excavaciones profundas en suelos de origen aluvial en la ciudad de Sabaneta, mediante el empleo de modelos en elementos finitos integrados por el software PLAXIS® -- Los desplazamientos horizontales son comparados con mediciones de inclinómetros instalados en el trasdós del muro diafragma anclado del proyecto Centro Comercial Mayorca Fase III, localizado en el municipio de Sabaneta, Antioquia -- Finalmente, se concluye acerca de la sensibilidad de los parámetros más relevantes según el modelo constitutivo empleado y la viabilidad en su aplicación para la solución del problema evaluado
Resumo:
Michigan depends heavily on fossil fuels to generate electricity. Compared with fossil fuels, electricity generation from renewable energy produces less pollutants emissions. A Renewable Portfolio Standard (RPS) is a mandate that requires electric utilities to generate a certain amount of electricity from renewable energy sources. This thesis applies the Cost-Benefits Analysis (CBA) method to investigate the impacts of implementing a 25% in Michigan by 2025. It is found that a 25% RPS will create about $20.12 billion in net benefits to the State. Moreover, if current tax credit policies will not change until 2025, its net present value will increase to about $26.59 billion. Based on the results of this CBA, a 25% RPS should be approved. The result of future studies on the same issue can be improved if more state specific data become available.
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.