8 resultados para Offshore systems modelling and control

em Universidad de Alicante


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical modelling methodologies are important by their application to engineering and scientific problems, because there are processes where analytical mathematical expressions cannot be obtained to model them. When the only available information is a set of experimental values for the variables that determine the state of the system, the modelling problem is equivalent to determining the hyper-surface that best fits the data. This paper presents a methodology based on the Galerkin formulation of the finite elements method to obtain representations of relationships that are defined a priori, between a set of variables: y = z(x1, x2,...., xd). These representations are generated from the values of the variables in the experimental data. The approximation, piecewise, is an element of a Sobolev space and has derivatives defined in a general sense into this space. The using of this approach results in the need of inverting a linear system with a structure that allows a fast solver algorithm. The algorithm can be used in a variety of fields, being a multidisciplinary tool. The validity of the methodology is studied considering two real applications: a problem in hydrodynamics and a problem of engineering related to fluids, heat and transport in an energy generation plant. Also a test of the predictive capacity of the methodology is performed using a cross-validation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of microprocessor-based systems is gaining importance in application domains where safety is a must. For this reason, there is a growing concern about the mitigation of SEU and SET effects. This paper presents a new hybrid technique aimed to protect both the data and the control-flow of embedded applications running on microprocessors. On one hand, the approach is based on software redundancy techniques for correcting errors produced in the data. On the other hand, control-flow errors can be detected by reusing the on-chip debug interface, existing in most modern microprocessors. Experimental results show an important increase in the system reliability even superior to two orders of magnitude, in terms of mitigation of both SEUs and SETs. Furthermore, the overheads incurred by our technique can be perfectly assumable in low-cost systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subsidence is a natural hazard that affects wide areas in the world causing important economic costs annually. This phenomenon has occurred in the metropolitan area of Murcia City (SE Spain) as a result of groundwater overexploitation. In this work aquifer system subsidence is investigated using an advanced differential SAR interferometry remote sensing technique (A-DInSAR) called Stable Point Network (SPN). The SPN derived displacement results, mainly the velocity displacement maps and the time series of the displacement, reveal that in the period 2004–2008 the rate of subsidence in Murcia metropolitan area doubled with respect to the previous period from 1995 to 2005. The acceleration of the deformation phenomenon is explained by the drought period started in 2006. The comparison of the temporal evolution of the displacements measured with the extensometers and the SPN technique shows an average absolute error of 3.9±3.8 mm. Finally, results from a finite element model developed to simulate the recorded time history subsidence from known water table height changes compares well with the SPN displacement time series estimations. This result demonstrates the potential of A-DInSAR techniques to validate subsidence prediction models as an alternative to using instrumental ground based techniques for validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The treatment of factual data has been widely studied in different areas of Natural Language Processing (NLP). However, processing subjective information still poses important challenges. This paper presents research aimed at assessing techniques that have been suggested as appropriate in the context of subjective - Opinion Question Answering (OQA). We evaluate the performance of an OQA with these new components and propose methods to optimally tackle the issues encountered. We assess the impact of including additional resources and processes with the purpose of improving the system performance on two distinct blog datasets. The improvements obtained for the different combination of tools are statistically significant. We thus conclude that the proposed approach is adequate for the OQA task, offering a good strategy to deal with opinionated questions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Science has been developed from the rational-empirical methods, having as a consequence, the representation of existing phenomena without understanding the root causes. The question which currently has is the sense of the being, and in a simplified way, one can say that the dogmatic religion lead to misinterpretations, the empirical sciences contain the exact rational representations of phenomena. Thus, Science has been able to get rid of the dogmatic religion. The project for the sciences of being looks to return to reality its essential foundations; under the plan of theory of systems necessarily involves a search for the meaning of Reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model Hamiltonians have been, and still are, a valuable tool for investigating the electronic structure of systems for which mean field theories work poorly. This review will concentrate on the application of Pariser–Parr–Pople (PPP) and Hubbard Hamiltonians to investigate some relevant properties of polycyclic aromatic hydrocarbons (PAH) and graphene. When presenting these two Hamiltonians we will resort to second quantisation which, although not the way chosen in its original proposal of the former, is much clearer. We will not attempt to be comprehensive, but rather our objective will be to try to provide the reader with information on what kinds of problems they will encounter and what tools they will need to solve them. One of the key issues concerning model Hamiltonians that will be treated in detail is the choice of model parameters. Although model Hamiltonians reduce the complexity of the original Hamiltonian, they cannot be solved in most cases exactly. So, we shall first consider the Hartree–Fock approximation, still the only tool for handling large systems, besides density functional theory (DFT) approaches. We proceed by discussing to what extent one may exactly solve model Hamiltonians and the Lanczos approach. We shall describe the configuration interaction (CI) method, a common technology in quantum chemistry but one rarely used to solve model Hamiltonians. In particular, we propose a variant of the Lanczos method, inspired by CI, that has the novelty of using as the seed of the Lanczos process a mean field (Hartree–Fock) determinant (the method will be named LCI). Two questions of interest related to model Hamiltonians will be discussed: (i) when including long-range interactions, how crucial is including in the Hamiltonian the electronic charge that compensates ion charges? (ii) Is it possible to reduce a Hamiltonian incorporating Coulomb interactions (PPP) to an 'effective' Hamiltonian including only on-site interactions (Hubbard)? The performance of CI will be checked on small molecules. The electronic structure of azulene and fused azulene will be used to illustrate several aspects of the method. As regards graphene, several questions will be considered: (i) paramagnetic versus antiferromagnetic solutions, (ii) forbidden gap versus dot size, (iii) graphene nano-ribbons, and (iv) optical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the proliferation of academic research on information systems outsourcing, not many studies analyze the characteristics of outsourcing contracts. This research aims to provide an in-depth description of information systems outsourcing. An additional objective is to examine how these characteristics evolve over time. Finally, this study reports on the usefulness of measuring such characteristics over time to assess the maturity level of the information systems outsourcing. This study gathers the data from the responses of the information systems managers of the largest Spanish firms to a questionnaire. This longitudinal study covers 12 years of research and compares authors' previous research results with the results of this study.