812 resultados para 519 Probalidades y matemática aplicada


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microbiota of multi-pond solar salterns around the world has been analyzed using a variety of culture-dependent and molecular techniques. However, studies addressing the dynamic nature of these systems are very scarce. Here we have characterized the temporal variation during 1 year of the microbiota of five ponds with increasing salinity (from 18% to >40%), by means of CARD-FISH and DGGE. Microbial community structure was statistically correlated with several environmental parameters, including ionic composition and meteorological factors, indicating that the microbial community was dynamic as specific phylotypes appeared only at certain times of the year. In addition to total salinity, microbial composition was strongly influenced by temperature and specific ionic composition. Remarkably, DGGE analyses unveiled the presence of most phylotypes previously detected in hypersaline systems using metagenomics and other molecular techniques, such as the very abundant Haloquadratum and Salinibacter representatives or the recently described low GC Actinobacteria and Nanohaloarchaeota. In addition, an uncultured group of Bacteroidetes was present along the whole range of salinity. Database searches indicated a previously unrecognized widespread distribution of this phylotype. Single-cell genome analysis of five members of this group suggested a set of metabolic characteristics that could provide competitive advantages in hypersaline environments, such as polymer degradation capabilities, the presence of retinal-binding light-activated proton pumps and arsenate reduction potential. In addition, the fairly high metagenomic fragment recruitment obtained for these single cells in both the intermediate and hypersaline ponds further confirm the DGGE data and point to the generalist lifestyle of this new Bacteroidetes group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new classification of microtidal sand and gravel beaches with very different morphologies is presented below. In 557 studied transects, 14 variables were used. Among the variables to be emphasized is the depth of the Posidonia oceanica. The classification was performed for 9 types of beaches: Type 1: Sand and gravel beaches, Type 2: Sand and gravel separated beaches, Type 3: Gravel and sand beaches, Type 4: Gravel and sand separated beaches, Type 5: Pure gravel beaches, Type 6: Open sand beaches, Type 7: Supported sand beaches, Type 8: Bisupported sand beaches and Type 9: Enclosed beaches. For the classification, several tools were used: discriminant analysis, neural networks and Support Vector Machines (SVM), the results were then compared. As there is no theory for deciding which is the most convenient neural network architecture to deal with a particular data set, an experimental study was performed with different numbers of neuron in the hidden layer. Finally, an architecture with 30 neurons was chosen. Different kernels were employed for SVM (Linear, Polynomial, Radial basis function and Sigmoid). The results obtained for the discriminant analysis were not as good as those obtained for the other two methods (ANN and SVM) which showed similar success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to build dynamic models for prediction and management of degraded Mediterranean forest areas was necessary to build MARIOLA model, which is a calculation computer program. This model includes the following subprograms. 1) bioshrub program, which calculates total, green and woody shrubs biomass and it establishes the time differences to calculate the growth. 2) selego program, which builds the flow equations from the experimental data. It is based on advanced procedures of statistical multiple regression. 3) VEGETATION program, which solves the state equations with Euler or Runge-Kutta integration methods. Each one of these subprograms can act as independent or as linked programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a generalization of a new systemic approach to abstract fuzzy systems. Using a fuzzy relations structure will retain the information provided by degrees of membership. In addition, to better suit the situation to be modelled, it is advisable to use T-norm or T-conorm distinct from the minimum and maximum, respectively. This gain in generality is due to the completeness of the work on a higher level of abstraction. You cannot always reproduce the results obtained previously, and also sometimes different definitions with different views are obtained. In any case this approach proves to be much more effective when modelling reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the use of directories of open access repositories worldwide (DOARW) to search Spanish repositories containing learning objects in the field of building engineering (BE). Results show that DOARW are powerful tools, but deficiencies (indicated in this study) have to be solved in order to obtain more accurate searches, and to facilitate repository-finding for potential users who are seeking learning objects (LOs) for reuse. Aiming to contribute to the promotion of the reuse of Spanish LOs, this study exposes to the academic community all existing Spanish repositories with LOs, and in particular, the repositories that contain LOs in the field of BE. This paper also studies the critical mass of available content (LOs) in the field of BE in Spain. It has been found to be low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical models used for the understanding of coastal seabed morphology play a key role in beach nourishment projects. These projects have become the fundamental strategy for coastal maintenance during the last few years. Accordingly, the accuracy of these models is vital to optimize the costs of coastal regeneration projects. Planning of such interventions requires methodologies that do not generate uncertainties in their interpretation. A study and comparison of mathematical simulation models of the coastline is carried out in this paper, as well as elements that are part of the model that are a source of uncertainty. The equilibrium profile (EP) and the offshore limit corresponding to the depth of closure (DoC) have been analyzed taking into account different timescale ranges. The results have thus been compared using data sets from three different periods which are identified as present, past and future. Accuracy in data collection for the beach profiles and the definition of the median grain size calculation using collected samples are the two main factors that have been taken into account in this paper. These data can generate high uncertainties and can produce a lack of accuracy in nourishment projects. Together they can generate excessive costs due to possible excess or shortage of sand used for the nourishment. The main goal of this paper is the development of a new methodology to increase the accuracy of the existing equilibrium beach profile models, providing an improvement to the inputs used in such models and in the fitting of the formulae used to obtain seabed shape. This new methodology has been applied and tested on Valencia's beaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A suitable knowledge of the orientation and motion of the Earth in space is a common need in various fields. That knowledge has been ever necessary to carry out astronomical observations, but with the advent of the space age, it became essential for making observations of satellites and predicting and determining their orbits, and for observing the Earth from space as well. Given the relevant role it plays in Space Geodesy, Earth rotation is considered as one of the three pillars of Geodesy, the other two being geometry and gravity. Besides, research on Earth rotation has fostered advances in many fields, such as Mathematics, Astronomy and Geophysics, for centuries. One remarkable feature of the problem is in the extreme requirements of accuracy that must be fulfilled in the near future, about a millimetre on the tangent plane to the planet surface, roughly speaking. That challenges all of the theories that have been devised and used to-date; the paper makes a short review of some of the most relevant methods, which can be envisaged as milestones in Earth rotation research, emphasizing the Hamiltonian approach developed by the authors. Some contemporary problems are presented, as well as the main lines of future research prospected by the International Astronomical Union/International Association of Geodesy Joint Working Group on Theory of Earth Rotation, created in 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Free Core Nutation (FCN) is a free mode of the Earth's rotation caused by the different material characteristics of the Earth's core and mantle. This causes the rotational axes of those layers to slightly diverge from each other, resulting in a wobble of the Earth's rotation axis comparable to nutations. In this paper we focus on estimating empirical FCN models using the observed nutations derived from the VLBI sessions between 1993 and 2013. Assuming a fixed value for the oscillation period, the time-variable amplitudes and phases are estimated by means of multiple sliding window analyses. The effects of using different a priori Earth Rotation Parameters (ERP) in the derivation of models are also addressed. The optimal choice of the fundamental parameters of the model, namely the window width and step-size of its shift, is searched by performing a thorough experimental analysis using real data. The former analyses lead to the derivation of a model with a temporal resolution higher than the one used in the models currently available, with a sliding window reduced to 400 days and a day-by-day shift. It is shown that this new model increases the accuracy of the modeling of the observed Earth's rotation. Besides, empirical models determined from USNO Finals as a priori ERP present a slightly lower Weighted Root Mean Square (WRMS) of residuals than IERS 08 C04 along the whole period of VLBI observations, according to our computations. The model is also validated through comparisons with other recognized models. The level of agreement among them is satisfactory. Let us remark that our estimates give rise to the lowest residuals and seem to reproduce the FCN signal in more detail.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One saw previously that indications of diversity IT and the one of Shannon permits to characterize globally by only one number one fundamental aspects of the text structure. However a more precise knowledge of this structure requires specific abundance distributions and the use, to represent this one, of a suitable mathematical model. Among the numerous models that would be either susceptible to be proposed, the only one that present a real convenient interest are simplest. One will limit itself to study applied three of it to the language L(MT): the log-linear, the log-normal and Mac Arthur's models very used for the calculation of the diversity of the species of ecosystems, and used, we believe that for the first time, in the calculation of the diversity of a text written in a certain language, in our case L(MT). One will show advantages and inconveniences of each of these model types, methods permitting to adjust them to text data and in short tests that permit to decide if this adjustment is acceptable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical modelling methodologies are important by their application to engineering and scientific problems, because there are processes where analytical mathematical expressions cannot be obtained to model them. When the only available information is a set of experimental values for the variables that determine the state of the system, the modelling problem is equivalent to determining the hyper-surface that best fits the data. This paper presents a methodology based on the Galerkin formulation of the finite elements method to obtain representations of relationships that are defined a priori, between a set of variables: y = z(x1, x2,...., xd). These representations are generated from the values of the variables in the experimental data. The approximation, piecewise, is an element of a Sobolev space and has derivatives defined in a general sense into this space. The using of this approach results in the need of inverting a linear system with a structure that allows a fast solver algorithm. The algorithm can be used in a variety of fields, being a multidisciplinary tool. The validity of the methodology is studied considering two real applications: a problem in hydrodynamics and a problem of engineering related to fluids, heat and transport in an energy generation plant. Also a test of the predictive capacity of the methodology is performed using a cross-validation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta red de investigación en docencia universitaria ha centrado sus objetivos en la coordinación de las actividades transversales en el módulo básico de los Grados en Biología y en Ciencias del Mar. Además, entre los intereses principales de la red se ha fijado la evaluación y propuestas de mejora de la planificación de las actividades con carga no presencial. En las sucesivas reuniones se ha contado con la participación de los delegados y delegadas de los distintos grupos y sus contribuciones han permitido incorporar la opinión del alumnado. Así, se ha establecido un primer análisis de la problemática a abordar en los siguientes años para mejorar la coordinación transversal y vertical en ambos grados, así como también se ha buscado una solución a todos los inconvenientes que han surgido a lo largo del semestre. Se plantea a su vez la mejora en la distribución de la carga semanal para evitar que haya semanas con mucha carga de horas no presenciales. Finalmente, se ha coordinado las competencias transversales en las asignaturas que comparten estos objetivos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este artículo bibliográfico propone una panorámica de MonTI (Monografías de Traducción e Interpretación), una revista académica anual nacida en 2009 y publicada conjuntamente por las tres universidades públicas que ofrecen un grado en Traducción e Interpretación en la Comunidad Autónoma Valenciana.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Se ha realizado una investigación acerca del uso del ordenador y de los objetos de aprendizaje que utilizan los estudiantes en una asignatura de Arquitectura Técnica de la Universidad de Alicante. Para ello, se ha creado un instrumento que analiza la utilidad percibida de los objetos de aprendizaje en la adquisición de las competencias y las actitudes de los estudiantes hacia el uso del ordenador. Los análisis realizados indican que el instrumento de medición elaborado es fiable y válido. La validez de contenido del instrumento se analizó a través del juicio de expertos (validez general del cuestionario = .912, p-valor = .000). La validez de constructo se estudió a través del análisis de su estructura interna, sometiendo a un análisis factorial los ítems de la versión definitiva del cuestionario (se identificaron cuatro factores que juntos explicaron el 45.65% de la varianza). La fiabilidad del instrumento se analizó calculando su consistencia interna por medio del coeficiente alpha de Cronbach (? para el total de la escala = .90).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classifier support vector machine is used in several problems in various areas of knowledge. Basically the method used in this classier is to end the hyperplane that maximizes the distance between the groups, to increase the generalization of the classifier. In this work, we treated some problems of binary classification of data obtained by electroencephalography (EEG) and electromyography (EMG) using Support Vector Machine with some complementary techniques, such as: Principal Component Analysis to identify the active regions of the brain, the periodogram method which is obtained by Fourier analysis to help discriminate between groups and Simple Moving Average to eliminate some of the existing noise in the data. It was developed two functions in the software R, for the realization of training tasks and classification. Also, it was proposed two weights systems and a summarized measure to help on deciding in classification of groups. The application of these techniques, weights and the summarized measure in the classier, showed quite satisfactory results, where the best results were an average rate of 95.31% to visual stimuli data, 100% of correct classification for epilepsy data and rates of 91.22% and 96.89% to object motion data for two subjects.