19 resultados para data gathering algorithm

em Universidad de Alicante


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Comunicación presentada en las XVI Jornadas de Ingeniería del Software y Bases de Datos, JISBD 2011, A Coruña, 5-7 septiembre 2011.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En el presente trabajo se detalla el estudio descriptivo realizado sobre las competencias digitales en alumnos de Magisterio de Educación Infantil de la Universidad de Alicante. El objetivo del estudio era conocer las competencias digitales en cuanto al uso y dominio de las TIC por parte de futuros docentes. Como herramienta de recogida de datos se ha empleado el cuestionario validado por Guzmán (2008) Utilización de las Tecnologías de la Información y la Comunicación [TIC] en estudiantes universitarios. Los resultados obtenidos muestran un buen nivel de uso y dominio de las herramientas TIC, así como un buen nivel en la competencia digital.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El estudio de las disciplinas científicas resulta más atractivo si se acompaña de actividades de carácter práctico. En este trabajo se propone un taller cuya finalidad es introducir al alumnado en el estudio de los microfósiles y de las reconstrucciones paleoambientales aplicándolo a uno de los eventos más significativos ocurridos en el área Mediterránea, que conllevó la desecación y posterior reinundación de toda la cuenca hace aproximadamente unos cinco millones de años. El taller consta de tres sesiones: una teórica, de introducción de los contenidos necesarios para el desarrollo de la actividad, una práctica, de obtención de datos, y una final, de interpretación de los cambios ambientales y presentación de los resultados en forma de artículo científico y posterior debate en el aula. Todos los datos necesarios para el desarrollo de la actividad se proporcionan en el presente artículo. Además, se proponen una serie de recursos bibliográficos y audiovisuales de fácil acceso para la introducción de los conceptos teóricos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El estudio de las disciplinas científicas resulta más atractivo si se acompaña de actividades de carácter práctico. En este trabajo se propone un taller cuya finalidad es introducir al alumnado en el trabajo científico que realizan los geólogos y paleontólogos a través de la información paleoambiental y bioestratigráfica que proporcionan los microfósiles y su aplicación a la Crisis de Salinidad del Messiniense. Este periodo es considerado como uno de los acontecimientos más relevantes de la historia geológica del Mediterráneo y se caracteriza por una acumulación masiva de evaporitas en el fondo de la cuenca, que se relaciona con la desecación y posterior reinundación del Mediterráneo hace aproximadamente cinco millones de años. El taller consta de tres sesiones: una teórica, de introducción de los contenidos necesarios para el desarrollo de la actividad, para la que se proponen una serie de recursos bibliográficos y audiovisuales de libre acceso en internet; una práctica, de obtención de datos; y una final, de interpretación de los cambios paleoambientales que conlleva la presentación de los resultados en forma de artículo científico y posterior debate en el aula. Todos los datos necesarios para el desarrollo de la actividad se proporcionan en el presente artículo, si bien esta propuesta de taller queda abierta a las posibles modificaciones y mejoras que el profesorado considere oportunas. Para vertebrar esta propuesta, en forma de ejemplo de aplicación, se ha incluido el taller en la programación de la asignatura Biología y Geología (4º ESO). La puesta a punto de este taller pone de manifiesto que resulta idóneo para el trabajo en grupo en el aula permitiendo que el alumnado se sienta partícipe de todas las fases que constituyen una investigación científica.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Different types of land use are usually present in the areas adjacent to many shallow karst cavities. Over time, the increasing amount of potentially harmful matter and energy, of mainly anthropic origin or influence, that reaches the interior of a shallow karst cavity can modify the hypogeal ecosystem and increase the risk of damage to the Palaeolithic rock art often preserved within the cavity. This study proposes a new Protected Area status based on the geological processes that control these matter and energy fluxes into the Altamira cave karst system. Analysis of the geological characteristics of the shallow karst system shows that direct and lateral infiltration, internal water circulation, ventilation, gas exchange and transmission of vibrations are the processes that control these matter and energy fluxes into the cave. This study applies a comprehensive methodological approach based on Geographic Information Systems (GIS) to establish the area of influence of each transfer process. The stratigraphic and structural characteristics of the interior of the cave were determined using 3D Laser Scanning topography combined with classical field work, data gathering, cartography and a porosity–permeability analysis of host rock samples. As a result, it was possible to determine the hydrogeological behavior of the cave. In addition, by mapping and modeling the surface parameters it was possible to identify the main features restricting hydrological behavior and hence direct and lateral infiltration into the cave. These surface parameters included the shape of the drainage network and a geomorphological and structural characterization via digital terrain models. Geological and geomorphological maps and models integrated into the GIS environment defined the areas involved in gas exchange and ventilation processes. Likewise, areas that could potentially transmit vibrations directly into the cave were identified. This study shows that it is possible to define a Protected Area by quantifying the area of influence related to each transfer process. The combined maximum area of influence of all the processes will result in the new Protected Area. This area will thus encompass all the processes that account for most of the matter and energy carried into the cave and will fulfill the criteria used to define the Protected Area. This methodology is based on the spatial quantification of processes and entities of geological origin and can therefore be applied to any shallow karst system that requires protection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Numerical modelling methodologies are important by their application to engineering and scientific problems, because there are processes where analytical mathematical expressions cannot be obtained to model them. When the only available information is a set of experimental values for the variables that determine the state of the system, the modelling problem is equivalent to determining the hyper-surface that best fits the data. This paper presents a methodology based on the Galerkin formulation of the finite elements method to obtain representations of relationships that are defined a priori, between a set of variables: y = z(x1, x2,...., xd). These representations are generated from the values of the variables in the experimental data. The approximation, piecewise, is an element of a Sobolev space and has derivatives defined in a general sense into this space. The using of this approach results in the need of inverting a linear system with a structure that allows a fast solver algorithm. The algorithm can be used in a variety of fields, being a multidisciplinary tool. The validity of the methodology is studied considering two real applications: a problem in hydrodynamics and a problem of engineering related to fluids, heat and transport in an energy generation plant. Also a test of the predictive capacity of the methodology is performed using a cross-validation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to develop a method to hide information inside a binary image. An algorithm to embed data in scanned text or figures is proposed, based on the detection of suitable pixels, which verify some conditions in order to be not detected. In broad terms, the algorithm locates those pixels placed at the contours of the figures or in those areas where some scattering of the two colors can be found. The hidden information is independent from the values of the pixels where this information is embedded. Notice that, depending on the sequence of bits to be hidden, around half of the used pixels to keep bits of data will not be modified. The other basic characteristic of the proposed scheme is that it is necessary to take into consideration the bits that are modified, in order to perform the recovering process of the information, which consists on recovering the sequence of bits placed in the proper positions. An application to banking sector is proposed for hidding some information in signatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermodynamic consistency of almost 90 VLE data series, including isothermal and isobaric conditions for systems of both total and partial miscibility in the liquid phase, has been examined by means of the area and point-to-point tests. In addition, the Gibbs energy of mixing function calculated from these experimental data has been inspected, with some rather surprising results: certain data sets exhibiting high dispersion or leading to Gibbs energy of mixing curves inconsistent with the total or partial miscibility of the liquid phase, surprisingly, pass the tests. Several possible inconsistencies in the tests themselves or in their application are discussed. Related to this is a very interesting and ambitious initiative that arose within the NIST organization: the development of an algorithm to assess the quality of experimental VLE data. The present paper questions the applicability of two of the five tests that are combined in the algorithm. It further shows that the deviation of the experimental VLE data from the correlation obtained by a given model, the basis of some point-to-point tests, should not be used to evaluate the quality of these data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose and discuss a new centrality index for urban street patterns represented as networks in geographical space. This centrality measure, that we call ranking-betweenness centrality, combines the idea behind the random-walk betweenness centrality measure and the idea of ranking the nodes of a network produced by an adapted PageRank algorithm. We initially use a PageRank algorithm in which we are able to transform some information of the network that we want to analyze into numerical values. Numerical values summarizing the information are associated to each of the nodes by means of a data matrix. After running the adapted PageRank algorithm, a ranking of the nodes is obtained, according to their importance in the network. This classification is the starting point for applying an algorithm based on the random-walk betweenness centrality. A detailed example of a real urban street network is discussed in order to understand the process to evaluate the ranking-betweenness centrality proposed, performing some comparisons with other classical centrality measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To calculate theoretically the errors in the estimation of corneal power when using the keratometric index (nk) in eyes that underwent laser refractive surgery for the correction of myopia and to define and validate clinically an algorithm for minimizing such errors. Methods: Differences between corneal power estimation by using the classical nk and by using the Gaussian equation in eyes that underwent laser myopic refractive surgery were simulated and evaluated theoretically. Additionally, an adjusted keratometric index (nkadj) model dependent on r1c was developed for minimizing these differences. The model was validated clinically by retrospectively using the data from 32 myopic eyes [range, −1.00 to −6.00 diopters (D)] that had undergone laser in situ keratomileusis using a solid-state laser platform. The agreement between Gaussian (PGaussc) and adjusted keratometric (Pkadj) corneal powers in such eyes was evaluated. Results: It was found that overestimations of corneal power up to 3.5 D were possible for nk = 1.3375 according to our simulations. The nk value to avoid the keratometric error ranged between 1.2984 and 1.3297. The following nkadj models were obtained: nkadj= −0.0064286r1c + 1.37688 (Gullstrand eye model) and nkadj = −0.0063804r1c + 1.37806 (Le Grand). The mean difference between Pkadj and PGaussc was 0.00 D, with limits of agreement of −0.45 and +0.46 D. This difference correlated significantly with the posterior corneal radius (r = −0.94, P < 0.01). Conclusions: The use of a single nk for estimating the corneal power in eyes that underwent a laser myopic refractive surgery can lead to significant errors. These errors can be minimized by using a variable nk dependent on r1c.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermodynamics Conference 2013 (Statistical Mechanics and Thermodynamics Group of the Royal Society of Chemistry), The University of Manchester, 3-6 September 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moderate resolution remote sensing data, as provided by MODIS, can be used to detect and map active or past wildfires from daily records of suitable combinations of reflectance bands. The objective of the present work was to develop and test simple algorithms and variations for automatic or semiautomatic detection of burnt areas from time series data of MODIS biweekly vegetation indices for a Mediterranean region. MODIS-derived NDVI 250m time series data for the Valencia region, East Spain, were subjected to a two-step process for the detection of candidate burnt areas, and the results compared with available fire event records from the Valencia Regional Government. For each pixel and date in the data series, a model was fitted to both the previous and posterior time series data. Combining drops between two consecutive points and 1-year average drops, we used discrepancies or jumps between the pre and post models to identify seed pixels, and then delimitated fire scars for each potential wildfire using an extension algorithm from the seed pixels. The resulting maps of the detected burnt areas showed a very good agreement with the perimeters registered in the database of fire records used as reference. Overall accuracies and indices of agreement were very high, and omission and commission errors were similar or lower than in previous studies that used automatic or semiautomatic fire scar detection based on remote sensing. This supports the effectiveness of the method for detecting and mapping burnt areas in the Mediterranean region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to propose a mathematical model to determine invariant sets, set covering, orbits and, in particular, attractors in the set of tourism variables. Analysis was carried out based on a pre-designed algorithm and applying our interpretation of chaos theory developed in the context of General Systems Theory. This article sets out the causal relationships associated with tourist flows in order to enable the formulation of appropriate strategies. Our results can be applied to numerous cases. For example, in the analysis of tourist flows, these findings can be used to determine whether the behaviour of certain groups affects that of other groups and to analyse tourist behaviour in terms of the most relevant variables. Unlike statistical analyses that merely provide information on current data, our method uses orbit analysis to forecast, if attractors are found, the behaviour of tourist variables in the immediate future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a simple algorithm for assessing the validity of the RVoG model for PolInSAR-based inversion techniques. This approach makes use of two important features characterizing a homogeneous random volume over a ground surface, i.e., the independence on polarization states of wave propagation through the volume and the structure of the polarimetric interferometric coherency matrix. These two features have led to two different methods proposed in the literature for retrieving the topographic phase within natural covers, i.e., the well-known line fitting procedure and the observation of the (1, 2) element of the polarimetric interferometric coherency matrix. We show that differences between outputs from both approaches can be interpreted in terms of the PolInSAR modeling based on the Freeman-Durden concept, and this leads to the definition of a RVoG/non-RVoG test. The algorithm is tested with both indoor and airborne data over agricultural and tropical forest areas.