1000 resultados para Tecnología informática y de comunicaciones


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sustainability strategy in urban spaces arises from reflecting on how to achieve a more habitable city and is materialized in a series of sustainable transformations aimed at humanizing different environments so that they can be used and enjoyed by everyone without exception and regardless of their ability. Modern communication technologies allow new opportunities to analyze efficiency in the use of urban spaces from several points of view: adequacy of facilities, usability, and social integration capabilities. The research presented in this paper proposes a method to perform an analysis of movement accessibility in sustainable cities based on radio frequency technologies and the ubiquitous computing possibilities of the new Internet of Things paradigm. The proposal can be deployed in both indoor and outdoor environments to check specific locations of a city. Finally, a case study in a controlled context has been simulated to validate the proposal as a pre-deployment step in urban environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Iterative Closest Point algorithm (ICP) is commonly used in engineering applications to solve the rigid registration problem of partially overlapped point sets which are pre-aligned with a coarse estimate of their relative positions. This iterative algorithm is applied in many areas such as the medicine for volumetric reconstruction of tomography data, in robotics to reconstruct surfaces or scenes using range sensor information, in industrial systems for quality control of manufactured objects or even in biology to study the structure and folding of proteins. One of the algorithm’s main problems is its high computational complexity (quadratic in the number of points with the non-optimized original variant) in a context where high density point sets, acquired by high resolution scanners, are processed. Many variants have been proposed in the literature whose goal is the performance improvement either by reducing the number of points or the required iterations or even enhancing the complexity of the most expensive phase: the closest neighbor search. In spite of decreasing its complexity, some of the variants tend to have a negative impact on the final registration precision or the convergence domain thus limiting the possible application scenarios. The goal of this work is the improvement of the algorithm’s computational cost so that a wider range of computationally demanding problems from among the ones described before can be addressed. For that purpose, an experimental and mathematical convergence analysis and validation of point-to-point distance metrics has been performed taking into account those distances with lower computational cost than the Euclidean one, which is used as the de facto standard for the algorithm’s implementations in the literature. In that analysis, the functioning of the algorithm in diverse topological spaces, characterized by different metrics, has been studied to check the convergence, efficacy and cost of the method in order to determine the one which offers the best results. Given that the distance calculation represents a significant part of the whole set of computations performed by the algorithm, it is expected that any reduction of that operation affects significantly and positively the overall performance of the method. As a result, a performance improvement has been achieved by the application of those reduced cost metrics whose quality in terms of convergence and error has been analyzed and validated experimentally as comparable with respect to the Euclidean distance using a heterogeneous set of objects, scenarios and initial situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comunicación presentada en las V Jornadas de Computación Empotrada, Valladolid, 17-19 Septiembre 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we propose the use of the neural gas (NG), a neural network that uses an unsupervised Competitive Hebbian Learning (CHL) rule, to develop a reverse engineering process. This is a simple and accurate method to reconstruct objects from point clouds obtained from multiple overlapping views using low-cost sensors. In contrast to other methods that may need several stages that include downsampling, noise filtering and many other tasks, the NG automatically obtains the 3D model of the scanned objects. To demonstrate the validity of our proposal we tested our method with several models and performed a study of the neural network parameterization computing the quality of representation and also comparing results with other neural methods like growing neural gas and Kohonen maps or classical methods like Voxel Grid. We also reconstructed models acquired by low cost sensors that can be used in virtual and augmented reality environments for redesign or manipulation purposes. Since the NG algorithm has a strong computational cost we propose its acceleration. We have redesigned and implemented the NG learning algorithm to fit it onto Graphics Processing Units using CUDA. A speed-up of 180× faster is obtained compared to the sequential CPU version.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research developed in this work consists in proposing a set of techniques for management of social networks and their integration into the educational process. The proposals made are based on assumptions that have been proven with simple examples in a real scenario of university teaching. The results show that social networks have more capacity to spread information than educational web platforms. Moreover, educational social networks are developed in a context of freedom of expression intrinsically linked to Internet freedom. In that context, users can write opinions or comments which are not liked by the staff of schools. However, this feature can be exploited to enrich the educational process and improve the quality of their achievement. The network has covered needs and created new ones. So, the figure of the Community Manager is proposed as agent in educational context for monitoring network and aims to channel the opinions and to provide a rapid response to an academic problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment. However usually the huge amount of 3D information is difficult to manage due to the fact that the robot storage system and computing capabilities are insufficient. Therefore, a data compression method is necessary to store and process this information while preserving as much information as possible. A few methods have been proposed to compress 3D information. Nevertheless, there does not exist a consistent public benchmark for comparing the results (compression level, distance reconstructed error, etc.) obtained with different methods. In this paper, we propose a dataset composed of a set of 3D point clouds with different structure and texture variability to evaluate the results obtained from 3D data compression methods. We also provide useful tools for comparing compression methods, using as a baseline the results obtained by existing relevant compression methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automated human behaviour analysis has been, and still remains, a challenging problem. It has been dealt from different points of views: from primitive actions to human interaction recognition. This paper is focused on trajectory analysis which allows a simple high level understanding of complex human behaviour. It is proposed a novel representation method of trajectory data, called Activity Description Vector (ADV) based on the number of occurrences of a person is in a specific point of the scenario and the local movements that perform in it. The ADV is calculated for each cell of the scenario in which it is spatially sampled obtaining a cue for different clustering methods. The ADV representation has been tested as the input of several classic classifiers and compared to other approaches using CAVIAR dataset sequences obtaining great accuracy in the recognition of the behaviour of people in a Shopping Centre.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software-based techniques offer several advantages to increase the reliability of processor-based systems at very low cost, but they cause performance degradation and an increase of the code size. To meet constraints in performance and memory, we propose SETA, a new control-flow software-only technique that uses assertions to detect errors affecting the program flow. SETA is an independent technique, but it was conceived to work together with previously proposed data-flow techniques that aim at reducing performance and memory overheads. Thus, SETA is combined with such data-flow techniques and submitted to a fault injection campaign. Simulation and neutron induced SEE tests show high fault coverage at performance and memory overheads inferior to the state-of-the-art.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and objective: In this paper, we have tested the suitability of using different artificial intelligence-based algorithms for decision support when classifying the risk of congenital heart surgery. In this sense, classification of those surgical risks provides enormous benefits as the a priori estimation of surgical outcomes depending on either the type of disease or the type of repair, and other elements that influence the final result. This preventive estimation may help to avoid future complications, or even death. Methods: We have evaluated four machine learning algorithms to achieve our objective: multilayer perceptron, self-organizing map, radial basis function networks and decision trees. The architectures implemented have the aim of classifying among three types of surgical risk: low complexity, medium complexity and high complexity. Results: Accuracy outcomes achieved range between 80% and 99%, being the multilayer perceptron method the one that offered a higher hit ratio. Conclusions: According to the results, it is feasible to develop a clinical decision support system using the evaluated algorithms. Such system would help cardiology specialists, paediatricians and surgeons to forecast the level of risk related to a congenital heart disease surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El temblor humano puede definirse como un movimiento rápido y, en cierta manera, rítmico de una o más partes del cuerpo. En algunas personas, este movimiento puede ser un síntoma de alguna alteración a nivel neurológico. Desde el punto de vista matemático, el temblor humano puede ser definido como una suma ponderada de diferentes señales sinusoidales que causan oscilaciones de algunas partes del cuerpo. Esta sinusoide se repite en el tiempo pero su amplitud y frecuencia cambian lentamente. Por esta razón, la amplitud y la frecuencia son consideradas factores importantes en la clasificación del temblor y por tanto útiles en su diagnóstico. En este artículo, se presenta una herramienta de ayuda al diagnóstico del temblor humano. Esta herramienta usa un dispositivo hardware de bajo coste (<$40) y permite calcular las principales componentes de esta sinusoide asociada al temblor de una manera precisa. Como casos de estudio se presentan su aplicación a dos casos reales para probar la bondad de los algoritmos desarrollados. Los casos muestran pacientes que sufrían temblores con distinta severidad y que han realizado una serie de tests con el dispositivo para que el sistema calculara las principales componentes del temblor. Estas medidas aportadas por el sistema ayudarían en un futuro a los expertos a tomar decisiones más precisas permitiéndoles centrarse en determinadas fases del test o la realización de tests más específicos para evaluar mejor las características propias del temblor del paciente. De la experimentación realizada podemos afirmar que no todos los tests son válidos para el diagnóstico para todos los pacientes. Será finalmente la experiencia del profesional el que decidirá finalmente qué test o conjunto de tests son los más apropiados para cada paciente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La plataforma tecnológica de datos abiertos universitarios (OpenData4U) permite la publicación de datos abiertos de una universidad, así como su acceso de manera que se potencie su reutilización (a través de un portal de datos abiertos y de un API para desarrolladores), a la vez que se permite disponer de un portal de transparencia para una acceso fácil a los datos de manera comprensible por cualquier persona. Esta es la plataforma que usa la Universidad de Alicante en su proyecto de datos abiertos y transparencia. Puedes acceder al código en https://github.com/UAdatos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente libro “Ecosistema de Datos Abiertos de la Universidad de Alicante” pretende ser de utilidad para aquellas Universidades interesadas en desarrollar políticas de transparencia y datos abiertos. En él se detalla la experiencia de la Universidad de Alicante en la implantación de su “Ecosistema de Datos Abiertos”, tanto los aspectos normativos, procedimentales como los tecnológicos. Desde el libro se enlaza el software “Plataforma Tecnológica de Datos Abiertos Univesitarios (OpenData4U)” que busca facilitar un entorno de colaboración tecnológica entre Universidades. Se crea de esta forma, el embrión de una red de ecosistemas tecnológicos de datos abiertos universitarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human behaviour recognition has been, and still remains, a challenging problem that involves different areas of computational intelligence. The automated understanding of people activities from video sequences is an open research topic in which the computer vision and pattern recognition areas have made big efforts. In this paper, the problem is studied from a prediction point of view. We propose a novel method able to early detect behaviour using a small portion of the input, in addition to the capabilities of it to predict behaviour from new inputs. Specifically, we propose a predictive method based on a simple representation of trajectories of a person in the scene which allows a high level understanding of the global human behaviour. The representation of the trajectory is used as a descriptor of the activity of the individual. The descriptors are used as a cue of a classification stage for pattern recognition purposes. Classifiers are trained using the trajectory representation of the complete sequence. However, partial sequences are processed to evaluate the early prediction capabilities having a specific observation time of the scene. The experiments have been carried out using the three different dataset of the CAVIAR database taken into account the behaviour of an individual. Additionally, different classic classifiers have been used for experimentation in order to evaluate the robustness of the proposal. Results confirm the high accuracy of the proposal on the early recognition of people behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El Cuadro de Mando SmartUA es una aplicación software que permite localizar y visualizar con facilidad, en cualquier momento y desde cualquier lugar, toda la información recopilada desde diversas fuentes de datos y redes de sensores generadas por el proyecto Smart University de la Universidad de Alicante; representarla en forma de mapas y gráficas; realizar búsquedas y filtros sobre dicha información; y mostrar a la comunidad universitaria en particular y a la ciudadanía en general, de una forma objetiva e inteligible, los fenómenos que ocurren en el campus, interconectado sistemas y personas para un mejor aprovechamiento de los recursos, una gestión eficiente y una innovación continua.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En los últimos años, los sistemas que utilizan como fuente recursos renovables se han posicionado como una interesante alternativa para la producción de energía. Entre las fuentes disponibles, la energía eólica viene configurándose como una de las fuentes de energía renovable con mayor crecimiento en los últimos años. En este trabajo se propone el aprovechamiento de las corrientes eólicas circulantes sobre la superficie terrestre para la producción de electricidad con el fin de abastecer buena parte de la demanda en viviendas aisladas, pequeñas instalaciones agropecuarias, equipamiento de servicio ubicado en lugares remotos, etc. Por lo general estas brisas tienen una baja densidad energética, por ello proponemos una interfaz mecánica que concentre las masas de aire, acelerando su circulación y alcanzando importantes incrementos en la velocidad de impulsión. La primera parte se centra en la elaboración de un procedimiento de caracterización a partir de la metodología científica con el cual modelar una estructura concentradora de flujo eólico válida para un aerogenerador de eje vertical. Este método trata el diseño de un elemento acelerador capaz de optimizar el aprovechamiento de estas brisas con independencia de la dirección de éstas. Su diseño viene dado por la resolución de un conjunto de objetivos fundamentales, dotando al sistema de unas prestaciones particulares en relación a su arquitectura y operatividad. Estos objetivos son los siguientes: - Operatividad ante cualquier dirección eólica adoptada - Incremento del rendimiento potencial de la turbina de eje vertical - Minimizar el desarrollo de efectos turbulentos alrededor del sistema integrado - Capacidad resolutiva ante la presencia de fuertes vientos - Estabilidad estructural - Compatibilidad ante instalaciones propias del volumen arquitectónico - Control global del rendimiento del sistema. La segunda parte aborda el modelado del prototipo y el análisis de su comportamiento mediante simulaciones en el ámbito de la fluidodinámica computacional. El resultado es un prototipo caracterizado por una arquitectura capaz de sectorizar la entrada de viento en diferentes tramos inyectando el flujo eólico estratégicamente. La incorporación de la interfaz sobre el rotor aumenta la superficie de captación eólica, facilitando su entrada a través de las diferentes aberturas y llevando a cabo su concentración según avanza por los tramo de circulación. Una vez finalizado dicho avance, el flujo es inyectado en un rango angular de nido por la elevada fuerza de sustentación capaz de generarse gracias a la incidencia del flujo eólico, aprovechando las particularidades que ofrece la rotación de este tipo de rotores. El resultado de la inyección sectorizada es el desarrollo de una circulación interior vorticial que incide permanentemente en el rango de sustentación característico del perfil aerodinámico que define la geometría de la pala de rotación. Ello provoca que se alcance un funcionamiento nominal a velocidades reducidas. En este proceso se incluyen las acciones necesarias para dar una respuesta eficiente a cualquier tipo e solicitación eólica. En presencia de velocidades de relativa importancia, la interfaz concentradora adapta su arquitectura con el fin de regular la entrada de flujo, retrasando la activación de los dispositivos reguladores propios de la turbina eólica. En presencia de vientos importantes, la interfaz dispone de los mecanismos necesarios para proceder al cierre de las aberturas procediendo a la parada del rotor. La validación de los prototipos elaborados se ha llevado a cabo mediante simulación computacional CFD. Los resultados confirman la consecución de un funcionamiento nominal a velocidades más reducidas, una mayor superficie de captación y periodo de tiempo de funcionamiento efectivo en comparación a las turbinas convencionales. Para el caso práctico modelado los resultados mejoran en más de 2.5 veces la potencia generada y multiplican por cuatro la fuerza ejercida sobre las palas de la turbina. La elaboración de un método preciso para el diseño de este tipo de estructuras concentradoras posibilita que se pueda alcanzar un diseño en función a las necesidades del usuario, o las condiciones eólicas de un emplazamiento dado. A ello hay que unir su compatibilidad de uso y montaje con sistemas de captación solar, conformando un sistema híbrido capaz de aprovechar tanto la energía solar como eólica para el abastecimiento autónomo. Esta característica incrementa el ratio de zonas geográficas donde se puede llevar a cabo su implantación. Las últimas páginas están reservadas a esbozar las líneas futuras de desarrollo y evolución, tanto en términos de e ciencia productiva como en su incorporación a nuevos volúmenes arquitectónicos y estructuras civiles en general.