24 resultados para Analyzing human behavior
em Universidad Politécnica de Madrid
Resumo:
El objeto de esta Tesis doctoral es el desarrollo de una metodologia para la deteccion automatica de anomalias a partir de datos hiperespectrales o espectrometria de imagen, y su cartografiado bajo diferentes condiciones tipologicas de superficie y terreno. La tecnologia hiperespectral o espectrometria de imagen ofrece la posibilidad potencial de caracterizar con precision el estado de los materiales que conforman las diversas superficies en base a su respuesta espectral. Este estado suele ser variable, mientras que las observaciones se producen en un numero limitado y para determinadas condiciones de iluminacion. Al aumentar el numero de bandas espectrales aumenta tambien el numero de muestras necesarias para definir espectralmente las clases en lo que se conoce como Maldicion de la Dimensionalidad o Efecto Hughes (Bellman, 1957), muestras habitualmente no disponibles y costosas de obtener, no hay mas que pensar en lo que ello implica en la Exploracion Planetaria. Bajo la definicion de anomalia en su sentido espectral como la respuesta significativamente diferente de un pixel de imagen respecto de su entorno, el objeto central abordado en la Tesis estriba primero en como reducir la dimensionalidad de la informacion en los datos hiperespectrales, discriminando la mas significativa para la deteccion de respuestas anomalas, y segundo, en establecer la relacion entre anomalias espectrales detectadas y lo que hemos denominado anomalias informacionales, es decir, anomalias que aportan algun tipo de informacion real de las superficies o materiales que las producen. En la deteccion de respuestas anomalas se asume un no conocimiento previo de los objetivos, de tal manera que los pixeles se separan automaticamente en funcion de su informacion espectral significativamente diferenciada respecto de un fondo que se estima, bien de manera global para toda la escena, bien localmente por segmentacion de la imagen. La metodologia desarrollada se ha centrado en la implicacion de la definicion estadistica del fondo espectral, proponiendo un nuevo enfoque que permite discriminar anomalias respecto fondos segmentados en diferentes grupos de longitudes de onda del espectro, explotando la potencialidad de separacion entre el espectro electromagnetico reflectivo y emisivo. Se ha estudiado la eficiencia de los principales algoritmos de deteccion de anomalias, contrastando los resultados del algoritmo RX (Reed and Xiaoli, 1990) adoptado como estandar por la comunidad cientifica, con el metodo UTD (Uniform Targets Detector), su variante RXD-UTD, metodos basados en subespacios SSRX (Subspace RX) y metodo basados en proyecciones de subespacios de imagen, como OSPRX (Orthogonal Subspace Projection RX) y PP (Projection Pursuit). Se ha desarrollado un nuevo metodo, evaluado y contrastado por los anteriores, que supone una variacion de PP y describe el fondo espectral mediante el analisis discriminante de bandas del espectro electromagnetico, separando las anomalias con el algortimo denominado Detector de Anomalias de Fondo Termico o DAFT aplicable a sensores que registran datos en el espectro emisivo. Se han evaluado los diferentes metodos de deteccion de anomalias en rangos del espectro electromagnetico del visible e infrarrojo cercano (Visible and Near Infrared-VNIR), infrarrojo de onda corta (Short Wavelenght Infrared-SWIR), infrarrojo medio (Meadle Infrared-MIR) e infrarrojo termico (Thermal Infrared-TIR). La respuesta de las superficies en las distintas longitudes de onda del espectro electromagnetico junto con su entorno, influyen en el tipo y frecuencia de las anomalias espectrales que puedan provocar. Es por ello que se han utilizado en la investigacion cubos de datos hiperepectrales procedentes de los sensores aeroportados cuya estrategia y diseno en la construccion espectrometrica de la imagen difiere. Se han evaluado conjuntos de datos de test de los sensores AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) y MASTER (MODIS/ASTER Simulator). Se han disenado experimentos sobre ambitos naturales, urbanos y semiurbanos de diferente complejidad. Se ha evaluado el comportamiento de los diferentes detectores de anomalias a traves de 23 tests correspondientes a 15 areas de estudio agrupados en 6 espacios o escenarios: Urbano - E1, Semiurbano/Industrial/Periferia Urbana - E2, Forestal - E3, Agricola - E4, Geologico/Volcanico - E5 y Otros Espacios Agua, Nubes y Sombras - E6. El tipo de sensores evaluados se caracteriza por registrar imagenes en un amplio rango de bandas, estrechas y contiguas, del espectro electromagnetico. La Tesis se ha centrado en el desarrollo de tecnicas que permiten separar y extraer automaticamente pixeles o grupos de pixeles cuya firma espectral difiere de manera discriminante de las que tiene alrededor, adoptando para ello como espacio muestral parte o el conjunto de las bandas espectrales en las que ha registrado radiancia el sensor hiperespectral. Un factor a tener en cuenta en la investigacion ha sido el propio instrumento de medida, es decir, la caracterizacion de los distintos subsistemas, sensores imagen y auxiliares, que intervienen en el proceso. Para poder emplear cuantitativamente los datos medidos ha sido necesario definir las relaciones espaciales y espectrales del sensor con la superficie observada y las potenciales anomalias y patrones objetivos de deteccion. Se ha analizado la repercusion que en la deteccion de anomalias tiene el tipo de sensor, tanto en su configuracion espectral como en las estrategias de diseno a la hora de registrar la radiacion prodecente de las superficies, siendo los dos tipos principales de sensores estudiados los barredores o escaneres de espejo giratorio (whiskbroom) y los barredores o escaneres de empuje (pushbroom). Se han definido distintos escenarios en la investigacion, lo que ha permitido abarcar una amplia variabilidad de entornos geomorfologicos y de tipos de coberturas, en ambientes mediterraneos, de latitudes medias y tropicales. En resumen, esta Tesis presenta una tecnica de deteccion de anomalias para datos hiperespectrales denominada DAFT en su variante de PP, basada en una reduccion de la dimensionalidad proyectando el fondo en un rango de longitudes de onda del espectro termico distinto de la proyeccion de las anomalias u objetivos sin firma espectral conocida. La metodologia propuesta ha sido probada con imagenes hiperespectrales reales de diferentes sensores y en diferentes escenarios o espacios, por lo tanto de diferente fondo espectral tambien, donde los resultados muestran los beneficios de la aproximacion en la deteccion de una gran variedad de objetos cuyas firmas espectrales tienen suficiente desviacion respecto del fondo. La tecnica resulta ser automatica en el sentido de que no hay necesidad de ajuste de parametros, dando resultados significativos en todos los casos. Incluso los objetos de tamano subpixel, que no pueden distinguirse a simple vista por el ojo humano en la imagen original, pueden ser detectados como anomalias. Ademas, se realiza una comparacion entre el enfoque propuesto, la popular tecnica RX y otros detectores tanto en su modalidad global como local. El metodo propuesto supera a los demas en determinados escenarios, demostrando su capacidad para reducir la proporcion de falsas alarmas. Los resultados del algoritmo automatico DAFT desarrollado, han demostrado la mejora en la definicion cualitativa de las anomalias espectrales que identifican a entidades diferentes en o bajo superficie, reemplazando para ello el modelo clasico de distribucion normal con un metodo robusto que contempla distintas alternativas desde el momento mismo de la adquisicion del dato hiperespectral. Para su consecucion ha sido necesario analizar la relacion entre parametros biofisicos, como la reflectancia y la emisividad de los materiales, y la distribucion espacial de entidades detectadas respecto de su entorno. Por ultimo, el algoritmo DAFT ha sido elegido como el mas adecuado para sensores que adquieren datos en el TIR, ya que presenta el mejor acuerdo con los datos de referencia, demostrando una gran eficacia computacional que facilita su implementacion en un sistema de cartografia que proyecte de forma automatica en un marco geografico de referencia las anomalias detectadas, lo que confirma un significativo avance hacia un sistema en lo que se denomina cartografia en tiempo real. The aim of this Thesis is to develop a specific methodology in order to be applied in automatic detection anomalies processes using hyperspectral data also called hyperspectral scenes, and to improve the classification processes. Several scenarios, areas and their relationship with surfaces and objects have been tested. The spectral characteristics of reflectance parameter and emissivity in the pattern recognition of urban materials in several hyperspectral scenes have also been tested. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) and MASTER (MODIS/ASTER Simulator) have been used in this research. It is assumed that there is not prior knowledge of the targets in anomaly detection. Thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by the image segmentation. Several experiments on different scenarios have been designed, analyzing the behavior of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. Results and their consequences in unsupervised classification processes are discussed. Detection of spectral anomalies aims at extracting automatically pixels that show significant responses in relation of their surroundings. This Thesis deals with the unsupervised technique of target detection, also called anomaly detection. Since this technique assumes no prior knowledge about the target or the statistical characteristics of the data, the only available option is to look for objects that are differentiated from the background. Several methods have been developed in the last decades, allowing a better understanding of the relationships between the image dimensionality and the optimization of search procedures as well as the subpixel differentiation of the spectral mixture and its implications in anomalous responses. In other sense, image spectrometry has proven to be efficient in the characterization of materials, based on statistical methods using a specific reflection and absorption bands. Spectral configurations in the VNIR, SWIR and TIR have been successfully used for mapping materials in different urban scenarios. There has been an increasing interest in the use of high resolution data (both spatial and spectral) to detect small objects and to discriminate surfaces in areas with urban complexity. This has come to be known as target detection which can be either supervised or unsupervised. In supervised target detection, algorithms lean on prior knowledge, such as the spectral signature. The detection process for matching signatures is not straightforward due to the complications of converting data airborne sensor with material spectra in the ground. This could be further complicated by the large number of possible objects of interest, as well as uncertainty as to the reflectance or emissivity of these objects and surfaces. An important objective in this research is to establish relationships that allow linking spectral anomalies with what can be called informational anomalies and, therefore, identify information related to anomalous responses in some places rather than simply spotting differences from the background. The development in recent years of new hyperspectral sensors and techniques, widen the possibilities for applications in remote sensing of the Earth. Remote sensing systems measure and record electromagnetic disturbances that the surveyed objects induce in their surroundings, by means of different sensors mounted on airborne or space platforms. Map updating is important for management and decisions making people, because of the fast changes that usually happen in natural, urban and semi urban areas. It is necessary to optimize the methodology for obtaining the best from remote sensing techniques from hyperspectral data. The first problem with hyperspectral data is to reduce the dimensionality, keeping the maximum amount of information. Hyperspectral sensors augment considerably the amount of information, this allows us to obtain a better precision on the separation of material but at the same time it is necessary to calculate a bigger number of parameters, and the precision lowers with the increase in the number of bands. This is known as the Hughes effects (Bellman, 1957) . Hyperspectral imagery allows us to discriminate between a huge number of different materials however some land and urban covers are made up with similar material and respond similarly which produces confusion in the classification. The training and the algorithm used for mapping are also important for the final result and some properties of thermal spectrum for detecting land cover will be studied. In summary, this Thesis presents a new technique for anomaly detection in hyperspectral data called DAFT, as a PP's variant, based on dimensionality reduction by projecting anomalies or targets with unknown spectral signature to the background, in a range thermal spectrum wavelengths. The proposed methodology has been tested with hyperspectral images from different imaging spectrometers corresponding to several places or scenarios, therefore with different spectral background. The results show the benefits of the approach to the detection of a variety of targets whose spectral signatures have sufficient deviation in relation to the background. DAFT is an automated technique in the sense that there is not necessary to adjust parameters, providing significant results in all cases. Subpixel anomalies which cannot be distinguished by the human eye, on the original image, however can be detected as outliers due to the projection of the VNIR end members with a very strong thermal contrast. Furthermore, a comparison between the proposed approach and the well-known RX detector is performed at both modes, global and local. The proposed method outperforms the existents in particular scenarios, demonstrating its performance to reduce the probability of false alarms. The results of the automatic algorithm DAFT have demonstrated improvement in the qualitative definition of the spectral anomalies by replacing the classical model by the normal distribution with a robust method. For their achievement has been necessary to analyze the relationship between biophysical parameters such as reflectance and emissivity, and the spatial distribution of detected entities with respect to their environment, as for example some buried or semi-buried materials, or building covers of asbestos, cellular polycarbonate-PVC or metal composites. Finally, the DAFT method has been chosen as the most suitable for anomaly detection using imaging spectrometers that acquire them in the thermal infrared spectrum, since it presents the best results in comparison with the reference data, demonstrating great computational efficiency that facilitates its implementation in a mapping system towards, what is called, Real-Time Mapping.
Resumo:
Durante la actividad diaria, la sociedad actual interactúa constantemente por medio de dispositivos electrónicos y servicios de telecomunicaciones, tales como el teléfono, correo electrónico, transacciones bancarias o redes sociales de Internet. Sin saberlo, masivamente dejamos rastros de nuestra actividad en las bases de datos de empresas proveedoras de servicios. Estas nuevas fuentes de datos tienen las dimensiones necesarias para que se puedan observar patrones de comportamiento humano a grandes escalas. Como resultado, ha surgido una reciente explosión sin precedentes de estudios de sistemas sociales, dirigidos por el análisis de datos y procesos computacionales. En esta tesis desarrollamos métodos computacionales y matemáticos para analizar sistemas sociales por medio del estudio combinado de datos derivados de la actividad humana y la teoría de redes complejas. Nuestro objetivo es caracterizar y entender los sistemas emergentes de interacciones sociales en los nuevos espacios tecnológicos, tales como la red social Twitter y la telefonía móvil. Analizamos los sistemas por medio de la construcción de redes complejas y series temporales, estudiando su estructura, funcionamiento y evolución en el tiempo. También, investigamos la naturaleza de los patrones observados por medio de los mecanismos que rigen las interacciones entre individuos, así como medimos el impacto de eventos críticos en el comportamiento del sistema. Para ello, hemos propuesto modelos que explican las estructuras globales y la dinámica emergente con que fluye la información en el sistema. Para los estudios de la red social Twitter, hemos basado nuestros análisis en conversaciones puntuales, tales como protestas políticas, grandes acontecimientos o procesos electorales. A partir de los mensajes de las conversaciones, identificamos a los usuarios que participan y construimos redes de interacciones entre los mismos. Específicamente, construimos una red para representar quién recibe los mensajes de quién y otra red para representar quién propaga los mensajes de quién. En general, hemos encontrado que estas estructuras tienen propiedades complejas, tales como crecimiento explosivo y distribuciones de grado libres de escala. En base a la topología de estas redes, hemos indentificado tres tipos de usuarios que determinan el flujo de información según su actividad e influencia. Para medir la influencia de los usuarios en las conversaciones, hemos introducido una nueva medida llamada eficiencia de usuario. La eficiencia se define como el número de retransmisiones obtenidas por mensaje enviado, y mide los efectos que tienen los esfuerzos individuales sobre la reacción colectiva. Hemos observado que la distribución de esta propiedad es ubicua en varias conversaciones de Twitter, sin importar sus dimensiones ni contextos. Con lo cual, sugerimos que existe universalidad en la relación entre esfuerzos individuales y reacciones colectivas en Twitter. Para explicar los factores que determinan la emergencia de la distribución de eficiencia, hemos desarrollado un modelo computacional que simula la propagación de mensajes en la red social de Twitter, basado en el mecanismo de cascadas independientes. Este modelo nos permite medir el efecto que tienen sobre la distribución de eficiencia, tanto la topología de la red social subyacente, como la forma en que los usuarios envían mensajes. Los resultados indican que la emergencia de un grupo selecto de usuarios altamente eficientes depende de la heterogeneidad de la red subyacente y no del comportamiento individual. Por otro lado, hemos desarrollado técnicas para inferir el grado de polarización política en redes sociales. Proponemos una metodología para estimar opiniones en redes sociales y medir el grado de polarización en las opiniones obtenidas. Hemos diseñado un modelo donde estudiamos el efecto que tiene la opinión de un pequeño grupo de usuarios influyentes, llamado élite, sobre las opiniones de la mayoría de usuarios. El modelo da como resultado una distribución de opiniones sobre la cual medimos el grado de polarización. Aplicamos nuestra metodología para medir la polarización en redes de difusión de mensajes, durante una conversación en Twitter de una sociedad políticamente polarizada. Los resultados obtenidos presentan una alta correspondencia con los datos offline. Con este estudio, hemos demostrado que la metodología propuesta es capaz de determinar diferentes grados de polarización dependiendo de la estructura de la red. Finalmente, hemos estudiado el comportamiento humano a partir de datos de telefonía móvil. Por una parte, hemos caracterizado el impacto que tienen desastres naturales, como innundaciones, sobre el comportamiento colectivo. Encontramos que los patrones de comunicación se alteran de forma abrupta en las áreas afectadas por la catástofre. Con lo cual, demostramos que se podría medir el impacto en la región casi en tiempo real y sin necesidad de desplegar esfuerzos en el terreno. Por otra parte, hemos estudiado los patrones de actividad y movilidad humana para caracterizar las interacciones entre regiones de un país en desarrollo. Encontramos que las redes de llamadas y trayectorias humanas tienen estructuras de comunidades asociadas a regiones y centros urbanos. En resumen, hemos mostrado que es posible entender procesos sociales complejos por medio del análisis de datos de actividad humana y la teoría de redes complejas. A lo largo de la tesis, hemos comprobado que fenómenos sociales como la influencia, polarización política o reacción a eventos críticos quedan reflejados en los patrones estructurales y dinámicos que presentan la redes construidas a partir de datos de conversaciones en redes sociales de Internet o telefonía móvil. ABSTRACT During daily routines, we are constantly interacting with electronic devices and telecommunication services. Unconsciously, we are massively leaving traces of our activity in the service providers’ databases. These new data sources have the dimensions required to enable the observation of human behavioral patterns at large scales. As a result, there has been an unprecedented explosion of data-driven social research. In this thesis, we develop computational and mathematical methods to analyze social systems by means of the combined study of human activity data and the theory of complex networks. Our goal is to characterize and understand the emergent systems from human interactions on the new technological spaces, such as the online social network Twitter and mobile phones. We analyze systems by means of the construction of complex networks and temporal series, studying their structure, functioning and temporal evolution. We also investigate on the nature of the observed patterns, by means of the mechanisms that rule the interactions among individuals, as well as on the impact of critical events on the system’s behavior. For this purpose, we have proposed models that explain the global structures and the emergent dynamics of information flow in the system. In the studies of the online social network Twitter, we have based our analysis on specific conversations, such as political protests, important announcements and electoral processes. From the messages related to the conversations, we identify the participant users and build networks of interactions with them. We specifically build one network to represent whoreceives- whose-messages and another to represent who-propagates-whose-messages. In general, we have found that these structures have complex properties, such as explosive growth and scale-free degree distributions. Based on the topological properties of these networks, we have identified three types of user behavior that determine the information flow dynamics due to their influence. In order to measure the users’ influence on the conversations, we have introduced a new measure called user efficiency. It is defined as the number of retransmissions obtained by message posted, and it measures the effects of the individual activity on the collective reacixtions. We have observed that the probability distribution of this property is ubiquitous across several Twitter conversation, regardlessly of their dimension or social context. Therefore, we suggest that there is a universal behavior in the relationship between individual efforts and collective reactions on Twitter. In order to explain the different factors that determine the user efficiency distribution, we have developed a computational model to simulate the diffusion of messages on Twitter, based on the mechanism of independent cascades. This model, allows us to measure the impact on the emergent efficiency distribution of the underlying network topology, as well as the way that users post messages. The results indicate that the emergence of an exclusive group of highly efficient users depends upon the heterogeneity of the underlying network instead of the individual behavior. Moreover, we have also developed techniques to infer the degree of polarization in social networks. We propose a methodology to estimate opinions in social networks and to measure the degree of polarization in the obtained opinions. We have designed a model to study the effects of the opinions of a small group of influential users, called elite, on the opinions of the majority of users. The model results in an opinions distribution to which we measure the degree of polarization. We apply our methodology to measure the polarization on graphs from the messages diffusion process, during a conversation on Twitter from a polarized society. The results are in very good agreement with offline and contextual data. With this study, we have shown that our methodology is capable of detecting several degrees of polarization depending on the structure of the networks. Finally, we have also inferred the human behavior from mobile phones’ data. On the one hand, we have characterized the impact of natural disasters, like flooding, on the collective behavior. We found that the communication patterns are abruptly altered in the areas affected by the catastrophe. Therefore, we demonstrate that we could measure the impact of the disaster on the region, almost in real-time and without needing to deploy further efforts. On the other hand, we have studied human activity and mobility patterns in order to characterize regional interactions on a developing country. We found that the calls and trajectories networks present community structure associated to regional and urban areas. In summary, we have shown that it is possible to understand complex social processes by means of analyzing human activity data and the theory of complex networks. Along the thesis, we have demonstrated that social phenomena, like influence, polarization and reaction to critical events, are reflected in the structural and dynamical patterns of the networks constructed from data regarding conversations on online social networks and mobile phones.
Resumo:
Services in smart environments pursue to increase the quality of people?s lives. The most important issues when developing this kind of environments is testing and validating such services. These tasks usually imply high costs and annoying or unfeasible real-world testing. In such cases, artificial societies may be used to simulate the smart environment (i.e. physical environment, equipment and humans). With this aim, the CHROMUBE methodology guides test engineers when modeling human beings. Such models reproduce behaviors which are highly similar to the real ones. Originally, these models are based on automata whose transitions are governed by random variables. Automaton?s structure and the probability distribution functions of each random variable are determined by a manual test and error process. In this paper, it is presented an alternative extension of this methodology which avoids the said manual process. It is based on learning human behavior patterns automatically from sensor data by using machine learning techniques. The presented approach has been tested on a real scenario, where this extension has given highly accurate human behavior models,
Resumo:
Las organizaciones son sistemas o unidades sociales, compuestas por personas que interactúan entre sí, para lograr objetivos comunes. Uno de sus objetivos es la productividad. La productividad es un constructo multidimensional en la que influyen aspectos tecnológicos, económicos, organizacionales y humanos. Diversos estudios apoyan la influencia de la motivación de las personas, de las habilidades y destrezas de los individuos, de su talento para desempeñar el trabajo, así como también del ambiente de trabajo presente en la organización, en la productividad. Por esta razón, el objetivo general de la investigación, es analizar la influencia entre los factores humanos y la productividad. Se hará énfasis en la persona como factor productivo clave, para responder a las interrogantes de la investigación, referidas a cuáles son las variables humanas que inciden en la productividad, a la posibilidad de plantear un modelo de productividad que considere el impacto del factor humano y la posibilidad de encontrar un método para la medición de la productividad que contemple la percepción del factor humano. Para resolver estas interrogantes, en esta investigación se busca establecer las relaciones entre las variables humanas y la productividad, vistas desde la perspectiva de tres unidades de análisis diferentes: individuo, grupo y organización, para la formulación de un modelo de productividad humana y el diseño de un instrumento para su medida. Una de las principales fuente de investigación para la elección de las variables humanas, la formulación del modelo, y el método de medición de la productividad, fue la revisión de la literatura disponible sobre la productividad y el factor humano en las organizaciones, lo que facilitó el trazado del marco teórico y conceptual. Otra de las fuentes para la selección fue la opinión de expertos y de especialistas directamente involucrados en el sector eléctrico venezolano, lo cual facilitó la obtención de un modelo, cuyas variables reflejasen la realidad del ámbito en estudio. Para aportar una interpretación explicativa del fenómeno, se planteó el modelo de los Factores Humanos vs Productividad (MFHP), el cual se analizó desde la perspectiva del análisis causal y fue conformado por tres variables latentes exógenas denominadas: factores individuales, factores grupales y factores organizacionales, que estaban relacionadas con una variable latente endógena denominada productividad. El MFHP se formuló mediante la metodología de los modelos de ecuaciones estructurales (SEM). Las relaciones inicialmente propuestas entre las variables latentes fueron corroboradas por los ajustes globales del modelo, se constataron las relaciones entre las variables latentes planteadas y sus indicadores asociados, lo que facilitó el enunciado de 26 hipótesis, de las cuales se comprobaron 24. El modelo fue validado mediante la estrategia de modelos rivales, utilizada para comparar varios modelos SEM, y seleccionar el de mejor ajuste, con sustento teórico. La aceptación del modelo se realizó mediante la evaluación conjunta de los índices de bondad de ajuste globales. Asimismo, para la elaboración del instrumento de medida de la productividad (IMPH), se realizó un análisis factorial exploratorio previo a la aplicación del análisis factorial confirmatorio, aplicando SEM. La revisión de los conceptos de productividad, la incidencia del factor humano, y sus métodos de medición, condujeron al planteamiento de métodos subjetivos que incorporaron la percepción de los principales actores del proceso productivo, tanto para la selección de las variables, como para la formulación de un modelo de productividad y el diseño de un instrumento de medición de la productividad. La contribución metodológica de este trabajo de investigación, ha sido el empleo de los SEM para relacionar variables que tienen que ver con el comportamiento humano en la organización y la productividad, lo cual abre nuevas posibilidades a la investigación en este ámbito. Organizations are social systems or units composed of people who interact with each other to achieve common goals. One objective is productivity, which is a multidimensional construct influenced by technological, economic, organizational and human aspects. Several studies support the influence on productivity of personal motivation, of the skills and abilities of individuals, of their talent for the job, as well as of the work environment present in the organization. Therefore, the overall objective of this research is to analyze the influence between human factors and productivity. The emphasis is on the individual as a productive factor which is key in order to answer the research questions concerning the human variables that affect productivity and to address the ability to propose a productivity model that considers the impact of the human factor and the possibility of finding a method for the measurement of productivity that includes the perception of the human factor. To consider these questions, this research seeks to establish the relationships between human and productivity variables, as seen from the perspective of three different units of analysis: the individual, the group and the organization, in order to formulate a model of human productivity and to design an instrument for its measurement. A major source of research for choosing the human variables, model formulation, and method of measuring productivity, was the review of the available literature on productivity and the human factor in organizations which facilitated the design of the theoretical and conceptual framework. Another source for the selection was the opinion of experts and specialists directly involved in the Venezuelan electricity sector which facilitated obtaining a model whose variables reflect the reality of the area under study. To provide an interpretation explaining the phenomenon, the model of the Human Factors vs. Productivity Model (HFPM) was proposed. This model has been analyzed from the perspective of causal analysis and was composed of three latent exogenous variables denominated: individual, group and organizational factors which are related to a latent variable denominated endogenous productivity. The HFPM was formulated using the methodology of Structural Equation Modeling (SEM). The initially proposed relationships between latent variables were confirmed by the global fits of the model, the relationships between the latent variables and their associated indicators enable the statement of 26 hypotheses, of which 24 were confirmed. The model was validated using the strategy of rival models, used for comparing various SEM models and to select the one that provides the best fit, with theoretical support. The acceptance of the model was performed through the joint evaluation of the adequacy of global fit indices. Additionally, for the development of an instrument to measure productivity, an exploratory factor analysis was performed prior to the application of a confirmatory factor analysis, using SEM. The review of the concepts of productivity, the impact of the human factor, and the measurement methods led to a subjective methods approach that incorporated the perception of the main actors of the production process, both for the selection of variables and for the formulation of a productivity model and the design of an instrument to measure productivity. The methodological contribution of this research has been the use of SEM to relate variables that have to do with human behavior in the organization and with productivity, opening new possibilities for research in this area.
Resumo:
Abstract Due to recent scientific and technological advances in information sys¬tems, it is now possible to perform almost every application on a mobile device. The need to make sense of such devices more intelligent opens an opportunity to design data mining algorithm that are able to autonomous execute in local devices to provide the device with knowledge. The problem behind autonomous mining deals with the proper configuration of the algorithm to produce the most appropriate results. Contextual information together with resource information of the device have a strong impact on both the feasibility of a particu¬lar execution and on the production of the proper patterns. On the other hand, performance of the algorithm expressed in terms of efficacy and efficiency highly depends on the features of the dataset to be analyzed together with values of the parameters of a particular implementation of an algorithm. However, few existing approaches deal with autonomous configuration of data mining algorithms and in any case they do not deal with contextual or resources information. Both issues are of particular significance, in particular for social net¬works application. In fact, the widespread use of social networks and consequently the amount of information shared have made the need of modeling context in social application a priority. Also the resource consumption has a crucial role in such platforms as the users are using social networks mainly on their mobile devices. This PhD thesis addresses the aforementioned open issues, focusing on i) Analyzing the behavior of algorithms, ii) mapping contextual and resources information to find the most appropriate configuration iii) applying the model for the case of a social recommender. Four main contributions are presented: - The EE-Model: is able to predict the behavior of a data mining algorithm in terms of resource consumed and accuracy of the mining model it will obtain. - The SC-Mapper: maps a situation defined by the context and resource state to a data mining configuration. - SOMAR: is a social activity (event and informal ongoings) recommender for mobile devices. - D-SOMAR: is an evolution of SOMAR which incorporates the configurator in order to provide updated recommendations. Finally, the experimental validation of the proposed contributions using synthetic and real datasets allows us to achieve the objectives and answer the research questions proposed for this dissertation.
Resumo:
The understanding of the structure and dynamics of the intricate network of connections among people that consumes products through Internet appears as an extremely useful asset in order to study emergent properties related to social behavior. This knowledge could be useful, for example, to improve the performance of personal recommendation algorithms. In this contribution, we analyzed five-year records of movie-rating transactions provided by Netflix, a movie rental platform where users rate movies from an online catalog. This dataset can be studied as a bipartite user-item network whose structure evolves in time. Even though several topological properties from subsets of this bipartite network have been reported with a model that combines random and preferential attachment mechanisms [Beguerisse Díaz et al., 2010], there are still many aspects worth to be explored, as they are connected to relevant phenomena underlying the evolution of the network. In this work, we test the hypothesis that bursty human behavior is essential in order to describe how a bipartite user-item network evolves in time. To that end, we propose a novel model that combines, for user nodes, a network growth prescription based on a preferential attachment mechanism acting not only in the topological domain (i.e. based on node degrees) but also in time domain. In the case of items, the model mixes degree preferential attachment and random selection. With these ingredients, the model is not only able to reproduce the asymptotic degree distribution, but also shows an excellent agreement with the Netflix data in several time-dependent topological properties.
Resumo:
Intelligent Transportation Systems (ITS) cover a broad range of methods and technologies that provide answers to many problems of transportation. Unmanned control of the steering wheel is one of the most important challenges facing researchers in this area. This paper presents a method to adjust automatically a fuzzy controller to manage the steering wheel of a mass-produced vehicle to reproduce the steering of a human driver. To this end, information is recorded about the car's state while being driven by human drivers and used to obtain, via genetic algorithms, appropriate fuzzy controllers that can drive the car in the way that humans do. These controllers have satisfy two main objectives: to reproduce the human behavior, and to provide smooth actions to ensure comfortable driving. Finally, the results of automated driving on a test circuit are presented, showing both good route tracking (similar to the performance obtained by persons in the same task) and smooth driving.
Resumo:
Este proyecto, recoge el estudio de diferentes simuladores sobre comunicaciones móviles, que se encargan de analizar el comportamiento de las tecnologías UMTS (Universal Mobile Telecommunications System), 3G y LTE (Long Term Evolution),3.9G, centrándose principalmente en el caso de los simuladores LTE, ya que es la tecnología que se está implantando en la actualidad. Por ello, antes de analizar las características de la interfaz radio más importante de esta generación, la 3.9G, se hará una overview general de cómo han ido evolucionando las comunicaciones móviles a lo largo de la historia, se analizarán las características de la tecnología móvil actual, la 3.9G, para posteriormente centrarse en un par de simuladores que demostrarán, mediante resultados gráficos, estas características. Hoy en día, el uso de estos simuladores es totalmente necesario, ya que las comunicaciones móviles, avanzan a un ritmo vertiginoso y es necesario por lo tanto conocer las prestaciones que pueden producir las diferentes tecnologías móviles utilizadas. Los simuladores utilizados por este proyecto, permiten analizar el comportamiento de varios escenarios, ya que existen diferentes tipos de simuladores, tanto a nivel de enlace como a nivel de sistema. Se mencionarán una serie de simuladores correspondientes a la tercera generación UMTS, pero los simuladores en cuestión que se estudiarán y analizarán con más profundidad en este proyecto fin de carrera son los simuladores “Link-Level” y “System-Level”, desarrollados por el “Institute of Communications and Radio-Frecuency Engineering” de la Universidad de Viena. Estos simuladores permiten realizar diferentes simulaciones, como analizar el comportamiento entre una estación base y un único usuario, para el caso de los simuladores a nivel de enlace, o bien analizar el comportamiento de toda una red en el caso de los simuladores a nivel de sistema. Con los resultados que se pueden obtener de ambos simuladores, se realizarán una serie de preguntas, basadas en la práctica realizada por el profesor de la universidad Politécnica de Madrid, Pedro García del Pino, tanto de tipo teóricas como de tipo prácticas, para comprobar que se han entendido los simuladores analizados. Finalmente se citarán las conclusiones que se obtiene de este proyecto, así como las líneas futuras de acción. PROJECT ABSTRACT This project includes the study of different simulators on mobile communications, which are responsible for analyzing the behavior of UMTS (Universal Mobile Telecommunications System), 3G and LTE (Long Term Evolution), 3.9G, mainly focusing on the case of LTE simulators because it is the technology that is being implemented today. Therefore, before analyzing the characteristics of the most important radio interface of this generation, 3.9G, there will give a general overview how the mobile communications have evolved throughout history, analyzing the characteristics of current mobile technology, the 3.9G, later focus on a pair of simulators that demonstrate through graphical results, these characteristics. Today, the use of these simulators is absolutely necessary, because mobile communications advance at a high rate, and it is necessary to know the features that can produce different mobile technologies that are used. The simulators used for this project, allow to analyze the behavior of several scenarios, as there are different types of simulators, both link and system level. It mentioned a number of simulators for the third generation UMTS, but the simulators in question to be studied and analyzed in this final project are the simulators "Link-Level" and "System-Level", developed by the "Institute of Communications and Radio-Frequency Engineering" at the University of Vienna. These simulators allow realize different simulations, analyze the behavior between a base station and a single user, in the case of the link-level simulators or analyze the performance of a network in the case of system-level simulators. With the results that can be obtained from both simulators, will perform a series of questions, based on the practice developed by Pedro García del Pino, Professor of “Universidad Politécnica de Madrid (UPM)”. These questions will be both of a theoretical and practical type, to check that have been understood the analyzed simulators. Finally, it quotes the conclusions obtained from this project and mention the future lines of action.
Resumo:
Dormancy is an adaptive mechanism that allows woody plants to survive at low temperatures during the winter. Disruption of circadian clock genes in winter or under low temperatures, both in long days as in short days, were described in our group few years ago (Ramos et al., 2005). Basic mechanisms of the circadian clock function are similar in herbaceous as well as in woody plants although there are differences in their response to low temperatures (Bieniawska et al., 2008). Woody plants growing in daylight conditions should have a specific transcriptional control above the circadian clock genes, which is responsible of their constitutive transcriptional activation observed under low temperatures conditions. In order to understand this regulatory process, we are analyzing the behavior of a circadian clock gene in poplar. To this aim, we have isolated its promoter region and fused to the luciferase reporter gene. This construct has been transformed into Populus tremula x P. alba 717-1B4 INRA clone. Here we present the characterization of these transgenic lines under different conditions of light and temperature.
Resumo:
En este Trabajo de Fin de Grado se ha realizado el análisis de textos explicativos de datos cuantitativos, con la finalidad de dar a conocer cuáles son las relaciones, basándose en la Teoría de la Estructura Retórica, entre las distintas frases de un texto de más común uso en documentos periodísticos relacionados con el comportamiento humano y el uso que hacen las personas de las redes sociales. Además de ello se han analizado un conjunto de 20 textos (alrededor de 1200 páginas) obteniendo frases típicas relacionadas con el mismo tema, que sirvieron como base para la construcción del modelo compuesto por un total de 101 patrones. En un futuro, este Trabajo puede ser continuado, si así se desea, para lo cual se plantean las siguientes posibilidades: Ampliar el conjunto de patrones proporcionado. Construir un Sistema Generador de Textos automáticos basados en los patrones creados. Ampliar el estudio y extrapolarlo a diversos temas. ---ABSTRACT---In this Final Project has been performed an analysis of quantitative data explanatory texts, in order to make known what are the relationships, based on Rhetorical Structure Theory, between the different sentences of a text of most common use in journalistic texts related to human behavior and the use people make of social networking. Furthermore have been analyzed a set of 20 texts (about 1200 pages) obtaining typical sentences related to the same topic that served as the basis for construction of the model consists of a total of 101 patterns. In the future, this work can be continued, if so desired, for which the following possibilities are raised: Extend the set of patterns provided. Build an Automatic Text Generator System based on the patterns collected in this study. Expand the study and extrapolate it to various topics.
Resumo:
El presente trabajo busca llenar un vacío existente en cuanto a una metodología general en los estudios Arqueoacústicos para la zona Mesoamericana. El resultado mas importante de ésta tesis es la propuesta de un procedimiento y método para los cuales se detalla el conjunto de operaciones utilizadas en las mediciones particulares involucradas, parámetros y análisis relevantes (de los cuales destacan el decaimiento de energía, tiempos de reverberación, intensidades de sonido directo y total, claridad e inteligibilidad), tipos de simulaciones e incertidumbres y actividades técnicas propias de la Ingeniería Acústica que se proponen para la efectiva y útil integración de la dimensión sonora en el quehacer arqueológico. Una importante consideración de la propuesta metodológica aquí presentada, es la consideración, por un lado, de los resultados obtenidos experimentalmente del trabajo in situ en cada uno de los sitios arqueológicos de interés, y por otro los resultados obtenidos de las simulaciones y modelaciones por computadora de los mismos para su comparación y contraste. La propuesta de sistematización está dividida en tres momentos investigativos, como son la coordinación y trabajo logístico en sitios arqueológicos, su reconocimiento y prospección acústica; el trabajo de campo; y el análisis de los resultados obtenidos. Asimismo, dicha propuesta metodológica, se presenta dividida en tres objetos de estudio fundamentales en el quehacer arqueoacústico: los Fenómenos Sonoros encontrados en sitios arqueológicos; los objetos de generación de sonido, tengan éstos aplicaciones musicales o de otra índole; y finalmente los espacios y recintos arquitectónicos, ya sean cerrados o abiertos, y el estudio de su funcionalidad. El acercamiento a la descripción de cada una de dichas ramas, se realiza mediante la presentación de casos de estudio, para los que se contrastan dos áreas culturales representativas e interrelacionadas de Mesoamérica, la zona del Bajío y la zona Maya. Dicho contraste es llevado a cabo mediante el análisis acústico de 4 zonas arqueológicas que son Plazuelas, Peralta y Cañada de la Virgen, pertenecientes a la zona del Bajio, y Chichen Itzá perteneciente a la zona Maya. De particular importancia para el presente trabajo, es la funcionalidad de los espacios estudiados, específicamente de la de los patios hundidos, característicos de la arquitectura del Bajío, y las plazas públicas que constituyen una estructura integral en muchos sitios arqueológicos, centrales para la comprensión de las conexiones entre el fenómeno sonoro y el comportamiento de una cultura en particular. Asimismo, el análisis de los instrumentos de generación sonora ha permitido realizar inferencias sobre papel del sonido en el comportamiento humano de las culturas en cuestión, completando los modelos acústicos y posibilitando el situar las características de las fuentes sonoras en los espacios resonantes. Los resultados particulares de ésta tesis, han permitido establecer las características acústicas de dichos espacios e instrumentos, así como formular y validar las hipótesis sobre su usos y funciones como espacios para eventos públicos y sociales, así como para representaciones culturales multitudinarios. ABSTRACT The present work, aims to fill a void existing in terms of a general methodology in Archaeoacoustic studies for the Mesoamerican region. The most important result of this thesis is the proposal of a procedure and method for archaeoacoustical studies and its relevant analysis and parameters (of which stand out the energy decay, reverberation times, intensities, clarity and intelligibility), simulations and uncertainty considerations for the effective and useful integration of the sound dimension in the archaeological realm. An important methodological consideration in the present work is the comparison and contrast of the results obtained experimentally on site, and the results of simulations and computer modeling. The proposed systematization is divided into three distinct moments: logistics; fieldwork; and result analysis. Likewise, the presented methodological proposal is divided into three fundamental objects of study: Sound phenomena found in archaeological sites; sound generation objects and instruments; and architectural enclosures and the study of their functionality. Presenting case studies is the approach to the description of each of these branches. Two interrelated Mesoamerican cultural areas are considered, the Bajío and the Mayan. Their contrast is carried out by the acoustic analysis of four archaeological sites: Plazuelas, Peralta and Cañada de la Virgen, belonging to the area of the Bajío, and Chichen Itza belonging to the Maya area. Of particular relevance to this paper, is the acoustic functionality of the public spaces present in many archaeological sites, specifically that of the sunken patios, characteristic of the architecture of the Bajío. The analysis of the sound generation instruments has allowed certain inferences about the role of sound in human behavior of the cultures in question, thus completing the acoustic models. It has also enable the contrast of the characteristics of the sound sources and those of the resonant spaces. The results of this thesis have made it possible to develop and validate certain hypotheses about the uses and functions of certain spaces for public and social events as well as for mass cultural representations.
Resumo:
Natural disasters affect hundreds of millions of people worldwide every year. Emergency response efforts depend upon the availability of timely information, such as information concerning the movements of affected populations. The analysis of aggregated and anonymized Call Detail Records (CDR) captured from the mobile phone infrastructure provides new possibilities to characterize human behavior during critical events. In this work, we investigate the viability of using CDR data combined with other sources of information to characterize the floods that occurred in Tabasco, Mexico in 2009. An impact map has been reconstructed using Landsat-7 images to identify the floods. Within this frame, the underlying communication activity signals in the CDR data have been analyzed and compared against rainfall levels extracted from data of the NASA-TRMM project. The variations in the number of active phones connected to each cell tower reveal abnormal activity patterns in the most affected locations during and after the floods that could be used as signatures of the floods - both in terms of infrastructure impact assessment and population information awareness. The epresentativeness of the analysis has been assessed using census data and civil protection records. While a more extensive validation is required, these early results suggest high potential in using cell tower activity information to improve early warning and emergency management mechanisms.
Resumo:
Natural disasters affect hundreds of millions of people worldwide every year. Emergency response efforts depend upon the availability of timely information, such as information concerning the movements of affected populations. The analysis of aggregated and anonymized Call Detail Records (CDR) captured from the mobile phone infrastructure provides new possibilities to characterize human behavior during critical events. In this work, we investigate the viability of using CDR data combined with other sources of information to characterize the floods that occurred in Tabasco, Mexico in 2009. An impact map has been reconstructed using Landsat-7 images to identify the floods. Within this frame, the underlying communication activity signals in the CDR data have been analyzed and compared against rainfall levels extracted from data of the NASA-TRMM project. The variations in the number of active phones connected to each cell tower reveal abnormal activity patterns in the most affected locations during and after the floods that could be used as signatures of the floods - both in terms of infrastructure impact assessment and population information awareness. The representativeness of the analysis has been assessed using census data and civil protection records. While a more extensive validation is required, these early results suggest high potential in using cell tower activity information to improve early warning and emergency management mechanisms.
Resumo:
In the past decades, online learning has transformed the educational landscape with the emergence of new ways to learn. This fact, together with recent changes in educational policy in Europe aiming to facilitate the incorporation of graduate students to the labor market, has provoked a shift on the delivery of instruction and on the role played by teachers and students, stressing the need for development of both basic and cross-curricular competencies. In parallel, the last years have witnessed the emergence of new educational disciplines that can take advantage of the information retrieved by technology-based online education in order to improve instruction, such as learning analytics. This study explores the applicability of learning analytics for prediction of development of two cross-curricular competencies – teamwork and commitment – based on the analysis of Moodle interaction data logs in a Master’s Degree program at Universidad a Distancia de Madrid (UDIMA) where the students were education professionals. The results from the study question the suitability of a general interaction-based approach and show no relation between online activity indicators and teamwork and commitment acquisition. The discussion of results includes multiple recommendations for further research on this topic.
Resumo:
For safety barriers the load bearing capacity of the glass when subjected to the soft body impact should be verified. The soft body pendulum test became a testing standard to classify safety glass plates. The classification of the safety glass do not consider the structural behavior when one sheet of a laminated glass is broken; in situations when the replacement of the plate could not be very urgent, structural behavior should be evaluated. The main objective of this paper is to present the structural behavior o laminated glass plates, though modal test and human impact test, including the post fracture behavior for the laminated cases. A god reproducibility and repeatability is obtained. Two main aspects of the structural behavior can be observed: the increment of the rupture load for laminated plates after the failure of the first sheet, and some similarities with a tempered monolithic behavior of equivalent thickness.