921 resultados para Processing wikipedia data
Resumo:
Durante el transcurso de esta Tesis Doctoral se ha realizado un estudio de la problemática asociada al desarrollo de sistemas de interacción hombre-máquina sensibles al contexto. Este problema se enmarca dentro de dos áreas de investigación: los sistemas interactivos y las fuentes de información contextual. Tradicionalmente la integración entre ambos campos se desarrollaba a través de soluciones verticales específicas, que abstraen a los sistemas interactivos de conocer los procedimientos de bajo nivel de acceso a la información contextual, pero limitan su interoperabilidad con otras aplicaciones y fuentes de información. Para solventar esta limitación se hace imprescindible potenciar soluciones interoperables que permitan acceder a la información del mundo real a través de procedimientos homogéneos. Esta problemática coincide perfectamente con los escenarios de \Computación Ubicua" e \Internet de las Cosas", donde se apunta a un futuro en el que los objetos que nos rodean serán capaces de obtener información del entorno y comunicarla a otros objetos y personas. Los sistemas interactivos, al ser capaces de obtener información de su entorno a través de la interacción con el usuario, pueden tomar un papel especial en este escenario tanto como consumidores como productores de información. En esta Tesis se ha abordado la integración de ambos campos teniendo en cuenta este escenario tecnológico. Para ello, en primer lugar se ha realizado un an álisis de las iniciativas más importantes para la definición y diseño de sistemas interactivos, y de las principales infraestructuras de suministro de información. Mediante este estudio se ha propuesto utilizar el lenguaje SCXML del W3C para el diseño de los sistemas interactivos y el procesamiento de los datos proporcionados por fuentes de contexto. Así, se ha reflejado cómo las capacidades del lenguaje SCXML para combinar información de diferentes modalidades pueden también utilizarse para procesar e integrar información contextual de diferentes fuentes heterogéneas, y por consiguiente diseñar sistemas de interacción sensibles al contexto. Del mismo modo se presenta a la iniciativa Sensor Web, y a su extensión semántica Semantic Sensor Web, como una iniciativa idónea para permitir un acceso y suministro homogéneo de la información a los sistemas interactivos sensibles al contexto. Posteriormente se han analizado los retos que plantea la integración de ambos tipos de iniciativas. Como resultado se ha conseguido establecer una serie de funcionalidades que son necesarias implementar para llevar a cabo esta integración. Utilizando tecnologías que aportan una gran flexibilidad al proceso de implementación y que se apoyan en recomendaciones y estándares actuales, se implementaron una serie de desarrollos experimentales que integraban las funcionalidades identificadas anteriormente. Finalmente, con el fin de validar nuestra propuesta, se realizaron un conjunto de experimentos sobre un entorno de experimentación que simula el escenario de la conducción. En este escenario un sistema interactivo se comunica con una extensión semántica de una plataforma basada en los estándares de la Sensor Web para poder obtener información y publicar las observaciones que el usuario realizaba al sistema. Los resultados obtenidos han demostrado la viabilidad de utilizar el lenguaje SCXML para el diseño de sistemas interactivos sensibles al contexto que requieren acceder a plataformas avanzadas de información para consumir y publicar información a la vez que interaccionan con el usuario. Del mismo modo, se ha demostrado cómo la utilización de tecnologías semánticas en los procesos de consulta y publicación de información puede facilitar la reutilización de la información publicada en infraestructuras Sensor Web por cualquier tipo de aplicación, y de este modo contribuir al futuro escenario de Internet de las Cosas. ABSTRACT In this Thesis, we have addressed the difficulties related to the development of context-aware human-machine interaction systems. This issue is part of two research fields: interactive systems and contextual information sources. Traditionally both fields have been integrated through domain-specific vertical solutions that allow interactive systems to access contextual information without having to deal with low-level procedures, but restricting their interoperability with other applications and heterogeneous data sources. Thus, it is essential to boost the research on interoperable solutions that provide access to real world information through homogeneous procedures. This issue perfectly matches with the scenarios of \Ubiquitous Computing" and \Internet of Things", which point toward a future in which many objects around us will be able to acquire meaningful information about the environment and communicate it to other objects and to people. Since interactive systems are able to get information from their environment through interaction with the user, they can play an important role in this scenario as they can both consume real-world data and produce enriched information. This Thesis deals with the integration of both fields considering this technological scenario. In order to do this, we first carried out an analysis of the most important initiatives for the definition and design of interactive systems, and the main infrastructures for providing information. Through this study the use of the W3C SCXML language is proposed for both the design of interactive systems and the processing of data provided by different context sources. Thus, this work has shown how the SCXML capabilities for combining information from different modalities can also be used to process and integrate contextual information from different heterogeneous sensor sources, and therefore to develope context-aware interaction systems. Similarly, we present the Sensor Web initiative, and its semantic extension Semantic Sensor Web, as an appropriate initiative to allow uniform access and delivery of information to the context-aware interactive systems. Subsequently we have analyzed the challenges of integrating both types of initiatives: SCXML and (Semantic) Sensor Web. As a result, we state a number of functionalities that are necessary to implement in order to perform this integration. By using technologies that provide exibility to the implementation process and are based on current recommendations and standards, we implemented a series of experimental developments that integrate the identified functionalities. Finally, in order to validate our approach, we conducted different experiments with a testing environment simulating a driving scenario. In this framework an interactive system can access a semantic extension of a Telco plataform, based on the standards of the Sensor Web, to acquire contextual information and publish observations that the user performed to the system. The results showed the feasibility of using the SCXML language for designing context-aware interactive systems that require access to advanced sensor platforms for consuming and publishing information while interacting with the user. In the same way, it was shown how the use of semantic technologies in the processes of querying and publication sensor data can assist in reusing and sharing the information published by any application in Sensor Web infrastructures, and thus contribute to realize the future scenario of \Internet of Things".
Resumo:
Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.
Resumo:
Collaborative hardening and hardware redundancy are nowadays the most interesting solutions in terms of fault tolerance achieved and low extra cost imposed to the project budget. Thanks to the powerful and cheap digital devices that are available in the market, extra processing capabilities can be used for redundant tasks, not only in early data processing (sensed data) but also in routing and interfacing1
Resumo:
Los modelos de termomecánica glaciar están definidos mediante sistemas de ecuaciones en derivadas parciales que establecen los principios básicos de conservación de masa, momento lineal y energía, acompañados por una ley constitutiva que define la relación entre las tensiones a las que está sometido el hielo glaciar y las deformaciones resultantes de las mismas. La resolución de estas ecuaciones requiere la definición precisa del dominio (la geometría del glaciar, obtenido a partir de medidas topográficas y de georradar), así como contar con un conjunto de condiciones de contorno, que se obtienen a partir de medidas de campo de las variables implicadas y que constituyen un conjunto de datos geoespaciales. El objetivo fundamental de esta tesis es desarrollar una serie de herramientas que nos permitan definir con precisión la geometría del glaciar y disponer de un conjunto adecuado de valores de las variables a utilizar como condiciones de contorno del problema. Para ello, en esta tesis se aborda la recopilación, la integración y el estudio de los datos geoespaciales existentes para la Península Hurd, en la Isla Livingston (Antártida), generados desde el año 1957 hasta la actualidad, en un sistema de información geográfica. Del correcto tratamiento y procesamiento de estos datos se obtienen otra serie de elementos que nos permiten realizar la simulación numérica del régimen termomecánico presente de los glaciares de Península Hurd, así como su evolución futura. Con este objetivo se desarrolla en primer lugar un inventario completo de datos geoespaciales y se realiza un procesado de los datos capturados en campo, para establecer un sistema de referencia común a todos ellos. Se unifican además todos los datos bajo un mismo formato estándar de almacenamiento e intercambio de información, generándose los metadatos correspondientes. Se desarrollan asimismo técnicas para la mejora de los procedimientos de captura y procesado de los datos, de forma que se minimicen los errores y se disponga de estimaciones fiables de los mismos. El hecho de que toda la información se integre en un sistema de información geográfica (una vez producida la normalización e inventariado de la misma) permite su consulta rápida y ágil por terceros. Además, hace posible efectuar sobre ella una serie de operaciones conducentes a la obtención de nuevas capas de información. El análisis de estos nuevos datos permite explicar el comportamiento pasado de los glaciares objeto de estudio y proporciona elementos esenciales para la simulación de su comportamiento futuro. ABSTRACT Glacier thermo-mechanical models are defined by systems of partial differential equations stating the basic principles of conservation of mass, momentum and energy, accompanied by a constitutive principle that defines the relationship between the stresses acting on the ice and the resulting deformations. The solution of these equations requires an accurate definition of the model domain (the geometry of the glacier, obtained from topographical and ground penetrating radar measurements), as well as a set of boundary conditions, which are obtained from measurements of the variables involved and define a set of geospatial data. The main objective of this thesis is to develop tools able to provide an accurate definition of the glacier geometry and getting a proper set of values for the variables to be used as boundary conditions of our problem. With the above aim, this thesis focuses on the collection, compilation and study of the geospatial data existing for the Hurd Peninsula on Livingston Island, Antarctica, generated since 1957 to present, into a geographic information system. The correct handling and processing of these data results on a new collection of elements that allow us to numerically model the present state and the future evolution of Hurd Peninsula glaciers. First, a complete inventory of geospatial data is developed and the captured data are processed, with the aim of establishing a reference system common to all collections of data. All data are stored under a common standard format, and the corresponding metadata are generated to facilitate the information exchange. We also develop techniques for the improvement of the procedures used for capturing and processing the data, such that the errors are minimized and better estimated. All information is integrated into a geographic information system (once produced the standardization and inventory of it). This allows easy and fast viewing and consulting of the data by third parties. Also, it is possible to carry out a series of operations leading to the production of new layers of information. The analysis of these new data allows to explain past glacier behavior, and provides essential elements for explaining its future evolution.
Resumo:
Este Proyecto Fin de Grado tiene como objetivo realizar un estudio del centelleo troposférico en presencia de lluvia en la banda Ka. Para ello se han utilizado los datos obtenidos de un enlace Tierra-satélite mediante una estación receptora situada en la ETSIT UPM. En la primera parte de este proyecto, se ha realizado un estudio teórico, no demasiado profundo de los principales fenómenos que afectan a este tipo de enlaces, comentando algunos modelos de predicción y profundizando en el centelleo troposférico, ya que es el principal tema que se trata en este proyecto. En segundo lugar, se ha realizado el experimento. En la primera parte se ha comprobado que efectivamente se cumple el modelo de pendiente de -20 dB/dec debidos a la lluvia, todo ello procesando los datos que han sido tomados en la ETSIT mediante Matlab. En la segunda parte, se han tratado individualmente distintos tipos de eventos con el fin de determinar una frecuencia de corte adecuada para realizar la separación mediante filtrado de las componentes de lluvia respecto a las de centelleo, así como verificar el cumplimiento de la pendiente en el espectro de -26 dB/dec debida al centelleo. Posteriormente, una vez que se ha obtenido una frecuencia de corte adecuada para el filtro, he separado las componentes espectrales debidas al centelleo de las componentes debidas a la lluvia. Se ha obtenido la varianza de una serie de eventos, ya que es el parámetro que mejor caracteriza la intensidad del centelleo y la hemos comparado en los diferentes meses para observar su variación, ya que las lluvias de verano tienen asociadas la presencia de turbulencias mayores respecto a las de invierno. Para finalizar el proyecto, se han expuesto las conclusiones a las que hemos llegado mediante el mismo, comentando también las posibles líneas de investigación futuras. ABSTRACT. The aim of this Project is to make a study about tropospheric Scintillation in presence of rain in the Ka band. For this purpose, the data obtained in an Earth-satellite link by a receiving station located at ETSIT UPM will be used. In the first part of this project, a non-detailed theoretical study of the principal phenomena affecting this type of links has been performed, commenting some prediction models and deepening in tropospheric Scintillation, because this is the principal objective of this project. In the second part, the experiment has been completed. In the first part the fulfillment of -20 dB/dec of slope due to the rain has been proved, after processing the data obtained in the ETSIT using Matlab. In the second part different types of events have been treated in order to determine an adequate cut frequency with the purpose of separating by filtering the components of the rain from the components of scintillation as well as verifying the fulfillment of the slope in the spectrum of -26 dB/dec due to scintillation. Afterwards, once that one suitable cut frequency for the filter has been obtained, the components of scintillation respect to the components produced by rain have been separated. The variance of a series of events has been obtained, because it is the parameter that best characterizes the scintillation intensity and during the different months it has been compared with the objective of observing its variation, due to the higher association of summer rains to the presence of greater turbulences than during winter rains. To complete this project I have presented the final conclusions to which we have come, also commenting on the possible future lines of investigation.
Resumo:
La embriogénesis es el proceso mediante el cual una célula se convierte en un ser un vivo. A lo largo de diferentes etapas de desarrollo, la población de células va proliferando a la vez que el embrión va tomando forma y se configura. Esto es posible gracias a la acción de varios procesos genéticos, bioquímicos y mecánicos que interaccionan y se regulan entre ellos formando un sistema complejo que se organiza a diferentes escalas espaciales y temporales. Este proceso ocurre de manera robusta y reproducible, pero también con cierta variabilidad que permite la diversidad de individuos de una misma especie. La aparición de la microscopía de fluorescencia, posible gracias a proteínas fluorescentes que pueden ser adheridas a las cadenas de expresión de las células, y los avances en la física óptica de los microscopios han permitido observar este proceso de embriogénesis in-vivo y generar secuencias de imágenes tridimensionales de alta resolución espacio-temporal. Estas imágenes permiten el estudio de los procesos de desarrollo embrionario con técnicas de análisis de imagen y de datos, reconstruyendo dichos procesos para crear la representación de un embrión digital. Una de las más actuales problemáticas en este campo es entender los procesos mecánicos, de manera aislada y en interacción con otros factores como la expresión genética, para que el embrión se desarrolle. Debido a la complejidad de estos procesos, estos problemas se afrontan mediante diferentes técnicas y escalas específicas donde, a través de experimentos, pueden hacerse y confrontarse hipótesis, obteniendo conclusiones sobre el funcionamiento de los mecanismos estudiados. Esta tesis doctoral se ha enfocado sobre esta problemática intentando mejorar las metodologías del estado del arte y con un objetivo específico: estudiar patrones de deformación que emergen del movimiento organizado de las células durante diferentes estados del desarrollo del embrión, de manera global o en tejidos concretos. Estudios se han centrado en la mecánica en relación con procesos de señalización o interacciones a nivel celular o de tejido. En este trabajo, se propone un esquema para generalizar el estudio del movimiento y las interacciones mecánicas que se desprenden del mismo a diferentes escalas espaciales y temporales. Esto permitiría no sólo estudios locales, si no estudios sistemáticos de las escalas de interacción mecánica dentro de un embrión. Por tanto, el esquema propuesto obvia las causas de generación de movimiento (fuerzas) y se centra en la cuantificación de la cinemática (deformación y esfuerzos) a partir de imágenes de forma no invasiva. Hoy en día las dificultades experimentales y metodológicas y la complejidad de los sistemas biológicos impiden una descripción mecánica completa de manera sistemática. Sin embargo, patrones de deformación muestran el resultado de diferentes factores mecánicos en interacción con otros elementos dando lugar a una organización mecánica, necesaria para el desarrollo, que puede ser cuantificado a partir de la metodología propuesta en esta tesis. La metodología asume un medio continuo descrito de forma Lagrangiana (en función de las trayectorias de puntos materiales que se mueven en el sistema en lugar de puntos espaciales) de la dinámica del movimiento, estimado a partir de las imágenes mediante métodos de seguimiento de células o de técnicas de registro de imagen. Gracias a este esquema es posible describir la deformación instantánea y acumulada respecto a un estado inicial para cualquier dominio del embrión. La aplicación de esta metodología a imágenes 3D + t del pez zebra sirvió para desvelar estructuras mecánicas que tienden a estabilizarse a lo largo del tiempo en dicho embrión, y que se organizan a una escala semejante al del mapa de diferenciación celular y con indicios de correlación con patrones de expresión genética. También se aplicó la metodología al estudio del tejido amnioserosa de la Drosophila (mosca de la fruta) durante el cierre dorsal, obteniendo indicios de un acoplamiento entre escalas subcelulares, celulares y supracelulares, que genera patrones complejos en respuesta a la fuerza generada por los esqueletos de acto-myosina. En definitiva, esta tesis doctoral propone una estrategia novedosa de análisis de la dinámica celular multi-escala que permite cuantificar patrones de manera inmediata y que además ofrece una representación que reconstruye la evolución de los procesos como los ven las células, en lugar de como son observados desde el microscopio. Esta metodología por tanto permite nuevas formas de análisis y comparación de embriones y tejidos durante la embriogénesis a partir de imágenes in-vivo. ABSTRACT The embryogenesis is the process from which a single cell turns into a living organism. Through several stages of development, the cell population proliferates at the same time the embryo shapes and the organs develop gaining their functionality. This is possible through genetic, biochemical and mechanical factors that are involved in a complex interaction of processes organized in different levels and in different spatio-temporal scales. The embryogenesis, through this complexity, develops in a robust and reproducible way, but allowing variability that makes possible the diversity of living specimens. The advances in physics of microscopes and the appearance of fluorescent proteins that can be attached to expression chains, reporting about structural and functional elements of the cell, have enabled for the in-vivo observation of embryogenesis. The imaging process results in sequences of high spatio-temporal resolution 3D+time data of the embryogenesis as a digital representation of the embryos that can be further analyzed, provided new image processing and data analysis techniques are developed. One of the most relevant and challenging lines of research in the field is the quantification of the mechanical factors and processes involved in the shaping process of the embryo and their interactions with other embryogenesis factors such as genetics. Due to the complexity of the processes, studies have focused on specific problems and scales controlled in the experiments, posing and testing hypothesis to gain new biological insight. However, methodologies are often difficult to be exported to study other biological phenomena or specimens. This PhD Thesis is framed within this paradigm of research and tries to propose a systematic methodology to quantify the emergent deformation patterns from the motion estimated in in-vivo images of embryogenesis. Thanks to this strategy it would be possible to quantify not only local mechanisms, but to discover and characterize the scales of mechanical organization within the embryo. The framework focuses on the quantification of the motion kinematics (deformation and strains), neglecting the causes of the motion (forces), from images in a non-invasive way. Experimental and methodological challenges hamper the quantification of exerted forces and the mechanical properties of tissues. However, a descriptive framework of deformation patterns provides valuable insight about the organization and scales of the mechanical interactions, along the embryo development. Such a characterization would help to improve mechanical models and progressively understand the complexity of embryogenesis. This framework relies on a Lagrangian representation of the cell dynamics system based on the trajectories of points moving along the deformation. This approach of analysis enables the reconstruction of the mechanical patterning as experienced by the cells and tissues. Thus, we can build temporal profiles of deformation along stages of development, comprising both the instantaneous events and the cumulative deformation history. The application of this framework to 3D + time data of zebrafish embryogenesis allowed us to discover mechanical profiles that stabilized through time forming structures that organize in a scale comparable to the map of cell differentiation (fate map), and also suggesting correlation with genetic patterns. The framework was also applied to the analysis of the amnioserosa tissue in the drosophila’s dorsal closure, revealing that the oscillatory contraction triggered by the acto-myosin network organized complexly coupling different scales: local force generation foci, cellular morphology control mechanisms and tissue geometrical constraints. In summary, this PhD Thesis proposes a theoretical framework for the analysis of multi-scale cell dynamics that enables to quantify automatically mechanical patterns and also offers a new representation of the embryo dynamics as experienced by cells instead of how the microscope captures instantaneously the processes. Therefore, this framework enables for new strategies of quantitative analysis and comparison between embryos and tissues during embryogenesis from in-vivo images.
Resumo:
Esta dissertação apresenta o desenvolvimento de uma plataforma inercial autônoma com três graus de liberdade para aplicação em estabilização de sensores - por exemplo, gravimétricos estacionários e embarcados - podendo ser utilizada também para estabilização de câmeras. O sistema é formado pela Unidade de Medida Inercial, IMU, desenvolvida utilizando um sensor micro eletromecânico, MEMS - que possui acelerômetro, giroscópio e magnetômetros nos três eixos de orientação - e um microcontrolador para aquisição, processamento e envio dos dados ao sistema de controle e aquisição de dados. Para controle dos ângulos de inclinação e orientação da plataforma, foi implementado um controlador PID digital utilizando microcontrolador. Este recebe os dados da IMU e fornece os sinais de controle utilizando as saídas PWM que acionam os motores, os quais controlam a posição da plataforma. Para monitoramento da plataforma foi desenvolvido um programa para aquisição de dados em tempo real em ambiente Matlab, por meio do qual se pode visualizar e gravar os sinais da IMU, os ângulos de inclinação e a velocidade angular. Testou-se um sistema de transmissão de dados por rádio frequência entre a IMU e o sistema de aquisição de dados e controle para avaliar a possibilidade da não utilização de slip rings ou fios entre o eixo de rotação e os quadros da plataforma. Entretanto, verificou-se a inviabilidade da transmissão em razão da baixa velocidade de transmissão e dos ruídos captados pelo receptor de rádio frequência durante osmovimentos da plataforma. Sendo assim, dois pares de fios trançados foram utilizados fios para conectar o sensor inercial ao sistema de aquisição e processamento.
Resumo:
In this paper the model of an Innovative Monitoring Network involving properly connected nodes to develop an Information and Communication Technology (ICT) solution for preventive maintenance of historical centres from early warnings is proposed. It is well known that the protection of historical centres generally goes from a large-scale monitoring to a local one and it could be supported by a unique ICT solution. More in detail, the models of a virtually organized monitoring system could enable the implementation of automated analyses by presenting various alert levels. An adequate ICT solution tool would allow to define a monitoring network for a shared processing of data and results. Thus, a possible retrofit solution could be planned for pilot cases shared among the nodes of the network on the basis of a suitable procedure utilizing a retrofit catalogue. The final objective would consist in providing a model of an innovative tool to identify hazards, damages and possible retrofit solutions for historical centres, assuring an easy early warning support for stakeholders. The action could proactively target the needs and requirements of users, such as decision makers responsible for damage mitigation and safeguarding of cultural heritage assets.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
We present an application of Mathematical Morphology (MM) for the classification of astronomical objects, both for star/galaxy differentiation and galaxy morphology classification. We demonstrate that, for CCD images, 99.3 +/- 3.8% of galaxies can be separated from stars using MM, with 19.4 +/- 7.9% of the stars being misclassified. We demonstrate that, for photographic plate images, the number of galaxies correctly separated from the stars can be increased using our MM diffraction spike tool, which allows 51.0 +/- 6.0% of the high-brightness galaxies that are inseparable in current techniques to be correctly classified, with only 1.4 +/- 0.5% of the high-brightness stars contaminating the population. We demonstrate that elliptical (E) and late-type spiral (Sc-Sd) galaxies can be classified using MM with an accuracy of 91.4 +/- 7.8%. It is a method involving fewer 'free parameters' than current techniques, especially automated machine learning algorithms. The limitation of MM galaxy morphology classification based on seeing and distance is also presented. We examine various star/galaxy differentiation and galaxy morphology classification techniques commonly used today, and show that our MM techniques compare very favourably.
Resumo:
A população e as comunidades escolares vêm pressionando e cobrando as autoridades dos Sistemas de Justiça e da Educação por uma intervenção coerente e efetiva na resolução de conflitos que se estabelecem no ambiente escolar, uma vez que o atual modelo de Justiça, o retributivo, não tem sanado a situação. A Justiça restaurativa surge então como uma nova maneira para enfrentar esse problema e uma de suas estratégias é o círculo restaurativo, caracterizado como um grupo para restauração das relações e dos conflitos. Esta pesquisa visa: descrever e analisar os elementos estruturais dos círculos restaurativos e os fenômenos do campo grupal em processos restaurativos realizados no ambiente escolar para intervir em situações de conflito. A amostra foi composta de cinco práticas restaurativas que envolveram pré-círculo, círculo e o pós-círculo mediados por um facilitador e dois co-facilitadores. O tratamento dos dados se deu a partir da análise dos elementos estruturais da justiça restaurativa (cerimônia de abertura e fechamento, bastão de fala, e processo decisório consensual), considerando tais elementos iguais ao setting de base psicanalítica, visto que neste caso especifico, tem como objetivo deixar claro aos participantes do grupo qual é a proposta de funcionamento do mesmo; e de uma análise de conteúdo organizada a partir de categorias pré-definidas, segundo conceitos psicanalíticos (resistência, acting/atuação e insight/elaboração). Os resultados mostraram que foram estabelecidos elementos estruturais (setting) favorável ao encontro dos participantes e que predominaram no campo aspectos positivos, o que resultou no bom reestabelecimento do convívio em todos os casos analisados. Os elementos estruturais estabelecidos para a realização do círculo restaurativo criaram um espaço seguro onde os participantes se ligaram de modo positivo, mesmo com a situação de conflito. Considera-se importante creditar a figura do facilitador (Psicólogo) parte da realização da resolução do conflito. Conclui-se que a função continente; o manejo e compreensão das resistências, actings e dos insights contribuíram para que o campo grupal configurasse em coesão ao invés da desintegração. Finalmente, cabe acrescentar que a experiência demonstrou que as crianças e adolescentes respondem muito bem quando são convidados a participar de um círculo restaurativo e ali aprendem a agir de acordo com os valores vivenciados como em um processo educativo.
Resumo:
Recently there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, there is no general consensus as to how best to process sequences using topographicmaps, and this topic remains an active focus of neurocomputational research. The representational capabilities and internal representations of the models are not well understood. Here, we rigorously analyze a generalization of the self-organizingmap (SOM) for processing sequential data, recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed-input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed-input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g., SOM). However, by allowing trainable feedback connections, one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate on the importance of non-Markovian organizations in topographic maps of sequential data. © 2006 Massachusetts Institute of Technology.
Resumo:
Recently, there has been a considerable research activity in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, the representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the Self-Organizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM [1]), as a non-autonomous dynamical system consisting off a set of fixed input maps. We show that contractive fixed input maps are likely to produce Markovian organizations of receptive fields o the RecSOM map. We derive bounds on parameter $\beta$ (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed.
Resumo:
Internal quantum efficiency (IQE) of a blue high-brightness InGaN/GaN light-emitting diode (LED) was evaluated from the external quantum efficiency measured as a function of current at various temperatures ranged between 13 and 440 K. Processing the data with a novel evaluation procedure based on the ABC-model, we have determined the temperature-dependent IQE of the LED structure and light extraction efficiency of the LED chip. Separate evaluation of these parameters is helpful for further optimization of the heterostructure and chip designs. The data obtained enable making a guess on the temperature dependence of the radiative and Auger recombination coefficients, which may be important for identification of dominant mechanisms responsible for the efficiency droop in III-nitride LEDs. Thermal degradation of the LED performance in terms of the emission efficiency is also considered.