920 resultados para Data-representation


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Embedded context management in resource-constrained devices (e.g. mobile phones, autonomous sensors or smart objects) imposes special requirements in terms of lightness for data modelling and reasoning. In this paper, we explore the state-of-the-art on data representation and reasoning tools for embedded mobile reasoning and propose a light inference system (LIS) aiming at simplifying embedded inference processes offering a set of functionalities to avoid redundancy in context management operations. The system is part of a service-oriented mobile software framework, conceived to facilitate the creation of context-aware applications?it decouples sensor data acquisition and context processing from the application logic. LIS, composed of several modules, encapsulates existing lightweight tools for ontology data management and rule-based reasoning, and it is ready to run on Java-enabled handheld devices. Data management and reasoning processes are designed to handle a general ontology that enables communication among framework components. Both the applications running on top of the framework and the framework components themselves can configure the rule and query sets in order to retrieve the information they need from LIS. In order to test LIS features in a real application scenario, an ?Activity Monitor? has been designed and implemented: a personal health-persuasive application that provides feedback on the user?s lifestyle, combining data from physical and virtual sensors. In this case of use, LIS is used to timely evaluate the user?s activity level, to decide on the convenience of triggering notifications and to determine the best interface or channel to deliver these context-aware alerts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large-scale transport infrastructure projects such as high-speed rail (HSR) produce significant effects on the spatial distribution of accessibility. These effects, commonly known as territorial cohesion effects, are receiving increasing attention in the research literature. However, there is little empirical research into the sensitivity of these cohesion results to methodological issues such as the definition of the limits of the study area or the zoning system. In a previous paper (Ortega et al., 2012), we investigated the influence of scale issues, comparing the cohesion results obtained at four different planning levels. This paper makes an additional contribution to our research with the investigation of the influence of zoning issues. We analyze the extent to which changes in the size of the units of analysis influence the measurement of spatial inequalities. The methodology is tested by application to the Galician (north-western) HSR corridor, with a length of nearly 670 km, included in the Spanish PEIT (Strategic Transport and Infrastructure Plan) 2005-2020. We calculated the accessibility indicators for the Galician HSR corridor and assessed their corresponding territorial distribution. We used five alternative zoning systems depending on the method of data representation used (vector or raster), and the level of detail (cartographic accuracy or cell size). Our results suggest that the choice between a vector-based and raster-based system has important implications. The vector system produces a higher mean accessibility value and a more polarized accessibility distribution than raster systems. The increased pixel size of raster-based systems tends to give rise to higher mean accessibility values and a more balanced accessibility distribution. Our findings strongly encourage spatial analysts to acknowledge that the results of their analyses may vary widely according to the definition of the units of analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este proyecto se pretende estudiar el comportamiento de la luz al atravesar medios de diversos materiales, tanto isótropos como anisótropos uniáxicos. Para ello se requiere realizar un estudio previo de las condiciones de contorno aplicables a las ecuaciones de Maxwell en la interfase de dos medios que pueden ser isótropos o anisótropos. En el caso de dos materiales isótropos, la solución del problema son los conocidos coeficientes de Fresnel de reflexión y transmisión. En este trabajo se pretende generalizar el estudio al caso del paso de la luz desde un medio isótropo a otro anisótropo uniáxico (con su eje óptico en orientación arbitraria) y viceversa y al caso de dos materiales anisótropos uniáxicos con ejes ópticos en orientaciones arbitrarias. Es de especial interés el caso de un mismo material uniáxico en el que las dos partes tienen el eje óptico con distinta orientación. Una vez planteadas las condiciones de contorno específicas en cada caso, se obtendrá un conjunto de ecuaciones algebraicas cuya resolución permitirá obtener los coeficientes de reflexión y transmisión buscados. Para plantear el sistema de ecuaciones adecuado, será necesario tener una descripción de las características ópticas de los materiales empleados, la orientación de los ejes ópticos en cada caso, y los posibles ángulos de incidencia. Se realizará un tratamiento matricial de modo que el paquete MatLab permite su inversión de manera inmediata. Se desarrollará una interfaz sencilla, realizada con MatLab, que permita al usuario introducir sin dificultad los datos correspondientes a los materiales de los medios incidente y transmitido, la orientación en espacial del o de los ejes ópticos, de la longitud de onda de trabajo y del ángulo de incidencia del haz de luz, con los que la aplicación realizará los cálculos. Los coeficientes de reflexión y refracción obtenidos serán representados gráficamente en función del ángulo de incidencia. Así mismo se representarán los ángulos transmitidos y reflejados en función del de incidencia. Todo ello de esta forma, que resulte sencilla la interpretación de los datos por parte del usuario. ABSTRACT. The reason for this project is to study the behavior of light when light crosses different media of different materials, isotropic materials and uniaxial anisotropic materials. For this, a previous study is necessary where the boundary conditions apply to Maxwell equations at the interface between two media which can be isotropic and anisotropic. If both materials are isotropic, the Fresnel ccoefficients of reflection and refraction are used to solve the problem. The aim of this work is to generalize a study when light crosses from an isotropic media to a uniaxial anisotropic media, where its axis have arbitrary directions, and vicecersa. The system consisting of two materials with axis in arbitrary directions are also being studied. Once the specific boundary conditions are known in each case, a set of algebraic equations are obtained whose solution allows obtaining the reflection coefficients and refraction coefficients. It is necessary to have a description of the optical characteristics of the materials used; of the directions axis in each case and the possible angle of incidence. A matrix is proposed for later treatment in Matlab that allows the immediate inversion. A simple interface will de developed, manufactured with Matlab, that allows the user to enter data easily corresponding to the incident media and transmission media of the different materials, the special axis directions, the wavelength and the angle of incidence of the light beam. This data is used by the application to perform the necessary calculations to solve the problem. When reflection coefficients and refraction coefficients are obtained, the application draws the graphics in function of the angle of incidence. Also transmitted and reflected angles depending on the incidence are represented. This is to perform a data representation which is a simple interpretation of the user data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports on the further results of the ongoing research analyzing the impact of a range of commonly used statistical and semantic features in the context of extractive text summarization. The features experimented with include word frequency, inverse sentence and term frequencies, stopwords filtering, word senses, resolved anaphora and textual entailment. The obtained results demonstrate the relative importance of each feature and the limitations of the tools available. It has been shown that the inverse sentence frequency combined with the term frequency yields almost the same results as the latter combined with stopwords filtering that in its turn proved to be a highly competitive baseline. To improve the suboptimal results of anaphora resolution, the system was extended with the second anaphora resolution module. The present paper also describes the first attempts of the internal document data representation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

MULTIPRED is a web-based computational system for the prediction of peptide binding to multiple molecules ( proteins) belonging to human leukocyte antigens (HLA) class I A2, A3 and class II DR supertypes. It uses hidden Markov models and artificial neural network methods as predictive engines. A novel data representation method enables MULTIPRED to predict peptides that promiscuously bind multiple HLA alleles within one HLA supertype. Extensive testing was performed for validation of the prediction models. Testing results show that MULTIPRED is both sensitive and specific and it has good predictive ability ( area under the receiver operating characteristic curve A(ROC) > 0.80). MULTIPRED can be used for the mapping of promiscuous T-cell epitopes as well as the regions of high concentration of these targets termed T-cell epitope hotspots. MULTIPRED is available at http:// antigen.i2r.a-star.edu.sg/ multipred/.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent years, interest in digital watermarking has grown significantly. Indeed, the use of digital watermarking techniques is seen as a promising mean to protect intellectual property rights of digital data and to ensure the authentication of digital data. Thus, a significant research effort has been devoted to the study of practical watermarking systems, in particular for digital images. In this thesis, a practical and principled approach to the problem is adopted. Several aspects of practical watermarking schemes are investigated. First, a power constaint formulation of the problem is presented. Then, a new analysis of quantisation effects on the information rate of digital watermarking scheme is proposed and compared to other approaches suggested in the literature. Subsequently, a new information embedding technique, based on quantisation, is put forward and its performance evaluated. Finally, the influence of image data representation on the performance of practical scheme is studied along with a new representation based on independent component analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ironically, the “learning of percent” is one of the most problematic aspects of school mathematics. In our view, these difficulties are not associated with the arithmetic aspects of the “percent problems”, but mostly with two methodological issues: firstly, providing students with a simple and accurate understanding of the rationale behind the use of percent, and secondly - overcoming the psychological complexities of the fluent and comprehensive understanding by the students of the sometimes specific wordings of “percent problems”. Before we talk about percent, it is necessary to acquaint students with a much more fundamental and important (regrettably, not covered by the school syllabus) classical concepts of quantitative and qualitative comparison of values, to give students the opportunity to learn the relevant standard terminology and become accustomed to conventional turns of speech. Further, it makes sense to briefly touch on the issue (important in its own right) of different representations of numbers. Percent is just one of the technical, but common forms of data representation: p% = p × % = p × 0.01 = p × 1/100 = p/100 = p × 10-2 "Percent problems” are involved in just two cases: I. The ratio of a variation m to the standard M II. The relative deviation of a variation m from the standard M The hardest and most essential in each specific "percent problem” is not the routine arithmetic actions involved, but the ability to figure out, to clearly understand which of the variables involved in the problem instructions is the standard and which is the variation. And in the first place, this is what teachers need to patiently and persistently teach their students. As a matter of fact, most primary school pupils are not yet quite ready for the lexical specificity of “percent problems”. ....Math teachers should closely, hand in hand with their students, carry out a linguistic analysis of the wording of each problem ... Schoolchildren must firmly understand that a comparison of objects is only meaningful when we speak about properties which can be objectively expressed in terms of actual numerical characteristics. In our opinion, an adequate acquisition of the teaching unit on percent cannot be achieved in primary school due to objective psychological specificities related to this age and because of the level of general training of students. Yet, if we want to make this topic truly accessible and practically useful, it should be taught in high school. A final question to the reader (quickly, please): What is greater: % of e or e% of Pi

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Temporal dynamics of Raman fibre lasers tend to have very complex nature, owing to great cavity lengths and high nonlinearity, being stochastic on short time scales and quasi-continuous on longer time scales. Generally fibre laser intensity dynamics is represented by one-dimensional time-series, which in case of quasi-continuous wave generation in Raman fibre lasers gives little insight into the processes underlying the operation of a laser. New methods of analysis and data representation could help to uncover the underlying physical processes, understand the dynamics or improve the performance of the system. Using intrinsic periodicity of laser radiation, one dimensional intensity time series of a Raman fibre laser was analysed over fast and slow variation time. This allowed to experimentally observe various spatio-temporal regimes of generation, such as laminar, turbulent, partial mode-lock, as well as transitions between them and identify the mechanisms responsible for the transitions. Great cavity length and high nonlinearity also make it difficult to achieve stable high repetition rate mode-locking in Raman fibre lasers. Using Faraday parametric instability in extremely simple linear cavity experimental configuration, a very high order harmonic mode-locking was achieved in ò.ò kmlong Raman fibre laser. The maximum achieved pulse repetition rate was 12 GHz, with 7.3 ps long Gaussian shaped pulses. There is a new type of random lasers – random distributed feedback Raman fibre laser, which temporal properties cannot be controlled by conventionalmode-locking or Q-switch techniques and mechanisms. By adjusting the pump configuration, a very stable pulsed operation of random distributed feedback Raman fibre laser was achieved. Pulse duration varied in the range from 50 to 200 μs depending on the pump power and the cavity length. Pulse repetition rate scaling on the parameters of the system was experimentally identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the past five years, XML has been embraced by both the research and industrial community due to its promising prospects as a new data representation and exchange format on the Internet. The widespread popularity of XML creates an increasing need to store XML data in persistent storage systems and to enable sophisticated XML queries over the data. The currently available approaches to addressing the XML storage and retrieval issue have the limitations of either being not mature enough (e.g. native approaches) or causing inflexibility, a lot of fragmentation and excessive join operations (e.g. non-native approaches such as the relational database approach). ^ In this dissertation, I studied the issue of storing and retrieving XML data using the Semantic Binary Object-Oriented Database System (Sem-ODB) to leverage the advanced Sem-ODB technology with the emerging XML data model. First, a meta-schema based approach was implemented to address the data model mismatch issue that is inherent in the non-native approaches. The meta-schema based approach captures the meta-data of both Document Type Definitions (DTDs) and Sem-ODB Semantic Schemas, thus enables a dynamic and flexible mapping scheme. Second, a formal framework was presented to ensure precise and concise mappings. In this framework, both schemas and the conversions between them are formally defined and described. Third, after major features of an XML query language, XQuery, were analyzed, a high-level XQuery to Semantic SQL (Sem-SQL) query translation scheme was described. This translation scheme takes advantage of the navigation-oriented query paradigm of the Sem-SQL, thus avoids the excessive join problem of relational approaches. Finally, the modeling capability of the Semantic Binary Object-Oriented Data Model (Sem-ODM) was explored from the perspective of conceptually modeling an XML Schema using a Semantic Schema. ^ It was revealed that the advanced features of the Sem-ODB, such as multi-valued attributes, surrogates, the navigation-oriented query paradigm, among others, are indeed beneficial in coping with the XML storage and retrieval issue using a non-XML approach. Furthermore, extensions to the Sem-ODB to make it work more effectively with XML data were also proposed. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To maintain the pace of development set by Moore's law, production processes in semiconductor manufacturing are becoming more and more complex. The development of efficient and interpretable anomaly detection systems is fundamental to keeping production costs low. As the dimension of process monitoring data can become extremely high anomaly detection systems are impacted by the curse of dimensionality, hence dimensionality reduction plays an important role. Classical dimensionality reduction approaches, such as Principal Component Analysis, generally involve transformations that seek to maximize the explained variance. In datasets with several clusters of correlated variables the contributions of isolated variables to explained variance may be insignificant, with the result that they may not be included in the reduced data representation. It is then not possible to detect an anomaly if it is only reflected in such isolated variables. In this paper we present a new dimensionality reduction technique that takes account of such isolated variables and demonstrate how it can be used to build an interpretable and robust anomaly detection system for Optical Emission Spectroscopy data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta dissertação insere-se num conjunto de trabalhos a decorrer no Instituto de Telecomunicações de Aveiro que tem como objetivo o desenvolvimento de um sistema de comunicação para um UAV. Neste sentido, apresenta a implementação e validação de um modem em banda base aberto e flexível implementado em FPGA, baseado em abordagem SDR, com possibilidade de integraçãoo no sistema de comunicação com o UAV. Ao longo desta dissertação implementou-se, utilizando o MATLAB, um modem de modulação adaptável, ao qual foram integrados algoritmos de sincronismo e de correção de fase. Desta forma, foi possível realizar uma análise ao modelo comportamental dos vários constituintes do modem abstraindose dos tempos de atraso do processamento ou da precisão da representação dos dados, e assim simplificar a sua implementação em hardware. Analisado o modelo comportamental do modem desenvolvido em MATLAB realizou-se a sua implementação em hardware para a modulação QPSK. A sua prototipagem foi realizada, com recurso à ferramenta computacional Vivado Design Suite 2014.2, utilizando o kit de desenvolvimento ZedBoard e o frontend AD-FMCOMMS1-EBZ. O correto funcionamento dos módulos implementados em hardware foi posteriormente avaliado através de uma interface entre o MATLAB e a Zed- Board, sendo que, os resultados obtidos no modelo em MATLAB serviram como termo de comparação. Através da utilização desta interface é possível validar parte do modem implementado em FPGA, mantendo o restante processamento a ser realizado em MATLAB, validando assim os módulos em FPGA de uma forma isolada.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Geographic Information System (GIS) is a technology that deals with location to support better representations and decision making. It has a long tradition in several planning areas, such as urbanism, environment, riskiness, transportation, archeology or tourism. In academics context higher education has followed that evolution. Despite of their potentialities in education, GIS technologies at the elementary and secondary have been underused. Empowering graduates to learn with GIS and to manipulate spatial data can effectively facilitate the teaching of critical thinking. Likewise it has been recognized that GIS tools can be incorporated as an interdisciplinary pedagogical tool. Nevertheless more practical examples on how GIS tools can enhance teaching and learning process, namely to promote interdisciplinary approaches. The proposed paper presents some results obtained from the project “Each thing in its place: the science in time and space”. This project results from the effort of three professors of Geography, History and Natural Sciences in the context of Didactics of World Knowledge curricular unit to enhance interdisciplinarity through Geographic Information Technologies (GIT). Implemented during the last three years this action-research project developed the research practice using GIS to create an interdisciplinary attitude in the future primary education teachers. More than teaching GIS the authors were focused on teaching with GIS to create an integrated vision where spatial data representation linked the space, the time and natural sciences. Accumulated experience reveals that those technologies can motivate students to learn and facilitating teacher’s interdisciplinary work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bioacoustic data can provide an important base for environmental monitoring. To explore a large amount of field recordings collected, an automated similarity search algorithm is presented in this paper. A region of an audio defined by frequency and time bounds is provided by a user; the content of the region is used to construct a query. In the retrieving process, our algorithm will automatically scan through recordings to search for similar regions. In detail, we present a feature extraction approach based on the visual content of vocalisations – in this case ridges, and develop a generic regional representation of vocalisations for indexing. Our feature extraction method works best for bird vocalisations showing ridge characteristics. The regional representation method allows the content of an arbitrary region of a continuous recording to be described in a compressed format.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The applicability of a formalism involving an exponential function of composition x1 in interpreting the thermodynamic properties of alloys has been studied. The excess integral and partial molar free energies of mixing are expressed as: $$\begin{gathered} \Delta F^{xs} = a_o x_1 (1 - x_1 )e^{bx_1 } \hfill \\ RTln\gamma _1 = a_o (1 - x_1 )^2 (1 + bx_1 )e^{bx_1 } \hfill \\ RTln\gamma _2 = a_o x_1^2 (1 - b + bx_1 )e^{bx_1 } \hfill \\ \end{gathered} $$ The equations are used in interpreting experimental data for several relatively weakly interacting binary systems. For the purpose of comparison, activity coefficients obtained by the subregular model and Krupkowski’s formalism have also been computed. The present equations may be considered to be convenient in describing the thermodynamic behavior of metallic solutions.