887 resultados para Web 2.0 Applications in Enterprises


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the continuous incorporation of new technologies in the learning process has been an important factor in the educational process [1]. The Technical University of Madrid (UPM) promotes educational innovation processes and develops projects related to the improvement of the education quality. The experience that we present fits into the Educational Innovation Project (EIP) of the E.U. of Agricultural Engineering of Madrid. One of the main objectives of the EIP is to "Take advantage of the new opportunities offered by the Learning and Knowledge Technologies in order to enrich the educational processes and teaching management" [2].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a CMOS temperature sensor based on the thermal dependencies of the leakage currents targeting the 65 nm node. To compensate for the effect of process fluctuations, the proposed sensor realizes the ratio of two measures of the time it takes a capacitor to discharge through a transistor in the subthreshold regime. Furthermore, a novel charging mechanism for the capacitor is proposed to further increase the robustness against fabrication variability. The sensor, including digitization and interfacing, occupies 0.0016 mm2 and has an energy consumption of 47.7–633 pJ per sample. The resolution of the sensor is 0.28 °C, and the 3σ inaccuracy over the range 40–110 °C is 1.17 °C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Schizosaccharomyces strains consumed less primary amino nitrogen and produced less urea and more pyruvic acid than other Saccharomyces species. Further, three of the four Schizosaccharomyces strains completed the breakdown of malic acid by day 4 of fermentation. The main negative effect of the use of Schizosaccharomyces was strong acetic acid production. The Schizosaccharomyces strains that produced most pyruvic acid (938 and 936) were associated with better ?wine? colour than the remaining yeasts. The studied Schizosaccharomyces could therefore be of oenological interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spatial-color-based non-parametric background-foreground modeling strategy in a GPGPU by using CUDA is proposed. This strategy is suitable for augmented-reality applications, providing real-time high-quality results in a great variety of scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last generation of consumer electronic devices is endowed with Augmented Reality (AR) tools. These tools require moving object detection strategies, which should be fast and efficient, to carry out higher level object analysis tasks. We propose a lightweight spatio-temporal-based non-parametric background-foreground modeling strategy in a General Purpose Graphics Processing Unit (GPGPU), which provides real-time high-quality results in a great variety of scenarios and is suitable for AR applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the continuous incorporation of new technologies in the learning process has been an important factor in the educational process (1). The Technical University of Madrid (UPM) promotes educational innovation processes and develops projects related to the improvement of the education quality. The experience that we present fits into the Educational Innovation Project (EIP) of the E.U. of Agricultural Engineering of Madrid. One of the main objectives of the EIP is to Take advantage of the new opportunities offered by the Learning and Knowledge Technologies in order to enrich the educational processes and teaching management (2).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the challenges facing the current web is the efficient use of all the available information. The Web 2.0 phenomenon has favored the creation of contents by average users, and thus the amount of information that can be found for diverse topics has grown exponentially in the last years. Initiatives such as linked data are helping to build the Semantic Web, in which a set of standards are proposed for the exchange of data among heterogeneous systems. However, these standards are sometimes not used, and there are still plenty of websites that require naive techniques to discover their contents and services. This paper proposes an integrated framework for content and service discovery and extraction. The framework is divided into several layers where the discovery of contents and services is made in a representational stateless transfer system such as the web. It employs several web mining techniques as well as feature-oriented modeling for the discovery of cross-cutting features in web resources. The framework is used in a scenario of electronic newspapers. An intelligent agent crawls the web for related news, and uses services and visits links automatically according to its goal. This scenario illustrates how the discovery is made at different levels and how the use of semantics helps implement an agent that performs high-level tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mercado de los semiconductores está saturado de productos similares y de distribuidores con una propuesta de servicios similar. Los procesos de Co-Creación en los que el cliente colabora en la definición y desarrollo del producto y proporciona información sobre su utilidad, prestaciones y valor percibido, con el resultado de un producto que soluciona sus necesidades reales, se están convirtiendo en un paso adelante en la diferenciación y expansión de la cadena de valor. El proceso de diseño y fabricación de semiconductores es bastante complejo, requiere inversiones cada vez mayores y demanda soluciones completas. Se requiere un ecosistema que soporte el desarrollo de los equipos electrónicos basados en dichos semiconductores. La facilidad para el diálogo y compartir información que proporciona internet, las herramientas basadas en web 2.0 y los servicios y aplicaciones en la nube; favorecen la generación de ideas, el desarrollo y evaluación de productos y posibilita la interacción entre diversos co-creadores. Para iniciar un proceso de co-creación se requiere métodos y herramientas adecuados para interactuar con los participantes e intercambiar experiencias, procesos para integrar la co-creación dentro de la operativa de la empresa, y desarrollar una organización y cultura que soporten y fomenten dicho proceso. Entre los métodos más efectivos están la Netnografía que estudia las conversaciones de las comunidades en internet; colaboración con usuarios pioneros que van por delante del Mercado y esperan un gran beneficio de la satisfacción de sus necesidades o deseos; los estudios de innovación que permiten al usuario definir y a menudo crear su propia solución y la externalización a la multitud, que mediante una convocatoria abierta plantea a la comunidad retos a resolver a cambio de algún tipo de recompensa. La especialización de empresas subcontratistas en el desarrollo y fabricación de semiconductores; facilita la innovación abierta colaborando con diversas entidades en las diversas fases del desarrollo del semiconductor y su ecosistema. La co-creación se emplea actualmente en el sector de los semiconductores para detectar ideas de diseños y aplicaciones, a menudo mediante concursos de innovación. El servicio de soporte técnico y la evaluación de los semiconductores con frecuencia es fruto de la colaboración entre los miembros de la comunidad fomentada y soportada por los fabricantes del producto. Con el programa EBVchips se posibilita el acceso a empresas pequeñas y medianas a la co-creación de semiconductores con los fabricantes en un proceso coordinado y patrocinado por el distribuidor EBV. Los semiconductores configurables como las FPGAs constituyen otro ejemplo de co-creación mediante el cual el fabricante proporciona el circuito integrado y el entorno de desarrollo y los clientes crean el producto final definiendo sus características y funcionalidades. Este proceso se enriquece con bloques funcionales de diseño, IP-cores, que a menudo son creados por la comunidad de usuarios. ABSTRACT. The semiconductor market is saturated of similar products and distributors with a similar proposal for services. The processes of co-creation in which the customer collaborates in the definition and development of the product and provides information about its utility, performance and perceived value, resulting in a product that solves their real needs, are becoming a step forward in the differentiation and expansion of the value chain. The design and semiconductor manufacturing process is quite complex, requires increasingly higher investments and demands complete solutions. It requires an ecosystem that supports the development of electronic equipments based on such semiconductors. The ease of dialogue and sharing information that provides internet, web 2.0-based tools and services and applications in the cloud; favor the generation of ideas, the development and evaluation of products and allows the interaction between various co-creators. To start a process of co-creation adequate methods and tools are required to interact with the participants and exchange experiences, processes to integrate the co-creation within the operations of the company, and developing an organization and culture that support and promote such process. Among the most effective methods are the Netnography that studies the conversations of the communities on the internet; collaboration with Lead Users who are ahead of the market and expect a great benefit from the satisfaction of their needs or desires; Innovation studies that allow the user to define and often create their own solution and Crowdsourcing, an open call to the community to solve challenges in exchange for some kind of reward. The specialization of subcontractors in the development and manufacture of semiconductors; facilitates open innovation in the context of collaboration with different entities working in the different phases of the development of the semiconductor and its ecosystem. Co-creation is used currently in the semiconductor sector to detect ideas of designs and applications, often through innovation’s contests. Technical support and evaluation of semiconductors frequently is the result of collaboration between members of the community fostered and supported by the manufacturers of the product. The EBVchips program provides access to small and medium-sized companies to the co-creation of semiconductors with manufacturers in a process coordinated and sponsored by the Distributor EBV. Configurable semiconductors like FPGAs are another example of co-creation whereby the manufacturer provides the integrated circuit and the development environment and customers create the final product by defining their features and functionality. This process is enriched with IP-cores, designs blocks that are often created by the user community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linked Data assets (RDF triples, graphs, datasets, mappings...) can be object of protection by the intellectual property law, the database law or its access or publication be restricted by other legal reasons (personal data pro- tection, security reasons, etc.). Publishing a rights expression along with the digital asset, allows the rightsholder waiving some or all of the IP and database rights (leaving the work in the public domain), permitting some operations if certain conditions are satisfied (like giving attribution to the author) or simply reminding the audience that some rights are reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent commentaries have proposed the advantages of using open exchange of data and informatics resources for improving health-related policies and patient care in Africa. Yet, in many African regions, both private medical and public health information systems are still unaffordable. Open exchange over the social Web 2.0 could encourage more altruistic support of medical initiatives. We have carried out some experiments to demonstrate the feasibility of using this approach to disseminate open data and informatics resources in Africa. After the experiments we developed the AFRICA BUILD Portal, the first Social Network for African biomedical researchers. Through the AFRICA BUILD Portal users can access in a transparent way to several resources. Currently, over 600 researchers are using distributed and open resources through this platform committed to low connections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conception of IoT (Internet of Things) is accepted as the future tendency of Internet among academia and industry. It will enable people and things to be connected at anytime and anyplace, with anything and anyone. IoT has been proposed to be applied into many areas such as Healthcare, Transportation,Logistics, and Smart environment etc. However, this thesis emphasizes on the home healthcare area as it is the potential healthcare model to solve many problems such as the limited medical resources, the increasing demands for healthcare from elderly and chronic patients which the traditional model is not capable of. A remarkable change in IoT in semantic oriented vision is that vast sensors or devices are involved which could generate enormous data. Methods to manage the data including acquiring, interpreting, processing and storing data need to be implemented. Apart from this, other abilities that IoT is not capable of are concluded, namely, interoperation, context awareness and security & privacy. Context awareness is an emerging technology to manage and take advantage of context to enable any type of system to provide personalized services. The aim of this thesis is to explore ways to facilitate context awareness in IoT. In order to realize this objective, a preliminary research is carried out in this thesis. The most basic premise to realize context awareness is to collect, model, understand, reason and make use of context. A complete literature review for the existing context modelling and context reasoning techniques is conducted. The conclusion is that the ontology-based context modelling and ontology-based context reasoning are the most promising and efficient techniques to manage context. In order to fuse ontology into IoT, a specific ontology-based context awareness framework is proposed for IoT applications. In general, the framework is composed of eight components which are hardware, UI (User Interface), Context modelling, Context fusion, Context reasoning, Context repository, Security unit and Context dissemination. Moreover, on the basis of TOVE (Toronto Virtual Enterprise), a formal ontology developing methodology is proposed and illustrated which consists of four stages: Specification & Conceptualization, Competency Formulation, Implementation and Validation & Documentation. In addition, a home healthcare scenario is elaborated by listing its well-defined functionalities. Aiming at representing this specific scenario, the proposed ontology developing methodology is applied and the ontology-based model is developed in a free and open-source ontology editor called Protégé. Finally, the accuracy and completeness of the proposed ontology are validated to show that this proposed ontology is able to accurately represent the scenario of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mundo de la web admite actualmente los productos desarrollados tanto por desarrolladores profesionales como por usuarios finales con un conocimiento más limitado. A pesar de la diferencia que se puede suponer de calidad entre los productos de ambos, las dos soluciones pueden ser reconocidas y empleadas en una aplicación. En la Web 2.0, este comportamiento se observa en el desarrollo de componentes web. Lo que se persigue en el trabajo es desarrollar un modelo de persistencia que, apoyado por un lado servidor y por uno cliente, recoja las métricas de calidad de los componentes cuando los usuarios interaccionan con ellos. A partir de estas métricas, es posible mejorar la calidad de estos componentes. La forma en la que se van a recoger las métricas es a través de PicBit, la aplicación desarrollada para que los usuarios puedan interconectar diferentes componentes entre ellos sin restricciones, de forma que tras interactuar con ellos puedan expresar su grado de satisfacción, que se recoge para la evaluación de la calidad. Se definen también unas métricas intrínsecas al componente, no determinadas por el usuario y que sirven como referencia de la evaluación. Cuando se tienen tanto las métricas intrínsecas como procedentes del usuario, se realiza una correlación entre ellas que permite analizar las posibles desviaciones entre ellas y determinar la calidad propia del componente. Las conclusiones que se pueden obtener del trabajo es que cuando los usuarios pueden realizar pruebas de usabilidad de forma libre, sin restricciones, es mayor la posibilidad de obtener resultados favorables porque estos resultados muestran cómo usará un usuario final la aplicación. Este método de trabajo se ve favorecido por el número de herramientas que se pueden utilizar hoy para monitorizar el flujo de usuario en el servicio.---ABSTRACT---Nowadays, the web world deals with products developed both by professional developers and by end-users with some limited knowledge. Although the difference between both can be important in quality terms, both are accepted and included in web applications. In web 2.0, this behavior can be recognized in the web components development. The goal pursued in the work presented is to create a persistent model that, supported by an end and a back side, will pick the quality measures of the components when the users interact with them. These measures are the starting point for improving the components. The way in which the measures are going to be picked is through PicBit, the application we have developed in order to allow the users playing with the components without restrictions or rules, so after the interaction they can give their satisfaction mark with the application. This will be the value used to evaluate the quality. Some own measures are also defined, which does not depend on the user and which will be used as a reference point of the evaluation. When the measures from users and own ones are got, their correlation is analyzed to study the differences between them and to establish the quality of the component. The conclusion that can be gained from the project is the importance of giving freedom for users when doing usability tests because it increases the chance to get positive results, in the way the users execute the operations they want with the application. This method is fortunate for having such a number of tools to monitor the user flow when using the service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La web vive un proceso de cambio constante, basado en una interacción mayor del usuario. A partir de la actual corriente de paradigmas y tecnologías asociadas a la web 2.0, han surgido una serie de estándares de gran utilidad, que cubre la necesidad de los desarrollos actuales de la red. Entre estos se incluyen los componentes web, etiquetas HTML definidas por el usuario que cubren una función concreta dentro de una página. Existe una necesidad de medir la calidad de dichos desarrollos, para discernir si el concepto de componente web supone un cambio revolucionario en el desarrollo de la web 2.0. Para ello, es necesario realizar una explotación de componentes web, considerada como la medición de calidad basada en métricas y definición de un modelo de interconexión de componentes. La plataforma PicBit surge como respuesta a estas cuestiones. Consiste en una plataforma social de construcción de perfiles basada en estos elementos. Desde la perspectiva del usuario final se trata de una herramienta para crear perfiles y comunidades sociales, mientras que desde una perspectiva académica, la plataforma consiste en un entorno de pruebas o sandbox de componentes web. Para ello, será necesario implementar el extremo servidor de dicha plataforma, enfocado a la labor de explotación, por medio de la definición de una interfaz REST de operaciones y un sistema para la recolección de eventos de usuario en la plataforma. Gracias a esta plataforma se podrán discernir qué parámetros influyen positivamente en la experiencia de uso de un componente, así como descubrir el futuro potencial de este tipo de desarrollos.---ABSTRACT---The web evolves into a more interactive platform. From the actual version of the web, named as web 2.0, many paradigms and standards have arisen. One of those standards is web components, a set of concepts to define new HTML tags that covers a specific function inside a web page. It is necessary to measure the quality of this kind of software development, and the aim behind this approach is to determine if this new set of concepts would survive in the actual web paradigm. To achieve this, it is described a model to analyse components, in the terms of quality measure and interconnection model description. PicBit consists of a social platform to use web components. From the point of view of the final user, this platform is a tool to create social profiles using components, whereas from the point of view of technicians, it consists of a sandbox of web components. Thanks to this platform, we will be able to discover those parameters that have a positive effect in the user experience and to discover the potential of this new set of standards into the web 2.0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis establece los fundamentos teóricos y diseña una colección abierta de clases C++ denominada VBF (Vector Boolean Functions) para analizar funciones booleanas vectoriales (funciones que asocian un vector booleano a otro vector booleano) desde una perspectiva criptográfica. Esta nueva implementación emplea la librería NTL de Victor Shoup, incorporando nuevos módulos que complementan a las funciones de NTL, adecuándolas para el análisis criptográfico. La clase fundamental que representa una función booleana vectorial se puede inicializar de manera muy flexible mediante diferentes estructuras de datas tales como la Tabla de verdad, la Representación de traza y la Forma algebraica normal entre otras. De esta manera VBF permite evaluar los criterios criptográficos más relevantes de los algoritmos de cifra en bloque y de stream, así como funciones hash: por ejemplo, proporciona la no-linealidad, la distancia lineal, el grado algebraico, las estructuras lineales, la distribución de frecuencias de los valores absolutos del espectro Walsh o del espectro de autocorrelación, entre otros criterios. Adicionalmente, VBF puede llevar a cabo operaciones entre funciones booleanas vectoriales tales como la comprobación de igualdad, la composición, la inversión, la suma, la suma directa, el bricklayering (aplicación paralela de funciones booleanas vectoriales como la empleada en el algoritmo de cifra Rijndael), y la adición de funciones coordenada. La tesis también muestra el empleo de la librería VBF en dos aplicaciones prácticas. Por un lado, se han analizado las características más relevantes de los sistemas de cifra en bloque. Por otro lado, combinando VBF con algoritmos de optimización, se han diseñado funciones booleanas cuyas propiedades criptográficas son las mejores conocidas hasta la fecha. ABSTRACT This thesis develops the theoretical foundations and designs an open collection of C++ classes, called VBF, designed for analyzing vector Boolean functions (functions that map a Boolean vector to another Boolean vector) from a cryptographic perspective. This new implementation uses the NTL library from Victor Shoup, adding new modules which complement the existing ones making VBF better suited for cryptography. The fundamental class representing a vector Boolean function can be initialized in a flexible way via several alternative types of data structures such as Truth Table, Trace Representation, Algebraic Normal Form (ANF) among others. This way, VBF allows the evaluation of the most relevant cryptographic criteria for block and stream ciphers as well as for hash functions: for instance, it provides the nonlinearity, the linearity distance, the algebraic degree, the linear structures, the frequency distribution of the absolute values of the Walsh Spectrum or the Autocorrelation Spectrum, among others. In addition, VBF can perform operations such as equality testing, composition, inversion, sum, direct sum, bricklayering (parallel application of vector Boolean functions as employed in Rijndael cipher), and adding coordinate functions of two vector Boolean functions. This thesis also illustrates the use of VBF in two practical applications. On the one hand, the most relevant properties of the existing block ciphers have been analysed. On the other hand, by combining VBF with optimization algorithms, new Boolean functions have been designed which have the best known cryptographic properties up-to-date.