21 resultados para number of patent applications

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the significant impact of Web 2.0-related innovations on new Internet-based initiatives, this paper seeks to identify to what extent the main developments are protected by patents and whether patents have had a leading role in the advent of Web 2.0. The article shows that the number of patent applications filed is not that important for many of the Web 2.0 technologies in frequent use and that, of those filed, those granted are even less. The conclusion is that patents do not seem to be a relevant factor in the development of the Web 2.0 (and more generally in dynamic markets) where there is a high degree of innovation and low entry barriers for newcomers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patent and trademark offices which run according to principles of new management have an inherent need for dependable forecasting data in planning capacity and service levels. The ability of the Spanish Office of Patents and Trademarks to carry out efficient planning of its resource needs requires the use of methods which allow it to predict the changes in the number of patent and trademark applications at different time horizons. The approach for the prediction of time series of Spanish patents and trademarks applications (1979e2009) was based on the use of different techniques of time series prediction in a short-term horizon. The methods used can be grouped into two specifics areas: regression models of trends and time series models. The results of this study show that it is possible to model the series of patents and trademarks applications with different models, especially ARIMA, with satisfactory model adjustment and relatively low error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe a complete development platform that features different innovative acceleration strategies, not included in any other current platform, that simplify and speed up the definition of the different elements required to design a spoken dialog service. The proposed accelerations are mainly based on using the information from the backend database schema and contents, as well as cumulative information produced throughout the different steps in the design. Thanks to these accelerations, the interaction between the designer and the platform is improved, and in most cases the design is reduced to simple confirmations of the “proposals” that the platform dynamically provides at each step. In addition, the platform provides several other accelerations such as configurable templates that can be used to define the different tasks in the service or the dialogs to obtain or show information to the user, automatic proposals for the best way to request slot contents from the user (i.e. using mixed-initiative forms or directed forms), an assistant that offers the set of more probable actions required to complete the definition of the different tasks in the application, or another assistant for solving specific modality details such as confirmations of user answers or how to present them the lists of retrieved results after querying the backend database. Additionally, the platform also allows the creation of speech grammars and prompts, database access functions, and the possibility of using mixed initiative and over-answering dialogs. In the paper we also describe in detail each assistant in the platform, emphasizing the different kind of methodologies followed to facilitate the design process at each one. Finally, we describe the results obtained in both a subjective and an objective evaluation with different designers that confirm the viability, usefulness, and functionality of the proposed accelerations. Thanks to the accelerations, the design time is reduced in more than 56% and the number of keystrokes by 84%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high productivity rate in Engineering is related to an efficient management of the flow of the large quantities of information and associated decision making activities that are consubstantial to the Engineering processes both in design and production contexts. Dealing with such problems from an integrated point of view and mimicking real scenarios is not given much attention in Engineering degrees. In the context of Engineering Education, there are a number of courses designed for developing specific competencies, as required by the academic curricula, but not that many in which integration competencies are the main target. In this paper, a course devoted to that aim is discussed. The course is taught in a Marine Engineering degree but the philosophy could be used in any Engineering field. All the lessons are given in a computer room in which every student can use each all the treated software applications. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using Ms-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity, ending up with the case of vessel construction. The second part of the course is dedicated to the use of a database manager, Ms-ACCESS, for managing production related information. A series of increasing complexity examples is treated ending up with the management of the pipe database of a real vessel. This database consists of a few thousand of pipes, for which a production timing frame is defined, which connects this part of the course with the first one. Finally, the third part of the course is devoted to the work with FORAN, an Engineering Production package of widespread use in the shipbuilding industry. With this package, the frames and plates where all the outfitting will be carried out are defined through cooperative work by the studens, working simultaneously in the same 3D model. In the paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feed-back from their experience as well as to assess their satisfaction with the learning process. Results from these surveys are discussed in the paper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to its small band-gap and its high mobility, InN is a promising material for a large number of key applications like band-gap engineering for high efficiency solar cells, light emitting diodes, and high speed devices. Unfortunately, it has been reported that this material exhibits strong surface charge accumulation which may depend on the type of surface. Current investigations are conducted in order to explain the mechanisms which govern such a behavior and to look for ways of avoiding it and/or finding applications that may use such an effect. In this framework, low frequency noise measurements have been performed at different temperatures on patterned MBE grown InN layers. The evolution of the 1/f noise level with temperature in the 77 K-300 K range is consistent with carrier number fluctuations thus indicating surface mechanisms: the surface charge accumulation is confirmed by the noise measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the decade of the 1980’s the literature on economic development began paying attention to the cases of countries which were industrialized after the first industrial revolution. One of the most relevant aspects analyzed has been the role of technology as a factor which promotes or delays the process of catching up with technology leaders. As result of this interest, new and more adequate indicators were identified to provide a coherent explanation for technological activities and their relationship with economic efficiency. Although the earliest studies focused on analyzing the activities of research and development (R&D), recently the focus of analysis has shifted to another type of variables, more oriented towards the processes of innovation and the gathering of knowledge and capabilities, in which patents provide relevant information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop general closed-form expressions for the mutual gravitational potential, resultant and torque acting upon a rigid tethered system moving in a non-uniform gravity field produced by an attracting body with revolution symmetry, such that an arbitrary number of zonal harmonics is considered. The final expressions are series expansion in two small parameters related to the reference radius of the primary and the length of the tether, respectively, each of which are scaled by the mutual distance between their centers of mass. A few numerical experiments are performed to study the convergence behavior of the final expressions, and conclude that for high precision applications it might be necessary to take into account additional perturbation terms, which come from the mutual Two-Body interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Independent Components Analysis is a Blind Source Separation method that aims to find the pure source signals mixed together in unknown proportions in the observed signals under study. It does this by searching for factors which are mutually statistically independent. It can thus be classified among the latent-variable based methods. Like other methods based on latent variables, a careful investigation has to be carried out to find out which factors are significant and which are not. Therefore, it is important to dispose of a validation procedure to decide on the optimal number of independent components to include in the final model. This can be made complicated by the fact that two consecutive models may differ in the order and signs of similarly-indexed ICs. As well, the structure of the extracted sources can change as a function of the number of factors calculated. Two methods for determining the optimal number of ICs are proposed in this article and applied to simulated and real datasets to demonstrate their performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single core capabilities have reached their maximum clock speed; new multicore architectures provide an alternative way to tackle this issue instead. The design of decoding applications running on top of these multicore platforms and their optimization to exploit all system computational power is crucial to obtain best results. Since the development at the integration level of printed circuit boards are increasingly difficult to optimize due to physical constraints and the inherent increase in power consumption, development of multiprocessor architectures is becoming the new Holy Grail. In this sense, it is crucial to develop applications that can run on the new multi-core architectures and find out distributions to maximize the potential use of the system. Today most of commercial electronic devices, available in the market, are composed of embedded systems. These devices incorporate recently multi-core processors. Task management onto multiple core/processors is not a trivial issue, and a good task/actor scheduling can yield to significant improvements in terms of efficiency gains and also processor power consumption. Scheduling of data flows between the actors that implement the applications aims to harness multi-core architectures to more types of applications, with an explicit expression of parallelism into the application. On the other hand, the recent development of the MPEG Reconfigurable Video Coding (RVC) standard allows the reconfiguration of the video decoders. RVC is a flexible standard compatible with MPEG developed codecs, making it the ideal tool to integrate into the new multimedia terminals to decode video sequences. With the new versions of the Open RVC-CAL Compiler (Orcc), a static mapping of the actors that implement the functionality of the application can be done once the application executable has been generated. This static mapping must be done for each of the different cores available on the working platform. It has been chosen an embedded system with a processor with two ARMv7 cores. This platform allows us to obtain the desired tests, get as much improvement results from the execution on a single core, and contrast both with a PC-based multiprocessor system. Las posibilidades ofrecidas por el aumento de la velocidad de la frecuencia de reloj de sistemas de un solo procesador están siendo agotadas. Las nuevas arquitecturas multiprocesador proporcionan una vía de desarrollo alternativa en este sentido. El diseño y optimización de aplicaciones de descodificación de video que se ejecuten sobre las nuevas arquitecturas permiten un mejor aprovechamiento y favorecen la obtención de mayores rendimientos. Hoy en día muchos de los dispositivos comerciales que se están lanzando al mercado están integrados por sistemas embebidos, que recientemente están basados en arquitecturas multinúcleo. El manejo de las tareas de ejecución sobre este tipo de arquitecturas no es una tarea trivial, y una buena planificación de los actores que implementan las funcionalidades puede proporcionar importantes mejoras en términos de eficiencia en el uso de la capacidad de los procesadores y, por ende, del consumo de energía. Por otro lado, el reciente desarrollo del estándar de Codificación de Video Reconfigurable (RVC), permite la reconfiguración de los descodificadores de video. RVC es un estándar flexible y compatible con anteriores codecs desarrollados por MPEG. Esto hace de RVC el estándar ideal para ser incorporado en los nuevos terminales multimedia que se están comercializando. Con el desarrollo de las nuevas versiones del compilador específico para el desarrollo de lenguaje RVC-CAL (Orcc), en el que se basa MPEG RVC, el mapeo estático, para entornos basados en multiprocesador, de los actores que integran un descodificador es posible. Se ha elegido un sistema embebido con un procesador con dos núcleos ARMv7. Esta plataforma nos permitirá llevar a cabo las pruebas de verificación y contraste de los conceptos estudiados en este trabajo, en el sentido del desarrollo de descodificadores de video basados en MPEG RVC y del estudio de la planificación y mapeo estático de los mismos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se da un ejemplo de un conjunto de n puntos situados en posición general, en el que se alcanza el mínimo número de puntos que pueden formar parte de algún k-set para todo k con 1menor que=kmenor quen/2. Se generaliza también, a puntos en posición no general, el resultado de Erdõs et al., 1973, sobre el mínimo número de puntos que pueden formar parte de algún k-set. The study of k- sets is a very relevant topic in the research area of computational geometry. The study of the maximum and minimum number of k-sets in sets of points of the plane in general position, specifically, has been developed at great length in the literature. With respect to the maximum number of k-sets, lower bounds for this maximum have been provided by Erdõs et al., Edelsbrunner and Welzl, and later by Toth. Dey also stated an upper bound for this maximum number of k-sets. With respect to the minimum number of k-set, this has been stated by Erdos el al. and, independently, by Lovasz et al. In this paper the authors give an example of a set of n points in the plane in general position (no three collinear), in which the minimum number of points that can take part in, at least, a k-set is attained for every k with 1 ≤ k < n/2. The authors also extend Erdos’s result about the minimum number of points in general position which can take part in a k-set to a set of n points not necessarily in general position. That is why this work complements the classic works we have mentioned before.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the relationship among research collaboration, number of documents and number of citations of computer science research activity. It analyzes the number of documents and citations and how they vary by number of authors. They are also analyzed (according to author set cardinality) under different circumstances, that is, when documents are written in different types of collaboration, when documents are published in different document types, when documents are published in different computer science subdisciplines, and, finally, when documents are published by journals with different impact factor quartiles. To investigate the above relationships, this paper analyzes the publications listed in the Web of Science and produced by active Spanish university professors between 2000 and 2009, working in the computer science field. Analyzing all documents, we show that the highest percentage of documents are published by three authors, whereas single-authored documents account for the lowest percentage. By number of citations, there is no positive association between the author cardinality and citation impact. Statistical tests show that documents written by two authors receive more citations per document and year than documents published by more authors. In contrast, results do not show statistically significant differences between documents published by two authors and one author. The research findings suggest that international collaboration results on average in publications with higher citation rates than national and institutional collaborations. We also find differences regarding citation rates between journals and conferences, across different computer science subdisciplines and journal quartiles as expected. Finally, our impression is that the collaborative level (number of authors per document) will increase in the coming years, and documents published by three or four authors will be the trend in computer science literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalized health (p-health) systems can contribute significantly to the sustainability of healthcare systems, though their feasibility is yet to be proven. One of the problems related to their development is the lack of well-established development tools for this domain. As the p-health paradigm is focused on patient self-management, big challenges arise around the design and implementation of patient systems. This paper presents a reference platform created for the development of these applications, and shows the advantages of its adoption in a complex project dealing with cardio-vascular diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we investigate the effect of biasing the axonal connection delay values in the number of polychronous groups produced for a spiking neuron network model. We use an estimation of distribution algorithm (EDA) that learns tree models to search for optimal delay configurations. Our results indicate that the introduced approach can be used to considerably increase the number of such groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the decades to come can be foreseen that electricity and water will keep be playing a key role in the countries development, both can be considered the most important energy vectors and its control can be crucial for governments, companies and leaders in general. Energy is essential for all human activities and its availability is critical to economic and social development. In particular, electricity, a form of energy, is required to produce goods, to provide medical assistance and basic civic services in education, to assure availability of clean water, to create conducive environment for prosperity and improvement, and to keep an acceptable quality of life. The way in which electricity is generated from different resources varies through the different countries. Nuclear energy controlled within reactors to steam production, gas, fuel-oil and coal fired in power stations, water, solar and wind energy among others are employed, sometimes not very efficiently, to produce electricity. The so call energy mix of an individual country is formed up by the contribution of each resource or form of energy to the electricity generation market of the so country. During the last decade the establishment of proper energy mixes for countries has gained much importance, and energy drivers should enforce long term plans and policies. Hints, reports and guides giving tracks on energy resources contribution are been developed by noticeable organisations like the IEA (International Energy Agency) or the IAEA (International Atomic Energy Agency) and the WEC (World Energy Council). This paper evaluates energy issues the market and countries are facing today regarding energy mix scheduling and panorama. This paper revises and seeks to improve methodology available that are applicable on energy mix plan definition. Key Factors are identified, established and assessed through this paper for the common implementation, the themes driving the future energy mix methodology proposal. Those have a clear influence and are closely related to future environmental policies. Key Factors take into consideration sustainability, energy security, social and economic growth, climate change, air quality and social stability. The strength of the Key Factors application on energy system planning to different countries is contingent on country resources, location, electricity demand and electricity generation industry, technology available, economic situation and prospects, energy policy and regulation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The control of carbon nanotubes conductivity is generating interest in several fields since it may be relevant for a number of applications. The self-organizing properties of liquid crystals may be used to impose alignment on dispersed carbon nanotubes,thus control-ling their conductivity and its anisotropy. This leads to a number of possible applications in photonic and electronic devices such as electrically controlled carbon nanotube switch- es and crossboards. In this work, cells of liquid crystals doped with multi-walled nanotubes have been prepared in different configurations. Their conductivity variations upon switching have been investigated. It turns out that conductivity evolution depends on the initial configuration (either homogeneous, homeotropic or in-plane switching), the cell thickness and the switching record. The control of these manufacturing paramenters allows the modulation of the electrical behavior of carbon nanotubes.