961 resultados para number of patent applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the significant impact of Web 2.0-related innovations on new Internet-based initiatives, this paper seeks to identify to what extent the main developments are protected by patents and whether patents have had a leading role in the advent of Web 2.0. The article shows that the number of patent applications filed is not that important for many of the Web 2.0 technologies in frequent use and that, of those filed, those granted are even less. The conclusion is that patents do not seem to be a relevant factor in the development of the Web 2.0 (and more generally in dynamic markets) where there is a high degree of innovation and low entry barriers for newcomers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnology developments continue to be produced at exponential rates for a wide and diverse range of applications. In this paper was done a study of technological forecasting in nanotechnology applied to health, based on information drawn in Brazil from 1991 to 2010. The longitudinal evolutions of the number of patent applications, their topics, and their respective patent families have been evaluated for the total global activity. There were obtained 1352 patent applications in this period. It were analyzed the legal nature of the depositors, the year of deposit, depositors' home countries and processes. It has been a goal subsidizes the policy-makers to adapt and modernize the regulatory framework on nanotechnology and risks involving health as a strategic area in the politics of Science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The profile analysis of CNPq Research Productivity Fellows (PQ) in the four subfields of chemistry and in their respective specialties highlighted particularities with regard to the indicators related to the judging criteria established by the Chemistry Advisory Committee. The curricula of all 727 PQ fellows with active grants in 15/03/2013 were analyzed spanning the past10 years (2003-2013). Out of all PQ-1 fellows, researchers in the subfield of Organic Chemistry had the highest median number of articles published per year. The subfield of Analytical Chemistry qualifies a higher number of postgraduate level students in comparison to other Chemistry subfields. Furthermore, this subfield had the highest average Hirsch index among PQ-1A and PQ-1B fellows. On the other hand, Inorganic Chemistry had the highest average number of patent applications per researcher, while Physical Chemistry had the specialties with the highest citation rates per paper and the highest average impact factors per journal. In all subfields, women made up a low proportion, especially at the highest levels of PQ fellowships. Although quantitative differences in scientific output were observed among the subfields, qualitative evaluation of science output was not carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Produção intelectual e desenvolvimento tecnológico podem diferenciar países e regiões no processo de desenvolvimento socioeconômico. No caso do Brasil, observa-se o papel energético do etanol combustível para veículos motorizados leves como um resultado importante do avanço tecnológico do país, que vai além da aptidão agroclimática. A contínua busca pela especialização tecnológica verticalizada do setor sucroenergético poderia levar o Brasil a uma posição, se não autônoma, mais confortável, não só de produtor de matéria-prima, mas de processos agregadores de valor no que diz respeito aos processos de produção de etanol de segunda geração, produzido a partir de biomassa lignocelulósica. O objetivo desta dissertação é analisar os esforços de P&D que resultaram em Depósitos e Publicações de patentes em órgãos oficiais como o United States Patent and Trademark Office (USPTO), o European Patent Office (EPO), e o Instituto Nacional de Propriedade Industrial (INPI), no tema etanol de segunda geração. Ainda, verifica-se se esses esforços impactam no poder concorrencial de países e firmas depositantes de patentes. Além das coletas e observações dos dados dos órgãos acima mencionados, foram calculados para os dados de depósitos e publicações de patentes no tema bioetanol lignocelulósico os índices de Herfindahl Hirschman (HHI) e a razão de concentração (Concentration Ratio) CR4, tradicionalmente utilizados para que órgãos reguladores de defesa do consumidor autorizem fusões e aquisições entre participantes de um determinado mercado. Esse método permite a observação do grau de competitividade entre as firmas depositantes de patentes no tema e a possível tendência sobre a detenção do controle em futuro próximo e a corrida para venda de royalties dos processos desenvolvidos em diferentes áreas tecnológicas para incrementar a produção industrial de etanol avançado. Os resultados indicam uma concentração elevada dos esforços de pesquisa, medidos pelos depósitos de patentes, referentes a etanol de segunda geração, em um número muito reduzido de empresas norte-americanas, quando analisada a base de dados dos EUA. O sucesso desses esforços, mensurados pela publicação de patentes, contudo, não se mostra concentrado nem nos EUA nem na União Europeia. No caso do Brasil, ainda não são encontradas publicações de patentes no tema Lignocellulosic Bioethanol, bem como apenas uma empresa brasileira possui uma patente publicada nos Estados Unidos. Esses resultados sugerem que investimentos em pesquisa científica no Brasil podem produzir mais artigos publicados e titulação acadêmica/científica que propriamente o registro de patentes em órgãos especializados em qualificar a invenção de métodos, processos ou fórmulas, dentro e fora do país. Isso pode significar tanto baixo esforço em pesquisa no assunto quanto à perda pelo autor e/ou sua instituição da oportunidade de ter seu esforço de pesquisa recompensado por meio de royalties, como compensação pela criatividade, dedicação intelectual e de recursos econômicos. Os resultados deste estudo contribuem para o debate a respeito da crescente necessidade de produção e abastecimento de fontes renováveis de energia, como o biocombustível etanol avançado à base do derivado bagaço de cana-de-açúcar, a custos mais competitivos como matéria-prima adicional e para produção incremental de etanol em futuro próximo. As conclusões do estudo indicam a necessidade do aumento na produção de conhecimento aplicado e em esforços para garantir sua propriedade intelectual, permitindo o retorno patrimonial com royalties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patent and trademark offices which run according to principles of new management have an inherent need for dependable forecasting data in planning capacity and service levels. The ability of the Spanish Office of Patents and Trademarks to carry out efficient planning of its resource needs requires the use of methods which allow it to predict the changes in the number of patent and trademark applications at different time horizons. The approach for the prediction of time series of Spanish patents and trademarks applications (1979e2009) was based on the use of different techniques of time series prediction in a short-term horizon. The methods used can be grouped into two specifics areas: regression models of trends and time series models. The results of this study show that it is possible to model the series of patents and trademarks applications with different models, especially ARIMA, with satisfactory model adjustment and relatively low error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synchronization plays an important role in telecommunication systems, integrated circuits, and automation systems. Formerly, the masterslave synchronization strategy was used in the great majority of cases due to its reliability and simplicity. Recently, with the wireless networks development, and with the increase of the operation frequency of integrated circuits, the decentralized clock distribution strategies are gaining importance. Consequently, fully connected clock distribution systems with nodes composed of phase-locked loops (PLLs) appear as a convenient engineering solution. In this work, the stability of the synchronous state of these networks is studied in two relevant situations: when the node filters are first-order lag-lead low-pass or when the node filters are second-order low-pass. For first-order filters, the synchronous state of the network shows to be stable for any number of nodes. For second-order filter, there is a superior limit for the number of nodes, depending on the PLL parameters. Copyright (C) 2009 Atila Madureira Bueno et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation presented at Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia in fulfilment of the requirements for the Masters degree in Mathematics and Applications, specialization in Actuarial Sciences, Statistics and Operations Research

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to suggest a method to find endogenously the points that group the individuals of a given distribution in k clusters, where k is endogenously determined. These points are the cut-points. Thus, we need to determine a partition of the N individuals into a number k of groups, in such way that individuals in the same group are as alike as possible, but as distinct as possible from individuals in other groups. This method can be applied to endogenously identify k groups in income distributions: possible applications can be poverty

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To introduce a new k-space traversal strategy for segmented three-dimensional echo planar imaging (3D EPI) that encodes two partitions per radiofrequency excitation, effectively reducing the number excitations used to acquire a 3D EPI dataset by half. METHODS: The strategy was evaluated in the context of functional MRI applications for: image quality compared with segmented 3D EPI, temporal signal-to-noise ratio (tSNR) (the ability to detect resting state networks compared with multislice two-dimensional (2D) EPI and segmented 3D EPI, and temporal resolution (the ability to separate cardiac- and respiration-related fluctuations from the desired blood oxygen level-dependent signal of interest). RESULTS: Whole brain images with a nominal voxel size of 2 mm isotropic could be acquired with a temporal resolution under half a second using traditional parallel imaging acceleration up to 4× in the partition-encode direction and using novel data acquisition speed-up of 2× with a 32-channel coil. With 8× data acquisition speed-up in the partition-encode direction, 3D reduced excitations (RE)-EPI produced acceptable image quality without introduction of noticeable additional artifacts. Due to increased tSNR and better characterization of physiological fluctuations, the new strategy allowed detection of more resting state networks compared with multislice 2D-EPI and segmented 3D EPI. CONCLUSION: 3D RE-EPI resulted in significant increases in temporal resolution for whole brain acquisitions and in improved physiological noise characterization compared with 2D-EPI and segmented 3D EPI. Magn Reson Med 72:786-792, 2014. © 2013 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanoparticles <100 nanometres are being introduced into industrial processes, but they are suspected to cause similar negative health effects to ambient particles. Poor knowledge about the scale of introduction has not allowed global risk analysis until now. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate to the full Swiss workforce. The purpose of the study presented here was to provide a quantitative estimate of the potential occupational exposure to nanoparticles in Swiss industry. Method: A layered representative questionnaire survey among 1626 Swiss companies of the production sector was conducted in 2007. The survey was a written questionnaire, collecting data about the used nanoparticles, the number of potentially exposed persons in the companies and their protection strategy. Results: The response rate of the study was 58.3%. The number of companies estimated to be using nanoparticles in Switzerland was 586 (95% Confidence Interval 145 to 1027). It is estimated that 1309 workers (95% CI 1073 to 1545) do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant protection means. Such information is valuable for risk evaluation. The low number of companies dealing with nanoparticles in Switzerland suggests that policy makers as well as health, safety and environmental officers within companies can focus their efforts on a relatively small number of companies or workers. The collected data about types of particles and applications may be used for research on prevention strategies and adapted protection means. However, to reflect the most recent trends, the information presented here has to be continuously updated, and a large-scale inventory of the usage should be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of variance is commonly used in morphometry in order to ascertain differences in parameters between several populations. Failure to detect significant differences between populations (type II error) may be due to suboptimal sampling and lead to erroneous conclusions; the concept of statistical power allows one to avoid such failures by means of an adequate sampling. Several examples are given in the morphometry of the nervous system, showing the use of the power of a hierarchical analysis of variance test for the choice of appropriate sample and subsample sizes. In the first case chosen, neuronal densities in the human visual cortex, we find the number of observations to be of little effect. For dendritic spine densities in the visual cortex of mice and humans, the effect is somewhat larger. A substantial effect is shown in our last example, dendritic segmental lengths in monkey lateral geniculate nucleus. It is in the nature of the hierarchical model that sample size is always more important than subsample size. The relative weight to be attributed to subsample size thus depends on the relative magnitude of the between observations variance compared to the between individuals variance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Application of semi-distributed hydrological models to large, heterogeneous watersheds deals with several problems. On one hand, the spatial and temporal variability in catchment features should be adequately represented in the model parameterization, while maintaining the model complexity in an acceptable level to take advantage of state-of-the-art calibration techniques. On the other hand, model complexity enhances uncertainty in adjusted model parameter values, therefore increasing uncertainty in the water routing across the watershed. This is critical for water quality applications, where not only streamflow, but also a reliable estimation of the surface versus subsurface contributions to the runoff is needed. In this study, we show how a regularized inversion procedure combined with a multiobjective function calibration strategy successfully solves the parameterization of a complex application of a water quality-oriented hydrological model. The final value of several optimized parameters showed significant and consistentdifferences across geological and landscape features. Although the number of optimized parameters was significantly increased by the spatial and temporal discretization of adjustable parameters, the uncertainty in water routing results remained at reasonable values. In addition, a stepwise numerical analysis showed that the effects on calibration performance due to inclusion of different data types in the objective function could be inextricably linked. Thus caution should be taken when adding or removing data from an aggregated objective function.