68 resultados para Coffeee certification schemes
Resumo:
The Keller-Segel system has been widely proposed as a model for bacterial waves driven by chemotactic processes. Current experiments on E. coli have shown precise structure of traveling pulses. We present here an alternative mathematical description of traveling pulses at a macroscopic scale. This modeling task is complemented with numerical simulations in accordance with the experimental observations. Our model is derived from an accurate kinetic description of the mesoscopic run-and-tumble process performed by bacteria. This model can account for recent experimental observations with E. coli. Qualitative agreements include the asymmetry of the pulse and transition in the collective behaviour (clustered motion versus dispersion). In addition we can capture quantitatively the main characteristics of the pulse such as the speed and the relative size of tails. This work opens several experimental and theoretical perspectives. Coefficients at the macroscopic level are derived from considerations at the cellular scale. For instance the stiffness of the signal integration process turns out to have a strong effect on collective motion. Furthermore the bottom-up scaling allows to perform preliminary mathematical analysis and write efficient numerical schemes. This model is intended as a predictive tool for the investigation of bacterial collective motion.
Resumo:
In this paper we describe an open learning object repository on Statistics based on DSpace which contains true learning objects, that is, exercises, equations, data sets, etc. This repository is part of a large project intended to promote the use of learning object repositories as part of the learning process in virtual learning environments. This involves the creation of a new user interface that provides users with additional services such as resource rating, commenting and so. Both aspects make traditional metadata schemes such as Dublin Core to be inadequate, as there are resources with no title or author, for instance, as those fields are not used by learners to browse and search for learning resources in the repository. Therefore, exporting OAI-PMH compliant records using OAI-DC is not possible, thus limiting the visibility of the learning objects in the repository outside the institution. We propose an architecture based on ontologies and the use of extended metadata records for both storing and refactoring such descriptions.
Resumo:
The aim of this paper is to verify, for the Spanish case, whether between 1977 and 2008 has increased the internal democracy of the major political parties (PSOE, AP / PP, PCE / IU, PNV and CDC). To do this, we will focus on their leadership selection processes, one of the key elements associated with intra-party democracy. The paper is going to introduce data on four different dimensions of leadership selection: the certification process, the voting procedure, the inclusiveness of the selectorate and, finally, the degree of competitiveness. The results will show that have been few changes in the leadership selection processes of the Spanish political parties since 1977. However, the results of the Spanish case will also be used to suggest some preliminary links between the four dimensions.
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.
Resumo:
Els esforços de les campanyes fetes per les ONG van portar a un primer plànol internacional els diamants de guerra - els conflictes lligats als conflictes armats -a finals dels anys noranta. Com a resposta, es va formar el Procés de Kimberley (PK), un fòrum negociador entre estats, ONG i indústria, per debatre possibles solucions que posessin fre al comerç de diamants de guerra. Menys de tres anys després es va adoptar un sistema voluntari de certificació internacional anomenat Sistema de Certificació del Procés de Kimberley (SCPK). El SCPK regula el comerç de diamants en brut certificant tots els diamants legals. Aquest article repassa el problema dels diamants de guerra, explica com una campanya internacional va despertar la consciència sobre el problema, i com el procés de recerca de solucions va culminar en el PK. L’ anàlisi es centra en els diferents actors implicats (ONG, estats i indústria) i les seves interaccions canviants al llarg de la campanya i dels esforços realitzats per crear una regulació internacional. Com a conclusió, es destaquen algunes lliçons fonamentals, en termes analítics i d’acció, derivats d’aquest estudi de cas.
Resumo:
Campaign efforts by NGOs initially put conflict diamonds on the global radar screen in the late 1990s. In response, the Kimberley Process (KP), a negotiation forum between states, NGOs, and industry, was formed to discuss possible solutions to curb the trade in conflict diamonds. Less than three years later, a voluntary, global certification named the Kimberley Process Certification Scheme (KPCS) was adopted. The KPCS regulates the trade of rough diamonds by certifying all legitimate diamonds. This paper outlines the problem of conflict diamonds, how a global campaign raised awareness about the issue, and how the process of solution building unfolded in the KP. My analysis focuses on the diverse set of actors (NGOs, states, and industry) and their changing interactions over the course of the campaign and global regulation efforts. I conclude with several key lessons that capture important elements observed in this case study.
Resumo:
We introduce and analyze two new semi-discrete numerical methods for the multi-dimensional Vlasov-Poisson system. The schemes are constructed by combing a discontinuous Galerkin approximation to the Vlasov equation together with a mixed finite element method for the Poisson problem. We show optimal error estimates in the case of smooth compactly supported initial data. We propose a scheme that preserves the total energy of the system.
Resumo:
We consider linear optimization over a nonempty convex semi-algebraic feasible region F. Semidefinite programming is an example. If F is compact, then for almost every linear objective there is a unique optimal solution, lying on a unique \active" manifold, around which F is \partly smooth", and the second-order sufficient conditions hold. Perturbing the objective results in smooth variation of the optimal solution. The active manifold consists, locally, of these perturbed optimal solutions; it is independent of the representation of F, and is eventually identified by a variety of iterative algorithms such as proximal and projected gradient schemes. These results extend to unbounded sets F.
Resumo:
An increasing number of studies have sprung up in recent years seeking to identify individual inventors from patent data. Different heuristics have been suggested to use their names and other information disclosed in patent documents in order to find out “who is who” in patents. This paper contributes to this literature by setting forth a methodology to identify them using patents applied to the European Patent Office (EPO hereafter). As in the large part of this literature, we basically follow a three-steps procedure: (1) the parsing stage, aimed at reducing the noise in the inventor’s name and other fields of the patent; (2) the matching stage, where name matching algorithms are used to group possible similar names; (3) the filtering stage, where additional information and different scoring schemes are used to filter out these potential same inventors. The paper includes some figures resulting of applying the algorithms to the set of European inventors applying to the EPO for a large period of time.
Resumo:
The effectiveness of R&D subsidies can vary substantially depending on their characteristics. Specifically, the amount and intensity of such subsidies are crucial issues in the design of public schemes supporting private R&D. Public agencies determine the intensities of R&D subsidies for firms in line with their eligibility criteria, although assessing the effects of R&D projects accurately is far from straightforward. The main aim of this paper is to examine whether there is an optimal intensity for R&D subsidies through an analysis of their impact on private R&D effort. We examine the decisions of a public agency to grant subsidies taking into account not only the characteristics of the firms but also, as few previous studies have done to date, those of the R&D projects. In determining the optimal subsidy we use both parametric and nonparametric techniques. The results show a non-linear relationship between the percentage of subsidy received and the firms’ R&D effort. These results have implications for technology policy, particularly for the design of R&D subsidies that ensure enhanced effectiveness.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
During the recent period of economic crisis, many countries have introduced scrappage schemes to boost the sale and production of vehicles, particularly of vehicles designed to pollute less. In this paper, we analyze the impact of a particular scheme in Spain (Plan2000E) on vehicle prices and sales figures as well as on the reduction of polluting emissions from vehicles on the road. We considered the introduction of this scheme an exogenous policy change and because we could distinguish a control group (non-subsidized vehicles) and a treatment group (subsidized vehicles), before and after the introduction of the Plan, we were able to carry out our analysis as a quasi-natural experiment. Our study reveals that manufacturers increased vehicle prices by the same amount they were granted through the Plan (1,000 â¬). In terms of sales, econometric estimations revealed an increase of almost 5% as a result of the implementation of the Plan. With regard to environmental efficiency, we compared the costs (inverted quantity of money) and the benefits of the program (reductions in polluting emissions and additional fiscal revenues) and found that the Plan would only be beneficial if it boosted demand by at least 30%.
Resumo:
The Great Tohoku-Kanto earthquake and resulting tsunami has brought considerable attention to the issue of the construction of new power plants. We argue in this paper, nuclear power is not a sustainable solution to energy problems. First, we explore the stock of uranium-235 and the different schemes developed by the nuclear power industry to exploit this resource. Second, we show that these methods, fast breeder and MOX fuel reactors, are not feasible. Third, we show that the argument that nuclear energy can be used to reduce CO2 emissions is false: the emissions from the increased water evaporation from nuclear power generation must be accounted for. In the case of Japan, water from nuclear power plants is drained into the surrounding sea, raising the water temperature which has an adverse affect on the immediate ecosystem, as well as increasing CO2 emissions from increased water evaporation from the sea. Next, a short exercise is used to show that nuclear power is not even needed to meet consumer demand in Japan. Such an exercise should be performed for any country considering the construction of additional nuclear power plants. Lastly, the paper is concluded with a discussion of the implications of our findings.
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.
Resumo:
Aquesta memòria vol mostrar que la tecnologia XML és la millor alternativa per a afrontar el repte tecnològic existent en els sistemes d'extracció d'informació de les aplicacions de nova generació. Aquests sistemes, d'una banda, han de garantir la seva independència respecte dels esquemes de les bases de dades dels quals s'alimenten i, de l'altra, han de ser capaços de mostrar la informació en múltiples formats.