961 resultados para metadata schemes
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
Cooperation between libraries is a universal language spoken in different dialects. In 1996 the libraries of the state-funded universities and the National Library of Catalonia (Spain) formed the Consortium of Academic Libraries of Catalonia (CBUC) to act as a channel for cooperation. The organization and activities of CBUC are an example of how this universal language has been adapted to the specific characteristics of the Libraries of Catalonia. Catalonia is an autonomous region of Spain with 7 million inhabitants with its own language, history and traditions and with a strong feeling of own identity that facilitates the cooperation. Thanks to this (and also to the hard work of the member libraries), since then, CBUC has created a union catalogue, an interlibrary lending program, the Digital Library of Catalonia, a cooperative store, different cooperatives repositories and other cooperation programs. One of these cooperatives repositories is RACO (Catalan Journals in Open Access, www.raco.cat) where can be consulted, in open access, the full-text articles of scientific, cultural and scholar Catalan journals. The main purpose of RACO is to increase the visibility and searches of the journals included and to spread the scientific and academic production published in Catalonia. This purpose makes specific in three aims: encourage the electronic edition of Catalan journals; be the interface that allows the whole search of all the journals and provide the instruments for its preservation. There are currently 244 journals in RACO, that includes more than 85.000 articles (80% in OA) from 50 publishing institutions. Since it got into operation it has had more than 4 millions of queries. These 244 journals offer the full-text of all the published issues. Nevertheless, some journal can have a delay between the introduction of the table of contents and the full-text for the recent issues. From 2005 we have a plan of retrospective digitization that has allowed to digitize more than 350.000 pages of back issues. The RACO repository works with the open source program OJS (Open Journal Systems, http://pkp.sfu.ca/ojs/) and uses Dublin Core Metadata and the interoperability protocol created by Open Archives Initiative (OAI) which allows to increase the visibility of the articles published in journals offering oneself together with other international repositories.
Resumo:
We construct a new family of semi-discrete numerical schemes for the approximation of the one-dimensional periodic Vlasov-Poisson system. The methods are based on the coupling of discontinuous Galerkin approximation to the Vlasov equation and several finite element (conforming, non-conforming and mixed) approximations for the Poisson problem. We show optimal error estimates for the all proposed methods in the case of smooth compactly supported initial data. The issue of energy conservation is also analyzed for some of the methods.
Resumo:
We characterize the class of strategy-proof social choice functions on the domain of symmetric single-peaked preferences. This class is strictly larger than the set of generalized median voter schemes (the class of strategy-proof and tops-only social choice functions on the domain of single-peaked preferences characterized by Moulin (1980)) since, under the domain of symmetric single-peaked preferences, generalized median voter schemes can be disturbed by discontinuity points and remain strategy-proof on the smaller domain. Our result identifies the specific nature of these discontinuities which allow to design non-onto social choice functions to deal with feasibility constraints.
Resumo:
The Keller-Segel system has been widely proposed as a model for bacterial waves driven by chemotactic processes. Current experiments on E. coli have shown precise structure of traveling pulses. We present here an alternative mathematical description of traveling pulses at a macroscopic scale. This modeling task is complemented with numerical simulations in accordance with the experimental observations. Our model is derived from an accurate kinetic description of the mesoscopic run-and-tumble process performed by bacteria. This model can account for recent experimental observations with E. coli. Qualitative agreements include the asymmetry of the pulse and transition in the collective behaviour (clustered motion versus dispersion). In addition we can capture quantitatively the main characteristics of the pulse such as the speed and the relative size of tails. This work opens several experimental and theoretical perspectives. Coefficients at the macroscopic level are derived from considerations at the cellular scale. For instance the stiffness of the signal integration process turns out to have a strong effect on collective motion. Furthermore the bottom-up scaling allows to perform preliminary mathematical analysis and write efficient numerical schemes. This model is intended as a predictive tool for the investigation of bacterial collective motion.
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.
Resumo:
1. Landscape modification is often considered the principal cause of population decline in many bat species. Thus, schemes for bat conservation rely heavily on knowledge about species-landscape relationships. So far, however, few studies have quantified the possible influence of landscape structure on large-scale spatial patterns in bat communities. 2. This study presents quantitative models that use landscape structure to predict (i) spatial patterns in overall community composition and (ii) individual species' distributions through canonical correspondence analysis and generalized linear models, respectively. A geographical information system (GIS) was then used to draw up maps of (i) overall community patterns and (ii) distribution of potential species' habitats. These models relied on field data from the Swiss Jura mountains. 3. Fight descriptors of landscape structure accounted for 30% of the variation in bat community composition. For some species, more than 60% of the variance in distribution could be explained by landscape structure. Elevation, forest or woodland cover, lakes and suburbs, were the most frequent predictors. 4. This study shows that community composition in bats is related to landscape structure through species-specific relationships to resources. Due to their nocturnal activities and the difficulties of remote identification, a comprehensive bat census is rarely possible, and we suggest that predictive modelling of the type described here provides an indispensable conservation tool.
Resumo:
Metabolic labeling techniques have recently become popular tools for the quantitative profiling of proteomes. Classical stable isotope labeling with amino acids in cell cultures (SILAC) uses pairs of heavy/light isotopic forms of amino acids to introduce predictable mass differences in protein samples to be compared. After proteolysis, pairs of cognate precursor peptides can be correlated, and their intensities can be used for mass spectrometry-based relative protein quantification. We present an alternative SILAC approach by which two cell cultures are grown in media containing isobaric forms of amino acids, labeled either with 13C on the carbonyl (C-1) carbon or 15N on backbone nitrogen. Labeled peptides from both samples have the same nominal mass and nearly identical MS/MS spectra but generate upon fragmentation distinct immonium ions separated by 1 amu. When labeled protein samples are mixed, the intensities of these immonium ions can be used for the relative quantification of the parent proteins. We validated the labeling of cellular proteins with valine, isoleucine, and leucine with coverage of 97% of all tryptic peptides. We improved the sensitivity for the detection of the quantification ions on a pulsing instrument by using a specific fast scan event. The analysis of a protein mixture with a known heavy/light ratio showed reliable quantification. Finally the application of the technique to the analysis of two melanoma cell lines yielded quantitative data consistent with those obtained by a classical two-dimensional DIGE analysis of the same samples. Our method combines the features of the SILAC technique with the advantages of isobaric labeling schemes like iTRAQ. We discuss advantages and disadvantages of isobaric SILAC with immonium ion splitting as well as possible ways to improve it
Resumo:
We introduce and analyze two new semi-discrete numerical methods for the multi-dimensional Vlasov-Poisson system. The schemes are constructed by combing a discontinuous Galerkin approximation to the Vlasov equation together with a mixed finite element method for the Poisson problem. We show optimal error estimates in the case of smooth compactly supported initial data. We propose a scheme that preserves the total energy of the system.
Resumo:
We consider linear optimization over a nonempty convex semi-algebraic feasible region F. Semidefinite programming is an example. If F is compact, then for almost every linear objective there is a unique optimal solution, lying on a unique \active" manifold, around which F is \partly smooth", and the second-order sufficient conditions hold. Perturbing the objective results in smooth variation of the optimal solution. The active manifold consists, locally, of these perturbed optimal solutions; it is independent of the representation of F, and is eventually identified by a variety of iterative algorithms such as proximal and projected gradient schemes. These results extend to unbounded sets F.
Resumo:
Behavioural variation in the South American malaria vector Anopheles darlingi is described. At the centre of its distribution, in forest areas close to the city of Manaus, Brazil, it is primarily exophagic and exophilic. Mosquitoes from this area are chromosomally diverse. Towards the northern edge of its distribution (in Guyana and Venezuela) it is more endophagic and less diverse chromosomally. Similarly in the south (in the state of Minas Gerais) it is less polymorphic. In this area, however, it is primarily zoophilic and exophagic. Evidence is presented that female wing size may vary between populations. The possibility that this widely distributed species may be a complex could have important implications for future malaria control schemes.
Resumo:
Introduction: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10-year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Switzerland. Methods: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Lausanne, Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10-year risk of fatal or non-fatal CHD >20% and a 10-year risk of fatal CVD >= 5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. Results: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC (Figure). Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24'310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). Conclusion: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Switzerland.
Resumo:
An increasing number of studies have sprung up in recent years seeking to identify individual inventors from patent data. Different heuristics have been suggested to use their names and other information disclosed in patent documents in order to find out “who is who” in patents. This paper contributes to this literature by setting forth a methodology to identify them using patents applied to the European Patent Office (EPO hereafter). As in the large part of this literature, we basically follow a three-steps procedure: (1) the parsing stage, aimed at reducing the noise in the inventor’s name and other fields of the patent; (2) the matching stage, where name matching algorithms are used to group possible similar names; (3) the filtering stage, where additional information and different scoring schemes are used to filter out these potential same inventors. The paper includes some figures resulting of applying the algorithms to the set of European inventors applying to the EPO for a large period of time.
Resumo:
The effectiveness of R&D subsidies can vary substantially depending on their characteristics. Specifically, the amount and intensity of such subsidies are crucial issues in the design of public schemes supporting private R&D. Public agencies determine the intensities of R&D subsidies for firms in line with their eligibility criteria, although assessing the effects of R&D projects accurately is far from straightforward. The main aim of this paper is to examine whether there is an optimal intensity for R&D subsidies through an analysis of their impact on private R&D effort. We examine the decisions of a public agency to grant subsidies taking into account not only the characteristics of the firms but also, as few previous studies have done to date, those of the R&D projects. In determining the optimal subsidy we use both parametric and nonparametric techniques. The results show a non-linear relationship between the percentage of subsidy received and the firms’ R&D effort. These results have implications for technology policy, particularly for the design of R&D subsidies that ensure enhanced effectiveness.