916 resultados para AFAS (ASEAN Framework Agreement on Services)
Resumo:
This paper presents a model of electoral competition focusing on the formation of thepublic agenda. An incumbent government and a challenger party in opposition competein elections by choosing the issues that will key out their campaigns. Giving salience toan issue implies proposing an innovative policy proposal, alternative to the status-quo.Parties trade off the issues with high salience in voters concerns and those with broadagreement on some alternative policy proposal. Each party expects a higher probabilityof victory if the issue it chooses becomes salient in the voters decision. But remarkably,the issues which are considered the most important ones by a majority of votes may notbe given salience during the electoral campaign. An incumbent government may survivein spite of its bad policy performance if there is no sufficiently broad agreement on apolicy alternative. We illustrate the analytical potential of the model with the case of theUnited States presidential election in 2004.
Resumo:
Plutonium and (90)Sr are considered to be among the most radiotoxic nuclides produced by the nuclear fission process. In spite of numerous studies on mammals and humans there is still no general agreement on the retention half time of both radionuclides in the skeleton in the general population. Here we determined plutonium and (90)Sr in human vertebrae in individuals deceased between 1960 and 2004 in Switzerland. Plutonium was measured by sensitive SF-ICP-MS techniques and (90)Sr by radiometric methods. We compared our results to the ones obtained for other environmental compartments to reveal the retention half time of NBT fallout (239)Pu and (90)Sr in trabecular bones of the Swiss population. Results show that plutonium has a retention half time of 40+/-14 years. In contrast (90)Sr has a shorter retention half time of 13.5+/-1.0 years. Moreover (90)Sr retention half time in vertebrae is shown to be linked to the retention half time in food and other environmental compartments. These findings demonstrate that the renewal of the vertebrae through calcium homeostatic control is faster for (90)Sr excretion than for plutonium excretion. The precise determination of the retention half time of plutonium in the skeleton will improve the biokinetic model of plutonium metabolism in humans.
Resumo:
In a closed economy context there is common agreement on price inflation stabilization being one of the objects of monetary policy. Moving to an open economy context gives rise to the coexistence of two measures of inflation: domestic inflation (DI) and consumer price inflation (CPI). Which one of the two measures should be the target variable? This is the question addressed in this paper. In particular, I use a small open economy model to show that once sticky wages indexed to past CPI inflation are introduced, a complete inward looking monetary policy is no more optimal. I first, derive a loss function from a secondorder approximation of the utility function and then, I compute the fully optimalmonetary policy under commitment. Then, I use the optimal monetary policy as a benchmark to compare the performance of different monetary policy rules. The main result is that once a positive degree of indexation is introduced in the model the rule performing better (among the Taylor type rules considered) is the one targeting wage inflation and CPI inflation. Moreover this rule delivers results very close to the one obtained under the fully optimal monetary policy with commitment.
Resumo:
The ill effects of second-hand smoke are now well documented. To protect the population from exposure to tobacco smoke, comprehensive smoking bans are necessary as expressed in the WHO Framework Convention on Tobacco Control and its guidelines. Switzerland has only a partial smoking ban full of exceptions which has been in effect since 2010, which reproduces the so-called Spanish model. In September 2012, the Swiss citizens refused a proposal for a more comprehensive ban. This case study examines the reasons behind this rejection and draws some lessons that can be learnt from it.
Resumo:
The development of statistical models for forensic fingerprint identification purposes has been the subject of increasing research attention in recent years. This can be partly seen as a response to a number of commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. In addition, key forensic identification bodies such as ENFSI [1] and IAI [2] have recently endorsed and acknowledged the potential benefits of using statistical models as an important tool in support of the fingerprint identification process within the ACE-V framework. In this paper, we introduce a new Likelihood Ratio (LR) model based on Support Vector Machines (SVMs) trained with features discovered via morphometric and spatial analyses of corresponding minutiae configurations for both match and close non-match populations often found in AFIS candidate lists. Computed LR values are derived from a probabilistic framework based on SVMs that discover the intrinsic spatial differences of match and close non-match populations. Lastly, experimentation performed on a set of over 120,000 publicly available fingerprint images (mostly sourced from the National Institute of Standards and Technology (NIST) datasets) and a distortion set of approximately 40,000 images, is presented, illustrating that the proposed LR model is reliably guiding towards the right proposition in the identification assessment of match and close non-match populations. Results further indicate that the proposed model is a promising tool for fingerprint practitioners to use for analysing the spatial consistency of corresponding minutiae configurations.
Resumo:
It is a well-appreciated fact that in many organisms the process of ageing reacts highly plastically, so that lifespan increases or decreases when the environment changes. The perhaps best-known example of such lifespan plasticity is dietary restriction (DR), a phenomenon whereby reduced food intake without malnutrition extends lifespan (typically at the expense of reduced fecundity) and which has been documented in numerous species, from invertebrates to mammals. For the evolutionary biologist, DR and other cases of lifespan plasticity are examples of a more general phenomenon called phenotypic plasticity, the ability of a single genotype to produce different phenotypes (e.g. lifespan) in response to changes in the environment (e.g. changes in diet). To analyse phenotypic plasticity, evolutionary biologists (and epidemiologists) often use a conceptual and statistical framework based on reaction norms (genotype-specific response curves) and genotype × environment interactions (G × E; differences in the plastic response among genotypes), concepts that biologists who are working on molecular aspects of ageing are usually not familiar with. Here I briefly discuss what has been learned about lifespan plasticity or, more generally, about plasticity of somatic maintenance and survival ability. In particular, I argue that adopting the conceptual framework of reaction norms and G × E interactions, as used by evolutionary biologists, is crucially important for our understanding of the mechanisms underlying DR and other forms of lifespan or survival plasticity.
Resumo:
Plutonium and Sr-90 are considered to be among the most radiotoxic nuclides produced by the nuclear fission process. In spite of numerous studies on mammals and humans there is still no general agreement on the retention half time of both radionuclides in the skeleton in the general population. Here we determined plutonium and Sr-90 in human vertebrae in individuals deceased between 1960 and 2004 in Switzerland. Plutonium was measured by sensitive SF-ICP-MS techniques and Sr-90 by radiometric methods. We compared our results to the ones obtained for other environmental compartments to reveal the retention half time of NBT fallout Pu-239 and Sr-90 in trabecular bones of the Swiss population. Results show that plutonium has a retention half time of 40 +/- 14 years. In contrast Sr-90 has a shorter retention half time of 13.5 +/- 1.0 years. Moreover Sr-90 retention half time in vertebrae is shown to be linked to the retention half time in food and other environmental compartments. These findings demonstrate that the renewal of the vertebrae through calcium homeostatic control is faster for Sr-90 excretion than for plutonium excretion. The precise determination of the retention half time of plutonium in the skeleton will improve the biokinetic model of plutonium metabolism in humans. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
The pace of on-going climate change calls for reliable plant biodiversity scenarios. Traditional dynamic vegetation models use plant functional types that are summarized to such an extent that they become meaningless for biodiversity scenarios. Hybrid dynamic vegetation models of intermediate complexity (hybrid-DVMs) have recently been developed to address this issue. These models, at the crossroads between phenomenological and process-based models, are able to involve an intermediate number of well-chosen plant functional groups (PFGs). The challenge is to build meaningful PFGs that are representative of plant biodiversity, and consistent with the parameters and processes of hybrid-DVMs. Here, we propose and test a framework based on few selected traits to define a limited number of PFGs, which are both representative of the diversity (functional and taxonomic) of the flora in the Ecrins National Park, and adapted to hybrid-DVMs. This new classification scheme, together with recent advances in vegetation modeling, constitutes a step forward for mechanistic biodiversity modeling.
Resumo:
The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.
Resumo:
The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.
Resumo:
Transfer of tumor antigen-specific T-cell receptors (TCRs) into human T cells aims at redirecting their cytotoxicity toward tumors. Efficacy and safety may be affected by pairing of natural and introduced TCRalpha/beta chains potentially leading to autoimmunity. We hypothesized that a novel single-chain (sc)TCR framework relying on the coexpression of the TCRalpha constant alpha (Calpha) domain would prevent undesired pairing while preserving structural and functional similarity to a fully assembled double-chain (dc)TCR/CD3 complex. We confirmed this hypothesis for a murine p53-specific scTCR. Substantial effector function was observed only in the presence of a murine Calpha domain preceded by a TCRalpha signal peptide for shuttling to the cell membrane. The generalization to a human gp100-specific TCR required the murinization of both C domains. Structural and functional T-cell avidities of an accessory disulfide-linked scTCR gp100/Calpha were higher than those of a dcTCR. Antigen-dependent phosphorylation of the proximal effector zeta-chain-associated protein kinase 70 at tyrosine 319 was not impaired, reflecting its molecular integrity in signaling. In melanoma-engrafted nonobese diabetic/severe combined immunodeficient mice, adoptive transfer of scTCR gp100/Calpha transduced T cells conferred superior delay in tumor growth among primary and long-term secondary tumor challenges. We conclude that the novel scTCR constitutes a reliable means to immunotherapeutically target hematologic malignancies.
Resumo:
The goal of this study is to develop a usable sufficiency rating system for secondary roads. There are several assumptions that have been made at the outset. These are: 1. County engineers currently use at least a limited set of decision criteria to make decisions regarding project priorities. 2. Some degree of consensus exists among the county engineers in terms of which are the most important criteria and that there is some agreement on their relative importance. Accordingly, a questionnaire was developed which could be used as a survey tool. The results of the survey were used to develop a final list of weighted rating elements which were used as part of the proposed sufficiency rating system. State and local jurisdictions from other states were also surveyed to determine the status of the use of sufficiency rating systems for secondary roads outside of Iowa and to gather some applicable data.
Resumo:
Increasingly, patients with unhealthy alcohol and other drug use are being seen in primary care and other non-specialty addiction settings. Primary care providers are well positioned to screen, assess, and treat patients with alcohol and other drug use because this use, and substance use disorders, may contribute to a host of medical and mental health harms. We sought to identify and examine important recent advances in addiction medicine in the medical literature that have implications for the care of patients in primary care or other generalist settings. To accomplish this aim, we selected articles in the field of addiction medicine, critically appraised and summarized the manuscripts, and highlighted their implications for generalist practice. During an initial review, we identified articles through an electronic Medline search (limited to human studies and in English) using search terms for alcohol and other drugs of abuse published from January 2010 to January 2012. After this initial review, we searched for other literature in web-based or journal resources for potential articles of interest. From the list of articles identified in these initial reviews, each of the six authors independently selected articles for more intensive review and identified the ones they found to have a potential impact on generalist practice. The identified articles were then ranked by the number of authors who selected each article. Through a consensus process over 4 meetings, the authors reached agreement on the articles with implications for practice for generalist clinicians that warranted inclusion for discussion. The authors then grouped the articles into five categories: 1) screening and brief interventions in outpatient settings, 2) identification and management of substance use among inpatients, 3) medical complications of substance use, 4) use of pharmacotherapy for addiction treatment in primary care and its complications, and 5) integration of addiction treatment and medical care. The authors discuss each selected articles' merits, limitations, conclusions, and implication to advancing addiction screening, assessment, and treatment of addiction in generalist physician practice environments.
Resumo:
The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.