929 resultados para Source analysis
Resumo:
This paper applies a policy analysis approach to the question of how to effectively regulate micropollution in a sustainable manner. Micropollution is a complex policy problem characterized by a huge number and diversity of chemical substances, as well as various entry paths into the aquatic environment. It challenges traditional water quality management by calling for new technologies in wastewater treatment and behavioral changes in industry, agriculture and civil society. In light of such challenges, the question arises as to how to regulate such a complex phenomenon to ensure water quality is maintained in the future? What can we learn from past experiences in water quality regulation? To answer these questions, policy analysis strongly focuses on the design and choice of policy instruments and the mix of such measures. In this paper, we review instruments commonly used in past water quality regulation. We evaluate their ability to respond to the characteristics of a more recent water quality problem, i.e., micropollution, in a sustainable way. This way, we develop a new framework that integrates both the problem dimension (i.e., causes and effects of a problem) as well as the sustainability dimension (e.g., long-term, cross-sectoral and multi-level) to assess which policy instruments are best suited to regulate micropollution. We thus conclude that sustainability criteria help to identify an appropriate instrument mix of end-of-pipe and source-directed measures to reduce aquatic micropollution.
Resumo:
The University of Bern has set up the new Laboratory for the Analysis of Radiocarbon with AMS (LARA) equipped with an accelerator mass spectrometer (AMS) MICADAS (MIni CArbon Dating System) to continue its long history of 14C analysis based on conventional counting. The new laboratory is designated to provide routine 14C dating for archaeology, climate research, and other disciplines at the University of Bern and to develop new analytical systems coupled to the gas ion source for 14C analysis of specific compounds or compound classes with specific physical properties. Measurements of reference standards and wood samples dated by dendrochronology demonstrate the quality of the 14C analyses performed at the new laboratory.
Resumo:
OBJECTIVE To analyze the types of articles and authorship characteristics of three orthodontic journals--American Journal of Orthodontics and Dentofacial Orthopedics (AJODO), The Angle Orthodontist (AO), and European Journal of Orthodontics (EJO)--published between 2008 and 2012 and to assess the differences in content within this period and an earlier period of 1998 to 2002. MATERIALS AND METHODS Each journal's content was accessed through the web edition. From each article, the following parameters were recorded: article type, number of authors, number of affiliations, source of article (referring to the first author's affiliation), and geographic origin. Descriptive statistics were performed and selected parameters were analyzed with the Pearson chi-square or Fisher exact test for independence at the .05 level of significance. RESULTS Review of differences between the two periods showed that the number of publications was almost double. The percentages of multi-authored articles increased. Fewer studies derived from the United States/Canada and European Union countries. Increases for articles from non-European Union countries, Asia, and other countries were found. Characteristics of the second period showed that the EJO and AO published more research articles, whereas the AJODO regularly published case reports and other articles. Approximately 75% of all studies derived from orthodontic departments. CONCLUSIONS The publications from 1998-2002 and 2008-2012 were significantly different both in terms of numbers and characteristics. Within 2008-2012 there were notable differences between the three journals concerning the type and origin of the publications.
Resumo:
BACKGROUND Data on the association between subclinical thyroid dysfunction and fractures conflict. PURPOSE To assess the risk for hip and nonspine fractures associated with subclinical thyroid dysfunction among prospective cohorts. DATA SOURCES Search of MEDLINE and EMBASE (1946 to 16 March 2014) and reference lists of retrieved articles without language restriction. STUDY SELECTION Two physicians screened and identified prospective cohorts that measured thyroid function and followed participants to assess fracture outcomes. DATA EXTRACTION One reviewer extracted data using a standardized protocol, and another verified data. Both reviewers independently assessed methodological quality of the studies. DATA SYNTHESIS The 7 population-based cohorts of heterogeneous quality included 50,245 participants with 1966 hip and 3281 nonspine fractures. In random-effects models that included the 5 higher-quality studies, the pooled adjusted hazard ratios (HRs) of participants with subclinical hyperthyroidism versus euthyrodism were 1.38 (95% CI, 0.92 to 2.07) for hip fractures and 1.20 (CI, 0.83 to 1.72) for nonspine fractures without statistical heterogeneity (P = 0.82 and 0.52, respectively; I2= 0%). Pooled estimates for the 7 cohorts were 1.26 (CI, 0.96 to 1.65) for hip fractures and 1.16 (CI, 0.95 to 1.42) for nonspine fractures. When thyroxine recipients were excluded, the HRs for participants with subclinical hyperthyroidism were 2.16 (CI, 0.87 to 5.37) for hip fractures and 1.43 (CI, 0.73 to 2.78) for nonspine fractures. For participants with subclinical hypothyroidism, HRs from higher-quality studies were 1.12 (CI, 0.83 to 1.51) for hip fractures and 1.04 (CI, 0.76 to 1.42) for nonspine fractures (P for heterogeneity = 0.69 and 0.88, respectively; I2 = 0%). LIMITATIONS Selective reporting cannot be excluded. Adjustment for potential common confounders varied and was not adequately done across all studies. CONCLUSION Subclinical hyperthyroidism might be associated with an increased risk for hip and nonspine fractures, but additional large, high-quality studies are needed. PRIMARY FUNDING SOURCE Swiss National Science Foundation.
Resumo:
Software developers are often unsure of the exact name of the method they need to use to invoke the desired behavior in a given context. This results in a process of searching for the correct method name in documentation, which can be lengthy and distracting to the developer. We can decrease the method search time by enhancing the documentation of a class with the most frequently used methods. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem - written in the same language and sharing dependencies. We implemented a proof of concept of the approach for Pharo Smalltalk and Java. In Pharo Smalltalk, methods are commonly searched for using a code browser tool called "Nautilus", and in Java using a web browser displaying HTML based documentation - Javadoc. We developed plugins for both browsers and gathered method usage data from open source projects, in order to increase developer productivity by reducing method search time. A small initial evaluation has been conducted showing promising results in improving developer productivity.
Resumo:
Eurasian fall snow cover changes have been suggested as a driver for changes in the Arctic Oscillation and might provide a link between sea-ice decline in the Arctic during summer and atmospheric circulation in the following winter. However, the mechanism connecting snow cover in Eurasia to sea-ice decline in autumn is still under debate. Our analysis is based on snow observations from 820 Russian land stations, moisture transport using a Lagrangian approach derived from meteorological re-analyses. We show that declining sea-ice in the Barents and Kara Seas (BKS) acts as moisture source for the enhanced Western Siberian snow depth as a result of changed tropospheric moisture transport. Transient disturbances enter the continent from the BKS region related to anomalies in the planetary wave pattern and move southward along the Ural mountains where they merge into the extension of the Mediterranean storm track.
Resumo:
Urban agriculture is a phenomenon that can be observed world-wide, particularly in cities of devel-oping countries. It is contributing significantly to food security and food safety and has sustained livelihood of the urban and peri-urban low income dwellers in developing countries for many years. Population increase due to rural-urban migration and natural, coupled with formal as well as infor-mal urbanization are competing with urban farming for available space and scarce water resources. A multitemporal multisensoral urban change analysis over the period of 25 years (1982-2007) was performed in order to measure and visualize the urban expansion along the Kizinga and Mzinga valley in the South of Dar es Salaam. Airphotos and VHR satellite data were analyzed by using a combination of a composition of anisotropic textural measures and spectral information. The study revealed that unplanned built-up area is expanding continuously and vegetation covers and agricultural lands decline at a fast rate. The validation showed that the overall classification accuracy varied depending on the database. The extracted built-up areas were used for visual in-terpretation mapping purposes and served as information source for another research project. The maps visualize an urban congestion and expansion of nearly 18% of the total analyzed area that had taken place in the Kizinga valley between 1982 and 2007. The same development can be ob-served in the less developed and more remote Mzinga valley between 1981 and 2002. Both areas underwent fast changes where land prices still tend to go up and an influx of people both from rural and urban areas continuously increase density with the consequence of increasing multiple land use interests.
Resumo:
BACKGROUND Gametogenesis and fertilization play crucial roles in malaria transmission. While male gametes are thought to be amongst the simplest eukaryotic cells and are proven targets of transmission blocking immunity, little is known about their molecular organization. For example, the pathway of energy metabolism that power motility, a feature that facilitates gamete encounter and fertilization, is unknown. METHODS Plasmodium berghei microgametes were purified and analysed by whole-cell proteomic analysis for the first time. Data are available via ProteomeXchange with identifier PXD001163. RESULTS 615 proteins were recovered, they included all male gamete proteins described thus far. Amongst them were the 11 enzymes of the glycolytic pathway. The hexose transporter was localized to the gamete plasma membrane and it was shown that microgamete motility can be suppressed effectively by inhibitors of this transporter and of the glycolytic pathway. CONCLUSIONS This study describes the first whole-cell proteomic analysis of the malaria male gamete. It identifies glycolysis as the likely exclusive source of energy for flagellar beat, and provides new insights in original features of Plasmodium flagellar organization.
Resumo:
In the Lower Mekon Basin the extraordinary pace of economic development and growth contradicts with environmental protection. On base of the Watershed Classification Project (WSCP) and the inclusion of a DTM for the entire LMB the potential degradation risk was derived for each land unit. The risks were grouped into five classes, where classes one and two are considered critical with regard to soil erosion when the land is cleared of natural resources. For practical use the database has an enormous potential for further spatial analysis in combination with other datasets, as for example the NCCR North-South uses the WSCP within two research projects.
Resumo:
AMS-14C applications often require the analysis of small samples. Such is the case of atmospheric aerosols where frequently only a small amount of sample is available. The ion beam physics group at the ETH, Zurich, has designed an Automated Graphitization Equipment (AGE III) for routine graphite production for AMS analysis from organic samples of approximately 1 mg. In this study, we explore the potential use of the AGE III for graphitization of particulate carbon collected in quartz filters. In order to test the methodology, samples of reference materials and blanks with different sizes were prepared in the AGE III and the graphite was analyzed in a MICADAS AMS (ETH) system. The graphite samples prepared in the AGE III showed recovery yields higher than 80% and reproducible 14C values for masses ranging from 50 to 300 lg. Also, reproducible radiocarbon values were obtained for aerosol filters of small sizes that had been graphitized in the AGE III. As a study case, the tested methodology was applied to PM10 samples collected in two urban cities in Mexico in order to compare the source apportionment of biomass and fossil fuel combustion. The obtained 14C data showed that carbonaceous aerosols from Mexico City have much lower biogenic signature than the smaller city of Cuernavaca.
Resumo:
OBJECTIVES The SOURCE XT Registry (Edwards SAPIEN XT Aortic Bioprosthesis Multi-Region Outcome Registry) assessed the use and clinical outcomes with the SAPIEN XT (Edwards Lifesciences, Irvine, California) valve in the real-world setting. BACKGROUND Transcatheter aortic valve replacement is an established treatment for high-risk/inoperable patients with severe aortic stenosis. The SAPIEN XT is a balloon-expandable valve with enhanced features allowing delivery via a lower profile sheath. METHODS The SOURCE XT Registry is a prospective, multicenter, post-approval study. Data from 2,688 patients at 99 sites were analyzed. The main outcome measures were all-cause mortality, stroke, major vascular complications, bleeding, and pacemaker implantations at 30-days and 1 year post-procedure. RESULTS The mean age was 81.4 ± 6.6 years, 42.3% were male, and the mean logistic EuroSCORE (European System for Cardiac Operative Risk Evaluation) was 20.4 ± 12.4%. Patients had a high burden of coronary disease (44.2%), diabetes (29.4%), renal insufficiency (28.9%), atrial fibrillation (25.6%), and peripheral vascular disease (21.2%). Survival was 93.7% at 30 days and 80.6% at 1 year. At 30-day follow-up, the stroke rate was 3.6%, the rate of major vascular complications was 6.5%, the rate of life-threatening bleeding was 5.5%, the rate of new pacemakers was 9.5%, and the rate of moderate/severe paravalvular leak was 5.5%. Multivariable analysis identified nontransfemoral approach (hazard ratio [HR]: 1.84; p < 0.0001), renal insufficiency (HR: 1.53; p < 0.0001), liver disease (HR: 1.67; p = 0.0453), moderate/severe tricuspid regurgitation (HR: 1.47; p = 0.0019), porcelain aorta (HR: 1.47; p = 0.0352), and atrial fibrillation (HR: 1.41; p = 0.0014), with the highest HRs for 1-year mortality. Major vascular complications and major/life-threatening bleeding were the most frequently seen complications associated with a significant increase in 1-year mortality. CONCLUSIONS The SOURCE XT Registry demonstrated appropriate use of the SAPIEN XT THV in the first year post-commercialization in Europe. The safety profile is sustained, and clinical benefits have been established in the real-world setting. (SOURCE XT Registry; NCT01238497).
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
BACKGROUND Listeria (L.) monocytogenes causes fatal infections in many species including ruminants and humans. In ruminants, rhombencephalitis is the most prevalent form of listeriosis. Using multilocus variable number tandem repeat analysis (MLVA) we recently showed that L. monocytogenes isolates from ruminant rhombencephalitis cases are distributed over three genetic complexes (designated A, B and C). However, the majority of rhombencephalitis strains and virtually all those isolated from cattle cluster in MLVA complex A, indicating that strains of this complex may have increased neurotropism and neurovirulence. The aim of this study was to investigate whether ruminant rhombencephalitis strains have an increased ability to propagate in the bovine hippocampal brain-slice model and can be discriminated from strains of other sources. For this study, forty-seven strains were selected and assayed on brain-slice cultures, a bovine macrophage cell line (BoMac) and a human colorectal adenocarcinoma cell line (Caco-2). They were isolated from ruminant rhombencephalitis cases (n = 21) and other sources including the environment, food, human neurolisteriosis cases and ruminant/human non-encephalitic infection cases (n = 26). RESULTS All but one L. monocytogenes strain replicated in brain slices, irrespectively of the source of the isolate or MLVA complex. The replication of strains from MLVA complex A was increased in hippocampal brain-slice cultures compared to complex C. Immunofluorescence revealed that microglia are the main target cells for L. monocytogenes and that strains from MLVA complex A caused larger infection foci than strains from MLVA complex C. Additionally, they caused larger plaques in BoMac cells, but not CaCo-2 cells. CONCLUSIONS Our brain slice model data shows that all L. monocytogenes strains should be considered potentially neurovirulent. Secondly, encephalitis strains cannot be conclusively discriminated from non-encephalitis strains with the bovine organotypic brain slice model. The data indicates that MLVA complex A strains are particularly adept at establishing encephalitis possibly by virtue of their higher resistance to antibacterial defense mechanisms in microglia cells, the main target of L. monocytogenes.
Resumo:
The search for translation universals has been an important topic in translation studies over the past decades. In this paper, we focus on the notion of explicitation through a multifaceted study of causal connectives, integrating four different variables: the role of the source and the target languages, the influence of specific connectives and the role of the discourse relation they convey. Our results indicate that while source and target languages do not globally influence explicitation, specific connectives have a significant impact on this phenomenon. We also show that in English and French, the most frequently used connectives for explicitation share a similar semantic profile. Finally, we demonstrate that explicitation also varies across different discourse relations, even when they are conveyed by a single connective.
VERIFICATION OF DNA PREDICTED PROTEIN SEQUENCES BY ENZYME HYDROLYSIS AND MASS SPECTROMETRIC ANALYSIS
Resumo:
The focus of this thesis lies in the development of a sensitive method for the analysis of protein primary structure which can be easily used to confirm the DNA sequence of a protein's gene and determine the modifications which are made after translation. This technique involves the use of dipeptidyl aminopeptidase (DAP) and dipeptidyl carboxypeptidase (DCP) to hydrolyze the protein and the mass spectrometric analysis of the dipeptide products.^ Dipeptidyl carboxypeptidase was purified from human lung tissue and characterized with respect to its proteolytic activity. The results showed that the enzyme has a relatively unrestricted specificity, making it useful for the analysis of the C-terminal of proteins. Most of the dipeptide products were identified using gas chromatography/mass spectrometry (GC/MS). In order to analyze the peptides not hydrolyzed by DCP and DAP, as well as the dipeptides not identified by GC/MS, a FAB ion source was installed on a quadrupole mass spectrometer and its performance evaluated with a variety of compounds.^ Using these techniques, the sequences of the N-terminal and C-terminal regions and seven fragments of bacteriophage P22 tail protein have been verified. All of the dipeptides identified in these analysis were in the same DNA reading frame, thus ruling out the possibility of a single base being inserted or deleted from the DNA sequence. The verification of small sequences throughout the protein sequence also indicates that no large portions of the protein have been removed after translation. ^