17 resultados para Forward looking models
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The global ocean is a significant sink for anthropogenic carbon (Cant), absorbing roughly a third of human CO2 emitted over the industrial period. Robust estimates of the magnitude and variability of the storage and distribution of Cant in the ocean are therefore important for understanding the human impact on climate. In this synthesis we review observational and model-based estimates of the storage and transport of Cant in the ocean. We pay particular attention to the uncertainties and potential biases inherent in different inference schemes. On a global scale, three data-based estimates of the distribution and inventory of Cant are now available. While the inventories are found to agree within their uncertainty, there are considerable differences in the spatial distribution. We also present a review of the progress made in the application of inverse and data assimilation techniques which combine ocean interior estimates of Cant with numerical ocean circulation models. Such methods are especially useful for estimating the air–sea flux and interior transport of Cant, quantities that are otherwise difficult to observe directly. However, the results are found to be highly dependent on modeled circulation, with the spread due to different ocean models at least as large as that from the different observational methods used to estimate Cant. Our review also highlights the importance of repeat measurements of hydrographic and biogeochemical parameters to estimate the storage of Cant on decadal timescales in the presence of the variability in circulation that is neglected by other approaches. Data-based Cant estimates provide important constraints on forward ocean models, which exhibit both broad similarities and regional errors relative to the observational fields. A compilation of inventories of Cant gives us a "best" estimate of the global ocean inventory of anthropogenic carbon in 2010 of 155 ± 31 PgC (±20% uncertainty). This estimate includes a broad range of values, suggesting that a combination of approaches is necessary in order to achieve a robust quantification of the ocean sink of anthropogenic CO2.
Resumo:
Cultural protectionism has been an element of national and foreign policies, as an extension of state sovereignty expressed both in a defensive and offensive manner. While the generic protectionist formula in the sense of restraining trade between states through measures such as import tariffs or quotas and through privileging domestic production has somewhat disintegrated over time under the rationale for free trade and the strong practical evidence of its benefits, the particular case of cultural protectionism has persevered. As we reveal in this paper, however, it has been modified, or at least its rhetoric has changed. The enquiry into the notion of cultural protectionism or cultural diversity, as the current political jargon would have it, is but one of the paper’s objectives. Its second and certainly more ambitious goal is the search for the normative dimensions of cultural diversity policies in the global digital space, asking what adjustments are needed and how feasible the entire project of diversity regulation in this environment may be. Taking into account the specificities of cyberspace and in a forward-looking manner, we propose some adjustments to current media policy practices that could better serve the goal of a sustainably diverse cultural environment.
Resumo:
This chapter explores cultural protectionism 2.0, i.e. the normative dimensions of cultural diversity policies in the global digital space, asking what adjustments are needed and in fact, how feasible the entire project of diversity regulation in this environment may be. The complexities of the shift from offline to online and from analogue to digital, and the inherent policy challenges are illustrated with some (positive and negative) instances of existing media initiatives. Taking into account the specificities of cyberspace and in a forward-looking manner, the chapter suggests some adjustments to current media policy practices in order to better serve the goal of sustainably diverse cultural environment.
Resumo:
The development of new digital technologies has resulted in significant transformations in daily life, from the arrival of online shopping to more fundamental changes in the ways we work and communicate. Many of these changes raise questions that transcend market access and liberalisation and demand cooperation and coherent regulatory design. International trade regulation has hitherto not reacted in a forward-looking manner to the digital revolution; particularly at the multilateral level, legal engineering has yielded few tangible results. This book examines whether WTO laws possess the necessary flexibility and resilience to accommodate the changes brought about by burgeoning digital trade. By revealing both the potential and the limitations of the WTO framework, it provides a broad picture of the interaction between digital technologies and trade regulation, links the often disconnected discourses of international trade law, intellectual property and cyberlaw, and explores discrete problems in different domains of global trade regulation.
Resumo:
This briefing note was prepared by invitation from the European Parliament's Committee on Culture and Education. It provides an analysis of how the existent EU internal policies reflect the spirit and the letter of the UNESCO Convention on Cultural Diversity. The note suggests further ideas on how the EU may calibrate current practices and explores in a forward-looking manner the possibilities for the Convention’s implementation in future internal policies, understood both as hard and soft EU legal instruments. Particular attention in this query is paid to digital media and their regulatory implications.
Resumo:
This paper takes the recent abdication of multiculturalism by the leaders of Europe’s most powerful nations (Germany, France, and Britain) as hub for a reflection on common themes in Europe’s crisis of multiculturalism. The most obvious common theme in this crisis is Islam and problems of Muslim integration. Accordingly, this paper addresses the role of religion and Islam in Europe’s multiculturalism crisis, and elaborates on the “muscular liberalism” or “civic integration” policies that have appeared in lieu of a discarded multiculturalism. In a final step, I tackle, in a forward-looking mode, some “critical issues” that will shape European immigrant integration after multiculturalism: the need to fight discrimination despite multiculturalism’s ebb, a greater concern for majority culture, the importance of robust debate and democracy as medium of integration, the often-neglected factor of immigrant selection, and a recognition that institutions matter more than policy in the process of integration.
Resumo:
isk Management today has moved from being the topic of top level conferences and media discussions to being a permanent issue in the board and top management agenda. Several new directives and regulations in Switzerland, Germany and EU make it obligatory for the firms to have a risk management strategy and transparently disclose the risk management process to their stakeholders. Shareholders, insurance providers, banks, media, analysts, employees, suppliers and other stakeholders expect the board members to be pro-active in knowing the critical risks facing their organization and provide them with a reasonable assurance vis-à-vis the management of those risks. In this environment however, the lack of standards and training opportunities makes this task difficult for board members. This book with the help of real life examples, analysis of drivers, interpretation of the Swiss legal requirements, and information based on international benchmarks tries to reach out to the forward looking leaders of today's businesses. The authors have collectively brought their years of scientific and practical experience in risk management, Swiss law and board memberships together to provide the board members practical solutions in risk management. The desire is that this book will clear the fear regarding risk management from the minds of the company leadership and help them in making risk savvy decisions in quest to achieve their strategic objectives.
Resumo:
When considering data from many trials, it is likely that some of them present a markedly different intervention effect or exert an undue influence on the summary results. We develop a forward search algorithm for identifying outlying and influential studies in meta-analysis models. The forward search algorithm starts by fitting the hypothesized model to a small subset of likely outlier-free studies and proceeds by adding studies into the set one-by-one that are determined to be closest to the fitted model of the existing set. As each study is added to the set, plots of estimated parameters and measures of fit are monitored to identify outliers by sharp changes in the forward plots. We apply the proposed outlier detection method to two real data sets; a meta-analysis of 26 studies that examines the effect of writing-to-learn interventions on academic achievement adjusting for three possible effect modifiers, and a meta-analysis of 70 studies that compares a fluoride toothpaste treatment to placebo for preventing dental caries in children. A simple simulated example is used to illustrate the steps of the proposed methodology, and a small-scale simulation study is conducted to evaluate the performance of the proposed method. Copyright © 2016 John Wiley & Sons, Ltd.
Resumo:
We describe the steady-state function of the ubiquitous mammalian Na/H exchanger (NHE)1 isoform in voltage-clamped Chinese hamster ovary cells, as well as other cells, using oscillating pH-sensitive microelectrodes to quantify proton fluxes via extracellular pH gradients. Giant excised patches could not be used as gigaseal formation disrupts NHE activity within the patch. We first analyzed forward transport at an extracellular pH of 8.2 with no cytoplasmic Na (i.e., nearly zero-trans). The extracellular Na concentration dependence is sigmoidal at a cytoplasmic pH of 6.8 with a Hill coefficient of 1.8. In contrast, at a cytoplasmic pH of 6.0, the Hill coefficient is <1, and Na dependence often appears biphasic. Results are similar for mouse skin fibroblasts and for an opossum kidney cell line that expresses the NHE3 isoform, whereas NHE1(-/-) skin fibroblasts generate no proton fluxes in equivalent experiments. As proton flux is decreased by increasing cytoplasmic pH, the half-maximal concentration (K(1/2)) of extracellular Na decreases less than expected for simple consecutive ion exchange models. The K(1/2) for cytoplasmic protons decreases with increasing extracellular Na, opposite to predictions of consecutive exchange models. For reverse transport, which is robust at a cytoplasmic pH of 7.6, the K(1/2) for extracellular protons decreases only a factor of 0.4 when maximal activity is decreased fivefold by reducing cytoplasmic Na. With 140 mM of extracellular Na and no cytoplasmic Na, the K(1/2) for cytoplasmic protons is 50 nM (pH 7.3; Hill coefficient, 1.5), and activity decreases only 25% with extracellular acidification from 8.5 to 7.2. Most data can be reconstructed with two very different coupled dimer models. In one model, monomers operate independently at low cytoplasmic pH but couple to translocate two ions in "parallel" at alkaline pH. In the second "serial" model, each monomer transports two ions, and translocation by one monomer allosterically promotes translocation by the paired monomer in opposite direction. We conclude that a large fraction of mammalian Na/H activity may occur with a 2Na/2H stoichiometry.
Resumo:
In this paper two models for the simulation of glucose-insulin metabolism of children with Type 1 diabetes are presented. The models are based on the combined use of Compartmental Models (CMs) and artificial Neural Networks (NNs). Data from children with Type 1 diabetes, stored in a database, have been used as input to the models. The data are taken from four children with Type 1 diabetes and contain information about glucose levels taken from continuous glucose monitoring system, insulin intake and food intake, along with corresponding time. The influences of taken insulin on plasma insulin concentration, as well as the effect of food intake on glucose input into the blood from the gut, are estimated from the CMs. The outputs of CMs, along with previous glucose measurements, are fed to a NN, which provides short-term prediction of glucose values. For comparative reasons two different NN architectures have been tested: a Feed-Forward NN (FFNN) trained with the back-propagation algorithm with adaptive learning rate and momentum, and a Recurrent NN (RNN), trained with the Real Time Recurrent Learning (RTRL) algorithm. The results indicate that the best prediction performance can be achieved by the use of RNN.
Resumo:
Cataract is a known condition leading to opacification of the eye lens causing partial or total blindness. Mutations are known to cause autosomal dominant or recessive inherited forms of cataracts in humans, mice, rats, guinea pigs and dogs. The use of large-sized animal models instead of those using mice for the study of this condition has been discussed due to the small size of rodent lenses. Four juvenile-onset cases of bilateral incomplete immature nuclear cataract were recently observed in Romagnola cattle. Pedigree analysis suggested a monogenic autosomal recessive inheritance. In addition to the cataract, one of the cases displayed abnormal head movements. Genome-wide association and homozygosity mapping and subsequent whole genome sequencing of a single case identified two perfectly associated sequence variants in a critical interval of 7.2 Mb on cattle chromosome 28: a missense point mutation located in an uncharacterized locus and an 855 bp deletion across the exon 19/intron 19 border of the bovine nidogen 1 (NID1) gene (c.3579_3604+829del). RT-PCR showed that NID1 is expressed in bovine lenses while the transcript of the second locus was absent. The NID1 deletion leads to the skipping of exon 19 during transcription and is therefore predicted to cause a frameshift and premature stop codon (p.1164fs27X). The truncated protein lacks a C-terminal domain essential for binding with matrix assembly complexes. Nidogen 1 deficient mice show neurological abnormalities and highly irregular crystal lens alterations. This study adds NID1 to the list of candidate genes for inherited cataract in humans and is the first report of a naturally occurring mutation leading to non-syndromic catarct in cattle provides a potential large animal model for human cataract.
Resumo:
Building on theories of impression formation based on faces, this research investigates the impact of job candidates’ facial age appearance on hiring as well as the underlying mechanism. In an experiment, participants decided whether to hire a fictitious candidate aged 50 years, 30 years or without age information. The candidate’s age was signaled either via chronological information (varied by date of birth) or via facial age appearance (varied by a photograph on the résumé). Findings showed that candidates with older-appearing faces—but not chronologically older candidates—triggered impressions of low health and fitness, compared to younger-appearing candidates. These impressions reduced perceptions of person-job fit, which lowered hiring probabilities for older-appearing candidates. These findings provide the first evidence that trait impressions from faces are a determinant of age discrimination in personnel selection. They call for an extension of current models of age discrimination by integrating the effects of face-based trait impressions, particularly with respect to health and fitness.
Resumo:
Vestibular cognition has recently gained attention. Despite numerous experimental and clinical demonstrations, it is not yet clear what vestibular cognition really is. For future research in vestibular cognition, adopting a computational approach will make it easier to explore the underlying mech- anisms. Indeed, most modeling approaches in vestibular science include a top-down or a priori component. We review recent Bayesian optimal observer models, and discuss in detail the conceptual value of prior assumptions, likelihood and posterior estimates for research in vestibular cognition. We then consider forward models in vestibular processing, which are required in order to distinguish between sensory input that is induced by active self-motion, and sensory input that is due to passive self-motion. We suggest that forward models are used not only in the service of estimating sensory states but they can also be drawn upon in an offline mode (e.g., spatial perspective transformations), in which interaction with sensory input is not desired. A computational approach to vestibular cogni- tion will help to discover connections across studies, and it will provide a more coherent framework for investigating vestibular cognition.
Resumo:
We report quantitative results from three brittle thrust wedge experiments, comparing numerical results directly with each other and with corresponding analogue results. We first test whether the participating codes reproduce predictions from analytical critical taper theory. Eleven codes pass the stable wedge test, showing negligible internal deformation and maintaining the initial surface slope upon horizontal translation over a frictional interface. Eight codes participated in the unstable wedge test that examines the evolution of a wedge by thrust formation from a subcritical state to the critical taper geometry. The critical taper is recovered, but the models show two deformation modes characterised by either mainly forward dipping thrusts or a series of thrust pop-ups. We speculate that the two modes are caused by differences in effective basal boundary friction related to different algorithms for modelling boundary friction. The third experiment examines stacking of forward thrusts that are translated upward along a backward thrust. The results of the seven codes that run this experiment show variability in deformation style, number of thrusts, thrust dip angles and surface slope. Overall, our experiments show that numerical models run with different numerical techniques can successfully simulate laboratory brittle thrust wedge models at the cm-scale. In more detail, however, we find that it is challenging to reproduce sandbox-type setups numerically, because of frictional boundary conditions and velocity discontinuities. We recommend that future numerical-analogue comparisons use simple boundary conditions and that the numerical Earth Science community defines a plasticity test to resolve the variability in model shear zones.