986 resultados para specification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assumption that ‘states' primary goal is survival’ lies at the heart of the neorealist paradigm. A careful examination of the assumption, however, reveals that neorealists draw upon a number of distinct interpretations of the ‘survival assumption’ that are then treated as if they are the same, pointing towards conceptual problems that surround the treatment of state preferences. This article offers a specification that focuses on two questions that highlight the role and function of the survival assumption in the neorealist logic: (i) what do states have to lose if they fail to adopt self-help strategies?; and (ii) how does concern for relevant losses motivate state behaviour and affect international outcomes? Answering these questions through the exploration of governing elites' sensitivity towards regime stability and territorial integrity of the state, in turn, addresses the aforementioned conceptual problems. This specification has further implications for the debates among defensive and offensive realists, potential extensions of the neorealist logic beyond the Westphalian states, and the relationship between neorealist theory and policy analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Document design and typeface design: A typographic specification for a new Intermediate Greek-English Lexicon by CUP, accompanied by typefaces modified for the specific typographic requirements of the text. The Lexicon is a substantial (over 1400 pages) publication for HE students and academics intended to complement Liddell-Scott (the standard reference for classical Greek since the 1850s), and has been in preparation for over a decade. The typographic appearance of such works has changed very little since the original editions, largely to the lack of suitable typefaces: early digital proofs of the Lexicon utilised directly digitised versions of historical typefaces, making the entries difficult to navigate, and the document uneven in typographic texture. Close collaboration with the editors of the Lexicon, and discussion of the historical precedents for such documents informed the design at all typographic levels to achieve a highly reader-friendly results that propose a model for this kind of typography. Uniquely for a work of this kind, typeface design decisions were integrated into the wider document design specification. A rethinking of the complex typography for Greek and English based on historical editions as well as equivalent bilingual reference works at this level (from OUP, CUP, Brill, Mondadori, and other publishers) led a redefinition of multi-script typeface pairing for the specific context, taking into account recent developments in typeface design. Specifically, the relevant weighting of elements within each entry were redefined, as well as the typographic texture of type styles across the two scripts. In details, Greek typefaces were modified to emphasise clarity and readability, particularly of diacritics, at very small sizes. The relative weights of typefaces typeset side-by-side were fine-tuned so that the visual hierarchy of the entires was unambiguous despite the dense typesetting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper develops a more precise specification and understanding of the process of national-level knowledge accumulation and absorptive capabilities by applying the reasoning and evidence from the firm-level analysis pioneered by Cohen and Levinthal (1989, 1990). In doing so, we acknowledge that significant cross-border effects due to the role of both inward and outward FDI exist and that assimilation of foreign knowledge is not only confined to catching-up economies but is also carried out by countries at the frontier-sharing phase. We postulate a non-linear relationship between national absorptive capacity and the technological gap, due to the effects of the cumulative nature of the learning process and the increase in complexity of external knowledge as the country approaches the technological frontier. We argue that national absorptive capacity and the accumulation of knowledge stock are simultaneously determined. This implies that different phases of technological development require different strategies. During the catching-up phase, knowledge accumulation occurs predominately through the absorption of trade and/or inward FDI-related R&D spillovers. At the pre-frontier-sharing phase onwards, increases in the knowledge base occur largely through independent knowledge creation and actively accessing foreign-located technological spillovers, inter alia through outward FDI-related R&D, joint ventures and strategic alliances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assimilation of measurements from the stratosphere and mesosphere is becoming increasingly common as the lids of weather prediction and climate models rise into the mesosphere and thermosphere. However, the dynamics of the middle atmosphere pose specific challenges to the assimilation of measurements from this region. Forecast-error variances can be very large in the mesosphere and this can render assimilation schemes very sensitive to the details of the specification of forecast error correlations. An example is shown where observations in the stratosphere are able to produce increments in the mesosphere. Such sensitivity of the assimilation scheme to misspecification of covariances can also amplify any existing biases in measurements or forecasts. Since both models and measurements of the middle atmosphere are known to have biases, the separation of these sources of bias remains a issue. Finally, well-known deficiencies of assimilation schemes, such as the production of imbalanced states or the assumption of zero bias, are proposed explanations for the inaccurate transport resulting from assimilated winds. The inability of assimilated winds to accurately transport constituents in the middle atmosphere remains a fundamental issue limiting the use of assimilated products for applications involving longer time-scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the acquisition of syntax in L2 grammars. We tested adult L2 speakers of Spanish (English L1) on the feature specification of T(ense), which is different in English and Spanish in so-called subject-to-subject raising structures. We present experimental results with the verb parecer “to seem/to appear” in different tenses, with and without experiencers, and with Tense Phrase (TP), verb phrase (vP) and Adjectival Phrase (AP) complements. The results show that advanced L2 learners can perform just like native Spanish speakers regarding grammatical knowledge in this domain, although the subtle differences between both languages are not explicitly taught. We argue that these results support Full Access approaches to Universal Grammar (UG) in L2 acquisition, by providing evidence that uninterpretable syntactic features can be learned in adult L2, even when such features are not directly instantiated in the same grammatical domain in the L1 grammar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many macroeconomic series, such as U.S. real output growth, are sampled quarterly, although potentially useful predictors are often observed at a higher frequency. We look at whether a mixed data-frequency sampling (MIDAS) approach can improve forecasts of output growth. The MIDAS specification used in the comparison uses a novel way of including an autoregressive term. We find that the use of monthly data on the current quarter leads to significant improvement in forecasting current and next quarter output growth, and that MIDAS is an effective way to exploit monthly data compared with alternative methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proneural genes such as Ascl1 are known to promote cell cycle exit and neuronal differentiation when expressed in neural progenitor cells. The mechanisms by which proneural genes activate neurogenesis--and, in particular, the genes that they regulate--however, are mostly unknown. We performed a genome-wide characterization of the transcriptional targets of Ascl1 in the embryonic brain and in neural stem cell cultures by location analysis and expression profiling of embryos overexpressing or mutant for Ascl1. The wide range of molecular and cellular functions represented among these targets suggests that Ascl1 directly controls the specification of neural progenitors as well as the later steps of neuronal differentiation and neurite outgrowth. Surprisingly, Ascl1 also regulates the expression of a large number of genes involved in cell cycle progression, including canonical cell cycle regulators and oncogenic transcription factors. Mutational analysis in the embryonic brain and manipulation of Ascl1 activity in neural stem cell cultures revealed that Ascl1 is indeed required for normal proliferation of neural progenitors. This study identified a novel and unexpected activity of the proneural gene Ascl1, and revealed a direct molecular link between the phase of expansion of neural progenitors and the subsequent phases of cell cycle exit and neuronal differentiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past few years have seen major advances in the field of NSC (neural stem cell) research with increasing emphasis towards its application in cell-replacement therapy for neurological disorders. However, the clinical application of NSCs will remain largely unfeasible until a comprehensive understanding of the cellular and molecular mechanisms of NSC fate specification is achieved. With this understanding will come an increased possibility to exploit the potential of stem cells in order to manufacture transplantable NSCs able to provide a safe and effective therapy for previously untreatable neurological disorders. Since the pathology of each of these disorders is determined by the loss or damage of a specific neural cell population, it may be necessary to generate a range of NSCs able to replace specific neurons or glia rather than generating a generic NSC population. Currently, a diverse range of strategies is being investigated with this goal in mind. In this review, we focus on the relationship between NSC specification and differentiation and discuss how this information may be used to direct NSCs towards a particular fate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We model strategic interaction in a differentiated input market as a game among two suppliers and n retailers. Each one of the upstream firms chooses the specification of the input which it will offer.Then, retailers choose their type from a continuum of possibilities. The decisions made in these two first stages affect the degree of compatibility between each retailer's ideal input specification and that of the inputs offered by the two upstream firms. In a third stage, upstream firms compete setting input prices. Equilibrium may be of the two-vendor policy or of the technological monopoly type.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a symmetric differentiated experimental oligopoly with multiproduct firms we test the predictive power of the corresponding Bertrand-Nash equilibria. Subjects are not informed on the specification of the underlying demand model. In the presence of intense multiproduct activity, and provided that a parallel pricing rule is imposed to multiproduct firms, strategies tend to confirm the non-cooperative multiproduct solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Often, firms have no information on the specification of the true demand model they are faced with. It is, however, a well established fact that trial-and-error algorithms may be used by them in order to learn how to make optimal decisions. Using experimental methods, we identify a property of the information on past actions which helps the seller of two asymmetric demand substitutes to reach the optimal prices more precisely and faster. The property concerns the possibility of disaggregating changes in each product’s demand into client exit/entry and shift from one product to the other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new Bayesian econometric specification for a hypothetical Discrete Choice Experiment (DCE) incorporating respondent ranking information about attribute importance. Our results indicate that a DCE debriefing question that asks respondents to rank the importance of attributes helps to explain the resulting choices. We also examine how mode of survey delivery (online and mail) impacts model performance, finding that results are not substantively a§ected by the mode of survey delivery. We conclude that the ranking data is a complementary source of information about respondent utility functions within hypothetical DCEs