895 resultados para degree of priority importance
Resumo:
This paper models the determinants of integration in the context of global real estate security markets. Using both local and U.S. Dollar denominated returns, we model conditional correlations across listed real estate sectors and also with the global stock market. The empirical results find that financial factors, such as the relationship with the respective equity market, volatility, the relative size of the real estate sector and trading turnover all play an important role in the degree of integration present. Furthermore, the results highlight the importance of macro-economic variables in the degree of integration present. All four of the macro-economic variables modeled provide at least one significant result across the specifications estimated. Factors such as financial and trade openness, monetary independence and the stability of a country’s currency all contribute to the degree of integration reported.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
The cell walls of wheat (Triticum aestivum) starchy endosperm are dominated by arabinoxylan (AX), accounting for 65% to 70% of the polysaccharide content. Genes within two glycosyl transferase (GT) families, GT43 (IRREGULAR XYLEM9 [IRX9] and IRX14) and GT47 (IRX10), have previously been shown to be involved in the synthesis of the xylan backbone in Arabidopsis, and close homologs of these have been implicated in the synthesis of xylan in other species. Here, homologs of IRX10 TaGT47_2 and IRX9 TaGT43_2, which are highly expressed in wheat starchy endosperm cells, were suppressed by RNA interference (RNAi) constructs driven by a starchy endosperm-specific promoter. The total amount of AX was decreased by 40% to 50% and the degree of arabinosylation was increased by 25% to 30% in transgenic lines carrying either of the transgenes. The cell walls of starchy endosperm in sections of grain from TaGT43_2 and TaGT47_2 RNAi transgenics showed decreased immunolabeling for xylan and arabinoxylan epitopes and approximately 50% decreased cell wall thickness compared with controls. The proportion of AX that was water soluble was not significantly affected, but average AX polymer chain length was decreased in both TaGT43_2 and TaGT47_2 RNAi transgenics. However, the long AX chains seen in controls were absent in TaGT43_2 RNAi transgenics but still present in TaGT47_2 RNAi transgenics. The results support an emerging picture of IRX9-like and IRX10-like proteins acting as key components in the xylan synthesis machinery in both dicots and grasses. Since AX is the main component of dietary fiber in wheat foods, the TaGT43_2 and TaGT47_2 genes are of major importance to human nutrition.
Resumo:
Based on theoretical arguments we propose a possible route for controlling the band-gap in the promising photovoltaic material CdIn2S4. Our ab initio calculations show that the experimental degree of inversion in this spinel (fraction of tetrahedral sites occupied by In) corresponds approximately to the equilibrium value given by the minimum of the theoretical inversion free energy at a typical synthesis temperature. Modification of this temperature, or of the cooling rate after synthesis, is then expected to change the inversion degree, which in turn sensitively tunes the electronic band-gap of the solid, as shown here by Heyd-Scuseria-Ernzerhof screened hybrid functional calculations.
Resumo:
Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.
Resumo:
The encoding of goal-oriented motion events varies across different languages. Speakers of languages without grammatical aspect (e.g., Swedish) tend to mention motion endpoints when describing events, e.g., “two nuns walk to a house,”, and attach importance to event endpoints when matching scenes from memory. Speakers of aspect languages (e.g., English), on the other hand, are more prone to direct attention to the ongoingness of motion events, which is reflected both in their event descriptions, e.g., “two nuns are walking.”, and in their non-verbal similarity judgements. This study examines to what extent native speakers of Swedish (n = 82) with English as a foreign language (FL) restructure their categorisation of goal-oriented motion as a function of their English proficiency and experience with the English language (e.g., exposure, learning). Seventeen monolingual native English speakers from the United Kingdom (UK) were engaged for comparison purposes. Data on motion event cognition were collected through a memory-based triads matching task, in which a target scene with an intermediate degree of endpoint orientation was matched with two alternative scenes with low and high degrees of endpoint orientation, respectively. Results showed that the preference among the Swedish speakers of L2 English to base their similarity judgements on ongoingness rather than event endpoints was correlated with their use of English in their everyday lives, such that those who often watched television in English approximated the ongoingness preference of the English native speakers. These findings suggest that event cognition patterns may be restructured through the exposure to FL audio-visual media. The results thus add to the emerging picture that learning a new language entails learning new ways of observing and reasoning about reality.
Resumo:
Claviceps purpurea is a biotrophic fungal pathogen of grasses causing the ergot disease. The infection process of C. purpurea on rye flowers is accompanied by pectin degradation and polygalacturonase (PG) activity represents a pathogenicity factor. Wheat is also infected by C. purpurea and we tested whether the presence of polygalacturonase inhibiting protein (PGIP) can affect pathogen infection and ergot disease development. Wheat transgenic plants expressing the bean PvPGIP2 did not show a clear reduction of disease symptoms when infected with C. purpurea. To ascertain the possible cause underlying this lack of improved resistance of PvPGIP2 plants, we expressed both polygalacturonases present in the C. purpurea genome, cppg1 and cppg2 in Pichia pastoris. In vitro assays using the heterologous expressed PGs and PvPGIP2 showed that neither PG is inhibited by this inhibitor. To further investigate the role of PG in the C. purpurea/wheat system, we demonstrated that the activity of both PGs of C. purpurea is reduced on highly methyl esterified pectin. Finally, we showed that this reduction in PG activity is relevant in planta, by inoculating with C. purpurea transgenic wheat plants overexpressing a pectin methyl esterase inhibitor (PMEI) and showing a high degree of pectin methyl esterification. We observed reduced disease symptoms in the transgenic line compared with null controls. Together, these results highlight the importance of pectin degradation for ergot disease development in wheat and sustain the notion that inhibition of pectin degradation may represent a possible route to control of ergot in cereals.
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
Lipid cubic phase films are of increasingly widespread importance, both in the analysis of the cubic phases themselves by techniques including microscopy and X-ray scattering, and in their applications, especially as electrode coatings for electrochemical sensors and for templates for the electrodeposition of nanostructured metal. In this work we demonstrate that the crystallographic orientation adopted by these films is governed by minimization of interfacial energy. This is shown by the agreement between experimental data obtained using grazing-incidence small-angle X-ray scattering (GI-SAXS), and the predicted lowest energy orientation determined using a theoretical approach we have recently developed. GI-SAXS data show a high degree of orientation for films of both the double diamond phase and the gyroid phase, with the [111] and [110] directions respectively perpendicular to the planar substrate. In each case, this matches the lowest energy facet calculated for that particular phase.
Resumo:
Introduction: Resistance to anticoagulants in Norway rats (Rattus norvegicus) and house mice (Mus domesticus) has been studied in the UK since the early 1960s. In no other country in the world is our understanding of resistance phenomena so extensive and profound. Almost every aspect of resistance in the key rodent target species has been examined in laboratory and field trials and results obtained by independent researchers have been published. It is the principal purpose of this document to present a short synopsis of this information. More recently, however, the development of genetical techniques has provided a definitive means of detection of resistant genotypes among pest rodent populations. Preliminary information from a number of such surveys will also be presented. Resistance in Norway rats: A total of nine different anticoagulant resistance mutations (single nucleotide polymorphisms or SNPs) are found among Norway rats in the UK. In no other country worldwide are present so many different forms of Norway rat resistance. Among these nine SNPs, five are known to confer on rats that carry them a significant degree of resistance to anticoagulant rodenticides. These mutations are: L128Q, Y139S, L120Q, Y139C and Y139F. The latter three mutations confer, to varying degrees, practical resistance to bromadiolone and difenacoum, the two second-generation anticoagulants in predominant use in the UK. It is the recommendation of RRAG that bromadiolone and difenacoum should not be used against rats carrying the L120Q, Y139C and Y139F mutations because this will promote the spread of resistance and jeopardise the long-term efficacy of anticoagulants. Brodifacoum, flocoumafen and difethialone are effective against these three genotypes but cannot presently be used because of the regulatory restriction that they can only be applied against rats that are living and feeding predominantly indoors. Our understanding of the geographical distribution of Norway rat resistance in incomplete but is rapidly increasing. In particular, the mapping of the focus of L120Q Norway rat resistance in central-southern England by DNA sequencing is well advanced. We now know that rats carrying this resistance mutation are present across a large part of the counties of Hampshire, Berkshire and Wiltshire, and the resistance spreads into Avon, Oxfordshire and Surrey. It is also found, perhaps as outlier foci, in south-west Scotland and East Sussex. L120Q is currently the most severe form of anticoagulant resistance found in Norway rats and is prevalent over a considerable part of central-southern England. A second form of advanced Norway rat resistance is conferred by the Y139C mutation. This is noteworthy because it occurs in at least four different foci that are widely geographically dispersed, namely in Dumfries and Galloway, Gloucestershire, Yorkshire and Norfolk. Once again, bromadiolone and difenacoum are not recommended for use against rats carrying this genotype and a concern of RRAG is that continued applications of resisted active substances may result in Y139C becoming more or less ubiquitous across much of the UK. Another type of advanced resistance, the Y139F mutation, is present in Kent and Sussex. This means that Norway rats, carrying some degree of resistance to bromadiolone and difenacoum, are now found from the south coast of Kent, west into the city of Bristol, to Yorkshire in the north-east and to the south-west of Scotland. This difficult situation can only deteriorate further where these three genotypes exist and resisted anticoagulants are predominantly used against them. Resistance in house mice: House mouse is not so well understood but the presence in the UK of two resistant genotypes, L128S and Y139C, is confirmed. House mice are naturally tolerant to anticoagulants and such is the nature of this tolerance, and the presence of genetical resistance, that house mice resistant to the first-generation anticoagulants are considered to be widespread in the UK. Consequently, baits containing warfarin, sodium warfarin, chlorophacinone and coumatetralyl are not approved for use against mice. This regulatory position is endorsed by RRAG. Baits containing brodifacoum, flocoumafen and difethialone are effective against house mice and may be applied in practice because house mouse infestations are predominantly indoors. There are some reports of resistance among mice in some areas to the second-generation anticoagulant bromadiolone, while difenacoum remains largely efficacious. Alternatives to anticoagulants: The use of habitat manipulation, that is the removal of harbourage, denial of the availability of food and the prevention of ingress to structures, is an essential component of sustainable rodent pest management. All are of importance in the management of resistant rodents and have the advantage of not selecting for resistant genotypes. The use of these techniques may be particularly valuable in preventing the build-up of rat infestations. However, none can be used to remove any sizeable extant rat infestation and for practical reasons their use against house mice is problematic. Few alternative chemical interventions are available in the European Union because of the removal from the market of zinc phosphide, calciferol and bromethalin. Our virtual complete reliance on the use of anticoagulants for the chemical control of rodents in the UK, and more widely in the EU, calls for improved schemes for resistance management. Of course, these might involve the use of alternatives to anticoagulant rodenticides. Also important is an increasing knowledge of the distribution of resistance mutations in rats and mice and the use of only fully effective anticoagulants against them.
Resumo:
The international appeal of Hollywood films through the twentieth century has been a subject of interest to economic and film historians alike. This paper employs some of the methods of the economic historian to evaluate key arguments within the film history literature explaining the global success of American films. Through careful analysis of both existing and newly constructed data sets, the paper examines the extent to which Hollywood's foreign earnings were affected by: film production costs; the extent of global distribution networks; and also the international orientation of the films themselves. The paper finds that these factors influenced foreign earnings in quite distinct ways, and that their relative importance changed over time. The evidence presented here suggests a degree of interaction between the production and distribution arms of the major US film companies in their pursuit of foreign markets that would benefit from further archival-based investigation.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.
Resumo:
This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.
Resumo:
Purpose – The purpose of this paper is to explore the role of the housing market in the monetary policy transmission to consumption among euro area member states. It has been argued that the housing market in one country is then important when its mortgage market is well developed. The countries in the euro area follow unitary monetary policy, however, their housing and mortgage markets show some heterogeneity, which may lead to different policy effects on aggregate consumption through the housing market. Design/methodology/approach – The housing market can act as a channel of monetary policy shocks to household consumption through changes in house prices and residential investment – the housing market channel. We estimate vector autoregressive models for each country and conduct a counterfactual analysis in order to disentangle the housing market channel and assess its importance across the euro area member states. Findings – We find little evidence for heterogeneity of the monetary policy transmission through house prices across the euro area countries. Housing market variations in the euro area seem to be better captured by changes in residential investment rather than by changes in house prices. As a result we do not find significantly large house price channels. For some of the countries however, we observe a monetary policy channel through residential investment. The existence of a housing channel may depend on institutional features of both the labour market or with institutional factors capturing the degree of household debt as is the LTV ratio. Originality/value – The study contributes to the existing literature by assessing whether a unitary monetary policy has a different impact on consumption across the euro area countries through their housing and mortgage markets. We disentangle monetary-policy-induced effects on consumption associated with variations on the housing markets due to either house price variations or residential investment changes. We show that the housing market can play a role in the monetary transmission mechanism even in countries with less developed mortgage markets through variations in residential investment.
Resumo:
Bilingualism is reported to re-structure executive control networks, but it remains unknown which aspects of the bilingual experience cause this modulation. This study explores the impact of three code-switching types on executive functions: (1) alternation of languages, (2) insertion of lexicon of one language into grammar of another, (3) dense code-switching with co-activation of lexicon and grammar. Current models hypothesise that they challenge different aspects of the executive system because they vary in the extent and scope of language separation. Two groups of German-English bilinguals differing in dense code-switching frequency participated in a flanker task under conditions varying in degree of trial-mixing and resulting demands to conflict-monitoring. Bilinguals engaging in more dense code-switching showed inhibitory advantages in the condition requiring most conflict-monitoring. Moreover, dense code-switching frequency correlated positively with monitoring skills. This suggests that the management of co-activated languages during dense code-switching engages conflict-monitoring and that the consolidation processes taking place within co-activated linguistic systems involve local inhibition. Code-switching types requiring greater degrees of language separation may involve more global forms of inhibition. This study shows that dense code-switching is a key experience shaping bilinguals’ executive functioning and highlights the importance of controlling for participants’ code-switching habits in bilingualism research.