70 resultados para Complexity syntactic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing models to predict the effects of social and economic change on agricultural landscapes is an important challenge. Model development often involves making decisions about which aspects of the system require detailed description and which are reasonably insensitive to the assumptions. However, important components of the system are often left out because parameter estimates are unavailable. In particular, measurements of the relative influence of different objectives, such as risk, environmental management, on farmer decision making, have proven difficult to quantify. We describe a model that can make predictions of land use on the basis of profit alone or with the inclusion of explicit additional objectives. Importantly, our model is specifically designed to use parameter estimates for additional objectives obtained via farmer interviews. By statistically comparing the outputs of this model with a large farm-level land-use data set, we show that cropping patterns in the United Kingdom contain a significant contribution from farmer’s preference for objectives other than profit. In particular, we found that risk aversion had an effect on the accuracy of model predictions, whereas preference for a particular number of crops grown was less important. While nonprofit objectives have frequently been identified as factors in farmers’ decision making, our results take this analysis further by demonstrating the relationship between these preferences and actual cropping patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present article addresses the following question: what variables condition syntactic transfer? Evidence is provided in support of the position that third language (L3) transfer is selective, whereby, at least under certain conditions, it is driven by the typological proximity of the target L3 measured against the other previously acquired linguistic systems (cf. Rothman and Cabrelli Amaro, 2007, 2010; Rothman, 2010; Montrul et al., 2011). To show this, we compare data in the domain of adjectival interpretation between successful first language (L1) Italian learners of English as a second language (L2) at the low to intermediate proficiency level of L3 Spanish, and successful L1 English learners of L2 Spanish at the same levels for L3 Brazilian Portuguese. The data show that, irrespective of the L1 or the L2, these L3 learners demonstrate target knowledge of subtle adjectival semantic nuances obtained via noun-raising, which English lacks and the other languages share. We maintain that such knowledge is transferred to the L3 from Italian (L1) and Spanish (L2) respectively in light of important differences between the L3 learners herein compared to what is known of the L2 Spanish performance of L1 English speakers at the same level of proficiency (see, for example, Judy et al., 2008; Rothman et al., 2010). While the present data are consistent with Flynn et al.’s (2004) Cumulative Enhancement Model, we discuss why a coupling of these data with evidence from other recent L3 studies suggests necessary modifications to this model, offering in its stead the Typological Primacy Model (TPM) for multilingual transfer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One central question in the formal linguistic study of adult multilingual morphosyntax (i.e., L3/Ln acquisition) involves determining the role(s) the L1 and/or the L2 play(s) at the L3 initial state (e.g., Bardel & Falk, Second Language Research 23: 459–484, 2007; Falk & Bardel, Second Language Research: forthcoming; Flynn et al., The International Journal of Multilingualism 8: 3–16, 2004; Rothman, Second Language Research: forthcoming; Rothman & Cabrelli, On the initial state of L3 (Ln) acquisition: Selective or absolute transfer?: 2007; Rothman & Cabrelli Amaro, Second Language Research 26: 219–289, 2010). The present article adds to this general program, testing Rothman's (Second Language Research: forthcoming) model for L3 initial state transfer, which when relevant in light of specific language pairings, maintains that typological proximity between the languages is the most deterministic variable determining the selection of syntactic transfer. Herein, I present empirical evidence from the later part of the beginning stages of L3 Brazilian Portuguese (BP) by native speakers of English and Spanish, who have attained an advanced level of proficiency in either English or Spanish as an L2. Examining the related domains of syntactic word order and relative clause attachment preference in L3 BP, the data clearly indicate that Spanish is transferred for both experimental groups irrespective of whether it was the L1 or L2. These results are expected by Rothman's (Second Language Research: forthcoming) model, but not necessarily predicted by other current hypotheses of multilingual syntactic transfer; the implications of this are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates transfer at the third-language (L3) initial state, testing between the following possibilities: (1) the first language (L1) transfer hypothesis (an L1 effect for all adult acquisition), (2) the second language (L2) transfer hypothesis, where the L2 blocks L1 transfer (often referred to in the recent literature as the ‘L2 status factor’; Williams and Hammarberg, 1998), and (3) the Cumulative Enhancement Model (Flynn et al., 2004), which proposes selective transfer from all previous linguistic knowledge. We provide data from successful English-speaking learners of L2 Spanish at the initial state of acquiring L3 French and L3 Italian relating to properties of the Null-Subject Parameter (e.g. Chomsky, 1981; Rizzi, 1982). We compare these groups to each other, as well as to groups of English learners of L2 French and L2 Italian at the initial state, and conclude that the data are consistent with the predictions of the ‘L2 status factor’. However, we discuss an alternative possible interpretation based on (psycho)typologically-motivated transfer (borrowing from Kellerman, 1983), providing a methodology for future research in this domain to meaningfully tease apart the ‘L2 status factor’ from this alternative account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contemporary acquisition theorizing has placed a considerable amount of attention on interfaces, points at which different linguistic modules interact. The claim is that vulnerable interfaces cause particular difficulties in L1, bilingual and adult L2 acquisition (e.g. Platzack, 2001; Montrul, 2004; Müller and Hulk, 2001; Sorace, 2000, 2003, 2004, 2005). Accordingly, it is possible that deficits at the syntax–pragmatics interface cause what appears to be particular non-target-like syntactic behavior in L2 performance. This syntax-before-discourse hypothesis is examined in the present study by analyzing null vs. overt subject pronoun distribution in L2 Spanish of English L1 learners. As ultimately determined by L2 knowledge of the Overt Pronoun Constraint (OPC) (Montalbetti, 1984), the data indicate that L2 learners at the intermediate and advanced levels reset the Null Subject Parameter (NSP), but only advanced learners have acquired a more or less target null/overt subject distribution. Against the predictions of Sorace (2004) and in line with Montrul and Rodríguez-Louro (2006), the data indicate an overuse of both overt and null subject pronouns. As a result, this behavior cannot be from L1 interference alone, suggesting that interface-conditioned properties are simply more complex and therefore, harder to acquire. Furthermore, the data from the advanced learners demonstrate that the syntax–pragmatics interface is not a predetermined locus for fossilization (in contra e.g. Valenzuela, 2006).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is widely assumed that the British are poorer modern foreign language (MFL) learners than their fellow Europeans. Motivation has often been seen as the main cause of this perceived disparity in language learning success. However, there have also been suggestions that curricular and pedagogical factors may play a part. This article reports a research project investigating how German and English 14- to 16-year-old learners of French as a first foreign language compare to one another in their vocabulary knowledge and in the lexical diversity, accuracy and syntactic complexity of their writing. Students from comparable schools in Germany and England were set two writing tasks which were marked by three French native speakers using standardised criteria aligned to the Common European Framework of Reference (CEF). Receptive vocabulary size and lexical diversity were established by the X_lex test and a verb types measure respectively. Syntactic complexity and formal accuracy were respectively assessed using the mean length of T-units (MLTU) and words/error metrics. Students' and teachers' questionnaires and semi-structured interviews were used to provide information and participants' views on classroom practices, while typical textbooks and feedback samples were analysed to establish differences in materials-related input and feedback in the two countries. The German groups were found to be superior in vocabulary size, and in the accuracy, lexical diversity and overall quality – but not the syntactic complexity – of their writing. The differences in performance outcomes are analysed and discussed with regard to variables related to the educational contexts (e.g. curriculum design and methodology).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Greater self-complexity has been suggested as a protective factor for people under stress (Linville, 1985). Two different measures have been proposed to assess individual self-complexity: Attneave’s H statistic (1959) and a composite index of two components of self-complexity (SC; Rafaeli-Mor et al., 1999). Using mood-incongruent recall, i.e., recalling positive events while in negative mood, the present study compared validity of the two measures through reanalysis of Sakaki’s (2004) data. Results indicated that H statistic did not predict performance of mood-incongruent recall. In contrast, greater SC was associated with better mood-incongruent recall even when the effect of H statistic was controlled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The synthesis and characterization of five new indium selenides, [C9H17N2]3[In5Se8+x(Se2)1−x] (1–2), [C6H12N2]4[C6H14N2]3[In10Se15(Se2)3] (3), [C6H14N2][(C6H12N2)2NaIn5Se9] (4) and [enH2][NH4][In7Se12] (5), are described. These materials were prepared under solvothermal conditions, using 1,8-diazabicyclo[5.4.0]undec-7-ene (DBU) and 1,4-diazabicyclo[2.2.2]octane (DABCO) as structure-directing agents. Compounds 1–4 represent the first examples of ribbons in indium selenides, and 4 is the first example of incorporation of an alkali metal complex. Compounds 1, 2 and 4 contain closely related [In5Se8+x(Se2)1−x]3− ribbons which differ only in their content of (Se2)2− anions. These ribbons are interspaced by organic countercations in 1 and 2, while in 4 they are linked by highly unusual [Na(DABCO)2]+ units into a three-dimensional framework. Compound 3 contains complex ribbons, with a long repeating sequence of ca. 36 Å, and 4 is a non-centrosymmetric three-dimensional framework, formed as a consequence of the decomposition of DABCO into ethylenediamine (en) and ammonia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inclusion of the direct and indirect radiative effects of aerosols in high-resolution global numerical weather prediction (NWP) models is being increasingly recognised as important for the improved accuracy of short-range weather forecasts. In this study the impacts of increasing the aerosol complexity in the global NWP configuration of the Met Office Unified Model (MetUM) are investigated. A hierarchy of aerosol representations are evaluated including three-dimensional monthly mean speciated aerosol climatologies, fully prognostic aerosols modelled using the CLASSIC aerosol scheme and finally, initialised aerosols using assimilated aerosol fields from the GEMS project. The prognostic aerosol schemes are better able to predict the temporal and spatial variation of atmospheric aerosol optical depth, which is particularly important in cases of large sporadic aerosol events such as large dust storms or forest fires. Including the direct effect of aerosols improves model biases in outgoing long-wave radiation over West Africa due to a better representation of dust. However, uncertainties in dust optical properties propagate to its direct effect and the subsequent model response. Inclusion of the indirect aerosol effects improves surface radiation biases at the North Slope of Alaska ARM site due to lower cloud amounts in high-latitude clean-air regions. This leads to improved temperature and height forecasts in this region. Impacts on the global mean model precipitation and large-scale circulation fields were found to be generally small in the short-range forecasts. However, the indirect aerosol effect leads to a strengthening of the low-level monsoon flow over the Arabian Sea and Bay of Bengal and an increase in precipitation over Southeast Asia. Regional impacts on the African Easterly Jet (AEJ) are also presented with the large dust loading in the aerosol climatology enhancing of the heat low over West Africa and weakening the AEJ. This study highlights the importance of including a more realistic treatment of aerosol–cloud interactions in global NWP models and the potential for improved global environmental prediction systems through the incorporation of more complex aerosol schemes.