893 resultados para The big one (filme)
Resumo:
There is disagreement about the routes taken by populations speaking Bantu languages as they expanded to cover much of sub-Saharan Africa. Here, we build phylogenetic trees of Bantu languages and map them onto geographical space in order to assess the likely pathway of expansion and test between dispersal scenarios. The results clearly support a scenario in which groups first moved south through the rainforest from a homeland somewhere near the Nigeria–Cameroon border. Emerging on the south side of the rainforest, one branch moved south and west. Another branch moved towards the Great Lakes, eventually giving rise to the monophyletic clade of East Bantu languages that inhabit East and Southeastern Africa. These phylogenies also reveal information about more general processes involved in the diversification of human populations into distinct ethnolinguistic groups. Our study reveals that Bantu languages show a latitudinal gradient in covering greater areas with increasing distance from the equator. Analyses suggest that this pattern reflects a true ecological relationship rather than merely being an artefact of shared history. The study shows how a phylogeographic approach can address questions relating to the specific histories of certain groups, as well as general cultural evolutionary processes.
Resumo:
This chapter explores the politics around the role of agency in the UK climate change debate. Government interventions on the demand side of consumption have increasingly involved attempts to obtain greater traction with the values, attitudes and beliefs of citizens in relation to climate change and also in terms of influencing consumer behaviour at an individual level. With figures showing that approximately 40% of the UK’s carbon emissions are attributable to household and transport behaviour, policy initiatives have progressively focused on the facilitation of “sustainable behaviours”. Evidence suggests however, that mobilisation of pro-environmental attitudes in addressing the perceived “value-action gap” has so far had limited success. Research in this field suggests that there is a more significant and nuanced “gap” between context and behaviour; a relationship that perhaps provides a more adroit reflection of reasons why people do not necessarily react in the way that policy-makers anticipate. Tracing the development of the UK Government’s behaviour change agenda over the last decade, we posit that a core reason for the limitations of this programme relates to an excessively narrow focus on the individual. This has served to obscure some of the wider political and economic aspects of the debate in favour of a more simplified discussion. The second part of the chapter reports findings from a series of focus groups exploring some of the wider political views that people hold around household energy habits, purchase and use of domestic appliances, and transport behaviour-and discusses these insights in relation to the literature on the agenda’s apparent limitations. The chapter concludes by considering whether the aims of the Big Society approach (recently established by the UK’s Coalition Government) hold the potential to engage more directly with some of these issues or whether they merely constitute a “repackaging” of the individualism agenda.
Resumo:
This study examines the evolution of prices in markets with Internet price-comparison search engines. The empirical study analyzes laboratory data of prices available to informed consumers, for two industry sizes and two conditions on the sample (complete and incomplete). Distributions are typically bimodal. One of the two modes of distribution, corresponding to monopoly pricing, tends to attract such pricing strategies increasingly over time. The second one, corresponding to interior pricing, follows a decreasing trend. Monopoly pricing can serve as a means of insurance against more competitive (but riskier) behavior. In fact, experimental subjects who initially earn low profits due to interior pricing are more likely to switch to monopoly pricing than subjects who experience good returns from the start.
Resumo:
Enterprise Architecture (EA) has been recognised as an important tool in modern business management for closing the gap between strategy and its execution. The current literature implies that for EA to be successful, it should have clearly defined goals. However, the goals of different stakeholders are found to be different, even contradictory. In our explorative research, we seek an answer to the questions: What kind of goals are set for the EA implementation? How do the goals evolve during the time? Are the goals different among stakeholders? How do they affect the success of EA? We analysed an EA pilot conducted among eleven Finnish Higher Education Institutions (HEIs) in 2011. The goals of the pilot were gathered from three different stages of the pilot: before the pilot, during the pilot, and after the pilot, by means of a project plan, interviews during the pilot and a questionnaire after the pilot. The data was analysed using qualitative and quantitative methods. Eight distinct goals were recognised by the coding: Adopt EA Method, Build Information Systems, Business Development, Improve Reporting, Process Improvement, Quality Assurance, Reduce Complexity, and Understand the Big Picture. The success of the pilot was analysed statistically using the scale 1-5. Results revealed that goals set before the pilot were very different from those mentioned during the pilot, or after the pilot. Goals before the pilot were mostly related to expected benefits from the pilot, whereas the most important result was to adopt the EA method. Results can be explained by possibly different roles of respondents, which in turn were most likely caused by poor communication. Interestingly, goals mentioned by different stakeholders were not limited to their traditional areas of responsibility. For example, in some cases Chief Information Officers' goals were Quality Assurance and Process Improvement, whereas managers’ goals were Build Information Systems and Adopt EA Method. This could be a result of a good understanding of the meaning of EA, or stakeholders do not regard EA as their concern at all. It is also interesting to notice that regardless of the different perceptions of goals among stakeholders, all HEIs felt the pilot to be successful. Thus the research does not provide support to confirm the link between clear goals and success.
Resumo:
Most prominent models of bilingual representation assume a degree of interconnection or shared representation at the conceptual level. However, in the context of linguistic and cultural specificity of human concepts, and given recent findings that reveal a considerable amount of bidirectional conceptual transfer and conceptual change in bilinguals, a particular challenge that bilingual models face is to account for non-equivalence or partial equivalence of L1 and L2 specific concepts in bilingual conceptual store. The aim of the current paper is to provide a state-of-the-art review of the available empirical evidence from the fields of psycholinguistics, cognitive, experimental, and cross-cultural psychology, and discuss how these may inform and develop further traditional and more recent accounts of bilingual conceptual representation. Based on a synthesis of the available evidence against theoretical postulates of existing models, I argue that the most coherent account of bilingual conceptual representation combines three fundamental assumptions. The first one is the distributed, multi-modal nature of representation. The second one concerns cross-linguistic and cross-cultural variation of concepts. The third one makes assumptions about the development of concepts, and the emergent links between those concepts and their linguistic instantiations.
Resumo:
By the mid-1930s the major Hollywood studios had developed extensive networks of distribution subsidiaries across five continents. This article focuses on the operation of American film distributors in Australia – one of Hollywood's largest foreign markets. Drawing on two unique primary datasets, the article compares and investigates film distribution in Sydney's first-run and suburban-run markets. It finds that the subsidiaries of US film companies faced a greater liability of foreignness in the city centre market than in the suburban one. Our data support the argument that film audiences in local or suburban cinema markets were more receptive to Hollywood entertainment than those in metropolitan centres.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.
Resumo:
Sea level change predicted by the CMIP5 atmosphere–ocean general circulation models (AOGCMs) is not spatially homogeneous. In particular, the sea level change in the North Atlantic is usually characterised by a meridional dipole pattern with higher sea level rise north of 40°N and lower to the south. The spread among models is also high in that region. Here we evaluate the role of surface buoyancy fluxes by carrying out simulations with the FAMOUS low-resolution AOGCM forced by surface freshwater and heat flux changes from CO2-forced climate change experiments with CMIP5 AOGCMs, and by a standard idealised surface freshwater flux applied in the North Atlantic. Both kinds of buoyancy flux change lead to the formation of the sea level dipole pattern, although the effect of the heat flux has a greater magnitude, and is the main cause of the spread of results among the CMIP5 models. By using passive tracers in FAMOUS to distinguish between additional and redistributed buoyancy, we show that the enhanced sea level rise north of 40°N is mainly due to the direct steric effect (the reduction of sea water density) caused by adding heat or freshwater locally. The surface buoyancy forcing also causes a weakening of the Atlantic meridional overturning circulation, and the consequent reduction of the northward ocean heat transport imposes a negative tendency on sea level rise, producing the reduced rise south of 40°N. However, unlike previous authors, we find that this indirect effect of buoyancy forcing is generally less important than the direct one, except in a narrow band along the east coast of the US, where it plays a major role and leads to sea level rise, as found by previous authors.
Resumo:
The analysis step of the (ensemble) Kalman filter is optimal when (1) the distribution of the background is Gaussian, (2) state variables and observations are related via a linear operator, and (3) the observational error is of additive nature and has Gaussian distribution. When these conditions are largely violated, a pre-processing step known as Gaussian anamorphosis (GA) can be applied. The objective of this procedure is to obtain state variables and observations that better fulfil the Gaussianity conditions in some sense. In this work we analyse GA from a joint perspective, paying attention to the effects of transformations in the joint state variable/observation space. First, we study transformations for state variables and observations that are independent from each other. Then, we introduce a targeted joint transformation with the objective to obtain joint Gaussianity in the transformed space. We focus primarily in the univariate case, and briefly comment on the multivariate one. A key point of this paper is that, when (1)-(3) are violated, using the analysis step of the EnKF will not recover the exact posterior density in spite of any transformations one may perform. These transformations, however, provide approximations of different quality to the Bayesian solution of the problem. Using an example in which the Bayesian posterior can be analytically computed, we assess the quality of the analysis distributions generated after applying the EnKF analysis step in conjunction with different GA options. The value of the targeted joint transformation is particularly clear for the case when the prior is Gaussian, the marginal density for the observations is close to Gaussian, and the likelihood is a Gaussian mixture.
Resumo:
For much of lowland Britain during the Holocene one important factor in determining environmental change was sea level fluctuation. A net rise of circa 20 m, within an oscillating short term picture of transgression and regression, caused significant short to medium term challenges for people exploiting those resources. During transgression phases estuarine creek systems extended landwards, and during the final transgression phase, widespread sedimentation took place, allowing for the development of saltmarshes on tidal flats. In later prehistory the exploitation of lowlands and estuarine wetlands was predominantly for fishing, waterfowling and pastoral use, and this paper explores the human ecodynamics of the intertidal zone in the Humber estuary during the Bronze Age. Results of the Humber Wetlands Project's recent estuarine survey, will be used to argue that following a marine transgression circa 1500 cal BC, the foreshore was fully exploited in terms of food procurement. Furthermore the construction of hurdle trackways allowed access across expanding tidal creek systems to be maintained. This not only shows continued use of the most productive environments, and provides evidence for selective use of woodland, but also the continued exploitation of the intertidal zone may have played a role in the evolution of social and political structures in this area during the Bronze Age.
Resumo:
Of all the various definitions of the polar cap boundary that have been used in the past, the most physically meaningful and significant is the boundary between open and closed field lines. Locating this boundary is very important as it defines which regions and phenomena are on open field lines and which are on closed. This usually has fundamental implications for the mechanisms invoked. Unfortunately, the open-closed boundary is usually very difficult to identify, particularly where it maps to an active reconnection site. This paper looks at the topological reconnection classes that can take place, both at the magnetopause and in the cross-tail current sheet and discusses the implications for identifying the open-closed boundary when reconnection is giving velocity filter dispersion of signatures. On the dayside, it is shown that the dayside boundary plasma sheet and low-latitude boundary layer precipitations are well explained as being on open field lines, energetic ions being present because of reflection of central plasma sheet ions off the two Alfvén waves launched by the reconnection site (the outer one of which is the magnetopause). This also explains otherwise anomalous features of the dayside convection pattern in the cusp region. On the nightside, similar considerations place the open-closed boundary somewhat poleward of the velocity-dispersed ion structures which are a signature of the plasma sheet boundary layer ion flows in the tail.
Resumo:
Numerical simulations are presented of the ion distribution functions seen by middle-altitude spacecraft in the low-latitude boundary layer (LLBL) and cusp regions when reconnection is, or has recently been, taking place at the equatorial magnetopause. From the evolution of the distribution function with time elapsed since the field line was opened, both the observed energy/observation-time and pitch-angle/energy dispersions are well reproduced. Distribution functions showing a mixture of magnetosheath and magnetospheric ions, often thought to be a signature of the LLBL, are found on newly opened field lines as a natural consequence of the magnetopause effects on the ions and their flight times. In addition, it is shown that the extent of the source region of the magnetosheath ions that are detected by a satellite is a function of the sensitivity of the ion instrument . If the instrument one-count level is high (and/or solar-wind densities are low), the cusp ion precipitation detected comes from a localised region of the mid-latitude magnetopause (around the magnetic cusp), even though the reconnection takes place at the equatorial magnetopause. However, if the instrument sensitivity is high enough, then ions injected from a large segment of the dayside magnetosphere (in the relevant hemisphere) will be detected in the cusp. Ion precipitation classed as LLBL is shown to arise from the low-latitude magnetopause, irrespective of the instrument sensitivity. Adoption of threshold flux definitions has the same effect as instrument sensitivity in artificially restricting the apparent source region.
Resumo:
When the solar wind blows: The northern lights are a sign of the awesome power that the Earth receives from the solar wind. The big puzzle is how
Resumo:
Factor forecasting models are shown to deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.