156 resultados para Narrow Polydispersity
Resumo:
We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.
Resumo:
Recent work on a sample of the chipped stone from one of the late Mesolithic shell midden sites on the Inner Hebrides island of Oronsay has shown that the assemblage was created using bipolar and platform technologies, with an unexpected element of narrow blade technology shown in cores and core trimming elements, suggesting that there may have been links between this classic ‘Obanian’ assemblage and the more typical Mesolithic narrow blade technology. Findings raise questions about the relationship of settlement on the island to Mesolithic settlement rounds and to the transition to the Neolithic.
Resumo:
Recent UK changes in the number of students entering higher education, and in the nature of financial support, highlight the complexity of students’ choices about human capital investments. Today’s students have to focus not on the relatively narrow issue of how much academic effort to invest, but instead on the more complicated issue of how to invest effort in pursuit of ‘employability skills’, and how to signal such acquisitions in the context of a highly competitive graduate jobs market. We propose a framework aimed specifically at students’ investment decisions, which encompasses corner solutions for both borrowing and employment while studying.
Resumo:
There has been considerable interest in the climate impact of trends in stratospheric water vapor (SWV). However, the representation of the radiative properties of water vapor under stratospheric conditions remains poorly constrained across different radiation codes. This study examines the sensitivity of a detailed line-by-line (LBL) code, a Malkmus narrow-band model and two broadband GCM radiation codes to a uniform perturbation in SWV in the longwave spectral region. The choice of sampling rate in wave number space (Δν) in the LBL code is shown to be important for calculations of the instantaneous change in heating rate (ΔQ) and the instantaneous longwave radiative forcing (ΔFtrop). ΔQ varies by up to 50% for values of Δν spanning 5 orders of magnitude, and ΔFtrop varies by up to 10%. In the three less detailed codes, ΔQ differs by up to 45% at 100 hPa and 50% at 1 hPa compared to a LBL calculation. This causes differences of up to 70% in the equilibrium fixed dynamical heating temperature change due to the SWV perturbation. The stratosphere-adjusted radiative forcing differs by up to 96% across the less detailed codes. The results highlight an important source of uncertainty in quantifying and modeling the links between SWV trends and climate.
Resumo:
A set of random variables is exchangeable if its joint distribution function is invariant under permutation of the arguments. The concept of exchangeability is discussed, with a view towards potential application in evaluating ensemble forecasts. It is argued that the paradigm of ensembles being an independent draw from an underlying distribution function is probably too narrow; allowing ensemble members to be merely exchangeable might be a more versatile model. The question is discussed whether established methods of ensemble evaluation need alteration under this model, with reliability being given particular attention. It turns out that the standard methodology of rank histograms can still be applied. As a first application of the exchangeability concept, it is shown that the method of minimum spanning trees to evaluate the reliability of high dimensional ensembles is mathematically sound.
Resumo:
In situ high resolution aircraft measurements of cloud microphysical properties were made in coordination with ground based remote sensing observations of a line of small cumulus clouds, using Radar and Lidar, as part of the Aerosol Properties, PRocesses And InfluenceS on the Earth's climate (APPRAISE) project. A narrow but extensive line (~100 km long) of shallow convective clouds over the southern UK was studied. Cloud top temperatures were observed to be higher than −8 °C, but the clouds were seen to consist of supercooled droplets and varying concentrations of ice particles. No ice particles were observed to be falling into the cloud tops from above. Current parameterisations of ice nuclei (IN) numbers predict too few particles will be active as ice nuclei to account for ice particle concentrations at the observed, near cloud top, temperatures (−7.5 °C). The role of mineral dust particles, consistent with concentrations observed near the surface, acting as high temperature IN is considered important in this case. It was found that very high concentrations of ice particles (up to 100 L−1) could be produced by secondary ice particle production providing the observed small amount of primary ice (about 0.01 L−1) was present to initiate it. This emphasises the need to understand primary ice formation in slightly supercooled clouds. It is shown using simple calculations that the Hallett-Mossop process (HM) is the likely source of the secondary ice. Model simulations of the case study were performed with the Aerosol Cloud and Precipitation Interactions Model (ACPIM). These parcel model investigations confirmed the HM process to be a very important mechanism for producing the observed high ice concentrations. A key step in generating the high concentrations was the process of collision and coalescence of rain drops, which once formed fell rapidly through the cloud, collecting ice particles which caused them to freeze and form instant large riming particles. The broadening of the droplet size-distribution by collision-coalescence was, therefore, a vital step in this process as this was required to generate the large number of ice crystals observed in the time available. Simulations were also performed with the WRF (Weather, Research and Forecasting) model. The results showed that while HM does act to increase the mass and number concentration of ice particles in these model simulations it was not found to be critical for the formation of precipitation. However, the WRF simulations produced a cloud top that was too cold and this, combined with the assumption of continual replenishing of ice nuclei removed by ice crystal formation, resulted in too many ice crystals forming by primary nucleation compared to the observations and parcel modelling.
Resumo:
Idealised convection-permitting simulations are used to quantify the impact of embedded convection on the precipitation generated by moist flow over midlatitude mountain ridges. A broad range of mountain dimensions and moist stabilities are considered to encompass a spectrum of physically plausible flows. The simulations reveal that convection only enhances orographic precipitation in cap clouds that are otherwise unable to efficiently convert cloud condensate into precipitate. For tall and wide mountains (e.g. the Washington Cascades or the southern Andes), precipitate forms efficiently through vapour deposition and collection, even in the absence of embedded convection. When embedded convection develops in such clouds, it produces competing effects (enhanced condensation in updraughts and enhanced evaporation through turbulent mixing and compensating subsidence) that cancel to yield little net change in precipitation. By contrast, convection strongly enhances precipitation over short and narrow mountains (e.g. the UK Pennines or the Oregon Coastal Range) where precipitation formation is otherwise highly inefficient. Although cancellation between increased condensation and evaporation still occurs, the enhanced precipitation formation within the convective updraughts leads to a net increase in precipitation efficiency. The simulations are physically interpreted through non-dimensional diagnostics and relevant time-scales that govern advective, microphysical, and convective processes.
Resumo:
This paper explores the politics around the role of agency in the UK climate change debate. Government interventions on the demand side of consumption have increasingly involved attempts to obtain greater traction with the values, attitudes and beliefs of citizens in relation to climate change and also in terms of influencing consumer behaviour at an individual level. With figures showing that approximately 40% of the UK’s carbon emissions are attributable to household and transport behaviour, policy initiatives have progressively focused on the facilitation of “sustainable behaviours”. Evidence suggests however, that mobilisation of pro-environmental attitudes in addressing the perceived “value-action gap” has so far had limited success. Research in this field suggests that there is a more significant and nuanced “gap” between context and behaviour; a relationship that perhaps provides a more adroit reflection of reasons why people do not necessarily react in the way that policy-makers anticipate. Tracing the development of the UK Government’s behaviour change agenda over the last decade, we posit that a core reason for the limitations of this programme relates to an excessively narrow focus on the individual. This has served to obscure some of the wider political and economic aspects of the debate in favour of a more simplified discussion. The second part of the paper reports findings from a series of focus groups exploring some of the wider political views that people hold around household energy habits, purchase and use of domestic appliances, and transport behaviour-and discusses these insights in relation to the literature on the agenda’s apparent limitations. The paper concludes by considering whether the aims of the Big Society approach (recently established by the UK’s Coalition Government) hold the potential to engage more directly with some of these issues or whether they merely constitute a “repackaging” of the individualism agenda.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
We investigate the behavior of a single-cell protozoan in a narrow tubular ring. This environment forces them to swim under a one-dimensional periodic boundary condition. Above a critical density, single-cell protozoa aggregate spontaneously without external stimulation. The high-density zone of swimming cells exhibits a characteristic collective dynamics including translation and boundary fluctuation. We analyzed the velocity distribution and turn rate of swimming cells and found that the regulation of the turing rate leads to a stable aggregation and that acceleration of velocity triggers instability of aggregation. These two opposing effects may help to explain the spontaneous dynamics of collective behavior. We also propose a stochastic model for the mechanism underlying the collective behavior of swimming cells.
Resumo:
Atmospheric Rivers (ARs), narrow plumes of enhanced moisture transport in the lower troposphere, are a key synoptic feature behind winter flooding in midlatitude regions. This article develops an algorithm which uses the spatial and temporal extent of the vertically integrated horizontal water vapor transport for the detection of persistent ARs (lasting 18 h or longer) in five atmospheric reanalysis products. Applying the algorithm to the different reanalyses in the vicinity of Great Britain during the winter half-years of 1980–2010 (31 years) demonstrates generally good agreement of AR occurrence between the products. The relationship between persistent AR occurrences and winter floods is demonstrated using winter peaks-over-threshold (POT) floods (with on average one flood peak per winter). In the nine study basins, the number of winter POT-1 floods associated with persistent ARs ranged from approximately 40 to 80%. A Poisson regression model was used to describe the relationship between the number of ARs in the winter half-years and the large-scale climate variability. A significant negative dependence was found between AR totals and the Scandinavian Pattern (SCP), with a greater frequency of ARs associated with lower SCP values.
Resumo:
In a number works Jerry Fodor has defended a reductive, causal and referential theory of cognitive content. I argue against this, defending a quasi-Fregean notion of cognitive content, and arguing also that the cognitive content of non-singular concepts is narrow, rather than wide.
Synergetic effect of carbon nanopore size and surface oxidation on CO2 capture from CO2/CH4 mixtures
Resumo:
We have studied the synergetic effect of confinement (carbon nanopore size) and surface chemistry (the number of carbonyl groups) on CO2 capture from its mixtures with CH4 at typical operating conditions for industrial adsorptive separation (298 K and compressed CO2CH4 mixtures). Although both confinement and surface oxidation have an impact on the efficiency of CO2/CH4 adsorptive separation at thermodynamics equilibrium, we show that surface functionalization is the most important factor in designing an efficient adsorbent for CO2 capture. Systematic Monte Carlo simulations revealed that adsorption of CH4 either pure or mixed with CO2 on oxidized nanoporous carbons is only slightly increased by the presence of functional groups (surface dipoles). In contrast, adsorption of CO2 is very sensitive to the number of carbonyl groups, which can be examined by a strong electric quadrupolar moment of CO2. Interestingly, the adsorbed amount of CH4 is strongly affected by the presence of the co-adsorbed CO2. In contrast, the CO2 uptake does not depend on the molar ratio of CH4 in the bulk mixture. The optimal carbonaceous porous adsorbent used for CO2 capture near ambient conditions should consist of narrow carbon nanopores with oxidized pore walls. Furthermore, the equilibrium separation factor was the greatest for CO2/CH4 mixtures with a low CO2 concentration. The maximum equilibrium separation factor of CO2 over CH4 of ∼18–20 is theoretically predicted for strongly oxidized nanoporous carbons. Our findings call for a review of the standard uncharged model of carbonaceous materials used for the modeling of the adsorption separation processes of gas mixtures containing CO2 (and other molecules with strong electric quadrupolar moment or dipole moment).
Resumo:
Societal concern is growing about the consequences of climate change for food systems and, in a number of regions, for food security. There is also concern that meeting the rising demand for food is leading to environmental degradation thereby exacerbating factors in part responsible for climate change, and further undermining the food systems upon which food security is based. A major emphasis of climate change/food security research over recent years has addressed the agronomic aspects of climate change, and particularly crop yield. This has provided an excellent foundation for assessments of how climate change may affect crop productivity, but the connectivity between these results and the broader issues of food security at large are relatively poorly explored; too often discussions of food security policy appear to be based on a relatively narrow agronomic perspective. To overcome the limitation of current agronomic research outputs there are several scientific challenges where further agronomic effort is necessary, and where agronomic research results can effectively contribute to the broader issues underlying food security. First is the need to better understand how climate change will affect cropping systems including both direct effects on the crops themselves and indirect effects as a result of changed pest and weed dynamics and altered soil and water conditions. Second is the need to assess technical and policy options for either reducing the deleterious impacts or enhancing the benefits of climate change on cropping systems while minimising further environmental degradation. Third is the need to understand how best to address the information needs of policy makers and report and communicate agronomic research results in a manner that will assist the development of food systems adapted to climate change. There are, however, two important considerations regarding these agronomic research contributions to the food security/climate change debate. The first concerns scale. Agronomic research has traditionally been conducted at plot scale over a growing season or perhaps a few years, but many of the issues related to food security operate at larger spatial and temporal scales. Over the last decade, agronomists have begun to establish trials at landscape scale, but there are a number of methodological challenges to be overcome at such scales. The second concerns the position of crop production (which is a primary focus of agronomic research) in the broader context of food security. Production is clearly important, but food distribution and exchange also determine food availability while access to food and food utilisation are other important components of food security. Therefore, while agronomic research alone cannot address all food security/climate change issues (and hence the balance of investment in research and development for crop production vis à vis other aspects of food security needs to be assessed), it will nevertheless continue to have an important role to play: it both improves understanding of the impacts of climate change on crop production and helps to develop adaptation options; and also – and crucially – it improves understanding of the consequences of different adaptation options on further climate forcing. This role can further be strengthened if agronomists work alongside other scientists to develop adaptation options that are not only effective in terms of crop production, but are also environmentally and economically robust, at landscape and regional scales. Furthermore, such integrated approaches to adaptation research are much more likely to address the information need of policy makers. The potential for stronger linkages between the results of agronomic research in the context of climate change and the policy environment will thus be enhanced.
Resumo:
Within generative L2 acquisition research there is a longstanding debate as to what underlies observable differences in L1/L2 knowledge/ performance. On the one hand, Full Accessibility approaches maintain that target L2 syntactic representations (new functional categories and features) are acquirable (e.g., Schwartz & Sprouse, 1996). Conversely, Partial Accessibility approaches claim that L2 variability and/or optionality, even at advanced levels, obtains as a result of inevitable deficits in L2 narrow syntax and is conditioned upon a maturational failure in adulthood to acquire (some) new functional features (e.g., Beck, 1998; Hawkins & Chan, 1997; Hawkins & Hattori, 2006; Tsimpli & Dimitrakopoulou, 2007). The present study tests the predictions of these two sets of approaches with advanced English learners of L2 Brazilian Portuguese (n = 21) in the domain of inflected infinitives. These advanced L2 learners reliably differentiate syntactically between finite verbs, uninflected and inflected infinitives, which, as argued, only supports Full Accessibility approaches. Moreover, we will discuss how testing the domain of inflected infinitives is especially interesting in light of recent proposals that Brazilian Portuguese colloquial dialects no longer actively instantiate them (Lightfoot, 1991; Pires, 2002, 2006; Pires & Rothman, 2009; Rothman, 2007).