886 resultados para bottom-simulating reflection
Resumo:
This reflection argues that, despite various good reasons for approaching the notion of the ‘universal’ with caution, cultural theorists should give up their resistance to the universal. The prominence of formats in today’s television suggests that the time is ripe to do. Intentionally or not, accounts of difference implicitly also often reveal sameness; the more we probe heterogeneity, the more likely we are to encounter something that remains consistent and similar. Thus, it is time to collaborate with scholars from the numerous disciplines for which the universal has long had validity and pertinence.
Resumo:
Lake Bysjön, southern Sweden, has experienced major lake-level lowerings during the Holocene, with one interval about 900014C yr B.P. when water level dropped ca. 7 m and the lake became closed. These changes were not solely due to known changes in radiation budgets or seasonal temperatures. Simulations with a lake-catchment model indicate that, given the actual changes in radiation and temperatures, all the observed lake-level lowerings (including the major lowering at 900014C yr B.P.) could have occurred in response to precipitation changes of <75 mm/yr when winter temperatures were warmer than today. In these circumstances, the reduction of runoff into the lake caused by increased evapotranspiration during the late winter and spring, combined with relatively small changes in precipitation, was sufficient for the lake to become closed. When winter temperatures were colder than today, the reduction in winter runoff related to reduced precipitation was only very slight and insufficient to lower the lake below threshold. In such circumstances, changes in outflow were sufficient to compensate for the combined changes in precipitation and runoff, and lake level therefore remained unchanged.
Resumo:
In this article the author reflects on the experiences of three Massive Open Online Courses (MOOCs): Edfuture (CHFE), Learning Design for 21st Century Curriculum (OLDS-MOOC), and Open Education (H817). Discussion draws on the perceived differences between OERs and MOOCs and questions the definitions of 'success', 'engagement', 'completion', and 'drop out' in a MOOC. Some lessons learnt as a participant are also discussed.
Resumo:
A lattice Boltzmann method for simulating the viscous flow in large distensible blood vessels is presented by introducing a boundary condition for elastic and moving boundaries. The mass conservation for the boundary condition is tested in detail. The viscous flow in elastic vessels is simulated with a pressure-radius relationship similar to that of the Pulmonary blood vessels. The numerical results for steady flow agree with the analytical prediction to very high accuracy, and the simulation results for pulsatile flow are comparable with those of the aortic flows observed experimentally. The model is expected to find many applications for studying blood flows in large distensible arteries, especially in those suffering from atherosclerosis. stenosis. aneurysm, etc.
Resumo:
Start-up shear rheology is a standard experiment used for characterizing polymer flow, and to test various models of polymer dynamics. A rich phenomenology is developed for behavior of entangled monodisperse linear polymers in such tests, documenting shear stress overshoots as a function of shear rates and molecular weights. A tube theory does a reasonable qualitative job at describing these phenomena, although it involves several drastic approximations and the agreement can be fortuitous. Recently, Lu and coworkers published several papers [e.g. Lu {\it et al.} {\it ACS Macro Lett}. 2014, 3, 569-573] reporting results from molecular dynamics simulations of linear entangled polymers, which contradict both theory and experiment. Based on these observations, they made very serious conclusions about the tube theory, which seem to be premature. In this letter, we repeat simulations of Lu {\it et al.} and systematically show that neither their simulation results, nor their comparison with theory are confirmed.
Resumo:
The concentrations of sulfate, black carbon (BC) and other aerosols in the Arctic are characterized by high values in late winter and spring (so-called Arctic Haze) and low values in summer. Models have long been struggling to capture this seasonality and especially the high concentrations associated with Arctic Haze. In this study, we evaluate sulfate and BC concentrations from eleven different models driven with the same emission inventory against a comprehensive pan-Arctic measurement data set over a time period of 2 years (2008–2009). The set of models consisted of one Lagrangian particle dispersion model, four chemistry transport models (CTMs), one atmospheric chemistry-weather forecast model and five chemistry climate models (CCMs), of which two were nudged to meteorological analyses and three were running freely. The measurement data set consisted of surface measurements of equivalent BC (eBC) from five stations (Alert, Barrow, Pallas, Tiksi and Zeppelin), elemental carbon (EC) from Station Nord and Alert and aircraft measurements of refractory BC (rBC) from six different campaigns. We find that the models generally captured the measured eBC or rBC and sulfate concentrations quite well, compared to previous comparisons. However, the aerosol seasonality at the surface is still too weak in most models. Concentrations of eBC and sulfate averaged over three surface sites are underestimated in winter/spring in all but one model (model means for January–March underestimated by 59 and 37 % for BC and sulfate, respectively), whereas concentrations in summer are overestimated in the model mean (by 88 and 44 % for July–September), but with overestimates as well as underestimates present in individual models. The most pronounced eBC underestimates, not included in the above multi-site average, are found for the station Tiksi in Siberia where the measured annual mean eBC concentration is 3 times higher than the average annual mean for all other stations. This suggests an underestimate of BC sources in Russia in the emission inventory used. Based on the campaign data, biomass burning was identified as another cause of the modeling problems. For sulfate, very large differences were found in the model ensemble, with an apparent anti-correlation between modeled surface concentrations and total atmospheric columns. There is a strong correlation between observed sulfate and eBC concentrations with consistent sulfate/eBC slopes found for all Arctic stations, indicating that the sources contributing to sulfate and BC are similar throughout the Arctic and that the aerosols are internally mixed and undergo similar removal. However, only three models reproduced this finding, whereas sulfate and BC are weakly correlated in the other models. Overall, no class of models (e.g., CTMs, CCMs) performed better than the others and differences are independent of model resolution.
Resumo:
In this work, the Cloud Feedback Model Intercomparison (CFMIP) Observation Simulation Package (COSP) is expanded to include scattering and emission effects of clouds and precipitation at passive microwave frequencies. This represents an advancement over the official version of COSP (version 1.4.0) in which only clear-sky brightness temperatures are simulated. To highlight the potential utility of this new microwave simulator, COSP results generated using the climate model EC-Earth's version 3 atmosphere as input are compared with Microwave Humidity Sounder (MHS) channel (190.311 GHz) observations. Specifically, simulated seasonal brightness temperatures (TB) are contrasted with MHS observations for the period December 2005 to November 2006 to identify possible biases in EC-Earth's cloud and atmosphere fields. The EC-Earth's atmosphere closely reproduces the microwave signature of many of the major large-scale and regional scale features of the atmosphere and surface. Moreover, greater than 60 % of the simulated TB are within 3 K of the NOAA-18 observations. However, COSP is unable to simulate sufficiently low TB in areas of frequent deep convection. Within the Tropics, the model's atmosphere can yield an underestimation of TB by nearly 30 K for cloudy areas in the ITCZ. Possible reasons for this discrepancy include both incorrect amount of cloud ice water in the model simulations and incorrect ice particle scattering assumptions used in the COSP microwave forward model. These multiple sources of error highlight the non-unique nature of the simulated satellite measurements, a problem exacerbated by the fact that EC-Earth lacks detailed micro-physical parameters necessary for accurate forward model calculations. Such issues limit the robustness of our evaluation and suggest a general note of caution when making COSP-satellite observation evaluations.
Resumo:
This paper presents an integrative and spatially explicit modeling approach for analyzing human and environmental exposure from pesticide application of smallholders in the potato producing Andean region in Colombia. The modeling approach fulfills the following criteria: (i) it includes environmental and human compartments; (ii) it contains a behavioral decision-making model for estimating the effect of policies on pesticide flows to humans and the environment; (iii) it is spatially explicit; and (iv) it is modular and easily expandable to include additional modules, crops or technologies. The model was calibrated and validated for the Vereda La Hoya and was used to explore the effect of different policy measures in the region. The model has moderate data requirements and can be adapted relatively easy to other regions in developing countries with similar conditions.
Resumo:
Rising greenhouse gas emissions (GHGEs) have implications for health and up to 30 % of emissions globally are thought to arise from agriculture. Synergies exist between diets low in GHGEs and health however some foods have the opposite relationship, such as sugar production being a relatively low source of GHGEs. In order to address this and to further characterise a healthy sustainable diet, we model the effect on UK non-communicable disease mortality and GHGEs of internalising the social cost of carbon into the price of food alongside a 20 % tax on sugar sweetened beverages (SSBs). Developing previously published work, we simulate four tax scenarios: (A) a GHGEs tax of £2.86/tonne of CO2 equivalents (tCO2e)/100 g product on all products with emissions greater than the mean across all food groups (0.36 kgCO2e/100 g); (B) scenario A but with subsidies on foods with emissions lower than 0.36 kgCO2e/100 g such that the effect is revenue neutral; (C) scenario A but with a 20 % sales tax on SSBs; (D) scenario B but with a 20 % sales tax on SSBs. An almost ideal demand system is used to estimate price elasticities and a comparative risk assessment model is used to estimate changes to non-communicable disease mortality. We estimate that scenario A would lead to 300 deaths delayed or averted, 18,900 ktCO2e fewer GHGEs, and £3.0 billion tax revenue; scenario B, 90 deaths delayed or averted and 17,100 ktCO2e fewer GHGEs; scenario C, 1,200 deaths delayed or averted, 18,500 ktCO2e fewer GHGEs, and £3.4 billion revenue; and scenario D, 2,000 deaths delayed or averted and 16,500 ktCO2e fewer GHGEs. Deaths averted are mainly due to increased fibre and reduced fat consumption; a SSB tax reduces SSB and sugar consumption. Incorporating the social cost of carbon into the price of food has the potential to improve health, reduce GHGEs, and raise revenue. The simple addition of a tax on SSBs can mitigate negative health consequences arising from sugar being low in GHGEs. Further conflicts remain, including increased consumption of unhealthy foods such as cakes and nutrients such as salt.
Resumo:
We present a modelling study of processes controlling the summer melt of the Arctic sea ice cover. We perform a sensitivity study and focus our interest on the thermodynamics at the ice–atmosphere and ice–ocean interfaces. We use the Los Alamos community sea ice model CICE, and additionally implement and test three new parametrization schemes: (i) a prognostic mixed layer; (ii) a three equation boundary condition for the salt and heat flux at the ice–ocean interface; and (iii) a new lateral melt parametrization. Recent additions to the CICE model are also tested, including explicit melt ponds, a form drag parametrization and a halodynamic brine drainage scheme. The various sea ice parametrizations tested in this sensitivity study introduce a wide spread in the simulated sea ice characteristics. For each simulation, the total melt is decomposed into its surface, bottom and lateral melt components to assess the processes driving melt and how this varies regionally and temporally. Because this study quantifies the relative importance of several processes in driving the summer melt of sea ice, this work can serve as a guide for future research priorities.
Resumo:
Observations and climate models suggest significant decadal variability within the North Atlantic subpolar gyre (NA SPG), though observations are sparse and models disagree on the details of this variability. Therefore, it is important to understand 1) the mechanisms of simulated decadal variability, 2) which parts of simulated variability are more faithful representations of reality, and 3) the implications for climate predictions. Here, we investigate the decadal variability in the NA SPG in the state-of-the-art, high resolution (0.25◦ ocean resolution), climate model ‘HadGEM3’. We find a decadal mode with a period of 17 years that explains 30% of the annual variance in related indices. The mode arises due to the advection of heat content anomalies, and shows asymmetries in the timescale of phase reversal between positive and negative phases. A negative feedback from temperature-driven density anomalies in the Labrador Sea (LS) allows for the phase reversal. The North Atlantic Oscillation (NAO), which exhibits the same periodicity, amplifies the mode. The atmosphere-ocean coupling is stronger during positive rather than negative NAO states, explaining the asymmetry. Within the NA SPG, there is potential predictability arising partly from this mode for up to 5 years. There are important similarities between observed and simulated variability, such as the apparent role for the propagation of heat content anomalies. However, observations suggest interannual LS density anomalies are salinity-driven. Salinity control of density would change the temperature feedback to the south, possibly limiting real-world predictive skill in the southern NA SPG with this model. Finally, to understand the diversity of behaviours, we analyse 42 present-generation climate models. Temperature and salinity biases are found to systematically influence the driver of density variability in the LS. Resolution is a good predictor of the biases. The dependence of variability on the background state has important implications for decadal predictions.
Resumo:
What explains cross-national variation in wage inequality? Research in comparative political economy stresses the importance of the welfare state and wage coordination in reducing not only disposable income inequality but also gross earnings inequality. However, the cross-national variation in gross earnings inequality between median and low income workers is at odds with this conventional wisdom: the German coordinated market economy is now more unequal in this type of inequality than the UK, a liberal market economy. To solve this puzzle, I argue that non-inclusive coordination benefits median but not bottom income workers and is as a result associated with higher – rather than lower - wage inequality. I find support for this argument using a large N quantitative analysis of wage inequality in a panel of Western European countries. Results are robust to the inclusion of numerous controls, country fixed effects, and also hold with a sample of OECD countries. Taken together these findings force us to reconsider the relationship between coordination and wage inequality at the bottom of the income distribution.
Resumo:
This paper analyzes the changes the ways of organizing memory have undergone since ancient times, turning them into the current artificial memory systems. It aims to draw a parallel between the art of memory (which associates images to specific texts) and the hypertext (which also uses associations, but in a non-linear way). Our methodology consisted of a qualitative approach, involving the collection of texts about the art of memory and hypertext; this enables us to salvage the historical-cultural changes which have modified form and use of the art of memory and allowed the creation of hypertext. It also analyzes the similarities among artificial memory systems created by different cultures in order to prevent loss of knowledge produced by society.