138 resultados para frequency mixing
Resumo:
Mixing layer height (MLH) is one of the key parameters in describing lower tropospheric dynamics and capturing its diurnal variability is crucial, especially for interpreting surface observations. In this paper we introduce a method for identifying MLH below the minimum range of a scanning Doppler lidar when operated at vertical. The method we propose is based on velocity variance in low-elevation-angle conical scanning and is applied to measurements in two very different coastal environments: Limassol, Cyprus, during summer and Loviisa, Finland, during winter. At both locations, the new method agrees well with MLH derived from turbulent kinetic energy dissipation rate profiles obtained from vertically pointing measurements. The low-level scanning routine frequently indicated non-zero MLH less than 100 m above the surface. Such low MLHs were more common in wintertime Loviisa on the Baltic Sea coast than during summertime in Mediterranean Limassol.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Resumo:
Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.
Resumo:
Sponge cakes have traditionally been manufactured using multistage mixing methods to enhance potential foam formation by the eggs. Today, use of all-in (single-stage) mixing methods is superseding multistage methods for large-scale batter preparation to reduce costs and production time. In this study, multistage and all-in mixing procedures and three final high-speed mixing times (3, 5, and 15 min) for sponge cake production were tested to optimize a mixing method for pilot-scale research. Mixing for 3 min produced batters with higher relative density values than did longer mixing times. These batters generated well-aerated cakes with high volume and low hardness. In contrast, after 5 and 15 min of high-speed mixing, batters with lower relative density and higher viscosity values were produced. Although higher bubble incorporation and retention were observed, longer mixing times produced better developed gluten networks, which stiffened the batters and inhibited bubble expansion during mixing. As a result, these batters did not expand properly and produced cakes with low volume, dense crumb, and high hardness values. Results for all-in mixing were similar to those for the multistage mixing procedure in terms of the physical properties of batters and cakes (i.e., relative density, elastic moduli, volume, total cell area, hardness, etc.). These results suggest the all-in mixing procedure with a final high-speed mixing time of 3 min is an appropriate mixing method for pilot-scale sponge cake production. The advantages of this method are reduced energy costs and production time.
Resumo:
Animal studies find that prenatal stress is associated with increased physiological and emotional reactivity later in life, mediated via fetal programming of the HPA axis through decreased glucocorticoid receptor (GR) gene expression. Post-natal behaviours, notably licking and grooming in rats, cause decreased behavioural indices of fear and reduced HPA axis reactivity mediated via increased GR gene expression. Post-natal maternal behaviours may therefore be expected to modify prenatal effects, but this has not previously been examined in humans. We examined whether, according to self-report, maternal stroking over the first weeks of life modified associations between prenatal depression and physiological and behavioral outcomes in infancy, hence mimicking effects of rodent licking and grooming. From a general population sample of 1233 first time mothers recruited at 20 weeks gestation we drew a stratified random sample of 316 for assessment at 32 weeks based on reported inter-partner psychological abuse, a risk to child development. Of these 271 provided data at 5, 9 and 29 weeks post delivery. Mothers reported how often they stroked their babies at 5 and 9 weeks. At 29 weeks vagal withdrawal to a stressor, a measure of physiological adaptability, and maternal reported negative emotionality were assessed. There was a significant interaction between prenatal depression and maternal stroking in the prediction of vagal reactivity to a stressor (p = .01), and maternal reports of infant anger proneness (p = .007) and fear (p = .043). Increasing maternal depression was associated with decreasing physiological adaptability, and with increasing negative emotionality, only in the presence of low maternal stroking. These initial findings in humans indicate that maternal stroking in infancy, as reported by mothers, has effects strongly resembling the effects of observed maternal behaviours in animals, pointing to future studies of the epigenetic, physiological and behavioral effects of maternal stroking.
Resumo:
While eye movements have been used widely to investigate how skilled adult readers process written language, relatively little research has used this methodology with children. This is unfortunate as, as we discuss here, eye-movement studies have significant potential to inform our understanding of children’s reading development. We consider some of the empirical and theoretical issues that arise when using this methodology with children, illustrating our points with data from an experiment examining word frequency effects in 8-year-old children’s sentence reading. Children showed significantly longer gaze durations to low than high-frequency words, demonstrating that linguistic characteristics of text drive children’s eye movements as they read. We discuss these findings within the broader context of how eye-movement studies can inform our understanding of children’s reading, and can assist with the development of appropriately targeted interventions to support children as they learn to read.
Resumo:
We have used a novel knockin mouse to investigate the effect of disruption of phosphotyrosine binding of the N-terminal SH2 domain of Syk on platelet activation by GPVI, CLEC-2, and integrin αIIbβ3. The Syk(R41Afl/fl) mouse was crossed to a PF4-Cre(+) mouse to induce expression of the Syk mutant in the megakaryocyte/platelet lineage. Syk(R41Afl/fl;PF4-Cre) mice are born at approximately 50% of the expected frequency and have a similar phenotype to Syk(fl/fl;PF4-Cre) mice, including blood-lymphatic mixing and chyloascites. Anastomosis of the venous and lymphatic vasculatures can be seen in the mesenteric circulation accounting for rapid and continuous mixing of the 2 vasculatures. Platelet activation by CLEC-2 and GPVI is abolished in Syk(R41Afl/fl;PF4-Cre) platelets. Syk phosphorylation on Tyr519/20 is blocked in CLEC-2-stimulated platelets, suggesting a model in which binding of Syk via its N-terminal SH2 domain regulates autophosphorylation. In contrast, outside-in signaling by integrin αIIbβ3 is not altered, but it is inhibited in the presence of inhibitors of Src and Syk tyrosine kinases. These results demonstrate that αIIbβ3 regulates Syk through an ITAM-independent pathway in mice and provide novel insight into the course of events underlying Syk activation and hemITAM phosphorylation by CLEC-2.
Resumo:
In this article we assess the abilities of a new electromagnetic (EM) system, the CMD Mini-Explorer, for prospecting of archaeological features in Ireland and the UK. The Mini-Explorer is an EM probe which is primarily aimed at the environmental/geological prospecting market for the detection of pipes and geology. It has long been evident from the use of other EM devices that such an instrument might be suitable for shallow soil studies and applicable for archaeological prospecting. Of particular interest for the archaeological surveyor is the fact that the Mini-Explorer simultaneously obtains both quadrature (‘conductivity’) and in-phase (relative to ‘magnetic susceptibility’) data from three depth levels. As the maximum depth range is probably about 1.5 m, a comprehensive analysis of the subsoil within that range is possible. As with all EM devices the measurements require no contact with the ground, thereby negating the problem of high contact resistance that often besets earth resistance data during dry spells. The use of the CMD Mini-Explorer at a number of sites has demonstrated that it has the potential to detect a range of archaeological features and produces high-quality data that are comparable in quality to those obtained from standard earth resistance and magnetometer techniques. In theory the ability to measure two phenomena at three depths suggests that this type of instrument could reduce the number of poor outcomes that are the result of single measurement surveys. The high success rate reported here in the identification of buried archaeology using a multi-depth device that responds to the two most commonly mapped geophysical phenomena has implications for evaluation style surveys. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
This paper shows that radiometer channel radiances for cloudy atmospheric conditions can be simulated with an optimised frequency grid derived under clear-sky conditions. A new clear-sky optimised grid is derived for AVHRR channel 5 ð12 m m, 833 cm �1 Þ. For HIRS channel 11 ð7:33 m m, 1364 cm �1 Þ and AVHRR channel 5, radiative transfer simulations using an optimised frequency grid are compared with simulations using a reference grid, where the optimised grid has roughly 100–1000 times less frequencies than the full grid. The root mean square error between the optimised and the reference simulation is found to be less than 0.3 K for both comparisons, with the magnitude of the bias less than 0.03 K. The simulations have been carried out with the radiative transfer model Atmospheric Radiative Transfer Simulator (ARTS), version 2, using a backward Monte Carlo module for the treatment of clouds. With this module, the optimised simulations are more than 10 times faster than the reference simulations. Although the number of photons is the same, the smaller number of frequencies reduces the overhead for preparing the optical properties for each frequency. With deterministic scattering solvers, the relative decrease in runtime would be even more. The results allow for new radiative transfer applications, such as the development of new retrievals, because it becomes much quicker to carry out a large number of simulations. The conclusions are applicable to any downlooking infrared radiometer.
Resumo:
The El Niño/Southern Oscillation is Earth’s most prominent source of interannual climate variability, alternating irregularly between El Niño and La Niña, and resulting in global disruption of weather patterns, ecosystems, fisheries and agriculture1, 2, 3, 4, 5. The 1998–1999 extreme La Niña event that followed the 1997–1998 extreme El Niño event6 switched extreme El Niño-induced severe droughts to devastating floods in western Pacific countries, and vice versa in the southwestern United States4, 7. During extreme La Niña events, cold sea surface conditions develop in the central Pacific8, 9, creating an enhanced temperature gradient from the Maritime continent to the central Pacific. Recent studies have revealed robust changes in El Niño characteristics in response to simulated future greenhouse warming10, 11, 12, but how La Niña will change remains unclear. Here we present climate modelling evidence, from simulations conducted for the Coupled Model Intercomparison Project phase 5 (ref. 13), for a near doubling in the frequency of future extreme La Niña events, from one in every 23 years to one in every 13 years. This occurs because projected faster mean warming of the Maritime continent than the central Pacific, enhanced upper ocean vertical temperature gradients, and increased frequency of extreme El Niño events are conducive to development of the extreme La Niña events. Approximately 75% of the increase occurs in years following extreme El Niño events, thus projecting more frequent swings between opposite extremes from one year to the next.
Resumo:
El Niño events are a prominent feature of climate variability with global climatic impacts. The 1997/98 episode, often referred to as ‘the climate event of the twentieth century’1, 2, and the 1982/83 extreme El Niño3, featured a pronounced eastward extension of the west Pacific warm pool and development of atmospheric convection, and hence a huge rainfall increase, in the usually cold and dry equatorial eastern Pacific. Such a massive reorganization of atmospheric convection, which we define as an extreme El Niño, severely disrupted global weather patterns, affecting ecosystems4, 5, agriculture6, tropical cyclones, drought, bushfires, floods and other extreme weather events worldwide3, 7, 8, 9. Potential future changes in such extreme El Niño occurrences could have profound socio-economic consequences. Here we present climate modelling evidence for a doubling in the occurrences in the future in response to greenhouse warming. We estimate the change by aggregating results from climate models in the Coupled Model Intercomparison Project phases 3 (CMIP3; ref. 10) and 5 (CMIP5; ref. 11) multi-model databases, and a perturbed physics ensemble12. The increased frequency arises from a projected surface warming over the eastern equatorial Pacific that occurs faster than in the surrounding ocean waters13, 14, facilitating more occurrences of atmospheric convection in the eastern equatorial region.
Resumo:
Observed and predicted changes in the strength of the westerly winds blowing over the Southern Ocean have motivated a number of studies of the response of the Antarctic Circumpolar Current and Southern Ocean Meridional Overturning Circulation (MOC) to wind perturbations and led to the discovery of the``eddy-compensation" regime, wherein the MOC becomes insensitive to wind changes. In addition to the MOC, tracer transport also depends on mixing processes. Here we show, in a high-resolution process model, that isopycnal mixing by mesoscale eddies is strongly dependent on the wind strength. This dependence can be explained by mixing-length theory and is driven by increases in eddy kinetic energy; the mixing length does not change strongly in our simulation. Simulation of a passive ventilation tracer (analogous to CFCs or anthropogenic CO$_2$) demonstrates that variations in tracer uptake across experiments are dominated by changes in isopycnal mixing, rather than changes in the MOC. We argue that, to properly understand tracer uptake under different wind-forcing scenarios, the sensitivity of isopycnal mixing to winds must be accounted for.
Resumo:
There are no standardised serving/portion sizes defined for foods consumed in the European Union (EU). Typical serving sizes can deviate significantly from the 100 g/100 ml labelling specification required by the EU legislation. Where the nutritional value of a portion is specified, the portion size is determined by the manufacturers. Our objective was to investigate the potential for standardising portion sizes for specific foods, thereby ensuring complementarity across countries. We compared portion size for 156 food items measured using a food frequency questionnaire across the seven countries participating in the Food4me study. The probability of consuming a food and the frequency of consumption differed across countries for 93% and 58% of the foods, respectively. However, the individual country mean portion size differed from the average across countries in only 16% of comparisons. Thus, although dietary choices vary markedly across countries, there is much less variation in portion sizes. Our results highlight the potential for standardisation of portion sizes on nutrition labels in the EU
Resumo:
1. Species’ distributions are likely to be affected by a combination of environmental drivers. We used a data set of 11 million species occurrence records over the period 1970–2010 to assess changes in the frequency of occurrence of 673 macro-moth species in Great Britain. Groups of species with different predicted sensitivities showed divergent trends, which we interpret in the context of land-use and climatic changes. 2. A diversity of responses was revealed: 260 moth species declined significantly, whereas 160 increased significantly. Overall, frequencies of occurrence declined, mirroring trends in less species-rich, yet more intensively studied taxa. 3. Geographically widespread species, which were predicted to be more sensitive to land use than to climate change, declined significantly in southern Britain, where the cover of urban and arable land has increased. 4. Moths associated with low nitrogen and open environments (based on their larval host plant characteristics) declined most strongly, which is also consistent with a land-use change explanation. 5. Some moths that reach their northern (leading edge) range limit in southern Britain increased, whereas species restricted to northern Britain (trailing edge) declined significantly, consistent with a climate change explanation. 6. Not all species of a given type behaved similarly, suggesting that complex interactions between species’ attributes and different combinations of environmental drivers determine frequency of occurrence changes. 7. Synthesis and applications. Our findings are consistent with large-scale responses to climatic and land-use changes, with some species increasing and others decreasing. We suggest that land-use change (e.g. habitat loss, nitrogen deposition) and climate change are both major drivers of moth biodiversity change, acting independently and in combination. Importantly, the diverse responses revealed in this species-rich taxon show that multifaceted conservation strategies are needed to minimize negative biodiversity impacts of multiple environmental changes. We suggest that habitat protection, management and ecological restoration can mitigate combined impacts of land-use change and climate change by providing environments that are suitable for existing populations and also enable species to shift their ranges.
Resumo:
Despite the importance of dust aerosol in the Earth system, state-of-the-art models show a large variety for North African dust emission. This study presents a systematic evaluation of dust emitting-winds in 30 years of the historical model simulation with the UK Met Office Earth-system model HadGEM2-ES for the Coupled Model Intercomparison Project Phase 5. Isolating the effect of winds on dust emission and using an automated detection for nocturnal low-level jets (NLLJs) allow an in-depth evaluation of the model performance for dust emission from a meteorological perspective. The findings highlight that NLLJs are a key driver for dust emission in HadGEM2-ES in terms of occurrence frequency and strength. The annually and spatially averaged occurrence frequency of NLLJs is similar in HadGEM2-ES and ERA-Interim from the European Centre for Medium-Range Weather Forecasts. Compared to ERA-Interim, a stronger pressure ridge over northern Africa in winter and the southward displaced heat low in summer result in differences in location and strength of NLLJs. Particularly the larger geostrophic winds associated with the stronger ridge have a strengthening effect on NLLJs over parts of West Africa in winter. Stronger NLLJs in summer may rather result from an artificially increased mixing coefficient under stable stratification that is weaker in HadGEM2-ES. NLLJs in the Bodélé Depression are affected by stronger synoptic-scale pressure gradients in HadGEM2-ES. Wintertime geostrophic winds can even be so strong that the associated vertical wind shear prevents the formation of NLLJs. These results call for further model improvements in the synoptic-scale dynamics and the physical parametrization of the nocturnal stable boundary layer to better represent dust-emitting processes in the atmospheric model. The new approach could be used for identifying systematic behavior in other models with respect to meteorological processes for dust emission. This would help to improve dust emission simulations and contribute to decreasing the currently large uncertainty in climate change projections with respect to dust aerosol.