958 resultados para Eguchi-hanson Metric
Resumo:
Purpose Prenatal undernutrition followed by postweaning feeding of a high-fat diet results in obesity in the adult offspring. In this study, we investigated whether diet-induced thermogenesis is altered as a result of such nutritional mismatch. Methods Female MF-1 mice were fed a normal protein (NP, 18 % casein) or a protein-restricted (PR, 9 % casein) diet throughout pregnancy and lactation. After weaning, male offspring of both groups were fed either a high-fat diet (HF; 45 % kcal fat) or standard chow (C, 7 % kcal fat) to generate the NP/C, NP/HF, PR/C and PR/HF adult offspring groups (n = 7–11 per group). Results PR/C and NP/C offspring have similar body weights at 30 weeks of age. Postweaning HF feeding resulted in significantly heavier NP/HF offspring (P < 0.01), but not in PR/HF offspring, compared with their chow-fed counterparts. However, the PR/HF offspring exhibited greater adiposity (P < 0.01) v the NP/HF group. The NP/HF offspring had increased energy expenditure and increased mRNA expression of uncoupling protein-1 and β-3 adrenergic receptor in the interscapular brown adipose tissue (iBAT) compared with the NP/C mice (both at P < 0.01). No such differences in energy expenditure and iBAT gene expression were observed between the PR/HF and PR/C offspring. Conclusions These data suggest that a mismatch between maternal diet during pregnancy and lactation, and the postweaning diet of the offspring, can attenuate diet-induced thermogenesis in the iBAT, resulting in the development of obesity in adulthood.
Resumo:
The subject of climate feedbacks focuses attention on global mean surface air temperature (GMST) as the key metric of climate change. But what does knowledge of past and future GMST tell us about the climate of specific regions? In the context of the ongoing UNFCCC process, this is an important question for policy-makers as well as for scientists. The answer depends on many factors, including the mechanisms causing changes, the timescale of the changes, and the variables and regions of interest. This paper provides a review and analysis of the relationship between changes in GMST and changes in local climate, first in observational records and then in a range of climate model simulations, which are used to interpret the observations. The focus is on decadal timescales, which are of particular interest in relation to recent and near-future anthropogenic climate change. It is shown that GMST primarily provides information about forced responses, but that understanding and quantifying internal variability is essential to projecting climate and climate impacts on regional-to-local scales. The relationship between local forced responses and GMST is often linear but may be nonlinear, and can be greatly complicated by competition between different forcing factors. Climate projections are limited not only by uncertainties in the signal of climate change but also by uncertainties in the characteristics of real-world internal variability. Finally, it is shown that the relationship between GMST and local climate provides a simple approach to climate change detection, and a useful guide to attribution studies.
Resumo:
We develop a new measurement scale to assess consumers’ brand likeability in firm-level brands. We present brand likeability as a multidimensional construct. In the context of service experience purchases, we find that increased likeability in brands results in: (1) greater amount of positive association; (2) increased interaction interest; (3) more personified quality; and (4) increased brand contentment. The four-dimensional multiple-item scale demonstrates good psychometric properties, showing strong evidence of reliability as well as convergent, discriminant and nomological validity. Our findings reveal that brand likeability is positively associated with satisfaction and positive word of mouth. The scale extends existing branding research, providing brand managers with a metric so that likeability can be managed strategically. It addresses the need for firms to act more likeably in an interaction-dominated economy. Focusing on likeability acts as a differentiator and encourages likeable brand personality traits. We present theoretical implications and future research directions on the holistic brand likeability concept.
Resumo:
Recent work in animals suggests that the extent of early tactile stimulation by parents of offspring is an important element in early caregiving. We evaluate the psychometric properties of a new parent-report measure designed to assess frequency of tactile stimulation across multiple caregiving domains in infancy. We describe the full item set of the Parent-Infant Caregiving Touch Scale (PICTS) and, using data from a UK longitudinal Child Health and Development Study, the response frequencies and factor structure and whether it was invariant over two time points in early development (5 and 9 weeks). When their infant was 9 weeks old, 838 mothers responded on the PICTS while a stratified subsample of 268 mothers completed PICTS at an earlier 5 week old assessment (229 responded on both occasions). Three PICTS factors were identified reflecting stroking, holding and affective communication. These were moderately to strongly correlated at each of the two time points of interest and were unrelated to, and therefore distinct from, a traditional measure of maternal sensitivity at 7-months. A wholly stable psychometry over 5 and 9-week assessments was not identified which suggests that behavior profiles differ slightly for younger and older infants. Tests of measurement invariance demonstrated that all three factors are characterized by full configural and metric invariance, as well as a moderate degree of evidence of scalar invariance for the stroking factor. We propose the PICTS as a valuable new measure of important aspects of caregiving in infancy.
Resumo:
A data insertion method, where a dispersion model is initialized from ash properties derived from a series of satellite observations, is used to model the 8 May 2010 Eyjafjallajökull volcanic ash cloud which extended from Iceland to northern Spain. We also briefly discuss the application of this method to the April 2010 phase of the Eyjafjallajökull eruption and the May 2011 Grímsvötn eruption. An advantage of this method is that very little knowledge about the eruption itself is required because some of the usual eruption source parameters are not used. The method may therefore be useful for remote volcanoes where good satellite observations of the erupted material are available, but little is known about the properties of the actual eruption. It does, however, have a number of limitations related to the quality and availability of the observations. We demonstrate that, using certain configurations, the data insertion method is able to capture the structure of a thin filament of ash extending over northern Spain that is not fully captured by other modeling methods. It also verifies well against the satellite observations according to the quantitative object-based quality metric, SAL—structure, amplitude, location, and the spatial coverage metric, Figure of Merit in Space.
Resumo:
Group Exhibition curated by Cedar Lewisohn. Contributors: Caroline Achaintre, Edwin Burdis, Slawomir Czajkowski (Zbiok), Shaun Doyle and Mally Mallinson, Ruth Ewan, Andrew Gilbert, Joel Gray, Tod Hanson, Geoffrey Ireland, kennardphillipps, Cedar Lewisohn (curator), Kieron Livingstone and Ian Allison, Alexis Milne, Laura Oldfield Ford, Max Reeves, Clunie Reid, John Russell, Francis Thorburn, Vicky Wright.
Resumo:
Debate about the definition of “small state” has produced more fragmentation than consensus, even as the literature has demonstrated its subjects’ roles in joining international organizations propagating norms, executing creative diplomacy, influencing allies, avoiding and joining conflicts, and building peace. However, work on small states has struggled to identify commonalities in these states’ international relations, to cumulate knowledge, or to impact broader IR theory. This paper advocates a changed conceptual and definitional framework. Analysis of “small states” should pivot to examine the dynamics of the asymmetrical relationships in which these states are engaged. Instead of seeking an overall metric for size as the relevant variable—falling victim in a different way Dahl’s “lump-of-power fallacy,” we can recognize the multifaceted, variegated nature of power, whether in war or peacetime.
Resumo:
Rates of phenotypic evolution vary widely in nature and these rates may often reflect the intensity of natural selection. Here we outline an approach for detecting exceptional shifts in the rate of phenotypic evolution across phylogenies. We introduce a simple new branch-specific metric ∆V/∆B that divides observed phenotypic change along a branch into two components: (1) that attributable to the background rate (∆B), and (2) that attributable to departures from the background rate (∆V). Where the amount of expected change derived from variation in the rate of morphological evolution doubles that explained by to the background rate (∆V/∆B > 2), we identify this as positive phenotypic selection. We apply our approach to six datasets, finding multiple instances of positive selection in each. Our results support the growing appreciation that the traditional gradual view of phenotypic evolution is rarely upheld, with a more episodic view taking its place. This moves focus away from viewing phenotypic evolution as a simple homogeneous process and facilitates reconciliation with macroevolutionary interpretations from a genetic perspective, paving the way to novel insights into the link between genotype and phenotype. The ability to detect positive selection when genetic data are unavailable or unobtainable represents an attractive prospect for extant species, but when applied to fossil data it can reveal patterns of natural selection in deep time that would otherwise be impossible.
Resumo:
Observers generally fail to recover three-dimensional shape accurately from binocular disparity. Typically, depth is overestimated at near distances and underestimated at far distances [Johnston, E. B. (1991). Systematic distortions of shape from stereopsis. Vision Research, 31, 1351–1360]. A simple prediction from this is that disparity-defined objects should appear to expand in depth when moving towards the observer, and compress in depth when moving away. However, additional information is provided when an object moves from which 3D Euclidean shape can be recovered, be this through the addition of structure from motion information [Richards, W. (1985). Structure from stereo and motion. Journal of the Optical Society of America A, 2, 343–349], or the use of non-generic strategies [Todd, J. T., & Norman, J. F. (2003). The visual perception of 3-D shape from multiple cues: Are observers capable of perceiving metric structure? Perception and Psychophysics, 65, 31–47]. Here, we investigated shape constancy for objects moving in depth. We found that to be perceived as constant in shape, objects needed to contract in depth when moving toward the observer, and expand in depth when moving away, countering the effects of incorrect distance scaling (Johnston, 1991). This is a striking example of the failure of shape con- stancy, but one that is predicted if observers neither accurately estimate object distance in order to recover Euclidean shape, nor are able to base their responses on a simpler processing strategy.
Resumo:
It has been suggested that few students graduate with the skills required for many ecological careers, as field-based learning is said to be in decline in academic institutions. Here, we asked if mobile technology could improve field-based learning, using ability to identify birds as the study metric. We divided a class of ninety-one undergraduate students into two groups for field-based sessions where they were taught bird identification skills. The first group has access to a traditional identification book and the second group were provided with an identification app. We found no difference between the groups in the ability of students to identify birds after three field sessions. Furthermore, we found that students using the traditional book were significantly more likely to identify novel species. Therefore, we find no evidence that mobile technology improved students’ ability to retain what they experienced in the field; indeed, there is evidence that traditional field guides were more useful to students as they attempted to identify new species. Nevertheless, students felt positively about using their own smartphone devices for learning, highlighting that while apps did not lead to an improvement in bird identification ability, they gave greater accessibility to relevant information outside allocated teaching times.
Resumo:
Drought events are projected to increase in frequency and magnitude, which may alter the composition of ecological communities. Using a functional community metric that describes abundance, life history traits and conservation status, based upon Grime’s CSR (Competitive-Stress tolerant-Ruderal)¬ scheme, we investigated how British butterfly communities changed during an extreme drought in 1995. Throughout Britain, the total abundance of these insects had a significant tendency to increase, accompanied by substantial changes in community composition, particularly in more northerly, wetter sites. Communities tended to shift away from specialist, vulnerable species, and towards generalist, widespread species and, in the year following, communities had yet to return to equilibrium. Importantly, heterogeneity in surrounding landscapes mediated community responses to the drought event. Contrary to expectation, however, community shifts were more extreme in areas of greater topographic diversity, whilst land-cover diversity buffered community changes and limited declines in vulnerable specialist butterflies.
Resumo:
Improved understanding and prediction of the fundamental environmental controls on ecosystem service supply across the landscape will help to inform decisions made by policy makers and land-water managers. To evaluate this issue for a local catchment case study, we explored metrics and spatial patterns of service supply for water quality regulation, agriculture production, carbon storage, and biodiversity for the Macronutrient Conwy catchment. Methods included using ecosystem models such as LUCI and JULES, integration of national scale field survey datasets, earth observation products and plant trait databases, to produce finely resolved maps of species richness and primary production. Analyses were done with both 1x1 km gridded and subcatchment data. A common single gradient characterised catchment scale ecosystem services supply with agricultural production and carbon storage at opposing ends of the gradient as reported for a national-scale assessment. Species diversity was positively related to production due to the below national average productivity levels in the Conwy combined with the unimodal relationship between biodiversity and productivity at the national scale. In contrast to the national scale assessment, a strong reduction in water quality as production increased was observed in these low productive systems. Various soil variables were tested for their predictive power of ecosystem service supply. Soil carbon, nitrogen, their ratio and soil pH all had double the power of rainfall and altitude, each explaining around 45% of variation but soil pH is proposed as a potential metric for ecosystem service supply potential as it is a simple and practical metric which can be carried out in the field with crowd-sourcing technologies now available. The study emphasises the importance of considering multiple ecosystem services together due to the complexity of covariation at local and national scales, and the benefits of exploiting a wide range of metrics for each service to enhance data robustness.
Resumo:
Parties to the United Nations Framework Convention on Climate Change (UNFCCC) have requested guidance on common greenhouse gas metrics in accounting for Nationally determined contributions (NDCs) to emission reductions1. Metric choice can affect the relative emphasis placed on reductions of ‘cumulative climate pollutants’ such as carbon dioxide versus ‘short-lived climate pollutants’ (SLCPs), including methane and black carbon2, 3, 4, 5, 6. Here we show that the widely used 100-year global warming potential (GWP100) effectively measures the relative impact of both cumulative pollutants and SLCPs on realized warming 20–40 years after the time of emission. If the overall goal of climate policy is to limit peak warming, GWP100 therefore overstates the importance of current SLCP emissions unless stringent and immediate reductions of all climate pollutants result in temperatures nearing their peak soon after mid-century7, 8, 9, 10, which may be necessary to limit warming to “well below 2 °C” (ref. 1). The GWP100 can be used to approximately equate a one-off pulse emission of a cumulative pollutant and an indefinitely sustained change in the rate of emission of an SLCP11, 12, 13. The climate implications of traditional CO2-equivalent targets are ambiguous unless contributions from cumulative pollutants and SLCPs are specified separately.
Resumo:
Skillful sea ice forecasts from days to years ahead are becoming increasingly important for the operation and planning of human activities in the Arctic. Here we analyze the potential predictability of the Arctic sea ice edge in six climate models. We introduce the integrated ice-edge error (IIEE), a user-relevant verification metric defined as the area where the forecast and the “truth” disagree on the ice concentration being above or below 15%. The IIEE lends itself to decomposition into an absolute extent error, corresponding to the common sea ice extent error, and a misplacement error. We find that the often-neglected misplacement error makes up more than half of the climatological IIEE. In idealized forecast ensembles initialized on 1 July, the IIEE grows faster than the absolute extent error. This means that the Arctic sea ice edge is less predictable than sea ice extent, particularly in September, with implications for the potential skill of end-user relevant forecasts.
Resumo:
The evolution of the mass of a black hole embedded in a universe filled with dark energy and cold dark matter is calculated in a closed form within a test fluid model in a Schwarzschild metric, taking into account the cosmological evolution of both fluids. The result describes exactly how accretion asymptotically switches from the matter-dominated to the Lambda-dominated regime. For early epochs, the black hole mass increases due to dark matter accretion, and on later epochs the increase in mass stops as dark energy accretion takes over. Thus, the unphysical behaviour of previous analyses is improved in this simple exact model. (C) 2010 Elsevier B.V. All rights reserved.