40 resultados para Continued formation. Manipulative and informatical abacus.
Resumo:
Although the role of the academic head of department (HoD) has always been important to university management and performance, an increasing significance given to bureaucracy, academic performance and productivity, and government accountability has greatly elevated the importance of this position. Previous research and anecdotal evidence suggests that as academics move into HoD roles, usually with little or no training, they experience a problem of struggling to adequately manage key aspects of their role. It is this problem – and its manifestations – that forms the research focus of this study. Based on the research question, “What are the career trajectories of academics who become HoDs in a selected post-1992 university?” the study aimed to achieve greater understanding of why academics become HoDs, what it is like being a HoD, and how the experience influences their future career plans. The study adopts an interpretive approach, in line with social constructivism. Edited topical life history interviews were undertaken with 17 male and female HoDs, from a range of disciplines, in a post-1992 UK university. These data were analysed using coding, categorisation and theme formation techniques and developing profiles of each of the respondents. The findings from this study suggest that academics who become HoDs not only need the capacity to assume a range of personal and professional identities, but need to regularly adopt and switch between them. Whether individuals can successfully balance and manage these multiple identities, or whether they experience major conflicts and difficulties within or between them, greatly affects their experiences of being a HoD and may influence their subsequent career decisions. It is claimed that the focus, approach and analytical framework - based on the interrelationships between the concepts of socialisation, identity and career trajectory - provide a distinct and original contribution to knowledge in this area. Although the results of this study cannot be generalised, the findings may help other individuals and institutions move towards a firmer understanding of the academic who becomes HoD - in relation to theory, practice and future research.
Resumo:
A number of recent articles emphasize the fundamental importance of taphonomy and formation processes to interpretation of plant remains assemblages, as well as the value of interdisciplinary approaches to studies of environmental change and ecological and social practices. This paper examines ways in which micromorphology can contribute to integrating geoarchaeology and archaeobotany in analysis of the taphonomy and context of plant remains and ecological and social practices. Micromorphology enables simultaneous in situ study of diverse plant materials and thereby traces of a range of depositional pathways and histories. In addition to charred plant remains, also often preserved in semi-arid environments are plant impressions, phytoliths and calcitic ashes. These diverse plant remains are often routinely separated and extracted from their depositional context or lost using other analytical techniques, thereby losing crucial evidence on taphonomy, formation processes and contextual associations, which are fundamental to all subsequent interpretations. Although micromorphological samples are small in comparison to bulk flotation samples of charred plant remains, their size is similar to phytolith and pollen samples. In this paper, key taphonomic issues are examined in the study of: fuel; animal dung, animal management and penning; building materials; and specific activities, including food storage and preparation and ritual, using selected case-studies from early urban settlements in the Ancient Near East. Microarchaeological residues and experimental archaeology are also briefly examined.
Resumo:
Background and aims: GP-TCM is the 1st EU-funded Coordination Action consortium dedicated to traditional Chinese medicine (TCM) research. This paper aims to summarise the objectives, structure and activities of the consortium and introduces the position of the consortium regarding good practice, priorities, challenges and opportunities in TCM research. Serving as the introductory paper for the GPTCM Journal of Ethnopharmacology special issue, this paper describes the roadmap of this special issue and reports how the main outputs of the ten GP-TCM work packages are integrated, and have led to consortium-wide conclusions. Materials and methods: Literature studies, opinion polls and discussions among consortium members and stakeholders. Results: By January 2012, through 3 years of team building, the GP-TCM consortium had grown into a large collaborative network involving ∼200 scientists from 24 countries and 107 institutions. Consortium members had worked closely to address good practice issues related to various aspects of Chinese herbal medicine (CHM) and acupuncture research, the focus of this Journal of Ethnopharmacology special issue, leading to state-of-the-art reports, guidelines and consensus on the application of omics technologies in TCM research. In addition, through an online survey open to GP-TCM members and non-members, we polled opinions on grand priorities, challenges and opportunities in TCM research. Based on the poll, although consortium members and non-members had diverse opinions on the major challenges in the field, both groups agreed that high-quality efficacy/effectiveness and mechanistic studies are grand priorities and that the TCM legacy in general and its management of chronic diseases in particular represent grand opportunities. Consortium members cast their votes of confidence in omics and systems biology approaches to TCM research and believed that quality and pharmacovigilance of TCM products are not only grand priorities, but also grand challenges. Non-members, however, gave priority to integrative medicine, concerned on the impact of regulation of TCM practitioners and emphasised intersectoral collaborations in funding TCM research, especially clinical trials. Conclusions: The GP-TCM consortium made great efforts to address some fundamental issues in TCM research, including developing guidelines, as well as identifying priorities, challenges and opportunities. These consortium guidelines and consensus will need dissemination, validation and further development through continued interregional, interdisciplinary and intersectoral collaborations. To promote this, a new consortium, known as the GP-TCM Research Association, is being established to succeed the 3-year fixed term FP7 GP-TCM consortium and will be officially launched at the Final GP-TCM Congress in Leiden, the Netherlands, in April 2012.
Resumo:
Acrylamide, a chemical that is probably carcinogenic in humans and has neurological and reproductive effects, forms from free asparagine and reducing sugars during high-temperature cooking and processing of common foods. Potato and cereal products are major contributors to dietary exposure to acrylamide and while the food industry reacted rapidly to the discovery of acrylamide in some of the most popular foods, the issue remains a difficult one for many sectors. Efforts to reduce acrylamide formation would be greatly facilitated by the development of crop varieties with lower concentrations of free asparagine and/or reducing sugars, and of best agronomic practice to ensure that concentrations are kept as low as possible. This review describes how acrylamide is formed, the factors affecting free asparagine and sugar concentrations in crop plants, and the sometimes complex relationship between precursor concentration and acrylamide-forming potential. It covers some of the strategies being used to reduce free asparagine and sugar concentrations through genetic modification and other genetic techniques, such as the identification of quantitative trait loci. The link between acrylamide formation, flavour, and colour is discussed, as well as the difficulty of balancing the unknown risk of exposure to acrylamide in the levels that are present in foods with the well-established health benefits of some of the foods concerned. Key words: Amino acids, asparagine, cereals, crop quality, food safety, Maillard reaction, potato, rye, sugars, wheat.
Resumo:
Sustained hypoxia alters the expression of numerous proteins and predisposes individuals to Alzheimer's disease (AD). We have previously shown that hypoxia in vitro alters Ca2+ homeostasis in astrocytes and promotes increased production of amyloid beta peptides (Abeta) of AD. Indeed, alteration of Ca2+ homeostasis requires amyloid formation. Here, we show that electrogenic glutamate uptake by astrocytes is suppressed by hypoxia (1% O2, 24h) in a manner that is independent of amyloid beta peptide formation. Thus, hypoxic suppression of glutamate uptake and expression levels of glutamate transporter proteins EAAT1 and EAAT2 were not mimicked by exogenous application of amyloid beta peptide, or by prevention of endogenous amyloid peptide formation (using inhibitors of either beta or gamma secretase). Thus, dysfunction in glutamate homeostasis in hypoxic conditions is independent of Abeta production, but will likely contribute to neuronal damage and death associated with AD following hypoxic events.
Resumo:
This article discusses the aesthetic and spatial representational strategies of the popular studio-based musical television drama serials Rock Follies and Rock Follies of ’77. It analyses how the texts’ themes relating to women and the entertainment industry are mediated through their postmodern ironic mode and representation of fantastic spaces. Rock Follies’ distinctive stylised aesthetic and mode of caricature are analysed with reference to the visual intentions and ‘voice’ of the writer, Howard Schuman. Through considering the programmes’ various spatial strategies, the article draws attention to the importance of visual and performance style in their postmodern discourse on culture, fantasy, gender and subjectivity. Analysis of the spaces of musical performance, characters’ domestic environments and simulated entertainment spaces reveals how a dialectic is established between the escapist imaginative pleasures of fantasy and the manipulative and exploitative practices of the culture industry. The shift from the optimism of the first series, when the LittleLadies first form, to the darker mood of the second series, in which they are increasingly divided by industry pressures, is traced through changes in the aesthetics of space and characterisation. As a space of artifice, performance and electronic visual manipulation that facilitates the texts’ reflexive representation of culture and feminised fantasy, the studio’s unique aesthetic strengths emerge through this case study.
Resumo:
While a quantitative climate theory of tropical cyclone formation remains elusive, considerable progress has been made recently in our ability to simulate tropical cyclone climatologies and understand the relationship between climate and tropical cyclone formation. Climate models are now able to simulate a realistic rate of global tropical cyclone formation, although simulation of the Atlantic tropical cyclone climatology remains challenging unless horizontal resolutions finer than 50 km are employed. This article summarizes published research from the idealized experiments of the Hurricane Working Group of U.S. CLIVAR (CLImate VARiability and predictability of the ocean-atmosphere system). This work, combined with results from other model simulations, has strengthened relationships between tropical cyclone formation rates and climate variables such as mid-tropospheric vertical velocity, with decreased climatological vertical velocities leading to decreased tropical cyclone formation. Systematic differences are shown between experiments in which only sea surface temperature is increased versus experiments where only atmospheric carbon dioxide is increased, with the carbon dioxide experiments more likely to demonstrate the decrease in tropical cyclone numbers previously shown to be a common response of climate models in a warmer climate. Experiments where the two effects are combined also show decreases in numbers, but these tend to be less for models that demonstrate a strong tropical cyclone response to increased sea surface temperatures. Further experiments are proposed that may improve our understanding of the relationship between climate and tropical cyclone formation, including experiments with two-way interaction between the ocean and the atmosphere and variations in atmospheric aerosols.
Resumo:
ESA’s first multi-satellite mission Cluster is unique in its concept of 4 satellites orbiting in controlled formations. This will give an unprecedented opportunity to study structure and dynamics of the magnetosphere. In this paper we discuss ways in which ground-based remote-sensing observations of the ionosphere can be used to support the multipoint in-situ satellite measurements. There are a very large number of potentially useful configurations between the satellites and any one ground-based observatory; however, the number of ideal occurrences for any one configuration is low. Many of the ground-based instruments cannot operate continuously and Cluster will take data only for a part of each orbit, depending on how much high-resolution (‘burst-mode’) data are acquired. In addition, there are a great many instrument modes and the formation, size and shape of the cluster of the four satellites to consider. These circumstances create a clear and pressing need for careful planning to ensure that the scientific return from Cluster is maximised by additional coordinated ground-based observations. For this reason, ESA established a working group to coordinate the observations on the ground with Cluster. We will give a number of examples how the combined spacecraft and ground-based observations can address outstanding questions in magnetospheric physics. An online computer tool has been prepared to allow for the planning of conjunctions and advantageous constellations between the Cluster spacecraft and individual or combined ground-based systems. During the mission a ground-based database containing index and summary data will help to identify interesting datasets and allow to select intervals for coordinated studies. We illustrate the philosophy of our approach, using a few important examples of the many possible configurations between the satellite and the ground-based instruments.
Resumo:
We study the scaling properties and Kraichnan–Leith–Batchelor (KLB) theory of forced inverse cascades in generalized two-dimensional (2D) fluids (α-turbulence models) simulated at resolution 8192x8192. We consider α=1 (surface quasigeostrophic flow), α=2 (2D Euler flow) and α=3. The forcing scale is well resolved, a direct cascade is present and there is no large-scale dissipation. Coherent vortices spanning a range of sizes, most larger than the forcing scale, are present for both α=1 and α=2. The active scalar field for α=3 contains comparatively few and small vortices. The energy spectral slopes in the inverse cascade are steeper than the KLB prediction −(7−α)/3 in all three systems. Since we stop the simulations well before the cascades have reached the domain scale, vortex formation and spectral steepening are not due to condensation effects; nor are they caused by large-scale dissipation, which is absent. One- and two-point p.d.f.s, hyperflatness factors and structure functions indicate that the inverse cascades are intermittent and non-Gaussian over much of the inertial range for α=1 and α=2, while the α=3 inverse cascade is much closer to Gaussian and non-intermittent. For α=3 the steep spectrum is close to that associated with enstrophy equipartition. Continuous wavelet analysis shows approximate KLB scaling ℰ(k)∝k−2 (α=1) and ℰ(k)∝k−5/3 (α=2) in the interstitial regions between the coherent vortices. Our results demonstrate that coherent vortex formation (α=1 and α=2) and non-realizability (α=3) cause 2D inverse cascades to deviate from the KLB predictions, but that the flow between the vortices exhibits KLB scaling and non-intermittent statistics for α=1 and α=2.
Resumo:
Twenty varieties of field-grown potato were stored for 2 months and 6 months at 8 °C. Mean acrylamide contents in crisps prepared from all varieties at both storage times ranged from 131 μg per kg in Verdi to 5360 μg per kg in Pentland Dell. In contrast to previous studies, the longer storage period did not affect acrylamide formation significantly for most varieties, the exceptions being Innovator, where acrylamide formation increased, and Saturna, where it decreased. Four of the five varieties designated as suitable for crisping produced crisps with acrylamide levels below the European Commission indicative value of 1000 μg per kg (Saturna, Lady Rosetta, Lady Claire, and Verdi); the exception was Hermes. Two varieties more often used for French fries, Markies and Fontane, also produced crisps with less than 1000 μg per kg acrylamide. Correlations between acrylamide, its precursors and crisp colour are described, and the implications of the results for production of potato crisps are discussed.
Resumo:
Considerable progress has been made in understanding the present and future regional and global sea level in the 2 years since the publication of the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. Here, we evaluate how the new results affect the AR5’s assessment of (i) historical sea level rise, including attribution of that rise and implications for the sea level budget, (ii) projections of the components and of total global mean sea level (GMSL), and (iii) projections of regional variability and emergence of the anthropogenic signal. In each of these cases, new work largely provides additional evidence in support of the AR5 assessment, providing greater confidence in those findings. Recent analyses confirm the twentieth century sea level rise, with some analyses showing a slightly smaller rate before 1990 and some a slightly larger value than reported in the AR5. There is now more evidence of an acceleration in the rate of rise. Ongoing ocean heat uptake and associated thermal expansion have continued since 2000, and are consistent with ocean thermal expansion reported in the AR5. A significant amount of heat is being stored deeper in the water column, with a larger rate of heat uptake since 2000 compared to the previous decades and with the largest storage in the Southern Ocean. The first formal detection studies for ocean thermal expansion and glacier mass loss since the AR5 have confirmed the AR5 finding of a significant anthropogenic contribution to sea level rise over the last 50 years. New projections of glacier loss from two regions suggest smaller contributions to GMSL rise from these regions than in studies assessed by the AR5; additional regional studies are required to further assess whether there are broader implications of these results. Mass loss from the Greenland Ice Sheet, primarily as a result of increased surface melting, and from the Antarctic Ice Sheet, primarily as a result of increased ice discharge, has accelerated. The largest estimates of acceleration in mass loss from the two ice sheets for 2003–2013 equal or exceed the acceleration of GMSL rise calculated from the satellite altimeter sea level record over the longer period of 1993–2014. However, when increased mass gain in land water storage and parts of East Antarctica, and decreased mass loss from glaciers in Alaska and some other regions are taken into account, the net acceleration in the ocean mass gain is consistent with the satellite altimeter record. New studies suggest that a marine ice sheet instability (MISI) may have been initiated in parts of the West Antarctic Ice Sheet (WAIS), but that it will affect only a limited number of ice streams in the twenty-first century. New projections of mass loss from the Greenland and Antarctic Ice Sheets by 2100, including a contribution from parts of WAIS undergoing unstable retreat, suggest a contribution that falls largely within the likely range (i.e., two thirds probability) of the AR5. These new results increase confidence in the AR5 likely range, indicating that there is a greater probability that sea level rise by 2100 will lie in this range with a corresponding decrease in the likelihood of an additional contribution of several tens of centimeters above the likely range. In view of the comparatively limited state of knowledge and understanding of rapid ice sheet dynamics, we continue to think that it is not yet possible to make reliable quantitative estimates of future GMSL rise outside the likely range. Projections of twenty-first century GMSL rise published since the AR5 depend on results from expert elicitation, but we have low confidence in conclusions based on these approaches. New work on regional projections and emergence of the anthropogenic signal suggests that the two commonly predicted features of future regional sea level change (the increasing tilt across the Antarctic Circumpolar Current and the dipole in the North Atlantic) are related to regional changes in wind stress and surface heat flux. Moreover, it is expected that sea level change in response to anthropogenic forcing, particularly in regions of relatively low unforced variability such as the low-latitude Atlantic, will be detectable over most of the ocean by 2040. The east-west contrast of sea level trends in the Pacific observed since the early 1990s cannot be satisfactorily accounted for by climate models, nor yet definitively attributed either to unforced variability or forced climate change.
Resumo:
Field observations of new particle formation and the subsequent particle growth are typically only possible at a fixed measurement location, and hence do not follow the temporal evolution of an air parcel in a Lagrangian sense. Standard analysis for determining formation and growth rates requires that the time-dependent formation rate and growth rate of the particles are spatially invariant; air parcel advection means that the observed temporal evolution of the particle size distribution at a fixed measurement location may not represent the true evolution if there are spatial variations in the formation and growth rates. Here we present a zero-dimensional aerosol box model coupled with one-dimensional atmospheric flow to describe the impact of advection on the evolution of simulated new particle formation events. Wind speed, particle formation rates and growth rates are input parameters that can vary as a function of time and location, using wind speed to connect location to time. The output simulates measurements at a fixed location; formation and growth rates of the particle mode can then be calculated from the simulated observations at a stationary point for different scenarios and be compared with the ‘true’ input parameters. Hence, we can investigate how spatial variations in the formation and growth rates of new particles would appear in observations of particle number size distributions at a fixed measurement site. We show that the particle size distribution and growth rate at a fixed location is dependent on the formation and growth parameters upwind, even if local conditions do not vary. We also show that different input parameters used may result in very similar simulated measurements. Erroneous interpretation of observations in terms of particle formation and growth rates, and the time span and areal extent of new particle formation, is possible if the spatial effects are not accounted for.
Resumo:
An overview of organization in the construction industry is identified from plans of work published in the UK. This provides a basis for identifying the essential steps through which any construction project must pass. It is shown that all construction projects pass through a set of stages of work, consisting of inception, feasibility, scheme design, detail design, contract formation, construction and commissioning. Although there may be changes to the sequence and importance of these stages, their identification helps in making judgements about organizational structure on construction projects.
Resumo:
A large ensemble of general circulation model (GCM) integrations coupled to a fully interactive sulfur cycle scheme were run on the climateprediction.net platform to investigate the uncertainty in the climate response to sulfate aerosol and carbon dioxide (CO2) forcing. The sulfate burden within the model (and the atmosphere) depends on the balance between formation processes and deposition (wet and dry). The wet removal processes for sulfate aerosol are much faster than dry removal and so any changes in atmospheric circulation, cloud cover, and precipitation will feed back on the sulfate burden. When CO2 is doubled in the Hadley Centre Slab Ocean Model (HadSM3), global mean precipitation increased by 5%; however, the global mean sulfate burden increased by 10%. Despite the global mean increase in precipitation, there were large areas of the model showing decreases in precipitation (and cloud cover) in the Northern Hemisphere during June–August, which reduced wet deposition and allowed the sulfate burden to increase. Further experiments were also undertaken with and without doubling CO2 while including a future anthropogenic sulfur emissions scenario. Doubling CO2 further enhanced the increases in sulfate burden associated with increased anthropogenic sulfur emissions as observed in the doubled CO2-only experiment. The implications are that the climate response to doubling CO2 can influence the amount of sulfate within the atmosphere and, despite increases in global mean precipitation, may act to increase it.