825 resultados para Range-finding
Resumo:
The aim of this paper is to develop a comprehensive taxonomy of green supply chain management (GSCM) practices and develop a structural equation modelling-driven decision support system following GSCM taxonomy for managers to provide better understanding of the complex relationship between the external and internal factors and GSCM operational practices. Typology and/or taxonomy play a key role in the development of social science theories. The current taxonomies focus on a single or limited component of the supply chain. Furthermore, they have not been tested using different sample compositions and contexts, yet replication is a prerequisite for developing robust concepts and theories. In this paper, we empirically replicate one such taxonomy extending the original study by (a) developing broad (containing the key components of supply chain) taxonomy; (b) broadening the sample by including a wider range of sectors and organisational size; and (c) broadening the geographic scope of the previous studies. Moreover, we include both objective measures and subjective attitudinal measurements. We use a robust two-stage cluster analysis to develop our GSCM taxonomy. The main finding validates the taxonomy previously proposed and identifies size, attitude and level of environmental risk and impact as key mediators between internal drivers, external drivers and GSCM operational practices.
Resumo:
This paper is an attempt to explore the challenges of defining intangible heritage and ‘community’ in England. It uses as its case study the Museum of English Rural Life, University of Reading: an urban museum with a rural theme. The paper examines current theoretical discourse around the concept of the ‘first voice’ and debate about the role of museums in the preservation of intangible heritage. It then examines the relevance of these concepts to the identification of ‘rural’ intangible heritage stake holders in England. In this way, it shows the potential for concepts of intangible heritage to influence national museums. However, by applying theory and practice which is designed to support work with well-defined ‘originating communities’ to a national museum, it also highlights the challenges of initiating community engagement in a multicultural society.
Resumo:
Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.
Resumo:
Wall plaster sequences from the Neolithic town of Çatalhöyük have been analysed and compared to three types of natural sediment found in the vicinity of the site, using a range of analytical techniques. Block samples containing the plaster sequences were removed from the walls of several different buildings on the East Mound. Sub-samples were examined by IR spectroscopy, X-ray diffraction and X-ray fluorescence to determine the overall mineralogical and elemental composition, whilst thin sections were studied using optical polarising microscopy, IR Microscopy and Environmental Scanning Electron Microscopy with Energy Dispersive X-ray analysis. The results of this study have shown that there are two types of wall plaster found in the sequences and that the sediments used to produce these were obtained from at least two distinct sources. In particular, the presence of clay, calcite and magnesian calcite in the foundation plasters suggested that these were prepared predominantly from a marl source. On the other hand, the finishing plasters were found to contain dolomite with a small amount of clay and no calcite, revealing that softlime was used in their preparation. Whilst marl is located directly below and around Çatalhöyük, the nearest source of softlime is 6.5 km away, an indication that the latter was important to the Neolithic people, possibly due to the whiter colour (5Y 8/1) of this sediment. Furthermore, the same two plaster types were found on each wall of Building 49, the main building studied in this research, and in all five buildings investigated, suggesting that the use of these sources was an established practice for the inhabitants of several different households across the site.
Resumo:
Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.
Resumo:
Among the range of materials used in bioengineering, parylene-C has been used in combination with silicon oxide and in presence of the serum proteins, in cell patterning. However, the structural properties of adsorbed serum proteins on these substrates still remain elusive. In this study, we use an optical biosensing technique to decipher the properties of fibronectin (Fn) and serum albumin adsorbed on parylene-C and silicon oxide substrates. Our results show the formation of layers with distinct structural and adhesive properties. Thin, dense layers are formed on parylene-C, whereas thicker, more diffuse layers are formed on silicon oxide. These results suggest that Fn acquires a compact structure on parylene-C and a more extended structure on silicon oxide. Nonetheless, parylene-C and silicon oxide substrates coated with Fn host cell populations that exhibit focal adhesion complexes and good cell attachment. Albumin adopts a deformed structure on parylene-C and a globular structure on silicon oxide, and does not support significant cell attachment on either surface. Interestingly, the co-incubation of Fn and albumin at the ratio found in serum, results in the preferential adsorption of albumin on parylene-C and Fn on silicon oxide. This finding is supported by the exclusive formation of focal adhesion complexes in differentiated mouse embryonic stem cells (CGR8), cultured on Fn/albumin coated silicon oxide, but not on parylene-C. The detailed information provided in this study on the distinct properties of layers of serum proteins on substrates such as parylene-C and silicon oxide is highly significant in developing methods for cell patterning.
Resumo:
Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.
Resumo:
Historical, artefactual and place-name evidence indicates that Scandinavian migrants moved to eastern England in the ninth century AD, settling in the Danelaw. However, only a handful of characteristically Scandinavian burials have been found in the region. One, widely held, explanation is that most of these Scandinavian settlers quickly adopted local Christian burial customs, thus leaving Scandinavians indistinguishable from the Anglo-Saxon population. We undertook osteological and isotopic analysis to investigate the presence of first-generation Scandinavian migrants. Burials from Masham were typical of the later Anglo-Saxon period and included men, women and children. The location and positioning of the four adult burials from Coppergate, however, are unusual for Anglo-Scandinavian York. None of the skeletons revealed interpersonal violence. Isotopic evidence did not suggest a marine component in the diet of either group, but revealed migration on a regional, and possibly an international, scale. Combined strontium and oxygen isotope analysis should be used to investigate further both regional and Scandinavian migration in the later Anglo-Saxon period.
Resumo:
Current UK intake of non-milk extrinsic sugars (NMES) is above recommendations. Reducing the sugar content of processed high sugar foods through reformulation is one option for reducing consumption of NMES at a population level. However, reformulation can alter the sensory attributes of food products and influence consumer liking. This study evaluated consumer acceptance of a selection of products that are commercially-available in the UK; these included regular and sugar-reduced baked beans, strawberry jam, milk chocolate, cola and cranberry & raspberry juice. Sweeteners were present in the reformulated chocolate (maltitol), cola (aspartame and acesulfame-K) and juice (sucralose) samples. Healthy, non-smoking consumers (n = 116; 55 men, 61 women, age: 33 ± 9 years; BMI: 25.7 ± 4.6 kg/m2) rated the products for overall liking and on liking of appearance, flavor and texture using a nine-point hedonic scale. There were significant differences between standard and reduced sugar products in consumers’ overall liking and on liking of each modality (appearance, flavor and texture; all P < 0.0001). For overall liking, only the regular beans and cola were significantly more liked than their reformulated counterparts (P < 0.0001). Cluster analysis identified three consumer clusters that were representative of different patterns of consumer liking. For the largest cluster (cluster 3: 45%), there was a significant difference in mean liking scores across all products, except jam. Differences in liking were predominantly driven by sweet taste in 2 out of 3 clusters. The current research has demonstrated that a high proportion of consumers prefer conventional products over sugar-reduced products across a wide range of product types (45%) or across selected products (27%), when tasted unbranded, and so there is room for further optimization of commercial reduced sugar products that were evaluated in the current study. Future work should evaluate strategies to facilitate compliance to dietary recommendations on NMES and free sugars, such as the impact of sugar-reduced food exposure on their acceptance.
Resumo:
Many theories for the Madden-Julian oscillation (MJO) focus on diabatic processes, particularly the evolution of vertical heating and moistening. Poor MJO performance in weather and climate models is often blamed on biases in these processes and their interactions with the large-scale circulation. We introduce one of three components of a model-evaluation project, which aims to connect MJO fidelity in models to their representations of several physical processes, focusing on diabatic heating and moistening. This component consists of 20-day hindcasts, initialised daily during two MJO events in winter 2009-10. The 13 models exhibit a range of skill: several have accurate forecasts to 20 days' lead, while others perform similarly to statistical models (8-11 days). Models that maintain the observed MJO amplitude accurately predict propagation, but not vice versa. We find no link between hindcast fidelity and the precipitation-moisture relationship, in contrast to other recent studies. There is also no relationship between models' performance and the evolution of their diabatic-heating profiles with rain rate. A more robust association emerges between models' fidelity and net moistening: the highest-skill models show a clear transition from low-level moistening for light rainfall to mid-level moistening at moderate rainfall and upper-level moistening for heavy rainfall. The mid-level moistening, arising from both dynamics and physics, may be most important. Accurately representing many processes may be necessary, but not sufficient for capturing the MJO, which suggests that models fail to predict the MJO for a broad range of reasons and limits the possibility of finding a panacea.
Resumo:
An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.
Resumo:
Subdermal magnetic implants originated as an art form in the world of body modification. To date an in depth scientific analysis of the benefits of this implant has yet to be established. This research explores the concept of sensory extension of the tactile sense utilising this form of implantation. This relatively simple procedure enables the tactile sense to respond to static and alternating magnetic fields. This is not to say that the underlying biology of the system has changed; i.e. the concept does not increase our tactile frequency response range or sensitivity to pressure, but now does invoke a perceptual response to a stimulus that is not innately available to humans. Within this research two social surveys have been conducted in order to ascertain one, the social acceptance of the general notion of human enhancement, and two the perceptual experiences of individuals with the magnetic implants themselves. In terms of acceptance to the notion of sensory improvement (via implantation) ~39% of the general population questioned responded positively with a further ~25% of the respondents answering with the indecisive response. Thus with careful dissemination a large proportion of individuals may adopt this technology much like this if it were to become available for consumers. Interestingly of the responses collected from the magnetic implants survey ~60% of the respondents actually underwent the implant for magnetic vision purposes. The main contribution of this research however comes from a series of psychophysical testing. In which 7 subjects with subdermal magnetic implants, were cross compared with 7 subjects that had similar magnets superficially attached to their dermis. The experimentation examined multiple psychometric thresholds of the candidates including intensity, frequency and temporal. Whilst relatively simple, the experimental setup for the perceptual experimentation conducted was novel in that custom hardware and protocols were created in order to determine the subjective thresholds of the individuals. Abstract iv The overall purpose of this research is to utilise this concept in high stress scenarios, such as driving or piloting; whereby alerts and warnings could be relayed to an operator without intruding upon their other (typically overloaded) exterior senses (i.e. the auditory and visual senses). Hence each of the thresholding experiments were designed with the intention of utilising the results in the design of signals for information transfer. The findings from the study show that the implanted group of subjects significantly outperformed the superficial group in the absolute intensity threshold experiment, i.e. the implanted group required significantly less force than the superficial group in order to perceive the stimulus. The results for the frequency difference threshold showed no significant difference in the two groups tested. Interestingly however at low frequencies, i.e. 20 and 50 Hz, the ability of the subjects tested to discriminate frequencies significantly increased with more complex waveforms i.e. square and sawtooth, when compared against the typically used sinewave. Furthermore a novel protocol for establishing the temporal gap detection threshold during a temporal numerosity study has been established in this thesis. This experiment measured the subjects’ capability to correctly determine the number of concatenated signals presented to them whilst the time between the signals, referred to as pulses, tended to zero. A significant finding was that when altering the length of, the frequency of, and the number of cycles of the pulses, the time between pulses for correct recognition altered. This finding will ultimately aid in the design of the tactile alerts for this method of information transfer. Preliminary development work for the use of this method of input to the body, in an automotive scenario, is also presented within this thesis in the form of a driving simulation. The overall goal of which is to present warning alerts to a driver, such as rear-to-end collision, or excessive speeds on roads, in order to prevent incidents and penalties from occurring. Discussion on the broader utility of this implant has been presented, reflecting on its potential use as a basis for vibrotactile, and sensory substitution, devices. This discussion furthers with postulations on its use as a human machine interface, as well as how a similar implant could be used within the ear as a hearing aid device.
Resumo:
Free range egg producers face continuing problems from injurious pecking (IP) which has financial consequences for farmers and poor welfare implications for birds. Beak trimming has been practised for many years to limit the damage caused by IP, but with the UK Government giving notification that they intend to ban beak trimming in 2016, considerable efforts have been made to devise feasible housing, range and management strategies to reduce IP. A recent research project investigated the efficacy of a range of IP reducing management strategies, the mean costs of which came to around 5 pence per bird. Here, the results of the above project’s consumer survey are presented: consumers’ attitudes to free range egg production are detailed showing that, whilst consumers had a very positive attitude towards free range eggs, they were especially uninformed about some aspects of free range egg production. The contingent valuation technique was used to estimate the price premium consumers would be prepared to pay to ensure that hens do not suffer from IP: this was calculated as just over 3% on top of the prevailing retail price of free range eggs. These findings reinforce other studies that have found that whilst consumers are not generally well-informed about certain specific welfare problems faced by animals under free range conditions, they are prepared to pay to improve animal welfare. Indeed, the study findings suggest that producers could obtain an additional price premium if they demonstrate the welfare provenance of their eggs, perhaps through marketing the eggs as coming from birds with intact beaks. This welfare provenance issue could usefully be assured to consumers by the introduction of a mandatory, single, accredited EU-wide welfare-standards labelling scheme.
Resumo:
Considerable progress has been made in understanding the present and future regional and global sea level in the 2 years since the publication of the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. Here, we evaluate how the new results affect the AR5’s assessment of (i) historical sea level rise, including attribution of that rise and implications for the sea level budget, (ii) projections of the components and of total global mean sea level (GMSL), and (iii) projections of regional variability and emergence of the anthropogenic signal. In each of these cases, new work largely provides additional evidence in support of the AR5 assessment, providing greater confidence in those findings. Recent analyses confirm the twentieth century sea level rise, with some analyses showing a slightly smaller rate before 1990 and some a slightly larger value than reported in the AR5. There is now more evidence of an acceleration in the rate of rise. Ongoing ocean heat uptake and associated thermal expansion have continued since 2000, and are consistent with ocean thermal expansion reported in the AR5. A significant amount of heat is being stored deeper in the water column, with a larger rate of heat uptake since 2000 compared to the previous decades and with the largest storage in the Southern Ocean. The first formal detection studies for ocean thermal expansion and glacier mass loss since the AR5 have confirmed the AR5 finding of a significant anthropogenic contribution to sea level rise over the last 50 years. New projections of glacier loss from two regions suggest smaller contributions to GMSL rise from these regions than in studies assessed by the AR5; additional regional studies are required to further assess whether there are broader implications of these results. Mass loss from the Greenland Ice Sheet, primarily as a result of increased surface melting, and from the Antarctic Ice Sheet, primarily as a result of increased ice discharge, has accelerated. The largest estimates of acceleration in mass loss from the two ice sheets for 2003–2013 equal or exceed the acceleration of GMSL rise calculated from the satellite altimeter sea level record over the longer period of 1993–2014. However, when increased mass gain in land water storage and parts of East Antarctica, and decreased mass loss from glaciers in Alaska and some other regions are taken into account, the net acceleration in the ocean mass gain is consistent with the satellite altimeter record. New studies suggest that a marine ice sheet instability (MISI) may have been initiated in parts of the West Antarctic Ice Sheet (WAIS), but that it will affect only a limited number of ice streams in the twenty-first century. New projections of mass loss from the Greenland and Antarctic Ice Sheets by 2100, including a contribution from parts of WAIS undergoing unstable retreat, suggest a contribution that falls largely within the likely range (i.e., two thirds probability) of the AR5. These new results increase confidence in the AR5 likely range, indicating that there is a greater probability that sea level rise by 2100 will lie in this range with a corresponding decrease in the likelihood of an additional contribution of several tens of centimeters above the likely range. In view of the comparatively limited state of knowledge and understanding of rapid ice sheet dynamics, we continue to think that it is not yet possible to make reliable quantitative estimates of future GMSL rise outside the likely range. Projections of twenty-first century GMSL rise published since the AR5 depend on results from expert elicitation, but we have low confidence in conclusions based on these approaches. New work on regional projections and emergence of the anthropogenic signal suggests that the two commonly predicted features of future regional sea level change (the increasing tilt across the Antarctic Circumpolar Current and the dipole in the North Atlantic) are related to regional changes in wind stress and surface heat flux. Moreover, it is expected that sea level change in response to anthropogenic forcing, particularly in regions of relatively low unforced variability such as the low-latitude Atlantic, will be detectable over most of the ocean by 2040. The east-west contrast of sea level trends in the Pacific observed since the early 1990s cannot be satisfactorily accounted for by climate models, nor yet definitively attributed either to unforced variability or forced climate change.