27 resultados para Cant.
em CentAUR: Central Archive University of Reading - UK
Resumo:
This mixed-method study tracked social interaction and adaptation among 20 international postgraduates on a 1-year programme in the UK, examining assumptions that language proficiency and interactional engagement directly underpin sociocultural adaptation. Participants remained frustrated by a perceived ‘threshold’ barring successful interaction with English speakers, while reporting reluctance to take up available opportunities, independent of language proficiency and sociocultural adaptation. We challenge linear models of adaptation and call for assistance to international students in crossing the threshold to successful interaction.
Resumo:
This study examined the development of EFL proficiency in an immersion environment. Adult Chinese speakers of English were tested at the beginning and end of a ten-month period of immersion in the UK on their acquisition of English question forms using a timed grammaticality judgement task. Participants showed significantly faster response times after ten months, but no significant difference in accuracy of target-like judgements, suggesting that immersion benefits fluency more than accuracy.
Resumo:
A large number of urban surface energy balance models now exist with different assumptions about the important features of the surface and exchange processes that need to be incorporated. To date, no com- parison of these models has been conducted; in contrast, models for natural surfaces have been compared extensively as part of the Project for Intercomparison of Land-surface Parameterization Schemes. Here, the methods and first results from an extensive international comparison of 33 models are presented. The aim of the comparison overall is to understand the complexity required to model energy and water exchanges in urban areas. The degree of complexity included in the models is outlined and impacts on model performance are discussed. During the comparison there have been significant developments in the models with resulting improvements in performance (root-mean-square error falling by up to two-thirds). Evaluation is based on a dataset containing net all-wave radiation, sensible heat, and latent heat flux observations for an industrial area in Vancouver, British Columbia, Canada. The aim of the comparison is twofold: to identify those modeling ap- proaches that minimize the errors in the simulated fluxes of the urban energy balance and to determine the degree of model complexity required for accurate simulations. There is evidence that some classes of models perform better for individual fluxes but no model performs best or worst for all fluxes. In general, the simpler models perform as well as the more complex models based on all statistical measures. Generally the schemes have best overall capability to model net all-wave radiation and least capability to model latent heat flux.
Resumo:
Dense deployments of wireless local area networks (WLANs) are becoming a norm in many cities around the world. However, increased interference and traffic demands can severely limit the aggregate throughput achievable unless an effective channel assignment scheme is used. In this work, a simple and effective distributed channel assignment (DCA) scheme is proposed. It is shown that in order to maximise throughput, each access point (AP) simply chooses the channel with the minimum number of active neighbour nodes (i.e. nodes associated with neighbouring APs that have packets to send). However, application of such a scheme to practice depends critically on its ability to estimate the number of neighbour nodes in each channel, for which no practical estimator has been proposed before. In view of this, an extended Kalman filter (EKF) estimator and an estimate of the number of nodes by AP are proposed. These not only provide fast and accurate estimates but can also exploit channel switching information of neighbouring APs. Extensive packet level simulation results show that the proposed minimum neighbour and EKF estimator (MINEK) scheme is highly scalable and can provide significant throughput improvement over other channel assignment schemes.
Resumo:
The vertical structure of the relationship between water vapor and precipitation is analyzed in 5 yr of radiosonde and precipitation gauge data from the Nauru Atmospheric Radiation Measurement (ARM) site. The first vertical principal component of specific humidity is very highly correlated with column water vapor (CWV) and has a maximum of both total and fractional variance captured in the lower free troposphere (around 800 hPa). Moisture profiles conditionally averaged on precipitation show a strong association between rainfall and moisture variability in the free troposphere and little boundary layer variability. A sharp pickup in precipitation occurs near a critical value of CWV, confirming satellite-based studies. A lag–lead analysis suggests it is unlikely that the increase in water vapor is just a result of the falling precipitation. To investigate mechanisms for the CWV–precipitation relationship, entraining plume buoyancy is examined in sonde data and simplified cases. For several different mixing schemes, higher CWV results in progressively greater plume buoyancies, particularly in the upper troposphere, indicating conditions favorable for deep convection. All other things being equal, higher values of lower-tropospheric humidity, via entrainment, play a major role in this buoyancy increase. A small but significant increase in subcloud layer moisture with increasing CWV also contributes to buoyancy. Entrainment coefficients inversely proportional to distance from the surface, associated with mass flux increase through a deep lower-tropospheric layer, appear promising. These yield a relatively even weighting through the lower troposphere for the contribution of environmental water vapor to midtropospheric buoyancy, explaining the association of CWV and buoyancy available for deep convection.
Resumo:
Abstract I argue for the following claims: [1] all uses of I (the word ‘I’ or thought-element I) are absolutely immune to error through misidentification relative to I. [2] no genuine use of I can fail to refer. Nevertheless [3] I isn’t univocal: it doesn’t always refer to the same thing, or kind of thing, even in the thought or speech of a single person. This is so even though [4] I always refers to its user, the subject of experience who speaks or thinks, and although [5] if I’m thinking about something specifically as myself, I can’t fail to be thinking of myself, and although [6] a genuine understanding use of I always involves the subject thinking of itself as itself, whatever else it does or doesn’t involve, and although [7] if I take myself to be thinking about myself, then I am thinking about myself.
Resumo:
A new tropopause definition involving a flow-dependent blending of the traditional thermal tropopause with one based on potential vorticity has been developed and applied to the European Centre for Medium-Range Weather Forecasts (ECMWF) reanalyses (ERA), ERA-40 and ERA-Interim. Global and regional trends in tropopause characteristics for annual and solsticial seasonal means are presented here, with emphasis on significant results for the newer ERA-Interim data for 1989-2007. The global-mean tropopause is rising at a rate of 47 m decade−1 , with pressure falling at 1.0 hPa decade−1 , and temperature falling at 0.18 K decade−1 . The Antarctic tropopause shows decreasing heights,warming,and increasing westerly winds. The Arctic tropopause also shows a warming, but with decreasing westerly winds. In the tropics the trends are small, but at the latitudes of the sub-tropical jets they are almost double the global values. It is found that these changes are mainly concentrated in the eastern hemisphere. Previous and new metrics for the rate of broadening of the tropics, based on both height and wind, give trends in the range 0.9◦ decade−1 to 2.2◦ decade−1 . For ERA-40 the global height and pressure trends for the period 1979-2001 are similar: 39 m decade−1 and -0.8 hPa decade−1. These values are smaller than those found from the thermal tropopause definition with this data set, as was used in most previous studies.
Resumo:
Food security is one of this century’s key global challenges. By 2050 the world will require increased crop production in order to feed its predicted 9 billion people. This must be done in the face of changing consumption patterns, the impacts of climate change and the growing scarcity of water and land. Crop production methods will also have to sustain the environment, preserve natural resources and support livelihoods of farmers and rural populations around the world. There is a pressing need for the ‘sustainable intensifi cation’ of global agriculture in which yields are increased without adverse environmental impact and without the cultivation of more land. Addressing the need to secure a food supply for the whole world requires an urgent international effort with a clear sense of long-term challenges and possibilities. Biological science, especially publicly funded science, must play a vital role in the sustainable intensifi cation of food crop production. The UK has a responsibility and the capacity to take a leading role in providing a range of scientifi c solutions to mitigate potential food shortages. This will require signifi cant funding of cross-disciplinary science for food security. The constraints on food crop production are well understood, but differ widely across regions. The availability of water and good soils are major limiting factors. Signifi cant losses in crop yields occur due to pests, diseases and weed competition. The effects of climate change will further exacerbate the stresses on crop plants, potentially leading to dramatic yield reductions. Maintaining and enhancing the diversity of crop genetic resources is vital to facilitate crop breeding and thereby enhance the resilience of food crop production. Addressing these constraints requires technologies and approaches that are underpinned by good science. Some of these technologies build on existing knowledge, while others are completely radical approaches, drawing on genomics and high-throughput analysis. Novel research methods have the potential to contribute to food crop production through both genetic improvement of crops and new crop and soil management practices. Genetic improvements to crops can occur through breeding or genetic modifi cation to introduce a range of desirable traits. The application of genetic methods has the potential to refi ne existing crops and provide incremental improvements. These methods also have the potential to introduce radical and highly signifi cant improvements to crops by increasing photosynthetic effi ciency, reducing the need for nitrogen or other fertilisers and unlocking some of the unrealised potential of crop genomes. The science of crop management and agricultural practice also needs to be given particular emphasis as part of a food security grand challenge. These approaches can address key constraints in existing crop varieties and can be applied widely. Current approaches to maximising production within agricultural systems are unsustainable; new methodologies that utilise all elements of the agricultural system are needed, including better soil management and enhancement and exploitation of populations of benefi cial soil microbes. Agronomy, soil science and agroecology—the relevant sciences—have been neglected in recent years. Past debates about the use of new technologies for agriculture have tended to adopt an either/or approach, emphasising the merits of particular agricultural systems or technological approaches and the downsides of others. This has been seen most obviously with respect to genetically modifi ed (GM) crops, the use of pesticides and the arguments for and against organic modes of production. These debates have failed to acknowledge that there is no technological panacea for the global challenge of sustainable and secure global food production. There will always be trade-offs and local complexities. This report considers both new crop varieties and appropriate agroecological crop and soil management practices and adopts an inclusive approach. No techniques or technologies should be ruled out. Global agriculture demands a diversity of approaches, specific to crops, localities, cultures and other circumstances. Such diversity demands that the breadth of relevant scientific enquiry is equally diverse, and that science needs to be combined with social, economic and political perspectives. In addition to supporting high-quality science, the UK needs to maintain and build its capacity to innovate, in collaboration with international and national research centres. UK scientists and agronomists have in the past played a leading role in disciplines relevant to agriculture, but training in agricultural sciences and related topics has recently suffered from a lack of policy attention and support. Agricultural extension services, connecting farmers with new innovations, have been similarly neglected in the UK and elsewhere. There is a major need to review the support for and provision of extension services, particularly in developing countries. The governance of innovation for agriculture needs to maximise opportunities for increasing production, while at the same time protecting societies, economies and the environment from negative side effects. Regulatory systems need to improve their assessment of benefits. Horizon scanning will ensure proactive consideration of technological options by governments. Assessment of benefi ts, risks and uncertainties should be seen broadly, and should include the wider impacts of new technologies and practices on economies and societies. Public and stakeholder dialogue—with NGOs, scientists and farmers in particular—needs to be a part of all governance frameworks.
Resumo:
Mannitol is a polymorphic excipient which is usually used in pharmaceutical products as the beta form, although other polymorphs (alpha and delta) are common contaminants. Binary mixtures containing beta and delta mannitol were prepared to quantify the concentration of the beta form using FT-Raman spectroscopy. Spectral regions characteristic of each form were selected and peak intensity ratios of beta peaks to delta peaks were calculated. Using these ratios, a correlation curve was established which was then validated by analysing further samples of known composition. The results indicate that levels down to 2% beta could be quantified using this novel, non-destructive approach. Potential errors associated with quantitative studies using FT-Raman spectroscopy were also researched. The principal source of variability arose from inhomogeneities on mixing of the samples; a significant reduction of these errors was observed by reducing and controlling the particle size range. The results show that FT-Raman spectroscopy can be used to rapidly and accurately quantitate polymorphic mixtures.
Resumo:
The assimilation of observations with a forecast is often heavily influenced by the description of the error covariances associated with the forecast. When a temperature inversion is present at the top of the boundary layer (BL), a significant part of the forecast error may be described as a vertical positional error (as opposed to amplitude error normally dealt with in data assimilation). In these cases, failing to account for positional error explicitly is shown t o r esult in an analysis for which the inversion structure is erroneously weakened and degraded. In this article, a new assimilation scheme is proposed to explicitly include the positional error associated with an inversion. This is done through the introduction of an extra control variable to allow position errors in the a priori to be treated simultaneously with the usual amplitude errors. This new scheme, referred to as the ‘floating BL scheme’, is applied to the one-dimensional (vertical) variational assimilation of temperature. The floating BL scheme is tested with a series of idealised experiments a nd with real data from radiosondes. For each idealised experiment, the floating BL scheme gives an analysis which has the inversion structure and position in agreement with the truth, and outperforms the a ssimilation which accounts only for forecast a mplitude error. When the floating BL scheme is used to assimilate a l arge sample of radiosonde data, its ability to give an analysis with an inversion height in better agreement with that observed is confirmed. However, it is found that the use of Gaussian statistics is an inappropriate description o f t he error statistics o f t he extra c ontrol variable. This problem is alleviated by incorporating a non-Gaussian description of the new control variable in the new scheme. Anticipated challenges in implementing the scheme operationally are discussed towards the end of the article.
Resumo:
We perform a multimodel detection and attribution study with climate model simulation output and satellite-based measurements of tropospheric and stratospheric temperature change. We use simulation output from 20 climate models participating in phase 5 of the Coupled Model Intercomparison Project. This multimodel archive provides estimates of the signal pattern in response to combined anthropogenic and natural external forcing (the finger-print) and the noise of internally generated variability. Using these estimates, we calculate signal-to-noise (S/N) ratios to quantify the strength of the fingerprint in the observations relative to fingerprint strength in natural climate noise. For changes in lower stratospheric temperature between 1979 and 2011, S/N ratios vary from 26 to 36, depending on the choice of observational dataset. In the lower troposphere, the fingerprint strength in observations is smaller, but S/N ratios are still significant at the 1% level or better, and range from three to eight. We find no evidence that these ratios are spuriously inflated by model variability errors. After removing all global mean signals, model fingerprints remain identifiable in 70% of the tests involving tropospheric temperature changes. Despite such agreement in the large-scale features of model and observed geographical patterns of atmospheric temperature change, most models do not replicate the size of the observed changes. On average, the models analyzed underestimate the observed cooling of the lower stratosphere and overestimate the warming of the troposphere. Although the precise causes of such differences are unclear, model biases in lower stratospheric temperature trends are likely to be reduced by more realistic treatment of stratospheric ozone depletion and volcanic aerosol forcing.
Resumo:
In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.
Resumo:
Many reasons are being advanced for the current ‘food crisis’ including financial speculation,increased demand for grains, export bans on selected foodstuffs, inadequate grain stocks, higher oil prices, poor harvests and the use of crop lands for the production of biofuels. This paper reviews the present knowledge of recorded impacts of climate change and variability on crop production, in order to estimate its contribution to the current situation. Many studies demonstrate increased regional temperatures over the last 40 years (often through greater increases in minimum rather than maximum temperatures), but effects on crop yields are mixed. Distinguishing climate effects from changes in yield resulting from improved crop management and genotypes is difficult, but phenological changes affecting sowing, maturity and disease incidence are emerging. Anthropogenic factors appear to be a significant contributory factor to the observed decline in rainfall in southwestern and southeastern Australia, which reduced tradable wheat grain during 2007. Indirect effects of climate change through actions to mitigate or adapt to anticipated changes in climate are also evident. The amount of land diverted from crop production to biofuel production is small but has had a disproportionate effect on tradable grains from the USA. Adaptation of crop production practices and other components of the food system contributing to food security in response to variable and changing climates have occurred, but those households without adequate livelihoods are most in danger of becoming food insecure. Overall, we conclude that changing climate is a small contributor to the current food crisis but cannot be ignored.
Resumo:
Power delivery for biomedical implants is a major consideration in their design for both measurement and stimulation. When performed by a wireless technique, transmission efficiency is critically important not only because of the costs associated with any losses but also because of the nature of those losses, for example, excessive heat can be uncomfortable for the individual involved. In this study, a method and means of wireless power transmission suitable for biomedical implants are both discussed and experimentally evaluated. The procedure initiated is comparable in size and simplicity to those methods already employed; however, some of Tesla’s fundamental ideas have been incorporated in order to obtain a significant improvement in efficiency. This study contains a theoretical basis for the approach taken; however, the emphasis here is on practical experimental analysis
Resumo:
Organizations introduce acceptable use policies to deter employee computer misuse. Despite the controlling, monitoring and other forms of interventions employed, some employees misuse the organizational computers to carry out their personal work such as sending emails, surfing internet, chatting, playing games etc. These activities not only waste productive time of employees but also bring a risk to the organization. A questionnaire was administrated to a random sample of employees selected from large and medium scale software development organizations, which measured the work computer misuse levels and the factors that influence such behavior. The presence of guidelines provided no evidence of significant effect on the level of employee computer misuse. Not having access to Internet /email away from work and organizational settings were identified to be the most significant influences of work computer misuse.