789 resultados para Return to work
Resumo:
This paper deals with second-generation Barbadians or 'Bajan-Brits', who have decided to,return' to the birthplace of their parents, focusing on their reactions to matters relating to race relations and racialised identities. The importance of race and the operation of the 'colour-class' system in the Caribbean are established at the outset. Based on fifty-two qualitative in-depth interviews, the paper initially considers the positive things that the second-generation migrants report about living in a majority black country and the salience of such racial affirmation as part of their migration process. The paper then presents an analysis of the narratives provided by the Bajan-Brits concerning their reactions to issues relating to race relations in Barbadian society. The impressions of the young returnees provide clear commentaries on what are regarded as (i) the 'acceptance of white hegemony' within Barbadian society, (ii) the occurrence of de facto 'racial segregation, (iii) perceptions of the 'existence of apartheid, and (iv) 'the continuation of slavery'. The account then turns to the contemporary operation of the colour-class system. It is concluded that, despite academic arguments that the colour-class dimension has to be put to one side as the principal dimension of social stratification in the contemporary Caribbean, the second-generation migrants are acutely aware of the continued existence and salience of such gradations within society. Thus, the analysis not only serves to emphasise the continued importance of racial-based stratification in the contemporary Caribbean, but also speaks of the 'hybrid' and 'in-between' racialised identities of the second-generation migrants.
Resumo:
Future stratospheric ozone concentrations will be determined both by changes in the concentration of ozone depleting substances (ODSs) and by changes in stratospheric and tropospheric climate, including those caused by changes in anthropogenic greenhouse gases (GHGs). Since future economic development pathways and resultant emissions of GHGs are uncertain, anthropogenic climate change could be a significant source of uncertainty for future projections of stratospheric ozone. In this pilot study, using an "ensemble of opportunity" of chemistry-climate model (CCM) simulations, the contribution of scenario uncertainty from different plausible emissions pathways for ODSs and GHGs to future ozone projections is quantified relative to the contribution from model uncertainty and internal variability of the chemistry-climate system. For both the global, annual mean ozone concentration and for ozone in specific geographical regions, differences between CCMs are the dominant source of uncertainty for the first two-thirds of the 21st century, up-to and after the time when ozone concentrations return to 1980 values. In the last third of the 21st century, dependent upon the set of greenhouse gas scenarios used, scenario uncertainty can be the dominant contributor. This result suggests that investment in chemistry-climate modelling is likely to continue to refine projections of stratospheric ozone and estimates of the return of stratospheric ozone concentrations to pre-1980 levels.
Resumo:
Projections of stratospheric ozone from a suite of chemistry-climate models (CCMs) have been analyzed. In addition to a reference simulation where anthropogenic halogenated ozone depleting substances (ODSs) and greenhouse gases (GHGs) vary with time, sensitivity simulations with either ODS or GHG concentrations fixed at 1960 levels were performed to disaggregate the drivers of projected ozone changes. These simulations were also used to assess the two distinct milestones of ozone returning to historical values (ozone return dates) and ozone no longer being influenced by ODSs (full ozone recovery). The date of ozone returning to historical values does not indicate complete recovery from ODSs in most cases, because GHG-induced changes accelerate or decelerate ozone changes in many regions. In the upper stratosphere where CO2-induced stratospheric cooling increases ozone, full ozone recovery is projected to not likely have occurred by 2100 even though ozone returns to its 1980 or even 1960 levels well before (~2025 and 2040, respectively). In contrast, in the tropical lower stratosphere ozone decreases continuously from 1960 to 2100 due to projected increases in tropical upwelling, while by around 2040 it is already very likely that full recovery from the effects of ODSs has occurred, although ODS concentrations are still elevated by this date. In the midlatitude lower stratosphere the evolution differs from that in the tropics, and rather than a steady decrease in ozone, first a decrease in ozone is simulated from 1960 to 2000, which is then followed by a steady increase through the 21st century. Ozone in the midlatitude lower stratosphere returns to 1980 levels by ~2045 in the Northern Hemisphere (NH) and by ~2055 in the Southern Hemisphere (SH), and full ozone recovery is likely reached by 2100 in both hemispheres. Overall, in all regions except the tropical lower stratosphere, full ozone recovery from ODSs occurs significantly later than the return of total column ozone to its 1980 level. The latest return of total column ozone is projected to occur over Antarctica (~2045–2060) whereas it is not likely that full ozone recovery is reached by the end of the 21st century in this region. Arctic total column ozone is projected to return to 1980 levels well before polar stratospheric halogen loading does so (~2025–2030 for total column ozone, cf. 2050–2070 for Cly+60×Bry) and it is likely that full recovery of total column ozone from the effects of ODSs has occurred by ~2035. In contrast to the Antarctic, by 2100 Arctic total column ozone is projected to be above 1960 levels, but not in the fixed GHG simulation, indicating that climate change plays a significant role.
Resumo:
This paper assesses the impact of the 'decoupling' reform of the Common Agricultural Policy on the labour allocation decisions of Irish farmers. The agricultural household decision-making model provides the conceptual and theoretical framework to examine the interaction between government subsidies and farmers' time allocation decisions. The relationship postulated is that 'decoupling' of agricultural support from production would probably result in a decline in the return to farm labour but it would also lead to an increase in household wealth. The effect of these factors on how farmers allocate their time is tested empirically using labour participation and labour supply models. The models developed are sufficiently general for application elsewhere. The main findings for the Irish situation are that the decoupling of direct payments is likely to increase the probability of farmers participating in the off-farm employment market and that the amount of time allocated to off-farm work will increase.
Resumo:
The shamba system involves farmers tending tree saplings on state-owned forest land in return for being permitted to intercrop perennial food crops until canopy closure. At one time the system was used throughout all state-owned forest lands in Kenya, accounting for a large proportion of some 160,000 ha. The system should theoretically be mutually beneficial to both local people and the government. However the system has had a chequered past in Kenya due to widespread malpractice and associated environmental degradation. It was last banned in 2003 but in early 2008 field trials were initiated for its reintroduction. This study aimed to: assess the benefits and limitations of the shamba system in Kenya; assess the main influences on the extent to which the limitations and benefits are realised and; consider the management and policy requirements for the system's successful and sustainable operation. Information was obtained from 133 questionnaires using mainly open ended questions and six participatory workshops carried out in forest-adjacent communities on the western slopes of Mount Kenya in Nyeri district. In addition interviews were conducted with key informants from communities and organisations. There was strong desire amongst local people for the system's reintroduction given that it had provided significant food, income and employment. Local perceptions of the failings of the system included firstly mismanagement by government or forest authorities and secondly abuse of the system by shamba farmers and outsiders. Improvements local people considered necessary for the shamba system to work included more accountability and transparency in administration and better rules with respect to plot allocation and stewardship. Ninety-seven percent of respondents said they would like to be more involved in management of the forest and 80% that they were willing to pay for the use of a plot. The study concludes that the structural framework laid down by the 2005 Forests Act, which includes provision for the reimplementation of the shamba system under the new plantation establishment and livelihood improvement scheme (PELIS) [It should be noted that whilst the shamba system was re-branded in 2008 under the acronym PELIS, for the sake of simplicity the authors continue to refer to the 'shamba system' and 'shamba farmers' throughout this paper.], is weakened because insufficient power is likely to be devolved to local people, casting them merely as 'forest users' and the shamba system as a 'forest user right'. In so doing the system's potential to both facilitate and embody the participation of local people in forest management is limited and the long-term sustainability of the new system is questionable. Suggested instruments to address this include some degree of sharing of profits from forest timber, performance related guarantees for farmers to gain a new plot and use of joint committees consisting of local people and the forest authorities for long term management of forests.
Resumo:
Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.
Resumo:
Purpose – The main aim of this paper is to present the results of a study examining managers' attitudes towards the deployment and use of information and communications technology (ICT) in their organisations. The study comes at a time when ICT is being recognised as a major enabler of innovation and new business models, which have the potential to have major impact on western economies and jobs. Design/methodology/approach – A questionnaire was specially designed to collect data relating to three research questions. The questionnaire also included a number of open-ended questions. A total of 181 managers from a wide range of industries across a number of countries participated in the electronic survey. The quantitative responses to the survey were analysed using SPSS. Exploratory factor analysis using Varimax rotation was used and ANOVA to compare responses by different groups. Findings – The survey showed that many of the respondents appeared equipped to work “any place, any time”. However, it also highlighted the challenges managers face in working in a connected operation. Also, the data suggested that many managers were less than confident about their companies' policies and practices in relation to information management. Originality/value – A next step from this exploratory research could be the development of a model exploring the impact of ICT on management and organisational performance in terms of personal characteristics of the manager, the role performed, the context and the ICT provision. Also, further research could focus on examining in more detail differences between management levels.
Resumo:
We present a conceptual architecture for a Group Support System (GSS) to facilitate Multi-Organisational Collaborative Groups (MOCGs) initiated by local government and including external organisations of various types. Multi-Organisational Collaborative Groups (MOCGs) consist of individuals from several organisations which have agreed to work together to solve a problem. The expectation is that more can be achieved working in harmony than separately. Work is done interdependently, rather than independently in diverse directions. Local government, faced with solving complex social problems, deploy MOCGs to enable solutions across organisational, functional, professional and juridical boundaries, by involving statutory, voluntary, community, not-for-profit and private organisations. This is not a silver bullet as it introduces new pressures. Each member organisation has its own goals, operating context and particular approaches, which can be expressed as their norms and business processes. Organisations working together must find ways of eliminating differences or mitigating their impact in order to reduce the risks of collaborative inertia and conflict. A GSS is an electronic collaboration system that facilitates group working and can offer assistance to MOCGs. Since many existing GSSs have been primarily developed for single organisation collaborative groups, even though there are some common issues, there are some difficulties peculiar to MOCGs, and others that they experience to a greater extent: a diversity of primary organisational goals among members; different funding models and other pressures; more significant differences in other information systems both technologically and in their use than single organisations; greater variation in acceptable approaches to solve problems. In this paper, we analyse the requirements of MOCGs led by local government agencies, leading to a conceptual architecture for an e-government GSS that captures the relationships between 'goal', 'context', 'norm', and 'business process'. Our models capture the dynamics of the circumstances surrounding each individual representing an organisation in a MOCG along with the dynamics of the MOCG itself as a separate community.
Resumo:
There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.
Resumo:
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.
Resumo:
This edited collection provides ideas and support for ways of 'bringing poetry alive' in the classroom at Key Stages 1,2 and 3, drawing on what is known to work and also exploring fresh thinking. It is designed to help both new and experienced teachers approach poetry teaching with greater imagination and confidence. The book is edited and introduced by Michael Lockwood and features chapters by experts who have taught poetry in different settings for many years, including contributions from poets Michael Rosen and James Carter. Professor Morag Styles of Cambridge University has provided a Preface. All the contributors have a connection with the University of Reading as lecturers, external examiners, current or former graduate students. The book includes the following sections: Introduction: Developments in Poetry Teaching 1: Reflections on Being Children’s Laureate – Michael Rosen 2: Teaching Poetry in the Early Years - Margaret Perkins 3: Actual Poems, Possible Responses - Prue Goodwin 4: Making Poetry - Catriona Nicholson 5: The role of the poet in primary schools -James Carter 6: Cross-Curricular Poetry Writing - Eileen Hyder 7: Teaching Poetry to Teenagers - Lionel Warner 8: Watching the Words: Drama and Poems - Andy Kempe 9: Literary Reading - Andy Goodwyn The book is intended for teacher educators,teachers and trainee teachers working with children aged 5 to 14 years.
Resumo:
Cleft lip and palate is the most common of the congenital conditions affecting the face and cranial bones and is associated with a raised risk of difficulties in infant-caregiver interaction; the reasons for such difficulties are not fully understood. Here, we report two experiments designed to explore how adults respond to infant faces with and without cleft lip, using behavioural measures of attractiveness appraisal (‘liking’) and willingness to work to view or remove the images (‘wanting’). We found that infants with cleft lip were rated as less attractive and were viewed for shorter durations than healthy infants, an effect that was particularly apparent where the cleft lip was severe. Women rated the infant faces as more attractive than men did, but there were no differences in men and women's viewing times of these faces. In a second experiment, we found that the presence of a cleft lip in domestic animals affected adults' ‘liking’ and ‘wanting’ responses in a comparable way to that seen for human infants. Adults' responses were also remarkably similar for images of infants and animals with cleft lip, although no gender difference in attractiveness ratings or viewing times emerged for animals. We suggest that the presence of a cleft lip can substantially change the way in which adults respond to human and animal faces. Furthermore, women may respond in different ways to men when asked to appraise infant attractiveness, despite the fact that men and women ‘want’ to view images of infants for similar durations.
Resumo:
In this article Geoff Tennant and Dave Harries report on the early stages of a research project looking to examine the transition from Key Stage (KS) 2 to 3 of children deemed Gifted and Talented (G&T) in mathematics. An examination of relevant literature points towards variation in definition of key terms and underlying rationale for activities. Preliminary fieldwork points towards a lack of meaningful communication between schools, with primary school teachers in particular left to themselves to decide how to work with children deemed G&T. Some pointers for action are given, along with ideas for future research and a request for colleagues interested in working with us to get in touch.
Resumo:
We present molecular dynamics (MD) and slip-springs model simulations of the chain segmental dynamics in entangled linear polymer melts. The time-dependent behavior of the segmental orientation autocorrelation functions and mean-square segmental displacements are analyzed for both flexible and semiflexible chains, with particular attention paid to the scaling relations among these dynamic quantities. Effective combination of the two simulation methods at different coarse-graining levels allows us to explore the chain dynamics for chain lengths ranging from Z ≈ 2 to 90 entanglements. For a given chain length of Z ≈ 15, the time scales accessed span for more than 10 decades, covering all of the interesting relaxation regimes. The obtained time dependence of the monomer mean square displacements, g1(t), is in good agreement with the tube theory predictions. Results on the first- and second-order segmental orientation autocorrelation functions, C1(t) and C2(t), demonstrate a clear power law relationship of C2(t) C1(t)m with m = 3, 2, and 1 in the initial, free Rouse, and entangled (constrained Rouse) regimes, respectively. The return-to-origin hypothesis, which leads to inverse proportionality between the segmental orientation autocorrelation functions and g1(t) in the entangled regime, is convincingly verified by the simulation result of C1(t) g1(t)−1 t–1/4 in the constrained Rouse regime, where for well-entangled chains both C1(t) and g1(t) are rather insensitive to the constraint release effects. However, the second-order correlation function, C2(t), shows much stronger sensitivity to the constraint release effects and experiences a protracted crossover from the free Rouse to entangled regime. This crossover region extends for at least one decade in time longer than that of C1(t). The predicted time scaling behavior of C2(t) t–1/4 is observed in slip-springs simulations only at chain length of 90 entanglements, whereas shorter chains show higher scaling exponents. The reported simulation work can be applied to understand the observations of the NMR experiments.