844 resultados para helping
Resumo:
Wydział Historyczny
Resumo:
This Policy Contribution assesses the broad obstacles hampering ICT-led growth in Europe and identifies the main areas in which policy could unlock the greatest value. We review estimates of the value that could be generated through take-up of various technologies and carry out a broad matching with policy areas. According to the literature survey and the collected estimates, the areas in which the right policies could unlock the greatest ICT-led growth are product and labour market regulations and the European Single Market. These areas should be reformed to make European markets more flexible and competitive. This would promote wider adoption of modern data-driven organisational and management practices thereby helping to close the productivity gap between the United States and the European Union. Gains could also be made in the areas of privacy, data security, intellectual property and liability pertaining to the digital economy, especially cloud computing, and next generation network infrastructure investment. Standardisation and spectrum allocation issues are found to be important, though to a lesser degree. Strong complementarities between the analysed technologies suggest, however, that policymakers need to deal with all of the identified obstacles in order to fully realise the potential of ICT to spur long-term growth beyond the partial gains that we report.
Resumo:
Greece, Portugal and Spain face a serious risk of external solvency due to their close to minus 100 percent of GDP net negative international investment positions, which are largely composed of debt. The perceived inability of these countries to rebalance their external positions is a major root of the euro crisis. Intra-euro rebalancing through declines in unit labour costs (ULC) in southern Europe, and ULC increases in northern Europe should continue, but has limits because: The share of intra-euro trade has declined. Intra-euro trade balances have already adjusted to a great extent. The intra-euro real exchange rates of Greece, Portugal and Spain have also either already adjusted or do not indicate significant appreciations since 2000. There are only two main current account surplus countries, Germany and the Netherlands. A purely intra-euro adjustment strategy would require too-significant wage increases in northern countries and wage declines in southern countries, which do not seem to be feasible. Before the crisis, the euro was significantly overvalued despite the close-to balanced current account position. The euro has depreciated recently, but more is needed to support the extra-euro trade of southern euro-area members. A weaker euro would also boost exports, growth, inflation and wage increases in Germany, thereby helping further intra-euro adjustment and the survival of the euro.
Resumo:
Would a research assistant - who can search for ideas related to those you are working on, network with others (but only share the things you have chosen to share), doesn’t need coffee and who might even, one day, appear to be conscious - help you get your work done? Would it help your students learn? There is a body of work showing that digital learning assistants can be a benefit to learners. It has been suggested that adaptive, caring, agents are more beneficial. Would a conscious agent be more caring, more adaptive, and better able to deal with changes in its learning partner’s life? Allow the system to try to dynamically model the user, so that it can make predictions about what is needed next, and how effective a particular intervention will be. Now, given that the system is essentially doing the same things as the user, why don’t we design the system so that it can try to model itself in the same way? This should mimic a primitive self-awareness. People develop their personalities, their identities, through interacting with others. It takes years for a human to develop a full sense of self. Nobody should expect a prototypical conscious computer system to be able to develop any faster than that. How can we provide a computer system with enough social contact to enable it to learn about itself and others? We can make it part of a network. Not just chatting with other computers about computer ‘stuff’, but involved in real human activity. Exposed to ‘raw meaning’ – the developing folksonomies coming out of the learning activities of humans, whether they are traditional students or lifelong learners (a term which should encompass everyone). Humans have complex psyches, comprised of multiple strands of identity which reflect as different roles in the communities of which they are part – so why not design our system the same way? With multiple internal modes of operation, each capable of being reflected onto the outside world in the form of roles – as a mentor, a research assistant, maybe even as a friend. But in order to be able to work with a human for long enough to be able to have a chance of developing the sort of rich behaviours we associate with people, the system needs to be able to function in a practical and helpful role. Unfortunately, it is unlikely to get a free ride from many people (other than its developer!) – so it needs to be able to perform a useful role, and do so securely, respecting the privacy of its partner. Can we create a system which learns to be more human whilst helping people learn?
Resumo:
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.
Resumo:
Construction materials and equipment are essential building blocks of every construction project and may account for 50-60 per cent of the total cost of construction. The rate of their utilization, on the other hand, is the element that most directly relates to a project progress. A growing concern in the industry that inadequate efficiency hinders its success could thus be accommodated by turning construction into a logistic process. Although mostly limited, recent attempts and studies show that Radio Frequency IDentification (RFID) applications have significant potentials in construction. However, the aim of this research is to show that the technology itself should not only be used for automation and tracking to overcome the supply chain complexity but also as a tool to generate, record and exchange process-related knowledge among the supply chain stakeholders. This would enable all involved parties to identify and understand consequences of any forthcoming difficulties and react accordingly before they cause major disruptions in the construction process. In order to achieve this aim the study focuses on a number of methods. First of all it develops a generic understanding of how RFID technology has been used in logistic processes in industrial supply chain management. Secondly, it investigates recent applications of RFID as an information and communication technology support facility in construction logistics for the management of construction supply chain. Based on these the study develops an improved concept of a construction logistics architecture that explicitly relies on integrating RFID with the Global Positioning System (GPS). The developed conceptual model architecture shows that categorisation provided through RFID and traceability as a result of RFID/GPS integration could be used as a tool to identify, record and share potential problems and thus vastly improve knowledge management processes within the entire supply chain. The findings thus clearly show a need for future research in this area.
Resumo:
There is ongoing debate concerning the possible environmental and human health impacts of growing genetically modified (GM) crops. Here, we report the results of a life-cycle assessment (LCA) comparing the environmental and human health impacts of conventional sugar beet growing regimes in the UK and Germany with those that might be expected if GM herbicide-tolerant (to glyphosate) sugar beet is commercialized. The results presented for a number of environmental and human health impact categories suggest that growing the GM herbicide-tolerant crop would be less harmful to the environment and human health than growing the conventional crop, largely due to lower emissions from herbicide manufacture, transport and field operations. Emissions contributing to negative environmental impacts, such as global warming, ozone depletion, ecotoxicity of water and acidification and nutrification of soil and water, were much lower for the herbicide-tolerant crop than for the conventional crop. Emissions contributing to summer smog, toxic particulate matter and carcinogenicity, which have negative human health impacts, were also substantially lower for the herbicide-tolerant crop. The environmental and human health impacts of growing GM crops need to be assessed on a case-by-case basis using a holistic approach. LCA is a valuable technique for helping to undertake such assessments.
Resumo:
Mathematical modeling of bacterial chemotaxis systems has been influential and insightful in helping to understand experimental observations. We provide here a comprehensive overview of the range of mathematical approaches used for modeling, within a single bacterium, chemotactic processes caused by changes to external gradients in its environment. Specific areas of the bacterial system which have been studied and modeled are discussed in detail, including the modeling of adaptation in response to attractant gradients, the intracellular phosphorylation cascade, membrane receptor clustering, and spatial modeling of intracellular protein signal transduction. The importance of producing robust models that address adaptation, gain, and sensitivity are also discussed. This review highlights that while mathematical modeling has aided in understanding bacterial chemotaxis on the individual cell scale and guiding experimental design, no single model succeeds in robustly describing all of the basic elements of the cell. We conclude by discussing the importance of this and the future of modeling in this area.
Resumo:
Counting elephants ought to be easy. It is not. Counting elephants lost to poaching is even harder. But, without knowledge of both, how can we know whether banning the sale of ivory is helping to save elephants?Bob Burn explains the problems.
Resumo:
It has long been suggested that the overall shape of the antigen combining site (ACS) of antibodies is correlated with the nature of the antigen. For example, deep pockets are characteristic of antibodies that bind haptens, grooves indicate peptide binders, while antibodies that bind to proteins have relatively flat combining sites. In. 1996, MacCallum, Martin and Thornton used a fractal shape descriptor and showed a strong correlation of the shape of the binding region with the general nature of the antigen. However, the shape of the ACS is determined primarily by the lengths of the six complementarity-determining regions (CDRs). Here, we make a direct correlation between the lengths of the CDRs and the nature of the antigen. In addition, we show significant differences in the residue composition of the CDRs of antibodies that bind to different antigen classes. As well as helping us to understand the process of antigen recognition, autoimmune disease and cross-reactivity these results are of direct application in the design of antibody phage libraries and modification of affinity. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
13C-2H correlation NMR spectroscopy (13C-2H COSY) permits the identification of 13C and 2H nuclei which are connected to one another by a single chemical bond via the sizeable 1JCD coupling constant. The practical development of this technique is described using a 13C-2H COSY pulse sequence which is derived from the classical 13C-1H correlation experiment. An example is given of the application of 13C-2H COSY to the study of the biogenesis of natural products from the anti-malarial plant Artemisia annua, using a doubly-labelled precursor molecule. Although the biogenesis of artemisinin, the anti-malarial principle from this species, has been extensively studied over the past twenty years there is still no consensus as to the true biosynthetic route to this important natural product – indeed, some published experimental results are directly contradictory. One possible reason for this confusion may be the ease with which some of the metabolites from A. annua undergo spontaneous autoxidation, as exemplified by our recent in vitro studies of the spontaneous autoxidation of dihydroartemisinic acid, and the application of 13C-2H COSY to this biosynthetic problem has been important in helping to mitigate against such processes. In this in vivo application of 13C-2H COSY, [15-13C2H3]-dihydroartemisinic acid (the doubly-labelled analogue of the natural product from this species which was obtained through synthesis) was fed to A. annua plants and was shown to be converted into several natural products which have been described previously, including artemisinin. It is proposed that all of these transformations occurred via a tertiary hydroperoxide intermediate, which is derived from dihyroartemisinic acid. This intermediate was observed directly in this feeding experiment by the 13C-2H COSY technique; its observation by more traditional procedures (e.g., chromatographic separation, followed by spectroscopic analysis of the purified product) would have been difficult owing to the instability of the hydroperoxide group (as had been established previously by our in vitro studies of the spontaneous autoxidation of dihydroartemisinic acid). This same hydroperoxide has been reported as the initial product of the spontaneous autoxidation of dihydroartemisinic acid in our previous in vitro studies. Its observation in this feeding experiment by the 13C-2H COSY technique, a procedure which requires the minimum of sample manipulation in order to achieve a reliable identification of metabolites (based on both 13C and 2H chemical shifts at the 15-position), provides the best possible evidence for its status as a genuine biosynthetic intermediate, rather than merely as an artifact of the experimental procedure.
Resumo:
This paper extends the build-operate-transfer (BOT) concession model (BOTCcM) to a new method for identifying a concession period by using bargaining-game theory. Concession period is one of the most important decision variables in arranging a BOT-type contract, and there are few methodologies available for helping to determine the value of this variable. The BOTCcM presents an alternative method by which a group of concession period solutions are produced. Nevertheless, a typical weakness in using BOTCcM is that the model cannot recommend a specific time span for concessionary. This paper introduces a new method called BOT bargaining concession model (BOTBaC) to enable the identification of a specific concession period, which takes into account the bargaining behavior of the two parties concerned in engaging a BOT contract, namely, the investor and the government concerned. The application of BOTBaC is demonstrated through using an example case.
Resumo:
A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.
Resumo:
This article reports on part of a larger study of the impact of strategy training in listening on learners of French, aged 16 to 17. One aim of the project was to investigate whether such training might have a positive effect on the self-efficacy of learners, by helping them see the relationship between the strategies they employed and what they achieved. One group of learners, as well as receiving strategy training, also received detailed feedback on their listening strategy use and on the reflective diaries they were asked to keep, in order to draw their attention to the relationship between strategies and learning outcomes. Another group received strategy training without feedback or reflective diaries, while a comparison group received neither strategy training nor feedback. As a result of the training, there was some evidence that students who had received feedback had made the biggest gains in certain aspects of self-efficacy for listening; although their gains as compared to the non-feedback group were not as great as had been anticipated. Reasons for this are discussed. The article concludes by suggesting changes in how teachers approach listening comprehension that may improve learners' view of themselves as listeners.
Resumo:
This paper explores how the concept of 'social capital' relates to the teaching of speaking and listening. The argument draws on Bourdieu's notion that a common language is an illusion but posits that an understanding of the grammar of speech can be productive in the development of both an understanding of what constitutes effective speech and the development of competence in speaking. It is argued that applying structuralist notions of written grammar is an inadequate approach to understanding speech acts or enhancing the creative use of speech. An analysis is made of how typical features of speech relate to dramatic dialogue and how the meaning of what is said is contingent upon aural and visual signifiers. On this basis a competent speaker is seen as being one who produces expressions appropriate for a range of situations by intentionally employing such signifiers. The paper draws on research into the way drama teachers make explicit reference to and use of semiotics and dramatic effectiveness in order to improve students' performance and by so doing empower them to increase their social capital. Ultimately, it is concluded that helping students identify, analyse and employ the aural, visual and verbal grammar of spoken English is not an adjunct to the subject of drama, but an intrinsic part of understanding the art form. What is called for is a re-appraisal by drama teachers of their own understanding of concepts relating to speech acts in order to enhance this area of their work.