58 resultados para Challenges of VET


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Launched with considerable fanfare in 1969, the Committee on the Challenges of Modern Society (CCMS) was supposed to bring new life to NATO by both re-energising public support and engaging with a variety of themes, issues and partners well beyond the alliance’s traditional scope. The first aim of this article is to go beyond the careful media operation that surrounded the launch of the CCMS and to examine the scepticism and resistance of some European partners, particularly the British. The second aim is to demonstrate that NATO started to think in terms of crisis management, disaster relief and environmental disasters well before 1989. The sheer military strength of the alliance and of its partners did remain central – and notably came back to the forefront in 1979 – but the alliance did start to see itself as a geopolitical player and to consider engagement beyond its strictly defined geographical area as early as 1969.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lack of access to insurance exacerbates the impact of climate variability on smallholder famers in Africa. Unlike traditional insurance, which compensates proven agricultural losses, weather index insurance (WII) pays out in the event that a weather index is breached. In principle, WII could be provided to farmers throughout Africa. There are two data-related hurdles to this. First, most farmers do not live close enough to a rain gauge with sufficiently long record of observations. Second, mismatches between weather indices and yield may expose farmers to uncompensated losses, and insurers to unfair payouts – a phenomenon known as basis risk. In essence, basis risk results from complexities in the progression from meteorological drought (rainfall deficit) to agricultural drought (low soil moisture). In this study, we use a land-surface model to describe the transition from meteorological to agricultural drought. We demonstrate that spatial and temporal aggregation of rainfall results in a clearer link with soil moisture, and hence a reduction in basis risk. We then use an advanced statistical method to show how optimal aggregation of satellite-based rainfall estimates can reduce basis risk, enabling remotely sensed data to be utilized robustly for WII.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existing data on animal health and welfare in organic livestock production systems in the European Community countries are reviewed in the light of the demands and challenges of the recently implemented EU regulation on organic livestock production. The main conclusions and recommendations of a three-year networking project on organic livestock production are summarised and the future challenges to organic livestock production in terms of welfare and health management are discussed. The authors conclude that, whilst the available data are limited and the implementation of the EC regulation is relatively recent, there is little evidence to suggest that organic livestock management causes major threats to animal health and welfare in comparison with conventional systems. There are, however, some well-identified areas, like parasite control and balanced ration formulation, where efforts are needed to find solutions that meet with organic standard requirements and guarantee high levels of health and welfare. It is suggested that, whilst organic standards offer an implicit framework for animal health and welfare management, there is a need to solve apparent conflicts between the organic farming objectives in regard to environment, public health, farmer income and animal health and welfare. The key challenges for the future of organic livestock production in Europe are related to the feasibility of implementing improved husbandry inputs and the development of evidence-based decision support systems for health and feeding management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces the findings of a recent study on the use of information technology (IT) among the quantity surveying (QS) organisations in Hong Kong. The study was conducted through a structured questionnaire survey among 18 QS organisations registered in Hong Kong, representing around 53% of the total number of organisations in the profession. The data set generated from this study provided rich information about what information technology the QS profession used, what the perceived benefits and barriers experienced by the users in the industry were. The survey concluded that although IT had been widely used in QS organisations in Hong Kong, it is mainly used to support various individual tasks of the QS services at a basic level, rather than to streamline the production of QS services as a whole through automation. Most of the respondents agreed that IT plays an important role in the QS profession but they had not fully taken advantage of IT to improve their competitive edge in the market. They usually adopted a more passive “wait and see” approach. In addition, very few QS organisations in Hong Kong have a comprehensive policy in promoting the use of IT within the organisations. It is recommended that the QS profession must recognise the importance of IT and take appropriate actions to meet the challenges of ever-changing and competitive market place.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The addition of small quantities of nanoparticles to conventional and sustainable thermoplastics leads to property enhancements with considerable potential in many areas of applications including food packaging 1, lightweight composites and high performance materials 2. In the case of sustainable polymers 3, the addition of nanoparticles may well sufficiently enhance properties such that the portfolio of possible applications is greatly increased. Most engineered nanoparticles are highly stable and these exist as nanoparticles prior to compounding with the polymer resin. They remain as nanoparticles during the active use of the packaging material as well as in the subsequent waste and recycling streams. It is also possible to construct the nanoparticles within the polymer films during processing from organic compounds selected to present minimal or no potential health hazards 4. In both cases the characterisation of the resultant nanostructured polymers presents a number of challenges. Foremost amongst these are the coupled challenges of the nanoscale of the particles and the low fraction present in the polymer matrix. Very low fractions of nanoparticles are only effective if the dispersion of the particles is good. This continues to be an issue in the process engineering but of course bad dispersion is much easier to see than good dispersion. In this presentation we show the merits of a combined scattering (neutron and x-ray) and microscopy (SEM, TEM, AFM) approach. We explore this methodology using rod like, plate like and spheroidal particles including metallic particles, plate-like and rod-like clay dispersions and nanoscale particles based on carbon such as nanotubes and graphene flakes. We will draw on a range of material systems, many explored in partnership with other members of Napolynet. The value of adding nanoscale particles is that the scale matches the scale of the structure in the polymer matrix. Although this can lead to difficulties in separating the effects in scattering experiments, the result in morphological studies means that both the nanoparticles and the polymer morphology are revealed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Following trends in operational weather forecasting, where ensemble prediction systems (EPS) are now increasingly the norm, flood forecasters are beginning to experiment with using similar ensemble methods. Most of the effort to date has focused on the substantial technical challenges of developing coupled rainfall-runoff systems to represent the full cascade of uncertainties involved in predicting future flooding. As a consequence much less attention has been given to the communication and eventual use of EPS flood forecasts. Drawing on interviews and other research with operational flood forecasters from across Europe, this paper highlights a number of challenges to communicating and using ensemble flood forecasts operationally. It is shown that operational flood forecasters understand the skill, operational limitations, and informational value of EPS products in a variety of different and sometimes contradictory ways. Despite the efforts of forecasting agencies to design effective ways to communicate EPS forecasts to non-experts, operational flood forecasters were often skeptical about the ability of forecast recipients to understand or use them appropriately. It is argued that better training and closer contacts between operational flood forecasters and EPS system designers can help ensure the uncertainty represented by EPS forecasts is represented in ways that are most appropriate and meaningful for their intended consumers, but some fundamental political and institutional challenges to using ensembles, such as differing attitudes to false alarms and to responsibility for management of blame in the event of poor or mistaken forecasts are also highlighted. Copyright © 2010 Royal Meteorological Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose of review There is growing interest in applying metabolic profiling technologies to food science as this approach is now embedded into the foodomics toolbox. This review aims at exploring how metabolic profiling can be applied to the development of functional foods. Recent findings One of the biggest challenges of modern nutrition is to propose a healthy diet to populations worldwide that must suit high inter-individual variability driven by complex gene-nutrient-environment interactions. Although a number of functional foods are now proposed in support of a healthy diet, a one-size-fits-all approach to nutrition is inappropriate and new personalised functional foods are necessary. Metabolic profiling technologies can assist at various levels of the development of functional foods, from screening for food composition to identification of new biomarkers of food intake to support diet intervention and epidemiological studies. Summary Modern ‘omics’ technologies, including metabolic profiling, will support the development of new personalised functional foods of high relevance to twenty-first-century medical challenges such as controlling the worldwide spread of metabolic disorders and ensuring healthy ageing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Overcoming the natural defensive barrier functions of the eye remains one of the greatest challenges of ocular drug delivery. Cornea is a chemical and mechanical barrier preventing the passage of any foreign bodies including drugs into the eye, but the factors limiting penetration of permeants and nanoparticulate drug delivery systems through the cornea are still not fully understood. In this study, we investigate these barrier properties of the cornea using thiolated and PEGylated (750 and 5000 Da) nanoparticles, sodium fluorescein, and two linear polymers (dextran and polyethylene glycol). Experiments used intact bovine cornea in addition to bovine cornea de-epithelialized or tissues pretreated with cyclodextrin. It was shown that corneal epithelium is the major barrier for permeation; pretreatment of the cornea with β-cyclodextrin provides higher permeation of low molecular weight compounds, such as sodium fluorescein, but does not enhance penetration of nanoparticles and larger molecules. Studying penetration of thiolated and PEGylated (750 and 5000 Da) nanoparticles into the de-epithelialized ocular tissue revealed that interactions between corneal surface and thiol groups of nanoparticles were more significant determinants of penetration than particle size (for the sizes used here). PEGylation with polyethylene glycol of a higher molecular weight (5000 Da) allows penetration of nanoparticles into the stroma, which proceeds gradually, after an initial 1 h lag phase.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a new parameterisation that relates surface mass balance (SMB: the sum of surface accumulation and surface ablation) to changes in surface elevation of the Greenland ice sheet (GrIS) for the MAR (Modèle Atmosphérique Régional: Fettweis, 2007) regional climate model. The motivation is to dynamically adjust SMB as the GrIS evolves, allowing us to force ice sheet models with SMB simulated by MAR while incorporating the SMB–elevation feedback, without the substantial technical challenges of coupling ice sheet and climate models. This also allows us to assess the effect of elevation feedback uncertainty on the GrIS contribution to sea level, using multiple global climate and ice sheet models, without the need for additional, expensive MAR simulations. We estimate this relationship separately below and above the equilibrium line altitude (ELA, separating negative and positive SMB) and for regions north and south of 77� N, from a set of MAR simulations in which we alter the ice sheet surface elevation. These give four “SMB lapse rates”, gradients that relate SMB changes to elevation changes. We assess uncertainties within a Bayesian framework, estimating probability distributions for each gradient from which we present best estimates and credibility intervals (CI) that bound 95% of the probability. Below the ELA our gradient estimates are mostly positive, because SMB usually increases with elevation: 0.56 (95% CI: −0.22 to 1.33) kgm−3 a−1 for the north, and 1.91 (1.03 to 2.61) kgm−3 a−1 for the south. Above the ELA, the gradients are much smaller in magnitude: 0.09 (−0.03 to 0.23) kgm−3 a−1 in the north, and 0.07 (−0.07 to 0.59) kgm−3 a−1 in the south, because SMB can either increase or decrease in response to increased elevation. Our statistically founded approach allows us to make probabilistic assessments for the effect of elevation feedback uncertainty on sea level projections (Edwards et al., 2014).