918 resultados para Generation of test processes
Resumo:
A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.
Resumo:
Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.
Resumo:
The African Technology Policy Studies Network (ATPS) is a multidisciplinary network of researchers, private sector actors, policymakers and civil society. ATPS has the vision to become the leading international centre of excellence and reference in science, technology and innovation (STI) systems research, training and capacity building, communication and sensitization, knowledge brokerage, policy advocacy and outreach in Africa. It has a Regional Secretariat in Nairobi Kenya, and operates through national chapters in 29 countries (including 27 in Africa and two Chapters in the United Kingdom and USA for Africans in the Diaspora) with an expansion plan to cover the entire continent by 2015. The ATPS Phase VI Strategic Plan aims to improve the understanding and functioning of STI processes and systems to strengthen the learning capacity, social responses, and governance of STI for addressing Africa's development challenges, with a specific focus on the Millennium Development Goals (MDGs). A team of external evaluators carried out a midterm review to assess the effectiveness and efficiency of the implementation of the Strategic Plan for the period January 1, 2009 to December 31, 2010. The evaluation methodology involved multiple quantitative and qualitative methods to assess the qualitative and quantitative inputs (human resources, financial resources, time, etc.) into ATPS activities (both thematic and facilitative) and their tangible and intangible outputs, outcomes and impacts. Methods included a questionnaire survey of ATPS members and stakeholders, key informant interviews, and focus group discussions (FGDs) with members in six countries. Effectiveness of Programmes Under all six strategic goals, very good progress has been made towards planned outputs and outcomes. This is evidenced by key performance indicators (KPIs) generated from desk review, ratings from the survey respondents, and the themes that run through the FGDs. Institutional and Programme Cost Effectiveness Institutional Effectiveness: assessment of institutional effectiveness suggests that adequate management frameworks are in place and are being used effectively and transparently. Also technical and financial accounting mechanisms are being followed in accordance with grant agreements and with global good practice. This is evidenced by KPIs generated from desk review. Programme Cost Effectiveness: assessment of cost-effectiveness of execution of programmes shows that organisational structure is efficient, delivering high quality, relevant research at relatively low cost by international standards. The evidence includes KPIs from desk review: administrative costs to programme cost ratio has fallen steadily, to around 10%; average size of research grants is modest, without compromising quality. There is high level of pro bono input by ATPS members. ATPS Programmes Strategic Evaluation ATPS research and STI related activities are indeed unique and well aligned with STI issues and needs facing Africa and globally. The multi-disciplinary and trans-boundary nature of the research activities are creating a unique group of research scientists. The ATPS approach to research and STI issues is paving the way for the so called Third Generation University (3GU). Understanding this unique positioning, an increasing number of international multilateral agencies are seeking partnership with ATPS. ATPS is seeing an increasing level of funding commitments by Donor Partners. Recommendations for ATPS Continued Growth and Effectiveness On-going reform of ATPS administrative structure to continue The on-going reforms that have taken place within the Board, Regional Secretariat, and at the National Chapter coordination levels are welcomed. Such reform should continue until fully functional corporate governance policy and practices are fully established and implemented across the ATPS governance structures. This will further strengthen ATPS to achieve the vision of being the leading STI policy brokerage organization in Africa. Although training in corporate governance has been carried out for all sectors of ATPS leadership structure in recent time, there is some evidence that these systems have not yet been fully implemented effectively within all the governance structures of the organization, especially at the Board and National chapter levels. Future training should emphasize practical application with exercises relevant to ATPS leadership structure from the Board to the National Chapter levels. Training on Transformational Leadership - Leading a Change Though a subject of intense debate amongst economists and social scientists, it is generally agreed that cultural mindsets and attitudes could enhance and/or hinder organizational progress. ATPS’s vision demands transformational leadership skills amongst its leaders from the Board members to the National Chapter Coordinators. To lead such a change, ATPS leaders must understand and avoid personal and cultural mindsets and value systems that hinder change, while embracing those that enhance it. It requires deliberate assessment of cultural, behavioural patterns that could hinder progress and the willingness to be recast into cultural and personal habits that make for progress. Improvement of relationship amongst the Board, Secretariat, and National Chapters A large number of ATPS members and stakeholders feel they do not have effective communications and/or access to Board, National Chapter Coordinators and Regional Secretariat activities. Effort should be made to improve the implementation of ATPS communication strategy to improve on information flows amongst the ATPS management and the members. The results of the survey and the FGDs suggest that progress has been made during the past two years in this direction, but more could be done to ensure effective flow of pertinent information to members following ATPS communications channels. Strategies for Increased Funding for National Chapters There is a big gap between the fundraising skills of the Regional Secretariat and those of the National Coordinators. In some cases, funds successfully raised by the Secretariat and disbursed to national chapters were not followed up with timely progress and financial reports by some national chapters. Adequate training in relevant skills required for effective interactions with STI key policy players should be conducted regularly for National Chapter coordinators and ATPS members. The ongoing training in grant writing should continue and be made continent-wide if funding permits. Funding of National Chapters should be strategic such that capacity in a specific area of research is built which, with time, will not only lead to a strong research capacity in that area, but also strengthen academic programmes. For example, a strong climate change programme is emerging at University of Nigeria Nsukka (UNN), with strong collaborations with Universities from neighbouring States. Strategies to Increase National Government buy-in and support for STI Translating STI research outcomes into policies requires a great deal of emotional intelligence, skills which are often lacking in the first and second generation universities. In the epoch of the science-based or 2GUs, governments were content with universities carrying out scientific research and providing scientific education. Now they desire to see universities as incubators of new science- or technology-based commercial activities, whether by existing firms or start-ups. Hence, governments demand that universities take an active and leading role in the exploitation of their knowledge and they are willing to make funds available to support such activities. Thus, for universities to gain the attention of national leadership they must become centres of excellence and explicit instruments of economic development in the knowledge-based economy. The universities must do this while working collaboratively with government departments, parastatals, and institutions and dedicated research establishments. ATPS should anticipate these shifting changes and devise programmes to assist both government and universities to relate effectively. New administrative structures in member organizations to sustain and manage the emerging STI multidisciplinary teams Second Generation universities (2GUs) tend to focus on pure science and often do not regard the application of their know-how as their task. In contrast, Third Generation Universities (3GUs) objectively stimulate techno-starters – students or academics – to pursue the exploitation or commercialisation of the knowledge they generate. They view this as being equal in importance to the objectives of scientific research and education. Administratively, research in the 2GU era was mainly monodisciplinary and departments were structured along disciplines. The emerging interdisciplinary scientific teams with focus on specific research areas functionally work against the current mono-disciplinary faculty-based, administrative structure of 2GUs. For interdisciplinary teams, the current faculty system is an obstacle. There is a need for new organisational forms for university management that can create responsibilities for the task of know-how exploitation. ATPS must anticipate this and begin to strategize solutions for their member institutions to transition to 3Gus administrative structure, otherwise ATPS growth will plateau, and progress achieved so far may be stunted.
Resumo:
Systematic climate shifts have been linked to multidecadal variability in observed sea surface temperatures in the North Atlantic Ocean1. These links are extensive, influencing a range of climate processes such as hurricane activity2 and African Sahel3, 4, 5 and Amazonian5 droughts. The variability is distinct from historical global-mean temperature changes and is commonly attributed to natural ocean oscillations6, 7, 8, 9, 10. A number of studies have provided evidence that aerosols can influence long-term changes in sea surface temperatures11, 12, but climate models have so far failed to reproduce these interactions6, 9 and the role of aerosols in decadal variability remains unclear. Here we use a state-of-the-art Earth system climate model to show that aerosol emissions and periods of volcanic activity explain 76 per cent of the simulated multidecadal variance in detrended 1860–2005 North Atlantic sea surface temperatures. After 1950, simulated variability is within observational estimates; our estimates for 1910–1940 capture twice the warming of previous generation models but do not explain the entire observed trend. Other processes, such as ocean circulation, may also have contributed to variability in the early twentieth century. Mechanistically, we find that inclusion of aerosol–cloud microphysical effects, which were included in few previous multimodel ensembles, dominates the magnitude (80 per cent) and the spatial pattern of the total surface aerosol forcing in the North Atlantic. Our findings suggest that anthropogenic aerosol emissions influenced a range of societally important historical climate events such as peaks in hurricane activity and Sahel drought. Decadal-scale model predictions of regional Atlantic climate will probably be improved by incorporating aerosol–cloud microphysical interactions and estimates of future concentrations of aerosols, emissions of which are directly addressable by policy actions.
Resumo:
This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.
Resumo:
During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.
Resumo:
This paper assesses the performance of a vocabulary test designed to measure second language productive vocabulary knowledge.The test, Lex30, uses a word association task to elicit vocabulary, and uses word frequency data to measure the vocabulary produced. Here we report firstly on the reliability of the test as measured by a test-retest study, a parallel test forms experiment and an internal consistency measure. We then investigate the construct validity of the test by looking at changes in test performance over time, analyses of correlations with scores on similar tests, and comparison of spoken and written test performance. Last, we examine the theoretical bases of the two main test components: eliciting vocabulary and measuring vocabulary. Interpretations of our findings are discussed in the context of test validation research literature. We conclude that the findings reported here present a robust argument for the validity of the test as a research tool, and encourage further investigation of its validity in an instructional context
Resumo:
Leucine Rich Repeat Kinase 2 (LRRK2) is one of the most important genetic contributors to Parkinson's disease. LRRK2 has been implicated in a number of cellular processes, including macroautophagy. To test whether LRRK2 has a role in regulating autophagy, a specific inhibitor of the kinase activity of LRRK2 was applied to human neuroglioma cells and downstream readouts of autophagy examined. The resulting data demonstrate that inhibition of LRRK2 kinase activity stimulates macroautophagy in the absence of any alteration in the translational targets of mTORC1, suggesting that LRRK2 regulates autophagic vesicle formation independent of canonical mTORC1 signaling. This study represents the first pharmacological dissection of the role LRRK2 plays in the autophagy/lysosomal pathway, emphasizing the importance of this pathway as a marker for LRRK2 physiological function. Moreover it highlights the need to dissect autophagy and lysosomal activities in the context of LRRK2 related pathologies with the final aim of understanding their aetiology and identifying specific target for disease modifying therapies in patients.
Resumo:
The flow patterns generated by a pulsating jet used to study hydrodynamic modulated voltammetry (HMV) are investigated. It is shown that the pronounced edge effect reported previously is the result of the generation of a vortex ring from the pulsating jet. This vortex behaviour of the pulsating jet system is imaged using a number of visualisation techniques. These include a dye system and an electrochemically generated bubble stream. In each case a toroidal vortex ring was observed. Image analysis revealed that the velocity of this motion was of the order of 250 mm s−1 with a corresponding Reynolds number of the order of 1200. This motion, in conjunction with the electrode structure, is used to explain the strong ‘ring and halo’ features detected by electrochemical mapping of the system reported previously.
Resumo:
The parameterisation of diabatic processes in numerical models is critical for the accuracy of weather forecasts and for climate projections. A novel approach to the evaluation of these processes in models is introduced in this contribution. The approach combines a suite of on-line tracer diagnostics with off-line trajectory calculations. Each tracer tracks accumulative changes in potential temperature associated with a particular parameterised diabatic process in the model. A comparison of tracers therefore allows the identification of the most active diabatic processes and their downstream impacts. The tracers are combined with trajectories computed using model-resolved winds, allowing the various diabatic contributions to be tracked back to their time and location of occurrence. We have used this approach to investigate diabatic processes within a simulated extratropical cyclone. We focus on the warm conveyor belt, in which the dominant diabatic contributions come from large-scale latent heating and parameterised convection. By contrasting two simulations, one with standard convection parameterisation settings and another with reduced parameterised convection, the effects of parameterised convection on the structure of the cyclone have been determined. Under reduced parameterised convection conditions, the large-scale latent heating is forced to release convective instability that would otherwise have been released by the convection parameterisation. Although the spatial distribution of precipitation depends on the details of the split between parameterised convection and large-scale latent heating, the total precipitation amount associated with the cyclone remains largely unchanged. For reduced parameterised convection, a more rapid and stronger latent heating episode takes place as air ascends within the warm conveyor belt.
Resumo:
Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.
Resumo:
Various complex oscillatory processes are involved in the generation of the motor command. The temporal dynamics of these processes were studied for movement detection from single trial electroencephalogram (EEG). Autocorrelation analysis was performed on the EEG signals to find robust markers of movement detection. The evolution of the autocorrelation function was characterised via the relaxation time of the autocorrelation by exponential curve fitting. It was observed that the decay constant of the exponential curve increased during movement, indicating that the autocorrelation function decays slowly during motor execution. Significant differences were observed between movement and no moment tasks. Additionally, a linear discriminant analysis (LDA) classifier was used to identify movement trials with a peak accuracy of 74%.
Resumo:
Flash floods pose a significant danger for life and property. Unfortunately, in arid and semiarid environment the runoff generation shows a complex non-linear behavior with a strong spatial and temporal non-uniformity. As a result, the predictions made by physically-based simulations in semiarid areas are subject to great uncertainty, and a failure in the predictive behavior of existing models is common. Thus better descriptions of physical processes at the watershed scale need to be incorporated into the hydrological model structures. For example, terrain relief has been systematically considered static in flood modelling at the watershed scale. Here, we show that the integrated effect of small distributed relief variations originated through concurrent hydrological processes within a storm event was significant on the watershed scale hydrograph. We model these observations by introducing dynamic formulations of two relief-related parameters at diverse scales: maximum depression storage, and roughness coefficient in channels. In the final (a posteriori) model structure these parameters are allowed to be both time-constant or time-varying. The case under study is a convective storm in a semiarid Mediterranean watershed with ephemeral channels and high agricultural pressures (the Rambla del Albujón watershed; 556 km 2 ), which showed a complex multi-peak response. First, to obtain quasi-sensible simulations in the (a priori) model with time-constant relief-related parameters, a spatially distributed parameterization was strictly required. Second, a generalized likelihood uncertainty estimation (GLUE) inference applied to the improved model structure, and conditioned to observed nested hydrographs, showed that accounting for dynamic relief-related parameters led to improved simulations. The discussion is finally broadened by considering the use of the calibrated model both to analyze the sensitivity of the watershed to storm motion and to attempt the flood forecasting of a stratiform event with highly different behavior.
Resumo:
Spatially dense observations of gust speeds are necessary for various applications, but their availability is limited in space and time. This work presents an approach to help to overcome this problem. The main objective is the generation of synthetic wind gust velocities. With this aim, theoretical wind and gust distributions are estimated from 10 yr of hourly observations collected at 123 synoptic weather stations provided by the German Weather Service. As pre-processing, an exposure correction is applied on measurements of the mean wind velocity to reduce the influence of local urban and topographic effects. The wind gust model is built as a transfer function between distribution parameters of wind and gust velocities. The aim of this procedure is to estimate the parameters of gusts at stations where only wind speed data is available. These parameters can be used to generate synthetic gusts, which can improve the accuracy of return periods at test sites with a lack of observations. The second objective is to determine return periods much longer than the nominal length of the original time series by considering extreme value statistics. Estimates for both local maximum return periods and average return periods for single historical events are provided. The comparison of maximum and average return periods shows that even storms with short average return periods may lead to local wind gusts with return periods of several decades. Despite uncertainties caused by the short length of the observational records, the method leads to consistent results, enabling a wide range of possible applications.
Resumo:
This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.