149 resultados para Farmer, Doug
Resumo:
In this paper we consider the structure of dynamically evolving networks modelling information and activity moving across a large set of vertices. We adopt the communicability concept that generalizes that of centrality which is defined for static networks. We define the primary network structure within the whole as comprising of the most influential vertices (both as senders and receivers of dynamically sequenced activity). We present a methodology based on successive vertex knockouts, up to a very small fraction of the whole primary network,that can characterize the nature of the primary network as being either relatively robust and lattice-like (with redundancies built in) or relatively fragile and tree-like (with sensitivities and few redundancies). We apply these ideas to the analysis of evolving networks derived from fMRI scans of resting human brains. We show that the estimation of performance parameters via the structure tests of the corresponding primary networks is subject to less variability than that observed across a very large population of such scans. Hence the differences within the population are significant.
Resumo:
Andalusia, located in southern Spain, is the major olive production area worldwide. Due to the relevance of this agricultural sector on the regional income, this article investigates olive farmer's perspectives regarding olive production after their retirement and potential factors affecting these including economic, social, environmental and spatial factors. We use data from a survey conducted to 431 olive farmers in Andalusia in 2010. Our findings show spatial dependence in explaining farmer's views on the future of olive farming at relatively small distances. In addition other factors such as bad economic performance, erosion or olive diseases affect farmer's perception. We make propositions on what elements should be taking into account when designing agricultural policies aiming at guaranteeing the sustainability of olive farming in future.
Observations of the eruption of the Sarychev volcano and simulations using the HadGEM2 climate model
Resumo:
In June 2009 the Sarychev volcano located in the Kuril Islands to the northeast of Japan erupted explosively, injecting ash and an estimated 1.2 ± 0.2 Tg of sulfur dioxide into the upper troposphere and lower stratosphere, making it arguably one of the 10 largest stratospheric injections in the last 50 years. During the period immediately after the eruption, we show that the sulfur dioxide (SO2) cloud was clearly detected by retrievals developed for the Infrared Atmospheric Sounding Interferometer (IASI) satellite instrument and that the resultant stratospheric sulfate aerosol was detected by the Optical Spectrograph and Infrared Imaging System (OSIRIS) limb sounder and CALIPSO lidar. Additional surface‐based instrumentation allows assessment of the impact of the eruption on the stratospheric aerosol optical depth. We use a nudged version of the HadGEM2 climate model to investigate how well this state‐of‐the‐science climate model can replicate the distributions of SO2 and sulfate aerosol. The model simulations and OSIRIS measurements suggest that in the Northern Hemisphere the stratospheric aerosol optical depth was enhanced by around a factor of 3 (0.01 at 550 nm), with resultant impacts upon the radiation budget. The simulations indicate that, in the Northern Hemisphere for July 2009, the magnitude of the mean radiative impact from the volcanic aerosols is more than 60% of the direct radiative forcing of all anthropogenic aerosols put together. While the cooling induced by the eruption will likely not be detectable in the observational record, the combination of modeling and measurements would provide an ideal framework for simulating future larger volcanic eruptions.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
This thesis is concerned with development of improved management practices in indigenous chicken production systems in a research process that includes participatory approaches with smallholder farmers and other stakeholders in Kenya. The research process involved a wide range of activities that included on-station experiments, field surveys, stakeholder consultations in workshops, seminars and visits, and on-farm farmer participatory research to evaluate the effect of some improved management interventions on production performance of indigenous chickens. The participatory research was greatly informed from collective experiences and lessons of the previous activities. The on-station studies focused on hatching, growth and nutritional characteristics of the indigenous chickens. Four research publications from these studies are included in this thesis. Quantitative statistical analyses were applied and they involved use of growth models estimated with non-linear regressions for the growth characteristics, chi-square determinations to investigate differences among different reciprocal crosses of indigenous chickens and general linear models and covariance determination for the nutrition study. The on-station studies brought greater understanding of performance and production characteristics of indigenous chickens and the influence of management practices on these characteristics. The field surveys and stakeholder consultations helped in understanding the overarching issues affecting the productivity of the indigenous chickens systems and their place in the livelihoods of smallholder farmers. These activities created strong networking opportunities with stakeholders from a wide spectrum. The on-farm farmer participatory research involved selection of 200 farmers in five regions followed by training and introduction of interventions on improved management practices which included housing, vaccination, deworming and feed supplementation. Implementation and monitoring was mainly done by individual farmers continuously for close to one and half years. Six quarterly visits to the farms were made by the research team to monitor and provide support for on-going project activities. The data collected has been analysed for 5 consecutive 3-monthly periods. Descriptive and inferential statistics were applied to analyse the data collected involving treatment applications, production characteristics and flock demography characteristics. Out of the 200 farmers initially selected, 173 had records on treatment applications and flock demography characteristics while 127 farmers had records on production characteristics. The demographic analysis with a dissimilarity index of flock size produced 7 distinct farm groups from among the 173 farms. Two of these farm groups were represented in similar numbers in each of the five regions. The research process also involved a number of dissemination and communication strategies that have brought the process and project outcomes into the domain of accessibility by wider readership locally and globally. These include workshops, seminars, field visits and consultations, local and international conferences, electronic conferencing, publications and personal communication via emailing and conventional posting. A number of research and development proposals were also developed based on the knowledge and experiences gained from the research process. The thesis captures the research process activities and outcomes in 8 chapters which include in ascending order – introduction, theoretical concepts underpinning FPR, research methodology and process, on-station research output, FPR descriptive statistical analysis, FPR inferential statistical analysis on production characteristics, FPR demographic analysis and conclusions. Various research approaches both quantitative and qualitative have been applied in the research process indicating the possibilities and importance of combining both systems for greater understanding of issues being studied. In our case, participatory studies of the improved management of indigenous chickens indicates their potential importance as livelihood assets for poor people.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
In the 1960s North Atlantic sea surface temperatures (SST) cooled rapidly. The magnitude of the cooling was largest in the North Atlantic subpolar gyre (SPG), and was coincident with a rapid freshening of the SPG. Here we analyze hindcasts of the 1960s North Atlantic cooling made with the UK Met Office’s decadal prediction system (DePreSys), which is initialised using observations. It is shown that DePreSys captures—with a lead time of several years—the observed cooling and freshening of the North Atlantic SPG. DePreSys also captures changes in SST over the wider North Atlantic and surface climate impacts over the wider region, such as changes in atmospheric circulation in winter and sea ice extent. We show that initialisation of an anomalously weak Atlantic Meridional Overturning Circulation (AMOC), and hence weak northward heat transport, is crucial for DePreSys to predict the magnitude of the observed cooling. Such an anomalously weak AMOC is not captured when ocean observations are not assimilated (i.e. it is not a forced response in this model). The freshening of the SPG is also dominated by ocean salt transport changes in DePreSys; in particular, the simulation of advective freshwater anomalies analogous to the Great Salinity Anomaly were key. Therefore, DePreSys suggests that ocean dynamics played an important role in the cooling of the North Atlantic in the 1960s, and that this event was predictable.
Resumo:
Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.
Resumo:
The recent slowdown (or 'pause') in global surface temperature rise is a hot topic for climate scientists and the wider public. We discuss how climate scientists have tried to communicate the pause and suggest that 'many-to-many' communication offers a key opportunity to directly engage with the public.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
We present an intuitive geometric approach for analysing the structure and fragility of T1-weighted structural MRI scans of human brains. Apart from computing characteristics like the surface area and volume of regions of the brain that consist of highly active voxels, we also employ Network Theory in order to test how close these regions are to breaking apart. This analysis is used in an attempt to automatically classify subjects into three categories: Alzheimer’s disease, mild cognitive impairment and healthy controls, for the CADDementia Challenge.
Resumo:
Combining satellite data, atmospheric reanalyses and climate model simulations, variability in the net downward radiative flux imbalance at the top of Earth's atmosphere (N) is reconstructed and linked to recent climate change. Over the 1985-1999 period mean N (0.34 ± 0.67 Wm–2) is lower than for the 2000-2012 period (0.62 ± 0.43 Wm–2, uncertainties at 90% confidence level) despite the slower rate of surface temperature rise since 2000. While the precise magnitude of N remains uncertain, the reconstruction captures interannual variability which is dominated by the eruption of Mt. Pinatubo in 1991 and the El Niño Southern Oscillation. Monthly deseasonalized interannual variability in N generated by an ensemble of 9 climate model simulations using prescribed sea surface temperature and radiative forcings and from the satellite-based reconstruction is significantly correlated (r ∼ 0.6) over the 1985-2012 period.