877 resultados para Doug Duchin
Resumo:
In the 1960s North Atlantic sea surface temperatures (SST) cooled rapidly. The magnitude of the cooling was largest in the North Atlantic subpolar gyre (SPG), and was coincident with a rapid freshening of the SPG. Here we analyze hindcasts of the 1960s North Atlantic cooling made with the UK Met Office’s decadal prediction system (DePreSys), which is initialised using observations. It is shown that DePreSys captures—with a lead time of several years—the observed cooling and freshening of the North Atlantic SPG. DePreSys also captures changes in SST over the wider North Atlantic and surface climate impacts over the wider region, such as changes in atmospheric circulation in winter and sea ice extent. We show that initialisation of an anomalously weak Atlantic Meridional Overturning Circulation (AMOC), and hence weak northward heat transport, is crucial for DePreSys to predict the magnitude of the observed cooling. Such an anomalously weak AMOC is not captured when ocean observations are not assimilated (i.e. it is not a forced response in this model). The freshening of the SPG is also dominated by ocean salt transport changes in DePreSys; in particular, the simulation of advective freshwater anomalies analogous to the Great Salinity Anomaly were key. Therefore, DePreSys suggests that ocean dynamics played an important role in the cooling of the North Atlantic in the 1960s, and that this event was predictable.
Resumo:
Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.
Resumo:
The recent slowdown (or 'pause') in global surface temperature rise is a hot topic for climate scientists and the wider public. We discuss how climate scientists have tried to communicate the pause and suggest that 'many-to-many' communication offers a key opportunity to directly engage with the public.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
We present an intuitive geometric approach for analysing the structure and fragility of T1-weighted structural MRI scans of human brains. Apart from computing characteristics like the surface area and volume of regions of the brain that consist of highly active voxels, we also employ Network Theory in order to test how close these regions are to breaking apart. This analysis is used in an attempt to automatically classify subjects into three categories: Alzheimer’s disease, mild cognitive impairment and healthy controls, for the CADDementia Challenge.
Resumo:
Combining satellite data, atmospheric reanalyses and climate model simulations, variability in the net downward radiative flux imbalance at the top of Earth's atmosphere (N) is reconstructed and linked to recent climate change. Over the 1985-1999 period mean N (0.34 ± 0.67 Wm–2) is lower than for the 2000-2012 period (0.62 ± 0.43 Wm–2, uncertainties at 90% confidence level) despite the slower rate of surface temperature rise since 2000. While the precise magnitude of N remains uncertain, the reconstruction captures interannual variability which is dominated by the eruption of Mt. Pinatubo in 1991 and the El Niño Southern Oscillation. Monthly deseasonalized interannual variability in N generated by an ensemble of 9 climate model simulations using prescribed sea surface temperature and radiative forcings and from the satellite-based reconstruction is significantly correlated (r ∼ 0.6) over the 1985-2012 period.
Resumo:
One of the prerequisites for achieving skill in decadal climate prediction is to initialize and predict the circulation in the Atlantic Ocean successfully. The RAPID array measures the Atlantic Meridional Overturning Circulation (MOC) at 26°N. Here we develop a method to include these observations in the Met Office Decadal Prediction System (DePreSys). The proposed method uses covariances of overturning transport anomalies at 26°N with ocean temperature and salinity anomalies throughout the ocean to create the density structure necessary to reproduce the observed transport anomaly. Assimilating transport alone in this way effectively reproduces the observed transport anomalies at 26°N and is better than using basin-wide temperature and salinity observations alone. However, when the transport observations are combined with in situ temperature and salinity observations in the analysis, the transport is not currently reproduced so well. The reasons for this are investigated using pseudo-observations in a twin experiment framework. Sensitivity experiments show that the MOC on monthly time-scales, at least in the HadCM3 model, is modulated by a mechanism where non-local density anomalies appear to be more important for transport variability at 26°N than local density gradients.
Resumo:
Observational analyses of running 5-year ocean heat content trends (Ht) and net downward top of atmosphere radiation (N) are significantly correlated (r~0.6) from 1960 to 1999, but a spike in Ht in the early 2000s is likely spurious since it is inconsistent with estimates of N from both satellite observations and climate model simulations. Variations in N between 1960 and 2000 were dominated by volcanic eruptions, and are well simulated by the ensemble mean of coupled models from the Fifth Coupled Model Intercomparison Project (CMIP5). We find an observation-based reduction in N of -0.31±0.21 Wm-2 between 1999 and 2005 that potentially contributed to the recent warming slowdown, but the relative roles of external forcing and internal variability remain unclear. While present-day anomalies of N in the CMIP5 ensemble mean and observations agree, this may be due to a cancellation of errors in outgoing longwave and absorbed solar radiation.
Resumo:
Dynamic electricity pricing can produce efficiency gains in the electricity sector and help achieve energy policy goals such as increasing electric system reliability and supporting renewable energy deployment. Retail electric companies can offer dynamic pricing to residential electricity customers via smart meter-enabled tariffs that proxy the cost to procure electricity on the wholesale market. Current investments in the smart metering necessary to implement dynamic tariffs show policy makers’ resolve for enabling responsive demand and realizing its benefits. However, despite these benefits and the potential bill savings these tariffs can offer, adoption among residential customers remains at low levels. Using a choice experiment approach, this paper seeks to determine whether disclosing the environmental and system benefits of dynamic tariffs to residential customers can increase adoption. Although sampling and design issues preclude wide generalization, we found that our environmentally conscious respondents reduced their required discount to switch to dynamic tariffs around 10% in response to higher awareness of environmental and system benefits. The perception that shifting usage is easy to do also had a significant impact, indicating the potential importance of enabling technology. Perhaps the targeted communication strategy employed by this study is one way to increase adoption and achieve policy goals.
Resumo:
Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.
Resumo:
Two methods are developed to estimate net surface energy fluxes based upon satellite-based reconstructions of radiative fluxes at the top of atmosphere and the atmospheric energy tendencies and transports from the ERA-Interim reanalysis. Method 1 applies the mass adjusted energy divergence from ERA-Interim while method 2 estimates energy divergence based upon the net energy difference at the top of atmosphere and the surface from ERA-Interim. To optimise the surface flux and its variability over ocean, the divergences over land are constrained to match the monthly area mean surface net energy flux variability derived from a simple relationship between the surface net energy flux and the surface temperature change. The energy divergences over the oceans are then adjusted to remove an unphysical residual global mean atmospheric energy divergence. The estimated net surface energy fluxes are compared with other data sets from reanalysis and atmospheric model simulations. The spatial correlation coefficients of multi-annual means between the estimations made here and other data sets are all around 0.9. There are good agreements in area mean anomaly variability over the global ocean, but discrepancies in the trend over the eastern Pacific are apparent.