977 resultados para Feature space
Resumo:
In this paper we explore the importance of emotionally inter-dependent relationships to the functioning of embodied social capital and habitus. Drawing upon the experiences of young people with socio-emotional differences, we demonstrate how emotionally inter-dependent and relatively nurturing relationships are integral to the acquisition of social capital and to the co-construction and embodiment of habitus. The young people presented in this paper often had difficulties in forging social relationships and in acquiring symbolic and cultural capital in school spaces. However, we outline how these young people (re)produce and embody alternative kinds of habitus, based on emotionally reciprocal relationships forged through formal and informal leisure activities and familial and fraternal social relationships. These alternative forms of habitus provide sites of subjection, scope for acquiring social and cultural capital and a positive sense of identity in the face of problematic relations and experiences in school spaces.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
Recent gravity missions have produced a dramatic improvement in our ability to measure the ocean’s mean dynamic topography (MDT) from space. To fully exploit this oceanic observation, however, we must quantify its error. To establish a baseline, we first assess the error budget for an MDT calculated using a 3rd generation GOCE geoid and the CLS01 mean sea surface (MSS). With these products, we can resolve MDT spatial scales down to 250 km with an accuracy of 1.7 cm, with the MSS and geoid making similar contributions to the total error. For spatial scales within the range 133–250 km the error is 3.0 cm, with the geoid making the greatest contribution. For the smallest resolvable spatial scales (80–133 km) the total error is 16.4 cm, with geoid error accounting for almost all of this. Relative to this baseline, the most recent versions of the geoid and MSS fields reduce the long and short-wavelength errors by 0.9 and 3.2 cm, respectively, but they have little impact in the medium-wavelength band. The newer MSS is responsible for most of the long-wavelength improvement, while for the short-wavelength component it is the geoid. We find that while the formal geoid errors have reasonable global mean values they fail capture the regional variations in error magnitude, which depend on the steepness of the sea floor topography.
Resumo:
Sea surface temperature (SST) datasets have been generated from satellite observations for the period 1991–2010, intended for use in climate science applications. Attributes of the datasets specifically relevant to climate applications are: first, independence from in situ observations; second, effort to ensure homogeneity and stability through the time-series; third, context-specific uncertainty estimates attached to each SST value; and, fourth, provision of estimates of both skin SST (the fundamental measure- ment, relevant to air-sea fluxes) and SST at standard depth and local time (partly model mediated, enabling comparison with his- torical in situ datasets). These attributes in part reflect requirements solicited from climate data users prior to and during the project. Datasets consisting of SSTs on satellite swaths are derived from the Along-Track Scanning Radiometers (ATSRs) and Advanced Very High Resolution Radiometers (AVHRRs). These are then used as sole SST inputs to a daily, spatially complete, analysis SST product, with a latitude-longitude resolution of 0.05°C and good discrimination of ocean surface thermal features. A product user guide is available, linking to reports describing the datasets’ algorithmic basis, validation results, format, uncer- tainty information and experimental use in trial climate applications. Future versions of the datasets will span at least 1982–2015, better addressing the need in many climate applications for stable records of global SST that are at least 30 years in length.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
This paper considers the dynamics of deposition around and across the causewayed enclosure at Etton, Cambridgeshire. As a result of detailed re-analysis (particularly refitting) of the pottery and flint assemblages from the site, it proved possible to shed new light both on the temporality of occupation and the character of deposition there. Certain aspects of our work challenge previous interpretations of the site, and of causewayed enclosures in general; but, just as importantly, others confirm materially what has previously been suggested. The quantities of material deposited at Etton reveal that the enclosure was occupied only very intermittently and certainly less regularly than other contemporary sites in the region. The spatial distribution of material suggests that the enclosure ditch lay open for the entirety of the monument's life, but that acts of deposition generally focused on a specific part of the monument at any one time. As well as enhancing our knowledge of one particular causewayed enclosure, it is hoped that this paper – in combination with our earlier analysis of the pit site at Kilverstone – makes clear the potential that detailed material analysis has to offer in relation to our understanding of the temporality of occupation on prehistoric sites in general.
Resumo:
Solar Stormwatch was the first space weather citizen science project, the aim of which was to identify and track coronal mass ejections (CMEs) observed by the Heliospheric Imagers aboard the STEREO satellites. The project has now been running for approximately 4 years, with input from >16000 citizen scientists, resulting in a dataset of >38000 time-elongation profiles of CME trajectories, observed over 18 pre-selected position angles. We present our method for reducing this data set into aCME catalogue. The resulting catalogue consists of 144 CMEs over the period January-2007 to February-2010, of which 110 were observed by STEREO-A and 77 were observed by STEREO-B. For each CME, the time-elongation profiles generated by the citizen scientists are averaged into a consensus profile along each position angle that the event was tracked. We consider this catalogue to be unique, being at present the only citizen science generated CME catalogue, tracking CMEs over an elongation range of 4 degrees out to a maximum of approximately 70 degrees. Using single spacecraft fitting techniques, we estimate the speed, direction, solar source region and latitudinal width of each CME. This shows that, at present, the Solar Stormwatch catalogue (which covers only solar minimum years) contains almost exclusively slow CMEs, with a mean speed of approximately 350 kms−1. The full catalogue is available for public access at www.met.reading.ac.uk/spate/stormwatch. This includes, for each event, the unprocessed time-elongation profiles generated by Solar Stormwatch, the consensus time-elongation profiles and a set of summary plots, as well as the estimated CME properties.
Resumo:
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.
Resumo:
We report on the first realtime ionospheric predictions network and its capabilities to ingest a global database and forecast F-layer characteristics and "in situ" electron densities along the track of an orbiting spacecraft. A global network of ionosonde stations reported around-the-clock observations of F-region heights and densities, and an on-line library of models provided forecasting capabilities. Each model was tested against the incoming data; relative accuracies were intercompared to determine the best overall fit to the prevailing conditions; and the best-fit model was used to predict ionospheric conditions on an orbit-to-orbit basis for the 12-hour period following a twice-daily model test and validation procedure. It was found that the best-fit model often provided averaged (i.e., climatologically-based) accuracies better than 5% in predicting the heights and critical frequencies of the F-region peaks in the latitudinal domain of the TSS-1R flight path. There was a sharp contrast however, in model-measurement comparisons involving predictions of actual, unaveraged, along-track densities at the 295 km orbital altitude of TSS-1R In this case, extrema in the first-principle models varied by as much as an order of magnitude in density predictions, and the best-fit models were found to disagree with the "in situ" observations of Ne by as much as 140%. The discrepancies are interpreted as a manifestation of difficulties in accurately and self-consistently modeling the external controls of solar and magnetospheric inputs and the spatial and temporal variabilities in electric fields, thermospheric winds, plasmaspheric fluxes, and chemistry.
Resumo:
Traditionally, the cusp has been described in terms of a time-stationary feature of the magnetosphere which allows access of magnetosheath-like plasma to low altitudes. Statistical surveys of data from low-altitude spacecraft have shown the average characteristics and position of the cusp. Recently, however, it has been suggested that the ionospheric footprint of flux transfer events (FTEs) may be identified as variations of the “cusp” on timescales of a few minutes. In this model, the cusp can vary in form between a steady-state feature in one limit and a series of discrete ionospheric FTE signatures in the other limit. If this time-dependent cusp scenario is correct, then the signatures of the transient reconnection events must be able, on average, to reproduce the statistical cusp occurrence previously determined from the satellite observations. In this paper, we predict the precipitation signatures which are associated with transient magnetopause reconnection, following recent observations of the dependence of dayside ionospheric convection on the orientation of the IMF. We then employ a simple model of the longitudinal motion of FTE signatures to show how such events can easily reproduce the local time distribution of cusp occurrence probabilities, as observed by low-altitude satellites. This is true even in the limit where the cusp is a series of discrete events. Furthermore, we investigate the existence of double cusp patches predicted by the simple model and show how these events may be identified in the data.
Resumo:
The impact on the dynamics of the stratosphere of three approaches to geoengineering by solar radiation management is investigated using idealized simulations of a global climate model. The approaches are geoengineering with sulfate aerosols, titania aerosols, and reduction in total solar irradiance (representing mirrors placed in space). If it were possible to use stratospheric aerosols to counterbalance the surface warming produced by a quadrupling of atmospheric carbon dioxide concentrations, tropical lower stratospheric radiative heating would drive a thermal wind response which would intensify the stratospheric polar vortices. In the Northern Hemisphere this intensification results in strong dynamical cooling of the polar stratosphere. Northern Hemisphere stratospheric sudden warming events become rare (one and two in 65 years for sulfate and titania, respectively). The intensification of the polar vortices results in a poleward shift of the tropospheric midlatitude jets in winter. The aerosol radiative heating enhances the tropical upwelling in the lower stratosphere, influencing the strength of the Brewer-Dobson circulation. In contrast, solar dimming does not produce heating of the tropical lower stratosphere, and so there is little intensification of the polar vortex and no enhanced tropical upwelling. The dynamical response to titania aerosol is qualitatively similar to the response to sulfate.