61 resultados para Multicriteria degree constrained
Broadly speaking: vocabulary in semantic dementia shifts towards general, semantically diverse words
Resumo:
One of the cardinal features of semantic dementia (SD) is a steady reduction in expressive vocabulary. We investigated the nature of this breakdown by assessing the psycholinguistic characteristics of words produced spontaneously by SD patients during an autobiographical memory interview. Speech was analysed with respect to frequency and imageability, and a recently-developed measure called semantic diversity. This measure quantifies the degree to which a word can be used in a broad range of different linguistic contexts. We used this measure in a formal exploration of the tendency for SD patients to replace specific terms with more vague and general words, on the assumption that more specific words are used in a more constrained set of contexts. Relative to healthy controls, patients were less likely to produce low-frequency, high-imageability words, and more likely to produce highly frequent, abstract words. These changes in the lexical-semantic landscape were related to semantic diversity: the highly frequent and abstract words most prevalent in the patients' speech were also the most semantically diverse. In fact, when the speech samples of healthy controls were artificially engineered such that low semantic diversity words (e.g., garage, spanner) were replaced with broader terms (e.g., place, thing), the characteristics of their speech production came to closely resemble that of SD patients. A similar simulation in which low-frequency words were replaced was less successful in replicating the patient data. These findings indicate systematic biases in the deterioration of lexical-semantic space in SD. As conceptual knowledge degrades, speech increasingly consists of general terms that can be applied in a broad range of linguistic contexts and convey less specific information.
Resumo:
Based on theoretical arguments we propose a possible route for controlling the band-gap in the promising photovoltaic material CdIn2S4. Our ab initio calculations show that the experimental degree of inversion in this spinel (fraction of tetrahedral sites occupied by In) corresponds approximately to the equilibrium value given by the minimum of the theoretical inversion free energy at a typical synthesis temperature. Modification of this temperature, or of the cooling rate after synthesis, is then expected to change the inversion degree, which in turn sensitively tunes the electronic band-gap of the solid, as shown here by Heyd-Scuseria-Ernzerhof screened hybrid functional calculations.
Resumo:
Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.
Resumo:
Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.
Resumo:
In recent years several methodologies have been developed to combine and interpret ensembles of climate models with the aim of quantifying uncertainties in climate projections. Constrained climate model forecasts have been generated by combining various choices of metrics used to weight individual ensemble members, with diverse approaches to sampling the ensemble. The forecasts obtained are often significantly different, even when based on the same model output. Therefore, a climate model forecast classification system can serve two roles: to provide a way for forecast producers to self-classify their forecasts; and to provide information on the methodological assumptions underlying the forecast generation and its uncertainty when forecasts are used for impacts studies. In this review we propose a possible classification system based on choices of metrics and sampling strategies. We illustrate the impact of some of the possible choices in the uncertainty quantification of large scale projections of temperature and precipitation changes, and briefly discuss possible connections between climate forecast uncertainty quantification and decision making approaches in the climate change context.
Resumo:
Palaeoclimates across Europe for 6000 y BP were estimated from pollen data using the modern pollen analogue technique constrained with lake-level data. The constraint consists of restricting the set of modern pollen samples considered as analogues of the fossil samples to those locations where the implied change in annual precipitation minus evapotranspiration (P–E) is consistent with the regional change in moisture balance as indicated by lakes. An artificial neural network was used for the spatial interpolation of lake-level changes to the pollen sites, and for mapping palaeoclimate anomalies. The climate variables reconstructed were mean temperature of the coldest month (T c ), growing degree days above 5 °C (GDD), moisture availability expressed as the ratio of actual to equilibrium evapotranspiration (α), and P–E. The constraint improved the spatial coherency of the reconstructed palaeoclimate anomalies, especially for P–E. The reconstructions indicate clear spatial and seasonal patterns of Holocene climate change, which can provide a quantitative benchmark for the evaluation of palaeoclimate model simulations. Winter temperatures (T c ) were 1–3 K greater than present in the far N and NE of Europe, but 2–4 K less than present in the Mediterranean region. Summer warmth (GDD) was greater than present in NW Europe (by 400–800 K day at the highest elevations) and in the Alps, but >400 K day less than present at lower elevations in S Europe. P–E was 50–250 mm less than present in NW Europe and the Alps, but α was 10–15% greater than present in S Europe and P–E was 50–200 mm greater than present in S and E Europe.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.
Resumo:
Periocular recognition has recently become an active topic in biometrics. Typically it uses 2D image data of the periocular region. This paper is the first description of combining 3D shape structure with 2D texture. A simple and effective technique using iterative closest point (ICP) was applied for 3D periocular region matching. It proved its strength for relatively unconstrained eye region capture, and does not require any training. Local binary patterns (LBP) were applied for 2D image based periocular matching. The two modalities were combined at the score-level. This approach was evaluated using the Bosphorus 3D face database, which contains large variations in facial expressions, head poses and occlusions. The rank-1 accuracy achieved from the 3D data (80%) was better than that for 2D (58%), and the best accuracy (83%) was achieved by fusing the two types of data. This suggests that significant improvements to periocular recognition systems could be achieved using the 3D structure information that is now available from small and inexpensive sensors.
Resumo:
Background Ageing increases risk of respiratory infections and impairs the response to influenza vaccination. Pre- and probiotics offer an opportunity to modulate anti-viral defenses and the response to vaccination via alteration of the gut microbiota. This study investigated the effect of a novel probiotic, Bifidobacterium longum bv. infantis CCUG 52486, combined with a prebiotic, gluco-oligosaccharide (B. longum + Gl-OS), on the response to seasonal influenza vaccination in young and older subjects in a double-blind, randomized controlled trial, taking into account the influence of immunosenescence markers at baseline. Results Vaccination resulted in a significant increase in total antibody titres, vaccine-specific IgA, IgM and IgG and seroprotection to all three subunits of the vaccine in both young and older subjects, and in general, the increases in young subjects were greater. There was little effect of the synbiotic, although it tended to reduce seroconversion to the Brisbane subunit of the vaccine and the vaccine-specific IgG response in older subjects. Immunological characterization revealed that older subjects randomized to the synbiotic had a significantly higher number of senescent (CD28-CD57+) helper T cells at baseline compared with those randomized to the placebo, and they also had significantly higher plasma levels of anti-CMV IgG and a greater tendency for CMV seropositivity. Moreover, higher numbers of CD28-CD57+ helper T cells were associated with failure to seroconvert to Brisbane, strongly suggesting that the subjects randomized to the synbiotic were already at a significant disadvantage in terms of likely ability to respond to the vaccine compared with those randomized to the placebo. Conclusions Ageing was associated with marked impairment of the antibody response to influenza vaccination in older subjects and the synbiotic failed to reverse this impairment. However, the older subjects randomized to the synbiotic were at a significant disadvantage due to a greater degree of immunosenscence at baseline compared with those randomized to the placebo. Thus, baseline differences in immunosenescence between the randomized groups are likely to have influenced the outcome of the intervention, highlighting the need for detailed immunological characterization of subjects prior to interventions.