141 resultados para Manipulation techniques
Resumo:
The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9– 12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children’s self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children’s perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children’s self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children.
Resumo:
This paper presents findings of our study on peer-reviewed papers published in the International Conference on Persuasive Technology from 2006 to 2010. The study indicated that out of 44 systems reviewed, 23 were reported to be successful, 2 to be unsuccessful and 19 did not specify whether or not it was successful. 56 different techniques were mentioned and it was observed that most designers use ad hoc definitions for techniques or methods used in design. Hence we propose the need for research to establish unambiguous definitions of techniques and methods in the field.
Resumo:
Scope: Our aim was to determine the effects of chronic dietary fat manipulation on postprandial lipaemia according to apolipoprotein (APO)E genotype. Methods and results:Men (mean age 53 (SD 9) years), prospectively recruited for the APOE genotype (n = 12 E3/E3, n = 11 E3/E4), were assigned to a low fat (LF), high fat, high-saturated fat (HSF), and HSF diet with 3.45 g/day docosahexaenoic acid (HSF-DHA), each for an 8-week period in the same order. At the end of each dietary period, a postprandial assessment was performed using a test meal with a macronutrient profile representative of that dietary intervention. A variable postprandial plasma triacylglycerol (TAG) response according to APOE genotype was evident, with a greater sensitivity to the TAG-lowering effects of DHA in APOE4 carriers (p ≤ 0.005). There was a lack of an independent genotype effect on any of the lipid measures. In the groups combined, dietary fat manipulation had a significant impact on lipids in plasma and Svedberg flotation rate (Sf) 60–400 TAG-rich lipoprotein fraction, with lower responses following the HSF-DHA than HSF intervention (p < 0.05). Conclusion: Although a modest impact of APOE genotype was observed on the plasma TAG profile, dietary fat manipulation emerged as a greater modulator of the postprandial lipid response in normolipidaemic men.
Resumo:
Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
Khartoum like many cities in least developing countries (LDCs) still witnesses huge influx of people. Accommodation of the new comers leads to encroachment on the cultivation land leads to sprawl expansion of Greater Khartoum. The city expanded in diameter from 16.8 km in 1955 to 802.5 km in 1998. Most of this horizontal expansion was residential. In 2008 Khartoum accommodated 29% of the urban population of Sudan. Today Khartoum is considered as one of 43 major cities in Africa that accommodates more than 1 million inhabitants. Most of new comers live in the outskirts of the city e.g. Dar El-Salam and Mayo neighbourhoods. The majority of those new comers built their houses especially the walls from mud, wood, straw and sacks. Selection of building materials usually depends on its price regardless of the environmental impact, quality, thermal performance and life of the material. Most of the time, this results in increasing the cost with variables of impacts over the environment during the life of the building. Therefore, consideration of the environmental impacts, social impacts and economic impacts is crucial in the selection of any building material. Decreasing such impacts could lead to more sustainable housing. Comparing the sustainability of the available wall building materials for low cost housing in Khartoum is carried out through the life cycle assessment (LCA) technique. The purpose of this paper is to compare the most available local building materials for walls for the urban poor of Khartoum from a sustainability point of view by going through the manufacturing of the materials, the use of these materials and then the disposal of the materials after their life comes to an end. Findings reveal that traditional red bricks couldn’t be considered as a sustainable wall building material that will draw the future of the low cost housing in Greater Khartoum. On the other hand, results of the comparison lead to draw attention to the wide range of the soil techniques and to its potentials to be a promising sustainable wall material for urban low cost housing in Khartoum.
Resumo:
n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.
Resumo:
Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.
Resumo:
The role of state and trait anxiety on observer ratings of social skill and negatively biased self-perception of social skill was examined. Participants were aged between 7 and 13 years (mean=9.65; sd=1.77; N=102), 47 had a current anxiety diagnosis and 55 were non-anxious controls. Participants were randomly allocated to a high or low anxiety condition and asked to complete social tasks. Task instructions were adjusted across conditions to manipulate participants’ state anxiety. Observers rated anxious participants as having poorer social skills than non-anxious controls but there was no evidence that anxious participants exhibited a negative self-perception bias, relative to controls. However, as participants’ ratings of state anxiety increased, their perception of their performance became more negatively biased. The results suggest that anxious children may exhibit real impairments in social skill and that high levels of state anxiety can lead to biased judgements of social skills in anxious and non-anxious children.
Resumo:
Cities, which are now inhabited by a majority of the world's population, are not only an important source of global environmental and resource depletion problems, but can also act as important centres of technological innovation and social learning in the continuing quest for a low carbon future. Planning and managing large-scale transitions in cities to deal with these pressures require an understanding of urban retrofitting at city scale. In this context performative techniques (such as backcasting and roadmapping) can provide valuable tools for helping cities develop a strategic view of the future. However, it is also important to identify ‘disruptive’ and ‘sustaining’ technologies which may contribute to city-based sustainability transitions. This paper presents research findings from the EPSRC Retrofit 2050 project, and explores the relationship between technology roadmaps and transition theory literature, highlighting the research gaps at urban/city level. The paper develops a research methodology to describe the development of three guiding visions for city-regional retrofit futures, and identifies key sustaining and disruptive technologies at city scale within these visions using foresight (horizon scanning) techniques. The implications of the research for city-based transition studies and related methodologies are discussed.
Resumo:
BACKGROUND: Reduction of vegetation height is recommended as a management strategy for controlling rodent pests of rice in South-east Asia, but there are limited field data to assess its effectiveness. The breeding biology of the main pest species of rodent in the Philippines, Rattus tanezumi, suggests that habitat manipulation in irrigated rice–coconut cropping systems may be an effective strategy to limit the quality and availability of their nesting habitat. The authors imposed a replicated manipulation of vegetation cover in adjacent coconut groves during a single rice-cropping season, and added artificial nest sites to facilitate capture and culling of young. RESULTS: Three trapping sessions in four rice fields (two treatments, two controls) adjacent to coconut groves led to the capture of 176 R. tanezumi, 12Rattus exulans and seven Chrotomysmindorensis individuals. There was no significant difference in overall abundance between crop stages or between treatments, and there was no treatment effect on damage to tillers or rice yield. Only two R. tanezumi were caught at the artificial nest sites. CONCLUSION: Habitat manipulation to reduce the quality of R. tanezumi nesting habitat adjacent to rice fields is not effective as a lone rodent management tool in rice–coconut cropping systems.
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
Resumo:
Apolipoprotein E (APOE) genotype is believed to play an important role in cardiovascular risk. APOE4 carriers have been associated with higher blood lipid levels and a more pro-inflammatory state compared with APOE3/E3 individuals. Although dietary fat composition has been considered to modulate the inflammatory state in humans, very little is known about how APOE genotype can impact on this response. In a follow-up to the main SATgene study, we aimed to explore the effects of APOE genotype, as well as, dietary fat manipulation on ex vivo cytokine production. Blood samples were collected from a subset of SATgene participants (n = 52/88), prospectively recruited according to APOE genotype (n = 26 E3/E3 and n = 26 E3/E4) after low-fat (LF), high saturated fat (HSF) and HSF with 3.45 g docosahexaenoic acid (DHA) dietary periods (each diet eight weeks in duration assigned in the same order) for the measurement of ex vivo cytokine production using whole blood culture (WBC). Concentrations of IL-1beta, IL-6, IL-8, IL-10 and TNF-alpha were measured in WBC supernatant samples after stimulation for 24 h with either 0.05 or 1 lg/ml of bacterial lipopolysaccharide (LPS). Cytokine levels were not influenced by genotype, whereas, dietary fat manipulation had a significant impact on TNF-a and IL-10 production; TNF-a concentration was higher after consumption of the HSF diet compared with baseline and the LF diet (P < 0.05), whereas, IL-10 concentration was higher after the LF diet compared with baseline (P < 0.05). In conclusion, our study has revealed the amount and type of dietary fat can significantly modulate the production of TNF-a and IL-10 by ex vivo LPS-stimulated WBC samples obtained from normolipidaemic subjects.
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).