815 resultados para Symbolic utility
Resumo:
This thesis examines the manufacture, use, exchange (including gift exchange), collecting and commodification of German medals and badges from the early 18th century until the present-day, with particular attention being given to the symbols that were deployed by the National Socialist German Workers’ Party (NSDAP) between 1919 and 1945. It does so by focusing in particular on the construction of value through insignia, and how such badges and their symbolic and monetary value changed over time. In order to achieve this, the thesis adopts a chronological structure, which encompasses the creation of Prussia in 1701, the Napoleonic wars and the increased democratisation of military awards such as the Iron Cross during the Great War. The collapse of the Kaiserreich in 1918 was the major factor that led to the creation of the NSDAP under the eventual strangle-hold of Hitler, a fundamentally racist and anti-Semitic movement that continued the German tradition of awarding and wearing badges. The traditional symbols of Imperial Germany, such as the eagle, were then infused with the swastika, an emblem that was meant to signify anti-Semitism, thus creating a hybrid identity. This combination was then replicated en-masse, and eventually eclipsed all the symbols that had possessed symbolic significance in Germany’s past. After Hitler was appointed Chancellor in 1933, millions of medals and badges were produced in an effort to create a racially based “People’s Community”, but the steel and iron that were required for munitions eventually led to substitute materials being utilised and developed in order to manufacture millions of politically oriented badges. The Second World War unleashed Nazi terror across Europe, and the conscripts and volunteers who took part in this fight for living-space were rewarded with medals that were modelled on those that had been instituted during Imperial times. The colonial conquest and occupation of the East by the Wehrmacht, the Order Police and the Waffen-SS surpassed the brutality of former wars that finally culminated in the Holocaust, and some of these horrific crimes and the perpetrators of them were perversely rewarded with medals and badges. Despite Nazism being thoroughly discredited, many of the Allied soldiers who occupied Germany took part in the age-old practice of obtaining trophies of war, which reconfigured the meaning of Nazi badges as souvenirs, and began the process of their increased commodification on an emerging secondary collectors’ market. In order to analyse the dynamics of this market, a “basket” of badges is examined that enables a discussion of the role that aesthetics, scarcity and authenticity have in determining the price of the artefacts. In summary, this thesis demonstrates how the symbolic, socio-economic and exchange value of German military and political medals and badges has changed substantially over time, provides a stimulus for scholars to conduct research in this under-developed area, and encourages collectors to investigate the artefacts that they collect in a more historically contextualised manner.
Resumo:
With the growing interest in the issue of reducing powerconsumption in electricity applications, energy-saving devises and the problem of their operation gains more attention. The increase in the utilization of energy-saving lamps, especially LED lamps, leads to the wide interest in their influence on the power system. The impact of discussed devices on the voltage and current in the grid must be thoroughly studied before launching devices into the market. The studies undertaken on the LED lamps present in a wide range the issue of harmonic emission from lamps and their influence on power quality. Due to many anufacturing technologies, different LED devices can have various effects on the utility. It is significant to carry out researches concerning the adverse effects of energy-saving lamps in order to utilize devices that meet certain requirements which are set by the international commissions.
Resumo:
Within the last few years, disabled people have become the target of government austerity measures through drastic cuts to welfare justified through the portrayal of benefit claimants as inactive, problem citizens who are wilfully unemployed. For all that is wrong with these cuts, they are one of many aspects of exclusion that disabled people face. Attitudes towards disability are deteriorating (Scope, 2011) and disabled people are devalued and negatively positioned in a myriad of ways, meaning that an understanding of the perceptions and positioning of disability and the power of disabling practices is critical. This thesis will examine how Bourdieu’s theoretical repertoire may be applied to the area of Disability Studies in order to discern how society produces oppressive and exclusionary systems of classification which structures the social position and perceptions of disability. The composite nature of disability and multiple forms of exclusion and inequality associated with it benefits from a multipronged approach which acknowledges personal, embodied and psychological aspects of disability alongside socio-political and cultural conceptualisations. Bourdieu’s approach is one in which the micro and macro aspects of social life are brought together through their meso interplay and provides a thorough analysis of the many aspects of disability.
Resumo:
Historically the imaginary and the hegemonic thinking, in the Western globe north, has been marked by the epistemology and capitalists archetypes. Notwithstanding the design seem as a practice and discipline shielded on a simplistic discourse of functional / communicative efficiency, wandering through by multiple aestheticism apparently neutral in relation to the symbolic, but in fact they never are, because what really hapens is that the aesthetic appearance of the generated forms will always be a review of the powers ruling. We start from the understanding that the act of creating an aesthetic artifact, will also be a movement of inscription in a discursive platform (that precedes it), is in itself an narrative act and that fact represent a certain take place in relation to certain symbolic reality. On reflection shown if it sees design as a discipline and / or an instrument of action, whose operational relevance tends to question and simultaneously rehearsing a response, in which more than why interests answer to why. Apparently the design is a content mediator, but also, it is structure, is body, is idea. We think a design praxis as discipline and enrollment tool of critical thought and social transformation. For guiding research in this text, we propose the following question: Can the Design want for themselves an engagement with the symbolic in order to be an active part in the production of critical thinking in the place where it belongs? Methodologically our argument will be present in two differents moments: 1. a first, exploratory nature where we rescue the draw issues in the practice of design and 2. a second analytical nature concerning the subject issues (graphic and / or utility ) design and how it incorporates formal rites, political events and social practices of contemporary everyday life. We consider the praxis of design as a discipline and critical thinking enrollment tool as agents of social transformation. With this study we seek for contribute phenomenology design by studying the artifacts of configuration as well as the possible messages they convey and what impact they may have on the social network.
Resumo:
Resumo:
Poster presented at the 25th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID). Copenhagen, Denmark, 25–28 April 2015
Resumo:
Mode of access: Internet.
Resumo:
Several factors have recently converged, elevating the need for highly parallel diagnostic platforms that have the ability to detect many known, novel, and emerging pathogenic agents simultaneously. Panviral DNA microarrays represent the most robust approach for massively parallel viral surveillance and detection. The Virochip is a panviral DNA microarray that is capable of detecting all known viruses, as well as novel viruses related to known viral families, in a single assay and has been used to successfully identify known and novel viral agents in clinical human specimens. However, the usefulness and the sensitivity of the Virochip platform have not been tested on a set of clinical veterinary specimens with the high degree of genetic variance that is frequently observed with swine virus field isolates. In this report, we investigate the utility and sensitivity of the Virochip to positively detect swine viruses in both cell culture-derived samples and clinical swine samples. The Virochip successfully detected porcine reproductive and respiratory syndrome virus (PRRSV) in serum containing 6.10 × 10(2) viral copies per microliter and influenza A virus in lung lavage fluid containing 2.08 × 10(6) viral copies per microliter. The Virochip also successfully detected porcine circovirus type 2 (PCV2) in serum containing 2.50 × 10(8) viral copies per microliter and porcine respiratory coronavirus (PRCV) in turbinate tissue homogenate. Collectively, the data in this report demonstrate that the Virochip can successfully detect pathogenic viruses frequently found in swine in a variety of solid and liquid specimens, such as turbinate tissue homogenate and lung lavage fluid, as well as antemortem samples, such as serum.
Resumo:
Mathematical skills that we acquire during formal education mostly entail exact numerical processing. Besides this specifically human faculty, an additional system exists to represent and manipulate quantities in an approximate manner. We share this innate approximate number system (ANS) with other nonhuman animals and are able to use it to process large numerosities long before we can master the formal algorithms taught in school. Dehaene´s (1992) Triple Code Model (TCM) states that also after the onset of formal education, approximate processing is carried out in this analogue magnitude code no matter if the original problem was presented nonsymbolically or symbolically. Despite the wide acceptance of the model, most research only uses nonsymbolic tasks to assess ANS acuity. Due to this silent assumption that genuine approximation can only be tested with nonsymbolic presentations, up to now important implications in research domains of high practical relevance remain unclear, and existing potential is not fully exploited. For instance, it has been found that nonsymbolic approximation can predict math achievement one year later (Gilmore, McCarthy, & Spelke, 2010), that it is robust against the detrimental influence of learners´ socioeconomic status (SES), and that it is suited to foster performance in exact arithmetic in the short-term (Hyde, Khanum, & Spelke, 2014). We provided evidence that symbolic approximation might be equally and in some cases even better suited to generate predictions and foster more formal math skills independently of SES. In two longitudinal studies, we realized exact and approximate arithmetic tasks in both a nonsymbolic and a symbolic format. With first graders, we demonstrated that performance in symbolic approximation at the beginning of term was the only measure consistently not varying according to children´s SES, and among both approximate tasks it was the better predictor for math achievement at the end of first grade. In part, the strong connection seems to come about from mediation through ordinal skills. In two further experiments, we tested the suitability of both approximation formats to induce an arithmetic principle in elementary school children. We found that symbolic approximation was equally effective in making children exploit the additive law of commutativity in a subsequent formal task as a direct instruction. Nonsymbolic approximation on the other hand had no beneficial effect. The positive influence of the symbolic approximate induction was strongest in children just starting school and decreased with age. However, even third graders still profited from the induction. The results show that also symbolic problems can be processed as genuine approximation, but that beyond that they have their own specific value with regard to didactic-educational concerns. Our findings furthermore demonstrate that the two often con-founded factors ꞌformatꞌ and ꞌdemanded accuracyꞌ cannot be disentangled easily in first graders numerical understanding, but that children´s SES also influences existing interrelations between the different abilities tested here.
Resumo:
A prospective randomised controlled clinical trial of treatment decisions informed by invasive functional testing of coronary artery disease severity compared with standard angiography-guided management was implemented in 350 patients with a recent non-ST elevation myocardial infarction (NSTEMI) admitted to 6 hospitals in the National Health Service. The main aims of this study were to examine the utility of both invasive fractional flow reserve (FFR) and non-invasive cardiac magnetic resonance imaging (MRI) amongst patients with a recent diagnosis of NSTEMI. In summary, the findings of this thesis are: (1) the use of FFR combined with intravenous adenosine was feasible and safe amongst patients with NSTEMI and has clinical utility; (2) there was discordance between the visual, angiographic estimation of lesion significance and FFR; (3). The use of FFR led to changes in treatment strategy and an increase in prescription of medical therapy in the short term compared with an angiographically guided strategy; (4) in the incidence of major adverse cardiac events (MACE) at 12 months follow up was similar in the two groups. Cardiac MRI was used in a subset of patients enrolled in two hospitals in the West of Scotland. T1 and T2 mapping methods were used to delineate territories of acute myocardial injury. T1 and T2 mapping were superior when compared with conventional T2-weighted dark blood imaging for estimation of the ischaemic area-at-risk (AAR) with less artifact in NSTEMI. There was poor correlation between the angiographic AAR and MRI methods of AAR estimation in patients with NSTEMI. FFR had a high accuracy at predicting inducible perfusion defects demonstrated on stress perfusion MRI. This thesis describes the largest randomized trial published to date specifically looking at the clinical utility of FFR in the NSTEMI population. We have provided evidence of the diagnostic and clinical utility of FFR in this group of patients and provide evidence to inform larger studies. This thesis also describes the largest ever MRI cohort, including with myocardial stress perfusion assessments, specifically looking at the NSTEMI population. We have demonstrated the diagnostic accuracy of FFR to predict reversible ischaemia as referenced to a non-invasive gold standard with MRI. This thesis has also shown the futility of using dark blood oedema imaging amongst all comer NSTEMI patients when compared to novel T1 and T2 mapping methods.
Resumo:
Background/Aims: The Mini Addenbrooke’s Cognitive Examination (M-ACE) is the abbreviated version of the widely-used Addenbrooke’s Cognitive Examination (ACE-III), a cognitive screening tool that is used internationally in the assessment of mild cognitive impairment (MCI) and dementia. The objectives of this study were to investigate the diagnostic accuracy of the M-ACE with individuals aged 75 and over to distinguish between those who do and do not have a dementia or MCI, and also to establish whether the cut-off scores recommended by Hsieh et al. (2014) [9] in the original validation study for the M-ACE are optimal for this age group. Methods: The M-ACE was administered to 58 participants (24 with a diagnosis of dementia, 17 with a diagnosis of MCI and 17 healthy controls). The extent to which scores distinguished between groups (dementia, MCI or no diagnosis) was explored using receiver operating characteristic curve analysis. Results: The optimal cut-off for detecting dementia was ≤ 21/30 (score ≤ 21/30 indicating dementia with a sensitivity of 0.95, a specificity of 1 and a positive predictive value of 1) compared to the original higher published cut-off of ≤ 25/30 (sensitivity of 0.95, specificity of 0.70 and a positive predictive value of 0.82 in this sample). Conclusions: The M-ACE has excellent diagnostic accuracy for the detection of dementia in a UK clinical sample. It may be necessary to consider lower cut-offs than those given in the original validation study.
Resumo:
We consider a system described by the linear heat equation with adiabatic boundary conditions which is perturbed periodicaly. This perturbation is nonlinear and is characterized by a one-parameter family of quadratic maps. The system, depending on the parameters, presents very complex behaviour. We introduce a symbolic framework to analyze the system and resume its most important features.
Resumo:
We consider piecewise defined differential dynamical systems which can be analysed through symbolic dynamics and transition matrices. We have a continuous regime, where the time flow is characterized by an ordinary differential equation (ODE) which has explicit solutions, and the singular regime, where the time flow is characterized by an appropriate transformation. The symbolic codification is given through the association of a symbol for each distinct regular system and singular system. The transition matrices are then determined as linear approximations to the symbolic dynamics. We analyse the dependence on initial conditions, parameter variation and the occurrence of global strange attractors.
Resumo:
The demand to implement routine outcome assessment in mental health care services calls for measures with clinical utility, i. e, feasible to therapists, acceptable to clients and generalizable to settings. This research aims to explore the clinical utility of a patient-generated measure, the Personal Questionnaire (PQ). An on-line survey was designed (study I) and administered to an international sample of 25 therapists with experience using the PQ (study II). Results suggest that the PQ is perceived as a clinically significant and fairly practical measure, useful not only in assessing outcome but also in various clinical tasks. Furthermore, it is relatively well accepted by clients and it is extremely generalizable to different clients, clinical approaches and settings. Specific suggestions to increase the PQ’s clinical utility are provided. Exploring therapists’ perspectives and practices will improve the appropriateness of measures to real-world clinical settings; A utilidade clínica do Personal Questionnaire RESUMO: O movimento para implementar a avaliação rotineira de resultados nos serviços de saúde mental pede medidas com utilidade clínica, i. e, práticas para terapeutas, aceitáveis para clientes e generalizáveis para contextos clínicos. Este estudo tem como objetivo explorar a utilidade clínica de uma medida gerada pelo cliente, o Personal Questionnaire (PQ). Um questionário on-line foi desenvolvido (estudo I) e administrado a uma amostra internacional de 25 terapeutas com experiência de uso do PQ (estudo II). Os resultados sugerem que o PQ é considerado um instrumento valioso para a prática clínica, relativamente prático, útil como indicador de resultado e também como ferramenta clínica. Adicionalmente, é bem aceite pelos clientes e bastante generalizável para diferentes clientes, abordagens terapêuticas e contextos clínicos. Sugestões específicas para melhorar a utilidade clínica do PQ são fornecidas. Explorar as perspetivas e práticas dos terapeutas face a medidas de resultado possibilita uma melhor adequação à prática clínica.
Resumo:
Process modeling can be regarded as the currently most popular form of conceptual modeling. Research evidence illustrates how process modeling is applied across the different information system life cycle phases for a range of different applications, such as configuration of Enterprise Systems, workflow management, or software development. However, a detailed discussion of critical factors of the quality of process models is still missing. This paper proposes a framework consisting of six quality factors, which is derived from a comprehensive literature review. It then presents in a case study, a utility provider, who had designed various business process models for the selection of an Enterprise System. The paper summarizes potential means of conducting a successful process modeling initiative and evaluates the described modeling approach within the Guidelines of Modeling (GoM) framework. An outlook shows the potential lessons learnt, and concludes with insights to the next phases of this study.