873 resultados para Embedded and embodied cognition
Resumo:
The fabrication of in-fibre Bragg gratings, and the application of arrays of such gratings as strain sensors and as true time delay elements for the control of phased array antennas is reported. Chirped period Bragg gratings were produced using the fibre deformation fabrication technique, with chirps of between 2.9nm and 17.3nm achieved. Arrays of 5mm and 2mm long uniform period Bragg gratings were fabricated using the inscription method, for use as true time delay elements,dissimilar wavefronts and their spectral characteristics recorded. The uniform period grating arrays were used to create minimum time delays of 9.09ps, 19.02ps and 31ps; making them suitable for controlling phased array antennas operating at RF frequencies of up to 3GHz, with 10° phase resolution. Four 4mm long chirped gratings were produced using the dissimilar wavefronts fabrication method, having chirps of 7nm, 12nm, 20nm and 30nm, and were used to create time delays of between 0.3ps and 59ps. Hence they are suitable for controlling phased array antennas at RF frequencies of up to 48GHz. The application of in fibre Bragg gratings as strain sensors within smart structure materials was investigated, with their sensitivity to applied strain and compression measured for both embedded and surface mounted uniform period and fibre Fabry-Perot filter gratings. A fibre Bragg grating sensor demultiplexing scheme based on a liquid crystal filled Fabry-Perot etalon tuneable transmission filter was proposed, successfully constructed and fully characterised. Three characteristics of the LCFP etalon were found to pose operational limitations to its application in a Bragg grating sensor system; most significantly, the resonance peak wavelength was highly (-2,77nm/°C) temperature dependent. Several methods for minimising this temperature sensitivity were investigated, but enjoyed only limited success. It was therefore concluded that this type (E7 filled) of LCFP etalon is unsuitable for use as a Bragg grating sensor demultiplexing element.
Resumo:
In recent years there has been growing concern about the emission trade balances of countries. This is due to the fact that countries with an open economy are active players in international trade. Trade is not only a major factor in forging a country’s economic structure, but contributes to the movement of embodied emissions beyond country borders. This issue is especially relevant from the carbon accounting policy and domestic production perspective, as it is known that the production-based principle is employed in the Kyoto agreement. The research described herein was designed to reveal the interdependence of countries on international trade and the corresponding embodied emissions both on national and on sectoral level and to illustrate the significance of the consumption-based emission accounting. It is presented here to what extent a consumption-based accounting would change the present system based on production-based accounting and allocation. The relationship of CO2 emission embodied in exports and embodied in imports is analysed here. International trade can blur the responsibility for the ecological effects of production and consumption and it can lengthen the link between consumption and its consequences. Input-output models are used in the methodology as they provide an appropriate framework for climate change accounting. The analysis comprises an international comparative study of four European countries (Germany, the United Kingdom, the Netherlands, and Hungary) with extended trading activities and carbon emissions. Moving from a production-based approach in climate policy to a consumption-based principle and allocation approach would help to increase the efficiency of emission reductions and would force countries to rethink their trading activities in order to decrease the environmental load of production activities. The results of this study show that it is important to distinguish between the two emission accounting approaches, both on the global and the local level.
Resumo:
This thesis extends previous research on critical decision making and problem-solving by refining and validating a self-report measure designed to assess the use of critical decision making and problem solving in making life choices. The analysis was conducted by performing two studies, and therefore collecting two sets of data on the psychometric properties of the measure. Psychometric analyses included: item analysis, internal consistency reliability, interrater reliability, and an exploratory factor analysis. This study also included regression analysis with the Wonderlic, an established measure of general intelligence, to provide preliminary evidence for the construct validity of the measure.
Resumo:
The coupling of mechanical stress fields in polymers to covalent chemistry (polymer mechanochemistry) has provided access to previously unattainable chemical reactions and polymer transformations. In the bulk, mechanochemical activation has been used as the basis for new classes of stress-responsive polymers that demonstrate stress/strain sensing, shear-induced intermolecular reactivity for molecular level remodeling and self-strengthening, and the release of acids and other small molecules that are potentially capable of triggering further chemical response. The potential utility of polymer mechanochemistry in functional materials is limited, however, by the fact that to date, all reported covalent activation in the bulk occurs in concert with plastic yield and deformation, so that the structure of the activated object is vastly different from its nascent form. Mechanochemically activated materials have thus been limited to “single use” demonstrations, rather than as multi-functional materials for structural and/or device applications. Here, we report that filled polydimethylsiloxane (PDMS) elastomers provide a robust elastic substrate into which mechanophores can be embedded and activated under conditions from which the sample regains its original shape and properties. Fabrication is straightforward and easily accessible, providing access for the first time to objects and devices that either release or reversibly activate chemical functionality over hundreds of loading cycles.
While the mechanically accelerated ring-opening reaction of spiropyran to merocyanine and associated color change provides a useful method by which to image the molecular scale stress/strain distribution within a polymer, the magnitude of the forces necessary for activation had yet to be quantified. Here, we report single molecule force spectroscopy studies of two spiropyran isomers. Ring opening on the timescale of tens of milliseconds is found to require forces of ~240 pN, well below that of previously characterized covalent mechanophores. The lower threshold force is a combination of a low force-free activation energy and the fact that the change in rate with force (activation length) of each isomer is greater than that inferred in other systems. Importantly, quantifying the magnitude of forces required to activate individual spiropyran-based force-probes enables the probe behave as a “scout” of molecular forces in materials; the observed behavior of which can be extrapolated to predict the reactivity of potential mechanophores within a given material and deformation.
We subsequently translated the design platform to existing dynamic soft technologies to fabricate the first mechanochemically responsive devices; first, by remotely inducing dielectric patterning of an elastic substrate to produce assorted fluorescent patterns in concert with topological changes; and second, by adopting a soft robotic platform to produce a color change from the strains inherent to pneumatically actuated robotic motion. Shown herein, covalent polymer mechanochemistry provides a viable mechanism to convert the same mechanical potential energy used for actuation into value-added, constructive covalent chemical responses. The color change associated with actuation suggests opportunities for not only new color changing or camouflaging strategies, but also the possibility for simultaneous activation of latent chemistry (e.g., release of small molecules, change in mechanical properties, activation of catalysts, etc.) in soft robots. In addition, mechanochromic stress mapping in a functional actuating device might provide a useful design and optimization tool, revealing spatial and temporal force evolution within the actuator in a way that might also be coupled to feedback loops that allow autonomous, self-regulation of activity.
In the future, both the specific material and the general approach should be useful in enriching the responsive functionality of soft elastomeric materials and devices. We anticipate the development of new mechanophores that, like the materials, are reversibly and repeatedly activated, expanding the capabilities of soft, active devices and further permitting dynamic control over chemical reactivity that is otherwise inaccessible, each in response to a single remote signal.
Resumo:
The present thesis examines the representation of the impotent body and mind in a selection of Samuel Beckett’s dramatic and prose works. Aiming to show that the body-mind relation is represented as one of co-implication and co-constitution, this thesis also takes the representation of memory in Beckett’s work as a key site for examining this relation. The thesis seeks to address the centrality of the body and embodied subjectivity in the experience of memory and indeed in signification and experience more generally. In these terms, Chapter 1 analyzes the representation of the figure of the couple in Beckett’s drama of the 1950s – as a metaphor of the body-mind relation – and, in light of Jacques Derrida’s theory of the supplement and Bernard Stiegler’s theory of technics, it discusses how the relationship between physical body and mind is defined by an essential supplementarity that is revealed even (or especially) in their apparent separation. Furthermore, the impotence that marks both elements in Beckett’s writings, when it is seen to lay bare this intrication, can be viewed, in important respects, as enabling rather than merely privative. Chapter 2 discusses the somatic structure of memory as represented in four of Beckett’s later dramatic works composed in the 1970s and 1980s. Similarly to Chapter 1, the second chapter focuses on the more “extreme” representation of bodily impotence in Beckett and demonstrates that rather than a merely “mental” recollection, memory in the work of Beckett is presented as necessarily experienced through, and shaped by, the body itself. In this light, then, it is shown that despite the impotence that marks the body in Beckett’s work of the 1970s and 1980s, the body is a necessary site of memory and retains or discovers a kind of activity in this impotence. Finally, Chapter 3 shifts its attention to Beckett’s prose works in order to explore how such works, reliant on language rather than the physical performance of actors onstage, sustain questions of embodied subjectivity at their heart. Specifically, the chapter argues that, on closer inspection, Beckett’s “literature of the unword” is not an abstention from meaning and its materialization, but one that paradoxically foregrounds that “something” which remains an essential part of it, that is, an embodied subjectivity.
Resumo:
Research on women prisoners and drug use is scarce in our context and needs theoretical tools to understand their life paths. In this article, I introduce an intersectional perspective on the experiences of women in prison, with particular focus on drug use. To illustrate this, I draw on the life story of one of the women interviewed in prison, in order to explore the axes of inequality in the lives of women in prison. These are usually presented as accumulated and articulated in complex and diverse ways. The theoretical tool of intersectionality allows us to gain an understanding of the phenomenon of women prisoners who have used drugs. This includes both the structural constraints in which they were embedded and the decisions they made, considering the circumstances of disadvantage in which they were immersed. This is a perspective which has already been intuitively present since the dawn of feminist criminology in the English-speaking world and can now be developed further due to new contributions in this field of gender studies.
Resumo:
This study examines the role of visual literacy in learning biology. Biology teachers promote the use of digital images as a learning tool for two reasons: because biology is the most visual of the sciences, and the use of imagery is becoming increasingly important with the advent of bioinformatics; and because studies indicate that this current generation of teenagers have a cognitive structure that is formed through exposure to digital media. On the other hand, there is concern that students are not being exposed enough to the traditional methods of processing biological information - thought to encourage left-brain sequential thinking patterns. Theories of Embodied Cognition point to the importance of hand-drawing for proper assimilation of knowledge, and theories of Multiple Intelligences suggest that some students may learn more easily using traditional pedagogical tools. To test the claim that digital learning tools enhance the acquisition of visual literacy in this generation of biology students, a learning intervention was carried out with 33 students enrolled in an introductory college biology course. The study compared learning outcomes following two types of learning tools. One learning tool was a traditional drawing activity, and the other was an interactive digital activity carried out on a computer. The sample was divided into two random groups, and a crossover design was implemented with two separate interventions. In the first intervention students learned how to draw and label a cell. Group 1 learned the material by computer and Group 2 learned the material by hand-drawing. In the second intervention, students learned how to draw the phases of mitosis, and the two groups were inverted. After each learning activity, students were given a quiz on the material they had learned. Students were also asked to self-evaluate their performance on each quiz, in an attempt to measure their level of metacognition. At the end of the study, they were asked to fill out a questionnaire that was used to measure the level of task engagement the students felt towards the two types of learning activities. In this study, following the first testing phase, the students who learned the material by drawing had a significantly higher average grade on the associated quiz compared to that of those who learned the material by computer. The difference was lost with the second “cross-over” trial. There was no correlation for either group between the grade the students thought they had earned through self-evaluation, and the grade that they received. In terms of different measures of task engagement, there were no significant differences between the two groups. One finding from the study showed a positive correlation between grade and self-reported time spent playing video games, and a negative correlation between grade and self-reported interest in drawing. This study provides little evidence to support claims that the use of digital tools enhances learning, but does provide evidence to support claims that drawing by hand is beneficial for learning biological images. However, the small sample size, limited number and type of learning tasks, and the indirect means of measuring levels of metacognition and task engagement restrict generalisation of these conclusions. Nevertheless, this study indicates that teachers should not use digital learning tools to the exclusion of traditional drawing activities: further studies on the effectiveness of these tools are warranted. Students in this study commented that the computer tool seemed more accurate and detailed - even though the two learning tools carried identical information. Thus there was a mismatch between the perception of the usefulness of computers as a learning tool and the reality, which again points to the need for an objective assessment of their usefulness. Students should be given the opportunity to try out a variety of traditional and digital learning tools in order to address their different learning preferences.
Resumo:
Background: The natural history of Myotonic Dystrophy type 1 is largely unclear, longitudinal studies are lacking. Objectives: to collect clinical and laboratory data, to evaluate sleep disorders, somatic and autonomic skin fibres, neuropsychological and neuroradiological aspects in DM1 patients. Methods: 72 DM1 patients underwent a standardized clinical and neuroradiological evaluation performed by a multidisciplinary team during 3 years of follow-up. Results: longer disease duration was associated with higher incidence of conduction disorders and lower ejection fraction; higher CVF values were predictors for a reduced risk of cardiopathy. Lower functional pulmonary values were associated with class of expansion and were negatively associated with disease duration; arterial blood gas parameters were not associated with expansion size, disease duration nor with respiratory function test. Excessive daytime sleepiness was not associated with class of expansion nor with any of the clinical parameters examined. We detected apnoea in a large percentage of patients, without differences between the 3 genetic classes; higher CVF values were predictors for a reduced risk of apnoea. Skin biopsies demonstrated the presence of a subclinical small fibre neuropathy with involvement of the somatic fibres. The pupillometry study showed lower pupil size at baseline and a lower constriction response to light. The most affected neuropsychological domains were executive functions, visuoconstructional, attention and visuospatial tasks, with a worse performance of E1 patients in the visuoperceptual ability and social cognition tasks. MRI study demonstrated a decrease in the volumes of frontal, parietal, temporal, occipital cortices, accumbens, putamen nuclei and a more severe volume reduction of the isthmus cingulate, transverse temporal, superior parietal and temporal gyri in E2 patients. Discussion: only some clinical parameters could predict the risk of cardiopathy, pulmonary syndrome and sleep disorders, while other clinical aspects proved to be unpredictable, confirming the importance of periodic clinical follow-up of these patients.
Resumo:
Spectral sensors are a wide class of devices that are extremely useful for detecting essential information of the environment and materials with high degree of selectivity. Recently, they have achieved high degrees of integration and low implementation cost to be suited for fast, small, and non-invasive monitoring systems. However, the useful information is hidden in spectra and it is difficult to decode. So, mathematical algorithms are needed to infer the value of the variables of interest from the acquired data. Between the different families of predictive modeling, Principal Component Analysis and the techniques stemmed from it can provide very good performances, as well as small computational and memory requirements. For these reasons, they allow the implementation of the prediction even in embedded and autonomous devices. In this thesis, I will present 4 practical applications of these algorithms to the prediction of different variables: moisture of soil, moisture of concrete, freshness of anchovies/sardines, and concentration of gasses. In all of these cases, the workflow will be the same. Initially, an acquisition campaign was performed to acquire both spectra and the variables of interest from samples. Then these data are used as input for the creation of the prediction models, to solve both classification and regression problems. From these models, an array of calibration coefficients is derived and used for the implementation of the prediction in an embedded system. The presented results will show that this workflow was successfully applied to very different scientific fields, obtaining autonomous and non-invasive devices able to predict the value of physical parameters of choice from new spectral acquisitions.
Resumo:
A Reserva Extrativista Marinha (RESEXMAR) do Corumbau foi criada no ano de 2000, a partir de uma ação coletiva, iniciada em 1997 por meio das lideranças de pescadores locais, na busca de instrumento jurídico que garantisse o acesso exclusivo dos recursos pesqueiros contra a atividade da pesca comercial de camarão sete-barbas que se instalou na década de 1980. Durante o processo de criação da RESEXMAR do Corumbau, os pescadores obtiveram apoio de órgãos governamentais, como a Coordenação Nacional de Populações Tradicionais (CNPT) e de entidades ambientalistas do terceiro setor – Associação Pradense de Proteção Ambiental (APPA), e posteriormente a Conservation International do Brasil (CI-Brasil). Entretanto, após a criação da RESEXMAR do Corumbau – entre os anos 2000 e 2002 – foi elaborado o Plano de Manejo que orientaria a gestão da Unidade de Conservação (UC). O documento foi capitaneado pela equipe técnica e científica vinculada à CI-Brasil, tendo como ponto de destaque a criação de áreas de exclusão total da atividade da pesca, por meio da Zona de Proteção Marinha (ZPM). A ideia de uma ZPM, para a CI-Brasil, era que de forma indireta e em médio e longo prazo, os pescadores se beneficiariam com o possível aumento de produção de pescado, contanto que 30% de cobertura de recifes tivesse algum tipo de proteção dos processos ecológicos, tais como reprodução e crescimento de espécies. Durante as discussões do Plano de Manejo e atualmente uma parcela de pescadores locais contestaram os limites da ZPM, pois iria restringir o acesso aos recursos pesqueiros. No entanto, tal contestação foi suprimida pelas relações não formais que os membros da CI-Brasil possuíam com o núcleo familiar principal da Vila do Corumbau, forçando os demais em um acordo formal temporário. Tal questionamento evidenciou um conflito de conjunto de normas distintas entre pescadores artesanais em relação à CI-Brasil e IBAMA: a pesca artesanal ‒ um tipo de ação que segue normas específicas das quais elementos humanos e não humanos interagem conjuntamente, evidenciando um conhecimento prático e corporizado constituindo um modelo compreensivo de mundo e de natureza; conceitos modernos e globalizantes de uma natureza totalmente desvinculada das práticas locais artesanais, com forte articulação de uma entidade ambientalista de alcance internacional, guiada pela emergência das questões ambientais, imprimindo no local (o lugar da prática da pesca tradicional) a ideia de um espaço (Áreas Marinhas Protegidas), desencaixado de formas específicas de natureza/culturas.
Resumo:
Este artigo apresenta parte de um estudo fundamentado na problemática da demonstração na matemática escolar. Descreve o modo como quatro alunos do 9.º ano exploraram uma tarefa relacionada com a descoberta de eixos de simetria em várias figuras geométricas. A demonstração, que os mesmos construíram, teve essencialmente uma função explicativa. O papel da professora na negociação do significado de demonstração e da sua necessidade é igualmente analisado. Os alunos desenvolvem primeiro uma compreensão prática sem consciência das razões que fundamentam as afirmações matemáticas e só depois uma compreensão teórica que os conduz à construção de uma demonstração.
Resumo:
There are several hazards in histopathology laboratories and its staff must ensure that their professional activity is set to the highest standards while complying with the best safety procedures. Formalin is one of the chemical hazards to which such professionals are routinely exposed. To decrease this contact, it is suggested that 10% neutral buffered liquid formalin (FL) is replaced by 10% formalin-gel (FG), given the later reduces the likelihood of spills and splashes, and decreased fume levels are released during its handling, proving itself less harmful. However, it is mandatory to assess the effectiveness of FG as a fixative and ensure that the subsequent complementary techniques, such as immunohistochemistry (IHC), are not compromised. Two groups of 30 samples from human placenta have been fixed with FG and FL fixatives during different periods of time (12, 24, and 48 hours) and, thereafter, processed, embedded, and sectioned. IHC for six different antibodies was performed and the results were scored (0–100) using an algorithm that took into account immunostaining intensity, percentage of staining structures, non-specific immunostaining, contrast, and morphological preservation. Parametric and non-parametric statistical tests were used (alpha = 0•05). All results were similar for both fixatives, with global score means of 95•36±6•65 for FL and 96•06±5•80 for FG, and without any statistical difference (P>0•05). The duration of the fixation had no statistical relevance also (P>0•05). So it is proved here FG could be an effective alternative to FL.
Resumo:
Preemptions account for a non-negligible overhead during system execution. There has been substantial amount of research on estimating the delay incurred due to the loss of working sets in the processor state (caches, registers, TLBs) and some on avoiding preemptions, or limiting the preemption cost. We present an algorithm to reduce preemptions by further delaying the start of execution of high priority tasks in fixed priority scheduling. Our approaches take advantage of the floating non-preemptive regions model and exploit the fact that, during the schedule, the relative task phasing will differ from the worst-case scenario in terms of admissible preemption deferral. Furthermore, approximations to reduce the complexity of the proposed approach are presented. Substantial set of experiments demonstrate that the approach and approximations improve over existing work, in particular for the case of high utilisation systems, where savings of up to 22% on the number of preemption are attained.
Resumo:
Consider the problem of scheduling a set of sporadic tasks on a multiprocessor system to meet deadlines using a tasksplitting scheduling algorithm. Task-splitting (also called semipartitioning) scheduling algorithms assign most tasks to just one processor but a few tasks are assigned to two or more processors, and they are dispatched in a way that ensures that a task never executes on two or more processors simultaneously. A certain type of task-splitting algorithms, called slot-based task-splitting, is of particular interest because of its ability to schedule tasks at high processor utilizations. We present a new schedulability analysis for slot-based task-splitting scheduling algorithms that takes the overhead into account and also a new task assignment algorithm.