839 resultados para Embodied embedded cognition
Resumo:
There is considerable evidence from animal studies that gonadal steroid hormones modulate neuronal activity and affect behavior. To study this in humans directly, we used H215O positron-emission tomography to measure regional cerebral blood flow (rCBF) in young women during three pharmacologically controlled hormonal conditions spanning 4–5 months: ovarian suppression induced by the gonadotropin-releasing hormone agonist leuprolide acetate (Lupron), Lupron plus estradiol replacement, and Lupron plus progesterone replacement. Estradiol and progesterone were administered in a double-blind cross-over design. On each occasion positron-emission tomography scans were performed during (i) the Wisconsin Card Sorting Test, a neuropsychological test that physiologically activates prefrontal cortex (PFC) and an associated cortical network including inferior parietal lobule and posterior inferolateral temporal gyrus, and (ii) a no-delay matching-to-sample sensorimotor control task. During treatment with Lupron alone (i.e., with virtual absence of gonadal steroid hormones), there was marked attenuation of the typical Wisconsin Card Sorting Test activation pattern even though task performance did not change. Most strikingly, there was no rCBF increase in PFC. When either progesterone or estrogen was added to the Lupron regimen, there was normalization of the rCBF activation pattern with augmentation of the parietal and temporal foci and return of the dorsolateral PFC activation. These data directly demonstrate that the hormonal milieu modulates cognition-related neural activity in humans.
Resumo:
We present evidence that the sporulation protein SpoIVFB of Bacillus subtilis is a member of a newly recognized family of metalloproteases that have catalytic centers adjacent to or within the membrane. SpoIVFB is required for converting the membrane-associated precursor protein, pro-σK, to the mature and active transcription factor σK by proteolytic removal of an N-terminal extension of 20 amino acids. SpoIVFB and other family members share the conserved sequence HEXXH, a hallmark of metalloproteases, as well as a second conserved motif NPDG, which is unique to the family. Both motifs, which are expected to form the catalytic center of the protease, overlap hydrophobic segments that are predicted to be separate transmembrane domains. The only other characterized member of this family of membrane-embedded metalloproteases is the mammalian Site-2 protease (S2P), which is required for the intramembrane cleavage of the eukaryotic transcription factor sterol regulatory element binding protein (SREBP). We report that amino acid substitutions in the two conserved motifs of SpoIVFB impair pro-σK processing and σK-directed gene expression during sporulation. These results and those from a similar analysis of S2P support the interpretation that both proteins are founding members of a family of metalloproteases involved in the activation of membrane-associated transcription factors. Thus, the pathways that govern the activation of the prokaryotic transcription factor pro-σK and the mammalian transcription factor SREBP not only are analogous but also use processing enzymes with strikingly homologous features.
Resumo:
In an unprecedented finding, Davis et al. [Davis, R. E., Miller, S., Herrnstadt, C., Ghosh, S. S., Fahy, E., Shinobu, L. A., Galasko, D., Thal, L. J., Beal, M. F., Howell, N. & Parker, W. D., Jr. (1997) Proc. Natl. Acad. Sci. USA 94, 4526–4531] used an unusual DNA isolation method to show that healthy adults harbor a specific population of mutated mitochondrial cytochrome c oxidase (COX) genes that coexist with normal mtDNAs. They reported that this heteroplasmic population was present at a level of 10–15% in the blood of normal individuals and at a significantly higher level (20–30%) in patients with sporadic Alzheimer’s disease. We provide compelling evidence that the DNA isolation method employed resulted in the coamplification of authentic mtDNA-encoded COX genes together with highly similar COX-like sequences embedded in nuclear DNA (“mtDNA pseudogenes”). We conclude that the observed heteroplasmy is an artifact.
Resumo:
The Vernacular Discourse of the "Arab Spring" is a project that bridges the divide between the East and the West by offering new readings to Arab subjectivities. Through an analysis of the "Arab Spring" through the lens of vernacular discourse, it challenges the Euro-Americo-centric legacies of Orientalism in Western academia and the new wave of extremism in the Arab world by offering alternative representations of Arab bodies and subjectivities. To offer this new reading of the "Arab Spring," it explores the foundations of critical rhetoric as a theory and a practice and argues for a turn towards a critical vernacular discourse. The turn towards critical vernacular discourse is important as it urges the analyses of different artifacts produced by marginalized groups in order to understand their perspectives that have largely been foreclosed in traditional cultural studies research. Building on embodied/performative critical rhetoric, the vernacular discourses of the Arab revolutionary body examines other forms of knowledge productions that are not merely textual; more specifically, through data gathered in the Lhbib Bourguiba, Tunisia. This analysis of the political revolutionary body unveils the complexity underlining the discussion around issues of identity, agency and representation in the Middle East and North Africa, and calls for a critical study towards these issues in the region beyond the binary approach that has been practiced and applied by academics and media analysts. Hence, by analyzing vernacular discourse, this research locates a method of examining and theorizing the dialectic between agency, citizenry, and subjectivity through the study of how power structure is recreated and challenged through the use of the vernacular in revolutionary movements, as well as how marginalized groups construct their own subjectivities through the use of vernacular discourse. Therefore, highlighting the political prominence of evaluating the Arab Spring as a vernacular discourse is important in creating new ways of understanding communication in postcolonial/neocolonial settings.
Resumo:
High-quality software, delivered on time and budget, constitutes a critical part of most products and services in modern society. Our government has invested billions of dollars to develop software assets, often to redevelop the same capability many times. Recognizing the waste involved in redeveloping these assets, in 1992 the Department of Defense issued the Software Reuse Initiative. The vision of the Software Reuse Initiative was "To drive the DoD software community from its current "re-invent the software" cycle to a process-driven, domain-specific, architecture-centric, library-based way of constructing software.'' Twenty years after issuing this initiative, there is evidence of this vision beginning to be realized in nonembedded systems. However, virtually every large embedded system undertaken has incurred large cost and schedule overruns. Investigations into the root cause of these overruns implicates reuse. Why are we seeing improvements in the outcomes of these large scale nonembedded systems and worse outcomes in embedded systems? This question is the foundation for this research. The experiences of the Aerospace industry have led to a number of questions about reuse and how the industry is employing reuse in embedded systems. For example, does reuse in embedded systems yield the same outcomes as in nonembedded systems? Are the outcomes positive? If the outcomes are different, it may indicate that embedded systems should not use data from nonembedded systems for estimation. Are embedded systems using the same development approaches as nonembedded systems? Does the development approach make a difference? If embedded systems develop software differently from nonembedded systems, it may mean that the same processes do not apply to both types of systems. What about the reuse of different artifacts? Perhaps there are certain artifacts that, when reused, contribute more or are more difficult to use in embedded systems. Finally, what are the success factors and obstacles to reuse? Are they the same in embedded systems as in nonembedded systems? The research in this dissertation is comprised of a series of empirical studies using professionals in the aerospace and defense industry as its subjects. The main focus has been to investigate the reuse practices of embedded systems professionals and nonembedded systems professionals and compare the methods and artifacts used against the outcomes. The research has followed a combined qualitative and quantitative design approach. The qualitative data were collected by surveying software and systems engineers, interviewing senior developers, and reading numerous documents and other studies. Quantitative data were derived from converting survey and interview respondents' answers into coding that could be counted and measured. From the search of existing empirical literature, we learned that reuse in embedded systems are in fact significantly different from nonembedded systems, particularly in effort in model based development approach and quality where the development approach was not specified. The questionnaire showed differences in the development approach used in embedded projects from nonembedded projects, in particular, embedded systems were significantly more likely to use a heritage/legacy development approach. There was also a difference in the artifacts used, with embedded systems more likely to reuse hardware, test products, and test clusters. Nearly all the projects reported using code, but the questionnaire showed that the reuse of code brought mixed results. One of the differences expressed by the respondents to the questionnaire was the difficulty in reuse of code for embedded systems when the platform changed. The semistructured interviews were performed to tell us why the phenomena in the review of literature and the questionnaire were observed. We asked respected industry professionals, such as senior fellows, fellows and distinguished members of technical staff, about their experiences with reuse. We learned that many embedded systems used heritage/legacy development approaches because their systems had been around for many years, before models and modeling tools became available. We learned that reuse of code is beneficial primarily when the code does not require modification, but, especially in embedded systems, once it has to be changed, reuse of code yields few benefits. Finally, while platform independence is a goal for many in nonembedded systems, it is certainly not a goal for the embedded systems professionals and in many cases it is a detriment. However, both embedded and nonembedded systems professionals endorsed the idea of platform standardization. Finally, we conclude that while reuse in embedded systems and nonembedded systems is different today, they are converging. As heritage embedded systems are phased out, models become more robust and platforms are standardized, reuse in embedded systems will become more like nonembedded systems.
Resumo:
Results of neuropsychological examinations depend on valid data. Whereas clinicians previously believed that clinical skill was sufficient to identify non-credible performance by examinees on standard tests, research demonstrates otherwise. Consequently, studies on measures to detect suspect effort in adults have received tremendous attention in the previous twenty years, and incorporation of validity indicators into neuropsychological examinations is now seen as integral. Few studies exist that validate methods appropriate for the measurement of effort in pediatric populations. Of extant studies, most evaluate standalone measures originally developed for use with adults. The present study examined the utility of indices from the California Verbal Learning Test – Children's Version (CVLT-C) as embedded validity indicators in a pediatric sample. Participants were 225 outpatients aged 8 to 16 years old referred for clinical assessment after mild traumatic brain injury (mTBI). Non-credible performance (n = 39) was defined as failure of the Medical Symptom Validity Test (MSVT). Logistic regression demonstrated that only the Recognition Discriminability index was predictive of MSVT failure (OR = 2.88, p < .001). A cutoff of z ≤ -1.0 was associated with sensitivity of 51% and specificity of 91%. In the current study, CVLT-C Recognition Discriminability was useful in the identification of non-credible performance in a sample of relatively high-functioning pediatric outpatients with mTBI. Thus, this index can be added to the short list of embedded validity indicators appropriate for pediatric neuropsychological assessment.
Resumo:
Online education is a new teaching and learning medium with few current guidelines for faculty, administrators or students. Its rapid growth over the last decade has challenged academic institutions to keep up with the demand, while also providing a quality education. Our understanding of the factors that determine quality and effective online learning experiences that lead to student learning outcomes is still evolving. There is a lack of consensus on the effectiveness of online versus face-to-face education in the current research. The U.S. Department of Education conducted a meta-analysis in 2009 and concluded that student-learning outcomes in online courses were equal to and, often times, better than face-to-face traditional courses. Subsequent research has found contradictory findings, and further inquiry is necessary. The purpose of this embedded mixed methods design research study is to further our understanding of the factors that create quality and successful educational outcomes in an online course. To achieve this, the first phase of this study measured and compared learning outcomes in an online and in class graduate-level legal administration course. The second phase of the study entailed interviews with those students in both the online and face-to-face sections to understand their perspectives on the factors contributing to learning outcomes. Six themes emerged from the qualitative findings: convenience, higher order thinking, discussions, professor engagement, professor and student interaction, and face-to-face interaction. Findings from this study indicate the factors students perceive as contributing to learning outcomes in an online course are consistent among all students and are supported in the existing literature. Higher order thinking, however, emerged as a stronger theme than indicated in the current research, and the face-to-face nature of the traditional classroom may be more an issue of familiarity than a factor contributing to learning outcomes. As education continues to reach new heights and developments in technology advance, the factors found to contribute to student learning outcomes will be refined and enhanced. These developments will continue to transform the ways in which we deliver and receive knowledge in both traditional and online classrooms. While there is a growing body of research on online education, the field’s evolution has unsettled earlier findings and posed new areas to investigate.
Resumo:
Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.
Resumo:
Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.
Resumo:
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).
Resumo:
The development of applications as well as the services for mobile systems faces a varied range of devices with very heterogeneous capabilities whose response times are difficult to predict. The research described in this work aims to respond to this issue by developing a computational model that formalizes the problem and that defines adjusting computing methods. The described proposal combines imprecise computing strategies with cloud computing paradigms in order to provide flexible implementation frameworks for embedded or mobile devices. As a result, the imprecise computation scheduling method on the workload of the embedded system is the solution to move computing to the cloud according to the priority and response time of the tasks to be executed and hereby be able to meet productivity and quality of desired services. A technique to estimate network delays and to schedule more accurately tasks is illustrated in this paper. An application example in which this technique is experimented in running contexts with heterogeneous work loading for checking the validity of the proposed model is described.
Resumo:
Comunicación presentada en las V Jornadas de Computación Empotrada, Valladolid, 17-19 Septiembre 2014
Resumo:
Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.