921 resultados para Embedded Constructions
Resumo:
In the 29 years following \"Our Common Future\" by the United Nations, there is considerable debate among governments, civil society, interest groups and business organisations about what constitutes sustainable development, which constitutes evidence for a contested discourse concerning sustainability. The purpose of this study is to understand this debate in the developing economic context of Brazil, and in particular, to understand and critique the social and environmental accounting [SEA] discursive constructions relating to the State-owned, Petrobras as well as to understand the Brazilian literature on SEA. The discourse theory [DT]-based analysis employs rhetorical redescription to analyse twenty-two reports from Petrobras from 2004-2013. I investigate the political notions by employing the methodological framework of the Logics of Critical Explanation [LCE]. LCE engenders five methodological steps: problematisation, retroduction, logics (social, political and fantasmatic), articulation and critique. The empirical discussion suggests that the hegemony of economic development operates to obfuscate, rhetorically, the development of sustainability, so as to maintain the core business of Petrobras conceived as capital accumulation. Equally, these articulations also illustrate how the constructions of SEA operate to serve the company\'s purpose with few (none) profound changes in integration of sustainability. The Brazilian literature on SEA sustains the status quo of neo-liberal market policies that operate to protect the dominant business case approach to maintain an agenda of wealth-creation in relation to social and environmental needs. The articulations of the case manifested in policies regarding, for example, corruption, which involved over-payments for contracts and unsustainable practices relating to the use of fossil fuels and demonstrated that there was antagonism between action and disclosure. The corruption scandal that emerged after SEA disclosures highlighted the rhetorical nature of disclosure when financial resources were subtracted from the company for political parties and engineering contractors hid facts through incomplete disclosures. The articulations of SEA misrepresent a broader context of the meanings associated with sustainability, which restricted the constructions of SEA to principally serve and represent the intention of the most powerful groups. The significance of SEA, then is narrowed to represent particular interests. The study argues for more critical studies as limited Brazilian literature concerning SEA kept a \'safe distance\' from substantively critiquing the constructions of SEA and its articulations in the Brazilian context. The literature review and the Petrobras\' case illustrate a variety of naming, instituting and articulatory practices that endeavour to maintain the current hegemony of development in an emerging economy, which allows Petrobras to continue to exercise significant profit at the expense of the social and environmental. The constructed idea of development in Petrobras\' discourses emphasises a rhetoric of wider development, but, in reality, these discourses were the antithesis of political, social and ethical developmental issues. These constructions aim to hide struggles between social inequalities and exploitation of natural resources and constitute excuses about a fanciful notion of rhetorical and hegemonic neo-liberal development. In summary, this thesis contributes to the prior literature in five ways: (i) the addition of DT to the analysis of SEA enhances the discussion of political elements such as hegemony, antagonism, logic of equivalence/difference, ideology and articulation; (ii) the analysis of an emerging economy such as Brazil incorporates a new perspective of the discussion of the discourses of SEA and development; (iii) this thesis includes a focus on rhetoric to discuss the maintenance of the status quo; (iv) the holistic structure of the LCE approach enlarges the understanding of the social, political and fantasmatic logics of SEA studies and; (v) this thesis combines an analysis of the literature and the case of Petrobras to characterise and critique the state of the Brazilian academy and its impacts and reflections on the significance of SEA. This thesis, therefore, argues for more critical studies in the Brazilian academy due to the persistence of idea of SEA and development that takes-for-granted deep exclusions and contradictions and provide little space for critiques.
Resumo:
This study was a qualitative examination of Black college students' experiences with the Acting White label. In conducting this study, two gaps in literature were addressed: (1) the lack of literature on Black college students and the acting white label, and (2) lack of attention to a US racial history and current structures which allow a label such as "acting white" to exist. Thus, the purpose of this study was to call attention to the experiences of Black college students as it relates to the acting white label. Additionally, the study calls attention to social constructions that allow the acting white label to exist and to be sustained. Data was collected from 14 Black college students at a predominantly white, private, liberal arts university in the west. Based on responses from students in the study, Black college students do hear that they are acting white. Yet, their reaction to hearing the label does not cause them to underachieve academically, but does have an impact on their social actions. The ways in which Black college students in the study were labeled as acting white was based on academic pursuits, speech patterns, dress, and hobbies. Student reactions to the label ranged from ignoring the label to challenging the accuser. In regards to how the acting white label is sustained, students in the study expressed that they learned what it meant to act white or black from family interactions, social interactions and observations from family and friends, and from media sources. It was concluded that Black college students, despite reactions, do hear the label and that the label seems to be used as a means to attack Black students and their identity.
Resumo:
High-quality software, delivered on time and budget, constitutes a critical part of most products and services in modern society. Our government has invested billions of dollars to develop software assets, often to redevelop the same capability many times. Recognizing the waste involved in redeveloping these assets, in 1992 the Department of Defense issued the Software Reuse Initiative. The vision of the Software Reuse Initiative was "To drive the DoD software community from its current "re-invent the software" cycle to a process-driven, domain-specific, architecture-centric, library-based way of constructing software.'' Twenty years after issuing this initiative, there is evidence of this vision beginning to be realized in nonembedded systems. However, virtually every large embedded system undertaken has incurred large cost and schedule overruns. Investigations into the root cause of these overruns implicates reuse. Why are we seeing improvements in the outcomes of these large scale nonembedded systems and worse outcomes in embedded systems? This question is the foundation for this research. The experiences of the Aerospace industry have led to a number of questions about reuse and how the industry is employing reuse in embedded systems. For example, does reuse in embedded systems yield the same outcomes as in nonembedded systems? Are the outcomes positive? If the outcomes are different, it may indicate that embedded systems should not use data from nonembedded systems for estimation. Are embedded systems using the same development approaches as nonembedded systems? Does the development approach make a difference? If embedded systems develop software differently from nonembedded systems, it may mean that the same processes do not apply to both types of systems. What about the reuse of different artifacts? Perhaps there are certain artifacts that, when reused, contribute more or are more difficult to use in embedded systems. Finally, what are the success factors and obstacles to reuse? Are they the same in embedded systems as in nonembedded systems? The research in this dissertation is comprised of a series of empirical studies using professionals in the aerospace and defense industry as its subjects. The main focus has been to investigate the reuse practices of embedded systems professionals and nonembedded systems professionals and compare the methods and artifacts used against the outcomes. The research has followed a combined qualitative and quantitative design approach. The qualitative data were collected by surveying software and systems engineers, interviewing senior developers, and reading numerous documents and other studies. Quantitative data were derived from converting survey and interview respondents' answers into coding that could be counted and measured. From the search of existing empirical literature, we learned that reuse in embedded systems are in fact significantly different from nonembedded systems, particularly in effort in model based development approach and quality where the development approach was not specified. The questionnaire showed differences in the development approach used in embedded projects from nonembedded projects, in particular, embedded systems were significantly more likely to use a heritage/legacy development approach. There was also a difference in the artifacts used, with embedded systems more likely to reuse hardware, test products, and test clusters. Nearly all the projects reported using code, but the questionnaire showed that the reuse of code brought mixed results. One of the differences expressed by the respondents to the questionnaire was the difficulty in reuse of code for embedded systems when the platform changed. The semistructured interviews were performed to tell us why the phenomena in the review of literature and the questionnaire were observed. We asked respected industry professionals, such as senior fellows, fellows and distinguished members of technical staff, about their experiences with reuse. We learned that many embedded systems used heritage/legacy development approaches because their systems had been around for many years, before models and modeling tools became available. We learned that reuse of code is beneficial primarily when the code does not require modification, but, especially in embedded systems, once it has to be changed, reuse of code yields few benefits. Finally, while platform independence is a goal for many in nonembedded systems, it is certainly not a goal for the embedded systems professionals and in many cases it is a detriment. However, both embedded and nonembedded systems professionals endorsed the idea of platform standardization. Finally, we conclude that while reuse in embedded systems and nonembedded systems is different today, they are converging. As heritage embedded systems are phased out, models become more robust and platforms are standardized, reuse in embedded systems will become more like nonembedded systems.
Resumo:
Results of neuropsychological examinations depend on valid data. Whereas clinicians previously believed that clinical skill was sufficient to identify non-credible performance by examinees on standard tests, research demonstrates otherwise. Consequently, studies on measures to detect suspect effort in adults have received tremendous attention in the previous twenty years, and incorporation of validity indicators into neuropsychological examinations is now seen as integral. Few studies exist that validate methods appropriate for the measurement of effort in pediatric populations. Of extant studies, most evaluate standalone measures originally developed for use with adults. The present study examined the utility of indices from the California Verbal Learning Test – Children's Version (CVLT-C) as embedded validity indicators in a pediatric sample. Participants were 225 outpatients aged 8 to 16 years old referred for clinical assessment after mild traumatic brain injury (mTBI). Non-credible performance (n = 39) was defined as failure of the Medical Symptom Validity Test (MSVT). Logistic regression demonstrated that only the Recognition Discriminability index was predictive of MSVT failure (OR = 2.88, p < .001). A cutoff of z ≤ -1.0 was associated with sensitivity of 51% and specificity of 91%. In the current study, CVLT-C Recognition Discriminability was useful in the identification of non-credible performance in a sample of relatively high-functioning pediatric outpatients with mTBI. Thus, this index can be added to the short list of embedded validity indicators appropriate for pediatric neuropsychological assessment.
Resumo:
Online education is a new teaching and learning medium with few current guidelines for faculty, administrators or students. Its rapid growth over the last decade has challenged academic institutions to keep up with the demand, while also providing a quality education. Our understanding of the factors that determine quality and effective online learning experiences that lead to student learning outcomes is still evolving. There is a lack of consensus on the effectiveness of online versus face-to-face education in the current research. The U.S. Department of Education conducted a meta-analysis in 2009 and concluded that student-learning outcomes in online courses were equal to and, often times, better than face-to-face traditional courses. Subsequent research has found contradictory findings, and further inquiry is necessary. The purpose of this embedded mixed methods design research study is to further our understanding of the factors that create quality and successful educational outcomes in an online course. To achieve this, the first phase of this study measured and compared learning outcomes in an online and in class graduate-level legal administration course. The second phase of the study entailed interviews with those students in both the online and face-to-face sections to understand their perspectives on the factors contributing to learning outcomes. Six themes emerged from the qualitative findings: convenience, higher order thinking, discussions, professor engagement, professor and student interaction, and face-to-face interaction. Findings from this study indicate the factors students perceive as contributing to learning outcomes in an online course are consistent among all students and are supported in the existing literature. Higher order thinking, however, emerged as a stronger theme than indicated in the current research, and the face-to-face nature of the traditional classroom may be more an issue of familiarity than a factor contributing to learning outcomes. As education continues to reach new heights and developments in technology advance, the factors found to contribute to student learning outcomes will be refined and enhanced. These developments will continue to transform the ways in which we deliver and receive knowledge in both traditional and online classrooms. While there is a growing body of research on online education, the field’s evolution has unsettled earlier findings and posed new areas to investigate.
Resumo:
Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.
Resumo:
Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.
Resumo:
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).
Resumo:
The development of applications as well as the services for mobile systems faces a varied range of devices with very heterogeneous capabilities whose response times are difficult to predict. The research described in this work aims to respond to this issue by developing a computational model that formalizes the problem and that defines adjusting computing methods. The described proposal combines imprecise computing strategies with cloud computing paradigms in order to provide flexible implementation frameworks for embedded or mobile devices. As a result, the imprecise computation scheduling method on the workload of the embedded system is the solution to move computing to the cloud according to the priority and response time of the tasks to be executed and hereby be able to meet productivity and quality of desired services. A technique to estimate network delays and to schedule more accurately tasks is illustrated in this paper. An application example in which this technique is experimented in running contexts with heterogeneous work loading for checking the validity of the proposed model is described.
Resumo:
Comunicación presentada en las V Jornadas de Computación Empotrada, Valladolid, 17-19 Septiembre 2014
Resumo:
According to the importance of rehabilitation and recovery of Architectural Heritage in the live of people, this paper is aimed to strengthen the traditional methods of stone vaults calculation taking advantage of the technological characteristics of the powerful program ANSYS Workbench. As an example of this, it could find out the possible pathologies that could arise during the construction history of the building. To limit this research, the upper vault of the main chapel of the Santiago parish church in Orihuela -Alicante- is selected as a reference which is a Jeronimo Quijano´s important building work in the XVI century in the Renaissance. Moreover, it is an innovative stone masonry vault that consists of 8 double intercrossed arches with each other and braced by severies. During the seventeenth century there was a lantern in the central cap and it is unknown why it was removed. Its construction could justify the original constructive solution with intercrossed arches that freed the center to create a more enlightened and comfortable presbytery. By similarity with other Quijano’s works, it is considered a small lantern drilling the central spherical cap. It is proposed to carry out a comparative study of it with different architectural solutions from the same period and based on several common parameters such as: a vault of square plant with spherical surround, intercrossed arches, a possible lantern, the dimension of the permitted space, similar states of loads and compact limestone masonry. The three solutions are mainly differentiated by their size and the type of lantern and its comparison lets us know which one is the most resistant and stable. The other two building works maintain some connection with the Quijano's professional scope. It has selected the particular case of the Communion chapel of the Basilica in Elche (a large prismatic lantern with a large cylindrical drum that starts from the own arches and an upper hemispherical dome), for its conservation, its proximity to Orihuela and its implementation during the century XVIII. Finally, a significant Dome Spanish Renaissance complete the selection: a cross vault of the Benavides Chapel of the Saint Francisco Convent in Baeza - Jaén-, designed by Andres of Vandelvira in the sixteenth century (a large hemispherical dome that starts from the own arcs). To simplify the calculation and standardize the work that have to be contrasted, all of them were considered with some similar characteristics: 30 cm constant thickness, the intercrossed arches were specifically analyzed and had identical loads, Young's modulus and Poisson's ratio. Regarding the calculation solutions, in general terms, the compressive stresses predominate, influencing on it the joint collaboration of the filling material on the vault, the vault itself, the thick side walls, the buttresses and the top cover weight . In addition, the three solutions are suitable, being the Orihuela one the safest and the Baeza one the riskiest for its large dimensions. Thus, the idea of intercrossed arches with suitable thickness would allow carry out the heaviest lantern and this would confirm it as a Renaissance architectural typology built in stone.
Resumo:
Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.