780 resultados para embedded computing
Resumo:
The goal of this thesis is to make a case study of test automation’s profitability in the development of embedded software in a real industrial setting. The cost-benefit analysis is done by considering the costs and benefits test automation causes to software development, before the software is released to customers. The potential benefits of test automation regarding software quality after customer release were not estimated. Test automation is a significant investment which often requires dedicated resources. When done accordingly, the investment in test automation can produce major cost savings by reducing the need for manual testing effort, especially if the software is developed with an agile development framework. It can reduce the cost of avoidable rework of software development, as test automation enables the detection of construction time defects in the earliest possible moment. Test automation also has many pitfalls such as test maintainability and testability of the software, and if those areas are neglected, the investment in test automation may become worthless or it may even produce negative results. The results of this thesis suggest that test automation is very profitable at the company under study.
Resumo:
Agile methods have become increasingly popular in the field of software engineering. While agile methods are now generally considered applicable to software projects of many different kinds, they have not been widely adopted in embedded systems development. This is partly due to the natural constraints that are present in embedded systems development (e.g. hardware–software interdependencies) that challenge the utilization of agile values, principles and practices. The research in agile embedded systems development has been very limited, and this thesis tackles an even less researched theme related to it: the suitability of different project management tools in agile embedded systems development. The thesis covers the basic aspects of many different agile tool types from physical tools, such as task boards and cards, to web-based agile tools that offer all-round solutions for application lifecycle management. In addition to these two extremities, there is also a wide range of lighter agile tools that focus on the core agile practices, such as backlog management. Also other non-agile tools, such as bug trackers, can be used to support agile development, for instance, with plug-ins. To investigate the special tool requirements in agile embedded development, the author observed tool related issues and solutions in a case study involving three different companies operating in the field of embedded systems development. All three companies had a distinct situation in the beginning of the case and thus the tool solutions varied from a backlog spreadsheet built from scratch to plug-in development for an already existing agile software tool. Detailed reports are presented of all three tool cases. Based on the knowledge gathered from agile tools and the case study experiences, it is concluded that there are tool related issues in the pilot phase, such as backlog management and user motivation. These can be overcome in various ways epending on the type of a team in question. Finally, five principles are formed to give guidelines for tool selection and usage in agile embedded systems development.
Resumo:
Smart phones became part and parcel of our life, where mobility provides a freedom of not being bounded by time and space. In addition, number of smartphones produced each year is skyrocketing. However, this also created discrepancies or fragmentation among devices and OSes, which in turn made an exceeding hard for developers to deliver hundreds of similar featured applications with various versions for the market consumption. This thesis is an attempt to investigate whether cloud based mobile development platforms can mitigate and eventually eliminate fragmentation challenges. During this research, we have selected and analyzed the most popular cloud based development platforms and tested integrated cloud features. This research showed that cloud based mobile development platforms may able to reduce mobile fragmentation and enable to utilize single codebase to deliver a mobile application for different platforms.
Resumo:
Traction motor design significantly differs from industrial machine design. The starting point is the load cycle instead of the steady-state rated operation point. The speed of the motor varies from zero to very high speeds. At low speeds, heavy overloading is used for starting, and the field-weakening region also plays an important role. Finding a suitable fieldweakening point is one of the important design targets. At the lowest speeds, a high torque output is desired, and all current reserves of the supplying converter unit are used to achieve the torque. In this paper, a 110-kW 2.5-p.u. starting torque and a maximum 2.5-p.u. speed permanent-magnet traction motor will be studied. The field-weakening point is altered by varying the number of winding turns of machine. One design is selected for prototyping. Theoretical results are verified by measurements.
Resumo:
Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.
Resumo:
Smart home implementation in residential buildings promises to optimize energy usage and save significant amount of energy simply due to a better understanding of user's energy usage profile. Apart from the energy optimisation prospects of this technology, it also aims to guarantee occupants significant amount of comfort and remote control over home appliances both at home locations and at remote places. However, smart home investment just like any other kind of investment requires an adequate measurement and justification of the economic gains it could proffer before its realization. These economic gains could differ for different occupants due to their inherent behaviours and tendencies. Thus it is pertinent to investigate the various behaviours and tendencies of occupants in different domain of interests and to measure the value of the energy savings accrued by smart home implementations in these domains of interest in order to justify such economic gains. This thesis investigates two domains of interests (the rented apartment and owned apartment) for primarily two behavioural tendencies (Finland and Germany) obtained from observation and corroborated by conducted interviews to measure the payback time and Return on Investment (ROI) of their smart home implementations. Also, similar measures are obtained for identified Australian use case. The research finding reveals that building automation for the Finnish behavioural tendencies seems to proffers a better ROI and payback time for smart home implementations.
Resumo:
The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.
Resumo:
This guide summarizes useful information about the European Space Agency (ESA), the European space industry, the ECSS standards and product assurance for small and medium enterprises that are aiming to enter the industry. Additionally, the applicability of agile development in space projects is discussed.
Resumo:
Manufacturing industry has been always facing challenge to improve the production efficiency, product quality, innovation ability and struggling to adopt cost-effective manufacturing system. In recent years cloud computing is emerging as one of the major enablers for the manufacturing industry. Combining the emerged cloud computing and other advanced manufacturing technologies such as Internet of Things, service-oriented architecture (SOA), networked manufacturing (NM) and manufacturing grid (MGrid), with existing manufacturing models and enterprise information technologies, a new paradigm called cloud manufacturing is proposed by the recent literature. This study presents concepts and ideas of cloud computing and cloud manufacturing. The concept, architecture, core enabling technologies, and typical characteristics of cloud manufacturing are discussed, as well as the difference and relationship between cloud computing and cloud manufacturing. The research is based on mixed qualitative and quantitative methods, and a case study. The case is a prototype of cloud manufacturing solution, which is software platform cooperated by ATR Soft Oy and SW Company China office. This study tries to understand the practical impacts and challenges that are derived from cloud manufacturing. The main conclusion of this study is that cloud manufacturing is an approach to achieve the transformation from traditional production-oriented manufacturing to next generation service-oriented manufacturing. Many manufacturing enterprises are already using a form of cloud computing in their existing network infrastructure to increase flexibility of its supply chain, reduce resources consumption, the study finds out the shift from cloud computing to cloud manufacturing is feasible. Meanwhile, the study points out the related theory, methodology and application of cloud manufacturing system are far from maturity, it is still an open field where many new technologies need to be studied.
Resumo:
Globalization and interconnectedness in the worldwide sphere have changed the existing and prevailing modus operandi of organizations around the globe and have challenged existing practices along with the business as usual mindset. There are no rules in terms of creating a competitive advantage and positioning within an unstable, constantly changing and volatile globalized business environment. The financial industry, the locomotive or the flagship industry of global economy, especially, within the aftermath of the financial crisis, has reached a certain point trying to recover and redefine its strategic orientation and positioning within the global business arena. Innovation has always been a trend and a buzzword and by many has been considered as the ultimate answer to any kind of problem. The mantra Innovate or Die has been prevailing in any organizational entity in a, sometimes, ruthless endeavour to develop cutting-edge products and services and capture a landmark position in the market. The emerging shift from a closed to an open innovation paradigm has been considered as new operational mechanism within the management and leadership of the company of the future. To that respect, open innovation has been experiencing a tremendous growth research trajectory by putting forward a new way of exchanging and using surplus knowledge in order to sustain innovation within organizations and in the level of industry. In the abovementioned reality, there seems to be something missing: the human element. This research, by going beyond the traditional narratives for open innovation, aims at making an innovative theoretical and managerial contribution developed and grounded on the on-going discussion regarding the individual and organizational barriers to open innovation within the financial industry. By functioning across disciplines and researching out to primary data, it debunks the myth that open innovation is solely a knowledge inflow and outflow mechanism and sheds light to the understanding on the why and the how organizational open innovation works by enlightening the broader dynamics and underlying principles of this fascinating paradigm. Little attention has been given to the role of the human element, the foundational pre-requisite of trust encapsulated within the precise and fundamental nature of organizing for open innovation, the organizational capabilities, the individual profiles of open innovation leaders, the definition of open innovation in the realms of the financial industry, the strategic intent of the financial industry and the need for nurturing a societal impact for human development. To that respect, this research introduces the trust-embedded approach to open innovation as a new insightful way of organizing for open innovation. It unveils the peculiarities of the corporate and individual spheres that act as a catalyst towards the creation of productive open innovation activities. The incentive of this research captures the fundamental question revolving around the need for financial institutions to recognise the importance for organizing for open innovation. The overarching question is why and how to create a corporate culture of openness in the financial industry, an organizational environment that can help open innovation excel. This research shares novel and cutting edge outcomes and propositions both under the prism of theory and practice. The trust-embedded open innovation paradigm captures the norms and narratives around the way of leading open innovation within the 21st century by cultivating a human-centricity mindset that leads to the creation of human organizations, leaving behind the dehumanization mindset currently prevailing within the financial industry.
Resumo:
By employing the embedded-atom potentials of Mei et ai.[l], we have calculated the dynamical matrices and phonon dispersion curves for six fee metals (Cu,Ag,Au,Ni,Pd and Pt). We have also investigated, within the quasiharmonic approximation, some other thermal properties of these metals which depend on the phonon density of states, such as the temperature dependence of lattice constant, coefficient of linear thermal expansion, isothermal and adiabatic bulk moduli, heat capacities at constant volume and constant pressure, Griineisen parameter and Debye temperature. The computed results are compared with the experimental findings wherever possible. The comparison shows a generally good agreement between the theoretical values and experimental data for all properties except the discrepancies of phonon frequencies and Debye temperature for Pd, Pt and Au. Further, we modify the parameters of this model for Pd and Pt and obtain the phonon dispersion curves which is in good agreement with experimental data.
Resumo:
Now, more than ever, sponsors of athletic events demand to see evidence of a commercial return, such as enhanced brand awareness, for their investment of cash or non-cash resources (Lough et aI., 2000). The most common way to measure the impact of perimeter signage (Le., any billboard or sign that displays a company's brand name and/or logo and which surrounds the playing area) on spectators' awareness of event sponsors has been through the use of brand name recall and recognition tests (Shilbury & Berriman, 1996). Recall testing requires spectators to list all of the sponsors they can remember seeing at, for example, an athletic event, strictly from memory and without any help (Cuneen & Hannan, 1993). With recognition testing, spectators are required to identify sponsors from a prepared list which include "dummy" brand names (i.e., sponsors that are present in the list but which do not actually sponsor the event). In order to determine whether sponsors' brand awareness objectives are being met, it is important for sport and recreation marketers to understand what influences a spectator's ability to remember (Le., recall and/or recognize) the brand names of companies who advertise on perimeter signage. The purpose this study was to examine the factors that influence spectators' recall and recognition of embedded sponsorship stimuli (i.e., company brand names on perimeter signage surrounding the play area) at a Canadian University's men's basketball game and football game. These factors included the number of games spectators attended over the course of the season (i.e., repeated exposure to sponsorship stimuli), spectators' level of involvement with the event, and spectators' level of involvement with the advertisements (i.e., perimeter signage). This study also examined the differences between recall and recognition as a means of measuring spectators' awareness of sponsors, and attempted to determine if there are sport differences in spectators' recall and recognition of perimeter signage. Upon leaving the football stadium or gymnasium, spectators were approached, at random, by trained research assistants located at each exit and asked to complete a brief survey questionnaire. Respondents completed the survey on-site. A total of 358 completed surveys were collected from spectators who attended the football (N = 277) and basketball (N = 81) games. The data suggest that football and basketball respondents recognized more sponsors' brand names than they recalled. In addition, football respondents who were highly involved with the event (i.e., those individuals who viewed attending the events as fun, interesting and exciting) attended more games over the course of the season and had significantly higher brand name recognition of sponsors who advertised on perimeter signage than those individuals with low involvement with the athletic event. Football respondents who were highly involved with the sponsors' advertisements (i.e., those individuals who viewed sponsors' perimeter signage as appealing, valuable and important) had significantly higher brand name recall of event sponsors than those individuals with low involvement with these sponsors' advertisements. Repeated exposure to perimeter signage did not have a significant influence on football or basketball respondents' recall or recognition of sponsors. Finally, the data revealed that football respondents had significantly higher recall of sponsors' brand names than basketball respondents. Conversely, basketball respondents had significantly higher recognition of sponsors' brand names than did football respondents.
Resumo:
Variations in different types of genomes have been found to be responsible for a large degree of physical diversity such as appearance and susceptibility to disease. Identification of genomic variations is difficult and can be facilitated through computational analysis of DNA sequences. Newly available technologies are able to sequence billions of DNA base pairs relatively quickly. These sequences can be used to identify variations within their specific genome but must be mapped to a reference sequence first. In order to align these sequences to a reference sequence, we require mapping algorithms that make use of approximate string matching and string indexing methods. To date, few mapping algorithms have been tailored to handle the massive amounts of output generated by newly available sequencing technologies. In otrder to handle this large amount of data, we modified the popular mapping software BWA to run in parallel using OpenMPI. Parallel BWA matches the efficiency of multithreaded BWA functions while providing efficient parallelism for BWA functions that do not currently support multithreading. Parallel BWA shows significant wall time speedup in comparison to multithreaded BWA on high-performance computing clusters, and will thus facilitate the analysis of genome sequencing data.
Resumo:
The potential of formative assessment (FA) for informing learning in classroom-based nursing courses is clearly established in the literature; however, research on FA in clinical courses remains scarce. This inquiry explored the lived experience of nursing students using transcendental phenomenology and described the phenomenon of being assessed in clinical courses. The research question guiding the study was: How is the phenomenon of assessment experienced by nursing students when FA is formally embedded in clinical courses? Inherent in this question were the following issues: (a) the meaning of clinical experiences for nursing students, (b) the meaning of being assessed through FA, and (c) what it is like to be assessed when FA is formally embedded within clinical experiences. The noematic themes that illuminated the whatness of the participants’ experience were (a) enabled cognitive activity, (b) useful feedback, (c) freedom to be, (d) enhanced focus, (e) stress moderator, and (f) respectful mentorship. The noetic themes associated with how the phenomenon was experienced were related to bodyhood, temporality, spatiality, and relationship to others. The results suggest a fundamental paradigm shift from traditional nursing education to a more pervasive integration of FA in clinical courses so that students have time to learn before being graded on their practice. Furthermore, this inquiry and the literature consulted provide evidence that using cognitive science theory to inform and reform clinical nursing education is a timely option to address the repeated calls from nursing leaders to modernize nursing education. This inquiry contributes to reduce our reliance on assumptions derived from research on FA in nursing classrooms and provides evidence based on the reality of using formative assessment in clinical courses. Recommendations for future research are presented.