68 resultados para support systems
Resumo:
This thesis deals with a hardware accelerated Java virtual machine, named REALJava. The REALJava virtual machine is targeted for resource constrained embedded systems. The goal is to attain increased computational performance with reduced power consumption. While these objectives are often seen as trade-offs, in this context both of them can be attained simultaneously by using dedicated hardware. The target level of the computational performance of the REALJava virtual machine is initially set to be as fast as the currently available full custom ASIC Java processors. As a secondary goal all of the components of the virtual machine are designed so that the resulting system can be scaled to support multiple co-processor cores. The virtual machine is designed using the hardware/software co-design paradigm. The partitioning between the two domains is flexible, allowing customizations to the resulting system, for instance the floating point support can be omitted from the hardware in order to decrease the size of the co-processor core. The communication between the hardware and the software domains is encapsulated into modules. This allows the REALJava virtual machine to be easily integrated into any system, simply by redesigning the communication modules. Besides the virtual machine and the related co-processor architecture, several performance enhancing techniques are presented. These include techniques related to instruction folding, stack handling, method invocation, constant loading and control in time domain. The REALJava virtual machine is prototyped using three different FPGA platforms. The original pipeline structure is modified to suit the FPGA environment. The performance of the resulting Java virtual machine is evaluated against existing Java solutions in the embedded systems field. The results show that the goals are attained, both in terms of computational performance and power consumption. Especially the computational performance is evaluated thoroughly, and the results show that the REALJava is more than twice as fast as the fastest full custom ASIC Java processor. In addition to standard Java virtual machine benchmarks, several new Java applications are designed to both verify the results and broaden the spectrum of the tests.
Resumo:
The objective of the thesis is to enhance the understanding about the management of the front end phases of the innovation process in a networked environment. The thesis approaches the front end of innovation from three perspectives, including the strategy, processes and systems of innovation. The purpose of the use of different perspectives in the thesis is that of providing an extensive systemic view of the front end, and uncovering the complex nature of innovation management. The context of the research is the networked operating environment of firms. The unit of analysis is the firm itself or its innovation processes, which means that this research approaches the innovation networks from the point of view of a firm. The strategy perspective of the thesis emphasises the importance of purposeful innovation management, the innovation strategy of firms. The role of innovation processes is critical in carrying out innovation strategies in practice, supporting the development of organizational routines for innovation, and driving the strategic renewal of companies. The primary focus of the thesis from systems perspective is on idea management systems, which are defined as a part of innovation management systems, and defined for this thesis as any working combination of methodology and tools (manual or IT-supported) that enhance the management of innovations within their early phases. The main contribution of the thesis are the managerial frameworks developed for managing the front end of innovation, which purposefully “wire” the front end of innovation into the strategy and business processes of a firm. The thesis contributes to modern innovation management by connecting the internal and external collaboration networks as foundational elements for successful management of the early phases of innovation processes in a dynamic environment. The innovation capability of a firm is largely defined by its ability to rely on and make use of internal and external collaboration already during the front end activities, which by definition include opportunity identification and analysis, idea generation, profileration and selection, and concept definition. More specifically, coordination of the interfaces between these activities, and between the internal and external innovation environments of a firm is emphasised. The role of information systems, in particular idea management systems, is to support and delineate the innovation-oriented behaviour and interaction of individuals and organizations during front end activities. The findings and frameworks developed in the thesis can be used by companies for purposeful promotion of their front end processes. The thesis provides a systemic strategy framework for managing the front end of innovation – not as a separate process, but as an elemental bundle ofactivities that is closely linked to the overall innovation process and strategy of a firm in a distributed environment. The theoretical contribution of the thesis relies on the advancement of the open innovation paradigm in the strategic context of a firm within its internal and external innovation environments. This thesis applies the constructive research approach and case study methodology to provide theoretically significant results, which are also practically beneficial.
Resumo:
Sales configurators are essential tools for companies that offer complicated case specifically crafted products for customers. Most sophisticated of them are able to design an entire end product on the fly according to given constraints, calculate price for the offer and move the order into production. This thesis covers a sales configurator acquisition project in a large industrial company that offers cranes for its customers. The study spans the preliminary stages of a large-scale software purchase project starting from the specification of problem domain and ending up presenting the most viable software solution that fulfils the requirements for the new system. The project consists of mapping usage environment, use cases, and collecting requirements that are expected from the new system. The collected requirements involve fitting the new sales system into enterprise application infrastructure, mitigating the risks involved in the project and specifying new features to the application whilst preserving all of the admired features of the old sales system currently used in the company. The collected requirements were presented to a number of different sales software vendors who were asked to provide solution suggestions that would fulfil all the demands. All of the received solution proposals were exposed to an evaluation to determine the most feasible solutions, and the construction of evaluation criteria itself was a part of the study. The final outcome of this study is a short-list of the most feasible sales configurator solutions together with a description of how software purchase process in large enterprises work, and which aspects should be paid attention in large projects of similar kind.
Resumo:
Fast changing environment sets pressure on firms to share large amount of information with their customers and suppliers. The terms information integration and information sharing are essential for facilitating a smooth flow of information throughout the supply chain, and the terms are used interchangeably in research literature. By integrating and sharing information, firms want to improve their logistics performance. Firms share information with their suppliers and customers by using traditional communication methods (telephone, fax, Email, written and face-to-face contacts) and by using advanced or modern communication methods such as electronic data interchange (EDI), enterprise resource planning (ERP), web-based procurement systems, electronic trading systems and web portals. Adopting new ways of using IT is one important resource for staying competitive on the rapidly changing market (Saeed et al. 2005, 387), and an information system that provides people the information they need for performing their work, will support company performance (Boddy et al. 2005, 26). The purpose of this research has been to test and understand the relationship between information integration with key suppliers and/or customers and a firm’s logistics performance, especially when information technology (IT) and information systems (IS) are used for integrating information. Quantitative and qualitative research methods have been used to perform the research. Special attention has been paid to the scope, level and direction of information integration (Van Donk & van der Vaart 2005a). In addition, the four elements of integration (Jahre & Fabbe-Costes 2008) are closely tied to the frame of reference. The elements are integration of flows, integration of processes and activities, integration of information technologies and systems and integration of actors. The study found that information integration has a low positive relationship to operational performance and a medium positive relationship to strategic performance. The potential performance improvements found in this study vary from efficiency, delivery and quality improvements (operational) to profit, profitability or customer satisfaction improvements (strategic). The results indicate that although information integration has an impact on a firm’s logistics performance, all performance improvements have not been achieved. This study also found that the use of IT and IS have a mediocre positive relationship to information integration. Almost all case companies agreed on that the use of IT and IS could facilitate information integration and improve their logistics performance. The case companies felt that an implementation of a web portal or a data bank would benefit them - enhance their performance and increase information integration.
Resumo:
Energy efficiency is one of the major objectives which should be achieved in order to implement the limited energy resources of the world in a sustainable way. Since radiative heat transfer is the dominant heat transfer mechanism in most of fossil fuel combustion systems, more accurate insight and models may cause improvement in the energy efficiency of the new designed combustion systems. The radiative properties of combustion gases are highly wavelength dependent. Better models for calculating the radiative properties of combustion gases are highly required in the modeling of large scale industrial combustion systems. With detailed knowledge of spectral radiative properties of gases, the modeling of combustion processes in the different applications can be more accurate. In order to propose a new method for effective non gray modeling of radiative heat transfer in combustion systems, different models for the spectral properties of gases including SNBM, EWBM, and WSGGM have been studied in this research. Using this detailed analysis of different approaches, the thesis presents new methods for gray and non gray radiative heat transfer modeling in homogeneous and inhomogeneous H2O–CO2 mixtures at atmospheric pressure. The proposed method is able to support the modeling of a wide range of combustion systems including the oxy-fired combustion scenario. The new methods are based on implementing some pre-obtained correlations for the total emissivity and band absorption coefficient of H2O–CO2 mixtures in different temperatures, gas compositions, and optical path lengths. They can be easily used within any commercial CFD software for radiative heat transfer modeling resulting in more accurate, simple, and fast calculations. The new methods were successfully used in CFD modeling by applying them to industrial scale backpass channel under oxy-fired conditions. The developed approaches are more accurate compared with other methods; moreover, they can provide complete explanation and detailed analysis of the radiation heat transfer in different systems under different combustion conditions. The methods were verified by applying them to some benchmarks, and they showed a good level of accuracy and computational speed compared to other methods. Furthermore, the implementation of the suggested banded approach in CFD software is very easy and straightforward.
Resumo:
Being a top of high technology industries, the aerospace represents one of the most complex fields of study. While the competitiveness of aircraft systems’ manufacturers attracts a significant number of researchers, some of the issues remain to be a blank spot. One of those is the after-sale modernization. The master thesis investigates how this concept is related to the theory of competitive advantages. Finding the routes in the framework of complex technological systems’ lifecycle, the key drivers of the aircraft modernization market are revealed. The competitive positioning of players is defined through multiple case studies in a form of several in-depth interviews. The key result of the research is the conclusion that modernization should be considered as an inherent component of strategy of any aircraft systems’ manufacturer, while the master thesis aims to support managerial decision making.
Resumo:
Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.
Resumo:
Today's networked systems are becoming increasingly complex and diverse. The current simulation and runtime verification techniques do not provide support for developing such systems efficiently; moreover, the reliability of the simulated/verified systems is not thoroughly ensured. To address these challenges, the use of formal techniques to reason about network system development is growing, while at the same time, the mathematical background necessary for using formal techniques is a barrier for network designers to efficiently employ them. Thus, these techniques are not vastly used for developing networked systems. The objective of this thesis is to propose formal approaches for the development of reliable networked systems, by taking efficiency into account. With respect to reliability, we propose the architectural development of correct-by-construction networked system models. With respect to efficiency, we propose reusable network architectures as well as network development. At the core of our development methodology, we employ the abstraction and refinement techniques for the development and analysis of networked systems. We evaluate our proposal by employing the proposed architectures to a pervasive class of dynamic networks, i.e., wireless sensor network architectures as well as to a pervasive class of static networks, i.e., network-on-chip architectures. The ultimate goal of our research is to put forward the idea of building libraries of pre-proved rules for the efficient modelling, development, and analysis of networked systems. We take into account both qualitative and quantitative analysis of networks via varied formal tool support, using a theorem prover the Rodin platform and a statistical model checker the SMC-Uppaal.
Resumo:
The pumping processes requiring wide range of flow are often equipped with parallelconnected centrifugal pumps. In parallel pumping systems, the use of variable speed control allows that the required output for the process can be delivered with a varying number of operated pump units and selected rotational speed references. However, the optimization of the parallel-connected rotational speed controlled pump units often requires adaptive modelling of both parallel pump characteristics and the surrounding system in varying operation conditions. The available information required for the system modelling in typical parallel pumping applications such as waste water treatment and various cooling and water delivery pumping tasks can be limited, and the lack of real-time operation point monitoring often sets limits for accurate energy efficiency optimization. Hence, alternatives for easily implementable control strategies which can be adopted with minimum system data are necessary. This doctoral thesis concentrates on the methods that allow the energy efficient use of variable speed controlled parallel pumps in system scenarios in which the parallel pump units consist of a centrifugal pump, an electric motor, and a frequency converter. Firstly, the suitable operation conditions for variable speed controlled parallel pumps are studied. Secondly, methods for determining the output of each parallel pump unit using characteristic curve-based operation point estimation with frequency converter are discussed. Thirdly, the implementation of the control strategy based on real-time pump operation point estimation and sub-optimization of each parallel pump unit is studied. The findings of the thesis support the idea that the energy efficiency of the pumping can be increased without the installation of new, more efficient components in the systems by simply adopting suitable control strategies. An easily implementable and adaptive control strategy for variable speed controlled parallel pumping systems can be created by utilizing the pump operation point estimation available in modern frequency converters. Hence, additional real-time flow metering, start-up measurements, and detailed system model are unnecessary, and the pumping task can be fulfilled by determining a speed reference for each parallel-pump unit which suggests the energy efficient operation of the pumping system.
Resumo:
The Swedish public health care organisation could very well be undergoing its most significant change since its specialisation during the late 19th and early 20th century. At the heart of this change is a move from using manual patient journals to electronic health records (EHR). EHR are complex integrated organisational wide information systems (IS) that promise great benefits and value as well as presenting great challenges to the organisation. The Swedish public health care is not the first organisation to implement integrated IS, and by no means alone in their quest for realising the potential benefits and value that it has to offer. As organisations invest in IS they embark on a journey of value-creation and capture. A journey where a costbased approach towards their IS-investments is replaced with a value-centric focus, and where the main challenges lie in the practical day-to-day task of finding ways to intertwine technology, people and business processes. This has however proven to be a problematic task. The problematic situation arises from a shift of perspective regarding how to manage IS in order to gain value. This is a shift from technology delivery to benefits delivery; from an ISimplementation plan to a change management plan. The shift gives rise to challenges related to the inability of IS and the elusiveness of value. As a response to these challenges the field of IS-benefits management has emerged offering a framework and a process in order to better understand and formalise benefits realisation activities. In this thesis the benefits realisation efforts of three Swedish hospitals within the same county council are studied. The thesis focuses on the participants of benefits analysis projects; their perceptions, judgments, negotiations and descriptions of potential benefits. The purpose is to address the process where organisations seek to identify which potential IS-benefits to pursue and realise, this in order to better understand what affects the process, so that realisation actions of potential IS-benefits could be supported. A qualitative case study research design is adopted and provides a framework for sample selection, data collection, and data analysis. It also provides a framework for discussions of validity, reliability and generalizability. Findings displayed a benefits fluctuation, which showed that participants’ perception of what constituted potential benefits and value changed throughout the formal benefits management process. Issues like structure, knowledge, expectation and experience affected perception differently, and this in the end changed the amount and composition of potential benefits and value. Five dimensions of benefits judgment were identified and used by participants when finding accommodations of potential benefits and value to pursue. Identified dimensions affected participants’ perceptions, which in turn affected the amount and composition of potential benefits. During the formal benefits management process participants shifted between judgment dimensions. These movements emerged through debates and interactions between participants. Judgments based on what was perceived as expected due to one’s role and perceived best for the organisation as a whole were the two dominant benefits judgment dimensions. A benefits negotiation was identified. Negotiations were divided into two main categories, rational and irrational, depending on participants’ drive when initiating and participating in negotiations. In each category three different types of negotiations were identified having different characteristics and generating different outcomes. There was also a benefits negotiation process identified that displayed management challenges corresponding to its five phases. A discrepancy was also found between how IS-benefits are spoken of and how actions of IS benefits realisation are understood. This was a discrepancy between an evaluation and a realisation focus towards IS value creation. An evaluation focus described IS-benefits as well-defined and measurable effects and a realisation focus spoke of establishing and managing an on-going place of value creation. The notion of valuescape was introduced in order to describe and support the understanding of IS value creation. Valuescape corresponded to a realisation focus and outlined a value configuration consisting of activities, logic, structure, drivers and role of IS.
Resumo:
A growing concern for organisations is how they should deal with increasing amounts of collected data. With fierce competition and smaller margins, organisations that are able to fully realize the potential in the data they collect can gain an advantage over the competitors. It is almost impossible to avoid imprecision when processing large amounts of data. Still, many of the available information systems are not capable of handling imprecise data, even though it can offer various advantages. Expert knowledge stored as linguistic expressions is a good example of imprecise but valuable data, i.e. data that is hard to exactly pinpoint to a definitive value. There is an obvious concern among organisations on how this problem should be handled; finding new methods for processing and storing imprecise data are therefore a key issue. Additionally, it is equally important to show that tacit knowledge and imprecise data can be used with success, which encourages organisations to analyse their imprecise data. The objective of the research conducted was therefore to explore how fuzzy ontologies could facilitate the exploitation and mobilisation of tacit knowledge and imprecise data in organisational and operational decision making processes. The thesis introduces both practical and theoretical advances on how fuzzy logic, ontologies (fuzzy ontologies) and OWA operators can be utilized for different decision making problems. It is demonstrated how a fuzzy ontology can model tacit knowledge which was collected from wine connoisseurs. The approach can be generalised and applied also to other practically important problems, such as intrusion detection. Additionally, a fuzzy ontology is applied in a novel consensus model for group decision making. By combining the fuzzy ontology with Semantic Web affiliated techniques novel applications have been designed. These applications show how the mobilisation of knowledge can successfully utilize also imprecise data. An important part of decision making processes is undeniably aggregation, which in combination with a fuzzy ontology provides a promising basis for demonstrating the benefits that one can retrieve from handling imprecise data. The new aggregation operators defined in the thesis often provide new possibilities to handle imprecision and expert opinions. This is demonstrated through both theoretical examples and practical implementations. This thesis shows the benefits of utilizing all the available data one possess, including imprecise data. By combining the concept of fuzzy ontology with the Semantic Web movement, it aspires to show the corporate world and industry the benefits of embracing fuzzy ontologies and imprecision.
Resumo:
Agile methods have become increasingly popular in the field of software engineering. While agile methods are now generally considered applicable to software projects of many different kinds, they have not been widely adopted in embedded systems development. This is partly due to the natural constraints that are present in embedded systems development (e.g. hardware–software interdependencies) that challenge the utilization of agile values, principles and practices. The research in agile embedded systems development has been very limited, and this thesis tackles an even less researched theme related to it: the suitability of different project management tools in agile embedded systems development. The thesis covers the basic aspects of many different agile tool types from physical tools, such as task boards and cards, to web-based agile tools that offer all-round solutions for application lifecycle management. In addition to these two extremities, there is also a wide range of lighter agile tools that focus on the core agile practices, such as backlog management. Also other non-agile tools, such as bug trackers, can be used to support agile development, for instance, with plug-ins. To investigate the special tool requirements in agile embedded development, the author observed tool related issues and solutions in a case study involving three different companies operating in the field of embedded systems development. All three companies had a distinct situation in the beginning of the case and thus the tool solutions varied from a backlog spreadsheet built from scratch to plug-in development for an already existing agile software tool. Detailed reports are presented of all three tool cases. Based on the knowledge gathered from agile tools and the case study experiences, it is concluded that there are tool related issues in the pilot phase, such as backlog management and user motivation. These can be overcome in various ways epending on the type of a team in question. Finally, five principles are formed to give guidelines for tool selection and usage in agile embedded systems development.
Resumo:
Mammalian spermatozoa gain their fertilizing ability during maturation in the epididymis. Proteins and lipids secreted into the epididymal lumen remodel the sperm membrane, thereby providing the structure necessary for progressive motility and oocyte interaction. In the current study, genetically modified mouse models were utilized to determine the role of novel genes and regulatory systems in the postnatal development and function of the epididymis. Ablation of the mouse β-defensin, Defb41, altered the flagellar movements of sperm and reduced the ability of sperm to bind to the oocyte in vitro. The Defb41-deficient iCre knock-in mouse model was furthermore utilized to generate Dicer1 conditional knock-out (cKO) mice. DICER1 is required for production of mature microRNAs in the regulation of gene expression by RNA interference. Dicer1 cKO gave rise to dedifferentiation of the epididymal epithelium and an altered expression of genes involved in lipid synthesis. As a consequence, the cholesterol:polyunsaturated fatty acid ratio of the Dicer1 cKO sperm membrane was increased, which resulted in membrane instability and infertility. In conclusion, the results of the Defb41 study further support the important role of β-defensin family members in sperm maturation. The regulatory role of Dicer1 was also shown to be required for epididymal development. In addition, the study is the first to show a clear connection between lipid homeostasis in the epididymis and sperm membrane integrity. Taken together, the results give important new evidence on the regulatory system guiding epididymal development and function
Resumo:
The paper studied marketing of automatic fire suppression systems from the perspectives of customer value and institutions. The object of the study was research the special features of the sales and marketing of fire suppression systems, and find some practical applications for sales, and for lobbying of a new fire suppression technology. The theoretical background of the study was in the customer value literature and the theoretical concept of institutional entrepreneurship. The research was conducted as an electronic survey for three different groups of respondents; end customers, solution integrators, and re-sellers. From the answers was gathered generalisations about the customer value assessment and communication of the value related to the sales and marketing processes of the fire suppression systems. In addition, there was observed manners to receive information about the systems, and effects caused by institutions to the decision making of the different parties involved. The findings of the study support companies that are launching a new safety technology to the market focus their marketing, and help to understand institutional forces that are affecting to a safety related product.
Resumo:
With small and medium sized-enterprises (SMEs) taking up the majority of the global businesses, it is important they act in an environmentally responsible manner. Environmental management systems (EMS) help companies evaluate and improve their environmental impact but they often require human, financial, and temporary resources that not all SMEs can provide. This research encompasses interviews with representatives of two small enterprises in Germany to provide insights into their understanding, and knowledge of an EMS and how they perceive their responsibility towards the environment. Furthermore, it presents a toolkit created especially for small and medium-sized enterprises that serves as a simplified version of an EMS based on the ISO 14001 standard and is evaluated by the representatives of the SMEs. Some of the findings are: while being open to the idea of improving their environmental impact, SMEs do not always feel it is their responsibility to do so; they seem to lack the means to fully implement an EMS. The developed toolkit is considered useful and usable and recommendations are drawn for its future enhancement.