923 resultados para Selecting Point-of-Sale Systems for Table Service Restaurants
Resumo:
The purpose of this thesis is to study factors that have an impact on the company’s capabilities to identify and analyze the value of digitalization of services during the early stages of service development process and evaluate them from the perspective of a case company. The research problem was defined: “How digitalization of services affects delivering the services of the future?” The research method of this thesis was based on the qualitative case study which aimed to study both company’s and customer’s set of values. The study included a literature review and a development study. The empirical research part consisted of analyzing three existing services, specifying a new digital service concept and its feasibility analysis as part of a business requirement phase. To understand the set of values, 10 stakeholder interviews were conducted and earlier customer surveys were utilized, and additionally, a number of meetings were conducted with the case company representatives to develop service concept, and evaluate the findings. The impact of the early stages of service development process discovered to reflect directly in the capabilities of the case company to identify and create customer value were related to the themes presented in the literature review. In order to specify the value achieved from the digitalization the following areas of strategic background elements were deepened during the study: Innovations, customer understanding and business service. Based on the findings, the study aims to enhance the case company’s capability to identify and evaluate the impact of the digitalization in delivering services of the future. Recognizing the value of digital service before the beginning of the development project is important to the businesses of both customer and provider. By exploring the various levels of digitalization one can get the overall picture of the value gained from utilizing digital opportunities. From the development perspective, the process of reviewing and discovering the most promising opportunities and solutions is the key step in order to deliver superior services. Ultimately, a company should understand the value outcome determination of the individual services as well as their digital counterparts.
Resumo:
In recent years the analysis and synthesis of (mechanical) control systems in descriptor form has been established. This general description of dynamical systems is important for many applications in mechanics and mechatronics, in electrical and electronic engineering, and in chemical engineering as well. This contribution deals with linear mechanical descriptor systems and its control design with respect to a quadratic performance criterion. Here, the notion of properness plays an important role whether the standard Riccati approach can be applied as usual or not. Properness and non-properness distinguish between the cases if the descriptor system is exclusively governed by the control input or by its higher-order time-derivatives additionally. In the unusual case of non-proper systems a quite different problem of optimal control design has to be considered. Both cases will be solved completely.
Resumo:
The increasing complexity of controller systems, applied in modern passenger cars, requires adequate simulation tools. The toolset FASIM_C++, described in the following, uses complex vehicle models in three-dimensional vehicle dynamics simulation. The structure of the implemented dynamic models and the generation of the equations of motion applying the method of kinematic differentials is explained briefly. After a short introduction in methods of event handling, several vehicle models and applications like controller development, roll-over simulation and real-time-simulation are explained. Finally some simulation results are presented.
Resumo:
The dynamics of flexible systems, such as robot manipulators , mechanical chains or multibody systems in general, is becoming increasingly important in engineering. This article deals with some nonlinearities that arise in the study of dynamics and control of multibody systems in connection to large rotations. Specifically, a numerical scheme that adresses the conservation of fundamental constants is presented in order to analyse the control-structure interaction problems.
Resumo:
Presentation at the Nordic Perspectives on Open Access and Open Science seminar, Helsinki, October 15, 2013
Resumo:
The Swedish public health care organisation could very well be undergoing its most significant change since its specialisation during the late 19th and early 20th century. At the heart of this change is a move from using manual patient journals to electronic health records (EHR). EHR are complex integrated organisational wide information systems (IS) that promise great benefits and value as well as presenting great challenges to the organisation. The Swedish public health care is not the first organisation to implement integrated IS, and by no means alone in their quest for realising the potential benefits and value that it has to offer. As organisations invest in IS they embark on a journey of value-creation and capture. A journey where a costbased approach towards their IS-investments is replaced with a value-centric focus, and where the main challenges lie in the practical day-to-day task of finding ways to intertwine technology, people and business processes. This has however proven to be a problematic task. The problematic situation arises from a shift of perspective regarding how to manage IS in order to gain value. This is a shift from technology delivery to benefits delivery; from an ISimplementation plan to a change management plan. The shift gives rise to challenges related to the inability of IS and the elusiveness of value. As a response to these challenges the field of IS-benefits management has emerged offering a framework and a process in order to better understand and formalise benefits realisation activities. In this thesis the benefits realisation efforts of three Swedish hospitals within the same county council are studied. The thesis focuses on the participants of benefits analysis projects; their perceptions, judgments, negotiations and descriptions of potential benefits. The purpose is to address the process where organisations seek to identify which potential IS-benefits to pursue and realise, this in order to better understand what affects the process, so that realisation actions of potential IS-benefits could be supported. A qualitative case study research design is adopted and provides a framework for sample selection, data collection, and data analysis. It also provides a framework for discussions of validity, reliability and generalizability. Findings displayed a benefits fluctuation, which showed that participants’ perception of what constituted potential benefits and value changed throughout the formal benefits management process. Issues like structure, knowledge, expectation and experience affected perception differently, and this in the end changed the amount and composition of potential benefits and value. Five dimensions of benefits judgment were identified and used by participants when finding accommodations of potential benefits and value to pursue. Identified dimensions affected participants’ perceptions, which in turn affected the amount and composition of potential benefits. During the formal benefits management process participants shifted between judgment dimensions. These movements emerged through debates and interactions between participants. Judgments based on what was perceived as expected due to one’s role and perceived best for the organisation as a whole were the two dominant benefits judgment dimensions. A benefits negotiation was identified. Negotiations were divided into two main categories, rational and irrational, depending on participants’ drive when initiating and participating in negotiations. In each category three different types of negotiations were identified having different characteristics and generating different outcomes. There was also a benefits negotiation process identified that displayed management challenges corresponding to its five phases. A discrepancy was also found between how IS-benefits are spoken of and how actions of IS benefits realisation are understood. This was a discrepancy between an evaluation and a realisation focus towards IS value creation. An evaluation focus described IS-benefits as well-defined and measurable effects and a realisation focus spoke of establishing and managing an on-going place of value creation. The notion of valuescape was introduced in order to describe and support the understanding of IS value creation. Valuescape corresponded to a realisation focus and outlined a value configuration consisting of activities, logic, structure, drivers and role of IS.
Resumo:
In this postgraduate thesis the focus is on two documentary films, Renaud Brothers's Dope Sick Love (2005) and Joonas Neuvonen's Reindeerspotting (2010). In both of these films, addicts are on limelight. The directors of both films follow addicts and their everyday lives on streets with handheld cameras. This thesis will analyze and compare the narratives of these addicts. Do these documentaries affirm to the existing conventions, according to which representations of addicts and addiction are always either miraculous survival stories or stories, which always end up in utter decadence. The main questions this thesis proposes, are how addicts and their narratives are being handled in the two case study films, and how the drug cultures depicted in the films differ from each other. What changes between New York City and Rovaniemi, Northern Finland, and what remains the same. The academic and theoretical framework of this thesis consists of Susanna Helke's doctoral thesis Nanookin jälki, where the history of documentary films' methods and approaches are discussed. Additionally, the thinking is heavily informed by such poststructuralist writers as Roland Barthes, Michel Foucault, Louis Althusser, Stuart Hall and Chris Weedon.
Resumo:
Cardiac troponins (cTn) I and T are the current golden standard biochemical markers in the diagnosis and risk stratification of patients with suspected acute coronary syndrome. During the past few years, novel assays capable of detecting cTn‐concentrations in >50% of apparently healthy individuals have become readily available. With the emerging of these high sensitivity cTn assays, reductions in the assay specificity have caused elevations in the measured cTn levels that do not correlate with the clinical picture of the patient. The increased assay sensitivity may reveal that various analytical interference mechanisms exist. This doctoral thesis focused on developing nanoparticle‐assisted immunometric assays that could possibly be applied to an automated point‐of‐care system. The main objective was to develop minimally interference‐prone assays for cTnI by employing recombinant antibody fragments. Fast 5‐ and 15‐minute assays for cTnI and D‐dimer, a degradation product of fibrin, based on intrinsically fluorescent nanoparticles were introduced, thus highlighting the versatility of nanoparticles as universally applicable labels. The utilization of antibody fragments in different versions of the developed cTnI‐assay enabled decreases in the used antibody amounts without sacrificing assay sensitivity. In addition, the utilization of recombinant antibody fragments was shown to significantly decrease the measured cTnI concentrations in an apparently healthy population, as well as in samples containing known amounts of potentially interfering factors: triglycerides, bilirubin, rheumatoid factors, or human anti‐mouse antibodies. When determining the specificity of four commercially available antibodies for cTnI, two out of the four cross‐reacted with skeletal troponin I, but caused crossreactivity issues in patient samples only when paired together. In conclusion, the results of this thesis emphasize the importance of careful antibody selection when developing cTnI assays. The results with different recombinant antibody fragments suggest that the utilization of antibody fragments should strongly be encouraged in the immunoassay field, especially with analytes such as cTnI that require highly sensitive assay approaches.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
In recent years, technological advancements in microelectronics and sensor technologies have revolutionized the field of electrical engineering. New manufacturing techniques have enabled a higher level of integration that has combined sensors and electronics into compact and inexpensive systems. Previously, the challenge in measurements was to understand the operation of the electronics and sensors, but this has now changed. Nowadays, the challenge in measurement instrumentation lies in mastering the whole system, not just the electronics. To address this issue, this doctoral dissertation studies whether it would be beneficial to consider a measurement system as a whole from the physical phenomena to the digital recording device, where each piece of the measurement system affects the system performance, rather than as a system consisting of small independent parts such as a sensor or an amplifier that could be designed separately. The objective of this doctoral dissertation is to describe in depth the development of the measurement system taking into account the challenges caused by the electrical and mechanical requirements and the measurement environment. The work is done as an empirical case study in two example applications that are both intended for scientific studies. The cases are a light sensitive biological sensor used in imaging and a gas electron multiplier detector for particle physics. The study showed that in these two cases there were a number of different parts of the measurement system that interacted with each other. Without considering these interactions, the reliability of the measurement may be compromised, which may lead to wrong conclusions about the measurement. For this reason it is beneficial to conceptualize the measurement system as a whole from the physical phenomena to the digital recording device where each piece of the measurement system affects the system performance. The results work as examples of how a measurement system can be successfully constructed to support a study of sensors and electronics.
Resumo:
Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.
Resumo:
The literature on agency suggests different implications for the use of export intermediaries. However, only few studies provide a view on import intermediaries. This thesis tries for its part to fill this research gap by studying the import intermediaries in the EU–Russia trade from a Russian industrial company’s point of view. The aim is to describe import intermediation and explain the need for import intermediary companies in the EU–Russia trade. The theoretical framework of this thesis originates from an article by Peng and York (2001), in which they study the performance of export intermediaries. This thesis applies resource-based theory, transaction cost theory and agency cost theory, following the idea of Peng and York. The resource-based theory approach is utilised for describing an ideal import intermediary company, and transaction cost theory provides a basis for understanding the benefits of using the services of import intermediary companies, while agency cost theory is applied in order to understand the risks the Russian industrial company faces when it decides to use the services of import intermediaries. The study is performed in the form of a case interview with a representative of a major Russian metallurgy company. The results of the study suggest that an ideal intermediary has the skills required specifically for the imports process, in order to save time and money of the principal company. The intermediary company helps reducing the amount of time the managers and the staff of the principal company use to make imports possible, thus reducing the salary costs and providing the possibility to concentrate on the company’s core competencies. The benefits of using the services of import intermediary companies are the reduced transaction costs, especially salary costs that are minimised because of the effectiveness and specialisation of import intermediaries. Intermediaries are specialised in the imports process and thus need less time and resources to organise the imports. They also help to reduce the fixed salary costs, because their services can be used only when needed. The risks of being misled by intermediaries are minimised by the competition on the import intermediary market. In case an intermediary attempts fraud, it gets replaced by its rival.