992 resultados para automated dose dispensing service
Kunnossapitopalveluita tarjoavan yrityksen markkinapotentiaalin kartoittaminen, case: ABB Service Oy
Resumo:
Tämän tutkimuksen aiheena on markkinapotentiaalin kartoittaminen Satakunnassa. Esimerkkiyrityksenä on kansainvälinen kunnossapitopalveluita tarjoava yritys ABB Service Oy. Työn tavoitteena on selvittää markkinatutkimuksen avulla yrityksen markkinapotentiaali kohdealueella. Teoriaosassa käsitellään markkinapotentiaalin määräytymisen teoriaa sekä esitellään markkinointi- ja markkinatutkimuksen vaiheita. Empiirisessä osassa esitellään sekä kohdeyritys että sen toimiala. Esimerkkiyrityksen markkinapotentiaalin määrittämiseksi tehty kvalitatiivinen puhelinhaastattelututkimus käydään pääpiirteittäin läpi. Yritysten keskittyessä ydinosaamiseensa sekä teollisuuden koneiden ja laitteiden automatisoituessa yritykset ostavat yhä enemmän kunnossapitopalveluita ulkopuolelta. Näin ollen erikoisosaamistaan hyödyntävälle kunnossapitoyritykselle löytyy yhä kasvavat markkinat. Tutkimuksen tuloksista selviää, että esimerkkiyritys tunnetaan varsin hyvin kohdealueella. Markkinoiden mahdollisilla asiakkailla tulee olla kolme ominaisuutta: kiinnostus, tulot ja mahdollisuus ostaa tuote. Näillä perusteilla etsittiin kohdeyrityksen segmenttiin sopivat potentiaaliset uudet asiakkaat. Satakunnan markkinoilla parhaiten tulevat menestymään yritykset, jotka haluavat sitoutua näille markkinoille.
Resumo:
Objective: The present study was aimed at evaluating the viability of replacing 18F with 99mTc in dose calibrator linearity testing. Materials and Methods: The test was performed with sources of 99mTc (62 GBq) and 18F (12 GBq) whose activities were measured up to values lower than 1 MBq. Ratios and deviations between experimental and theoretical 99mTc and 18F sources activities were calculated and subsequently compared. Results: Mean deviations between experimental and theoretical 99mTc and 18F sources activities were 0.56 (± 1.79)% and 0.92 (± 1.19)%, respectively. The mean ratio between activities indicated by the device for the 99mTc source as measured with the equipment pre-calibrated to measure 99mTc and 18F was 3.42 (± 0.06), and for the 18F source this ratio was 3.39 (± 0.05), values considered constant over the measurement time. Conclusion: The results of the linearity test using 99mTc were compatible with those obtained with the 18F source, indicating the viability of utilizing both radioisotopes in dose calibrator linearity testing. Such information in association with the high potential of radiation exposure and costs involved in 18F acquisition suggest 99mTc as the element of choice to perform dose calibrator linearity tests in centers that use 18F, without any detriment to the procedure as well as to the quality of the nuclear medicine service.
Resumo:
The acute administration of an indirect activator of the enzyme pyruvate dehydrogenase (PDH) in human athletes causes a reduction in blood lactate level during and after exercise. A single IV dose (2.5m.kg-1) of dichloroacetate (DCA) was administered before a submaximal incremental exercise test (IET) with five velocity steps, from 5.0 m.s-1 for 1 min to 6.0, 6.5, 7.0 and 7.5m.s-1 every 30s in four untrained mares. The blood collections were done in the period after exercise, at times 1, 3, 5, 10, 15 and 20 min. Blood lactate and glucose (mM) were determined electro-enzymatically utilizing a YSI 2300 automated analyzer. There was a 15.3% decrease in mean total blood lactate determined from the values obtained at all assessment times in both trials after the exercise. There was a decrease in blood lactate 1, 3, 5, 10, 15 and 20 min after exercise for the mares that received prior DCA treatment, with respective mean values of 6.31±0.90 vs 5.81±0.50, 6.45±1.19 vs 5.58±1.06, 6.07±1.56 vs 5.26±1.12, 4.88±1.61 vs 3.95±1.00, 3.66±1.41 vs 2.86±0.75 and 2.75±0.51 vs 2.04±0.30. There was no difference in glucose concentrations. By means of linear regression analysis, V140, V160, V180 and V200 were determined (velocity at which the rate heart is 140, 160, 180, and 200 beats/minute, respectively). The velocities related to heart rate did not differ, indicating that there was no ergogenic effect, but prior administration of a relatively low dose of DCA in mares reduced lactatemia after an IET.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Resumo:
Digitalization has been predicted to change the future as a growing range of non-routine tasks will be automated, offering new kinds of business models for enterprises. Serviceoriented architecture (SOA) provides a basis for designing and implementing welldefined problems as reusable services, allowing computers to execute them. Serviceoriented design has potential to act as a mediator between IT and human resources, but enterprises struggle with their SOA adoption and lack a linkage between the benefits and costs of services. This thesis studies the phenomenon of service reuse in enterprises, proposing an ontology to link different kinds of services with their role conceptually as a part of the business model. The proposed ontology has been created on the basis of qualitative research conducted in three large enterprises. Service reuse has two roles in enterprises: it enables automated data sharing among human and IT resources, and it may provide cost savings in service development and operations. From a technical viewpoint, the ability to define a business problem as a service is one of the key enablers for achieving service reuse. The research proposes two service identification methods, first to identify prospective services in the existing documentation of the enterprise and secondly to model the services from a functional viewpoint, supporting service identification sessions with business stakeholders.
Resumo:
Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.
Resumo:
Objectives: To assess the impact of a closed-loop electronic prescribing, automated dispensing, barcode patient identification and electronic medication administration record (EMAR) system on prescribing and administration errors, confirmation of patient identity before administration, and staff time. Design, setting and participants: Before-and-after study in a surgical ward of a teaching hospital, involving patients and staff of that ward. Intervention: Closed-loop electronic prescribing, automated dispensing, barcode patient identification and EMAR system. Main outcome measures: Percentage of new medication orders with a prescribing error, percentage of doses with medication administration errors (MAEs) and percentage given without checking patient identity. Time spent prescribing and providing a ward pharmacy service. Nursing time on medication tasks. Results: Prescribing errors were identified in 3.8% of 2450 medication orders pre-intervention and 2.0% of 2353 orders afterwards (p<0.001; χ2 test). MAEs occurred in 7.0% of 1473 non-intravenous doses pre-intervention and 4.3% of 1139 afterwards (p = 0.005; χ2 test). Patient identity was not checked for 82.6% of 1344 doses pre-intervention and 18.9% of 1291 afterwards (p<0.001; χ2 test). Medical staff required 15 s to prescribe a regular inpatient drug pre-intervention and 39 s afterwards (p = 0.03; t test). Time spent providing a ward pharmacy service increased from 68 min to 98 min each weekday (p = 0.001; t test); 22% of drug charts were unavailable pre-intervention. Time per drug administration round decreased from 50 min to 40 min (p = 0.006; t test); nursing time on medication tasks outside of drug rounds increased from 21.1% to 28.7% (p = 0.006; χ2 test). Conclusions: A closed-loop electronic prescribing, dispensing and barcode patient identification system reduced prescribing errors and MAEs, and increased confirmation of patient identity before administration. Time spent on medication-related tasks increased.
Resumo:
Notification Services mediate between information publishers and consumers that wish to subscribe to periodic updates. In many cases, however, there is a mismatch between the dissemination of these updates and the delivery preferences of the consumer, often in terms of frequency of delivery, quality, etc. In this paper, we present an automated negotiation engine that identifies mutually acceptable terms; we study its performance, and discuss its application to a Grid Notification Service. We also demonstrate how the negotiation engine enables users to control the Quality of Service levels they require.
Resumo:
Notification Services mediate between information publishers and consumers that wish to subscribe to periodic updates. In many cases, however, there is a mismatch between the dissemination of these updates and the delivery preferences of the consumer, often in terms of frequency of delivery, quality, etc. In this paper, we present an automated negotiation engine that identifies mutually acceptable terms; we study its performance, and discuss its application to a Grid Notification Service. We also demonstrate how the negotiation engine enables users to control the Quality of Service levels they require.
Resumo:
Service provisioning is a challenging research area for the design and implementation of autonomic service-oriented software systems. It includes automated QoS management for such systems and their applications. Monitoring, Diagnosis and Repair are three key features of QoS management. This work presents a self-healing Web service-based framework that manages QoS degradation at runtime. Our approach is based on proxies. Proxies act on meta-level communications and extend the HTTP envelope of the exchanged messages with QoS-related parameter values. QoS Data are filtered over time and analysed using statistical functions and the Hidden Markov Model. Detected QoS degradations are handled with proxies. We experienced our framework using an orchestrated electronic shop application (FoodShop).
Resumo:
The acute administration of an indirect activator of the enzyme pyruvate dehydroge-nase (PDH) in human athletes causes a reduction in blood lactate level during and after exercise. A single IV dose (2.5m.kg-1) of dichloroacetate (DCA) was administered before a submaximal incremental exercise test (IET) with five velocity steps, from 5.0 m.s-1 for 1 min to 6.0, 6.5, 7.0 and 7.5m.s-1 every 30s in four untrained mares. The blood collections were done in the period after exercise, at times 1, 3, 5, 10, 15 and 20 min. Blood lactate and glucose (mM) were determined electro-enzymatically utilizing a YSI 2300 automated analyzer. There was a 15.3% decrease in mean total blood lactate determined from the values obtained at all assessment times in both trials after the exercise. There was a decrease in blood lactate 1, 3, 5, 10, 15 and 20 min after exercise for the mares that received prior DCA treatment, with respective mean values of 6.31±0.90 vs 5.81±0.50, 6.45±1.19 vs 5.58±1.06, 6.07±1.56 vs 5.26±1.12, 4.88±1.61 vs 3.95±1.00, 3.66±1.41 vs 2.86±0.75 and 2.75±0.51 vs 2.04±0.30. There was no difference in glucose concentrations. By means of linear regression analysis, V140, V160, V180 and V200 were determined (velocity at which the rate heart is 140, 160, 180, and 200 beats/minute, respectively). The velocities related to heart rate did not differ, indicating that there was no ergogenic effect, but prior administration of a relatively low dose of DCA in mares reduced lactatemia after an IET.
Resumo:
NEWEST (Neoadjuvant Endocrine Therapy for Women with Estrogen-Sensitive Tumors) is the first study to compare biological and clinical activity of fulvestrant 500 versus 250 mg in the neoadjuvant breast cancer setting. We hypothesized that fulvestrant 500 mg may be superior to 250 mg in blocking estrogen receptor (ER) signaling and growth. A multicenter, randomized, open-label, Phase II study was performed to compare fulvestrant 500 mg (500 mg/month plus 500 mg on day 14 of month 1) versus fulvestrant 250 mg/month for 16 weeks prior to surgery in postmenopausal women with ER+ locally advanced breast cancer. Core biopsies at baseline, week 4, and surgery were assessed for biomarker changes. Primary endpoint: change in Ki67 labeling index (LI) from baseline to week 4 determined by automated computer imaging system (ACIS). Secondary endpoints: ER protein expression and function; progesterone receptor (PgR) expression; tumor response; tolerability. ER and PgR were examined retrospectively using the H score method. A total of 211 patients were randomized (fulvestrant 500 mg: n = 109; 250 mg: n = 102). At week 4, fulvestrant 500 mg resulted in greater reduction of Ki67 LI and ER expression versus 250 mg (-78.8 vs. -47.4% [p < 0.0001] and -25.0 vs. -13.5% [p = 0.0002], respectively [ACIS]); PgR suppression was not significantly different (-22.7 vs. -17.6; p = 0.5677). However, H score detected even greater suppression of ER (-50.3 vs. -13.7%; p < 0.0001) and greater PgR suppression (-80.5 vs. -46.3%; p = 0.0018) for fulvestrant 500 versus 250 mg. At week 16, tumor response rates were 22.9 and 20.6% for fulvestrant 500 and 250 mg, respectively, with considerable decline in all markers by both ACIS and H score. No detrimental effects on endometrial thickness or bone markers and no new safety concerns were identified. This provides the first evidence of greater biological activity for fulvestrant 500 versus 250 mg in depleting ER expression, function, and growth.
Resumo:
The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.
Resumo:
This paper presents an automated solution for precise detection of fiducial screws from three-dimensional (3D) Computerized Tomography (CT)/Digital Volume Tomography (DVT) data for image-guided ENT surgery. Unlike previously published solutions, we regard the detection of the fiducial screws from the CT/DVT volume data as a pose estimation problem. We thus developed a model-based solution. Starting from a user-supplied initialization, our solution detects the fiducial screws by iteratively matching a computer aided design (CAD) model of the fiducial screw to features extracted from the CT/DVT data. We validated our solution on one conventional CT dataset and on five DVT volume datasets, resulting in a total detection of 24 fiducial screws. Our experimental results indicate that the proposed solution achieves much higher reproducibility and precision than the manual detection. Further comparison shows that the proposed solution produces better results on the DVT dataset than on the conventional CT dataset.