890 resultados para product service systems
Resumo:
Objective: Identifying the main causes for underreporting of Adverse Drug Reaction (ADR) by health professionals. Method: A systematic review carried out in the following databases: LILACS, PAHO, SciELO, EMBASE and PubMed in the period between 1992 and 2012. Descriptors were used in the search for articles, and the identified causes of underreporting were analyzed according to the classification of Inman. Results: In total, were identified 149 articles, among which 29 were selected. Most studies were carried out in hospitals (24/29) for physicians (22/29), and pharmacists (10/29). The main causes related to underreporting were ignorance (24/29), insecurity (24/29) and indifference (23/29). Conclusion: The data show the eighth sin in underreporting, which is the lack of training in pharmacovigilance. Therefore, continuing education can increase adherence of professionals to the service and improve knowledge and communication of risks due to drug use.
Resumo:
The study aimed to identify pharmacoeconomic studies in pharmacovigilance and to observe the economic outcomes in post-marketing surveillance. Therefore, a bibliographic survey was performed in databases Lilacs, PubMed/ Bireme. The search strategy was done by using scientific health descriptors [ "adverse drug reaction reporting systems " OR " medication errors " OR "product surveillance, postmarketing" OR " sentinel surveillance" ] AND [ " cost-benefit analysis" OR "cost efficiency analysis " OR " costs and cost analysis " OR " hospital costs " OR " cost-effectiveness " OR " cost-effectiveness evaluation " OR " drug costs " ]. Manuscripts published in the last 10 years were selected. We chose 13 articles, of which 12 corresponded to cost-benefit analysis and only one to cost-effectiveness assessment. In only one study there was no economy, all the other ones generated savings, ranging from 13.7 to 30% in spending valued service. Surveillance actions were: continuing education; active search through tracking devices and / or implementation of round; teamwork and multidisciplinary deployment; computerized security services management, enabling traceability of information and alerts. The results of the proposed actions have led to the prevention of adverse drug reactions, to decline of risks to the patient, to the reduction of inappropriate prescriptions, as well as the length of hospital stay spending valued service. Surveillance actions were: continuing education; active search through tracking devices and / or implementation of round; teamwork and multidisciplinary deployment; computerized security services management, enabling traceability of information and alerts. The results of the proposed actions have led to the prevention of adverse drug reactions, to decline of risks to the patient, to the reduction of inappropriate prescriptions, as well as the length of hospital stay
Resumo:
Objective: Identifying the main causes for underreporting of Adverse Drug Reaction (ADR) by health professionals. Method: A systematic review carried out in the following databases: LILACS, PAHO, SciELO, EMBASE and PubMed in the period between 1992 and 2012. Descriptors were used in the search for articles, and the identified causes of underreporting were analyzed according to the classification of Inman. Results: In total, were identified 149 articles, among which 29 were selected. Most studies were carried out in hospitals (24/29) for physicians (22/29), and pharmacists (10/29). The main causes related to underreporting were ignorance (24/29), insecurity (24/29) and indifference (23/29). Conclusion: The data show the eighth sin in underreporting, which is the lack of training in pharmacovigilance. Therefore, continuing education can increase adherence of professionals to the service and improve knowledge and communication of risks due to drug use.
Resumo:
The aim of this paper is to propose a classification of reverse logistics systems based on activities for value recovery from returned products. Case studies were carried out in three Brazilian companies. Research results show that Company 1 uses a reverse logistics system based on ‘disposal logistics system’, the main reason for returns is ‘end of life’ and the main motivation is ‘legislation’; Company 2 uses ‘Recycling logistics system’, the main reason for the returns is ‘products not sold’ and the main motivation is ‘recovery of assets and value’; finally, Company 3 uses ‘product reprocessing logistics system’, the main reason for returns is ‘end of life’ and the main motivation is ‘social and environmental responsibility’.
Resumo:
Software product line (SPL) engineering offers several advantages in the development of families of software products such as reduced costs, high quality and a short time to market. A software product line is a set of software intensive systems, each of which shares a common core set of functionalities, but also differs from the other products through customization tailored to fit the needs of individual groups of customers. The differences between products within the family are well-understood and organized into a feature model that represents the variability of the SPL. Products can then be built by generating and composing features described in the feature model. Testing of software product lines has become a bottleneck in the SPL development lifecycle, since many of the techniques used in their testing have been borrowed from traditional software testing and do not directly take advantage of the similarities between products. This limits the overall gains that can be achieved in SPL engineering. Recent work proposed by both industry and the research community for improving SPL testing has begun to consider this problem, but there is still a need for better testing techniques that are tailored to SPL development. In this thesis, I make two primary contributions to software product line testing. First I propose a new definition for testability of SPLs that is based on the ability to re-use test cases between products without a loss of fault detection effectiveness. I build on this idea to identify elements of the feature model that contribute positively and/or negatively towards SPL testability. Second, I provide a graph based testing approach called the FIG Basis Path method that selects products and features for testing based on a feature dependency graph. This method should increase our ability to re-use results of test cases across successive products in the family and reduce testing effort. I report the results of a case study involving several non-trivial SPLs and show that for these objects, the FIG Basis Path method is as effective as testing all products, but requires us to test no more than 24% of the products in the SPL.
Resumo:
The service sector has acquired a growing importance in every country economy, which has stimulated research in the field of service innovation, a new field in management studies. This text aimed to state a research agenda upon service innovation, based on an articulated discussion of the results of several articles that compose the state of the art of this concept. 73 empirical articles were analyzed, 33% of them exploring the innovation strategies and technology; 18% of the articles describe research on economic performance and enterprise productivity; 16% are related to antecedents and determinants of innovation; another 16% about network capacity development, alliances and collaboration among organizations; 9% of the articles explore service quality, innovation taxonomy, flexible systems and regional systems of innovation; and another 8% are related to themes such as intensive knowledge, research and development. The researches were concentrated in the Engineering & Technology and Hospitality Industries, which accounted for 31% and 24% of the texts, respectively. The remaining 45% of the articles referred to sectors such as Telecommunications, Health, Retail, Financial & Insurance and Public Services. The main gaps identified in these texts refer to the difficulties on measuring service innovation, besides the small number of researches on the public sector. At the end, a research agenda in the subject is presented, including the development of a scale for orientating the innovation and identifying the determining factors of the innovation in the public environment.
Resumo:
This study evaluated the five-year clinical performance of ceramic inlays and onlays made with two systems: sintered Duceram (Dentsply-Degussa) and pressable IPS Empress (Ivoclar Vivadent). Eighty-six restorations were placed by a single operator in 35 patients with a median age of 33 years. The restorations were cemented with dual-cured resin cement (Variolink II, Ivoclar Vivadent) and Syntac Classic adhesive under rubber dam. The evaluations were conducted by two independent investigators at baseline, and at one, two, three, and five years using the modified United States Public Health Service (USPHS) criteria. At the five-year recall, 26 patients were evaluated (74.28%), totalling 62 (72.09%) restorations. Four IPS restorations were fractured, two restorations presented secondary caries (one from IPS and one from Duceram), and two restorations showed unacceptable defects at the restoration margin and needed replacement (one restoration from each ceramic system). A general success rate of 87% was recorded. The Fisher exact test revealed no significant difference between Duceram and IPS Empress ceramic systems for all aspects evaluated at different recall appointments (p>0.05). The McNemar chi-square test showed significant differences in relation to marginal discoloration, marginal integrity, and surface texture between the baseline and five-year recall for both systems (p<0.001), with an increased percentage of Bravo scores. However, few Charlie or Delta scores were attributed to these restorations. In conclusion, these two types of ceramic materials demonstrated acceptable clinical performance after five years
Resumo:
[ES] Las necesidades básicas de las empresas suelen ser las mismas, ya sea una empresa grande que pequeña, la infraestructura sobre la que montan sus procesos de negocio y las aplicaciones para gestionarlos suelen ser casi iguales. Si dividimos la infraestructura TIC de una empresa en hardware, sistema y aplicaciones, podemos ver que en la mayoría de ellas el sistema es casi idéntico. Además, gracias a la virtualización, que ha entrado de manera arrolladora en el mundo de la informática, podemos independizar totalmente el software del hardware, de forma que obtenemos una flexibilidad enorme a la hora de planificar despliegues de infraestructura. Sobre estas dos ideas, uniformidad de sistema e independencia de hardware, son sobre las que se va a desarrollar el siguiente TFG. Para el desarrollo de la primera de ellas se realizará el estudio de la infraestructura básica ( sistema) que cualquier empresa suele tener. Se intentará dar una solución que sea válida para una gran cantidad de empresas de nuestro entorno y se realizará el diseño del mismo. Con la segunda idea desarrollaremos un sistema basado en servicios, que sea lo suficientemente completa para poder dar respuesta a las necesidades vistas pero, a su vez, suficientemente flexible para que el crecimiento en capacidades o servicios se pueda realizar de forma sencilla sin que la estructura del sistema, o sus módulos deban modificarse para realizarlos. Por tanto, vamos a realizar un diseño integral y completa, de forma que será tanto de hardware como de software, haciendo énfasis en la integración de los sistemas y la interrelación entre los distintos elementos de ellos. Se dará, a su vez, la valoración económica del mismo. Por último, y como ejemplo de la flexibilidad del diseño elegido veremos dos modificaciones sobre el diseño original. El primero de ellos será una ampliación para dar mayor seguridad en cuanto a redundancia de almacenamiento y, ya en un paso definitivo, montar un CPD remoto. El segundo de ellos será un diseño de bajo coste, en el que, mantenimiento los mismos servicios, bajaremos el coste del diseño con productos con algo menos de prestaciones, pero manteniendo la solución en conjunto unos altos niveles de calidad y servicio.
Resumo:
Service Oriented Computing is a new programming paradigm for addressing distributed system design issues. Services are autonomous computational entities which can be dynamically discovered and composed in order to form more complex systems able to achieve different kinds of task. E-government, e-business and e-science are some examples of the IT areas where Service Oriented Computing will be exploited in the next years. At present, the most credited Service Oriented Computing technology is that of Web Services, whose specifications are enriched day by day by industrial consortia without following a precise and rigorous approach. This PhD thesis aims, on the one hand, at modelling Service Oriented Computing in a formal way in order to precisely define the main concepts it is based upon and, on the other hand, at defining a new approach, called bipolar approach, for addressing system design issues by synergically exploiting choreography and orchestration languages related by means of a mathematical relation called conformance. Choreography allows us to describe systems of services from a global view point whereas orchestration supplies a means for addressing such an issue from a local perspective. In this work we present SOCK, a process algebra based language inspired by the Web Service orchestration language WS-BPEL which catches the essentials of Service Oriented Computing. From the definition of SOCK we will able to define a general model for dealing with Service Oriented Computing where services and systems of services are related to the design of finite state automata and process algebra concurrent systems, respectively. Furthermore, we introduce a formal language for dealing with choreography. Such a language is equipped with a formal semantics and it forms, together with a subset of the SOCK calculus, the bipolar framework. Finally, we present JOLIE which is a Java implentation of a subset of the SOCK calculus and it is part of the bipolar framework we intend to promote.
Resumo:
The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.
Resumo:
A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.
Resumo:
In recent years, due to the rapid convergence of multimedia services, Internet and wireless communications, there has been a growing trend of heterogeneity (in terms of channel bandwidths, mobility levels of terminals, end-user quality-of-service (QoS) requirements) for emerging integrated wired/wireless networks. Moreover, in nowadays systems, a multitude of users coexists within the same network, each of them with his own QoS requirement and bandwidth availability. In this framework, embedded source coding allowing partial decoding at various resolution is an appealing technique for multimedia transmissions. This dissertation includes my PhD research, mainly devoted to the study of embedded multimedia bitstreams in heterogenous networks, developed at the University of Bologna, advised by Prof. O. Andrisano and Prof. A. Conti, and at the University of California, San Diego (UCSD), where I spent eighteen months as a visiting scholar, advised by Prof. L. B. Milstein and Prof. P. C. Cosman. In order to improve the multimedia transmission quality over wireless channels, joint source and channel coding optimization is investigated in a 2D time-frequency resource block for an OFDM system. We show that knowing the order of diversity in time and/or frequency domain can assist image (video) coding in selecting optimal channel code rates (source and channel code rates). Then, adaptive modulation techniques, aimed at maximizing the spectral efficiency, are investigated as another possible solution for improving multimedia transmissions. For both slow and fast adaptive modulations, the effects of imperfect channel estimation errors are evaluated, showing that the fast technique, optimal in ideal systems, might be outperformed by the slow adaptive modulation, when a real test case is considered. Finally, the effects of co-channel interference and approximated bit error probability (BEP) are evaluated in adaptive modulation techniques, providing new decision regions concepts, and showing how the widely used BEP approximations lead to a substantial loss in the overall performance.
Resumo:
Durch globale Expressionsprofil-Analysen auf Transkriptom-, Proteom- oder Metabolom-Ebene können biotechnologische Produktionsprozesse besser verstanden und die Erkenntnisse für die zielgerichtete, rationale Optimierung von Expressionssystemen genutzt werden. In der vorliegenden Arbeit wurde die Überexpression einer Glukose-Dehydrogenase (EC 1.1.5.2), die von der Roche Diagnostics GmbH für die diagnostische Anwendung optimiert worden war, in Escherichia coli untersucht. Die Enzymvariante unterscheidet sich in sieben ihrer 455 Aminosäuren vom Wildtyp-Enzym und wird im sonst isogenen Wirt-/Vektor-System in signifikant geringeren Mengen (Faktor 5) gebildet. Das prokaryontische Expressionssystem wurde auf Proteom-Ebene charakterisiert. Die 2-dimensionale differenzielle Gelelektrophorese (DIGE) wurde zuvor unter statistischen Aspekten untersucht. Unter Berücksichtigung von technischen und biologischen Variationen, falsch-positiven (α-) und falsch-negativen (β-) Fehlern sowie einem daraus abgeleiteten Versuchsdesign konnten Expressionsunterschiede als signifikant quantifiziert werden, wenn sie um den Faktor ≥ 1,4 differierten. Durch eine Hauptkomponenten-Analyse wurde gezeigt, dass die DIGE-Technologie für die Expressionsprofil-Analyse des Modellsystems geeignet ist. Der Expressionsstamm für die Enzymvariante zeichnete sich durch eine höhere Variabilität an Enzymen für den Zuckerabbau und die Nukleinsäure-Synthese aus. Im Expressionssystem für das Wildtyp-Enzym wurde eine unerwartet erhöhte Plasmidkopienzahl nachgewiesen. Als potenzieller Engpass in der Expression der rekombinanten Glukose-Dehydrogenase wurde die Löslichkeitsvermittlung identifiziert. Im Expressionsstamm für das Wildtyp-Enzym wurden viele Proteine für die Biogenese der äußeren Membran verstärkt exprimiert. Als Folge dessen wurde ein sog. envelope stress ausgelöst und die Zellen gingen in die stationäre Wuchsphase über. Die Ergebnisse der Proteomanalyse wurden weiterführend dazu genutzt, die Produktionsleistung für die Enzymvariante zu verbessern. Durch den Austausch des Replikationsursprungs im Expressionsvektor wurde die Plasmidkopienzahl erhöht und die zelluläre Expressionsleistung für die diagnostisch interessantere Enzymvariante um Faktor 7 - 9 gesteigert. Um die Löslichkeitsvermittlung während der Expression zu verbessern, wurde die Plasmidkopienzahl gesenkt und die Coexpression von Chaperonen initiiert. Die Ausbeuten aktiver Glukose-Dehydrogenase wurden durch die Renaturierung inaktiven Produkts aus dem optimierten Expressionssystem insgesamt um einen Faktor von 4,5 erhöht. Somit führte im Rahmen dieser Arbeit eine proteombasierte Expressionsprofil-Analyse zur zielgerichteten, rationalen Expressionsoptimierung eines prokaryontischen Modellsystems.
Resumo:
During the last decade peach and nectarine fruit have lost considerable market share, due to increased consumer dissatisfaction with quality at retail markets. This is mainly due to harvesting of too immature fruit and high ripening heterogeneity. The main problem is that the traditional used maturity indexes are not able to objectively detect fruit maturity stage, neither the variability present in the field, leading to a difficult post-harvest management of the product and to high fruit losses. To assess more precisely the fruit ripening other techniques and devices can be used. Recently, a new non-destructive maturity index, based on the vis-NIR technology, the Index of Absorbance Difference (IAD), that correlates with fruit degreening and ethylene production, was introduced and the IAD was used to study peach and nectarine fruit ripening from the “field to the fork”. In order to choose the best techniques to improve fruit quality, a detailed description of the tree structure, of fruit distribution and ripening evolution on the tree was faced. More in details, an architectural model (PlantToon®) was used to design the tree structure and the IAD was applied to characterize the maturity stage of each fruit. Their combined use provided an objective and precise evaluation of the fruit ripening variability, related to different training systems, crop load, fruit exposure and internal temperature. Based on simple field assessment of fruit maturity (as IAD) and growth, a model for an early prediction of harvest date and yield, was developed and validated. The relationship between the non-destructive maturity IAD, and the fruit shelf-life, was also confirmed. Finally the obtained results were validated by consumer test: the fruit sorted in different maturity classes obtained a different consumer acceptance. The improved knowledge, leaded to an innovative management of peach and nectarine fruit, from “field to market”.
Resumo:
Il pomodoro è una delle colture principali del panorama agro-alimentare italiano e rappresenta un ingrediente base della tradizione culinaria nazionale. Il pomodoro lavorato dall’industria conserviera può essere trasformato in diverse tipologie merceologiche, che si differenziano in base alla tecniche di lavorazione impiegate ed alle caratteristiche del prodotto finito. la percentuale di spesa totale destinata all’acquisto di cibo fuori casa è in aumento a livello globale e l’interesse dell’industria alimentare nei confronti di questo canale di vendita è quindi crescente. Mentre sono numerose le indagine in letteratura che studiano i processi di acquisto dei consumatori finali, non ci sono evidenze di studi simili condotti sugli operatori del Food Service. Obiettivo principale della ricerca è quello di valutare le preferenze dei responsabili acquisti del settore Food Service per diverse tipologie di pomodoro trasformato, in relazione ad una gamma di attributi rilevanti del prodotto e di caratteristiche del cliente. La raccolta dei dati è avvenuta attraverso un esperimento di scelta ipotetico realizzato in Italia e alcuni mercati esteri. Dai risultati ottenuti dall’indagine emerge che i Pelati sono la categoria di pomodoro trasformato preferita dai responsabili degli acquisti del settore Food Service intervistati, con il 35% delle preferenze dichiarate nell'insieme dei contesti di scelta proposti, seguita dalla Polpa (25%), dalla Passata (20%) e dal Concentrato (15%). Dai risultati ottenuti dalla stima del modello econometrico Logit a parametri randomizzati è emerso che alcuni attributi qualitativi di fiducia (credence), spesso impiegati nelle strategie di differenziazione e posizionamento da parte dell’industria alimentare nel mercato Retail, possono rivestire un ruolo importante anche nell’influenzare le preferenze degli operatori del Food Service. Questo potrebbe quindi essere un interessante filone di ricerca da sviluppare nel futuro, possibilmente con l'impiego congiunto di metodologie di analisi basate su esperimenti di scelta ipotetici e non ipotetici.