12 resultados para Process Analytical Technology (PAT)
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Advances in food transformation have dramatically increased the diversity of products on the market and, consequently, exposed consumers to a complex spectrum of bioactive nutrients whose potential risks and benefits have mostly not been confidently demonstrated. Therefore, tools are needed to efficiently screen products for selected physiological properties before they enter the market. NutriChip is an interdisciplinary modular project funded by the Swiss programme Nano-Tera, which groups scientists from several areas of research with the aim of developing analytical strategies that will enable functional screening of foods. The project focuses on postprandial inflammatory stress, which potentially contributes to the development of chronic inflammatory diseases. The first module of the NutriChip project is composed of three in vitro biochemical steps that mimic the digestion process, intestinal absorption, and subsequent modulation of immune cells by the bioavailable nutrients. The second module is a miniaturised form of the first module (gut-on-a-chip) that integrates a microfluidic-based cell co-culture system and super-resolution imaging technologies to provide a physiologically relevant fluid flow environment and allows sensitive real-time analysis of the products screened in vitro. The third module aims at validating the in vitro screening model by assessing the nutritional properties of selected food products in humans. Because of the immunomodulatory properties of milk as well as its amenability to technological transformation, dairy products have been selected as model foods. The NutriChip project reflects the opening of food and nutrition sciences to state-of-the-art technologies, a key step in the translation of transdisciplinary knowledge into nutritional advice.
Resumo:
This paper introduces a mobile application (app) as the first part of an interactive framework. The framework enhances the inter-action between cities and their citizens, introducing the Fuzzy Analytical Hierarchy Process (FAHP) as a potential information acquisition method to improve existing citizen management en-deavors for cognitive cities. Citizen management is enhanced by advanced visualization using Fuzzy Cognitive Maps (FCM). The presented app takes fuzziness into account in the constant inter-action and continuous development of communication between cities or between certain of their entities (e.g., the tax authority) and their citizens. A transportation use case is implemented for didactical reasons.
Resumo:
The fuzzy analytical network process (FANP) is introduced as a potential multi-criteria-decision-making (MCDM) method to improve digital marketing management endeavors. Today’s information overload makes digital marketing optimization, which is needed to continuously improve one’s business, increasingly difficult. The proposed FANP framework is a method for enhancing the interaction between customers and marketers (i.e., involved stakeholders) and thus for reducing the challenges of big data. The presented implementation takes realities’ fuzziness into account to manage the constant interaction and continuous development of communication between marketers and customers on the Web. Using this FANP framework, the marketers are able to increasingly meet the varying requirements of their customers. To improve the understanding of the implementation, advanced visualization methods (e.g., wireframes) are used.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
Lipoproteins are a heterogeneous population of blood plasma particles composed of apolipoproteins and lipids. Lipoproteins transport exogenous and endogenous triglycerides and cholesterol from sites of absorption and formation to sites of storage and usage. Three major classes of lipoproteins are distinguished according to their density: high-density (HDL), low-density (LDL) and very low-density lipoproteins (VLDL). While HDLs contain mainly apolipoproteins of lower molecular weight, the two other classes contain apolipoprotein B and apolipoprotein (a) together with triglycerides and cholesterol. HDL concentrations were found to be inversely related to coronary heart disease and LDL/VLDL concentrations directly related. Although many studies have been published in this area, few have concentrated on the exact protein composition of lipoprotein particles. Lipoproteins were separated by density gradient ultracentrifugation into different subclasses. Native gel electrophoresis revealed different gel migration behaviour of the particles, with less dense particles having higher apparent hydrodynamic radii than denser particles. Apolipoprotein composition profiles were measured by matrix-assisted laser desorption/ionization-mass spectrometry on a macromizer instrument, equipped with the recently introduced cryodetector technology, and revealed differences in apolipoprotein composition between HDL subclasses. By combining these profiles with protein identifications from native and denaturing polyacrylamide gels by liquid chromatography-tandem mass spectrometry, we characterized comprehensively the exact protein composition of different lipoprotein particles. We concluded that the differential display of protein weight information acquired by macromizer mass spectrometry is an excellent tool for revealing structural variations of different lipoprotein particles, and hence the foundation is laid for the screening of cardiovascular disease risk factors associated with lipoproteins.
Resumo:
he notion of outsourcing – making arrangements with an external entity for the provision of goods or services to supplement or replace internal efforts – has been around for centuries. The outsourcing of information systems (IS) is however a much newer concept but one which has been growing dramatically. This book attempts to synthesize what is known about IS outsourcing by dividing the subject into three interrelated parts: (1) Traditional Information Technology Outsourcing, (2) Information Technolgy Offshoring, and (3) Business Process Outsourcing. The book should be of interest to all academics and students in the field of Information Systems as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
OBJECTIVE to compare the vascular healing process between the sirolimus-eluting NEVO and the everolimus-eluting Xience stent by optical coherence tomography (OCT) at 1-year follow-up. BACKGROUND Presence of durable polymer on a drug-eluting metallic stent may be the basis of an inflammatory reaction with abnormal healing response. The NEVO stent, having a bioresorbable polymer eluted by reservoir technology, may overcome this problem. METHODS All consecutive patients, who received NEVO or Xience stent implantation between September 2010 and October 2010 in our institution, were included. Vascular healing was assessed at 1-year as percentage of uncovered struts, neointimal thickness (NIT), in-stent/stent area obstruction and pattern of neointima. RESULTS A total 47 patients (2:1 randomization, n = 32 NEVO, n = 15 Xience) were included. Eighteen patients underwent angiographic follow-up (eight patients with nine lesions for NEVO vs. 10 patients with 11 lesions for Xience). The angiographic late loss was numerically higher but not statistically different in NEVO compared with Xience treated lesions (0.38 ± 0.47 mm vs. 0.18 ± 0.27 mm; P = 0.171). OCT analysis of 4,912 struts demonstrated similar rates of uncovered struts (0.5 vs. 0.7%, P = 0.462), higher mean NIT (177.76 ± 87.76 µm vs. 132.22 ± 30.91 µm; P = 0.170) and in stent/stent area obstruction (23.02 ± 14.74% vs. 14.17 ± 5.94%, P = 0.120) in the NEVO as compared with Xience. CONCLUSION The NEVO stent with a reservoir technology seems to exhibit more neointimal proliferation as compared to Xience stent. The findings of our study, which currently represent the unique data existing on this reservoir technology, would need to be confirmed in a large population.
Resumo:
Ultraviolet-ozone treatment is used as a standard surface cleaning procedure for removal of molecular organic contamination from analytical and sensing devices. Here, it is applied for injection-molded polymer microcantilevers before characterization and sensing experiments. This article examines the effects of the surface cleaning process using commercial equipment, in particular on the performance and mechanical properties of the cantilevers. It can be shown that the first chemical aging process essentially consist of the cross linking of the polymer chains together with a physical aging of the material. For longer exposure, the expected thermo-oxidative formation of carbonyl groups sets in and an exposure dependent chemical degradation can be detected. A process time of 20 min was found suitable as a trade-off between cleaning and stability
Resumo:
Many technological developments of the past two decades come with the promise of greater IT flexi-bility, i.e. greater capacity to adapt IT. These technologies are increasingly used to improve organiza-tional routines that are not affected by large, hard-to-change IT such as ERP. Yet, most findings on the interaction of routines and IT stem from contexts where IT is hard to change. Our research ex-plores how routines and IT co-evolve when IT is flexible. We review the literatures on routines to sug-gest that IT may act as a boundary object that mediates the learning process unfolding between the ostensive and the performative aspect of the routine. Although prior work has concluded from such conceptualizations that IT stabilizes routines, we qualify that flexible IT can also stimulate change because it enables learning in short feedback cycles. We suggest that, however, such change might not always materialize because it is contingent on governance choices and technical knowledge. We de-scribe the case-study method to explore how routines and flexible IT co-evolve and how governance and technical knowledge influence this process. We expect to contribute towards stronger theory of routines and to develop recommendations for the effective implementation of flexible IT in loosely coupled routines.
Resumo:
How do developers and designers of a new technology make sense of intended users? The critical groundwork for user-centred technology development begins not by involving actual users' exposure to the technological artefact but much earlier, with designers' and developers' vision of future users. Thus, anticipating intended users is critical to technology uptake. We conceptualise the anticipation of intended users as a form of prospective sensemaking in technology development. Employing a narrative analytical approach and drawing on four key communities in the development of Grid computing, we reconstruct how each community anticipated the intended Grid user. Based on our findings, we conceptualise user anticipation in Terms of two key dimensions, namely the intended possibility to inscribe user needs into the technological artefact as well as the intended scope of the application domain. In turn, these dimensions allow us to develop an initial typology of intended user concepts that in turn might provide a key building block towards a generic typology of intended users.