919 resultados para Modeling of purification operations inbiotechnology
Resumo:
Forward modeling is commonly applied to gravity field data of impact structures to determine the main gravity anomaly sources. In this context, we have developed 2.5-D gravity models of the Serra da Cangalha impact structure for the purpose of investigating geological bodies/structures underneath the crater. Interpretation of the models was supported by ground magnetic data acquired along profiles, as well as by high resolution aeromagnetic data. Ground magnetic data reveal the presence of short-wavelength anomalies probably related to shallow magnetic sources that could have been emplaced during the cratering process. Aeromagnetic data show that the basement underneath the crater occurs at an average depth of about 1.9 km, whereas in the region beneath the central uplift it is raised to 0.51 km below the current surface. These depths are also supported by 2.5-D gravity models showing a gentle relief for the basement beneath the central uplift area. Geophysical data were used to provide further constraints for numeral modeling of crater formation that provided important information on the structural modification that affected the rocks underneath the crater, as well as on shock-induced modifications of target rocks. The results showed that the morphology is consistent with the current observations of the crater and that Serra da Cangalha was formed by a meteorite of approximately 1.4 km diameter striking at 12 km s-1.
Resumo:
Consistent in silico models for ADME properties are useful tools in early drug discovery. Here, we report the hologram QSAR modeling of human intestinal absorption using a dataset of 638 compounds with experimental data associated. The final validated models are consistent and robust for the consensus prediction of this important pharmacokinetic property and are suitable for virtual screening applications. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.
Resumo:
Dimensional modeling, GT-Power in particular, has been used for two related purposes-to quantify and understand the inaccuracies of transient engine flow estimates that cause transient smoke spikes and to improve empirical models of opacity or particulate matter used for engine calibration. It has been proposed by dimensional modeling that exhaust gas recirculation flow rate was significantly underestimated and volumetric efficiency was overestimated by the electronic control module during the turbocharger lag period of an electronically controlled heavy duty diesel engine. Factoring in cylinder-to-cylinder variation, it has been shown that the electronic control module estimated fuel-Oxygen ratio was lower than actual by up to 35% during the turbocharger lag period but within 2% of actual elsewhere, thus hindering fuel-Oxygen ratio limit-based smoke control. The dimensional modeling of transient flow was enabled with a new method of simulating transient data in which the manifold pressures and exhaust gas recirculation system flow resistance, characterized as a function of exhaust gas recirculation valve position at each measured transient data point, were replicated by quasi-static or transient simulation to predict engine flows. Dimensional modeling was also used to transform the engine operating parameter model input space to a more fundamental lower dimensional space so that a nearest neighbor approach could be used to predict smoke emissions. This new approach, intended for engine calibration and control modeling, was termed the "nonparametric reduced dimensionality" approach. It was used to predict federal test procedure cumulative particulate matter within 7% of measured value, based solely on steady-state training data. Very little correlation between the model inputs in the transformed space was observed as compared to the engine operating parameter space. This more uniform, smaller, shrunken model input space might explain how the nonparametric reduced dimensionality approach model could successfully predict federal test procedure emissions when roughly 40% of all transient points were classified as outliers as per the steady-state training data.
Resumo:
To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.
Resumo:
Consecrated in 1297 as the monastery church of the four years earlier founded St. Catherine’s monastery, the Gothic Church of St. Catherine was largely destroyed in a devastating bombing raid on January 2nd 1945. To counteract the process of disintegration, the departments of geo-information and lower monument protection authority of the City of Nuremburg decided to getting done a three dimensional building model of the Church of St. Catherine’s. A heterogeneous set of data was used for preparation of a parametric architectural model. In effect the modeling of historic buildings can profit from the so called BIM method (Building Information Modeling), as the necessary structuring of the basic data renders it into very sustainable information. The resulting model is perfectly suited to deliver a vivid impression of the interior and exterior of this former mendicant orders’ church to present observers.
Resumo:
Both climate change and socio-economic development will significantly modify the supply and consumption of water in future. Consequently, regional development has to face aggravation of existing or emergence of new conflicts of interest. In this context, transdisciplinary co-production of knowledge is considered as an important means for coping with these challenges. Accordingly, the MontanAqua project aims at developing strategies for more sustainable water management in the study area Crans-Montana-Sierre (Switzerland) in a transdisciplinary way. It strives for co-producing system, target and transformation knowledge among researchers, policy makers, public administration and civil society organizations. The research process basically consisted of the following steps: First, the current water situation in the study region was investigated. How much water is available? How much water is being used? How are decisions on water distribution and use taken? Second, participatory scenario workshops were conducted in order to identify the stakeholders’ visions of regional development. Third, the water situation in 2050 was simulated by modeling the evolution of water resources and water use and by reflecting on the institutional aspects. These steps laid ground for jointly assessing the consequences of the stakeholders’ visions of development in view of scientific data regarding governance, availability and use of water in the region as well as developing necessary transformation knowledge. During all of these steps researchers have collaborated with stakeholders in the support group RegiEau. The RegiEau group consists of key representatives of owners, managers, users, and pressure groups related to water and landscape: representatives of the communes (mostly the presidents), the canton (administration and parliament), water management associations, agriculture, viticulture, hydropower, tourism, and landscape protection. The aim of the talk is to explore potentials and constraints of scientific modeling of water availability and use within the process of transdisciplinary co-producing strategies for more sustainable water governance.
Resumo:
Many biological processes depend on the sequential assembly of protein complexes. However, studying the kinetics of such processes by direct methods is often not feasible. As an important class of such protein complexes, pore-forming toxins start their journey as soluble monomeric proteins, and oligomerize into transmembrane complexes to eventually form pores in the target cell membrane. Here, we monitored pore formation kinetics for the well-characterized bacterial pore-forming toxin aerolysin in single cells in real time to determine the lag times leading to the formation of the first functional pores per cell. Probabilistic modeling of these lag times revealed that one slow and seven equally fast rate-limiting reactions best explain the overall pore formation kinetics. The model predicted that monomer activation is the rate-limiting step for the entire pore formation process. We hypothesized that this could be through release of a propeptide and indeed found that peptide removal abolished these steps. This study illustrates how stochasticity in the kinetics of a complex process can be exploited to identify rate-limiting mechanisms underlying multistep biomolecular assembly pathways.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
The 20th Annual Biochemical Engineering Symposium was held at Kansas State University on April 21,1990. The objectives of the symposium were to provide: (i) a forum for informal discussion of biochemical engineering research being conducted at the participating institutions and (ii) an opportunity for students to present and publish their work. Twenty-eight papers presented at the symposium are included in this proceedings. Some of the papers describe the progress of ongoing projects, and others contain the results of completed projects. Only brief summaries are given of the papers that will be published in full elsewhere. The program of the symposium and a list of the participants are included in the proceedings. ContentsCell Separations and Recycle Using an Inclined Settler, Ching-Yuan Lee, Robert H. Davis and Robert A. Sclafani Micromixing and Metabolism in Bioreactors: Characterization of a 14 L Fermenter, K.S. Wenger and E.H. Dunlop Production, Purification, and Hydrolysis Kinetics of Wild-Type and Mutant Glucoamylases from Aspergillus Awamori, Ufuk Bakir, Paul D. Oates, Hsiu-Mei Chen and Peter J. Reilly Dynamic Modeling of the Immune System, Barry Vant-Hull and Dhinakar S. Kompala Dynamic Modeling of Active Transport Across a Biological Cell: A Stochastic Approach, B.C. Shen, S.T. Chou, Y.Y. Chiu and L.T. Fan Electrokinetic Isolation of Bacterial Vesicles and Ribosomes, Debra T.L. Hawker, Robert H. Davis, Paul W. Todd, and Robert Lawson Application of Dynamic Programming for Fermentative Ethanol Production by Zymomonas mobilis, Sheyla L. Rivera and M. Nazmul Karim Biodegradation of PCP by Pseudomonas cepacia, R. Rayavarapu, S.K. Banerji, and R.K. Bajpai Modeling the Bioremediation of Contaminated Soil Aggregates: a Phenomenological Approach, S. Dhawan, L.E. Erickson and L.T. Fan Biospecific Adsorption of Glucoamylase-I from Aspergillus niger on Raw Starch, Bipin K. Dalmia and Zivko L. Nikolov Overexpression in Recombinant Mammalian Cells: Effect on Growth Rate and Genetic Instability, Jeffrey A. Kern and Dhinakar S. Kompala Structured Mathematical Modeling of Xylose Fermentation, A.K. Hilaly, M.N. Karim, I. C. Linden and S. Lastick A New Culture Medium for Carbon-limited Growth of Bacillus thuringiensis, W. -M. Liu and R.K. Bajpai Determination of Sugars and Sugar Alcohols by High Performance Ion Chromatography, T. J. Paskach, H.-P. Lieker, P.J. Reilly, and K. Thielecke Characterization of Poly-Asp Tailed B-Galactosidase, M.Q. Niederauer, C.E. Glatz, l.A. Suominen, C.F. Ford, and M.A. Rougvie Computation of Conformations and Energies of cr-Glucosyl Disaccharides, Jing Zepg, Michael K. Dowd, and Peter J. Reilly Pentachlorophenol Interactions with Soil, Shein-Ming Wei, Shankha K. Banerji, and Rakesh K. Bajpai Oxygen Transfer to Viscous Liquid Media in Three-Phase Fluidized Beds of Floating Bubble Freakers, Y. Kang, L.T. Fan, B.T. Min and S.D. Kim Studies on the Invitro Development of Chick Embryo, A. Venkatraman and T. Panda The Evolution of a Silicone Based Phase-Separated Gravity-Independent Bioreactor, Peter E. Villeneuve and Eric H. Dunlop Biodegradation of Diethyl Phthalate, Guorong Zhang, Kenneth F. Reardon and Vincent G. Murphy Microcosm Treatability of Soil Contaminated with Petroleum Hydrocarbons, P. Tuitemwong, S. Dhawan, B.M. Sly, L.E. Erickson and J.R. Schlup
Resumo:
This is the twenty-second of a series of symposia devoted to talks and posters by students about their biochemical engineering research. The first, third, fifth, ninth, twelfth, sixteenth, and twenti~th were hosted by Kansas State University, the second and fourth by the University of Nebraska- Lincoln, the sixth, seventh, tenth, thirteenth, seventeenth, and twenty-second by Iowa State University, the eighth, fourteenth, and nineteenth by the University of Missouri-Columbia, the eleventh, fifteenth, and twenty-first by Colorado State University, and the eighteenth by the University of Colorado. Next year's symposium will be at the University of Oklahoma. Symposium proceedings are edited and issued by faculty of the host institution. Because final publication usually takes place in refereed journals, articles included here are brief and often cover work in progress. ContentsC. A. Baldwin, J.P. McDonald, and L. E. Erickson, Kansas State University. Effect of Hydrocarbon Phase on Kinetic and Transport Limitations for Bioremediation of Microporous Soil J. C. Wang, S. K. Banerji, and Rakesh Bajpai, University of Missouri-Columbia. Migration of PCP in Soil-Columns in Presence of a Second Organic Phase Cheng-Hsien Hsu and Roger G. Harrison, University of Oklahoma. Bacterial Leaching of Zinc and Copper from Mining Wastes James A. Searles, Paul Todd, and Dhinakar S. Kompala, University of Colorado. Suspension Culture of Chinese Hamster Ovary Cells Utilizing Inclined Sedimentation Ron Beyerinck and Eric H. Dunlop, Colorado State University. The Effect of Feed Zone Turbulence as Measured by Laser Doppler Velocimetry on Baker's Yeast Metabolism in a Chemostat Paul Li-Hong Yeh, GraceY. Sun, Gary A. Weisman, and Rakesh Bajpai, University of Missouri-Columbia. Effect of Medium Constituents upon Membrane Composition of Insect Cells R. Shane Gold, M. M. Meagher, R. Hutkins, and T. Conway, University of Nebraska-Lincoin. Ethanol Tolerance and Carbohydrate Metabolism in Lactobacilli John Sargantanis and M. N. Karim, Colorado State University. Application of Kalman Filter and Adaptive Control in Solid Substrate Fermentation D. Vrana, M. Meagher, and R. Hutkins, University of Nebraska-Lincoln. Product Recovery Optimization in the ABE Fermentation Kalyan R. Tadikonda and Robert H. Davis, University of Colorado. Cell Separations Using Targeted Monoclonal Antibodies Against Surface Proteins Meng H. Heng and Charles E. Glatz, Iowa State University. Charged Fusion for Selective Recovery of B-Galactosidase from Cell Extract Using Hollow Fiber Ion-Exchange Membrane Adsorption Hsiu-Mei Chen, Peter J. Reilly, and Clark Ford, Iowa State University. Site-Directed Mutagenesis to Enhance Thermostability of Glucoamylase from Aspergillus: A Rational Approach P. Tuitemwong, L. E. Erickson, and D. Y. C. Fung, Kansas State University. Applications of Enzymatic Hydrolysis and Fermentation on the Reduction of Flatulent Sugars in the Rapid Hydration Hydrothermal Cooked Soy Milk Sanjeev Redkar and Robert H. Davis, University of Colorado. Crossflow Microfiltration of Yeast Suspensions Linda Henk and James C. Linden, Colorado State University, and Irving C. Anderson, Iowa State University. Evaluation of Sorghum Ensilage as an Ethanol Feedstock Marc Lipovitch and James C. Linden, Colorado State University. Stability and Biomass Feedstock Pretreatability for Simultaneous Saccharification and Fermentation Ali Demirci, Anthony L. Pometto Ill, and Kenneth E. Johnson, Iowa State University. Application of Biofilm Reactors in Lactic Acid Fermentation Michael K. Dowd, Peter I. Reilly, and WalterS. Trahanovsky, Iowa State University. Low Molecular-Weight Organic Composition of Ethanol Stillage from Corn Craig E. Forney, Meng H. Heng, John R. Luther, Mark Q. Niederauer, and Charles E. Glatz, Iowa State University. Enhancement of Protein Separation Using Genetic Engineering J. F. Shimp, J. C. Tracy, E. Lee, L. C. Davis, and L. E. Erickson, Kansas State University. Modeling Contaminant Transport, Biodegradation and Uptake by Plants in the Rhizosphere Xiaoqing Yang, L. E. Erickson, and L. T. Fan, Kansas State University. Modeling of Dispersive-Convective Characteristics in Bioremediation of Contaminated Soil Jan Johansson and Rakesh Bajpai, University of Missouri-Columbia. Fouling of Membranes J. M. Wang, S. K. Banerji, and R. K. Bajpai, University of Missouri-Columbia. Migration of Sodium-Pentachorophenol (Na-PCP) in Unsaturated and Saturated Soil-Columns J. Sweeney and M. Meagher, University of Nebraska-Lincoln. The Purification of Alpha-D-Glucuronidase from Trichoderma reesei
Resumo:
La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.
Resumo:
This PhD dissertation is framed in the emergent fields of Reverse Logistics and ClosedLoop Supply Chain (CLSC) management. This subarea of supply chain management has gained researchers and practitioners' attention over the last 15 years to become a fully recognized subdiscipline of the Operations Management field. More specifically, among all the activities that are included within the CLSC area, the focus of this dissertation is centered in direct reuse aspects. The main contribution of this dissertation to current knowledge is twofold. First, a framework for the so-called reuse CLSC is developed. This conceptual model is grounded in a set of six case studies conducted by the author in real industrial settings. The model has also been contrasted with existing literature and with academic and professional experts on the topic as well. The framework encompasses four building blocks. In the first block, a typology for reusable articles is put forward, distinguishing between Returnable Transport Items (RTI), Reusable Packaging Materials (RPM), and Reusable Products (RP). In the second block, the common characteristics that render reuse CLSC difficult to manage from a logistical standpoint are identified, namely: fleet shrinkage, significant investment and limited visibility. In the third block, the main problems arising in the management of reuse CLSC are analyzed, such as: (1) define fleet size dimension, (2) control cycle time and promote articles rotation, (3) control return rate and prevent shrinkage, (4) define purchase policies for new articles, (5) plan and control reconditioning activities, and (6) balance inventory between depots. Finally, in the fourth block some solutions to those issues are developed. Firstly, problems (2) and (3) are addressed through the comparative analysis of alternative strategies for controlling cycle time and return rate. Secondly, a methodology for calculating the required fleet size is elaborated (problem (1)). This methodology is valid for different configurations of the physical flows in the reuse CLSC. Likewise, some directions are pointed out for further development of a similar method for defining purchase policies for new articles (problem (4)). The second main contribution of this dissertation is embedded in the solutions part (block 4) of the conceptual framework and comprises a two-level decision problem integrating two mixed integer linear programming (MILP) models that have been formulated and solved to optimality using AIMMS as modeling language, CPLEX as solver and Excel spreadsheet for data introduction and output presentation. The results obtained are analyzed in order to measure in a client-supplier system the economic impact of two alternative control strategies (recovery policies) in the context of reuse. In addition, the models support decision-making regarding the selection of the appropriate recovery policy against the characteristics of demand pattern and the structure of the relevant costs in the system. The triangulation of methods used in this thesis has enabled to address the same research topic with different approaches and thus, the robustness of the results obtained is strengthened.
Resumo:
The bryostatins are a unique family of emerging cancer chemotherapeutic candidates isolated from marine bryozoa. Although the biochemical basis for their therapeutic activity is not known, these macrolactones exhibit high affinities for protein kinase C (PKC) isozymes, compete for the phorbol ester binding site on PKC, and stimulate kinase activity in vitro and in vivo. Unlike the phorbol esters, they are not first-stage tumor promoters. The design, computer modeling, NMR solution structure, PKC binding, and functional assays of a unique class of synthetic bryostatin analogs are described. These analogs (7b, 7c, and 8) retain the putative recognition domain of the bryostatins but are simplified through deletions and modifications in the C4-C14 spacer domain. Computer modeling of an analog prototype (7a) indicates that it exists preferentially in two distinct conformational classes, one in close agreement with the crystal structure of bryostatin 1. The solution structure of synthetic analog 7c was determined by NMR spectroscopy and found to be very similar to the previously reported structures of bryostatins 1 and 10. Analogs 7b, 7c, and 8 bound strongly to PKC isozymes with Ki = 297, 3.4, and 8.3 nM, respectively. Control 7d, like the corresponding bryostatin derivative, exhibited weak PKC affinity, as did the derivative, 9, lacking the spacer domain. Like bryostatin, acetal 7c exhibited significant levels of in vitro growth inhibitory activity (1.8–170 ng/ml) against several human cancer cell lines, providing an important step toward the development of simplified, synthetically accessible analogs of the bryostatins.
Resumo:
The extent to which new technological knowledge flows across institutional and national boundaries is a question of great importance for public policy and the modeling of economic growth. In this paper we develop a model of the process generating subsequent citations to patents as a lens for viewing knowledge diffusion. We find that the probability of patent citation over time after a patent is granted fits well to a double-exponential function that can be interpreted as the mixture of diffusion and obsolescense functions. The results indicate that diffusion is geographically localized. Controlling for other factors, within-country citations are more numerous and come more quickly than those that cross country boundaries.