823 resultados para blended workflow
Resumo:
Expanded products have been developed by extrusion of non-conventional highly nutritious raw materials such as amaranth and chickpea blended with bovine lung. As sensory acceptance of these snacks is restricted, this study aimed at improving their texture, through the addition of monosodium glutamate (MSG) and disodium inosinate (IMP) flavor enhancers to the feeding material, or to the flavor added after the extrusion. Sensory and mechanical analyses showed that both enhancers affected texture, assessed by sensory and instrumental methods. Addition of IMP together with MSG to the chickpea-based snacks presented the best results. This beneficial effect was not observed in the amaranth-based snack, suggesting that IMP and MSG can favorably impact texture of extruded products depending on the amount and type of protein present
Resumo:
Montage was one of the great innovations of avant-garde literature. As time went by, this technique lost some of its shock effect and became increasingly a method for structuring the text. After the avant-garde literature movement, collage started to be mentioned as a concept together with montage, many times as a complement, but the concepts are often blended into each other. In this article, there will be an attempt to delimit what is montage and what is collage through an exposition of some theories and the examination of the way they are applied in the work of a contemporary writer, the German Walter Kempowski.
Resumo:
Biopulping fundamentals, technology and mechanisms are reviewed in this article. Mill evaluation of Eucalyptus grandis wood chips biotreated by Ceriporiopsis subvermispora on a 50-tonne pilot-plant demonstrated that equivalent energy savings can be obtained in lab- and mill-scale biopulping. Some drawbacks concerning limited improvements in pulp strength and contamination of the chip pile with opportunist fungi have been observed. The use of pre-cultured wood chips as inoculum seed for the biotreatment process minimized contamination problems related to the use of blended mycelium and corn-steep liquor in the inoculation step. Alkaline wash restored part of the brightness in biopulps and marketable brightness values were obtained by one-stage bleaching with 5% H2O2 when bio-TMP pulps were under evaluation. Considering the current scenario, the understanding of biopulping mechanisms has gained renewed attention because more resistant and competitive fungal species could be selected with basis on a function-directed screening project. A series of studies aimed to elucidate structural changes in lignin during wood biodegradation by C. subvermispora had indicated that lignin depolymerization occurs during initial stages of wood biotreatment. Aromatic hydroxyls did not increase with the split of aryl-ether linkages, suggesting that the ether-cleavage-products remain as quitione-type structures. On the other hand, cellulose is more resistant to the attack by C subvermispora. MnP-initiated lipid peroxidation reactions have been proposed to explain degradation of non-phenolic lignin substructures by C subvermispora, while the lack of cellobiohydrolases and the occurrence of systems able to suppress Fenton`s reaction in the cultures have explained non-efficient cellulose degradation by this biopulping fungus. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
The compositions of canola, soybean, corn, cottonseed and sunflower oils suggest that they exhibit substantially different propensity for oxidation following the order of Canola < corn < cottonseed < sunflower approximate to soybean. These data suggest that any of the vegetable oils evaluated could be blended with minimal impact on viscosity although compositional differences would surely affect oxidative stability. Cooling curve analysis showed that similar cooling profiles were obtained for different vegetable oils. Interestingly, no film boiling or transition nucleate boiling was observed with any of the vegetable oils and heat transfer occurs only by pure nucleate boiling and convection. High-temperature cooling properties of vegetable oils are considerable faster than those observed for petroleum oil-based quenchants. (C)2010 Journal of Mechanical Engineering. All rights reserved.
Resumo:
Five vegetable oils: canola, soybean, corn, cottonseed and sunflower oils were characterized with respect to their composition by gas chromatography and viscosity. The compositions of the vegetable oils suggest that they exhibit substantially different propensity for oxidation following the order of: canola < corn < cottonseed < sunflower approximate to soybean. Viscosities at 40 degrees C and 100 degrees C and the viscosity index (VI) values were determined for the vegetable oils and two petroleum oil quenchants: Microtemp 157 (a conventional slow oil) and Microtemp 153B (an accelerated or fast oil). The kinematic viscosities of the different vegetable and petroleum oils at 40 degrees C were similar. The VI values for the different vegetable oils were very close and varied between 209-220 and were all much higher than the VI values obtained for Microtemp 157 (96) and Microtemp 153B (121). These data indicate that the viscosity variations of these vegetable oils are substantially less sensitive to temperature variation than are the parafinic oil based Microtemp 157 and Microtemp 153B. Although these data suggest that any of the vegetable oils evaluated could be blended with minimal impact on viscosity, the oxidative stability would surely be substantially impacted. Cooling curve analysis was performed on these vegetable oils at 60 degrees C under non-agitated conditions. These results were compared with cooling curves obtained for Microtemp 157, a conventional, unaccelerated petroleum oil, and Microtemp 153B, an accelerated petroleum oil under the same conditions. The results showed that cooling profiles of the different vegetable oils were similar as expected from the VI values. However, no boiling was observed wit any of the vegetable oils and heat transfer occurs only by convection since there is no full-film boiling and nucleate boiling process as typically observed for petroleum oil quenchants, including those of this study. Therefore, high-temperature cooling is considerable faster for vegetable oils as a class. The cooling properties obtained suggest that vegetable oils would be especially suitable fur quenching low-hardenability steels such as carbon steels.
Resumo:
Ternary compatible blends of chitosan, poly(vinyl alcohol), and poly(lactic acid) were prepared by an oil-in-water (O/W) emulsion process. Solutions of chitosan in aqueous acetic acid, poly(vinyl alcohol) (PVA) in water, and poly(lactic acid) (PLA) in chloroform were blended with a high shear mixer. PVA was used as an emulsifier to stabilize the emulsion and to reduce the interfacial tension between the solid polymers in the blends-produced. It proved to work very well because the emulsions were stable for periods of days or weeks and compatible blends were obtained When PVA was added. This effect was attributed to a synergistic effect of PVA and chitosan because the binary blends PVA/PLA and chitosan/PLA were completely incompatible; The blends were characterized by scanning electron microscopy (SEM), differential scanning calorimetry (DSC), thermal mechanical analysis (TMA), stress strain tests, and Fourier transform infrared spectroscopy (FTIR). The results indicated that despite the fact that the system contained distinct phases some degree of molecular miscibility occurred when the three components were present in the blend.
Resumo:
At present, the cement industry generates approximately 5% of the world`s anthropogenic CO(2) emissions. This share is expected to increase since demand for cement based products is forecast to multiply by a factor of 2.5 within the next 40 years and the traditional strategies to mitigate emissions, focused on the production of cement, will not be capable of compensating such growth. Therefore, additional mitigation strategies are needed, including an increase in the efficiency of cement use. This paper proposes indicators for measuring cement use efficiency, presents a benchmark based on literature data and discusses potential gains in efficiency. The binder intensity (bi) index measures the amount of binder (kg m(-3)) necessary to deliver 1 MPa of mechanical strength, and consequently express the efficiency of using binder materials. The CO(2) intensity index (ci) allows estimating the global warming potential of concrete formulations. Research benchmarks show that bi similar to 5 kg m(-3) MPa(-1) are feasible and have already been achieved for concretes >50 MPa. However, concretes with lower compressive strengths have binder intensities varying between 10 and 20 kg m(-3) MPa(-1). These values can be a result of the minimum cement content established in many standards and reveal a significant potential for performance gains. In addition, combinations of low bi and ci are shown to be feasible. (c) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This document records the process of migrating eprints.org data to a Fez repository. Fez is a Web-based digital repository and workflow management system based on Fedora (http://www.fedora.info/). At the time of migration, the University of Queensland Library was using EPrints 2.2.1 [pepper] for its ePrintsUQ repository. Once we began to develop Fez, we did not upgrade to later versions of eprints.org software since we knew we would be migrating data from ePrintsUQ to the Fez-based UQ eSpace. Since this document records our experiences of migration from an earlier version of eprints.org, anyone seeking to migrate eprints.org data into a Fez repository might encounter some small differences. Moving UQ publication data from an eprints.org repository into a Fez repository (hereafter called UQ eSpace (http://espace.uq.edu.au/) was part of a plan to integrate metadata (and, in some cases, full texts) about all UQ research outputs, including theses, images, multimedia and datasets, in a single repository. This tied in with the plan to identify and capture the research output of a single institution, the main task of the eScholarshipUQ testbed for the Australian Partnership for Sustainable Repositories project (http://www.apsr.edu.au/). The migration could not occur at UQ until the functionality in Fez was at least equal to that of the existing ePrintsUQ repository. Accordingly, as Fez development occurred throughout 2006, a list of eprints.org functionality not currently supported in Fez was created so that programming of such development could be planned for and implemented.
Resumo:
The aim of this study was to evaluate the response of osteoblastic cells to the composite of Ricinus cominunis polyurethane (RCP) and alkaline phosphatase (ALP) incubated in synthetic body fluid (SBF). RCP pure (RCPp) and RCP blended with ALP 6 mg/mL polymer (RCP+ALP) were incubated in SBF for 17 days. Four groups of RCP were tested: RCPp, RCP+ALP, and RCPp and RCP+ALP incubated in SBF (RCPp/SBF and RCP+ALP/SBF). Stem cells from rat bone marrow were cultured in conditions that allowed osteoblastic differentiation on RCP discs and were evaluated: cell adhesion, culture growth, cell viability, total protein content, ALP activity, and bone-like nodule formation. Data were compared by ANOVA or Kruskal-Wallis test. The group RCP-A P was highly cytotoxic and, therefore, was not considered here. Cell adhesion (p = 0.14), culture growth (p = 0.39), viability (p = 0.46) and total protein content (p = 0.12) were not affected by either RCP composition or incubation in SBE ALP activity was affected (p = 0.0001) as follows: RCPp < RCPp/SBF < RCP+ALP/SBF. Bone-like nodule formation was not observed on all evaluated groups. The composite RCP+ALP prior to SBF incubation is cytotoxic and must not be considered as biomaterial, but the incorporation of ALP to the RCP followed by SBF incubation could be a useful alternative to improve the biological properties of the RCP. (c) 2007 Wiley Periodicals, Inc.
Resumo:
The Ministry of Education in Singapore has embarked on the ambitious project of introducing IT in schools. The IT Masterplan, budgeted at a cost of $2 billion, aims to wire up all schools by the year 2002. While the well-funded IT Masterplan is seeing the project in its final phase of implementation, this paper argues for a "critical cyber pedagogy" along with the acquisition of the functional and operational skills of technology. Drawing on theories of critical multiliteracies (Burbules & Callister, 2000; Luke, 2000b; New London Group, 1996), this paper explores and suggests how an instructional design of two classroom activities can be utilized as new forms of cyber and technoliteracies. Through the critical evaluation of websites and hypertext construction, students will be equipped with a new literacy that extends reading and writing by incorporating new blended forms of hybrid textualities. This technology-assisted pedagogy can achieve the desired outcome of self-directed learning, teamwork, critical thinking and problem solving strategies necessary for a knowledge-based society.
Resumo:
This paper examines the development of starch-based plastics for use as biodegradable mulch film. A variety of starch-based polymers are blended with high performance biodegradable polyester polymers in order to determine the applicability of films to be processed on a film blowing line and to perform well in mulch film field trials. The process of material formulation, film blowing processing and scale-up and performance properties are highlighted for a successful material. Insights into future developments of starch-derived biodegradable polymers are given.
Resumo:
In the last years, it has become increasingly clear that neurodegenerative diseases involve protein aggregation, a process often used as disease progression readout and to develop therapeutic strategies. This work presents an image processing tool to automatic segment, classify and quantify these aggregates and the whole 3D body of the nematode Caenorhabditis Elegans. A total of 150 data set images, containing different slices, were captured with a confocal microscope from animals of distinct genetic conditions. Because of the animals’ transparency, most of the slices pixels appeared dark, hampering their body volume direct reconstruction. Therefore, for each data set, all slices were stacked in one single 2D image in order to determine a volume approximation. The gradient of this image was input to an anisotropic diffusion algorithm that uses the Tukey’s biweight as edge-stopping function. The image histogram median of this outcome was used to dynamically determine a thresholding level, which allows the determination of a smoothed exterior contour of the worm and the medial axis of the worm body from thinning its skeleton. Based on this exterior contour diameter and the medial animal axis, random 3D points were then calculated to produce a volume mesh approximation. The protein aggregations were subsequently segmented based on an iso-value and blended with the resulting volume mesh. The results obtained were consistent with qualitative observations in literature, allowing non-biased, reliable and high throughput protein aggregates quantification. This may lead to a significant improvement on neurodegenerative diseases treatment planning and interventions prevention
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
O desenvolvimento actual de aplicações paralelas com processamento intensivo (HPC - High Performance Computing) para alojamento em computadores organizados em Cluster baseia-se muito no modelo de passagem de mensagens, do qual é de realçar os esforços de definição de standards, por exemplo, MPI - Message - Passing Interface. Por outro lado, com a generalização do paradigma de programação orientado aos objectos para ambientes distribuídos (Java RMI, .NET Remoting), existe a possibilidade de considerar que a execução de uma aplicação, de processamento paralelo e intensivo, pode ser decomposta em vários fluxos de execução paralela, em que cada fluxo é constituído por uma ou mais tarefas executadas no contexto de objectos distribuídos. Normalmente, em ambientes baseados em objectos distribuídos, a especificação, controlo e sincronização dos vários fluxos de execução paralela, é realizada de forma explicita e codificada num programa principal (hard-coded), dificultando possíveis e necessárias modificações posteriores. No entanto, existem, neste contexto, trabalhos que propõem uma abordagem de decomposição, seguindo o paradigma de workflow com interacções entre as tarefas por, entre outras, data-flow, control-flow, finite - state - machine. Este trabalho consistiu em propor e explorar um modelo de execução, sincronização e controlo de múltiplas tarefas, que permita de forma flexível desenhar aplicações de processamento intensivo, tirando partido da execução paralela de tarefas em diferentes máquinas. O modelo proposto e consequente implementação, num protótipo experimental, permite: especificar aplicações usando fluxos de execução; submeter fluxos para execução e controlar e monitorizar a execução desses fluxos. As tarefas envolvidas nos fluxos de execução podem executar-se num conjunto de recursos distribuídos. As principais características a realçar no modelo proposto, são a expansibilidade e o desacoplamento entre as diferentes componentes envolvidas na execução dos fluxos de execução. São ainda descritos casos de teste que permitiram validar o modelo e o protótipo implementado. Tendo consciência da necessidade de continuar no futuro esta linha de investigação, este trabalho é um contributo para demonstrar que o paradigma de workflow é adequado para expressar e executar, de forma paralela e distribuída, aplicações complexas de processamento intensivo.
Resumo:
We present a novel data analysis strategy which combined with subcellular fractionation and liquid chromatography-mass spectrometry (LC-MS) based proteomics provides a simple and effective workflow for global drug profiling. Five subcellular fractions were obtained by differential centrifugation followed by high resolution LC-MS and complete functional regulation analysis. The methodology combines functional regulation and enrichment analysis into a single visual summary. The workflow enables improved insight into perturbations caused by drugs. We provide a statistical argument to demonstrate that even crude subcellular fractions leads to improved functional characterization. We demonstrate this data analysis strategy on data obtained in a MS-based global drug profiling study. However, this strategy can also be performed on other types of large scale biological data.