36 resultados para Multi-Photon Processes

em Universidade do Minho


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fiber membranes prepared from jute fragments can be valuable, low cost, and renewable. They have broad application prospects in packing bags, geotextiles, filters, and composite reinforcements. Traditionally, chemical adhesives have been used to improve the properties of jute fiber membranes. A series of new laccase, laccase/mediator systems, and multi-enzyme synergisms were attempted. After the laccase treatment of jute fragments, the mechanical properties and surface hydrophobicity of the produced fiber membranes increased because of the cross-coupling of lignins with ether bonds mediated by laccase. The optimum conditions were a buffer pH of 4.5 and an incubation temperature of 60 °C with 0.92 U/mL laccase for 3 h. Laccase/guaiacol and laccase/alkali lignin treatments resulted in remarkable increases in the mechanical properties; in contrast, the laccase/2,2-azino-bis-(3-ethylthiazoline-6-sulfonate) (ABTS) and laccase/2,6-dimethoxyphenol treatments led to a decrease. The laccase/ guaiacol system was favorable to the surface hydrophobicity of jute fiber membranes. However, the laccase/alkali lignin system had the opposite effect. Xylanase/laccase and cellulase/laccase combined treatments were able to enhance both the mechanical properties and the surface hydrophobicity of jute fiber membranes. Among these, cellulase/laccase treatment performed better; compared to mechanical properties, the surface hydrophobicity of the jute fiber membranes showed only a slight increase after the enzymatic multi-step processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kinetic models have a great potential for metabolic engineering applications. They can be used for testing which genetic and regulatory modifications can increase the production of metabolites of interest, while simultaneously monitoring other key functions of the host organism. This work presents a methodology for increasing productivity in biotechnological processes exploiting dynamic models. It uses multi-objective dynamic optimization to identify the combination of targets (enzymatic modifications) and the degree of up- or down-regulation that must be performed in order to optimize a set of pre-defined performance metrics subject to process constraints. The capabilities of the approach are demonstrated on a realistic and computationally challenging application: a large-scale metabolic model of Chinese Hamster Ovary cells (CHO), which are used for antibody production in a fed-batch process. The proposed methodology manages to provide a sustained and robust growth in CHO cells, increasing productivity while simultaneously increasing biomass production, product titer, and keeping the concentrations of lactate and ammonia at low values. The approach presented here can be used for optimizing metabolic models by finding the best combination of targets and their optimal level of up/down-regulation. Furthermore, it can accommodate additional trade-offs and constraints with great flexibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doctoral Dissertation for PhD degree in Chemical and Biological Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper assesses land-use changes related to naturbanization processes on three biosphere reserves in Southern Europe. A comparative analysis has been done on the National Parks in Peneda-Ger^es in North Portugal, C_evennes in South France and Sierra Nevada in South Spain, using Corine Land Cover data from 1990 until 2006. Results indicate that the process of land-use intensification is taking place in the frame of naturbanization dynamics that could jeopardize the role of Protected Areas. Focusing on the trends faced by National Parks and their surrounding territories, the analysis demonstrates, both in quantitative and spatial terms, the intensification processes of land-use changes and how it is important to know them for coping with increasing threats. The article concludes that in the current context of increasing stresses, a broader focus on nature protection, encompassing the wider countryside, is needed if the initiatives for biodiversity protection are to be effective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling Extract-Transform-Load (ETL) processes of a Data Warehousing System has always been a challenge. The heterogeneity of the sources, the quality of the data obtained and the conciliation process are some of the issues that must be addressed in the design phase of this critical component. Commercial ETL tools often provide proprietary diagrammatic components and modeling languages that are not standard, thus not providing the ideal separation between a modeling platform and an execution platform. This separation in conjunction with the use of standard notations and languages is critical in a system that tends to evolve through time and which cannot be undermined by a normally expensive tool that becomes an unsatisfactory component. In this paper we demonstrate the application of Relational Algebra as a modeling language of an ETL system as an effort to standardize operations and provide a basis for uncommon ETL execution platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PhD thesis in Bioengineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In several industrial applications, highly complex behaviour materials are used together with intricate mixing processes, which difficult the achievement of the desired properties for the produced materials. This is the case of the well-known dispersion of nano-sized fillers in a melt polymer matrix, used to improve the nanocomposite mechanical and/or electrical properties. This mixing is usually performed in twin-screw extruders, that promote complex flow patterns, and, since an in loco analysis of the material evolution and mixing is difficult to perform, numerical tools can be very useful to predict the evolution and behaviour of the material. This work presents a numerical based study to improve the understanding of mixing processes. Initial numerical studies were performed with generalized Newtonian fluids, but, due to the null relaxation time that characterize this type of fluids, the assumption of viscoelastic behavior was required. Therefore, the polymer melt was rheologically characterized, and, a six mode Phan-Thien-Tanner and Giesekus models were used to fit the rheological data. These viscoelastic rheological models were used to model the process. The conclusions obtained in this work provide additional and useful data to correlate the type and intensity of the deformation history promoted to the polymer nanocomposite and the quality of the mixing obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A search is performed for top-quark pairs (tt¯) produced together with a photon (γ) with transverse momentum >20 GeV using a sample of tt¯ candidate events in final states with jets, missing transverse momentum, and one isolated electron or muon. The dataset used corresponds to an integrated luminosity of 4.59 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. In total 140 and 222 tt¯γ candidate events are observed in the electron and muon channels, to be compared to the expectation of 79±26 and 120±39 non-tt¯γ background events respectively. The production of tt¯γ events is observed with a significance of 5.3 standard deviations away from the null hypothesis. The tt¯γ production cross section times the branching ratio (BR) of the single-lepton decay channel is measured in a fiducial kinematic region within the ATLAS acceptance. The measured value is σfidtt¯γ=63±8(stat.)+17−13(syst.)±1(lumi.) fb per lepton flavor, in good agreement with the leading-order theoretical calculation normalized to the next-to-leading-order theoretical prediction of 48±10 fb.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results of a search for new phenomena in events with an energetic photon and large missing transverse momentum with the ATLAS experiment at the LHC are reported. Data were collected in proton--proton collisions at a center-of-mass energy of 8 TeV and correspond to an integrated luminosity of 20.3 fb−1. The observed data are well described by the expected Standard Model backgrounds. The expected (observed) upper limit on the fiducial cross section for the production of such events is 6.1 (5.3) fb at 95% confidence level. Exclusion limits are presented on models of new phenomena with large extra spatial dimensions, supersymmetric quarks, and direct pair production of dark-matter candidates.