967 resultados para Generic application tests
Resumo:
A protein extract containing a plant lipase from oleaginous seeds of Pachira aquatica was tested using soybean oil, wastewater from a poultry processing plant, and beef fat particles as substrate. The hydrolysis experiments were carried out at a temperature of 40°C, an incubation time of 90 minutes, and pH 8.0-9.0. The enzyme had the best stability at pH 9.0 and showed good stability in the alkaline range. It was found that P. aquatica lipase was stable in the presence of some commercial laundry detergent formulations, and it retained full activity up to 0.35% in hydrogen peroxide, despite losing activity at higher concentrations. Concerning wastewater, the lipase increased free fatty acids release by 7.4 times and promoted the hydrolysis of approximately 10% of the fats, suggesting that it could be included in a pretreatment stage, especially for vegetable oil degradation.
Resumo:
Atualmente, verifica-se um aumento na necessidade de software feito à medida do cliente, que se consiga adaptar de forma rápida as constantes mudanças da sua área de negócio. Cada cliente tem os seus problemas concretos que precisa de resolver, não lhe sendo muitas vezes possível dispensar uma elevada quantidade de recursos para atingir os fins pretendidos. De forma a dar resposta a estes problemas surgiram várias arquiteturas e metodologias de desenvolvimento de software, que permitem o desenvolvimento ágil de aplicações altamente configuráveis, que podem ser personalizadas por qualquer utilizador das mesmas. Este dinamismo, trazido para as aplicações sobre a forma de modelos que são personalizados pelos utilizadores e interpretados por uma plataforma genérica, cria maiores desafios no momento de realizar testes, visto existir um número de variáveis consideravelmente maior que numa aplicação com uma arquitetura tradicional. É necessário, em todos os momentos, garantir a integridade de todos os modelos, bem como da plataforma responsável pela sua interpretação, sem ser necessário o desenvolvimento constante de aplicações para suportar os testes sobre os diferentes modelos. Esta tese debruça-se sobre uma aplicação, a plataforma myMIS, que permite a interpretação de modelos orientados à gestão, escritos numa linguagem específica de domínio, sendo realizada a avaliação do estado atual e definida uma proposta de práticas de testes a aplicar no desenvolvimento da mesma. A proposta resultante desta tese permitiu verificar que, apesar das dificuldades inerentes à arquitetura da aplicação, o desenvolvimento de testes de uma forma genérica é possível, podendo as mesmas lógicas ser utilizadas para o teste de diversos modelos distintos.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Dissertação de natureza científica para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Dissertação para Obtenção do Grau de Mestre em Engenharia e Gestão Industrial
Resumo:
Vesistöissä laivojen pintaan tarttuvat eliöt ovat sekä taloudellinen että kosmeettinen ongelma. Kontrolloimattoman eliöiden kiinnittymisen seurauksena aiheutuu kitkaa, joka puolestaan hidastaa laivan nopeutta ja aiheuttaa polttoaineen kulutuksen kasvua. Tavallisesti eliöiden kiinnittymistä ehkäistään kiinnittymisenestomaalien avulla. Niiden toiminta perustuu biosidien liukenemiseen, jolloin veden ja pinnoitteen väliselle rajapinnalle muodostuu korkea biosidipitoisuus, joka estää eliöiden kiinnittymistä pinnalle. Maailmanlaajuinen orgaanisten tinayhdisteiden käyttökielto kiinnittymisen-estomaaleissa tulee voimaan vuoden 2003 alusta. Tällä hetkellä 70 % maailman laivastoista on suojattu orgaanista tinayhdistettä sisältävällä kiinnittymisenestomaalilla. Nyt onkin kasvava tarve kehittää uusia ympäristöystävällisempiä kiinnittymisenesto-pinnoitteita. Todennäköisesti tinayhdisteet tullaan korvaamaan synteettisillä orgaanisilla yhdisteillä käytettyinä yhdessä kuparin kanssa. Työn tarkoituksena oli valmistaa ympäristöystävällisempi tyydyttämätön polyesteripinnoite, joka itsessään ehkäisisi eliöiden kiinnittymistä. Kirjallisuusosassa tutustuttiin markkinoilla oleviin biosideihin, niiden myrkyllisyyteen ja vaikutuksiin ympäristölle sekä muuttuvaan lainsäädäntöön. Työssä tarkasteltiin myös tällä hetkellä markkinoilla olevia pinnoitteita ja niiden toimintamekanismeja sekä myrkyttömiä vaihtoehtopinnoitteita kiinnittymisenestoon. Kokeellinen osa koostui kahdesta osasta. Ensimmäisessä osassa tutkittiin biosidien sopivuutta käytettäväksi yhdessä tyydyttymättömän polyesterin kanssa. Yhteensopivuutta määritettiin applikaatiotesteillä ja pinnoitteen käyttäytymisen perusteella. Toinen vaihe oli selvittää pinnoitteen tehokkuus leväntarttumista vastaan. Tyydyttymätön polyesteri gel coat kiinnittymisenesto-ominaisuuksilla valmistettiin dispergoimalla biosideja tyydyttymättömään polyesterigeeliin. Yhteensopivuustestien tulosten perusteella huomattiin, ettei biosidien lisääminen geeliin vaikuta mainittavasti applikaatio-ominaisuuksien huononemiseen. Brookfield viskositeetin stabiilisuus jopa paranee ja yksi työssä käytetyistä biosideista parantaa pinnoitteen säänkestoominaisuuksia. Tässä työssä ei pystytty määrittämään eri biosidien välisiä eroja tehokkuudessa levää vastaan.
Resumo:
Georeferencing is one of the major tasks of satellite-borne remote sensing. Compared to traditional indirect methods, direct georeferencing through a Global Positioning System/inertial navigation system requires fewer and simpler steps to obtain exterior orientation parameters of remotely sensed images. However, the pixel shift caused by geographic positioning error, which is generally derived from boresight angle as well as terrain topography variation, can have a great impact on the precision of georeferencing. The distribution of pixel shifts introduced by the positioning error on a satellite linear push-broom image is quantitatively analyzed. We use the variation of the object space coordinate to simulate different kinds of positioning errors and terrain topography. Then a total differential method was applied to establish a rigorous sensor model in order to mathematically obtain the relationship between pixel shift and positioning error. Finally, two simulation experiments are conducted using the imaging parameters of Chang’ E-1 satellite to evaluate two different kinds of positioning errors. The experimental results have shown that with the experimental parameters, the maximum pixel shift could reach 1.74 pixels. The proposed approach can be extended to a generic application for imaging error modeling in remote sensing with terrain variation.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A control-oriented model of a Dual Clutch Transmission was developed for real-time Hardware In the Loop (HIL) applications, to support model-based development of the DCT controller. The model is an innovative attempt to reproduce the fast dynamics of the actuation system while maintaining a step size large enough for real-time applications. The model comprehends a detailed physical description of hydraulic circuit, clutches, synchronizers and gears, and simplified vehicle and internal combustion engine sub-models. As the oil circulating in the system has a large bulk modulus, the pressure dynamics are very fast, possibly causing instability in a real-time simulation; the same challenge involves the servo valves dynamics, due to the very small masses of the moving elements. Therefore, the hydraulic circuit model has been modified and simplified without losing physical validity, in order to adapt it to the real-time simulation requirements. The results of offline simulations have been compared to on-board measurements to verify the validity of the developed model, that was then implemented in a HIL system and connected to the TCU (Transmission Control Unit). Several tests have been performed: electrical failure tests on sensors and actuators, hydraulic and mechanical failure tests on hydraulic valves, clutches and synchronizers, and application tests comprehending all the main features of the control performed by the TCU. Being based on physical laws, in every condition the model simulates a plausible reaction of the system. The first intensive use of the HIL application led to the validation of the new safety strategies implemented inside the TCU software. A test automation procedure has been developed to permit the execution of a pattern of tests without the interaction of the user; fully repeatable tests can be performed for non-regression verification, allowing the testing of new software releases in fully automatic mode.
Resumo:
Relatório de estágio apresentada para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Sistemas de Informação Organizacionais
Resumo:
Emulsions and microcapsules are typical structures in various dispersion formulations for pharmaceutical, food, personal and house care applications. Precise control over size and size distribution of emulsion droplets and microcapsules are important for effective use and delivery of active components and better product quality. Many emulsification technologies have been developed to meet different formulation and processing requirements. Among them, membrane and microfluidic emulsification as emerging technologies have the feature of being able to precisely manufacture droplets in a drop-by-drop manner to give subscribed sizes and size distributions with lower energy consumption. This paper reviews fundamental sciences and engineering aspects of emulsification, membrane and microfluidic emulsification technologies and their use for precision manufacture of emulsions for intensified processing. Generic application examples are given for single and double emulsions and microcapsules with different structure features. © 2013 The Society of Powder Technology Japan. Published by Elsevier B.V.
Resumo:
The Toledo Gate of Ciudad Real, Spain, constructed between the late 13th and early 14th centuries, is the last remaining portion of a once complete medieval city wall. It represents the long history of the city and constitutes its main heritage symbol, dividing the historic city centre from the later 19th and 20th century expansions. In October 2012, the Town Hall and the Montemadrid Foundation started the conservation works to preserve this important monument. The preliminary phase of this project included an in-depth series of scientific studies which were carried out by a multidisciplinary team focusing on archival research, historic investigations, archaeological excavations as well as material composition analysis and main treatment application tests. As a result of these studies a series of virtual 3D models were created to inform, discuss and study the monument. A first digital model permitted visualization of the gate in the 19th century and how the main entrance to the city was integrated as a fundamental part of the city walls. This virtual reconstruction also became an important part of the campaign to raise awareness among the citizens towards a monument that had remained in the shadows for the last century, isolated in a roundabout after the systematic demolition of the city walls in the late 19th century. Over the last three years and as a result of these archaeological and historic investigations and subsequent virtual models, surprisingly new and interesting data were brought to light thus permitting the establishment and corroboration of a new and updated hypothesis of the Toledo Gate that goes beyond the previous ideas. As a result of these studies a new architectural typology with construction techniques of has been suggested. This paper describes how the results of this continuous and interdisciplinary documentation process have benefitted from a computer graphic reconstruction of the gate. It highlights how virtual reconstruction can be a powerful tool for conservation decision making and awareness raising. Furthermore, the interesting results of the final reconstruction hypothesis convinced the technical team responsible for the conservation to alter some aspects of the final project physical interventions in order to focus on some of the features and conclusions discovered through the virtual model study.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
The competence evaluation promoted by the European High Education Area entails a very important methodological change that requires guiding support to help teachers carry out this new and complex task. In this regard, the Technical University of Madrid (UPM, by its Spanish acronym) has financed a series of coordinated projects with a two-fold objective: a) To develop a model for teaching and evaluating core competences that is useful and easily applicable to its different degrees, and b) to provide support to teachers by creating an area within the Website for Educational Innovation where they can search for information on the model corresponding to each core competence approved by UPM. Information available on each competence includes its definition, the formulation of indicators providing evidence on the level of acquisition, the recommended teaching and evaluation methodology, examples of evaluation rules for the different levels of competence acquisition, and descriptions of best practices. These best practices correspond to pilot tests applied to several of the academic subjects conducted at UPM in order to validate the model. This work describes the general procedure that was used and presents the model developed specifically for the problem-solving competence. Some of the pilot experiences are also summarised and their results analysed
Resumo:
The competence evaluation promoted by the European High Education Area entails a very important methodological change that requires guiding support to help teachers carry out this new and complex task. In this regard, the Technical University of Madrid (UPM, by its Spanish acronym) has financed a series of coordinated projects with a two-fold objective: a) To develop a model for teaching and evaluating core competences that is useful and easily applicable to its different degrees, and b) to provide support to teachers by creating an area within the Website for Educational Innovation where they can search for information on the model corresponding to each core competence approved by UPM. Information available on each competence includes its definition, the formulation of indicators providing evidence on the level of acquisition, the recommended teaching and evaluation methodology, examples of evaluation rules for the different levels of competence acquisition, and descriptions of best practices. These best practices correspond to pilot tests applied to several of the academic subjects conducted at UPM in order to validate the model. This work describes the general procedure that was used and presents the model developed specifically for the problem-solving competence. Some of the pilot experiences are also summarised and their results analysed