987 resultados para process interface
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Rupture of a light cellophane diaphragm in an expansion tube has been studied by an optical method. The influence of the light diaphragm on test flow generation has long been recognised, however the diaphragm rupture mechanism is less well known. It has been previously postulated that the diaphragm ruptures around its periphery due to the dynamic pressure loading of the shock wave, with the diaphragm material at some stage being removed from the flow to allow the shock to accelerate to the measured speeds downstream. The images obtained in this series of experiments are the first to show the mechanism of diaphragm rupture and mass removal in an expansion tube. A light diaphragm was impulsively loaded via a shock wave and a series of images was recorded holographically throughout the rupture process, showing gradual destruction of the diaphragm. Features such as the diaphragm material, the interface between gases, and a reflected shock were clearly visualised. Both qualitative and quantitative aspects of the rupture dynamics were derived from the images and compared with existing one-dimensional theory.
Resumo:
We have grown surfactant-templated silicate films at the air-water interface using n-alkyltrimethylammonium bromide and chloride in an acid synthesis with tetraethyl orthosilicate as the silicate source. The films have been grown with and without added salt (sodium chloride, sodium bromide) and with n-alkyl chain lengths from 12 to 18, the growth process being monitored by X-ray reflectometry. Glassy, hexagonal, and lamellar structures have been produced in ways that are predictable from the pure surfactant-water phase diagrams. The synthesis appears to proceed initially through an induction period characterized by the accumulation of silica-coated spherical micelles near the surface. All syntheses, except those involving C(12)TACl, show a sudden transformation of the spherical micellar phase to a hexagonal phase. This occurs when the gradually increasing ionic strength and/or changing ethanol concentration is sufficient to change the position of boundaries within the phase diagram. A possible mechanism for this to occur may be to induce a sphere to rod transition in the micellar structure. This transformation, as predicted from the surfactant-water phase diagram, can be induced by addition of salts and is slower for chloride than bromide counteranions. The hexagonal materials change in cell dimension as the chain length is changed in a way consistent with theoretical model predictions. All the materials have sufficiently flexible silica frameworks that phase interconversion is observed both from glassy to hexagonal and from hexagonal, to lamellar and vice versa in those surfactant systems where multiple phases are found to exist.
Resumo:
The simultaneous design of the steady-state and dynamic performance of a process has the ability to satisfy much more demanding dynamic performance criteria than the design of dynamics only by the connection of a control system. A method for designing process dynamics based on the use of a linearised systems' eigenvalues has been developed. The eigenvalues are associated with system states using the unit perturbation spectral resolution (UPSR), characterising the dynamics of each state. The design method uses a homotopy approach to determine a final design which satisfies both steady-state and dynamic performance criteria. A highly interacting single stage forced circulation evaporator system, including control loops, was designed by this method with the goal of reducing the time taken for the liquid composition to reach steady-state. Initially the system was successfully redesigned to speed up the eigenvalue associated with the liquid composition state, but this did not result in an improved startup performance. Further analysis showed that the integral action of the composition controller was the source of the limiting eigenvalue. Design changes made to speed up this eigenvalue did result in an improved startup performance. The proposed approach provides a structured way to address the design-control interface, giving significant insight into the dynamic behaviour of the system such that a systematic design or redesign of an existing system can be undertaken with confidence.
Resumo:
We use the finite element method to simulate the rock alteration and metamorphic process in hydrothermal systems. In particular, we consider the fluid-rock interaction problems in pore-fluid saturated porous rocks. Since the fluid rock interaction takes place at the contact interface between the pore-fluid and solid minerals, it is governed by the chemical reaction which usually takes place very slowly at this contact interface, from the geochemical point of view. Due to the relative slowness of the rate of the chemical reaction to the velocity of the pore-fluid flow in the hydrothermal system to be considered, there exists a retardation zone, in which the conventional static theory in geochemistry does not hold true. Since this issue is often overlooked by some purely numerical modellers, it is emphasized in this paper. The related results from a typical rock alteration and metamorphic problem in a hydrothermal system have shown not only the detailed rock alteration and metamorphic process, but also the size of the retardation zone in the hydrothermal system. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
The Jacobsen catalyst, Mn(salen), was immobilized in chitosan membrane. The obtained Mn(salen)-Chit was characterized by thermogravimetric analysis (TC), differential thermal analysis (DTA), differential scanning calorimetry (DSC), infrared spectroscopy (FT-IR), degree of N-acetylation by (1)H NMR, and UV-vis spectroscopy. The UV-vis absorption spectrum of the encapsulated catalyst displayed the typical bands of the Jacobsen catalyst, and the FT-IR presented an absorption band characteristic of the imines present in the Jacobsen catalyst. The chitosan membranes were available, in a biphasic system, as a catalytic barrier between two different phases: an organic substrate phase (cyclooctene or styrene) and an aqueous solution of either m-CPBA, t-BuOOH or H(2)O(2), and dismissing the need for phase transfer agents and leading to better product yields compared with the catalyst in homogeneous medium. This new catalyst did not leach from the support and was reused many times, leading to high turnover frequencies. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Purpose: The aim of this study was to perform qualitative and quantitative analyses of the effect of nicotine on autogenous bone block grafts and to describe events in the initial healing phase and the differences in the repair processes between animals exposed to nicotine and controls. Materials and Methods: Forty-eight female Wistar rats were randomly divided into 2 groups, the nicotine group and the saline group. All animals received either nicotine (3 mg/kg) or saline 4 weeks before the surgical procedure and continued to receive nicotine from surgery to sacrifice at 7, 14, or 28 days. The autogenous bone block graft was harvested from the calvaria and stabilized on the external cortical area near the angle of the mandible. Results: The histologic analyses of the nicotine group depicted a delay in osteogenic activity at the bed-graft interface, as well as impairment of the organization of the granulation tissue that developed instead of blood clot. Nicotine-group specimens exhibited less bone neoformation, and the newly formed bone was poorly cellularized and vascularized. The histometric analysis revealed significantly less bone formation in the nicotine group at both 14 days (23.75% +/- 6.18% versus 51.31% +/- 8.31%) and 28 days (42.44% +/- 8.70% versus 73.00% +/- 4.99%). Conclusion: Nicotine did jeopardize the early healing process of autogenous bone block grafts in rats but did not prevent it.
Resumo:
The generalization of the quasi mode theory of macroscopic quantization in quantum optics and cavity QED presented in the previous paper, is applied to provide a fully quantum theoretic derivation of the laws of reflection and refraction at a boundary. The quasi mode picture of this process involves the annihilation of a photon travelling in the incident region quasi mode, and the subsequent creation of a photon in either the incident region or transmitted region quasi modes. The derivation of the laws of reflection and refraction is achieved through the dual application of the quasi mode theory and a quantum scattering theory based on the Heisenberg picture. Formal expressions from scattering theory are given for the reflection and transmission coefficients. The behaviour of the intensity for a localized one photon wave packet coming in at time minus infinity from the incident direction is examined and it is shown that at time plus infinity, the light intensity is only significant where the classical laws of reflection and refraction predict. The occurrence of both refraction and reflection is dependent upon the quasi mode theory coupling constants between incident and transmitted region quasi modes being nonzero, and it is seen that the contributions to such coupling constants come from the overlap of the mode functions in the boundary layer region, as might be expected from a microscopic theory.
Resumo:
Esta dissertação apresenta o desenvolvimento de uma plataforma multimodal de aquisição e processamento de sinais. O projeto proposto insere-se no contexto do desenvolvimento de interfaces multimodais para aplicação em dispositivos robóticos cujo propósito é a reabilitação motora adaptando o controle destes dispositivos de acordo com a intenção do usuário. A interface desenvolvida adquire, sincroniza e processa sinais eletroencefalográficos (EEG), eletromiográficos (EMG) e sinais provenientes de sensores inerciais (IMUs). A aquisição dos dados é feita em experimentos realizados com sujeitos saudáveis que executam tarefas motoras de membros inferiores. O objetivo é analisar a intenção de movimento, a ativação muscular e o início efetivo dos movimentos realizados, respectivamente, através dos sinais de EEG, EMG e IMUs. Para este fim, uma análise offline foi realizada. Nessa análise, são utilizadas técnicas de processamento dos sinais biológicos e técnicas para processar sinais provenientes de sensores inerciais. A partir destes, os ângulos da articulação do joelho também são aferidos ao longo dos movimentos. Um protocolo experimental de testes foi proposto para as tarefas realizadas. Os resultados demonstraram que o sistema proposto foi capaz de adquirir, sincronizar, processar e classificar os sinais combinadamente. Análises acerca da acurácia dos classificadores utilizados mostraram que a interface foi capaz de identificar intenção de movimento em 76, 0 ± 18, 2% dos movimentos. A maior média de tempo de antecipação ao movimento foi obtida através da análise do sinal de EEG e foi de 716, 0±546, 1 milisegundos. A partir da análise apenas do sinal de EMG, este valor foi de 88, 34 ± 67, 28 milisegundos. Os resultados das etapas de processamento dos sinais biológicos, a medição dos ângulos da articulação, bem como os valores de acurácia e tempo de antecipação ao movimento se mostraram em conformidade com a literatura atual relacionada.
Resumo:
Mestrado em Engenharia Informática. Área de Especialização em Tecnologias do Conhecimento e Decisão.
Resumo:
Dissertation to obtain the degree of Master in Music - Artistic Interpretation
Resumo:
Na atualidade, está a emergir um novo paradigma de interação, designado por Natural User Interface (NUI) para reconhecimento de gestos produzidos com o corpo do utilizador. O dispositivo de interação Microsoft Kinect foi inicialmente concebido para controlo de videojogos, para a consola Xbox360. Este dispositivo demonstra ser uma aposta viável para explorar outras áreas, como a do apoio ao processo de ensino e de aprendizagem para crianças do ensino básico. O protótipo desenvolvido visa definir um modo de interação baseado no desenho de letras no ar, e realizar a interpretação dos símbolos desenhados, usando os reconhecedores de padrões Kernel Discriminant Analysis (KDA), Support Vector Machines (SVM) e $N. O desenvolvimento deste projeto baseou-se no estudo dos diferentes dispositivos NUI disponíveis no mercado, bibliotecas de desenvolvimento NUI para este tipo de dispositivos e algoritmos de reconhecimento de padrões. Com base nos dois elementos iniciais, foi possível obter uma visão mais concreta de qual o hardware e software disponíveis indicados à persecução do objetivo pretendido. O reconhecimento de padrões constitui um tema bastante extenso e complexo, de modo que foi necessária a seleção de um conjunto limitado deste tipo de algoritmos, realizando os respetivos testes por forma a determinar qual o que melhor se adequava ao objetivo pretendido. Aplicando as mesmas condições aos três algoritmos de reconhecimento de padrões permitiu avaliar as suas capacidades e determinar o $N como o que apresentou maior eficácia no reconhecimento. Por último, tentou-se averiguar a viabilidade do protótipo desenvolvido, tendo sido testado num universo de elementos de duas faixas etárias para determinar a capacidade de adaptação e aprendizagem destes dois grupos. Neste estudo, constatou-se um melhor desempenho inicial ao modo de interação do grupo de idade mais avançada. Contudo, o grupo mais jovem foi revelando uma evolutiva capacidade de adaptação a este modo de interação melhorando progressivamente os resultados.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia Mecânica
Resumo:
The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.