987 resultados para Unified Modelling Language
Resumo:
With the projection of an increasing world population, hand-in-hand with a journey towards a bigger number of developed countries, further demand on basic chemical building blocks, as ethylene and propylene, has to be properly addressed in the next decades. The methanol-to-olefins (MTO) is an interesting reaction to produce those alkenes using coal, gas or alternative sources, like biomass, through syngas as a source for the production of methanol. This technology has been widely applied since 1985 and most of the processes are making use of zeolites as catalysts, particularly ZSM-5. Although its selectivity is not especially biased over light olefins, it resists to a quick deactivation by coke deposition, making it quite attractive when it comes to industrial environments; nevertheless, this is a highly exothermic reaction, which is hard to control and to anticipate problems, such as temperature runaways or hot-spots, inside the catalytic bed. The main focus of this project is to study those temperature effects, by addressing both experimental, where the catalytic performance and the temperature profiles are studied, and modelling fronts, which consists in a five step strategy to predict the weight fractions and activity. The mind-set of catalytic testing is present in all the developed assays. It was verified that the selectivity towards light olefins increases with temperature, although this also leads to a much faster catalyst deactivation. To oppose this effect, experiments were carried using a diluted bed, having been able to increase the catalyst lifetime between 32% and 47%. Additionally, experiments with three thermocouples placed inside the catalytic bed were performed, analysing the deactivation wave and the peaks of temperature throughout the bed. Regeneration was done between consecutive runs and it was concluded that this action can be a powerful means to increase the catalyst lifetime, maintaining a constant selectivity towards light olefins, by losing acid strength in a steam stabilised zeolitic structure. On the other hand, developments on the other approach lead to the construction of a raw basic model, able to predict weight fractions, that should be tuned to be a tool for deactivation and temperature profiles prediction.
Resumo:
This paper aims to provide a model that allows BPI to measure the credit risk, through its rating scale, of the subsidiaries included in the corporate groups who are their clients. This model should be simple enough to be applied in practice, accurate, and must give consistent results in comparison to what have been the ratings given by the bank. The model proposed includes operational, strategic, and financial factors and ends up giving one of three results: no support, partial support, or full support from the holding to the subsidiary, and each of them translates in adjustments in each subsidiary’s credit rating. As it would be expectable, most of the subsidiaries should have the same credit rating of its parent company.
Resumo:
Hand gestures are a powerful way for human communication, with lots of potential applications in the area of human computer interaction. Vision-based hand gesture recognition techniques have many proven advantages compared with traditional devices, giving users a simpler and more natural way to communicate with electronic devices. This work proposes a generic system architecture based in computer vision and machine learning, able to be used with any interface for human-computer interaction. The proposed solution is mainly composed of three modules: a pre-processing and hand segmentation module, a static gesture interface module and a dynamic gesture interface module. The experiments showed that the core of visionbased interaction systems could be the same for all applications and thus facilitate the implementation. For hand posture recognition, a SVM (Support Vector Machine) model was trained and used, able to achieve a final accuracy of 99.4%. For dynamic gestures, an HMM (Hidden Markov Model) model was trained for each gesture that the system could recognize with a final average accuracy of 93.7%. The proposed solution as the advantage of being generic enough with the trained models able to work in real-time, allowing its application in a wide range of human-machine applications. To validate the proposed framework two applications were implemented. The first one is a real-time system able to interpret the Portuguese Sign Language. The second one is an online system able to help a robotic soccer game referee judge a game in real time.
Resumo:
Vision-based hand gesture recognition is an area of active current research in computer vision and machine learning. Being a natural way of human interaction, it is an area where many researchers are working on, with the goal of making human computer interaction (HCI) easier and natural, without the need for any extra devices. So, the primary goal of gesture recognition research is to create systems, which can identify specific human gestures and use them, for example, to convey information. For that, vision-based hand gesture interfaces require fast and extremely robust hand detection, and gesture recognition in real time. Hand gestures are a powerful human communication modality with lots of potential applications and in this context we have sign language recognition, the communication method of deaf people. Sign lan- guages are not standard and universal and the grammars differ from country to coun- try. In this paper, a real-time system able to interpret the Portuguese Sign Language is presented and described. Experiments showed that the system was able to reliably recognize the vowels in real-time, with an accuracy of 99.4% with one dataset of fea- tures and an accuracy of 99.6% with a second dataset of features. Although the im- plemented solution was only trained to recognize the vowels, it is easily extended to recognize the rest of the alphabet, being a solid foundation for the development of any vision-based sign language recognition user interface system.
Resumo:
Today it is easy to find a lot of tools to define data migration schemas among different types of information systems. Data migration processes use to be implemented on a very diverse range of applications, ranging from conventional operational systems to data warehousing platforms. The implementation of a data migration process often involves a serious planning, considering the development of conceptual migration schemas at early stages. Such schemas help architects and engineers to plan and discuss the most adequate way to migrate data between two different systems. In this paper we present and discuss a way for enriching data migration conceptual schemas in BPMN using a domain-specific language, demonstrating how to convert such enriched schemas to a first correspondent physical representation (a skeleton) in a conventional ETL implementation tool like Kettle.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto
Resumo:
In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.
Resumo:
[Excerpt] A large number of constitutive equations were developed for viscoelastic fluids, some empirical and other with strong physical foundations. The currently available macroscopic constitutive equations can be divided in two main types: differential and integral. Some of the constitutive equations, e.g. Maxwell are available both in differential and integral types. However, relevant in tegral models, like K - BKZ, just possesses the integral form. (...)
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
Programa Doutoral em Matemática e Aplicações.
Resumo:
The study reported here aims at contributing to a deeper understanding of the educational possibilities offered by digital manipulatives in preschool contexts. It presents a study carried with a digital manipulative to enhance the development of lexical knowledge and language awareness, which are relevant language abilities for formal literacy learning. The study took place in a Portuguese preschool, with a class of 20 five-year-olds in collaboration with the teacher. The digital manipulative supported the construction of multiple fictional worlds, motivating children's verbal interactions, and the playing of words and sound games, thus contextualizing the learning of an extensive collection of vocabulary and language awareness abilities. The degree of engagement and involvement that the manipulative provided in supporting children’s imaginative play as well as the imitation, in their own play, of the playful pedagogical interventions that the teacher had designed, shows the importance of well- designed materials that support a child-centered learning model. As such, it sustains a discussion on the potential of digital manipulatives to enhance fundamental language development in the preschool years. Further, the study highlights the importance of multidisciplinary teams in the creation of innovative pedagogical materials.