986 resultados para sistemi integrati, CAT tools, machine translation
Resumo:
One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.
Resumo:
L’observation de l’exécution d’applications JavaScript est habituellement réalisée en instrumentant une machine virtuelle (MV) industrielle ou en effectuant une traduction source-à-source ad hoc et complexe. Ce mémoire présente une alternative basée sur la superposition de machines virtuelles. Notre approche consiste à faire une traduction source-à-source d’un programme pendant son exécution pour exposer ses opérations de bas niveau au travers d’un modèle objet flexible. Ces opérations de bas niveau peuvent ensuite être redéfinies pendant l’exécution pour pouvoir en faire l’observation. Pour limiter la pénalité en performance introduite, notre approche exploite les opérations rapides originales de la MV sous-jacente, lorsque cela est possible, et applique les techniques de compilation à-la-volée dans la MV superposée. Notre implémentation, Photon, est en moyenne 19% plus rapide qu’un interprète moderne, et entre 19× et 56× plus lente en moyenne que les compilateurs à-la-volée utilisés dans les navigateurs web populaires. Ce mémoire montre donc que la superposition de machines virtuelles est une technique alternative compétitive à la modification d’un interprète moderne pour JavaScript lorsqu’appliqué à l’observation à l’exécution des opérations sur les objets et des appels de fonction.
Resumo:
Els processos de mecanitzat amb arrencament de ferritja consisteixen en arrencar contínuament petits fragments de material de la peça a la que es vol donar forma mitjançant les eines de tall. En treballar amb metalls, aquestes forces de tall són importants, i provoquen el desgast de les eines així com un escalfament important de l’eina i la peça. Per aquest motiu s’introdueixen els olis de tall, amb les principals funcions de refrigerar i lubricar. Tradicionalment, aquests olis de tall s’han subministrat a raig, quedant la zona de treball inundada. Els últims anys, degut a les millores tecnològiques i per intentar estalviar amb olis de tall, s’ha ideat un nou sistema anomenat mínima quantitat de lubricant, el qual subministra petites gotes d’oli, no soluble en aigua, dins un flux d’aire a pressió. Aquest sistema, a més, s’injecta prop de la zona de treball, on és estricament necessari. Aquest sistema redueix considerablement el consum d’oli de tall, i a més, augmenta l’eficiència de la lubricació. A part, elimina els posteriors tractaments per reciclar l’oli de tall un cop perd les seves propietats. Per aquest motiu, es pretén estudiar la utilització d’un mètode semblant al de la mínima quantitat de lubricant, subministrant el mateix oli de tall que s’utilitza convencionalment però polvoritzar per tal de convertir-lo en un aerosol
Resumo:
El processament d'imatges mèdiques és una important àrea de recerca. El desenvolupament de noves tècniques que assisteixin i millorin la interpretació visual de les imatges de manera ràpida i precisa és fonamental en entorns clínics reals. La majoria de contribucions d'aquesta tesi són basades en Teoria de la Informació. Aquesta teoria tracta de la transmissió, l'emmagatzemament i el processament d'informació i és usada en camps tals com física, informàtica, matemàtica, estadística, biologia, gràfics per computador, etc. En aquesta tesi, es presenten nombroses eines basades en la Teoria de la Informació que milloren els mètodes existents en l'àrea del processament d'imatges, en particular en els camps del registre i la segmentació d'imatges. Finalment es presenten dues aplicacions especialitzades per l'assessorament mèdic que han estat desenvolupades en el marc d'aquesta tesi.
Resumo:
La selección de centros de mecanizado de alta velocidad es un proceso complejo que requiere de mucha experiencia, puesto que en él intervienen un gran número de variables, tanto tecnológicas como económicas. Existen metodologías orientadas a seleccionar el centro de mecanizado óptimo, considerando únicamente una de estas dos tipologías de variables, sin embargo, esta tesis propone una metodología que contempla ambos tipos. Para ello se identifican las variables que tienen mayor influencia sobre los resultados del proceso de mecanizado, tanto desde un punto de vista de calidad de las piezas fabricadas como de la economía de la fabricación, y se propone un modelo de selección basado en los resultados de un trabajo experimental realizado sobre piezas de aluminio. Dicho modelo se implementa mediante redes neurales, cuyo entrenamiento se realiza en base a los resultados del trabajo experimental mencionado.
Resumo:
La millora de la productivitat i la qualitat són indubtablement dues de les principals exigències del sector productiu modern i factors clau per la competitivitat i la supervivència. Dins aquest sector,la fabricació per arrancada de material juga encara avui en dia un paper protagonista tot i l'aparició de noves tècniques de conformat per addició.Indústries com l'aeronàutica, l'automobilística,la del motlle o l'energètica, depenen en bona part de les prestacions de les màquines-eina. Aquesta Tesi aborda dos aspectes rellevants quan es tracta de millorar de la productivitat i la qualitat del sector productiu: el problema del fimbrament, més conegut per la denominació anglosaxona chatter,i la monitorització de la rugositat superficial en el mecanitzat a alta velocitat.
Resumo:
The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·
Resumo:
Many evolutionary algorithm applications involve either fitness functions with high time complexity or large dimensionality (hence very many fitness evaluations will typically be needed) or both. In such circumstances, there is a dire need to tune various features of the algorithm well so that performance and time savings are optimized. However, these are precisely the circumstances in which prior tuning is very costly in time and resources. There is hence a need for methods which enable fast prior tuning in such cases. We describe a candidate technique for this purpose, in which we model a landscape as a finite state machine, inferred from preliminary sampling runs. In prior algorithm-tuning trials, we can replace the 'real' landscape with the model, enabling extremely fast tuning, saving far more time than was required to infer the model. Preliminary results indicate much promise, though much work needs to be done to establish various aspects of the conditions under which it can be most beneficially used. A main limitation of the method as described here is a restriction to mutation-only algorithms, but there are various ways to address this and other limitations.
Resumo:
The Perspex Machine arose from the unification of computation with geometry. We now report significant redevelopment of both a partial C compiler that generates perspex programs and of a Graphical User Interface (GUI). The compiler is constructed with standard compiler-generator tools and produces both an explicit parse tree for C and an Abstract Syntax Tree (AST) that is better suited to code generation. The GUI uses a hash table and a simpler software architecture to achieve an order of magnitude speed up in processing and, consequently, an order of magnitude increase in the number of perspexes that can be manipulated in real time (now 6,000). Two perspex-machine simulators are provided, one using trans-floating-point arithmetic and the other using transrational arithmetic. All of the software described here is available on the world wide web. The compiler generates code in the neural model of the perspex. At each branch point it uses a jumper to return control to the main fibre. This has the effect of pruning out an exponentially increasing number of branching fibres, thereby greatly increasing the efficiency of perspex programs as measured by the number of neurons required to implement an algorithm. The jumpers are placed at unit distance from the main fibre and form a geometrical structure analogous to a myelin sheath in a biological neuron. Both the perspex jumper-sheath and the biological myelin-sheath share the computational function of preventing cross-over of signals to neurons that lie close to an axon. This is an example of convergence driven by similar geometrical and computational constraints in perspex and biological neurons.
Resumo:
Virtual reality has the potential to improve visualisation of building design and construction, but its implementation in the industry has yet to reach maturity. Present day translation of building data to virtual reality is often unidirectional and unsatisfactory. Three different approaches to the creation of models are identified and described in this paper. Consideration is given to the potential of both advances in computer-aided design and the emerging standards for data exchange to facilitate an integrated use of virtual reality. Commonalities and differences between computer-aided design and virtual reality packages are reviewed, and trials of current system, are described. The trials have been conducted to explore the technical issues related to the integrated use of CAD and virtual environments within the house building sector of the construction industry and to investigate the practical use of the new technology.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
1. Understanding the behaviour and ecology of large carnivores is becoming increasingly important as the list of endangered species grows, with felids such as Panthera leo in some locations heading dangerously close to extinction in the wild. In order to have more reliable and effective tools to understand animal behaviour, movement and diet, we need to develop novel, integrated approaches and effective techniques to capture a detailed profile of animal foraging and movement patterns. 2. Ecological studies have shown considerable interest in using stable isotope methods, both to investigate the nature of animal feeding habits, and to map their geographical location. However, recent work has suggested that stable isotope analyses of felid fur and bone is very complex and does not correlate directly with the isotopic composition of precipitation (and hence geographical location). 3. We present new data that suggest these previous findings may be atypical, and demonstrate that isotope analyses of Felidae are suitable for both evaluating dietary inputs and establishing geo-location as they have strong environmental referents to both food and water. These data provide new evidence of an important methodology that can be applied to the family Felidae for future research in ecology, conservation, wildlife forensics and archaeological science.
Resumo:
Species` potential distribution modelling consists of building a representation of the fundamental ecological requirements of a species from biotic and abiotic conditions where the species is known to occur. Such models can be valuable tools to understand the biogeography of species and to support the prediction of its presence/absence considering a particular environment scenario. This paper investigates the use of different supervised machine learning techniques to model the potential distribution of 35 plant species from Latin America. Each technique was able to extract a different representation of the relations between the environmental conditions and the distribution profile of the species. The experimental results highlight the good performance of random trees classifiers, indicating this particular technique as a promising candidate for modelling species` potential distribution. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this study is to explore the strategies and attitudes of students towards translation in the context of language learning. The informants come from two different classes at an Upper Secondary vocational program. The study was born from the backdrop of discussions among some English teachers representing different theories on translation and language learning, meeting students endeavoring in language learning beyond the confinement of the classroom and personal experiences of translation in language learning. The curriculum and course plan for English at the vocational program emphasize two things of particular interest to our study; integration of the program outcomes and vocational language into the English course - so called meshed learning – and student awareness of their own learning processes. A background is presented of different contrasting methods in translation and language learning that is relevant to our discussion. However, focus is given to contemporary research on reforms within the Comparative Theory, as expressed in Translation in Language and Teaching (TILT), Contrastive Analysis and “The Third Space”. The results of the students’ reflections are presented as attempts to translate two different texts; one lyric and one technical vocational text. The results show a pragmatic attitude among the students toward tools like dictionaries or Google Translate, but also a critical awareness about their use and limits. They appear to prefer the use of first language to the target language when discussing the correct translation as they sought accuracy over meaning. Translation for them was a natural and problem-solving event worth a rightful place in language teaching.
Resumo:
There has been a great interest for improving the machining of cast iron materials in the automotive and other industries. Comparative studies for tool used to machine grey cast iron (CI) and compacted graphite iron (CGI) on dry machining were also performed in order to find out why in this case the tool lifetime is not significantly higher. However the machining these materials while considering turning with the traditional high-speed steel and carbide cutting tools present any disadvantages. One of these disadvantages is that all the traditional machining processes involve the cooling fluid to remove the heat generated on workpiece due to friction during cutting. This paper present a new generation of ceramic cutting tool exhibiting improved properties and important advances in machining CI and CGI. The tool performance was analyzed in function of flank wear, temperature and roughness, while can be observed that main effects were found for tool wear, were abrasion to CI and inter-diffusion of constituting elements between tool and CGI, causing crater. However the difference in tool lifetime can be explained by the formation of a MnS layer on the tool surface in the case of grey CI. This layer is missing in the case of CGI.