259 resultados para Automate
Resumo:
Cette thèse présente une étude dans divers domaines de l'informatique théorique de modèles de calculs combinant automates finis et contraintes arithmétiques. Nous nous intéressons aux questions de décidabilité, d'expressivité et de clôture, tout en ouvrant l'étude à la complexité, la logique, l'algèbre et aux applications. Cette étude est présentée au travers de quatre articles de recherche. Le premier article, Affine Parikh Automata, poursuit l'étude de Klaedtke et Ruess des automates de Parikh et en définit des généralisations et restrictions. L'automate de Parikh est un point de départ de cette thèse; nous montrons que ce modèle de calcul est équivalent à l'automate contraint que nous définissons comme un automate qui n'accepte un mot que si le nombre de fois que chaque transition est empruntée répond à une contrainte arithmétique. Ce modèle est naturellement étendu à l'automate de Parikh affine qui effectue une opération affine sur un ensemble de registres lors du franchissement d'une transition. Nous étudions aussi l'automate de Parikh sur lettres: un automate qui n'accepte un mot que si le nombre de fois que chaque lettre y apparaît répond à une contrainte arithmétique. Le deuxième article, Bounded Parikh Automata, étudie les langages bornés des automates de Parikh. Un langage est borné s'il existe des mots w_1, w_2, ..., w_k tels que chaque mot du langage peut s'écrire w_1...w_1w_2...w_2...w_k...w_k. Ces langages sont importants dans des domaines applicatifs et présentent usuellement de bonnes propriétés théoriques. Nous montrons que dans le contexte des langages bornés, le déterminisme n'influence pas l'expressivité des automates de Parikh. Le troisième article, Unambiguous Constrained Automata, introduit les automates contraints non ambigus, c'est-à-dire pour lesquels il n'existe qu'un chemin acceptant par mot reconnu par l'automate. Nous montrons qu'il s'agit d'un modèle combinant une meilleure expressivité et de meilleures propriétés de clôture que l'automate contraint déterministe. Le problème de déterminer si le langage d'un automate contraint non ambigu est régulier est montré décidable. Le quatrième article, Algebra and Complexity Meet Contrained Automata, présente une étude des représentations algébriques qu'admettent les automates contraints et les automates de Parikh affines. Nous déduisons de ces caractérisations des résultats d'expressivité et de complexité. Nous montrons aussi que certaines hypothèses classiques en complexité computationelle sont reliées à des résultats de séparation et de non clôture dans les automates de Parikh affines. La thèse est conclue par une ouverture à un possible approfondissement, au travers d'un certain nombre de problèmes ouverts.
Resumo:
Réalisé en cotutelle avec l'Université de Grenoble.
Resumo:
Les changements climatiques prennent une importance grandissante dans l’étude des phénomènes spatiaux à grande échelle. Plusieurs experts affirment que les changements climatiques seront un des principaux moteurs de changement écologique dans les prochaines décennies et que leurs conséquences seront inévitables. Ces changements se manifesteront sur le milieu physique par la fonte des calottes glaciaires, le dégel du pergélisol, l’instabilité des versants montagneux en zone de pergélisol, l’augmentation de l’intensité, de la sévérité et de la fréquence des événements climatiques extrêmes tels les feux de forêt. Les changements climatiques se manifesteront aussi sur le milieu biologique, tel la modification de la durée de la saison végétative, l’augmentation des espèces exotiques invasives et les changements dans la distribution en espèces vivantes. Deux aspects sont couverts par cette étude : 1) les changements dans la répartition spatiale de 39 espèces d’oiseaux et 2) les modifications dans les patrons spatiaux des feux, en forêt boréale québécoise, tous deux dans l’horizon climatique de 2100. Une approche de modélisation statistique démontre que la répartition spatiale des oiseaux de la forêt boréale est fortement liée à des variables bioclimatiques (R2adj = 0.53). Ces résultats permettent d’effectuer des modélisations bioclimatiques pour le gros-bec errant et la mésange à tête noire quivoient une augmentation de la limite nordique de distribution de l’espèce suivant l’intensité du réchauffement climatique. Finalement, une modélisation spatialement explicite par automate cellulaire permet de démontrer comment les changements climatiques induiront une augmentation dans la fréquence de feux de forêt et dans la superficie brûlée en forêt boréale du Québec.
Resumo:
Aim of the present work was to automate CSP process, to deposit and characterize CuInS2/In2S3 layers using this system and to fabricate devices using these films.An automated spray system for the deposition of compound semiconductor thin films was designed and developed so as to eliminate the manual labour involved in spraying and facilitate standardization of the method. The system was designed such that parameters like spray rate, movement of spray head, duration of spray, temperature of substrate, pressure of carrier gas and height of the spray head from the substrate could be varied. Using this system, binary, ternary as well as quaternary films could be successfully deposited.The second part of the work deal with deposition and characterization of CuInS2 and In2S3 layers respectively.In the case of CuInS2 absorbers, the effects of different preparation conditions and post deposition treatments on the optoelectronic, morphological and structural properties were investigated. It was observed that preparation conditions and post deposition treatments played crucial role in controlling the properties of the films. The studies in this direction were useful in understanding how the variation in spray parameters tailored the properties of the absorber layer. These results were subsequently made use of in device fabrication process.Effects of copper incorporation in In2S3 films were investigated to find how the diffusion of Cu from CuInS2 to In2S3 will affect the properties at the junction. It was noticed that there was a regular variation in the opto-electronic properties with increase in copper concentration.Devices were fabricated on ITO coated glass using CuInS2 as absorber and In2S3 as buffer layer with silver as the top electrode. Stable devices could be deposited over an area of 0.25 cm2, even though the efficiency obtained was not high. Using manual spray system, we could achieve devices of area 0.01 cm2 only. Thus automation helped in obtaining repeatable results over larger areas than those obtained while using the manual unit. Silver diffusion on the cells before coating the electrodes resulted in better collection of carriers.From this work it was seen CuInS2/In2S3 junction deposited through automated spray process has potential to achieve high efficiencies.
Resumo:
Embedded systems, especially Wireless Sensor Nodes are highly prone to Type Safety and Memory Safety issues. Contiki, a prominent Operating System in the domain is even more affected by the problem since it makes extensive use of Type casts and Pointers. The work is an attempt to nullify the possibility of Safety violations in Contiki. We use a powerful, still efficient tool called Deputy to achieve this. We also try to automate the process
Resumo:
Embedded systems, especially Wireless Sensor Nodes are highly prone to Type Safety and Memory Safety issues. Contiki, a prominent Operating System in the domain is even more affected by the problem since it makes extensive use of Type casts and Pointers. The work is an attempt to nullify the possibility of Safety violations in Contiki. We use a powerful, still efficient tool called Deputy to achieve this. We also try to automate the process
Resumo:
A novel technique for estimating the rank of the trajectory matrix in the local subspace affinity (LSA) motion segmentation framework is presented. This new rank estimation is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built with LSA. The result is an enhanced model selection technique for trajectory matrix rank estimation by which it is possible to automate LSA, without requiring any a priori knowledge, and to improve the final segmentation
Resumo:
Expert supervision systems are software applications specially designed to automate process monitoring. The goal is to reduce the dependency on human operators to assure the correct operation of a process including faulty situations. Construction of this kind of application involves an important task of design and development in order to represent and to manipulate process data and behaviour at different degrees of abstraction for interfacing with data acquisition systems connected to the process. This is an open problem that becomes more complex with the number of variables, parameters and relations to account for the complexity of the process. Multiple specialised modules tuned to solve simpler tasks that operate under a co-ordination provide a solution. A modular architecture based on concepts of software agents, taking advantage of the integration of diverse knowledge-based techniques, is proposed for this purpose. The components (software agents, communication mechanisms and perception/action mechanisms) are based on ICa (Intelligent Control architecture), software middleware supporting the build-up of applications with software agent features
Resumo:
Primera conferencia. Bibliotecas y Repositorios Digitales: Gestión del Conocimiento, Acceso Abierto y Visibilidad Latinoamericana. (BIREDIAL) Mayo 9 al 11 de 2011. Bogotá, Colombia.
Resumo:
This 10-minute video shows you how you can include image files in your thesis. By using the University template and special styles, you will be able to automate their numbering and references to them in the text, as well as generate tables of figures.
Resumo:
Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 11:00-11:50 Location: B32/3077 File size: 669 Mb Abstract For good scientific practice, it is important that research results may be properly checked by reviewers and possibly repeated and extended by other researchers. This is of particular interest for "digital science" i.e. for in-silico experiments. In this talk, I'll discuss some issues of how software systems and services may contribute to good scientific practice. Particularly, I'll present our PubFlow approach to automate publication workflows for scientific data. The PubFlow workflow management system is based on established technology. We integrate institutional repository systems (based on EPrints) and world data centers (in marine science). PubFlow collects provenance data automatically via our monitoring framework Kieker. Provenance information describes the origins and the history of scientific data in its life cycle, and the process by which it arrived. Thus, provenance information is highly relevant to repeatability and trustworthiness of scientific results. In our evaluation in marine science, we collaborate with the GEOMAR Helmholtz Centre for Ocean Research Kiel.
Resumo:
We describe a simple method to automate the geometric optimization of molecular orbital calculations of supermolecules on potential surfaces that are corrected for basis set superposition error using the counterpoise (CP) method. This method is applied to the H-bonding complexes HF/HCN, HF/H2O, and HCCH/H2O using the 6-31G(d,p) and D95 + + (d,p) basis sets at both the Hartree-Fock and second-order Møller-Plesset levels. We report the interaction energies, geometries, and vibrational frequencies of these complexes on the CP-optimized surfaces; and compare them with similar values calculated using traditional methods, including the (more traditional) single point CP correction. Upon optimization on the CP-corrected surface, the interaction energies become more negative (before vibrational corrections) and the H-bonding stretching vibrations decrease in all cases. The extent of the effects vary from extremely small to quite large depending on the complex and the calculational method. The relative magnitudes of the vibrational corrections cannot be predicted from the H-bond stretching frequencies alone
Resumo:
RESUMO: O conhecimento existe desde sempre, mesmo num estado latente condicionado algures e apenas à espera de um meio (de uma oportunidade) de se poder manifestar. O conhecimento é duplamente um fenómeno da consciência: porque dela procede num dado momento da sua vida e da sua história e porque só nela termina, aperfeiçoando-a e enriquecendo-a. O conhecimento está assim em constante mudança. À relativamente pouco tempo começou-se a falar de Gestão do Conhecimento e na altura foi muito associada às Tecnologias da Informação, como meio de colectar, processar e armazenar cada vez mais, maiores quantidades de informação. As Tecnologias da Informação têm tido, desde alguns anos para cá, um papel extremamente importante nas organizações, inicialmente foram adoptadas com o propósito de automatizar os processos operacionais das organizações, que suportam as suas actividades quotidianas e nestes últimos tempos as Tecnologias da Informação dentro das organizações têm evoluído rapidamente. Todo o conhecimento, mesmo até o menos relevante de uma determinada área de negócio, é fundamental para apoiar o processo de tomada de decisão. As organizações para atingirem melhores «performances» e conseguirem transcender as metas a que se propuseram inicialmente, tendem a munir-se de mais e melhores Sistemas de Informação, assim como, à utilização de várias metodologias e tecnologias hoje em dia disponíveis. Por conseguinte, nestes últimos anos, muitas organizações têm vindo a demonstrar uma necessidade crucial de integração de toda a sua informação, a qual está dispersa pelos diversos departamentos constituintes. Para que os gestores de topo (mas também para outros funcionários) possam ter disponível em tempo útil, informação pertinente, verdadeira e fiável dos negócios da organização que eles representam, precisam de ter acesso a bons Sistemas de Tecnologias de Informação. Numa acção de poderem agir mais eficazmente e eficientemente nas tomadas de decisão, por terem conseguido tirar por esses meios o máximo de proveito possível da informação, e assim, apresentarem melhores níveis de sucesso organizacionais. Também, os Sistemas de «Business Intelligence» e as Tecnologias da Informação a ele associadas, utilizam os dados existentes nas organizações para disponibilizar informação relevante para as tomadas de decisão. Mas, para poderem alcançar esses níveis tão satisfatórios, as organizações necessitam de recursos humanos, pois como podem elas serem competitivas sem Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 6 trabalhadores qualificados. Assim, surge a necessidade das organizações em recrutar os chamados hoje em dia “Trabalhadores do Conhecimento”, que são os indivíduos habilitados para interpretar as informações dentro de um domínio específico. Eles detectam problemas e identificam alternativas, com os seus conhecimentos e discernimento, eles trabalham para solucionar esses problemas, ajudando consideravelmente as organizações que representam. E, usando metodologias e tecnologias da Engenharia do Conhecimento como a modelação, criarem e gerirem um histórico de conhecimento, incluindo conhecimento tácito, sobre várias áreas de negócios da organização, que podem estar explícitos em modelos abstractos, que possam ser compreendidos e interpretados facilmente, por outros trabalhadores com níveis de competência equivalentes. ABSTRACT: Knowledge has always existed, even in a latent state conditioning somewhere and just waiting for a half (an opportunity) to be able to manifest. Knowledge is doubly a phenomenon of consciousness: because proceeds itself at one point in its life and its history and because solely itself ends, perfecting it and enriching it. The knowledge is so in constant change. In the relatively short time that it began to speak of Knowledge Management and at that time was very associated with Information Technologies, as a means to collect, process and store more and more, larger amounts of information. Information Technologies has had, from a few years back, an extremely important role in organizations, were initially adopted in order to automate the operational processes of organizations, that support their daily activities and in recent times Information Technologies within organizations has evolved rapidly. All the knowledge, even to the least relevant to a particular business area, is fundamental to support the process of decision making. The organizations to achieve better performances and to transcend the goals that were initially propose, tend to provide itself with more and better Information Systems, as well as, the use of various methodologies and technologies available today. Consequently, in recent years, many organizations have demonstrated a crucial need for integrating all their information, which is dispersed by the diver constituents departments. For top managers (but also for other employees) may have ready in time, pertinent, truthful and reliable information of the organization they represent, need access to good Information Technology Systems. In an action that they can act more effectively and efficiently in decision making, for having managed to get through these means the maximum possible advantage of the information, and so, present better levels of organizational success. Also, the systems of Business Intelligence and Information Technologies its associated, use existing data on organizations to provide relevant information for decision making. But, in order to achieve these levels as satisfactory, organizations need human resources, because how can they be competitive without skilled workers. Thus, arises the need for organizations to recruit called today “Knowledge Workers”, they are the individuals enable to interpret the information within a specific domain. They detect problems and identify alternatives, with their knowledge and discernment they work to solve these problems, helping considerably the organizations that represent. And, using Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 8 methodologies and technologies of Knowledge Engineering as modeling, create and manage a history of knowledge, including tacit knowledge, on various business areas of the organization, that can be explicit in the abstract models, that can be understood and interpreted easily, by other workers with equivalent levels of competence.
Resumo:
O bom funcionamento de uma empresa passa pela coordenação dos seus vários elementos, pela fluidez das suas operações diárias, pelo desempenho dos seus recursos, tanto humanos como materiais, e da interacção dos vários sistemas que a compõem. As tecnologias empresariais sentiram um desenvolvimento contínuo após a sua aparição, desde o processo básico, para gestão de processos de negócios (BPM), para plataformas de recursos empresariais (ERP) modernos como o sistema proprietário SAP ou Oracle, para conceitos mais gerais como SOA e cloud, baseados em standards abertos. As novas tecnologias apresentam novos canais de trânsito de informação mais rápidos e eficientes, formas de automatizar e acompanhar processos de negócio e vários tipos de infra-estruturas que podem ser utilizadas de forma a tornar a empresa mais produtiva e flexível. As soluções comerciais existentes permitem realizar estes objectivos mas os seus custos de aquisição podem revelar-se demasiado elevados para algumas empresas ou organizações, que arriscam de não se adaptar às mudanças do negócio. Ao mesmo tempo, software livre está a ganhar popularidade mas existem sempre alguns preconceitos sobre a qualidade e maturidade deste tipo de software. O objectivo deste trabalho é apresentar SOA, os principais produtos SOA comerciais e open source e realizar uma comparação entre as duas categorias para verificar o nível de maturidade do SOA open source em relação às soluções SOA proprietárias.
Resumo:
This paper addresses the requirements for a Work/flow Management System that is intended to automate the production and distribution chain for cross-media content which is by nature multi-partner and multi-site. It advocates the requirements for an ontology-based object lifecycle tracking within work/flow integration by identifying various types of interfaces, object life cycles and the work-flow interaction environments within the AXMEDIS Framework.