994 resultados para continuous integration
Resumo:
Hoy en día, existen numerosos sistemas (financieros, fabricación industrial, infraestructura de servicios básicos, etc.) que son dependientes del software. Según la definición de Ingeniería del Software realizada por I. Sommerville, “la Ingeniería del Software es una disciplina de la ingeniería que comprende todos los aspectos de la producción de software desde las etapas iniciales de la especificación del sistema, hasta el mantenimiento de éste después de que se utiliza.” “La ingeniería del software no sólo comprende los procesos técnicos del desarrollo de software, sino también actividades tales como la gestión de proyectos de software y el desarrollo de herramientas, métodos y teorías de apoyo a la producción de software.” Los modelos de proceso de desarrollo software determinan una serie de pautas para poder desarrollar con éxito un proyecto de desarrollo software. Desde que surgieran estos modelos de proceso, se investigado en nuevas maneras de poder gestionar un proyecto y producir software de calidad. En primer lugar surgieron las metodologías pesadas o tradicionales, pero con el avance del tiempo y la tecnología, surgieron unas nuevas llamadas metodologías ágiles. En el marco de las metodologías ágiles cabe destacar una determinada práctica, la integración continua. Esta práctica surgió de la mano de Martin Fowler, con el objetivo de facilitar el trabajo en grupo y automatizar las tareas de integración. La integración continua se basa en la construcción automática de proyectos con una frecuencia alta, promoviendo la detección de errores en un momento temprano para poder dar prioridad a corregir dichos errores. Sin embargo, una de las claves del éxito en el desarrollo de cualquier proyecto software consiste en utilizar un entorno de trabajo que facilite, sistematice y ayude a aplicar un proceso de desarrollo de una forma eficiente. Este Proyecto Fin de Grado (PFG) tiene por objetivo el análisis de distintas herramientas para configurar un entorno de trabajo que permita desarrollar proyectos aplicando metodologías ágiles e integración continua de una forma fácil y eficiente. Una vez analizadas dichas herramientas, se ha propuesto y configurado un entorno de trabajo para su puesta en marcha y uso. Una característica a destacar de este PFG es que las herramientas analizadas comparten una cualidad común y de alto valor, son herramientas open-source. El entorno de trabajo propuesto en este PFG presenta una arquitectura cliente-servidor, dado que la mayoría de proyectos software se desarrollan en equipo, de tal forma que el servidor proporciona a los distintos clientes/desarrolladores acceso al conjunto de herramientas que constituyen el entorno de trabajo. La parte servidora del entorno propuesto proporciona soporte a la integración continua mediante herramientas de control de versiones, de gestión de historias de usuario, de análisis de métricas de software, y de automatización de la construcción de software. La configuración del cliente únicamente requiere de un entorno de desarrollo integrado (IDE) que soporte el lenguaje de programación Java y conexión con el servidor. ABSTRACT Nowadays, numerous systems (financial, industrial production, basic services infrastructure, etc.) depend on software. According to the Software Engineering definition made by I.Sommerville, “Software engineering is an engineering discipline that is concerned with all aspects of software production from the early stages of system specification through to maintaining the system after it has gone into use.” “Software engineering is not just concerned with the technical processes of software development. It also includes activities such as software project management and the development of tools, methods, and theories to support software production.” Software development process models determine a set of guidelines to successfully develop a software development project. Since these process models emerged, new ways of managing a project and producing software with quality have been investigated. First, the so-called heavy or traditional methodologies appeared, but with the time and the technological improvements, new methodologies emerged: the so-called agile methodologies. Agile methodologies promote, among other practices, continuous integration. This practice was coined by Martin Fowler and aims to make teamwork easier as well as automate integration tasks. Nevertheless, one of the keys to success in software projects is to use a framework that facilitates, systematize, and help to deploy a development process in an efficient way. This Final Degree Project (FDP) aims to analyze different tools to configure a framework that enables to develop projects by applying agile methodologies and continuous integration in an easy and efficient way. Once tools are analyzed, a framework has been proposed and configured. One of the main features of this FDP is that the tools under analysis share a common and high-valued characteristic: they are open-source. The proposed framework presents a client-server architecture, as most of the projects are developed by a team. In this way, the server provides access the clients/developers to the tools that comprise the framework. The server provides continuous integration through a set of tools for control management, user stories management, software quality management, and software construction automatization. The client configuration only requires a Java integrated development environment and network connection to the server.
Resumo:
Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.
Resumo:
In questa tesi mostreremo qual è l'impatto delle sezioni d'urto di cattura neutronica (n, γ) mediate con la distribuzione maxwelliana di energia (MACS), sull'evoluzione delle stelle giganti rosse. Per raggiungere questo obiettivo, è stata sviluppata una procedura automatizzata per calcolare le MACS, partendo da librerie di dati nucleari valutati. Le MACS così ottenute sono state inserite come parametri di input per il codice FUNS, il quale implementa modelli di evoluzione stellare. Vengono mostrati risultati circa le abbondanze isotopiche degli elementi sintetizzati nel processo-s ottenuti usando differenti librerie. Infine viene mostrato un esempio dell'impatto dei dati ottenuti all'esperimento n_TOF.
Resumo:
Gamma zero-lag phase synchronization has been measured in the animal brain during visual binding. Human scalp EEG studies used a phase locking factor (trial-to-trial phase-shift consistency) or gamma amplitude to measure binding but did not analyze common-phase signals so far. This study introduces a method to identify networks oscillating with near zero-lag phase synchronization in human subjects.
Resumo:
Oxidation processes can be used to treat industrial wastewater containing non-biodegradable organic compounds. However, the presence of dissolved salts may inhibit or retard the treatment process. In this study, wastewater desalination by electrodialysis (ED) associated with an advanced oxidation process (photo-Fenton) was applied to an aqueous NaCl solution containing phenol. The influence of process variables on the demineralization factor was investigated for ED in pilot scale and a correlation was obtained between the phenol, salt and water fluxes with the driving force. The oxidation process was investigated in a laboratory batch reactor and a model based on artificial neural networks was developed by fitting the experimental data describing the reaction rate as a function of the input variables. With the experimental parameters of both processes, a dynamic model was developed for ED and a continuous model, using a plug flow reactor approach, for the oxidation process. Finally, the hybrid model simulation could validate different scenarios of the integrated system and can be used for process optimization.
Resumo:
Most definitions of virtual enterprise (VE) incorporate the idea of extended and collaborative outsourcing to suppliers and subcontractors in order to achieve a competitive response to market demands (Webster, Sugden, & Tayles, 2004). As suggested by several authors (Browne & Zhang, 1999; Byrne, 1993; Camarinha-Matos & Afsarmanesh, 1999; Cunha, Putnik, & Ávila, 2000; Davidow & Malone, 1992; Preiss, Goldman, & Nagel, 1996), a VE consists of a network of independent enterprises (resources providers) with reconfiguration capability in useful time, permanently aligned with the market requirements, created to take profit from a specific market opportunity, and where each participant contributes with its best practices and core competencies to the success and competitiveness of the structure as a whole. Even during the operation phase of the VE, the configuration can change, to assure business alignment with the market demands, traduced by the identification of reconfiguration opportunities and continuous readjustment or reconfiguration of the VE network, to meet unexpected situations or to keep permanent competitiveness and maximum performance (Cunha & Putnik, 2002, 2005a, 2005b).
Resumo:
Lean Thinking is an important pillar in the success of any program of continuous improvement process. Its tools are useful means in the analysis, control and organization of important data for correct decision making in organizations. This project had as main objective the design of a program of quality improvement in Eurico Ferreira, S.A., based on the evaluation of customer satisfaction and the implementation of 5S. Subsequently, we have selected which business area of the company to address. After the selection, there was an initial diagnostic procedure, identifying the various points of improvement to which some tools of Lean Thinking have been applied, in particular Value Stream Mapping and 5S methodology. With the first, we were able to map the current state of the process in which all stakeholders were represented as well as the flow of materials and information throughout the process. The 5S methodology allowed to act on the wastage, identifying and implementing various process improvements.
Resumo:
A manutenção é uma área extremamente importante, principalmente na indústria. Devidamente organizada, permitirá um fluxo produtivo devidamente planeado e executado, que permitirá a qualquer empresa manter o nível de facturação desejado e o prazo de entrega acordado com os clientes. De outra forma, poderá originar o caos. No entanto, os desafios de gestão da produção mais correntes, nomeadamente através do Lean Manufacturing, passam a exigir um pouco mais do que uma simples manutenção. Torna-se obrigatório fazer análises económicas que permitam averiguar quando cada equipamento passa a exigir custos de manutenção excessivos, os quais poderão obrigar a um recondicionamento mais acentuado do equipamento, o qual pode passar inclusivamente por uma melhoria da sua performance. Nestes casos, terá que existir uma “cumplicidade” entre a Direcção de Produção e a Manutenção, no sentido de averiguar o melhor momento para proceder a uma melhoria do equipamento, numa perspectiva de funcionamento global em linha de produção, adaptando-o à performance que será exigida ao conjunto. Neste domínio, o Projecto passa a prestar um serviço valiosíssimo à empresa, integrando-se no conjunto Produção + Manutenção, criando valor na intervenção, através do desenvolvimento de um trabalho que permite não só repor o estado natural da produção, mas sim promover uma melhoria sustentada da mesma. Este trabalho pretende reflectir e avaliar a relevância do Projecto neste tipo de operações, contribuindo de uma forma sistemática e sustentada para a melhoria contínua dos processos de fabrico. É apresentado um caso de estudo que pretende validar todo o desenvolvimento anteriormente realizado na matéria.
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
Hippocampal adult neurogenesis results in the continuous formation of new neurons in the adult hippocampus, which participate to learning and memory. Manipulations increasing adult neurogenesis have a huge clinical potential in pathologies involving memory loss. Intringuingly, most of the newborn neurons die during their maturation. Thus, increasing newborn neuron survival during their maturation may be a powerful way to increase overall adult neurogenesis. The factors governing this neuronal death are yet poorly known. In my PhD project, we made the hypothesis that synaptogenesis and synaptic activity play a role in the survival of newborn hippocampal neurons. We studied three factors potentially involved in the regulation of the synaptic integration of adult-born neurons. First, we used propofol anesthesia to provoke a global increase in GABAergic activity of the network, and we evaluated the outcome on newborn neuron synaptic integration, morphological development and survival. Propofol anesthesia impaired the dendritic maturation and survival of adult-born neurons in an age-dependent manner. Next, we examined the development of astrocytic ensheathment on the synapses formed by newborn neurons, as we hypothesized that astrocytes are involved in their synaptic integration. Astrocytic processes ensheathed the synapses of newborn neurons very early in their development, and the processes modulated synaptic transmission on these cells. Finally, we studied the cell-autonomous effects of the overexpression of synaptic adhesion molecules on the development, synaptic integration and survival of newborn neurons, and we found that manipulating of a single adhesion molecule was sufficient to modify synaptogenesis and/or synapse function, and to modify newborn neuron survival. Together, these results suggest that the activity of the neuronal network, the modulation of glutamate transport by astrocytes, and the synapse formation and activity of the neuron itself may regulate the survival of newborn neurons. Thus, the survival of newborn neurons may depend on their ability to communicate with the network. This knowledge is crucial for finding ways to increase neurogenesis in patients. More generally, understanding how the neurogenic niche works and which factors are important for the generation, maturation and survival of neurons is fundamental to be able to maybe, one day, replace neurons in any region of the brain.
Resumo:
Torrefaction is one of the pretreatment technologies to enhance the fuel characteristics of biomass. The efficient and continuous operation of a torrefaction reactor, in the commercial scale, demands a secure biomass supply, in addition to adequate source of heat. Biorefinery plants or biomass-fuelled steam power plants have the potential to integrate with the torrefaction reactor to exchange heat and mass, using available infrastructure and energy sources. The technical feasibility of this integration is examined in this study. A new model for the torrefaction process is introduced and verified by the available experimental data. The torrefaction model is then integrated in different steam power plants to simulate possible mass and energy exchange between the reactor and the plants. The performance of the integrated plant is investigated for different configurations and the results are compared.
Integration of marketing research data in new product development. Case study: Food industry company
Resumo:
The aim of this master’s thesis is to provide a real life example of how marketing research data is used by different functions in the NPD process. In order to achieve this goal, a case study in a company was implemented where gathering, analysis, distribution and synthesis of marketing research data in NPD were studied. The main research question was formulated as follows: How is marketing research data integrated and used by different company functions in the NPD process? The theory part of the master’s thesis was focused on the discussion of the marketing function role in NPD, use of marketing research particularly in the food industry, as well as issues related to the marketing/R&D interface during the NPD process. The empirical part of the master’s thesis was based on qualitative explanatory case study research. Individual in-depth interviews with company representatives, company documents and online research were used for data collection and analyzed through triangulation method. The empirical findings advocate that the most important marketing data sources at the concept generation stage of NPD are: global trends monitoring, retailing audit and consumers insights. These data sets are crucial for establishing the potential of the product on the market and defining the desired features for the new product to be developed. The findings also suggest the example of successful crossfunctional communication during the NPD process with formal and informal communication patterns. General managerial recommendations are given on the integration in NPD of a strategy, process, continuous improvement, and motivated cross-functional product development teams.
Resumo:
Cette thèse s'intéresse à l'étude des propriétés et applications de quatre familles des fonctions spéciales associées aux groupes de Weyl et dénotées $C$, $S$, $S^s$ et $S^l$. Ces fonctions peuvent être vues comme des généralisations des polynômes de Tchebyshev. Elles sont en lien avec des polynômes orthogonaux à plusieurs variables associés aux algèbres de Lie simples, par exemple les polynômes de Jacobi et de Macdonald. Elles ont plusieurs propriétés remarquables, dont l'orthogonalité continue et discrète. En particulier, il est prouvé dans la présente thèse que les fonctions $S^s$ et $S^l$ caractérisées par certains paramètres sont mutuellement orthogonales par rapport à une mesure discrète. Leur orthogonalité discrète permet de déduire deux types de transformées discrètes analogues aux transformées de Fourier pour chaque algèbre de Lie simple avec racines des longueurs différentes. Comme les polynômes de Tchebyshev, ces quatre familles des fonctions ont des applications en analyse numérique. On obtient dans cette thèse quelques formules de <
Resumo:
Studies of construction labour productivity have revealed that limited predictability and multi-agent social complexity make long-range planning of construction projects extremely inaccurate. Fire-fighting, a cultural feature of construction project management, social and structural diversity of involved permanent organizations, and structural temporality all contribute towards relational failures and frequent changes. The main purpose of this paper is therefore to demonstrate that appropriate construction planning may have a profound synergistic effect on structural integration of a project organization. Using the general systems theory perspective it is further a specific objective to investigate and evaluate organizational effects of changes in planning and potentials for achieving continuous project-organizational synergy. The newly developed methodology recognises that planning should also represent a continuous, improvement-leading driving force throughout a project. The synergistic effect of the process planning membership duality fostered project-wide integration, eliminated internal boundaries, and created a pool of constantly upgrading knowledge. It maintained a creative environment that resulted in a number of process-related improvements from all parts of the organization. As a result labour productivity has seen increases of more than 30%, profits have risen from an average of 12% to more than 18%, and project durations have been reduced by several days.
Resumo:
Evolutionary novelties in the skeleton are usually expressed as changes in the timing of growth of features intrinsically integrated at different hierarchical levels of development(1). As a consequence, most of the shape- traits observed across species do vary quantitatively rather than qualitatively(2), in a multivariate space(3) and in a modularized way(4,5). Because most phylogenetic analyses normally use discrete, hypothetically independent characters(6), previous attempts have disregarded the phylogenetic signals potentially enclosed in the shape of morphological structures. When analysing low taxonomic levels, where most variation is quantitative in nature, solving basic requirements like the choice of characters and the capacity of using continuous, integrated traits is of crucial importance in recovering wider phylogenetic information. This is particularly relevant when analysing extinct lineages, where available data are limited to fossilized structures. Here we show that when continuous, multivariant and modularized characters are treated as such, cladistic analysis successfully solves relationships among main Homo taxa. Our attempt is based on a combination of cladistics, evolutionary- development- derived selection of characters, and geometric morphometrics methods. In contrast with previous cladistic analyses of hominid phylogeny, our method accounts for the quantitative nature of the traits, and respects their morphological integration patterns. Because complex phenotypes are observable across different taxonomic groups and are potentially informative about phylogenetic relationships, future analyses should point strongly to the incorporation of these types of trait.