948 resultados para bug life-cycle management
Resumo:
All’interno di questo elaborato di tesi verrà proposto un Modello di Valutazione e Selezione delle Iniziative commerciali, costruito nel corso dell’esperienza di stage presso la società Astaldi. Nella parte iniziale del lavoro, verranno presentati dei concetti teorici che costituiscono il presupposto e la base per la costruzione del Modello. Il background di riferimento è quello di un’azienda che lavora nel mondo delle costruzioni, sviluppando progetti complessi a livello internazionale. Nel primo capitolo, dopo una breve introduzione sul Project Management, si prenderà in esame l’ambiente multiproject, un ambiente nel quale gran parte delle aziende si trova ad operare e nel quale opera anche l’Astaldi. Successivamente, partendo da queste considerazioni legate all’ambiente multi-progetto, verranno descritte le potenzialità ed i vantaggi di un approccio Project Portfolio Management, come evoluzione naturale del Project Management. Nell’ambito del Project Portfolio Management, verrà analizzata l’importanza del processo di selezione dei progetti per la costruzione del portafoglio e le relative tecniche di selezione. Nel secondo capitolo verranno presentate le Misure e gli Indicatori di performance definiti in letteratura. Verranno anche introdotti dei possibili Indicatori di Valutazione, la cui misurazione è particolarmente legata al Project Life Cycle: questi verranno fatti propri all’interno del Modello presentato che darà come output, proprio un Indicatore di Valutazione ex-ante del progetto. Nel terzo capitolo verrà illustrata la società Astaldi al fine di comprendere il più possibile il contesto in cui e per cui si è costruito tale Modello. Nel quarto ed ultimo capitolo sarà analizzato il Modello stesso, descrivendo le modalità con cui è stato realizzato, le modalità operative per la sua applicazione ed alcuni esempi esplicativi del suo utilizzo.
Resumo:
In the last years, the European countries have paid increasing attention to renewable sources and greenhouse emissions. The Council of the European Union and the European Parliament have established ambitious targets for the next years. In this scenario, biomass plays a prominent role since its life cycle produces a zero net carbon dioxide emission. Additionally, biomass can ensure plant operation continuity thanks to its availability and storage ability. Several conventional systems running on biomass are available at the moment. Most of them are performant either in the large-scale or in the small power range. The absence of an efficient system on the small-middle scale inspired this thesis project. The object is an innovative plant based on a wet indirectly fired gas turbine (WIFGT) integrated with an organic Rankine cycle (ORC) unit for combined heat and power production. The WIFGT is a performant system in the small-middle power range; the ORC cycle is capable of giving value to low-temperature heat sources. Their integration is investigated in this thesis with the aim of carrying out a preliminary design of the components. The targeted plant output is around 200 kW in order not to need a wide cultivation area and to avoid biomass shipping. Existing in-house simulation tools are used: They are adapted to this purpose. Firstly the WIFGT + ORC model is built; Zero-dimensional models of heat exchangers, compressor, turbines, furnace, dryer and pump are used. Different fluids are selected but toluene and benzene turn out to be the most suitable. In the indirectly fired gas turbine a pressure ratio around 4 leads to the highest efficiency. From the thermodynamic analysis the system shows an electric efficiency of 38%, outdoing other conventional plants in the same power range. The combined plant is designed to recover thermal energy: Water is used as coolant in the condenser. It is heated from 60°C up to 90°C, ensuring the possibility of space heating. Mono-dimensional models are used to design the heat exchange equipment. Different types of heat exchangers are chosen depending on the working temperature. A finned-plate heat exchanger is selected for the WIFGT heat transfer equipment due to the high temperature, oxidizing and corrosive environment. A once-through boiler with finned tubes is chosen to vaporize the organic fluid in the ORC. A plate heat exchanger is chosen for the condenser and recuperator. A quasi-monodimensional model for single-stage axial turbine is implemented to design both the WIFGT and the ORC turbine. The system simulation after the components design shows an electric efficiency around 34% with a decrease by 10% compared to the zero-dimensional analysis. The work exhibits the system potentiality compared to the existing plants from both technical and economic point of view.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
Green roof mitigation of volume and peak flow-rate of stormwater runoff has been studied extensively. However, due to the common practice of green roof fertilization, there is the potential for introduction of nutrients into local bodies of water. Therefore, this study compares green roof runoff quality with the water quality of precipitation and runoff from a bare shingle roof. The runoff from a demonstration-scale extensive green roof was analyzed during the summer of 2011 for its effect on runoff volume and analyzed during eleven storm events in the fall and winter for concentrations of copper, cadmium, zinc, lead, nitrogen species, total nitrogen, total organic carbon, sulfate, orthophosphate, and other monovalent and divalent ions. The green roof reduced the overall volume of runoff and served as a sink for NO3 - and NH4 +. However, the green roof was also a source for the pollutants PO4 3-, SO4 2-, TOC, cations, and total nitrogen. Metals such as zinc and lead showed trends of higher mass loads in the bare roof runoff than in precipitation and green roof runoff, although results were not statistically significant. The green roof also showed trends, although also not statistically significant, of retaining cadmium and copper. With the green roof serving as a source of phosphorous species and a sink for nitrogen species, and appearing to a retain metals and total volume, the life cycle impact analysis shows minimum impacts from the green roof, when compared with precipitation and bare roof runoff, in all but fresh water eutrophication. Therefore, the best environments to install a green roof may be in coastal environments.
Resumo:
Monitoring of herbaceous plants on the Ottawa National Forest (ONF) is used to understand the impact of forest management on understory composition and site conditions. In their planning, national forests are required to take into account management impacts on diversity and ecosystem health. The effect of management on understory species is dependent on various factors, including the intensity of disturbance and the biology of the plant. In the first study in this report, a population of Carex assiniboinensis, a Michigan state threatened species, was monitored for seven seasons including before logging commenced, in order to determine the sedge’s response to a single-tree selection harvest. Analyses provided insights for management of C. assiniboinensis at the stand level over the short-term. In the second study in this report, the use of the cutleaf toothwort (Cardamine concatenata) as a Management Indicator Species on the ONF was reviewed. Data were analyzed to determine the suitability of using C. concatenata to monitor impacts of forest management on site conditions. The various factors that affect understory species population dynamics illuminated the challenges of using indicator species to monitor site conditions. Insights from the study provide a greater understanding of management impacts on understory species across the Ottawa National Forest.
Resumo:
Der Artikel behandelt das Projektieren der Produkt-Service-Verbindung vom Standpunkt der Informationsintegration aus. Der Autor erläutert grundlegende Unterschiede zwischen dem traditionellen und dem modernen Operationsmanagementkonzept. Ergänzend wird die Rolle der logistischen Unterstüzungsanalyse wird betrachtet. Der Artikel stellt das Konzept von CALS (Continuous Acquisition and Life cycle Support) dar, welches als Umgebung, die Datenverteilung zwischen den in den Entwicklungsprozess beteiligten Geschäftspartnern ermöglicht.
Resumo:
Der Beitrag fokussiert die Entwicklung, den Einsatz und die Nutzung von innovativen Technologien zur Unterstützung von Bildungsszenarien in Schule, Hochschule und Weiterbildung. Ausgehend von den verschiedenen Phasen des Corporate Learning, Social Learning, Mobile Learning und Intelligent Learning wird in einem ersten Abschnitt das Nutzungsverhalten von Technologien durch Kinder, Jugendliche und (junge) Erwachsene in Schule, Studium und Lehre betrachtet. Es folgt die Darstellung technologischer Entwicklungen auf Basis des Technology Life Cycle und die Konsequenzen von unterschiedlichen Entwicklungszuständen und Reifegraden von Technologien wie Content Learning Management, sozialen Netzwerken, mobilen Endgeräten, multidimensionalen und -modalen Räumen bis hin zu Anwendungen augmentierter Realität und des Internets der Dinge, Dienste und Daten für den Einsatz und die Nutzung in Bildungsszenarien. Nach der Darstellung von Anforderungen an digitale Technologien hinsichtlich Inhalte, Didaktik und Methodik wie etwa hinsichtlich der Erstellung von Inhalten, deren Wiederverwendung, Digitalisierung und Auffindbarkeit sowie Standards werden methodische Hinweise zur Nutzung digitaler Technologien zur Interaktion von Lernenden, von Lehrenden, sozialer Interaktion, kollaborativem Autorieren, Kommentierung, Evaluation und Begutachtung gegeben. Abschließend werden - differenziert für Schule und Hochschule - Erkenntnisse zu Rahmenbedingungen, Einflussgrößen, hemmenden und fördernden Faktoren sowie Herausförderungen bei der Einführung und nachhaltigen Implementation digitaler Technologien im schulischen Unterricht, in Lehre, Studium und Weiterbildung im Überblick zusammengefasst.
Resumo:
IT has turned out to be a key factor for the purposes of gaining maturity in Business Process Management (BPM). This book presents a worldwide investigation that was conducted among companies from the ‘Forbes Global 2000’ list to explore the current usage of software throughout the BPM life cycle and to identify the companies’ requirements concerning process modelling. The responses from 130 companies indicate that, at the present time, it is mainly software for process description and analysis that is required, while process execution is supported by general software such as databases, ERP systems and office tools. The resulting complex system landscapes give rise to distinct requirements for BPM software, while the process modelling requirements can be equally satisfied by the most common languages (BPMN, UML, EPC).
Resumo:
Currently, dramatic changes are happening in the IS development industry. The incumbent system developers (hubs) are embracing partnerships with less well established companies (spokes), acting in specific niches. This paper seeks to establish a better understanding of the motives for this strategy. Relying on existing work on strategic alliance formation, it is argued that partnering is particularly attractive, if these small companies possess certain capabilities that are difficult to obtain through other arrangements than partnering. Again drawing on the literature, three categories of capabilities are identified: the capability to innovate within their niche, the capability to provide a specific functionality that can be integrated with the incumbents’ systems, and the capability to address novel markets. These factors are analyzed through a case study. The case represents a market leader in the global IS development industry, which fosters a network of smaller partner firms. The study reveals that temporal dynamics between the identified factors are playing a dominant role in these networks. A cyclical partnership model is developed that attempts to explain the life cycle of a partnership within such a network.
Resumo:
Content Distribution Networks are mandatory components of modern web architectures, with plenty of vendors offering their services. Despite its maturity, new paradigms and architecture models are still being developed in this area. Cloud Computing, on the other hand, is a more recent concept which has expanded extremely quickly, with new services being regularly added to cloud management software suites such as OpenStack. The main contribution of this paper is the architecture and the development of an open source CDN that can be provisioned in an on-demand, pay-as-you-go model thereby enabling the CDN as a Service paradigm. We describe our experience with integration of CDNaaS framework in a cloud environment, as a service for enterprise users. We emphasize the flexibility and elasticity of such a model, with each CDN instance being delivered on-demand and associated to personalized caching policies as well as an optimized choice of Points of Presence based on exact requirements of an enterprise customer. Our development is based on the framework developed in the Mobile Cloud Networking EU FP7 project, which offers its enterprise users a common framework to instantiate and control services. CDNaaS is one of the core support components in this project as is tasked to deliver different type of multimedia content to several thousands of users geographically distributed. It integrates seamlessly in the MCN service life-cycle and as such enjoys all benefits of a common design environment, allowing for an improved interoperability with the rest of the services within the MCN ecosystem.
Resumo:
Transaction costs, one often hears, are the economic equivalent of friction in physical systems. Like physicists, economists can sometimes neglect friction in formulating theories; but like engineers, they can never neglect friction in studying how the system actually does let alone should work. Interestingly, however, the present-day economics of organization also ignores friction. That is, almost single-mindedly, the literature analyzes transactions from the point of view of misaligned incentives and (especially) transaction-specific assets. The costs involved are certainly costs of running the economic system in some sense, but they are not obviously frictions. Stories about frictions in trade are not nearly as intriguing as stories about guileful trading partners and expensive assets placed at risk. But I will argue that these seemingly dull categories of cost what Baldwin and Clark (2003) call mundane transaction costs actually have a secret life. They are at least as important as, and quite probably far more important than, the more glamorous costs of asset specificity in explaining the partition between firm and market. These costs also have a secret life in another sense: they have a secret life cycle. I will argue that these mundane transaction costs provide much better material for helping us understanding how the boundaries among firms, markets, and hybrid forms change over time.
Resumo:
There is a long tradition of river monitoring using macroinvertebrate communities to assess environmental quality in Europe. A promising alternative is the use of species life-history traits. Both methods, however, have relied on the time-consuming identification of taxa. River biotopes, 1-100 m**2 'habitats' with associated species assemblages, have long been seen as a useful and meaningful way of linking the ecology of macroinvertebrates and river hydro-morphology and can be used to assess hydro-morphological degradation in rivers. Taxonomic differences, however, between different rivers had prevented a general test of this concept until now. The species trait approach may overcome this obstacle across broad geographical areas, using biotopes as the hydro-morphological units which have characteristic species trait assemblages. We collected macroinvertebrate data from 512 discrete patches, comprising 13 river biotopes, from seven rivers in England and Wales. The aim was to test whether river biotopes were better predictors of macroinvertebrate trait profiles than taxonomic composition (genera, families, orders) in rivers, independently of the phylogenetic effects and catchment scale characteristics (i.e. hydrology, geography and land cover). We also tested whether species richness and diversity were better related to biotopes than to rivers. River biotopes explained 40% of the variance in macroinvertebrate trait profiles across the rivers, largely independently of catchment characteristics. There was a strong phylogenetic signature, however. River biotopes were about 50% better at predicting macroinvertebrate trait profiles than taxonomic composition across rivers, no matter which taxonomic resolution was used. River biotopes were better than river identity at explaining the variability in taxonomic richness and diversity (40% and <=10%, respectively). Detailed trait-biotope associations agreed with independent a priori predictions relating trait categories to near river bed flows. Hence, species traits provided a much needed mechanistic understanding and predictive ability across a broad geographical area. We show that integration of the multiple biological trait approach with river biotopes at the interface between ecology and hydro-morphology provides a wealth of new information and potential applications for river science and management.
Resumo:
La fisuración iniciada en la superficie de los pavimentos asfálticos constituye uno de los más frecuentes e importantes modos de deterioro que tienen lugar en los firmes bituminosos, como han demostrado los estudios teóricos y experimentales llevados a cabo en la última década. Sin embargo, este mecanismo de fallo no ha sido considerado por los métodos tradicionales de diseño de estos firmes. El concepto de firmes de larga duración se fundamenta en un adecuado seguimiento del proceso de avance en profundidad de estos deterioros y la intervención en el momento más apropiado para conseguir mantenerlos confinados como fisuras de profundidad parcial en la capa superficial más fácilmente accesible y reparable, de manera que pueda prolongarse la durabilidad y funcionalidad del firme y reducir los costes generalizados de su ciclo de vida. Por lo tanto, para la selección de la estrategia óptima de conservación de los firmes resulta esencial disponer de metodologías que posibiliten la identificación precisa in situ de la fisuración descendente, su seguimiento y control, y que además permitan una determinación fiable y con alto rendimiento de su profundidad y extensión. En esta Tesis Doctoral se presentan los resultados obtenidos mediante la investigación sistemática de laboratorio e in situ llevada a cabo para la obtención de datos sobre fisuración descendente en firmes asfálticos y para el estudio de procedimientos de evaluación de la profundidad de este tipo de fisuras empleando técnicas de ultrasonidos. Dichos resultados han permitido comprobar que la metodología no destructiva propuesta, de rápida ejecución, bajo coste y sencilla implementación (principalmente empleada hasta el momento en estructuras metálicas y de hormigón, debido a las dificultades que introduce la naturaleza viscoelástica de los materiales bituminosos) puede ser aplicada con suficiente fiabilidad y repetibilidad sobre firmes asfálticos. Las medidas resultan asimismo independientes del espesor total del firme. Además, permite resolver algunos de los inconvenientes frecuentes que presentan otros métodos de diagnóstico de las fisuras de pavimentos, tales como la extracción de testigos (sistema destructivo, de alto coste y prolongados tiempos de interrupción del tráfico) o algunas otras técnicas no destructivas como las basadas en medidas de deflexiones o el georradar, las cuales no resultan suficientemente precisas para la investigación de fisuras superficiales. Para ello se han realizado varias campañas de ensayos sobre probetas de laboratorio en las que se han estudiado diferentes condiciones empíricas como, por ejemplo, distintos tipos de mezclas bituminosas en caliente (AC, SMA y PA), espesores de firme y adherencias entre capas, temperaturas, texturas superficiales, materiales de relleno y agua en el interior de las grietas, posición de los sensores y un amplio rango de posibles profundidades de fisura. Los métodos empleados se basan en la realización de varias medidas de velocidad o de tiempo de transmisión del pulso ultrasónico sobre una única cara o superficie accesible del material, de manera que resulte posible obtener un coeficiente de transmisión de la señal (mediciones relativas o autocompensadas). Las mediciones se han realizado a bajas frecuencias de excitación mediante dos equipos de ultrasonidos diferentes dotados, en un caso, de transductores de contacto puntual seco (DPC) y siendo en el otro instrumento de contacto plano a través de un material especialmente seleccionado para el acoplamiento (CPC). Ello ha permitido superar algunos de los tradicionales inconvenientes que presenta el uso de los transductores convencionales y no precisar preparación previa de las superficies. La técnica de autocalibración empleada elimina los errores sistemáticos y la necesidad de una calibración local previa, demostrando el potencial de esta tecnología. Los resultados experimentales han sido comparados con modelos teóricos simplificados que simulan la propagación de las ondas ultrasónicas en estos materiales bituminosos fisurados, los cuales han sido deducidos previamente mediante un planteamiento analítico y han permitido la correcta interpretación de dichos datos empíricos. Posteriormente, estos modelos se han calibrado mediante los resultados de laboratorio, proporcionándose sus expresiones matemáticas generalizadas y gráficas para su uso rutinario en las aplicaciones prácticas. Mediante los ensayos con ultrasonidos efectuados en campañas llevadas a cabo in situ, acompañados de la extracción de testigos del firme, se han podido evaluar los modelos propuestos. El máximo error relativo promedio en la estimación de la profundidad de las fisuras al aplicar dichos modelos no ha superado el 13%, con un nivel de confianza del 95%, en el conjunto de todos los ensayos realizados. La comprobación in situ de los modelos ha permitido establecer los criterios y las necesarias recomendaciones para su utilización sobre firmes en servicio. La experiencia obtenida posibilita la integración de esta metodología entre las técnicas de auscultación para la gestión de su conservación. Abstract Surface-initiated cracking of asphalt pavements constitutes one of the most frequent and important types of distress that occur in flexible bituminous pavements, as clearly has been demonstrated in the technical and experimental studies done over the past decade. However, this failure mechanism has not been taken into consideration for traditional methods of flexible pavement design. The concept of long-lasting pavements is based on adequate monitoring of the depth and extent of these deteriorations and on intervention at the most appropriate moment so as to contain them in the surface layer in the form of easily-accessible and repairable partial-depth topdown cracks, thereby prolonging the durability and serviceability of the pavement and reducing the overall cost of its life cycle. Therefore, to select the optimal maintenance strategy for perpetual pavements, it becomes essential to have access to methodologies that enable precise on-site identification, monitoring and control of top-down propagated cracks and that also permit a reliable, high-performance determination of the extent and depth of cracking. This PhD Thesis presents the results of systematic laboratory and in situ research carried out to obtain information about top-down cracking in asphalt pavements and to study methods of depth evaluation of this type of cracking using ultrasonic techniques. These results have demonstrated that the proposed non-destructive methodology –cost-effective, fast and easy-to-implement– (mainly used to date for concrete and metal structures, due to the difficulties caused by the viscoelastic nature of bituminous materials) can be applied with sufficient reliability and repeatability to asphalt pavements. Measurements are also independent of the asphalt thickness. Furthermore, it resolves some of the common inconveniences presented by other methods used to evaluate pavement cracking, such as core extraction (a destructive and expensive procedure that requires prolonged traffic interruptions) and other non-destructive techniques, such as those based on deflection measurements or ground-penetrating radar, which are not sufficiently precise to measure surface cracks. To obtain these results, extensive tests were performed on laboratory specimens. Different empirical conditions were studied, such as various types of hot bituminous mixtures (AC, SMA and PA), differing thicknesses of asphalt and adhesions between layers, varied temperatures, surface textures, filling materials and water within the crack, different sensor positions, as well as an ample range of possible crack depths. The methods employed in the study are based on a series of measurements of ultrasonic pulse velocities or transmission times over a single accessible side or surface of the material that make it possible to obtain a signal transmission coefficient (relative or auto-calibrated readings). Measurements were taken at low frequencies by two short-pulse ultrasonic devices: one equipped with dry point contact transducers (DPC) and the other with flat contact transducers that require a specially-selected coupling material (CPC). In this way, some of the traditional inconveniences presented by the use of conventional transducers were overcome and a prior preparation of the surfaces was not required. The auto-compensating technique eliminated systematic errors and the need for previous local calibration, demonstrating the potential for this technology. The experimental results have been compared with simplified theoretical models that simulate ultrasonic wave propagation in cracked bituminous materials, which had been previously deduced using an analytical approach and have permitted the correct interpretation of the aforementioned empirical results. These models were subsequently calibrated using the laboratory results, providing generalized mathematical expressions and graphics for routine use in practical applications. Through a series of on-site ultrasound test campaigns, accompanied by asphalt core extraction, it was possible to evaluate the proposed models, with differences between predicted crack depths and those measured in situ lower than 13% (with a confidence level of 95%). Thereby, the criteria and the necessary recommendations for their implementation on in-service asphalt pavements have been established. The experience obtained through this study makes it possible to integrate this methodology into the evaluation techniques for pavement management systems.
Resumo:
Under the 12th International Conference on Building Materials and Components is inserted this communication related to the field of management of those assets that constitute the Spanish Cultural Heritage and maintenance. This work is related to the field of management of those assets that constitute the Spanish Cultural Heritage which share an artistic or historical background. The conservation and maintenance become a social demand necessary for the preservation of public values, requiring the investment of necessary resources. The legal protection involves a number of obligations and rights to ensure the conservation and heritage protection. The duty of maintenance and upkeep exceeds the useful life the property that must endure more for their cultural value for its usability. The establishment of the necessary conditions to prevent deterioration and precise in order to fulfill its social function, seeking to prolong the life of the asset, preserving their physical integrity and its ability to convey the values protected. This obligation implies a substantial financial effort to the holder of the property, either public or private entity, addressing a problem of economic sustainability. Economic exploitation, with the aim of contributing to their well-maintained, is sometimes the best way to get resources. The work will include different lines of research with the following objectives. - Establishment of processes for assessing total costs over the building life cycle (LCC), during the planning stages or maintenance budgets to determine the most advantageous operating system. - Relationship between the value of property and maintenance costs, and establishing a sensitivity analysis.
Resumo:
In the beginning of the 90s, ontology development was similar to an art: ontology developers did not have clear guidelines on how to build ontologies but only some design criteria to be followed. Work on principles, methods and methodologies, together with supporting technologies and languages, made ontology development become an engineering discipline, the so-called Ontology Engineering. Ontology Engineering refers to the set of activities that concern the ontology development process and the ontology life cycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them. Thanks to the work done in the Ontology Engineering field, the development of ontologies within and between teams has increased and improved, as well as the possibility of reusing ontologies in other developments and in final applications. Currently, ontologies are widely used in (a) Knowledge Engineering, Artificial Intelligence and Computer Science, (b) applications related to knowledge management, natural language processing, e-commerce, intelligent information integration, information retrieval, database design and integration, bio-informatics, education, and (c) the Semantic Web, the Semantic Grid, and the Linked Data initiative. In this paper, we provide an overview of Ontology Engineering, mentioning the most outstanding and used methodologies, languages, and tools for building ontologies. In addition, we include some words on how all these elements can be used in the Linked Data initiative.