860 resultados para Engineering, Civil|Engineering, Industrial|Computer Science
Resumo:
The job security issue is crucial for the development of construction due to the need to ensure the health of workers, which is done by means of laws and production management. Thus, among various other laws, was enacted NR-18, in order to ensure the worker's minimum conditions for the development work. Despite legislative developments on the subject, they have become ineffective against the excessive number of accidents in the construction industry, bringing the company to greater in ensuring the health and safety of its workers. In view of this need for improvement of working environment in a general appearance, both for purposes of ensuring the law obedience as comfort for workers and quality of the organization, the System Health Management and Safety (OHSMS) is a valid tool demonstrates the evolution of business management, as well as OHSAS 18001 which proposes to ensure the efficiency and integration of a system geared to safety and health at work by means of implements and adaptations of it, in order to bring significant improvements to conditions of work, especially in the form of a new culture to be adopted by the company. Addressing the problem, this paper aims to develop a management system by OHSAS 18001 which is consistent with the terms of NR-18 as it is this integration of OHSMS Management System of the company as a usual practice of that aims at an improvement of work safety in the business of Buildings.
Resumo:
Trabalho baseado no relatório para a disciplina “Sociologia das Novas Tecnologias de Informação” no âmbito do Mestrado Integrado de Engenharia e Gestão Industrial, da Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa em 2015-16. O trabalho foi orientado pelo Prof. António Brandão Moniz do Departamento de Ciências Sociais Aplicadas (DCSA) na mesma Faculdade.
Resumo:
Part 14: Interoperability and Integration
Resumo:
Part 13: Virtual Reality and Simulation
Resumo:
Part 6: Engineering and Implementation of Collaborative Networks
Resumo:
Part 5: Service Orientation in Collaborative Networks
Resumo:
Meso-/microporous zeolites combine the charactersitics of well-defined micropores of zeolite with efficient mass transfer consequences of mesopores to increase the efficiency of the catalysts in reactions involving bulky molecules. Different methods such as demetallation and templating have been explored for the synthesis of meso-/microporous zeolites. However, they all have limitations in production of meso-/microporous zeolites with tunable textural and catalytic properties using few synthesis steps. To address this challenge, a simple one-step dual template synthesis approach has been developed in this work to engineer lamellar meso-/microporous zeolites structures with tunable textural and catalytic properties. First, one-step dual template synthesis of meso-/microporous mordenite framework inverted (MFI) zeolite structures was investigated. Tetrapropyl ammonium hydroxide (TPAOH) and diquaternary ammonium surfactant ([C22H45-N+(CH3)2-C6H12-N+(CH3)2-C6H13]Br2, C22-6-6) were used as templates to produce micropores and mesopores, respectively. The variation in concentration ratios of dual templates and hydrothermal synthesis conditions resulted in production of multi-lamellar MFI and the hybrid lamellar-bulk MFI (HLBM) zeolite structures. The relationship between the morphology, porosity, acidity, and catalytic properties of these catalysts was systematically studied. Then, the validity of the proposed synthesis approach for production of other types of zeolites composites was examined by creating a meso-/microporous bulk polymorph A (BEA)-lamellar MFI (BBLM) composite. The resulted composite samples showed higher catalytic stability compared to their single component zeolites. The studies demonstrated the high potential of the one-step dual template synthesis procedure for engineering the textural and catalytic properties of the synthesized zeolites.
Resumo:
A large percentage of Vanier College's technology students do not attain their College degrees within the scheduled three years of their program. A closer investigation of the problem revealed that in many of these cases these students had completed all of their program professional courses but they had not completed all of the required English and/or Humanities courses. Fortunately, most of these students do extend their stay at the college for the one or more semesters required for graduation, although some choose to go on into the workforce without returning to complete the missing English and/or Humanities and without their College Degrees. The purpose of this research was to discover if there was any significant measure of association between a student's family linguistic background, family cultural background, high school average, and/or College English Placement Test results and his or her likelihood of succeeding in his or her English and/or Humanities courses within the scheduled three years of the program. Because of both demographic differences between 'hard' and 'soft' technologies, including student population, more specifically gender ratios and student average ages in specific programs; and program differences, including program writing requirements and types of practical skill activities required; in order to have a more uniform sample, the research was limited to the hard technologies where students work hands-on with hardware and/or computers and tend to have overall low research and writing requirements. Based on a review of current literature and observations made in one of the hard technology programs at Vanier College, eight research questions were developed. These questions were designed to examine different aspects of success in the English and Humanities courses such as failure and completion rates and the number of courses remaining after the end of the fifth semester and as well examine how the students assessed their ability to communicate in English. The eight research questions were broken down into a total of 54 hypotheses. The high number of hypotheses was required to address a total of seven independent variables: primary home language, high school language of instruction, student's place of birth (Canada, Not-Canada), student's parents' place of birth (Both-born-in-Canada, Not-both-born-in-Canada), high school averages and English placement level (as a result of the College English Entry Test); and eleven dependent variables: number of English completed, number of English failed, whether all English were completed by the end of the 5th semester (yes, no), number of Humanities courses completed, number of Humanities courses failed, whether all the Humanities courses were completed by the end of the 5th semester (yes, no), the total number of English and Humanities courses left, and the students' assessments of their ability to speak, read and write in English. The data required to address the hypotheses were collected from two sources, from the students themselves and from the College. Fifth and sixth semester students from Building Engineering Systems, Computer and Digital Systems, Computer Science and Industrial Electronics Technology Programs were surveyed to collect personal information including family cultural and linguistic history and current language usages, high school language of instruction, perceived fluency in speaking, reading and writing in English and perceived difficulty in completing English and Humanities courses. The College was able to provide current academic information on each of the students, including copies of college program planners and transcripts, and high school transcripts for students who attended a high school in Quebec. Quantitative analyses were done on the data using the SPSS statistical analysis program. Of the fifty-four hypotheses analysed, in fourteen cases the results supported the research hypotheses, in the forty other cases the null hypotheses had to be accepted. One of the findings was that there was a strong significant association between a student's primary home language and place of birth and his or her perception of his or her ability to communicate in English (speak, read, and write) signifying that both students whose primary home language was not English and students who were not born in Canada, considered themselves, on average, to be weaker in these skills than did students whose primary home language was English. Although this finding was noteworthy, the two most significant findings were the association found between a student's English entry placement level and the number of English courses failed and the association between the parents' place of birth and the student's likelihood of succeeding in both his or her English and Humanities courses. According to the research results, the mean number of English courses failed, on average, by students placed in the lowest entry level of College English was significantly different from the number of English courses failed by students placed in any of the other entry level English courses. In this sample students who were placed in the lowest entry level of College English failed, on average, at least three times as many English courses as those placed in any of the other English entry level courses. These results are significant enough that they will be brought to the attention of the appropriate College administration. The results of this research also appeared to indicate that the most significant determining factor in a student's likelihood of completing his or her English and Humanities courses is his or her parents' place of birth (both-born-in-Canada or not-both-born-in-Canada). Students who had at least one parent who was not born in Canada, would, on average, fail a significantly higher number of English courses, be significantly more likely to still have at least one English course left to complete by the end of the 5th semester, fail a significantly higher number of Humanities courses, be significantly more likely to still have at least one Humanities course to complete by the end of the 5th semester and have significantly more combined English and Humanities courses to complete at the end of their 5th semester than students with both parents born in Canada. This strong association between students' parents' place of birth and their likelihood of succeeding in their English and Humanities courses within the three years of their program appears to indicate that acculturation may be a more significant factor than either language or high school averages, for which no significant association was found for any of the English and Humanities related dependent variables. Although the sample size for this research was only 60 students and more research needs to be conducted in this area, to see if these results are supported with other groups within the College, these results are still significant. If the College can identify, at admission, the students who will be more likely to have difficulty in completing their English and Humanities courses, the College will now have the opportunity to intercede during or before the first semester, and offer these students the support they require in order to increase their chances of success in their education, whether it be classes or courses designed to meet their specific needs, special mentoring, tutoring or other forms of support. With the necessary support, the identified students will have a greater opportunity of successfully completing their programs within the scheduled three years, while at the same time the College will have improved its capacity to meeting the needs of its students.
Resumo:
Las organizaciones y sus entornos son sistemas complejos. Tales sistemas son difíciles de comprender y predecir. Pese a ello, la predicción es una tarea fundamental para la gestión empresarial y para la toma de decisiones que implica siempre un riesgo. Los métodos clásicos de predicción (entre los cuales están: la regresión lineal, la Autoregresive Moving Average y el exponential smoothing) establecen supuestos como la linealidad, la estabilidad para ser matemática y computacionalmente tratables. Por diferentes medios, sin embargo, se han demostrado las limitaciones de tales métodos. Pues bien, en las últimas décadas nuevos métodos de predicción han surgido con el fin de abarcar la complejidad de los sistemas organizacionales y sus entornos, antes que evitarla. Entre ellos, los más promisorios son los métodos de predicción bio-inspirados (ej. redes neuronales, algoritmos genéticos /evolutivos y sistemas inmunes artificiales). Este artículo pretende establecer un estado situacional de las aplicaciones actuales y potenciales de los métodos bio-inspirados de predicción en la administración.
Resumo:
The multi-faced evolution of network technologies ranges from big data centers to specialized network infrastructures and protocols for mission-critical operations. For instance, technologies such as Software Defined Networking (SDN) revolutionized the world of static configuration of the network - i.e., by removing the distributed and proprietary configuration of the switched networks - centralizing the control plane. While this disruptive approach is interesting from different points of view, it can introduce new unforeseen vulnerabilities classes. One topic of particular interest in the last years is industrial network security, an interest which started to rise in 2016 with the introduction of the Industry 4.0 (I4.0) movement. Networks that were basically isolated by design are now connected to the internet to collect, archive, and analyze data. While this approach got a lot of momentum due to the predictive maintenance capabilities, these network technologies can be exploited in various ways from a cybersecurity perspective. Some of these technologies lack security measures and can introduce new families of vulnerabilities. On the other side, these networks can be used to enable accurate monitoring, formal verification, or defenses that were not practical before. This thesis explores these two fields: by introducing monitoring, protections, and detection mechanisms where the new network technologies make it feasible; and by demonstrating attacks on practical scenarios related to emerging network infrastructures not protected sufficiently. The goal of this thesis is to highlight this lack of protection in terms of attacks on and possible defenses enabled by emerging technologies. We will pursue this goal by analyzing the aforementioned technologies and by presenting three years of contribution to this field. In conclusion, we will recapitulate the research questions and give answers to them.
Resumo:
In the framework of industrial problems, the application of Constrained Optimization is known to have overall very good modeling capability and performance and stands as one of the most powerful, explored, and exploited tool to address prescriptive tasks. The number of applications is huge, ranging from logistics to transportation, packing, production, telecommunication, scheduling, and much more. The main reason behind this success is to be found in the remarkable effort put in the last decades by the OR community to develop realistic models and devise exact or approximate methods to solve the largest variety of constrained or combinatorial optimization problems, together with the spread of computational power and easily accessible OR software and resources. On the other hand, the technological advancements lead to a data wealth never seen before and increasingly push towards methods able to extract useful knowledge from them; among the data-driven methods, Machine Learning techniques appear to be one of the most promising, thanks to its successes in domains like Image Recognition, Natural Language Processes and playing games, but also the amount of research involved. The purpose of the present research is to study how Machine Learning and Constrained Optimization can be used together to achieve systems able to leverage the strengths of both methods: this would open the way to exploiting decades of research on resolution techniques for COPs and constructing models able to adapt and learn from available data. In the first part of this work, we survey the existing techniques and classify them according to the type, method, or scope of the integration; subsequently, we introduce a novel and general algorithm devised to inject knowledge into learning models through constraints, Moving Target. In the last part of the thesis, two applications stemming from real-world projects and done in collaboration with Optit will be presented.
Resumo:
One of the most visionary goals of Artificial Intelligence is to create a system able to mimic and eventually surpass the intelligence observed in biological systems including, ambitiously, the one observed in humans. The main distinctive strength of humans is their ability to build a deep understanding of the world by learning continuously and drawing from their experiences. This ability, which is found in various degrees in all intelligent biological beings, allows them to adapt and properly react to changes by incrementally expanding and refining their knowledge. Arguably, achieving this ability is one of the main goals of Artificial Intelligence and a cornerstone towards the creation of intelligent artificial agents. Modern Deep Learning approaches allowed researchers and industries to achieve great advancements towards the resolution of many long-standing problems in areas like Computer Vision and Natural Language Processing. However, while this current age of renewed interest in AI allowed for the creation of extremely useful applications, a concerningly limited effort is being directed towards the design of systems able to learn continuously. The biggest problem that hinders an AI system from learning incrementally is the catastrophic forgetting phenomenon. This phenomenon, which was discovered in the 90s, naturally occurs in Deep Learning architectures where classic learning paradigms are applied when learning incrementally from a stream of experiences. This dissertation revolves around the Continual Learning field, a sub-field of Machine Learning research that has recently made a comeback following the renewed interest in Deep Learning approaches. This work will focus on a comprehensive view of continual learning by considering algorithmic, benchmarking, and applicative aspects of this field. This dissertation will also touch on community aspects such as the design and creation of research tools aimed at supporting Continual Learning research, and the theoretical and practical aspects concerning public competitions in this field.
Resumo:
The fourth industrial revolution, also known as Industry 4.0, has rapidly gained traction in businesses across Europe and the world, becoming a central theme in small, medium, and large enterprises alike. This new paradigm shifts the focus from locally-based and barely automated firms to a globally interconnected industrial sector, stimulating economic growth and productivity, and supporting the upskilling and reskilling of employees. However, despite the maturity and scalability of information and cloud technologies, the support systems already present in the machine field are often outdated and lack the necessary security, access control, and advanced communication capabilities. This dissertation proposes architectures and technologies designed to bridge the gap between Operational and Information Technology, in a manner that is non-disruptive, efficient, and scalable. The proposal presents cloud-enabled data-gathering architectures that make use of the newest IT and networking technologies to achieve the desired quality of service and non-functional properties. By harnessing industrial and business data, processes can be optimized even before product sale, while the integrated environment enhances data exchange for post-sale support. The architectures have been tested and have shown encouraging performance results, providing a promising solution for companies looking to embrace Industry 4.0, enhance their operational capabilities, and prepare themselves for the upcoming fifth human-centric revolution.
Resumo:
In recent years, IoT technology has radically transformed many crucial industrial and service sectors such as healthcare. The multi-facets heterogeneity of the devices and the collected information provides important opportunities to develop innovative systems and services. However, the ubiquitous presence of data silos and the poor semantic interoperability in the IoT landscape constitute a significant obstacle in the pursuit of this goal. Moreover, achieving actionable knowledge from the collected data requires IoT information sources to be analysed using appropriate artificial intelligence techniques such as automated reasoning. In this thesis work, Semantic Web technologies have been investigated as an approach to address both the data integration and reasoning aspect in modern IoT systems. In particular, the contributions presented in this thesis are the following: (1) the IoT Fitness Ontology, an OWL ontology that has been developed in order to overcome the issue of data silos and enable semantic interoperability in the IoT fitness domain; (2) a Linked Open Data web portal for collecting and sharing IoT health datasets with the research community; (3) a novel methodology for embedding knowledge in rule-defined IoT smart home scenarios; and (4) a knowledge-based IoT home automation system that supports a seamless integration of heterogeneous devices and data sources.
Resumo:
The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.