911 resultados para pacs: knowledge engineering tools
Resumo:
In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I–IV), and ‘key genes’ within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96–100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential ‘hubs of activity’. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several ‘key genes’ may be required for the development of glioblastoma. Further studies are needed to validate these ‘key genes’ as useful tools for early detection and novel therapeutic options for these tumors.
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
Professor Mohammed K. Farouk, Major Professor In 1979, the Florida State Board of Education approved the teaching of global education in the state of Florida. The purpose of this study was to examine the factors that contributed to teachers' global knowledge, global mindedness, and pedagogy in global education. The Hanvey model of teaching from a global perspective was the theoretical framework for the study. A total of 90 secondary teachers from Miami-Dade County Public Schools were randomly selected and placed in three groups: Globally Oriented Social Studies Program (GOSSE), Non-Globally Oriented Social Studies Program (non-GOSSE), and Teachers Who Teach Other Subjects (TWTOS). Seven teachers, two of whom team-taught a class, were selected for classroom observations and interviews. A mixed methods design that combined quantitative and qualitative data was used. ANOVA and Chi square techniques were used to determine whether the factors that contributed to teachers' global knowledge and global mindedness differ among groups. Classroom observations and interviews were conducted to determine whether the instructional strategies differ among the seven selected teachers. The findings of the study show that teachers who were trained in teaching from a global perspective differed in their global knowledge and used more appropriate instructional strategies than teachers who were not trained in teaching global perspectives. There was no significant difference in the combined global knowledge of the non-GOSSE and TWTOS groups when compared with the GOSSE group. There was no significant difference in the combined global knowledge of the GOSSE and non- GOSSE groups when compared with the TWTOS group. There was no significant difference among the teachers in their global mindedness. Observation and interview data indicate that current events, role-playing, simulations, open-ended discussion, debates, and projects were the predominant instructional strategies used by globally trained teachers. Cable networks, Internet, magazines, and newspapers were found to be the dominant tools for teaching global education. This study concluded that teachers who were trained in globally oriented programs had more global knowledge than teachers who were not. It is recommended that teacher education programs should incorporate a global perspective in the preparation of social studies teachers, with particular attention to developing their global attitudes.
Resumo:
The motion capture is a main tool for quantitative motion analyses. Since the XIX century, several motion caption systems have been developed for biomechanics study, animations, games and movies. The biomechanics and kinesiology involves and depends on knowledge from distinct fields, the engineering and health sciences. A precise human motion analysis requires knowledge from both fields. It is necessary then the use of didactics tools and methods for research and teaching for learning aid. The devices for analysis and motion capture currently that are found on the market and on educational institutes presents difficulties for didactical practice, which are the difficulty of transportation, high cost and limited freedom for the user towards the data acquisition. Therefore, the motion analysis is qualitatively performed or is quantitatively performed in highly complex laboratories. Based is these problems, this work presents the development of a motion capture system for didactic use hence a cheap, light, portable and easily used device with a free software. This design includes the selection of the device, the software development for that and tests. The developed system uses the device Kinect, from Microsoft, for its low cost, low weight, portability and easy use, and delivery tree-dimensional data with only one peripheral device. The proposed programs use the hardware to make motion captures, store them, reproduce them, process the motion data and graphically presents the data.
Resumo:
Wireless Sensor and Actuator Networks (WSAN) are a key component in Ubiquitous Computing Systems and have many applications in different knowledge domains. Programming for such networks is very hard and requires developers to know the available sensor platforms specificities, increasing the learning curve for developing WSAN applications. In this work, an MDA (Model-Driven Architecture) approach for WSAN applications development called ArchWiSeN is proposed. The goal of such approach is to facilitate the development task by providing: (i) A WSAN domain-specific language, (ii) a methodology for WSAN application development; and (iii) an MDA infrastructure composed of several software artifacts (PIM, PSMs and transformations). ArchWiSeN allows the direct contribution of domain experts in the WSAN application development without the need of specialized knowledge on WSAN platforms and, at the same time, allows network experts to manage the application requirements without the need for specific knowledge of the application domain. Furthermore, this approach also aims to enable developers to express and validate functional and non-functional requirements of the application, incorporate services offered by WSAN middleware platforms and promote reuse of the developed software artifacts. In this sense, this Thesis proposes an approach that includes all WSAN development stages for current and emerging scenarios through the proposed MDA infrastructure. An evaluation of the proposal was performed by: (i) a proof of concept encompassing three different scenarios performed with the usage of the MDA infrastructure to describe the WSAN development process using the application engineering process, (ii) a controlled experiment to assess the use of the proposed approach compared to traditional method of WSAN application development, (iii) the analysis of ArchWiSeN support of middleware services to ensure that WSAN applications using such services can achieve their requirements ; and (iv) systematic analysis of ArchWiSeN in terms of desired characteristics for MDA tool when compared with other existing MDA tools for WSAN.
Resumo:
The integration between architectural design and structur al systems consi sts, in academic education, one of the main challenges to the architectural design education . Recent studies point to the relevance of the use of computational tools in academic settings as an important strategy for such integration. Although in recent yea rs teaching experience using BIM (BuildingInformationModeling) may be incorporated by the a rchitecture schools , notes the need for further didactic and pedagogical practices that promote the architectural design and structur al integration teaching. This pa per analyzes experiences developed within the UFRN and UFPB, seeking to identify tools, processes and products used, pointing limitations and potentials in subjects taught in these institutions. The research begins with a literature review on teaching BIM and related aspects to the integration of architectural design and stru c tur e . It has been used as data collection techniques in studio the direct observation, the use of questionnaires and interviews with students and teachers, and mixed method, qualitativ e and quantitative analysis . In UFRN, the scope of the Integrated Workshop as a compulsory subject in the curriculum, favors the integration of disciplines studied here as it allows teachers from different disciplines at the same project studio . Regarding the use of BIM form initial users, BIM modelers, able to extract quantitative and automatically speed up production, gaining in quality in the products, however learn the tool and design in parallel cause some difficulties. UFPB, lack of required courses o n BIM, generates lack of knowledge and confidence in using the tool and processes, by most students. Thus we see the need for greater efforts by school to adopt BIM skills and training. There is a greater need for both BIM concept, in order to promote BIM process and consequent better use of tools, and obsolete avoiding impairment of technology, merely a tool. It is considered the inclusion of specific subjects with more advanced BIM skills, through partnerships with engineering degrees and the promotion of trans disciplinary integration favoring the exchange of different cultures from the academic environment.
Resumo:
The need of the oil industry to ensure the safety of the facilities, employees and the environment, not to mention the search for maximum efficiency of its facilities, makes it seeks to achieve a high level of excellence in all stages of its production processes in order to obtain the required quality of the final product. Know the reliability of equipment and what it stands for a system is of fundamental importance for ensuring the operational safety. The reliability analysis technique has been increasingly applied in the oil industry as fault prediction tool and undesirable events that can affect business continuity. It is an applied scientific methodology that involves knowledge in engineering and statistics to meet and or analyze the performance of components, equipment and systems in order to ensure that they perform their function without fail, for a period of time and under a specific condition. The results of reliability analyzes help in making decisions about the best maintenance strategy of petrochemical plants. Reliability analysis was applied on equipment (bike-centrifugal fan) between the period 2010-2014 at the Polo Petrobras Guamaré Industrial, situated in rural Guamaré municipality in the state of Rio Grande do Norte, where he collected data field, analyzed historical equipment and observing the behavior of faults and their impacts. The data were processed in commercial software reliability ReliaSoft BlockSim 9. The results were compared with a study conducted by the experts in the field in order to get the best maintenance strategy for the studied system. With the results obtained from the reliability analysis tools was possible to determine the availability of the centrifugal motor-fan and what will be its impact on the security of process units if it will fail. A new maintenance strategy was established to improve the reliability, availability, maintainability and decreased likelihood of Moto-Centrifugal Fan failures, it is a series of actions to promote the increased system reliability and consequent increase in cycle life of the asset. Thus, this strategy sets out preventive measures to reduce the probability of failure and mitigating aimed at minimizing the consequences.
Resumo:
Advanced therapies combating acute and chronic skin wounds are likely to be brought about using our knowledge of regenerative medicine coupled with appropriately tissue engineered skin substitutes. At the present time, there are no models of an artificial skin that completely replicate normal uninjured skin and they are usually accompanied by fibrotic reactions that result in the production of a scar. Natural biopolymers such as collagen have been a lot investigated as potential source of biomaterial for skin replacement in Tissue Engineering. Collagens are the most abundant high molecular weight proteins in both invertebrate and vertebrate organisms, including mammals, and possess mainly a structural role in connective tissues. From this, they have been elected as one of the key biological materials in tissue regeneration approaches, as skin tissue engineering. In addition, industry is constantly searching for new natural sources of collagen and upgraded methodologies for their production. The most common sources are skin and bone from bovine and porcine origin. However, these last carry high risk of bovine spongiform encephalopathy or transmissible spongiform encephalopathy and immunogenic responses. On the other hand, the increase of jellyfish has led us to consider this marine organism as potential collagen source for tissue engineering applications. In the present study, novel form of acid and pepsin soluble collagen were extracted from dried Rhopilema hispidum jellyfish species in an effort to obtain an alternative and safer collagen. We studied different methods of collagen purification (tissues and experimental procedures). The best collagen yield was obtained using pepsin extraction method (34.16 mg collagen/g of tissue). The isolated collagen was characterized by SDS-polyacrylamide gel electrophoresis and circular dichroism spectroscopy.
Resumo:
Peer reviewed
Resumo:
We present an innovation value chain analysis for a representative sample of new technology based firms (NTBFs) in the UK. This involves determining which factors lead to the usage of different knowledge sources and the relationships that exist between those sources of knowledge; the effect that each knowledge source has on innovative activity; and how innovation outputs affect the performance of NTBFs. We find that internal (i.e. R&D) and external knowledge sources are complementary for NTBFs, and that supply chain linkages have both a direct and indirect effect on innovation. NTBFs' skill resources matter throughout the innovation value chain, being positively associated with external knowledge linkages and innovation success, and also having a direct effect on growth independent of the effect on innovation. ©2010 IEEE.
Resumo:
In common with most universities teaching electronic engineering in the UK, Aston University has seen a shift in the profile of its incoming students in recent years. The educational background of students has moved away from traditional Alevel maths and science and if anything this variation is set to increase with the introduction of engineering diplomas. Another major change to the circumstances of undergraduate students relates to the introduction of tuition fees in 1998 which has resulted in an increased likelihood of them working during term time. This may have resulted in students tending to concentrate on elements of the course that directly provide marks contributing to the degree classification. In the light of these factors a root and branch rethink of the electronic engineering degree programme structures at Aston was required. The factors taken into account during the course revision were:. Changes to the qualifications of incoming students. Changes to the background and experience of incoming students. Increase in overseas students, some with very limited practical experience. Student focus on work directly leading to marks. Modular compartmentalisation of knowledge. The need for provision of continuous feedback on performance We discuss these issues with specific reference to a 40 credit first year electronic engineering course and detail the new course structure and evaluate the effectiveness of the changes. The new approach appears to have been successful both educationally and with regards to student satisfaction. The first cohort of students from the new course will graduate in 2010 and results from student surveys relating particularly to project and design work will be presented at the conference. © 2009 K Sugden, D J Webb and R P Reeves.
Resumo:
We present an innovation value chain analysis for a representative sample of new technology based firms (NTBFs) in the UK. This involves determining which factors lead to the usage of different knowledge sources and the relationships that exist between those sources of knowledge; the effect that each knowledge source has on innovative activity; and how innovation outputs affect the performance of NTBFs. We find that internal (i.e. R&D) and external knowledge sources are complementary for NTBFs, and that supply chain linkages have both a direct and indirect effect on innovation. NTBFs' skill resources matter throughout the innovation value chain, being positively associated with external knowledge linkages and innovation success, and also having a direct effect on growth independent of the effect on innovation. ©2010 IEEE.
Resumo:
Human activities represent a significant burden on the global water cycle, with large and increasing demands placed on limited water resources by manufacturing, energy production and domestic water use. In addition to changing the quantity of available water resources, human activities lead to changes in water quality by introducing a large and often poorly-characterized array of chemical pollutants, which may negatively impact biodiversity in aquatic ecosystems, leading to impairment of valuable ecosystem functions and services. Domestic and industrial wastewaters represent a significant source of pollution to the aquatic environment due to inadequate or incomplete removal of chemicals introduced into waters by human activities. Currently, incomplete chemical characterization of treated wastewaters limits comprehensive risk assessment of this ubiquitous impact to water. In particular, a significant fraction of the organic chemical composition of treated industrial and domestic wastewaters remains uncharacterized at the molecular level. Efforts aimed at reducing the impacts of water pollution on aquatic ecosystems critically require knowledge of the composition of wastewaters to develop interventions capable of protecting our precious natural water resources.
The goal of this dissertation was to develop a robust, extensible and high-throughput framework for the comprehensive characterization of organic micropollutants in wastewaters by high-resolution accurate-mass mass spectrometry. High-resolution mass spectrometry provides the most powerful analytical technique available for assessing the occurrence and fate of organic pollutants in the water cycle. However, significant limitations in data processing, analysis and interpretation have limited this technique in achieving comprehensive characterization of organic pollutants occurring in natural and built environments. My work aimed to address these challenges by development of automated workflows for the structural characterization of organic pollutants in wastewater and wastewater impacted environments by high-resolution mass spectrometry, and to apply these methods in combination with novel data handling routines to conduct detailed fate studies of wastewater-derived organic micropollutants in the aquatic environment.
In Chapter 2, chemoinformatic tools were implemented along with novel non-targeted mass spectrometric analytical methods to characterize, map, and explore an environmentally-relevant “chemical space” in municipal wastewater. This was accomplished by characterizing the molecular composition of known wastewater-derived organic pollutants and substances that are prioritized as potential wastewater contaminants, using these databases to evaluate the pollutant-likeness of structures postulated for unknown organic compounds that I detected in wastewater extracts using high-resolution mass spectrometry approaches. Results showed that application of multiple computational mass spectrometric tools to structural elucidation of unknown organic pollutants arising in wastewaters improved the efficiency and veracity of screening approaches based on high-resolution mass spectrometry. Furthermore, structural similarity searching was essential for prioritizing substances sharing structural features with known organic pollutants or industrial and consumer chemicals that could enter the environment through use or disposal.
I then applied this comprehensive methodological and computational non-targeted analysis workflow to micropollutant fate analysis in domestic wastewaters (Chapter 3), surface waters impacted by water reuse activities (Chapter 4) and effluents of wastewater treatment facilities receiving wastewater from oil and gas extraction activities (Chapter 5). In Chapter 3, I showed that application of chemometric tools aided in the prioritization of non-targeted compounds arising at various stages of conventional wastewater treatment by partitioning high dimensional data into rational chemical categories based on knowledge of organic chemical fate processes, resulting in the classification of organic micropollutants based on their occurrence and/or removal during treatment. Similarly, in Chapter 4, high-resolution sampling and broad-spectrum targeted and non-targeted chemical analysis were applied to assess the occurrence and fate of organic micropollutants in a water reuse application, wherein reclaimed wastewater was applied for irrigation of turf grass. Results showed that organic micropollutant composition of surface waters receiving runoff from wastewater irrigated areas appeared to be minimally impacted by wastewater-derived organic micropollutants. Finally, Chapter 5 presents results of the comprehensive organic chemical composition of oil and gas wastewaters treated for surface water discharge. Concurrent analysis of effluent samples by complementary, broad-spectrum analytical techniques, revealed that low-levels of hydrophobic organic contaminants, but elevated concentrations of polymeric surfactants, which may effect the fate and analysis of contaminants of concern in oil and gas wastewaters.
Taken together, my work represents significant progress in the characterization of polar organic chemical pollutants associated with wastewater-impacted environments by high-resolution mass spectrometry. Application of these comprehensive methods to examine micropollutant fate processes in wastewater treatment systems, water reuse environments, and water applications in oil/gas exploration yielded new insights into the factors that influence transport, transformation, and persistence of organic micropollutants in these systems across an unprecedented breadth of chemical space.
Resumo:
In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.
Resumo:
This thesis presents details of the design and development of novel tools and instruments for scanning tunneling microscopy (STM), and may be considered as a repository for several years' worth of development work. The author presents design goals and implementations for two microscopes. First, a novel Pan-type STM was built that could be operated in an ambient environment as a liquid-phase STM. Unique features of this microscope include a unibody frame, for increased microscope rigidity, a novel slider component with large Z-range, a unique wiring scheme and damping mechanism, and a removable liquid cell. The microscope exhibits a high level of mechanical isolation at the tunnel junction, and operates excellently as an ambient tool. Experiments in liquid are on-going. Simultaneously, the author worked on designs for a novel low temperature, ultra-high vacuum (LT-UHV) instrument, and these are presented as well. A novel stick-slip vertical coarse approach motor was designed and built. To gauge the performance of the motor, an in situ motion sensing apparatus was implemented, which could measure the step size of the motor to high precision. A new driving circuit for stick-slip inertial motors is also presented, that o ffers improved performance over our previous driving circuit, at a fraction of the cost. The circuit was shown to increase step size performance by 25%. Finally, a horizontal sample stage was implemented in this microscope. The build of this UHV instrument is currently being fi nalized. In conjunction with the above design projects, the author was involved in a collaborative project characterizing N-heterocyclic carbene (NHC) self-assembled monolayers (SAMs) on Au(111) films. STM was used to characterize Au substrate quality, for both commercial substrates and those manufactured via a unique atomic layer deposition (ALD) process by collaborators. Ambient and UHV STM was then also used to characterize the NHC/Au(111) films themselves, and several key properties of these films are discussed. During this study, the author discovered an unexpected surface contaminant, and details of this are also presented. Finally, two models are presented for the nature of the NHC-Au(111) surface interaction based on the observed film properties, and some preliminary theoretical work by collaborators is presented.