907 resultados para Computer aided design tool
Resumo:
Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.
Resumo:
Conventional rockmass characterization and analysis methods for geotechnical assessment in mining, civil tunnelling, and other excavations consider only the intact rock properties and the discrete fractures that are present and form blocks within rockmasses. Field logging and classification protocols are based on historically useful but highly simplified design techniques, including direct empirical design and empirical strength assessment for simplified ground reaction and support analysis. As modern underground excavations go deeper and enter into more high stress environments with complex excavation geometries and associated stress paths, healed structures within initially intact rock blocks such as sedimentary nodule boundaries and hydrothermal veins, veinlets and stockwork (termed intrablock structure) are having an increasing influence on rockmass behaviour and should be included in modern geotechnical design. Due to the reliance on geotechnical classification methods which predate computer aided analysis, these complexities are ignored in conventional design. Given the comparatively complex, sophisticated and powerful numerical simulation and analysis techniques now practically available to the geotechnical engineer, this research is driven by the need for enhanced characterization of intrablock structure for application to numerical methods. Intrablock structure governs stress-driven behaviour at depth, gravity driven disintegration for large shallow spans, and controls ultimate fragmentation. This research addresses the characterization of intrablock structure and the understanding of its behaviour at laboratory testing and excavation scales, and presents new methodologies and tools to incorporate intrablock structure into geotechnical design practice. A new field characterization tool, the Composite Geological Strength Index, is used for outcrop or excavation face evaluation and provides direct input to continuum numerical models with implicit rockmass structure. A brittle overbreak estimation tool for complex rockmasses is developed using field observations. New methods to evaluate geometrical and mechanical properties of intrablock structure are developed. Finally, laboratory direct shear testing protocols for interblock structure are critically evaluated and extended to intrablock structure for the purpose of determining input parameters for numerical models with explicit structure.
Resumo:
In 2017, Chronic Respiratory Diseases accounted for almost four million deaths worldwide. Unfortunately, current treatments are not definitive for such diseases. This unmet medical need forces the scientific community to increase efforts in the identification of new therapeutic solutions. PI3K delta plays a key role in mechanisms that promote airway chronic inflammation underlying Asthma and COPD. The first part of this project was dedicated to the identification of novel PI3K delta inhibitors. A first SAR expansion of a Hit, previously identified by a HTS campaign, was carried out. A library of 43 analogues was synthesised taking advantage of an efficient synthetic approach. This allowed the identification of an improved Hit of nanomolar enzymatic potency and moderate selectivity for PI3K delta over other PI3K isoforms. However, this compound exhibited low potency in cell-based assays. Low cellular potency was related to sub optimal phys-chem and ADME properties. The analysis of the X-ray crystal structure of this compound in human PI3K delta guided a second tailored SAR expansion that led to improved cellular potency and solubility. The second part of the thesis was focused on the rational design and synthesis of new macrocyclic Rho-associated protein kinases (ROCKs) inhibitors. Inhibition of these kinases has been associated with vasodilating effects. Therefore, ROCKs could represent attractive targets for the treatment of pulmonary arterial hypertension (PAH). Known ROCK inhibitors suffer from low selectivity across the kinome. The design of macrocyclic inhibitors was considered a promising strategy to obtain improved selectivity. Known inhibitors from literature were evaluated for opportunities of macrocyclization using a knowledge-based approach supported by Computer Aided Drug Design (CADD). The identification of a macrocyclic ROCK inhibitor with enzymatic activity in the low micro molar range against ROCK II represented a promising result that validated this innovative approach in the design of new ROCKs inhibitors.
Resumo:
Recent research trends in computer-aided drug design have shown an increasing interest towards the implementation of advanced approaches able to deal with large amount of data. This demand arose from the awareness of the complexity of biological systems and from the availability of data provided by high-throughput technologies. As a consequence, drug research has embraced this paradigm shift exploiting approaches such as that based on networks. Indeed, the process of drug discovery can benefit from the implementation of network-based methods at different steps from target identification to drug repurposing. From this broad range of opportunities, this thesis is focused on three main topics: (i) chemical space networks (CSNs), which are designed to represent and characterize bioactive compound data sets; (ii) drug-target interactions (DTIs) prediction through a network-based algorithm that predicts missing links; (iii) COVID-19 drug research which was explored implementing COVIDrugNet, a network-based tool for COVID-19 related drugs. The main highlight emerged from this thesis is that network-based approaches can be considered useful methodologies to tackle different issues in drug research. In detail, CSNs are valuable coordinate-free, graphically accessible representations of structure-activity relationships of bioactive compounds data sets especially for medium-large libraries of molecules. DTIs prediction through the random walk with restart algorithm on heterogeneous networks can be a helpful method for target identification. COVIDrugNet is an example of the usefulness of network-based approaches for studying drugs related to a specific condition, i.e., COVID-19, and the same ‘systems-based’ approaches can be used for other diseases. To conclude, network-based tools are proving to be suitable in many applications in drug research and provide the opportunity to model and analyze diverse drug-related data sets, even large ones, also integrating different multi-domain information.
Resumo:
The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.
Resumo:
Natural products have widespread biological activities, including inhibition of mitochondrial enzyme systems. Some of these activities, for example cytotoxicity, may be the result of alteration of cellular bioenergetics. Based on previous computer-aided drug design (CADD) studies and considering reported data on structure-activity relationships (SAR), an assumption regarding the mechanism of action of natural products against parasitic infections involves the NADH-oxidase inhibition. In this study, chemometric tools, such as: Principal Component Analysis (PCA), Consensus PCA (CPCA), and partial least squares regression (PLS), were applied to a set of forty natural compounds, acting as NADH-oxidase inhibitors. The calculations were performed using the VolSurf+ program. The formalisms employed generated good exploratory and predictive results. The independent variables or descriptors having a hydrophobic profile were strongly correlated to the biological data.
Resumo:
Tuberculosis (TB) is the primary cause of mortality among infectious diseases. Mycobacterium tuberculosis monophosphate kinase (TMPKmt) is essential to DNA replication. Thus, this enzyme represents a promising target for developing new drugs against TB. In the present study, the receptor-independent, RI, 4D-QSAR method has been used to develop QSAR models and corresponding 3D-pharmacophores for a set of 81 thymidine analogues, and two corresponding subsets, reported as inhibitors of TMPKmt. The resulting optimized models are not only statistically significant with r (2) ranging from 0.83 to 0.92 and q (2) from 0.78 to 0.88, but also are robustly predictive based on test set predictions. The most and the least potent inhibitors in their respective postulated active conformations, derived from each of the models, were docked in the active site of the TMPKmt crystal structure. There is a solid consistency between the 3D-pharmacophore sites defined by the QSAR models and interactions with binding site residues. Moreover, the QSAR models provide insights regarding a probable mechanism of action of the analogues.
Resumo:
Purpose of review To identify and discuss recent research studies that propose innovative psychosocial interventions in old age psychiatry. Recent findings Studies have shown that cognitive training research for healthy elderly has advanced in several ways, particularly in the refinement of study design and methodology. Studies have included larger samples and longer training protocols. Interestingly, new research has shown changes in biological markers associated with learning and memory after cognitive training. Among mild cognitive impairment patients, results have demonstrated that they benefit from interventions displaying cognitive plasticity. Rehabilitation studies involving dementia patients have suggested the efficacy of combined treatment approaches, and light and music therapies have shown promising effects. For psychiatric disorders, innovations have included improvements in well known techniques such as cognitive behavior therapy, studies in subpopulations with comorbidities, as well as the use of new computer-aided resources. Summary Research evidence on innovative interventions in old age psychiatry suggests that this exciting field is moving forward by means of methodological refinements and testing of creative new ideas.
Resumo:
Text serves as a sequel to 'Computational and Constructive Design Theory,' c1996; containing research papers and surveys of recent research work on design construction and computer-aided study of designs. For researchers in theory of computational designs.
Resumo:
The Sciatic Functional Index (SFI) is a quite useful tool for the evaluation of functional recovery of the sciatic nerve of rats in a number of experimental injuries and treatments. Although it is an objective method, it depends on the examiner`s ability to adequately recognize and mark the previously established footprint key points, which is an entirely subjective step, thus potentially interfering with the calculations according to the mathematical formulae proposed by different authors. Thus, an interpersonal evaluation of the reproducibility of an SFI computer-aided method was carried out here to study data variability. A severe crush injury was produced on a 5 mm-long segment of the right sciatic nerve of 20 Wistar rats (a 5000 g load directly applied for 10 min) and the SH was measured by four different examiners (an experienced one and three newcomers) preoperatively and at weekly intervals from the 1st to the 8th postoperative week. Three measurements were made for each print and the average was calculated and used for statistical analysis. The results showed that interpersonal correlation was high (0.82) in the 3rd, 4th, 5th, 7th and 8th weeks, with an unexpected but significant (p < 0.01) drop in the 6th week. There was virtually no interpersonal correlation (correlation index close to 0) on the 1st and 2nd weeks, a period during which the variability between animals and examiners (p =0.24 and 0.32, respectively) was similar, certainly due to a poor definition of the footprints. The authors conclude that the SFI method studied here is only reliable from the 3rd week on after a severe lesion of the sciatic nerve of rats. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
A concepção de instalações eléctricas deve garantir condições de segurança para as pessoas e equipamentos. Para tal é exigida, quer por força de regulamentação ou de normalização, a instalação de dispositivos que garantam a detecção e a protecção contra os defeitos mais comuns nas instalações eléctricas como, por exemplo, as sobreintensidades e as sobretensões. Susceptíveis de criar sobretensões perigosas nas instalações eléctricas, as descargas atmosféricas podem ainda causar danos estruturais elevados, o que, em algumas actividades económicas, torna fundamental a implementação de medidas de protecção contra este fenómeno natural. A protecção contra descargas atmosféricas directas consiste em identificar as vulnerabilidades das estruturas e, nesses locais, implementar dispositivos de captura, direccionamento e escoamento da descarga atmosférica à terra, em condições de segurança. O presente trabalho, desenvolvido no âmbito da dissertação de Mestrado em Engenharia Electrotécnica, visa desenvolver e implementar uma ferramenta computacional, baseada em programas de desenho assistido por computador (CAD) de utilização corrente na área de projecto de arquitectura e de engenharia, que permita, no âmbito de normas internacionais, a análise e implementação de sistemas de protecção em edifícios contra descargas atmosféricas de uma forma rápida e expedita. Baseado num programa CAD 3D, que permite a modelização tridimensional das estruturas a proteger, a ferramenta desenvolvida tentará identificar as suas vulnerabilidades das estruturas às descargas atmosféricas directas, com o intuito de implementar as medidas de protecção mais adequadas do ponto de vista técnico económico. Prevê-se que a ferramenta resultante deste estudo, o Simulador do Modelo Electrogeométrico (SIMODEL), possibilite aos projectistas e particularmente aos alunos das unidades curriculares na área do projecto de instalações eléctricas da Área Departamental de Engenharia de Sistemas e Potencia e Automação (ADESPA) do ISEL, estudar e implementar sistemas de protecção contra descargas atmosféricas (SPDA) baseados na normalização internacional do CENELEC e da IEC, nomeadamente as normas da série 62305.
Resumo:
It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.
Resumo:
This work addresses the effects of catalyst deactivation and investigates methods to reduce their impact on the reactive distillation columns performance. The use of variable feed quality and reboil ratio are investigated using a rigorous dynamic model developed in gPROMS and applied to an illustrative example, i.e., the olefin metathesis system, wherein 2-pentene reacts to form 2-butene and 3-hexene. Three designs and different strategies on column energy supply to tackle catalyst deactivation are investigated and the results compared.
Resumo:
This chapter presents some of the issues with holonic manufacturing systems. It starts by presenting the current manufacturing scenario and trends and then provides some background information on the holonic concept and its application to manufacturing. The current limitations and future trends of manufacturing suggest more autonomous and distributed organisations for manufacturing systems; holonic manufacturing systems are proposed as a way to achieve such autonomy and decentralisation. After a brief literature survey a specific research work is presented to handle scheduling in holonic manufacturing systems. This work is based on task and resource holons that cooperate with each other based on a variant of the contract net protocol that allow the propagation of constraints between operations in the execution plan. The chapter ends by presenting some challenges and future opportunities of research.