827 resultados para computational creativity
Resumo:
El presente trabajo se realizó con el objetivo de tener una visión completa de las teorías del liderazgo, teniendo de este una concepción como proceso y poder examinar las diversas formas de aplicación en las organizaciones contemporáneas. El tema es enfocado desde la perspectiva organizacional, un mundo igualmente complejo, sin desconocer su importancia en otros ámbitos como la educación, la política o la dirección del estado. Su enfoque tiene que ver con el estudio académico del cual es la culminación y se enmarca dentro de la perspectiva constitucional de la Carta Política Colombiana que reconoce la importancia capital que tienen la actividad económica y la iniciativa privada en la constitución de empresas. Las diversas visiones del liderazgo han sido aplicadas de distintas maneras en las organizaciones contemporáneas y han generado diversos resultados. Hoy, no es posible pensar en una organización que no haya definido su forma de liderazgo y en consecuencia, confluyen en el campo empresarial multitud de teorías, sin que pueda afirmarse que una sola de ellas permita el manejo adecuado y el cumplimiento de los objetivos misionales. Por esta razón se ha llegado a concebir el liderazgo como una función compleja, en un mundo donde las organizaciones mismas se caracterizan no solo por la complejidad de sus acciones y de su conformación, sino también porque esta característica pertenece también al mundo de la globalización. Las organizaciones concebidas como máquinas que en sentido metafórico logran reconstituirse sus estructuras a medida que están en interacción con otras en el mundo globalizado. Adaptarse a las cambiantes circunstancias hace de las organizaciones conglomerados en permanente dinámica y evolución. En este ámbito puede decirse que el liderazgo es también complejo y que es el liderazgo transformacional el que más se acerca al sentido de la complejidad.
Resumo:
This thesis introduce a new innovation methodology called IDEAS(R)EVOLUTION that was developed according to an on-going experimental research project started in 2007. This new approach to innovation has initial based on Design thinking for innovation theory and practice. The concept of design thinking for innovation has received much attention in recent years. This innovation approach has climbed from the design and designers knowledge field towards other knowledge areas, mainly business management and marketing. Human centered approach, radical collaboration, creativity and breakthrough thinking are the main founding principles of Design thinking that were adapted by those knowledge areas due to their assertively and fitness to the business context and market complexity evolution. Also Open innovation, User-centered innovation and later on Living Labs models emerge as answers to the market and consumers pressure and desire for new products, new services or new business models. Innovation became the principal business management focus and strategic orientation. All this changes had an impact also in the marketing theory. It is possible now to have better strategies, communications plans and continuous dialogue systems with the target audience, incorporating their insights and promoting them to the main dissemination ambassadors of our innovations in the market. Drawing upon data from five case studies, the empirical findings in this dissertation suggest that companies need to shift from Design thinking for innovation approach to an holistic, multidimensional and integrated innovation system. The innovation context it is complex, companies need deeper systems then the success formulas that “commercial “Design thinking for innovation “preaches”. They need to learn how to change their organization culture, how to empower their workforce and collaborators, how to incorporate external stakeholders in their innovation processes, hoe to measure and create key performance indicators throughout the innovation process to give them better decision making data, how to integrate meaning and purpose in their innovation philosophy. Finally they need to understand that the strategic innovation effort it is not a “one shot” story it is about creating a continuous flow of interaction and dialogue with their clients within a “value creation chain“ mindset; RESUMO: Metodologia de co-criação de um produto/marca cruzando Marketing, Design Thinking, Criativity and Management - IDEAS(R)EVOLUTION. Esta dissertação apresenta uma nova metodologia de inovação chamada IDEAS(R)EVOLUTION, que foi desenvolvida segundo um projecto de investigação experimental contínuo que teve o seu início em 2007. Esta nova abordagem baseou-se, inicialmente, na teoria e na práctica do Design thinking para a inovação. Actualmente o conceito do Design Thinking para a inovação “saiu” do dominio da area de conhecimento do Design e dos Designers, tendo despertado muito interesse noutras áreas como a Gestão e o Marketing. Uma abordagem centrada na Pessoa, a colaboração radical, a criatividade e o pensamento disruptivo são principios fundadores do movimento do Design thinking que têm sido adaptados por essas novas áreas de conhecimento devido assertividade e adaptabilidade ao contexto dos negócios e à evolução e complexidade do Mercado. Também os modelos de Inovação Aberta, a inovação centrada no utilizador e mais tarde os Living Labs, emergem como possiveis soluções para o Mercado e para a pressão e desejo dos consumidores para novos productos, serviços ou modelos de negócio. A inovação passou a ser o principal foco e orientação estratégica na Gestão. Todas estas mudanças também tiveram impacto na teoria do Marketing. Hoje é possivel criar melhores estratégias, planos de comunicação e sistemas continuos de diálogo com o público alvo, incorporando os seus insights e promovendo os consumidores como embaixadores na disseminação da inovação das empresas no Mercado Os resultados empiricos desta tese, construídos com a informação obtida nos cinco casos realizados, sugerem que as empresas precisam de se re-orientar do paradigma do Design thinking para a inovação, para um sistema de inovação mais holistico, multidimensional e integrado. O contexto da Inovação é complexo, por isso as empresas precisam de sistemas mais profundos e não apenas de “fórmulas comerciais” como o Design thinking para a inovação advoga. As Empresas precisam de aprender como mudar a sua cultura organizacional, como capacitar sua força de trabalho e colaboradores, como incorporar os públicos externos no processo de inovação, como medir o processo de inovação criando indicadores chave de performance e obter dados para um tomada de decisão mais informada, como integrar significado e propósito na sua filosofia de inovação. Por fim, precisam de perceber que uma estratégia de inovação não passa por ter “sucesso uma vez”, mas sim por criar um fluxo contínuo de interação e diálogo com os seus clientes com uma mentalidade de “cadeia de criação de valor”
Resumo:
This paper focuses on computational models development and its applications on demand response, within smart grid scope. A prosumer model is presented and the corresponding economic dispatch problem solution is analyzed. The prosumer solar radiation production and energy consumption are forecasted by artificial neural networks. The existing demand response models are studied and a computational tool based on fuzzy clustering algorithm is developed and the results discussed. Consumer energy management applications within the InovGrid pilot project are presented. Computation systems are developed for the acquisition, monitoring, control and supervision of consumption data provided by smart meters, allowing the incorporation of consumer actions on their electrical energy management. An energy management system with integration of smart meters for energy consumers in a smart grid is developed.
Resumo:
Catalysis plays a vital role in modern synthetic chemistry. However, even if conventional catalysis (organo-catalysis, metal-catalysis and enzyme-catalysis) has provided outstanding results, various unconventional ways to make chemical reactions more effective appear now very promising. Computational methods can be of great help to reach a deeper comprehension of these chemical processes. The methodologies employed in this thesis are Quantum-Mechanical (QM), Molecular Mechanics (MM) and hybrid Quantum-Mechanical/Molecular Mechanics (QM/MM) methods. In this abstract the results are briefly summarised. The first unconventional catalysis investigated consists in the application of Oriented External Electric Fields (OEEFs) to SN2 and 4e-electrocyclic reactions. SN2 reactions with back-side mechanism can be catalysed or inhibited by the presence of an OEEF. Moreover, OEEFs can inhibit back-side mechanism (Walden inversion of configuration) and promote the naturally unfavoured front-side mechanism (retention of configuration). Electrocyclic ring opening reaction of 3-substituted cyclobutene molecules can occur with inward or outward mechanisms depending on the nature of substituent groups on the cyclobutene structure (torquoselectivity principle). OEEFs can catalyse the naturally favoured pathway or circumvent the torquoselectivity principle leading to different stereoisomers. The second case study is based on Carbon Nanotubes (CNTs) working as nano-reactors: the reaction of ethyl chloride with chloride anion inside CNTs was investigated. In addition to the SN2 mechanism, syn and anti-E2 reactions are possible. These reactions inside CNTs of different radii were examined with hybrid QM/MM methods, finding that these processes can be both catalysed and inhibited by the CNT diameter. The results suggest that electrostatic effects govern the activation energy variations inside CNTs. Finally, a new biochemical approach, based on the use of DNA catalyst was investigated at QM level. Deoxyribozyme 9DB1 catalyses the RNA ligation allowing the regioselective formation of the 3'-5' bond, following an addition-elimination two-step mechanism.
Resumo:
Prokaryotic organisms are one of the most successful forms of life, they are present in all known ecosystems. The deluge diversity of bacteria reflects their ability to colonise every environment. Also, human beings host trillions of microorganisms in their body districts, including skin, mucosae, and gut. This symbiosis is active for all other terrestrial and marine animals, as well as plants. With the term holobiont we refer, with a single word, to the systems including both the host and its symbiotic microbial species. The coevolution of bacteria within their ecological niches reflects the adaptation of both host and guest species, and it is shaped by complex interactions that are pivotal for determining the host state. Nowadays, thanks to the current sequencing technologies, Next Generation Sequencing, we have unprecedented tools for investigating the bacterial life by studying the prokaryotic genome sequences. NGS revolution has been sustained by the advancements in computational performance, in terms of speed, storage capacity, algorithm development and hardware costs decreasing following the Moore’s Law. Bioinformaticians and computational biologists design and implement ad hoc tools able to analyse high-throughput data and extract valuable biological information. Metagenomics requires the integration of life and computational sciences and it is uncovering the deluge diversity of the bacterial world. The present thesis work focuses mainly on the analysis of prokaryotic genomes under different aspects. Being supervised by two groups at the University of Bologna, the Biocomputing group and the group of Microbial Ecology of Health, I investigated three different topics: i) antimicrobial resistance, particularly with respect to missense point mutations involved in the resistant phenotype, ii) bacterial mechanisms involved in xenobiotic degradation via the computational analysis of metagenomic samples, and iii) the variation of the human gut microbiota through ageing, in elderly and longevous individuals.
Resumo:
The technology of Organic Light-Emitting Diodes has reached such a high level of reliability that it can be used in various applications. The required light emission efficiency can be achieved by transforming the triplet excitons into singlet states through Reverse InterSystem Crossing (RISC), which is the main process of a general mechanism called thermally activated delayed fluorescence (TADF). In this thesis, we theoretically analyzed two carbazole-benzonitrile (donor-acceptor) derivatives, 2,5-di(9H-carbazol-9-yl)benzonitrile (p-2CzBN) and 2,3,4,5,6-penta(9H-carbazol-9-yl)benzonitrile (5CzBN), and addressed the problem of how donor-acceptor (D-A) or donor-acceptor-donor (D-A-D) flexible molecular architectures influence the nature of the excited states and the emission intensity. Furthermore, we analyzed the RISC rates as a function of the conformation of the carbazole lateral groups, considering the first electronic states, S0, S1, T1 and T2, involved in TADF process. The two prototype molecules, p-2CzBN and 5CzBN, have a similar energy gap between the first singlet and triplet states (∆EST, a key parameter in the RISC rate), but different TADF performances. Therefore, other parameters must be considered to explain their different behavior. The oscillator strength of p-2CzBN, never tested as emitter in OLEDs, is similar to that of 5CzBN, which is an active TADF molecule. We also note that the presence of a second T2 triplet state, lower in energy than S1 only in 5CzBN, and the reorganization energies, associated with RISC processes involving T1 and T2, are important factors in differentiating the rates in p-2CzBN and 5CzBN. For p-2CzBN, the RISC rate from T2 to S1 is surprisingly higher than that from T1 to S1, in disagreement with El-Sayed rules, due to a large reorganization energy associated to the T1 to S1, process; while the contrary occurs for 5CzBN. These insights are important for designing new TADF emitters based on the benzo-carbazole architecture.
Resumo:
Asymmetric organocatalysed reactions are one of the most fascinating synthetic strategies which one can adopt in order to induct a desired chirality into a reaction product. From all the possible practical applications of small organic molecules in catalytic reaction, amine–based catalysis has attracted a lot of attention during the past two decades. The high interest in asymmetric aminocatalytic pathways is to account to the huge variety of carbonyl compounds that can be functionalized by many different reactions of their corresponding chiral–enamine or –iminium ion as activated nucleophile and electrophile, respectively. Starting from the employment of L–Proline, many useful substrates have been proposed in order to further enhance the catalytic performances of these reaction in terms of enantiomeric excess values, yield, conversion of the substrate and turnover number. In particular, in the last decade the use of chiral and quasi–enantiomeric primary amine species has got a lot of attention in the field. Contemporaneously, many studies have been carried out in order to highlight the mechanism through which these kinds of substrates induct chirality into the desired products. In this scenario, computational chemistry has played a crucial role due to the possibility of simulating and studying any kind of reaction and the transition state structures involved. In the present work the transition state geometries of primary amine–catalysed Michael addition reaction of cyclohexanone to trans–β–nitrostyrene with different organic acid cocatalysts has been studied through different computational techniques such as density functional theory based quantum mechanics calculation and force–field directed molecular simulations.
Resumo:
Polymerases and nucleases are enzymes processing DNA and RNA. They are involved in crucial processes for cell life by performing the extension and the cleavage of nucleic acid chains during genome replication and maintenance. Additionally, both enzymes are often associated to several diseases, including cancer. In order to catalyze the reaction, most of them operate via the two-metal-ion mechanism. For this, despite showing relevant differences in structure, function and catalytic properties, they share common catalytic elements, which comprise the two catalytic ions and their first-shell acidic residues. Notably, recent studies of different metalloenzymes revealed the recurrent presence of additional elements surrounding the active site, thus suggesting an extended two-metal-ion-centered architecture. However, whether these elements have a catalytic function and what is their role is still unclear. In this work, using state-of-the-art computational techniques, second- and third-shell elements are showed to act in metallonucleases favoring the substrate positioning and leaving group release. In particular, in hExo1 a transient third metal ion is recruited and positioned near the two-metal-ion site by a structurally conserved acidic residue to assist the leaving group departure. Similarly, in hFEN1 second- and third-shell Arg/Lys residues operate the phosphate steering mechanism through (i) substrate recruitment, (ii) precise cleavage localization, and (iii) leaving group release. Importantly, structural comparisons of hExo1, hFEN1 and other metallonucleases suggest that similar catalytic mechanisms may be shared by other enzymes. Overall, the results obtained provide an extended vision on parallel strategies adopted by metalloenzymes, which employ divalent metal ion or positively charged residues to ensure efficient and specific catalysis. Furthermore, these outcomes may have implications for de novo enzyme engineering and/or drug design to modulate nucleic acid processing.
Resumo:
In this thesis we discuss in what ways computational logic (CL) and data science (DS) can jointly contribute to the management of knowledge within the scope of modern and future artificial intelligence (AI), and how technically-sound software technologies can be realised along the path. An agent-oriented mindset permeates the whole discussion, by stressing pivotal role of autonomous agents in exploiting both means to reach higher degrees of intelligence. Accordingly, the goals of this thesis are manifold. First, we elicit the analogies and differences among CL and DS, hence looking for possible synergies and complementarities along 4 major knowledge-related dimensions, namely representation, acquisition (a.k.a. learning), inference (a.k.a. reasoning), and explanation. In this regard, we propose a conceptual framework through which bridges these disciplines can be described and designed. We then survey the current state of the art of AI technologies, w.r.t. their capability to support bridging CL and DS in practice. After detecting lacks and opportunities, we propose the notion of logic ecosystem as the new conceptual, architectural, and technological solution supporting the incremental integration of symbolic and sub-symbolic AI. Finally, we discuss how our notion of logic ecosys- tem can be reified into actual software technology and extended towards many DS-related directions.
Resumo:
Cancer is a challenging disease that involves multiple types of biological interactions in different time and space scales. Often computational modelling has been facing problems that, in the current technology level, is impracticable to represent in a single space-time continuum. To handle this sort of problems, complex orchestrations of multiscale models is frequently done. PRIMAGE is a large EU project that aims to support personalized childhood cancer diagnosis and prognosis. The goal is to do so predicting the growth of the solid tumour using multiscale in-silico technologies. The project proposes an open cloud-based platform to support decision making in the clinical management of paediatric cancers. The orchestration of predictive models is in general complex and would require a software framework that support and facilitate such task. The present work, proposes the development of an updated framework, referred herein as the VPH-HFv3, as a part of the PRIMAGE project. This framework, a complete re-writing with respect to the previous versions, aims to orchestrate several models, which are in concurrent development, using an architecture as simple as possible, easy to maintain and with high reusability. This sort of problem generally requires unfeasible execution times. To overcome this problem was developed a strategy of particularisation, which maps the upper-scale model results into a smaller number and homogenisation which does the inverse way and analysed the accuracy of this approach.
Resumo:
The weight-transfer effect, consisting of the change in dynamic load distribution between the front and the rear tractor axles, is one of the most impairing phenomena for the performance, comfort, and safety of agricultural operations. Excessive weight transfer from the front to the rear tractor axle can occur during operation or maneuvering of implements connected to the tractor through the three-point hitch (TPH). In this respect, an optimal design of the TPH can ensure better dynamic load distribution and ultimately improve operational performance, comfort, and safety. In this study, a computational design tool (The Optimizer) for the determination of a TPH geometry that minimizes the weight-transfer effect is developed. The Optimizer is based on a constrained minimization algorithm. The objective function to be minimized is related to the tractor front-to-rear axle load transfer during a simulated reference maneuver performed with a reference implement on a reference soil. Simulations are based on a 3-degrees-of-freedom (DOF) dynamic model of the tractor-TPH-implement aggregate. The inertial, elastic, and viscous parameters of the dynamic model were successfully determined through a parameter identification algorithm. The geometry determined by the Optimizer complies with the ISO-730 Standard functional requirements and other design requirements. The interaction between the soil and the implement during the simulated reference maneuver was successfully validated against experimental data. Simulation results show that the adopted reference maneuver is effective in triggering the weight-transfer effect, with the front axle load exhibiting a peak-to-peak value of 27.1 kN during the maneuver. A benchmark test was conducted starting from four geometries of a commercially available TPH. As result, all the configurations were optimized by above 10%. The Optimizer, after 36 iterations, was able to find an optimized TPH geometry which allows to reduce the weight-transfer effect by 14.9%.
Resumo:
Spectral sensors are a wide class of devices that are extremely useful for detecting essential information of the environment and materials with high degree of selectivity. Recently, they have achieved high degrees of integration and low implementation cost to be suited for fast, small, and non-invasive monitoring systems. However, the useful information is hidden in spectra and it is difficult to decode. So, mathematical algorithms are needed to infer the value of the variables of interest from the acquired data. Between the different families of predictive modeling, Principal Component Analysis and the techniques stemmed from it can provide very good performances, as well as small computational and memory requirements. For these reasons, they allow the implementation of the prediction even in embedded and autonomous devices. In this thesis, I will present 4 practical applications of these algorithms to the prediction of different variables: moisture of soil, moisture of concrete, freshness of anchovies/sardines, and concentration of gasses. In all of these cases, the workflow will be the same. Initially, an acquisition campaign was performed to acquire both spectra and the variables of interest from samples. Then these data are used as input for the creation of the prediction models, to solve both classification and regression problems. From these models, an array of calibration coefficients is derived and used for the implementation of the prediction in an embedded system. The presented results will show that this workflow was successfully applied to very different scientific fields, obtaining autonomous and non-invasive devices able to predict the value of physical parameters of choice from new spectral acquisitions.
Resumo:
The development of Next Generation Sequencing promotes Biology in the Big Data era. The ever-increasing gap between proteins with known sequences and those with a complete functional annotation requires computational methods for automatic structure and functional annotation. My research has been focusing on proteins and led so far to the development of three novel tools, DeepREx, E-SNPs&GO and ISPRED-SEQ, based on Machine and Deep Learning approaches. DeepREx computes the solvent exposure of residues in a protein chain. This problem is relevant for the definition of structural constraints regarding the possible folding of the protein. DeepREx exploits Long Short-Term Memory layers to capture residue-level interactions between positions distant in the sequence, achieving state-of-the-art performances. With DeepRex, I conducted a large-scale analysis investigating the relationship between solvent exposure of a residue and its probability to be pathogenic upon mutation. E-SNPs&GO predicts the pathogenicity of a Single Residue Variation. Variations occurring on a protein sequence can have different effects, possibly leading to the onset of diseases. E-SNPs&GO exploits protein embeddings generated by two novel Protein Language Models (PLMs), as well as a new way of representing functional information coming from the Gene Ontology. The method achieves state-of-the-art performances and is extremely time-efficient when compared to traditional approaches. ISPRED-SEQ predicts the presence of Protein-Protein Interaction sites in a protein sequence. Knowing how a protein interacts with other molecules is crucial for accurate functional characterization. ISPRED-SEQ exploits a convolutional layer to parse local context after embedding the protein sequence with two novel PLMs, greatly surpassing the current state-of-the-art. All methods are published in international journals and are available as user-friendly web servers. They have been developed keeping in mind standard guidelines for FAIRness (FAIR: Findable, Accessible, Interoperable, Reusable) and are integrated into the public collection of tools provided by ELIXIR, the European infrastructure for Bioinformatics.