920 resultados para INTELLIGENCE SYSTEMS METHODOLOGY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surgical interventions are usually performed in an operation room; however, access to the information by the medical team members during the intervention is limited. While in conversations with the medical staff, we observed that they attach significant importance to the improvement of the information and communication direct access by queries during the process in real time. It is due to the fact that the procedure is rather slow and there is lack of interaction with the systems in the operation room. These systems can be integrated on the Cloud adding new functionalities to the existing systems the medical expedients are processed. Therefore, such a communication system needs to be built upon the information and interaction access specifically designed and developed to aid the medical specialists. Copyright 2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many years have passed since Berners-Lee envi- sioned the Web as it should be (1999), but still many information professionals do not know their precise role in its development, especially con- cerning ontologies –considered one of its main elements. Why? May it still be a lack of under- standing between the different academic commu- nities involved (namely, Computer Science, Lin- guistics and Library and Information Science), as reported by Soergel (1999)? The idea behind the Semantic Web is that of several technologies working together to get optimum information re- trieval performance, which is based on proper resource description in a machine-understandable way, by means of metadata and vocabularies (Greenberg, Sutton and Campbell, 2003). This is obviously something that Library and Information Science professionals can do very well, but, are we doing enough? When computer scientists put on stage the ontology paradigm they were asking for semantically richer vocabularies that could support logical inferences in artificial intelligence as a way to improve information retrieval systems. Which direction should vocabulary development take to contribute better to that common goal? The main objective of this paper is twofold: 1) to identify main trends, issues and problems con- cerning ontology research and 2) to identify pos- sible contributions from the Library and Information Science area to the development of ontologies for the semantic web. To do so, our paper has been structured in the following manner. First, the methodology followed in the paper is reported, which is based on a thorough literature review, where main contributions are analysed. Then, the paper presents a discussion of the main trends, issues and problems concerning ontology re- search identified in the literature review. Recom- mendations of possible contributions from the Library and Information Science area to the devel- opment of ontologies for the semantic web are finally presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rise in population growth, as well as nutrient mining, has contributed to low agricultural productivity in Sub-Saharan Africa (SSA). A plethora of technologies to boost agricultural production have been developed but the dissemination of these agricultural innovations and subsequent uptake by smallholder farmers has remained a challenge. Scientists and philanthropists have adopted the Integrated Soil Fertility Management (ISFM) paradigm as a means to promote sustainable intensification of African farming systems. This comparative study aimed: 1) To assess the efficacy of Agricultural Knowledge and Innovation Systems (AKIS) in East (Kenya) and West (Ghana) Africa in the communication and dissemination of ISFM (Study I); 2) To investigate how specifically soil quality, and more broadly socio-economic status and institutional factors, influence farmer adoption of ISFM (Study II); and 3) To assess the effect of ISFM on maize yield and total household income of smallholder farmers (Study III). To address these aims, a mixed methodology approach was employed for study I. AKIS actors were subjected to social network analysis methods and in-depth interviews. Structured questionnaires were administered to 285 farming households in Tamale and 300 households in Kakamega selected using a stratified random sampling approach. There was a positive relationship between complete ISFM awareness among farmers and weak knowledge ties to both formal and informal actors at both research locations. The Kakamega AKIS revealed a relationship between complete ISFM awareness among farmers and them having strong knowledge ties to formal actors implying that further integration of formal actors with farmers’ local knowledge is crucial for the agricultural development progress. The structured questionnaire was also utilized to answer the query pertaining to study II. Soil samples (0-20 cm depth) were drawn from 322 (Tamale, Ghana) and 459 (Kakamega, Kenya) maize plots and analysed non-destructively for various soil fertility indicators. Ordinal regression modeling was applied to assess the cumulative adoption of ISFM. According to model estimates, soil carbon seemed to preclude farmers from intensifying input use in Tamale, whereas in Kakamega it spurred complete adoption. This varied response by farmers to soil quality conditions is multifaceted. From the Tamale perspective, it is consistent with farmers’ tendency to judiciously allocate scarce resources. Viewed from the Kakamega perspective, it points to a need for farmers here to intensify agricultural production in order to foster food security. In Kakamega, farmers with more acidic soils were more likely to adopt ISFM. Other household and farm-level factors necessary for ISFM adoption included off-farm income, livestock ownership, farmer associations, and market inter-linkages. Finally, in study III a counterfactual model was used to calculate the difference in outcomes (yield and household income) of the treatment (ISFM adoption) in order to estimate causal effects of ISFM adoption. Adoption of ISFM contributed to a yield increase of 16% in both Tamale and Kakamega. The innovation affected total household income only in Tamale, where ISFM adopters had an income gain of 20%. This may be attributable to the different policy contexts under which the two sets of farmers operate. The main recommendations underscored the need to: (1) improve the functioning of AKIS, (2) enhance farmer access to hybrid maize seed and credit, (3) and conduct additional multi-locational studies as farmers operate under varying contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduce a new innovation methodology called IDEAS(R)EVOLUTION that was developed according to an on-going experimental research project started in 2007. This new approach to innovation has initial based on Design thinking for innovation theory and practice. The concept of design thinking for innovation has received much attention in recent years. This innovation approach has climbed from the design and designers knowledge field towards other knowledge areas, mainly business management and marketing. Human centered approach, radical collaboration, creativity and breakthrough thinking are the main founding principles of Design thinking that were adapted by those knowledge areas due to their assertively and fitness to the business context and market complexity evolution. Also Open innovation, User-centered innovation and later on Living Labs models emerge as answers to the market and consumers pressure and desire for new products, new services or new business models. Innovation became the principal business management focus and strategic orientation. All this changes had an impact also in the marketing theory. It is possible now to have better strategies, communications plans and continuous dialogue systems with the target audience, incorporating their insights and promoting them to the main dissemination ambassadors of our innovations in the market. Drawing upon data from five case studies, the empirical findings in this dissertation suggest that companies need to shift from Design thinking for innovation approach to an holistic, multidimensional and integrated innovation system. The innovation context it is complex, companies need deeper systems then the success formulas that “commercial “Design thinking for innovation “preaches”. They need to learn how to change their organization culture, how to empower their workforce and collaborators, how to incorporate external stakeholders in their innovation processes, hoe to measure and create key performance indicators throughout the innovation process to give them better decision making data, how to integrate meaning and purpose in their innovation philosophy. Finally they need to understand that the strategic innovation effort it is not a “one shot” story it is about creating a continuous flow of interaction and dialogue with their clients within a “value creation chain“ mindset; RESUMO: Metodologia de co-criação de um produto/marca cruzando Marketing, Design Thinking, Criativity and Management - IDEAS(R)EVOLUTION. Esta dissertação apresenta uma nova metodologia de inovação chamada IDEAS(R)EVOLUTION, que foi desenvolvida segundo um projecto de investigação experimental contínuo que teve o seu início em 2007. Esta nova abordagem baseou-se, inicialmente, na teoria e na práctica do Design thinking para a inovação. Actualmente o conceito do Design Thinking para a inovação “saiu” do dominio da area de conhecimento do Design e dos Designers, tendo despertado muito interesse noutras áreas como a Gestão e o Marketing. Uma abordagem centrada na Pessoa, a colaboração radical, a criatividade e o pensamento disruptivo são principios fundadores do movimento do Design thinking que têm sido adaptados por essas novas áreas de conhecimento devido assertividade e adaptabilidade ao contexto dos negócios e à evolução e complexidade do Mercado. Também os modelos de Inovação Aberta, a inovação centrada no utilizador e mais tarde os Living Labs, emergem como possiveis soluções para o Mercado e para a pressão e desejo dos consumidores para novos productos, serviços ou modelos de negócio. A inovação passou a ser o principal foco e orientação estratégica na Gestão. Todas estas mudanças também tiveram impacto na teoria do Marketing. Hoje é possivel criar melhores estratégias, planos de comunicação e sistemas continuos de diálogo com o público alvo, incorporando os seus insights e promovendo os consumidores como embaixadores na disseminação da inovação das empresas no Mercado Os resultados empiricos desta tese, construídos com a informação obtida nos cinco casos realizados, sugerem que as empresas precisam de se re-orientar do paradigma do Design thinking para a inovação, para um sistema de inovação mais holistico, multidimensional e integrado. O contexto da Inovação é complexo, por isso as empresas precisam de sistemas mais profundos e não apenas de “fórmulas comerciais” como o Design thinking para a inovação advoga. As Empresas precisam de aprender como mudar a sua cultura organizacional, como capacitar sua força de trabalho e colaboradores, como incorporar os públicos externos no processo de inovação, como medir o processo de inovação criando indicadores chave de performance e obter dados para um tomada de decisão mais informada, como integrar significado e propósito na sua filosofia de inovação. Por fim, precisam de perceber que uma estratégia de inovação não passa por ter “sucesso uma vez”, mas sim por criar um fluxo contínuo de interação e diálogo com os seus clientes com uma mentalidade de “cadeia de criação de valor”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objecto de estudo desta tese de mestrado surgiu da necessidade de dar resposta a uma proposta para uma solução de business intelligence a pedido de um cliente da empresa onde até à data me encontro a desempenhar funções de analista programador júnior. O projecto consistiu na realização de um sistema de monitorização de eventos e análise de operações, portanto um sistema integrado de gestão de frotas com módulo de business intelligence. Durante o decurso deste projecto foi necessário analisar metodologias de desenvolvimento, aprender novas linguagens, ferramentas, como C#, JasperReport, visual studio, Microsoft SQL Server entre outros. ABSTRACT: Business Intelligence applied to fleet management systems - Technologies and Methodologies Analysis. The object of study of this master's thesis was the necessity of responding to a proposal for a business intelligence solution at the request of a client company where so far I find the duties of junior programmer. The project consisted of a system event monitoring and analysis of operations, so an integrated fleet management with integrated business intelligence. During the course of this project was necessary to analyze development methodologies, learn new languages, tools such as C #, JasperReports, visual studio, Microsoft Sql Server and others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A link between patterns of pelvic growth and human life history is supported by the finding that, cross-culturally, variation in maturation rates of female pelvis are correlated with variation in ages of menarche and first reproduction, i.e., it is well known that the human dimensions of the pelvic bones depend on the gender and vary with the age. Indeed, one feature in which humans appear to be unique is the prolonged growth of the pelvis after the age of sexual maturity. Both the total superoinferior length and mediolateral breadth of the pelvis continues to grow markedly after puberty, and do not reach adult proportions until the late teens years. This continuation of growth is accomplished by relatively late fusion of the separate centers of ossification that form the bones of the pelvis. Hence, in this work we will focus on the development of an intelligent decision support system to predict individual’s age based on a pelvis' dimensions criteria. Some basic image processing techniques were applied in order to extract the relevant features from pelvic X-rays, being the computational framework built on top of a Logic Programming approach to Knowledge Representation and Reasoning that caters for the handling of incomplete, unknown, or even self-contradictory information, complemented with a Case Base approach to computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this Thesis a series of numerical models for the evaluation of the seasonal performance of reversible air-to-water heat pump systems coupled to residential and non-residential buildings are presented. The exploitation of the energy saving potential linked to the adoption of heat pumps is a hard task for designers due to the influence on their energy performance of several factors, like the external climate variability, the heat pump modulation capacity, the system control strategy and the hydronic loop configuration. The aim of this work is to study in detail all these aspects. In the first part of this Thesis a series of models which use a temperature class approach for the prediction of the seasonal performance of reversible air source heat pumps are shown. An innovative methodology for the calculation of the seasonal performance of an air-to-water heat pump has been proposed as an extension of the procedure reported by the European standard EN 14825. This methodology can be applied not only to air-to-water single-stage heat pumps (On-off HPs) but also to multi-stage (MSHPs) and inverter-driven units (IDHPs). In the second part, dynamic simulation has been used with the aim to optimize the control systems of the heat pump and of the HVAC plant. A series of dynamic models, developed by means of TRNSYS, are presented to study the behavior of On-off HPs, MSHPs and IDHPs. The main goal of these dynamic simulations is to show the influence of the heat pump control strategies and of the lay-out of the hydronic loop used to couple the heat pump to the emitters on the seasonal performance of the system. A particular focus is given to the modeling of the energy losses linked to on-off cycling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although its great potential as low to medium temperature waste heat recovery (WHR) solution, the ORC technology presents open challenges that still prevent its diffusion in the market, which are different depending on the application and the size at stake. Focusing on the micro range power size and low temperature heat sources, the ORC technology is still not mature due to the lack of appropriate machines and working fluids. Considering instead the medium to large size, the technology is already available but the investment is still risky. The intention of this thesis is to address some of the topical themes in the ORC field, paying special attention in the development of reliable models based on realistic data and accounting for the off-design performance of the ORC system and of each of its components. Concerning the “Micro-generation” application, this work: i) explores the modelling methodology, the performance and the optimal parameters of reciprocating piston expanders; ii) investigates the performance of such expander and of the whole micro-ORC system when using Hydrofluorocarbons as working fluid or their new low GWP alternatives and mixtures; iii) analyzes the innovative ORC reversible architecture (conceived for the energy storage), its optimal regulation strategy and its potential when inserted in typical small industrial frameworks. Regarding the “Industrial WHR” sector, this thesis examines the WHR opportunity of ORCs, with a focus on the natural gas compressor stations application. This work provides information about all the possible parameters that can influence the optimal sizing, the performance and thus the feasibility of installing an ORC system. New WHR configurations are explored: i) a first one, relying on the replacement of a compressor prime mover with an ORC; ii) a second one, which consists in the use of a supercritical CO2 cycle as heat recovery system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedding intelligence in extreme edge devices allows distilling raw data acquired from sensors into actionable information, directly on IoT end-nodes. This computing paradigm, in which end-nodes no longer depend entirely on the Cloud, offers undeniable benefits, driving a large research area (TinyML) to deploy leading Machine Learning (ML) algorithms on micro-controller class of devices. To fit the limited memory storage capability of these tiny platforms, full-precision Deep Neural Networks (DNNs) are compressed by representing their data down to byte and sub-byte formats, in the integer domain. However, the current generation of micro-controller systems can barely cope with the computing requirements of QNNs. This thesis tackles the challenge from many perspectives, presenting solutions both at software and hardware levels, exploiting parallelism, heterogeneity and software programmability to guarantee high flexibility and high energy-performance proportionality. The first contribution, PULP-NN, is an optimized software computing library for QNN inference on parallel ultra-low-power (PULP) clusters of RISC-V processors, showing one order of magnitude improvements in performance and energy efficiency, compared to current State-of-the-Art (SoA) STM32 micro-controller systems (MCUs) based on ARM Cortex-M cores. The second contribution is XpulpNN, a set of RISC-V domain specific instruction set architecture (ISA) extensions to deal with sub-byte integer arithmetic computation. The solution, including the ISA extensions and the micro-architecture to support them, achieves energy efficiency comparable with dedicated DNN accelerators and surpasses the efficiency of SoA ARM Cortex-M based MCUs, such as the low-end STM32M4 and the high-end STM32H7 devices, by up to three orders of magnitude. To overcome the Von Neumann bottleneck while guaranteeing the highest flexibility, the final contribution integrates an Analog In-Memory Computing accelerator into the PULP cluster, creating a fully programmable heterogeneous fabric that demonstrates end-to-end inference capabilities of SoA MobileNetV2 models, showing two orders of magnitude performance improvements over current SoA analog/digital solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aedes albopictus is a vector able to transmit several arboviruses. Due to its high impact on human health, it is important to develop an efficient control strategy for this pest. Nowadays, control based on chemical insecticides is limited by the number of available active principles and the occurrence of resistance. A valuable alternative to the conventional control strategies is the sterile insect technique (SIT) which relies on releasing sterile males of the target insect. Mating between wild females and sterile males results in no viable offspring. A crucial aspect of SIT is the production of a large number of sterile males with a low presence of females that can bite and transmit viruses. The present thesis aimed to find, implement and study the most reliable mechanical sex sorter and protocol to implement male productivity and reduce female contamination. In addition, I evaluated different variables and sorting protocols to enable female recovery for breeding purposes. Furthermore, I studied the creation of a hyper-protandric strain potentially able to produce only males. I also assessed the integration of artificial intelligence with an optical unit to identify sexes at the adult stage. All these applications helped to realise a mass production model in Italy with a potential weekly production of 1 million males. Moreover, I studied and applied for aerial sterile male release in an urban environment. This technology could allow the release of males in a wide area, overcoming environmental and urban obstacles. However, the development and application of drone technologies in a metropolitan area close to airports, such as in Bologna area, must fit specific requirements. Lastly, at Réunion Island, during a Short Term Scientific Mission France (AIM-COST Action), Indian Ocean, I studied the Boosted SIT application. Coating sterile males with Pyriproxyfen may help spread the insecticide into the larval breeding sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the development of calibration procedures and control systems to improve the performance and efficiency of modern spark ignition turbocharged engines. The algorithms developed are used to optimize and manage the spark advance and the air-to-fuel ratio to control the knock and the exhaust gas temperature at the turbine inlet. The described work falls within the activity that the research group started in the previous years with the industrial partner Ferrari S.p.a. . The first chapter deals with the development of a control-oriented engine simulator based on a neural network approach, with which the main combustion indexes can be simulated. The second chapter deals with the development of a procedure to calibrate offline the spark advance and the air-to-fuel ratio to run the engine under knock-limited conditions and with the maximum admissible exhaust gas temperature at the turbine inlet. This procedure is then converted into a model-based control system and validated with a Software in the Loop approach using the engine simulator developed in the first chapter. Finally, it is implemented in a rapid control prototyping hardware to manage the combustion in steady-state and transient operating conditions at the test bench. The third chapter deals with the study of an innovative and cheap sensor for the in-cylinder pressure measurement, which is a piezoelectric washer that can be installed between the spark plug and the engine head. The signal generated by this kind of sensor is studied, developing a specific algorithm to adjust the value of the knock index in real-time. Finally, with the engine simulator developed in the first chapter, it is demonstrated that the innovative sensor can be coupled with the control system described in the second chapter and that the performance obtained could be the same reachable with the standard in-cylinder pressure sensors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet of Vehicles (IoV) paradigm has emerged in recent times, where with the support of technologies like the Internet of Things and V2X , Vehicular Users (VUs) can access different services through internet connectivity. With the support of 6G technology, the IoV paradigm will evolve further and converge into a fully connected and intelligent vehicular system. However, this brings new challenges over dynamic and resource-constrained vehicular systems, and advanced solutions are demanded. This dissertation analyzes the future 6G enabled IoV systems demands, corresponding challenges, and provides various solutions to address them. The vehicular services and application requests demands proper data processing solutions with the support of distributed computing environments such as Vehicular Edge Computing (VEC). While analyzing the performance of VEC systems it is important to take into account the limited resources, coverage, and vehicular mobility into account. Recently, Non terrestrial Networks (NTN) have gained huge popularity for boosting the coverage and capacity of terrestrial wireless networks. Integrating such NTN facilities into the terrestrial VEC system can address the above mentioned challenges. Additionally, such integrated Terrestrial and Non-terrestrial networks (T-NTN) can also be considered to provide advanced intelligent solutions with the support of the edge intelligence paradigm. In this dissertation, we proposed an edge computing-enabled joint T-NTN-based vehicular system architecture to serve VUs. Next, we analyze the terrestrial VEC systems performance for VUs data processing problems and propose solutions to improve the performance in terms of latency and energy costs. Next, we extend the scenario toward the joint T-NTN system and address the problem of distributed data processing through ML-based solutions. We also proposed advanced distributed learning frameworks with the support of a joint T-NTN framework with edge computing facilities. In the end, proper conclusive remarks and several future directions are provided for the proposed solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In highly urbanized coastal lowlands, effective site characterization is crucial for assessing seismic risk. It requires a comprehensive stratigraphic analysis of the shallow subsurface, coupled with the precise assessment of the geophysical properties of buried deposits. In this context, late Quaternary paleovalley systems, shallowly buried fluvial incisions formed during the Late Pleistocene sea-level fall and filled during the Holocene sea-level rise, are crucial for understanding seismic amplification due to their soft sediment infill and sharp lithologic contrasts. In this research, we conducted high-resolution stratigraphic analyses of two regions, the Pescara and Manfredonia areas along the Adriatic coastline of Italy, to delineate the geometries and facies architecture of two paleovalley systems. Furthermore, we carried out geophysical investigations to characterize the study areas and perform seismic response analyses. We tested the microtremor-based horizontal-to-vertical spectral ratio as a mapping tool to reconstruct the buried paleovalley geometries. We evaluated the relationship between geological and geophysical data and identified the stratigraphic surfaces responsible for the observed resonances. To perform seismic response analysis of the Pescara paleovalley system, we integrated the stratigraphic framework with microtremor and shear wave velocity measurements. The seismic response analysis highlights strong seismic amplifications in frequency ranges that can interact with a wide variety of building types. Additionally, we explored the applicability of artificial intelligence in performing facies analysis from borehole images. We used a robust dataset of high-resolution digital images from continuous sediment cores of Holocene age to outline a novel, deep-learning-based approach for performing automatic semantic segmentation directly on core images, leveraging the power of convolutional neural networks. We propose an automated model to rapidly characterize sediment cores, reproducing the sedimentologist's interpretation, and providing guidance for stratigraphic correlation and subsurface reconstructions.