822 resultados para Information technology and communication (ITC)
Resumo:
Collective intelligence is an interdisciplinary subject and it has been explored for many different knowledge areas. As a proposal totally tied to the concept of information and information technologies and communication, it is considered as relevant the discussion about the topic within the scope of Information Science. Therefore, a descriptive and exploratory study was carried out from Pierre Levy's work, identifying the precepts of collective intelligence and its ambiences and implications. The research is documental, focusing on determining the state of the art of the production about collective intelligence, verifying what was produced by Pierre Lévy and by other authors about the subject, in order to point out what possible interventions of Information Science on studies about collective intelligence. The research showed that that in the field of Information Science there is little research on the theoretical level about collective intelligence. Nevertheless, discussions about the representation and organization of collective intelligence in digital environments have been recurrent in the present, thus opening new fields of approach between Information Science and conceptual research and practice in collective intelligence.
Resumo:
Organizational environments are related to hierarchic levels existing in a determined organization, and they influence in the formal and informal flows origin and in their monitoring and/or extinction. Informational environments are a result of organizational environments, of which focus is information and knowledge. Information flows are a fundamental element to informational environments, in a way that there´s no informational environments if there´s no information flows. Informational flows are natural reflections from their environments, in terms of content and in the way they occur. This qualitative and quantitative research was developed in three stages, in a way to allow the comprehension of the phenomena related to information and knowledge environments and information flows that occur in the meat sector from the Province of Salamanca, Spain. We used Laurence Bardin´s ‘Analysis of Content’, more specifically the ‘Categorical Analysis’ technique to data analysis. As data collection procedure we accomplished a field research, applying a questionnaire as an intentional sample of the meat industries segment from the Province of Salamanca, Spain. From data tabulation and analysis, we infer that information environments and flows are relevant to these companies business development, as well as we emphasized the need of information and knowledge management deployment, in a way to insure organizational processes quality, industrial chain production and companies competition to conquer potential markets.
Resumo:
The physics of plasmas encompasses basic problems from the universe and has assured us of promises in diverse applications to be implemented in a wider range of scientific and engineering domains, linked to most of the evolved and evolving fundamental problems. Substantial part of this domain could be described by R–D mechanisms involving two or more species (reaction–diffusion mechanisms). These could further account for the simultaneous non-linear effects of heating, diffusion and other related losses. We mention here that in laboratory scale experiments, a suitable combination of these processes is of vital importance and very much decisive to investigate and compute the net behaviour of plasmas under consideration. Plasmas are being used in the revolution of information processing, so we considered in this technical note a simple framework to discuss and pave the way for better formalisms and Informatics, dealing with diverse domains of science and technologies. The challenging and fascinating aspects of plasma physics is that it requires a great deal of insight in formulating the relevant design problems, which in turn require ingenuity and flexibility in choosing a particular set of mathematical (and/or experimental) tools to implement them.
Resumo:
Pós-graduação em Psicologia - FCLAS
Resumo:
Pós-graduação em Psicologia - FCLAS
Resumo:
OBJECTIVE: Due to their toxicity, diesel emissions have been submitted to progressively more restrictive regulations in developed countries. However, in Brazil, the implementation of the Cleaner Diesel Technologies policy (Euro IV standards for vehicles produced in 2009 and low-sulfur diesel with 50 ppm of sulfur) was postponed until 2012 without a comprehensive analysis of the effect of this delay on public health parameters. We aimed to evaluate the impact of the delay in implementing the Cleaner Diesel Technologies policy on health indicators and monetary health costs in Brazil. METHODS: The primary estimator of exposure to air pollution was the concentration of ambient fine particulate matter (particles with aerodynamic diameters, <2.5 mu m, [PM2.5]). This parameter was measured daily in six Brazilian metropolitan areas during 2007-2008. We calculated 1) the projected reduction in the PM2.5 that would have been achieved if the Euro IV standards had been implemented in 2009 and 2) the expected reduction after implementation in 2012. The difference between these two time curves was transformed into health outcomes using previous dose-response curves. The economic valuation was performed based on the DALY (disability-adjusted life years) method. RESULTS: The delay in implementing the Cleaner Diesel Technologies policy will result in an estimated excess of 13,984 deaths up to 2040. Health expenditures are projected to be increased by nearly US$ 11.5 billion for the same period. CONCLUSIONS: The present results indicate that a significant health burden will occur because of the postponement in implementing the Cleaner Diesel Technologies policy. These results also reinforce the concept that health effects must be considered when revising fuel and emission policies.
Resumo:
The Grupo de Estudos e Pesquisas de Tecnologia da Informacao nos Processos de Trabalho em Enfermagem (Study and Research Group for Information Technology in the Nursing Working Processes, GEPETE) has the purpose of producing and socializing knowledge in information technology and health and nursing communication, making associations with research groups in this field and promoting student participation. This study was performed by the group tutors with the objective to report on the development of the virtual learning environment (VLE) and the tutors' experience as mediators of a research group using the Moodle platform. To do this, a VLE was developed and pedagogical mediation was performed following the theme of mentoring. An initial diagnosis was made of the difficulties in using this technology in interaction and communication, which permitted the proposal of continuing to use the platform as a resource to support research activities, offer lead researchers the mechanisms to socialize projects and offer the possibility of giving advice at a distance.
Resumo:
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.
Resumo:
Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)
Resumo:
Quando si parla di green information technology si fa riferimento a un nuovo filone di ricerche focalizzate sulle tecnologie ecologiche o verdi rivolte al rispetto ambientale. In prima battuta ci si potrebbe chiedere quali siano le reali motivazioni che possono portare allo studio di tecnologie green nel settore dell’information technology: sono così inquinanti i computer? Non sono le automobili, le industrie, gli aerei, le discariche ad avere un impatto inquinante maggiore sull’ambiente? Certamente sì, ma non bisogna sottovalutare l’impronta inquinante settore IT; secondo una recente indagine condotta dal centro di ricerche statunitense Gartner nel 2007, i sistemi IT sono tra le maggiori fonti di emissione di CO2 e di altri gas a effetto serra , con una percentuale del 2% sulle emissioni totali del pianeta, eguagliando il tasso di inquinamento del settore aeromobile. Il numero enorme di computer disseminato in tutto il mondo assorbe ingenti quantità di energia elettrica e le centrali che li alimentano emettono tonnellate di anidride carbonica inquinando l’atmosfera. Con questa tesi si vuole sottolineare l’impatto ambientale del settore verificando, attraverso l’analisi del bilancio sociale ed ambientale, quali misure siano state adottate dai leader del settore informatico. La ricerca è volta a dimostrare che le più grandi multinazionali informatiche siano consapevoli dell’inquinamento prodotto, tuttavia non adottano abbastanza soluzioni per limitare le emissioni, fissando futili obiettivi futuri.
Resumo:
Synchronization is a key issue in any communication system, but it becomes fundamental in the navigation systems, which are entirely based on the estimation of the time delay of the signals coming from the satellites. Thus, even if synchronization has been a well known topic for many years, the introduction of new modulations and new physical layer techniques in the modern standards makes the traditional synchronization strategies completely ineffective. For this reason, the design of advanced and innovative techniques for synchronization in modern communication systems, like DVB-SH, DVB-T2, DVB-RCS, WiMAX, LTE, and in the modern navigation system, like Galileo, has been the topic of the activity. Recent years have seen the consolidation of two different trends: the introduction of Orthogonal Frequency Division Multiplexing (OFDM) in the communication systems, and of the Binary Offset Carrier (BOC) modulation in the modern Global Navigation Satellite Systems (GNSS). Thus, a particular attention has been given to the investigation of the synchronization algorithms in these areas.
Resumo:
Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.
Resumo:
Al fine di migliorare le tecniche di coltura cellulare in vitro, sistemi a bioreattore sono sempre maggiormente utilizzati, e.g. ingegnerizzazione del tessuto osseo. Spinner Flasks, bioreattori rotanti e sistemi a perfusione di flusso sono oggi utilizzati e ogni sistema ha vantaggi e svantaggi. Questo lavoro descrive lo sviluppo di un semplice bioreattore a perfusione ed i risultati della metodologia di valutazione impiegata, basata su analisi μCT a raggi-X e tecniche di modellizzazione 3D. Un semplice bioreattore con generatore di flusso ad elica è stato progettato e costruito con l'obiettivo di migliorare la differenziazione di cellule staminali mesenchimali, provenienti da embrioni umani (HES-MP); le cellule sono state seminate su scaffold porosi di titanio che garantiscono una migliore adesione della matrice mineralizzata. Attraverso un microcontrollore e un'interfaccia grafica, il bioreattore genera tre tipi di flusso: in avanti (senso orario), indietro (senso antiorario) e una modalità a impulsi (avanti e indietro). Un semplice modello è stato realizzato per stimare la pressione generata dal flusso negli scaffolds (3•10-2 Pa). Sono stati comparati tre scaffolds in coltura statica e tre all’interno del bioreattore. Questi sono stati incubati per 21 giorni, fissati in paraformaldehyde (4% w/v) e sono stati soggetti ad acquisizione attraverso μCT a raggi-X. Le immagini ottenute sono state poi elaborate mediante un software di imaging 3D; è stato effettuato un sezionamento “virtuale” degli scaffolds, al fine di ottenere la distribuzione del gradiente dei valori di grigio di campioni estratti dalla superficie e dall’interno di essi. Tale distribuzione serve per distinguere le varie componenti presenti nelle immagini; in questo caso gli scaffolds dall’ipotetica matrice cellulare. I risultati mostrano che sia sulla superficie che internamente agli scaffolds, mantenuti nel bioreattore, è presente una maggiore densità dei gradienti dei valori di grigio ciò suggerisce un migliore deposito della matrice mineralizzata. Gli insegnamenti provenienti dalla realizzazione di questo bioreattore saranno utilizzati per progettare una nuova versione che renderà possibile l’analisi di più di 20 scaffolds contemporaneamente, permettendo un’ulteriore analisi della qualità della differenziazione usando metodologie molecolari ed istochimiche.