899 resultados para Computer Networks and Communications
Resumo:
Business angels are natural persons who provide equity financing for young enterprises and gain ownership in them. They are usually anonym investors and they operate in the background of the companies. Their important feature is that over the funding of the enterprises based on their business experiences they can contribute to the success of the companies with their special expertise and with strategic support. As a result of the asymmetric information between the angels and the companies their matching is difficult (Becsky-Nagy – Fazekas 2015), and the fact, that angel investors prefer anonymity makes it harder for entrepreneurs to obtain informal venture capital. The primary aim of the different type of business angel organizations and networks is to alleviate this matching process with intermediation between the two parties. The role of these organizations is increasing in the informal venture capital market compared to the individually operating angels. The recognition of their economic importance led many governments to support them. There were also public initiations that aimed the establishment of these intermediary organizations that led to the institutionalization of business angels. This study via the characterization of business angels focuses on the progress of these informational intermediaries and their ways of development with regards to the international trends and the current situation of Hungarian business angels and angel networks.
Resumo:
Solving microkinetics of catalytic systems, which bridges microscopic processes and macroscopic reaction rates, is currently vital for understanding catalysis in silico. However, traditional microkinetic solvers possess several drawbacks that make the process slow and unreliable for complicated catalytic systems. In this paper, a new approach, the so-called reversibility iteration method (RIM), is developed to solve microkinetics for catalytic systems. Using the chemical potential notation we previously proposed to simplify the kinetic framework, the catalytic systems can be analytically illustrated to be logically equivalent to the electric circuit, and the reaction rate and coverage can be calculated by updating the values of reversibilities. Compared to the traditional modified Newton iteration method (NIM), our method is not sensitive to the initial guess of the solution and typically requires fewer iteration steps. Moreover, the method does not require arbitrary-precision arithmetic and has a higher probability of successfully solving the system. These features make it ∼1000 times faster than the modified Newton iteration method for the systems we tested. Moreover, the derived concept and the mathematical framework presented in this work may provide new insight into catalytic reaction networks.
Resumo:
Background
It is generally acknowledged that a functional understanding of a biological system can only be obtained by an understanding of the collective of molecular interactions in form of biological networks. Protein networks are one particular network type of special importance, because proteins form the functional base units of every biological cell. On a mesoscopic level of protein networks, modules are of significant importance because these building blocks may be the next elementary functional level above individual proteins allowing to gain insight into fundamental organizational principles of biological cells.
Results
In this paper, we provide a comparative analysis of five popular and four novel module detection algorithms. We study these module prediction methods for simulated benchmark networks as well as 10 biological protein interaction networks (PINs). A particular focus of our analysis is placed on the biological meaning of the predicted modules by utilizing the Gene Ontology (GO) database as gold standard for the definition of biological processes. Furthermore, we investigate the robustness of the results by perturbing the PINs simulating in this way our incomplete knowledge of protein networks.
Conclusions
Overall, our study reveals that there is a large heterogeneity among the different module prediction algorithms if one zooms-in the biological level of biological processes in the form of GO terms and all methods are severely affected by a slight perturbation of the networks. However, we also find pathways that are enriched in multiple modules, which could provide important information about the hierarchical organization of the system
Resumo:
Computer game technology produces compelling ‘immersive environments’ where our digitally-native youth play and explore. Players absorb visual, auditory and other signs and process these in real time, making rapid choices on how to move through the game-space to experience ‘meaningful play’. How can immersive environments be designed to elicit perception and understanding of signs? In this paper we explore game design and gameplay from a semiotic perspective, focusing on the creation of meaning for players as they play the game. We propose a theory of game design based on semiotics.
Resumo:
This chapter examines community media projects in Scotland as social processes that nurture knowledge through participation in production. A visual and media anthropology framework (Ginsburg, 2005) with an emphasis on the social context of media production informs the analysis of community media. Drawing on community media projects in the Govan area of Glasgow and the Isle of Bute, the techniques of production foreground “the relational aspects of filmmaking” (Grimshaw and Ravetz, 2005: 7) and act as a catalyst for knowledge and networks of relations embedded in time and place. Community media is defined here as a creative social process, characterised by an approach to production that is multi-authored, collaborative and informed by the lives of participants, and which recognises the relevance of networks of relations to that practice (Caines, 2007: 2). As a networked process, community media production is recognised as existing in collaboration between a director or producer, such as myself, and organisations, institutions and participants, who are connected through a range of identities, practices and place. These relations born of the production process reflect a complex area of practice and participation that brings together “parallel and overlapping public spheres” (Meadows et al., 2002: 3). This relates to broader concerns with networks (Carpentier, Servaes and Lie, 2003; Rodríguez, 2001), both revealed during the process of production and enhanced by it, and how they can be described with reference to the knowledge practice of community media.
Resumo:
This article presents an interdisciplinary experience that brings together two areas of computer science; didactics and philosophy. As such, the article introduces a relatively unexplored area of research, not only in Uruguay but in the whole Latin American region. The reflection on the ontological status of computer science, its epistemic and educational problems, as well as their relationship with technology, allows us to elaborate a critical analysis of the discipline and a social perception of it as a basic science.
Resumo:
We can categorically state that the information is the main ingredient of the culture of a people. Defining information as a set of processed data that are needed, have value and meaning for individuals and for developing countries where data is any fact.The information could also be considered as an international resource indispensable. The scientific and technological development is a consequence of the relevant information is handled. The "heap" is a source of information that libraries must consider and integrate them because they give rise to more information and new technology. The need for improved information systems in recent years, it has become critical because the information data grows in quantity and complexity of organization. Consequently, some libraries have tried to coordinate functions, redistribute resources, identify needs and work together to give easier access to information
Resumo:
The present investigation aims to analyse the relationship between knowledge sharing behaviours and performance. The former behaviours were studied using Social Network Analysis, in an attempt to characterise knowledge sharing networks. Through identification of central individuals in these networks, we made analysis of the association between this centrality and individual performance. A questionnaire was developed and applied to a sample of workers in a Portuguese organisation (N=244). The final conclusions point to a positive association between these behaviours and individual performance.
Resumo:
The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.
Resumo:
The very nature of computer science with its constant changes forces those who wish to follow to adapt and react quickly. Large companies invest in being up to date in order to generate revenue and stay active on the market. Universities, on the other hand, need to imply same practices of staying up to date with industry needs in order to produce industry ready engineers. By interviewing former students, now engineers in the industry, and current university staff this thesis aims to learn if there is space for enhancing the education through different lecturing approaches and/or curriculum adaptation and development. In order to address these concerns a qualitative research has been conducted, focusing on data collection obtained through semi-structured live world interviews. The method used follows the seven stages of research interviewing introduced by Kvale and focuses on collecting and preparing relevant data for analysis. The collected data is transcribed, refined, and further on analyzed in the “Findings and analysis” chapter. The focus of analyzing was answering the three research questions; learning how higher education impacts a Computer Science and Informatics Engineers’ job, how to better undergo the transition from studies to working in the industry and how to develop a curriculum that helps support the previous two. Unaltered quoted extracts are presented and individually analyzed. To paint a better picture a theme-wise analysis is presented summing valuable themes that were repeated throughout the interviewing phase. The findings obtained imply that there are several factors directly influencing the quality of education. From the student side, it mostly concerns expectation and dedication involving studies, and from the university side it is commitment to the curriculum development process. Due to the time and resource limitations this research provides findings conducted on a narrowed scope, although it can serve as a great foundation for further development; possibly as a PhD research.
Resumo:
This thesis examines the development of state-narco networks in post-transition Bolivia. Mainstream discourses of drugs tend to undertheorise such relationships, holding illicit economies, weak states and violence as synergistic phenomena. Such assumptions fail to capture the nuanced relations that emerge between the state and the drug trade in different contexts, their underlying logics and diverse effects. As an understudied case, Bolivia offers novel insights into these dynamics. Bolivian military authoritarian governments (1964-1982), for example, integrated drug rents into clientelistic systems of governance, helping to establish factional coalitions and reinforce regime authority. Following democratic transition in 1982 and the escalation of US counterdrug efforts, these stable modes of exchange between the state and the coca-cocaine economy fragmented. Bolivia, though, continued to experience lower levels of drug-related violence than its Andean neighbours, and sustained democratisation despite being a major drug producer. Focusing on the introduction of the Andean Initiative (1989-1993), I explore state-narco interactions during this period of flux: from authoritarianism to (formal) democracy, and from Cold War to Drug War. As such, the thesis transcends the conventional analyses of the drugs literature and orthodox readings of Latin American narco-violence, providing insights into the relationship between illicit economies and democratic transition, the regional role of the US, and the (unintended) consequences of drug policy interventions. I utilise a mixed methods approach to offer discrete perspectives on the object of study. Drawing on documentary and secondary sources, I argue that state-narco networks were interwoven with Bolivia’s post-transition political settlement. Uneven democratisation ensured pockets of informalism, as clientelistic and authoritarian practices continued. This included police and military autonomy, and tolerance of drug corruption within both institutions. Non-enforcement of democratic norms of accountability and transparency was linked to the maintenance of fragile political equilibrium. Interviews with key US and Bolivian elite actors also revealed differing interpretations of state-narco interactions. These exposed competing agendas, and were folded into alternative paradigms and narratives of the ‘war on drugs’. The extension of US Drug War goals and the targeting of ‘corrupt’ local power structures, clashed with local ambivalence towards the drug trade, opposition to destabilising, ‘Colombianised’ policies and the claimed ‘democratising mission’ of the Bolivian government. In contrasting these US and Bolivian accounts, the thesis shows how real and perceived state-narco webs were understood and navigated by different actors in distinct ways. ‘Drug corruption’ held significance beyond simple economic transaction or institutional failure. Contestation around state-narco interactions was enmeshed in US-Bolivian relations of power and control.
Resumo:
In this thesis, we investigate the role of applied physics in epidemiological surveillance through the application of mathematical models, network science and machine learning. The spread of a communicable disease depends on many biological, social, and health factors. The large masses of data available make it possible, on the one hand, to monitor the evolution and spread of pathogenic organisms; on the other hand, to study the behavior of people, their opinions and habits. Presented here are three lines of research in which an attempt was made to solve real epidemiological problems through data analysis and the use of statistical and mathematical models. In Chapter 1, we applied language-inspired Deep Learning models to transform influenza protein sequences into vectors encoding their information content. We then attempted to reconstruct the antigenic properties of different viral strains using regression models and to identify the mutations responsible for vaccine escape. In Chapter 2, we constructed a compartmental model to describe the spread of a bacterium within a hospital ward. The model was informed and validated on time series of clinical measurements, and a sensitivity analysis was used to assess the impact of different control measures. Finally (Chapter 3) we reconstructed the network of retweets among COVID-19 themed Twitter users in the early months of the SARS-CoV-2 pandemic. By means of community detection algorithms and centrality measures, we characterized users’ attention shifts in the network, showing that scientific communities, initially the most retweeted, lost influence over time to national political communities. In the Conclusion, we highlighted the importance of the work done in light of the main contemporary challenges for epidemiological surveillance. In particular, we present reflections on the importance of nowcasting and forecasting, the relationship between data and scientific research, and the need to unite the different scales of epidemiological surveillance.