883 resultados para Information theory in aesthetics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problems and methods for adaptive control and multi-agent processing of information in global telecommunication and computer networks (TCN) are discussed. Criteria for controllability and communication ability (routing ability) of dataflows are described. Multi-agent model for exchange of divided information resources in global TCN has been suggested. Peculiarities for adaptive and intelligent control of dataflows in uncertain conditions and network collisions are analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a model for development of project proposals by students as an approach to teaching information technology while promoting entrepreneurship and reflection. In teams of 3 to 5 participants, students elaborate a project proposal on a topic they have negotiated with each other and with the teacher. The project domain is related to the practical application of state-of-theart information technology in areas of substantial public interest or of immediate interest to the participants. This gives them ample opportunities for reflection not only on technical but also on social, economic, environmental and other dimensions of information technology. This approach has long been used with students of different years and programs of study at the Faculty of Mathematics and Informatics, Plovdiv University “Paisiy Hilendarski”. It has been found to develop all eight key competences for lifelong learning set forth in the Reference Framework and procedural skills required in real life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, the authors investigate the outage-optimal relay strategy under outdated channel state information (CSI) in a decode-and-forward cooperative communication system. They first confirm mathematically that minimising the outage probability under outdated CSI is equivalent to minimising the conditional outage probability on the outdated CSI of all the decodable relays' links. They then propose a multiple-relay strategy with optimised transmitting power allocation (MRS-OTPA) that minimises the conditional outage probability. It is shown that this MRS is a generalised relay approach to achieve the outage optimality under outdated CSI. To reduce the complexity, they also propose a MRS with equal transmitting power allocation (MRS-ETPA) that achieves near-optimal outage performance. It is proved that full spatial diversity, which has been achieved under ideal CSI, can still be achieved under outdated CSI through MRS-OTPA and MRS-ETPA. Finally, the outage performance and diversity order of MRS-OTPA and MRS-ETPA are evaluated by simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study provides an overview of the application possibilities of game theory to climate change. The characteristics of games are adapted to the topics of climate and carbon. The importance of uncertainty, probability, marginal value of adaptation, common pool resources, etc. are tailored to the context of international relations and the challenge of global warming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, non-integrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. ^ A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. ^ One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, nonintegrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two concepts in rural economic development policy have been the focus of much research and policy action: the identification and support of clusters or networks of firms and the availability and adoption by rural businesses of Information and Communication Technologies (ICT). From a theoretical viewpoint these policies are based on two contrasting models, with clustering seen as a process of economic agglomeration, and ICT-mediated communication as a means of facilitating economic dispersion. The study’s conceptual framework is based on four interrelated elements: location, interaction, knowledge, and advantage, together with the concept of networks which is employed as an operationally and theoretically unifying concept. The research questions are developed in four successive categories: Policy, Theory, Networks, and Method. The questions are approached using a study of two contrasting groups of rural small businesses in West Cork, Ireland: (a) Speciality Foods, and (b) firms in Digital Products and Services. The study combines Social Network Analysis (SNA) with Qualitative Thematic Analysis, using data collected from semi-structured interviews with 58 owners or managers of these businesses. Data comprise relational network data on the firms’ connections to suppliers, customers, allies and competitors, together with linked qualitative data on how the firms established connections, and how tacit and codified knowledge was sourced and utilised. The research finds that the key characteristics identified in the cluster literature are evident in the sample of Speciality Food businesses, in relation to flows of tacit knowledge, social embedding, and the development of forms of social capital. In particular the research identified the presence of two distinct forms of collective social capital in this network, termed “community” and “reputation”. By contrast the sample of Digital Products and Services businesses does not have the form of a cluster, but matches more closely to dispersive models, or “chain” structures. Much of the economic and social structure of this set of firms is best explained in terms of “project organisation”, and by the operation of an individual rather than collective form of “reputation”. The rural setting in which these firms are located has resulted in their being service-centric, and consequently they rely on ICT-mediated communication in order to exchange tacit knowledge “at a distance”. It is this factor, rather than inputs of codified knowledge, that most strongly influences their operation and their need for availability and adoption of high quality communication technologies. Thus the findings have applicability in relation to theory in Economic Geography and to policy and practice in Rural Development. In addition the research contributes to methodological questions in SNA, and to methodological questions about the combination or mixing of quantitative and qualitative methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extractive industry is characterized by high levels of risk and uncertainty. These attributes create challenges when applying traditional accounting concepts (such as the revenue recognition and matching concepts) to the preparation of financial statements in the industry. The International Accounting Standards Board (2010) states that the objective of general purpose financial statements is to provide useful financial information to assist the capital allocation decisions of existing and potential providers of capital. The usefulness of information is defined as being relevant and faithfully represented so as to best aid in the investment decisions of capital providers. Value relevance research utilizes adaptations of the Ohlson (1995) to assess the attribute of value relevance which is one part of the attributes resulting in useful information. This study firstly examines the value relevance of the financial information disclosed in the financial reports of extractive firms. The findings reveal that the value relevance of information disclosed in the financial reports depends on the circumstances of the firm including sector, size and profitability. Traditional accounting concepts such as the matching concept can be ineffective when applied to small firms who are primarily engaged in nonproduction activities that involve significant levels of uncertainty such as exploration activities or the development of sites. Standard setting bodies such as the International Accounting Standards Board and the Financial Accounting Standards Board have addressed the financial reporting challenges in the extractive industry by allowing a significant amount of accounting flexibility in industryspecific accounting standards, particularly in relation to the accounting treatment of exploration and evaluation expenditure. Therefore, secondly this study examines whether the choice of exploration accounting policy has an effect on the value relevance of information disclosed in the financial reports. The findings show that, in general, the Successful Efforts method produces value relevant information in the financial reports of profitable extractive firms. However, specifically in the oil & gas sector, the Full Cost method produces value relevant asset disclosures if the firm is lossmaking. This indicates that investors in production and non-production orientated firms have different information needs and these needs cannot be simultaneously fulfilled by a single accounting policy. In the mining sector, a preference by large profitable mining companies towards a more conservative policy than either the Full Cost or Successful Efforts methods does not result in more value relevant information being disclosed in the financial reports. This finding supports the fact that the qualitative characteristic of prudence is a form of bias which has a downward effect on asset values. The third aspect of this study is an examination of the effect of corporate governance on the value relevance of disclosures made in the financial reports of extractive firms. The findings show that the key factor influencing the value relevance of financial information is the ability of the directors to select accounting policies which reflect the economic substance of the particular circumstances facing the firms in an effective way. Corporate governance is found to have an effect on value relevance, particularly in the oil & gas sector. However, there is no significant difference between the exploration accounting policy choices made by directors of firms with good systems of corporate governance and those with weak systems of corporate governance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis mariológica de la conceptio per aurem, según la cual la Virgen María habría concebido a Jesucristo por el oído en el momento de escuchar del ángel el mensaje celestial anunciándole que, sin perder su virginidad, sería madre del Hijo de Dios encarnado, ha merecido hasta ahora muy pocos estudios académicos rigurosamente fundados en fuentes primarias. De hecho, en la literatura especializada son muy escasas las referencias a tal teoría y, cuando algún estudioso la evoca, casi siempre se contenta con aludir a ella, sin aportar pruebas documentales. Sin embargo, tal como lo revelan las nueve pinturas italianas aquí analizadas, esa teoría fue ilustrada mediante sutiles metáforas visuales en muchas obras pictóricas medievales, las cuales se inspiraron en una sólida tradición literaria. Además una pléyade de Padres de la Iglesia y teólogos medievales testimonia, mediante afirmaciones explícitas, que semejante teoría gozó de notable aceptación entre los maestros del pensamiento cristiano. Basándose en numerosos textos patrísticos y teológicos, este artículo intenta dos objetivos esenciales: exponer, ante todo, las distintas formulaciones teóricas propuestas por esos pensadores; y además, tratar de poner en luz los significados dogmáticos que subyacen bajo esa sorprendente tesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on two basic issues: the anxiety-generating nature of the interpreting task and the relevance of interpreter trainees’ academic self-concept. The first has already been acknowledged, although not extensively researched, in several papers, and the second has only been mentioned briefly in interpreting literature. This study seeks to examine the relationship between the anxiety and academic self-concept constructs among interpreter trainees. An adapted version of the Foreign Language Anxiety Scale (Horwitz et al., 1986), the Academic Autoconcept Scale (Schmidt, Messoulam & Molina, 2008) and a background information questionnaire were used to collect data. Students’ t-Test analysis results indicated that female students reported experiencing significantly higher levels of anxiety than male students. No significant gender difference in self-concept levels was found. Correlation analysis results suggested, on the one hand, that younger would-be interpreters suffered from higher anxiety levels and students with higher marks tended to have lower anxiety levels; and, on the other hand, that younger students had lower self-concept levels and higher-ability students held higher self-concept levels. In addition, the results revealed that students with higher anxiety levels tended to have lower self-concept levels. Based on these findings, recommendations for interpreting pedagogy are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The business model of an organization is an important strategic tool for its success, and should therefore be understood by business professionals and information technology professionals. By this context and considering the importance of information technology in contemporary business models, this article aims to verify the use of the business model components in the information technology (IT) projects management process in enterprises. To achieve this goal, this exploratory research has investigated the use of the Business Model concept in the information technology projects management, by a survey applied to 327 professionals from February to April 2012. It was observed that the business model concept, as well as its practices or its blocks, are not so well explored in its whole potential, possibly because it is relatively new. One of the benefits of this conceptual tool is to provide an understanding in terms of the core business for different areas, enabling a higher level of knowledge in terms of the essential activities of the enterprise IT professionals and the business area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Centrality is in fact one of the fundamental notions in graph theory which has established its close connection with various other areas like Social networks, Flow networks, Facility location problems etc. Even though a plethora of centrality measures have been introduced from time to time, according to the changing demands, the term is not well defined and we can only give some common qualities that a centrality measure is expected to have. Nodes with high centrality scores are often more likely to be very powerful, indispensable, influential, easy propagators of information, significant in maintaining the cohesion of the group and are easily susceptible to anything that disseminate in the network.