888 resultados para Knowledge representation (Information theory)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Miért van a magyar gazdaság hasonlóan kritikus helyzetben, mint a rendszerváltást megelőző időkben? Miért nem sikerül az eladósodást megállítani, a gazdaságot konszolidálni, és fenntartható növekedési pályára állítani? Ennek közvetlen okát a szerzők a fejlődés féloldalasságában és a duális gazdasági szerkezetben látják. A piacgazdasági intézmények bevezetése ugyanis nem járt együtt az egész gazdaságot (vagy legalábbis annak nagy részét) átfogó, a tudásvezérelt információgazdaság felé mutató, gyors fejlődéssel. Számos indikátor azt jelzi, hogy az ország lemaradt a nemzetközi innovációs versenyben. Az innovációk elégtelensége azonban nem véletlen, nem exogén adottság, hanem a formális piaci intézmények mögött meghúzódó valós intézményi berendezkedéssel, a gazdaságban ténylegesen észlelhető magatartásmintákkal magyarázható. A piacgazdasági intézmények és jogi keretek, amelyek elvileg lehetőséget adnak az erőforrásokhoz való szabad és nyílt hozzáférésre, csak részben töltődtek meg tartalommal. Az országban még mindig meglehetősen korlátozott a sikeres belépés lehetősége a piaci és a politikai arénába, azaz a magyar társadalom - a north-i terminológiával élve - "korlátozott hozzáférésre alapozott társadalomként" írható le. Az a tény, hogy a piaci szereplők esélyei az államhoz és intézményeihez fűződő kapcsolatoktól függően nagyban különböznek egymástól, és a gazdasági szereplők rendszerszerűen használják e kapcsolatokat járadékszerzésre, lefojtja az innovációt, és újra és újra megakasztja a fejlődést. / === / Why is Hungary's economy still in a critical state similar to the one before the change of system? Why is the indebtedness not being halted, the economy consolidated or vital structural reforms performed? The authors see the direct cause in lopsided development and a dual economic structure. The introduction of market economic institutions was not followed by rapid, overall development towards a knowledge-led, information economy. Several indicators show how Hungary has fallen behind in competition for international innovation. The inadequacy is not a fortuitous, exogenous attribute, but explicable in terms of a real institutional setup underlying the formal market institutions, and of actual behaviour patterns found in the economy. Only in part has substance been gained by the institutions and legal frames of a market economy that theoretically would provide free and open access to resources. The scope for successful entry into the market or political arena remains very narrow, so that in Douglass North's terms, Hungary's is still "a society based on restricted access." Innovation is stifled and development repeatedly impeded by the fact that market players' chances differ widely depending on their connections with the state and its institutions and such connections are used regularly in rent-seeking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global problems, rapid and massive regional changes in the 21st century call for genuine long-term, awareness, planning and well focused actions from both national governments and international organizations. This book wishes to contribute to building an innovative path of strategic views in handling the diverse challenges, and more emphatically, the economic impacts of climate change. Although the contributors of this volume represent several approaches, they all rely on some common grounds such as the costbenefit analysis of mitigation and adaptation, and on the need to present an in-depth theoretical and practical dimension. The research accounted for in this book tried to integrate and confront various types of economics approaches and methods, as well as knowledge from game theory to country surveys, from agricultural adaptation to weather bonds, from green tax to historical experience of human adaptation. The various themes and points of views do deserve the attention of the serious academic reader interested in the economics of climate change. We hope to enhance the spread of good solutions resulting from world wide disputes and tested strategic decisions. WAKE UP! It is not just the polar bears' habitat that is endangered, but the entire human form of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coding process is a fundamental aspect of cerebral functioning. The sensory stimuli transformation in neurophysiological responses has been a research theme in several areas of Neuroscience. One of the most used ways to measure a neural code e ciency is by the use of Information Theory measures, such as mutual information. Using these tools, recent studies show that in the auditory cortex both local eld potentials (LFPs) and action potential spiking times code information about sound stimuli. However, there are no studies applying Information Theory tools to investigate the e ciency of codes that use postsynaptics potentials (PSPs), alone and associated with LFP analysis. These signals are related in the sense that LFPs are partly created by joint action of several PSPs. The present dissertation reports information measures between PSP and LFP responses obtained in the primary auditory cortex of anaesthetized rats and auditory stimuli of distinct frequencies. Our results show that PSP responses hold information about sound stimuli in comparable levels and even greater than LFP responses. We have also found that PSPs and LFPs code sound information independently, since the joint analysis of these signals did neither show synergy nor redundancy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information extraction is a frequent and relevant problem in digital signal processing. In the past few years, different methods have been utilized for the parameterization of signals and the achievement of efficient descriptors. When the signals possess statistical cyclostationary properties, the Cyclic Autocorrelation Function (CAF) and the Spectral Cyclic Density (SCD) can be used to extract second-order cyclostationary information. However, second-order cyclostationary information is poor in nongaussian signals, as the cyclostationary analysis in this case should comprise higher-order statistical information. This paper proposes a new mathematical tool for the higher-order cyclostationary analysis based on the correntropy function. Specifically, the cyclostationary analysis is revisited focusing on the information theory, while the Cyclic Correntropy Function (CCF) and Cyclic Correntropy Spectral Density (CCSD) are also defined. Besides, it is analytically proven that the CCF contains information regarding second- and higher-order cyclostationary moments, being a generalization of the CAF. The performance of the aforementioned new functions in the extraction of higher-order cyclostationary characteristics is analyzed in a wireless communication system where nongaussian noise exists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research aims to analyze the meaning making by Elementary Education students from public schools of Uberlândia (Minas Gerais, Brazil) about the social and environmental problems of their surroundings as of audiovisual reading and expression skills. The theoretical framework is based on constructivism, through the study of contributions about: cognitive development; meaningful learning; cognitive processes and types of knowledge; principles of learning with technology; educommunication focused on critical media literacy; and critical environmental education. The study object is a video production workshop organized in nine meetings, from September to November 2015, attended by 15 students. The following data collection instruments were used: the materials produced by the participants, specifically guided critical media literacy, agenda, script and final video; researcher observations from his role as a mediator; and focal interviews. The analysis was divided into two axes: procedural knowledge – technical skills of critical media literacy and production; and conceptual and metacognitive knowledgerepresentation of social and environmental problems and metacognitive skills of critical media literacy. Data were coded in the form of a skill evaluation rubric and also in the form of graphs. Thus, despite the time constraints, it is inferred that the workshop helped students to deepen their understanding about the discussed content, which is reinforced by observing in the graphs how the constant progressive differentiation of more inclusive concepts occurred along the meetings. It is further considered that the workshop contributed to the students reflect on their way of learning through critical use of techniques of media literacy and production, which can be seen from the satisfactory learners performance in most elements evaluated by the rubrics, as well as from the success in identifying interlocutors, values and actions in the read and built texts, something revealed by the graphs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.

The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.

We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.

Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been years since the introduction of the Dynamic Network Optimization (DNO) concept, yet the DNO development is still at its infant stage, largely due to a lack of breakthrough in minimizing the lengthy optimization runtime. Our previous work, a distributed parallel solution, has achieved a significant speed gain. To cater for the increased optimization complexity pressed by the uptake of smartphones and tablets, however, this paper examines the potential areas for further improvement and presents a novel asynchronous distributed parallel design that minimizes the inter-process communications. The new approach is implemented and applied to real-life projects whose results demonstrate an augmented acceleration of 7.5 times on a 16-core distributed system compared to 6.1 of our previous solution. Moreover, there is no degradation in the optimization outcome. This is a solid sprint towards the realization of DNO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the achievable sum-rate of uplink massive multiple-input multiple-output (MIMO) systems considering a practical channel impairment, namely, aged channel state information (CSI). Taking into account both maximum ratio combining (MRC) and zero-forcing (ZF) receivers at the base station, we present tight closed-form lower bounds on the sum-rate for both receivers, which provide efficient means to evaluate the sum-rate of the system. More importantly, we characterize the impact of channel aging on the power scaling law. Specifically, we show that the transmit power of each user can be scaled down by 1/√(M), which indicates that aged CSI does not affect the power scaling law; instead, it causes only a reduction on the sum rate by reducing the effective signal-to-interference-and-noise ratio (SINR).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bakgrund: Många vuxna med ADHD avbryter sin behandling av okända orsaker. Därför är det viktigt att undersöka hur vuxna med ADHD upplever sina möjligheter att vara delaktiga i behandlingen och på vilket sätt sjuksköterskan kan vara till stöd för dem. Syfte: Att beskriva vuxna ADHD patienters upplevelse av delaktighet och behov av sjuksköterskans stöd vid behandling inom öppenvårdspsykiatrin. Metod: Kvalitativ studie som baseras på åtta semistrukturerade intervjuer. Intervjumaterialet analyserades med innehållsanalys. Resultat: Många upplevde att de saknade information om diagnos och behandling och att de därför inte kunde vara med och påverka. Sjuksköterskan upplevdes vara lättare att nå än läkaren och kunde erbjuda tätare uppföljningar. Sjuksköterskan upplevdes också ha en samordnande funktion mellan olika yrkeskategorier. Att bli sedd som person och inte bli reducerad till en patient med ADHD var viktigt. Slutligen framkom betydelsen av att även involvera närstående i behandlingen. Slutsats: En del av sjuksköterskans arbete är att föra ut kunskap och information till patienterna på ett sätt som patienterna förstår utifrån varje persons specifika behov för att öka patienternas möjlighet till delaktighet. Specialistsjuksköterskan kan också utgöra ett stöd genom möjligheten att erbjuda tätare kontakt och mer uppföljning, samordning med andra yrkesgrupper och samhällsinstanser. Stöd kan också handla om att se människan bakom diagnosen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ankle sprains are the most common injuries in sports, usually causing damage to the lateral ligaments. Recurrence has as usual result permanent instability, and thus loss of proprioception. This fact, together with residual symptoms, is what is known as chronic ankle instability, CAI, or FAI, if it is functional. This problem tries to be solved by improving musculoskeletal stability and proprioception by the application of bandages and performing exercises. The aim of this study has been to review articles (meta-analisis, systematic reviews and revisions) published in 2009-2015 in PubMed, Medline, ENFISPO and BUCea, using keywords such as “sprain instability”, “sprain proprioception”, “chronic ankle instability”. Evidence affirms that there does exist decreased proprioception in patients who suffer from CAI. Rehabilitation exercise regimen is indicated as a treatment because it generates a subjective improvement reported by the patient, and the application of bandages works like a sprain prevention method limiting the range of motion, reducing joint instability and increasing confidence during exercise. As podiatrists we should recommend proprioception exercises to all athletes in a preventive way, and those with CAI or FAI, as a rehabilitation programme, together with the application of bandages. However, further studies should be generated focusing on ways of improving proprioception, and on the exercise patterns that provide the maximum benefit.