924 resultados para Interval signals and systems
Resumo:
Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.
This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.
In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.
Resumo:
This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.
Resumo:
PURPOSE: Conventional staging methods are inadequate to identify patients with stage II colon cancer (CC) who are at high risk of recurrence after surgery with curative intent. ColDx is a gene expression, microarray-based assay shown to be independently prognostic for recurrence-free interval (RFI) and overall survival in CC. The objective of this study was to further validate ColDx using formalin-fixed, paraffin-embedded specimens collected as part of the Alliance phase III trial, C9581.
PATIENTS AND METHODS: C9581 evaluated edrecolomab versus observation in patients with stage II CC and reported no survival benefit. Under an initial case-cohort sampling design, a randomly selected subcohort (RS) comprised 514 patients from 901 eligible patients with available tissue. Forty-nine additional patients with recurrence events were included in the analysis. Final analysis comprised 393 patients: 360 RS (58 events) and 33 non-RS events. Risk status was determined for each patient by ColDx. The Self-Prentice method was used to test the association between the resulting ColDx risk score and RFI adjusting for standard prognostic variables.
RESULTS: Fifty-five percent of patients (216 of 393) were classified as high risk. After adjustment for prognostic variables that included mismatch repair (MMR) deficiency, ColDx high-risk patients exhibited significantly worse RFI (multivariable hazard ratio, 2.13; 95% CI, 1.3 to 3.5; P < .01). Age and MMR status were marginally significant. RFI at 5 years for patients classified as high risk was 82% (95% CI, 79% to 85%), compared with 91% (95% CI, 89% to 93%) for patients classified as low risk.
CONCLUSION: ColDx is associated with RFI in the C9581 subsample in the presence of other prognostic factors, including MMR deficiency. ColDx could be incorporated with the traditional clinical markers of risk to refine patient prognosis.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
This paper presents a new formalism for reasoning about change over time. The formalism derives a clean separation between the notion of states and situations. It allows more flexible temporal causal relationships than do other formalisms for reasoning about causal change, such as the situation calculus and the event calculus. It includes effects that start during, immediately after, or some time after their causes, and which end before, simultaneously with, or after their causes. A formal distinction between actions, action-types and events is proposed, which allows the expression of common-sense causal laws at high level. It is shown how these laws can be used to deduce state change over time at low level, when events occur under certain preconditions hold. Two problems that beset most interval-based temporal systems, i.e., the so-called dividing instant problem and intermingling problem, are absent from the formalism.
Resumo:
Cognitive radio (CR) was developed for utilizing the spectrum bands efficiently. Spectrum sensing and awareness represent main tasks of a CR, providing the possibility of exploiting the unused bands. In this thesis, we investigate the detection and classification of Long Term Evolution (LTE) single carrier-frequency division multiple access (SC-FDMA) signals, which are used in uplink LTE, with applications to cognitive radio. We explore the second-order cyclostationarity of the LTE SC-FDMA signals, and apply results obtained for the cyclic autocorrelation function to signal detection and classification (in other words, to spectrum sensing and awareness). The proposed detection and classification algorithms provide a very good performance under various channel conditions, with a short observation time and at low signal-to-noise ratios, with reduced complexity. The validity of the proposed algorithms is verified using signals generated and acquired by laboratory instrumentation, and the experimental results show a good match with computer simulation results.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
Awnless barnyard grass, feathertop Rhodes grass, and windmill grass are important weeds in Australian cotton systems. In October 2014, an experiment was established to investigate the phenological plasticity of these species. Seed of these species were planted in a glasshouse every four weeks and each cohort grown for 6 months. A developmental response to day length was observed in barnyard grass but not in the other species. Days to maturity increased with each planting for feathertop Rhodes and windmill grass for the first six cohorts. Barnyard grass showed a similar pattern in growth for seeds planted from October to December with an increase in the onset of maturity from 51 to 58 days. However, the onset of maturity for cohorts planted between January and March decreased to between 50 and 52 days. All species had a decrease in the total number of panicles produced from the first four plantings. Feathertop Rhodes grass planted in October produced 41 panicles compared to those planted at the end of December producing 30 panicles, barnyard grass had a decrease from 99 to 47 panicles and windmill grass 37 to 15 panicles on average. By comparing the development of these key weed species over 12 months, detailed information on the phenological plasticity of these species will be obtained. This information will contribute to more informed management decisions by improving our understanding of appropriate weed control timings or herbicide rates depending on weed emergence and development.
Resumo:
Development of no-tillage (NT) farming has revolutionized agricultural systems by allowing growers to manage greater areas of land with reduced energy, labour and machinery inputs to control erosion, improve soil health and reduce greenhouse gas emission. However, NT farming systems have resulted in a build-up of herbicide-resistant weeds, an increased incidence of soil- and stubble-borne diseases and enrichment of nutrients and carbon near the soil surface. Consequently, there is an increased interest in the use of an occasional tillage (termed strategic tillage, ST) to address such emerging constraints in otherwise-NT farming systems. Decisions around ST uses will depend upon the specific issues present on the individual field or farm, and profitability and effectiveness of available options for management. This paper explores some of the issues with the implementation of ST in NT farming systems. The impact of contrasting soil properties, the timing of the tillage and the prevailing climate exert a strong influence on the success of ST. Decisions around timing of tillage are very complex and depend on the interactions between soil water content and the purpose for which the ST is intended. The soil needs to be at the right water content before executing any tillage, while the objective of the ST will influence the frequency and type of tillage implement used. The use of ST in long-term NT systems will depend on factors associated with system costs and profitability, soil health and environmental impacts. For many farmers maintaining farm profitability is a priority, so economic considerations are likely to be a primary factor dictating adoption. However, impacts on soil health and environment, especially the risk of erosion and the loss of soil carbon, will also influence a grower's choice to adopt ST, as will the impact on soil moisture reserves in rainfed cropping systems. © 2015 Elsevier B.V.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
In a global society, all educational sectors need to recognise internationalism as a core, foundational principle. Whilst most educational sectors are taking up that challenge, vocational education and training (VET) is still being pulled towards the national agenda in terms of its structures and systems, and the policies driving it, disadvantaging those who graduate from VET, those who teach in it, and the businesses and countries that connect with it. This paper poses questions about the future of internationalisation in the sector. It examines whether there is a way to create a VET system that meets its primary point of value, to produce skilled workers for the local labour market, while still benefitting those graduates by providing international skills and knowledge, gained from VET institutions that are international in their outlook. The paper examines some of the key barriers created by systems and structures in VET to internationalisation and suggests that the efforts which have been made to address the problem have had limited success. It suggests that only a model which gives freedom to those with a direct vested interest, students, teachers, trainers and employers, to pursue international co-operation and liaison will have the opportunity to succeed. (DIPF/Orig.)
Resumo:
We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
161 p.
Resumo:
In energy harvesting communications, users transmit messages using energy harvested from nature. In such systems, transmission policies of the users need to be carefully designed according to the energy arrival profiles. When the energy management policies are optimized, the resulting performance of the system depends only on the energy arrival profiles. In this dissertation, we introduce and analyze the notion of energy cooperation in energy harvesting communications where users can share a portion of their harvested energy with the other users via wireless energy transfer. This energy cooperation enables us to control and optimize the energy arrivals at users to the extent possible. In the classical setting of cooperation, users help each other in the transmission of their data by exploiting the broadcast nature of wireless communications and the resulting overheard information. In contrast to the usual notion of cooperation, which is at the signal level, energy cooperation we introduce here is at the battery energy level. In a multi-user setting, energy may be abundant in one user in which case the loss incurred by transferring it to another user may be less than the gain it yields for the other user. It is this cooperation that we explore in this dissertation for several multi-user scenarios, where energy can be transferred from one user to another through a separate wireless energy transfer unit. We first consider the offline optimal energy management problem for several basic multi-user network structures with energy harvesting transmitters and one-way wireless energy transfer. In energy harvesting transmitters, energy arrivals in time impose energy causality constraints on the transmission policies of the users. In the presence of wireless energy transfer, energy causality constraints take a new form: energy can flow in time from the past to the future for each user, and from one user to the other at each time. This requires a careful joint management of energy flow in two separate dimensions, and different management policies are required depending on how users share the common wireless medium and interact over it. In this context, we analyze several basic multi-user energy harvesting network structures with wireless energy transfer. To capture the main trade-offs and insights that arise due to wireless energy transfer, we focus our attention on simple two- and three-user communication systems, such as the relay channel, multiple access channel and the two-way channel. Next, we focus on the delay minimization problem for networks. We consider a general network topology of energy harvesting and energy cooperating nodes. Each node harvests energy from nature and all nodes may share a portion of their harvested energies with neighboring nodes through energy cooperation. We consider the joint data routing and capacity assignment problem for this setting under fixed data and energy routing topologies. We determine the joint routing of energy and data in a general multi-user scenario with data and energy transfer. Next, we consider the cooperative energy harvesting diamond channel, where the source and two relays harvest energy from nature and the physical layer is modeled as a concatenation of a broadcast and a multiple access channel. Since the broadcast channel is degraded, one of the relays has the message of the other relay. Therefore, the multiple access channel is an extended multiple access channel with common data. We determine the optimum power and rate allocation policies of the users in order to maximize the end-to-end throughput of this system. Finally, we consider the two-user cooperative multiple access channel with energy harvesting users. The users cooperate at the physical layer (data cooperation) by establishing common messages through overheard signals and then cooperatively sending them. For this channel model, we investigate the effect of intermittent data arrivals to the users. We find the optimal offline transmit power and rate allocation policy that maximize the departure region. When the users can further cooperate at the battery level (energy cooperation), we find the jointly optimal offline transmit power and rate allocation policy together with the energy transfer policy that maximize the departure region.
Resumo:
Background: Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs. Methods: The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied. Results: Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3-13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students. Conclusions: Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice.