818 resultados para Functorial Embedding
Resumo:
Acute life-threatening events are mostly predictable in adults and children. Despite real-time monitoring these events still occur at a rate of 4%. This paper describes an automated prediction system based on the feature space embedding and time series forecasting methods of the SpO2 signal; a pulsatile signal synchronised with heart beat. We develop an age-independent index of abnormality that distinguishes patient-specific normal to abnormal physiology transitions. Two different methods were used to distinguish between normal and abnormal physiological trends based on SpO2 behaviour. The abnormality index derived by each method is compared against the current gold standard of clinical prediction of critical deterioration. Copyright © 2013 Inderscience Enterprises Ltd.
Resumo:
We investigated the nature of resource limitations during visual target processing by imposing high temporal processing demands on the cognitive system. This was achieved by embedding target stimuli into rapid-serial-visual-presentation-streams (RSVP). In RSVP streams, it is difficult to report the second of two targets (T2) if the second follows the first (T1) within 500 ms. This effect is known as the attentional blink (AB). For the AB to occur, it is essential that T1 is followed by a mask, as without such a stimulus, the AB is significantly attenuated. Usually, it is thought that T1 processing is delayed by the mask, which in turn delays T2 processing, increasing the likelihood for T2 failures (AB). Predictions regarding amplitudes and latencies of cortical responses (M300, the magnetic counterpart to the P300) to targets were tested by investigating the neurophysiological effects of the post-T1 item (mask) by means of magnetoencephalography (MEG). Cortical M300 responses to targets drawn from prefrontal sources – areas associated with working memory – revealed accelerated T1 yet delayed T2 processing with an intervening mask. The explanation we are proposing assumes that “protection” of ongoing T1 processing necessitated by the occurrence of the mask suppresses other activation patterns, which boosts T1 yet also hinders further processing. Our data shed light on the mechanisms employed by the human brain for ensuring visual target processing under high temporal processing demands, which is hypothesized to occur at the expense of subsequently presented information.
Resumo:
These case studies from CIMA highlight the need to embed risk management within more easily understood behaviours, consistent with the overall organisational culture. In each case, some form of internal audit team provides either an oversight function or acts as an expert link in that feedback loop. Frontline staff, managers and specialists should be completely aligned on risk, in part just to ensure that there is a consistency of approach. They should understand instinctively that good performance includes good risk management. Tesco has continued to thrive during the recession and remains a robust and efficient group of businesses despite the emergence of potential threats around consumer spending and the supply chain. RBS, by contrast, has suffered catastrophic and very public failures of risk management despite a large in-house function and stiff regulation of risk controls. Birmingham City Council, like all local authorities, is adapting to more commercial modes of operation and is facing diverse threats and opportunities emerging as a result of social change. And DCMS, like many other public sector organisations, has to handle an incredibly complex network of delivery partners within the context of a relatively recent overhaul of central government risk management processes. Key Findings: •Risk management is no longer solely a financial discipline, nor is it simply a concern for the internal control function. •Where organisations retain a discrete risk management cadre – often specialists at monitoring and evaluating a range of risks – their success is dependent on embedding risk awareness in the wider culture of the enterprise. •Risk management is most successful when it is explicitly linked to operational performance. •Clear leadership, specific goals, excellent influencing skills and open-mindedness to potential threats and opportunities are essential for effective risk management. •Bureaucratic processes and systems can hamper good risk management – either as a result of a ‘box-ticking mentality’ or because managers and staff believe they do not need to consider risk themselves.
Resumo:
Original method and technology of systemological «Unit-Function-Object» analysis for solving complete ill-structured problems is proposed. The given visual grapho-analytical UFO technology for the fist time combines capabilities and advantages of the system and object approaches and can be used for business reengineering and for information systems design. UFO- technology procedures are formalized by pattern-theory methods and developed by embedding systemological conceptual classification models into the system-object analysis and software tools. Technology is based on natural classification and helps to investigate deep semantic regularities of subject domain and to take proper account of system-classes essential properties the most objectively. Systemological knowledge models are based on method which for the first time synthesizes system and classification analysis. It allows creating CASE-toolkit of a new generation for organizational modelling for companies’ sustainable development and competitive advantages providing.
Resumo:
We report poor fluorinated graphene sheets produced by thermal exfoliation embedding in carboxymethylcellulose polymer composite (GCMC) as an efficient mode locker for erbium doped fiber laser. Two GCMC mode lockers with different concentration have been fabricated. The GCMC based mode locked fiber laser shows stable soliton output pulse shaping with repetition rate of 28.5MHz and output power of 5.5 mW was achieved with the high concentration GCMC, while a slightly higher output power of 6.9 mW was obtained using the low concentration GCMC mode locker.
Resumo:
Given a differentiable action of a compact Lie group G on a compact smooth manifold V , there exists [3] a closed embedding of V into a finite-dimensional real vector space E so that the action of G on V may be extended to a differentiable linear action (a linear representation) of G on E. We prove an analogous equivariant embedding theorem for compact differentiable spaces (∞-standard in the sense of [6, 7, 8]).
Resumo:
AMS Subj. Classification: 68U05, 68P30
Resumo:
2000 Mathematics Subject Classification: 12F12
Resumo:
Acute life threatening events such as cardiac/respiratory arrests are often predictable in adults and children. However critical events such as unplanned extubations are considered as not predictable. This paper seeks to evaluate the ability of automated prediction systems based on feature space embedding and time series methods to predict unplanned extubations in paediatric intensive care patients. We try to exploit the trends in the physiological signals such as Heart Rate, Respiratory Rate, Systolic Blood Pressure and Oxygen saturation levels in the blood using signal processing aspects of a frame-based approach of expanding signals using a nonorthogonal basis derived from the data. We investigate the significance of the trends in a computerised prediction system. The results are compared with clinical observations of predictability. We will conclude by investigating whether the prediction capability of the system could be exploited to prevent future unplanned extubations. © 2014 IEEE.
Resumo:
Report published in the Proceedings of the National Conference on "Education in the Information Society", Plovdiv, May, 2012
Resumo:
Shield UI’s advanced framework for creating rich charts and graphs is the first of a line of data visualization components, giving web developers the power for embedding rich graphics in their web projects with minimum effort. Built with HTML, CSS3 and packaged as a jQuery plugin, the library has full support for legacy and modern desktop web browsers, as well as the latest mobile devices.
Resumo:
AMS subject classification: 52A01, 13C99.
Resumo:
In this paper, we investigate the use of manifold learning techniques to enhance the separation properties of standard graph kernels. The idea stems from the observation that when we perform multidimensional scaling on the distance matrices extracted from the kernels, the resulting data tends to be clustered along a curve that wraps around the embedding space, a behavior that suggests that long range distances are not estimated accurately, resulting in an increased curvature of the embedding space. Hence, we propose to use a number of manifold learning techniques to compute a low-dimensional embedding of the graphs in an attempt to unfold the embedding manifold, and increase the class separation. We perform an extensive experimental evaluation on a number of standard graph datasets using the shortest-path (Borgwardt and Kriegel, 2005), graphlet (Shervashidze et al., 2009), random walk (Kashima et al., 2003) and Weisfeiler-Lehman (Shervashidze et al., 2011) kernels. We observe the most significant improvement in the case of the graphlet kernel, which fits with the observation that neglecting the locational information of the substructures leads to a stronger curvature of the embedding manifold. On the other hand, the Weisfeiler-Lehman kernel partially mitigates the locality problem by using the node labels information, and thus does not clearly benefit from the manifold learning. Interestingly, our experiments also show that the unfolding of the space seems to reduce the performance gap between the examined kernels.
Resumo:
Kernel methods provide a convenient way to apply a wide range of learning techniques to complex and structured data by shifting the representational problem from one of finding an embedding of the data to that of defining a positive semidefinite kernel. One problem with the most widely used kernels is that they neglect the locational information within the structures, resulting in less discrimination. Correspondence-based kernels, on the other hand, are in general more discriminating, at the cost of sacrificing positive-definiteness due to their inability to guarantee transitivity of the correspondences between multiple graphs. In this paper we generalize a recent structural kernel based on the Jensen-Shannon divergence between quantum walks over the structures by introducing a novel alignment step which rather than permuting the nodes of the structures, aligns the quantum states of their walks. This results in a novel kernel that maintains localization within the structures, but still guarantees positive definiteness. Experimental evaluation validates the effectiveness of the kernel for several structural classification tasks. © 2014 Springer-Verlag Berlin Heidelberg.
Resumo:
Kernel methods provide a way to apply a wide range of learning techniques to complex and structured data by shifting the representational problem from one of finding an embedding of the data to that of defining a positive semidefinite kernel. In this paper, we propose a novel kernel on unattributed graphs where the structure is characterized through the evolution of a continuous-time quantum walk. More precisely, given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic. With this new graph to hand, we compute the density operators of the quantum systems representing the evolutions of two suitably defined quantum walks. Finally, we define the kernel between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.