943 resultados para framework-intensive applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retina is a very complex neural structure, which performs spatial, temporal, and chromatic processing on visual information and converts it into a compact ‘digital’ format composed of neural impulses. This paper presents a new compiler-based framework able to describe, simulate and validate custom retina models. The framework is compatible with the most usual neural recording and analysis tools, taking advantage of the interoperability with these kinds of applications. Furthermore it is possible to compile the code to generate accelerated versions of the visual processing models compatible with COTS microprocessors, FPGAs or GPUs. The whole system represents an ongoing work to design and develop a functional visual neuroprosthesis. Several case studies are described to assess the effectiveness and usefulness of the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of applications as well as the services for mobile systems faces a varied range of devices with very heterogeneous capabilities whose response times are difficult to predict. The research described in this work aims to respond to this issue by developing a computational model that formalizes the problem and that defines adjusting computing methods. The described proposal combines imprecise computing strategies with cloud computing paradigms in order to provide flexible implementation frameworks for embedded or mobile devices. As a result, the imprecise computation scheduling method on the workload of the embedded system is the solution to move computing to the cloud according to the priority and response time of the tasks to be executed and hereby be able to meet productivity and quality of desired services. A technique to estimate network delays and to schedule more accurately tasks is illustrated in this paper. An application example in which this technique is experimented in running contexts with heterogeneous work loading for checking the validity of the proposed model is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present a semantic framework suitable of being used as support tool for recommender systems. Our purpose is to use the semantic information provided by a set of integrated resources to enrich texts by conducting different NLP tasks: WSD, domain classification, semantic similarities and sentiment analysis. After obtaining the textual semantic enrichment we would be able to recommend similar content or even to rate texts according to different dimensions. First of all, we describe the main characteristics of the semantic integrated resources with an exhaustive evaluation. Next, we demonstrate the usefulness of our resource in different NLP tasks and campaigns. Moreover, we present a combination of different NLP approaches that provide enough knowledge for being used as support tool for recommender systems. Finally, we illustrate a case of study with information related to movies and TV series to demonstrate that our framework works properly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Desenvolvimento de Software e Sistemas Interactivos, realizada sob a orientação científica do Doutor Osvaldo Arede dos Santos, Professor Adjunto da Unidade Técnico Científica de Informática da Escola Superior de Tecnologia do Instituto Politécnico de Castelo Branco.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's internet world, web browsers are an integral part of our day-to-day activities. Therefore, web browser security is a serious concern for all of us. Browsers can be breached in different ways. Because of the over privileged access, extensions are responsible for many security issues. Browser vendors try to keep safe extensions in their official extension galleries. However, their security control measures are not always effective and adequate. The distribution of unsafe extensions through different social engineering techniques is also a very common practice. Therefore, before installation, users should thoroughly analyze the security of browser extensions. Extensions are not only available for desktop browsers, but many mobile browsers, for example, Firefox for Android and UC browser for Android, are also furnished with extension features. Mobile devices have various resource constraints in terms of computational capabilities, power, network bandwidth, etc. Hence, conventional extension security analysis techniques cannot be efficiently used by end users to examine mobile browser extension security issues. To overcome the inadequacies of the existing approaches, we propose CLOUBEX, a CLOUd-based security analysis framework for both desktop and mobile Browser EXtensions. This framework uses a client-server architecture model. In this framework, compute-intensive security analysis tasks are generally executed in a high-speed computing server hosted in a cloud environment. CLOUBEX is also enriched with a number of essential features, such as client-side analysis, requirements-driven analysis, high performance, and dynamic decision making. At present, the Firefox extension ecosystem is most susceptible to different security attacks. Hence, the framework is implemented for the security analysis of the Firefox desktop and Firefox for Android mobile browser extensions. A static taint analysis is used to identify malicious information flows in the Firefox extensions. In CLOUBEX, there are three analysis modes. A dynamic decision making algorithm assists us to select the best option based on some important parameters, such as the processing speed of a client device and network connection speed. Using the best analysis mode, performance and power consumption are improved significantly. In the future, this framework can be leveraged for the security analysis of other desktop and mobile browser extensions, too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emergence of widespread offshoring of information-intensive services is arguably one of the more impactful phenomena to transform business in the last ten years. A growing body of research has examined the firm-level drivers andlocation factors (i.e., the why's and where's) of services offshoring. However, little empirical research has examined the maturation sequencing (or when's) of services offshoring. Adopting industry life cycle theory as a framework, the key research questions examined in the paper are: when do different categories of offshoring services provision change from being emergent sectors to more mature ones, and how does the timing of this sequence relate to the type of service offshored. Using a database of 1420 offshore services FDI projects, we find that the value-add as well as the information sensitivity of the service category are related to when the service categories progress through the industry life cycle. Implications for future waves of service offshoring are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negli ultimi decenni, le tecnologie e i prodotti informatici sono diventati pervasivi e sono ora una parte essenziale delle nostre vite. Ogni giorno ci influenzano in maniera più o meno esplicita, cambiando il nostro modo di vivere e i nostri comportamenti più o meno intenzionalmente. Tuttavia, i computer non nacquero inizialmente per persuadere: essi furono costruiti per gestire, calcolare, immagazzinare e recuperare dati. Non appena i computer si sono spostati dai laboratori di ricerca alla vita di tutti i giorni, sono però diventati sempre più persuasivi. Questa area di ricerca è chiamata pesuasive technology o captology, anche definita come lo studio dei sistemi informatici interattivi progettati per cambiare le attitudini e le abitudini delle persone. Nonostante il successo crescente delle tecnologie persuasive, sembra esserci una mancanza di framework sia teorici che pratici, che possano aiutare gli sviluppatori di applicazioni mobili a costruire applicazioni in grado di persuadere effettivamente gli utenti finali. Tuttavia, il lavoro condotto dal Professor Helal e dal Professor Lee al Persuasive Laboratory all’interno dell’University of Florida tenta di colmare questa lacuna. Infatti, hanno proposto un modello di persuasione semplice ma efficace, il quale può essere usato in maniera intuitiva da ingegneri o specialisti informatici. Inoltre, il Professor Helal e il Professor Lee hanno anche sviluppato Cicero, un middleware per dispositivi Android basato sul loro precedente modello, il quale può essere usato in modo molto semplice e veloce dagli sviluppatori per creare applicazioni persuasive. Il mio lavoro al centro di questa tesi progettuale si concentra sull’analisi del middleware appena descritto e, successivamente, sui miglioramenti e ampliamenti portati ad esso. I più importanti sono una nuova architettura di sensing, una nuova struttura basata sul cloud e un nuovo protocollo che permette di creare applicazioni specifiche per smartwatch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cyclotides are a family of disulfide-rich proteins from plants. They have the characteristic structural features of a circular protein backbone and a knotted arrangement of disulfide bonds. Structural and biochemical studies of the cyclotides suggest that their unique physiological stability can be loaned to bioactive peptide fragments for pharmaceutical and agricultural development. In particular, the cyclotides incorporate a number of solvent-exposed loops that are potentially suitable for epitope grafting applications. Here, we determine the structure of the largest known cyclotide, palicourein, which has an atypical size and composition within one of the surface-exposed loops. The structural data show that an increase in size of a palicourein loop does not perturb the core fold, to which the thermodynamic and chemical stability has been attributed. The cyclotide core fold, thus, can in principle be used as a framework for the development of useful pharmaceutical and agricultural bioactivities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a new control design method is proposed for stable processes which can be described using Hammerstein-Wiener models. The internal model control (IMC) framework is extended to accommodate multiple IMC controllers, one for each subsystem. The concept of passive systems is used to construct the IMC controllers which approximate the inverses of the subsystems to achieve dynamic control performance. The Passivity Theorem is used to ensure the closed-loop stability. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Transcranial Doppler (TCD) ultrasonography is a technique that uses a hand-held Doppler transducer (placed on the surface of the cranial skin) to measure the velocity and pulsatility of blood flow within the intracranial and the extracranial arteries. This review critically evaluates the evidence for the use of TCD in the critical care population. Discussion: TCD has been frequently employed for the clinical evaluation of cerebral vasospasm following subarachnoid haemorrhage (SAH). To a lesser degree, TCD has also been used to evaluate cerebral autoregulatory capacity, monitor cerebral circulation during cardiopulmonary bypass and carotid endarterectomies and to diagnose brain death. Technological advances such as M mode, colour Doppler and three-dimensional power Doppler ultrasonography have extended the scope of TCD to include other non-critical care applications including assessment of cerebral emboli, functional TCD and the management of sickle cell disease. Conclusions: Despite publications suggesting concordance between TCD velocity measurements and cerebral blood flow there are few randomized controlled studies demonstrating an improved outcome with the use of TCD monitoring in neurocritical care. Newer developments in this technology include venous Doppler, functional Doppler and use of ultrasound contrast agents.