15 resultados para Regularization Methods
Resumo:
This paper reviews the methods for measuring the economic cost of conflict. Estimating the economic costs of conflict requires a counterfactual calculation, which makes this a very difficult task. Social researchers have resorted to different estimation methods depending on the particular effect in question. The method used in each case depends on the units being analyzed (firms, sectors, regions or countries), the outcome variable under study (aggregate output, market valuation of firms, market shares, etc.) and data availability (a single cross-section, time series or panel data). This paper reviews existing methods used in the literature to assess the economic impact of conflict: cost accounting, cross-section methods, time series methods, panel data methods, gravity models, event studies, natural experiments and comparative case studies. The paper ends with a discussion of cost estimates and directions for further research.
Resumo:
Methods for generating a new population are a fundamental component of estimation of distribution algorithms (EDAs). They serve to transfer the information contained in the probabilistic model to the new generated population. In EDAs based on Markov networks, methods for generating new populations usually discard information contained in the model to gain in efficiency. Other methods like Gibbs sampling use information about all interactions in the model but are computationally very costly. In this paper we propose new methods for generating new solutions in EDAs based on Markov networks. We introduce approaches based on inference methods for computing the most probable configurations and model-based template recombination. We show that the application of different variants of inference methods can increase the EDAs’ convergence rate and reduce the number of function evaluations needed to find the optimum of binary and non-binary discrete functions.
Resumo:
Functional Electrical Stimulation (FES) is a technique that consists on applying electrical current pulses to artificially activate motor nerve fibers and produce muscle contractions to achieve functional movements. The main applications of FES are within the rehabilitation field, in which this technique is used to aid recovery or to restore lost motor functions. People that benefit of FES are usually patients with neurological disorders which result in motor dysfunctions; most common patients include stroke and spinal cord injury (SCI). Neuroprosthesis are devices that have their basis in FES technique, and their aim is to bridge interrupted or damaged neural paths between the brain and upper or lower limbs. One of the aims of neuroprosthesis is to artificially generate muscle contractions that produce functional movements, and therefore, assist impaired people by making them able to perform activities of daily living (ADL). FES applies current pulses and stimulates nerve fibers by means of electrodes, which can be either implanted or surface electrodes. Both of them have advantages and disadvantages. Implanted electrodes need open surgery to place them next to the nerve root, so these electrodes carry many disadvantages that are produced by the use of invasive techniques. In return, as the electrodes are attached to the nerve, they make it easier to achieve selective functional movements. On the contrary, surface electrodes are not invasive and are easily attached or detached on the skin. Main disadvantages of surface electrodes are the difficulty of selectively stimulating nerve fibers and uncomfortable feeling perceived by users due to sensory nerves located in the skin. Electrical stimulation surface electrode technology has improved significantly through the years and recently, multi-field electrodes have been suggested. This multi-field or matrix electrode approach brings many advantages to FES; among them it is the possibility of easily applying different stimulation methods and techniques. The main goal of this thesis is therefore, to test two stimulation methods, which are asynchronous and synchronous stimulation, in the upper limb with multi-field electrodes. To this end, a purpose-built wrist torque measuring system and a graphic user interface were developed to measure wrist torque produced with each of the methods and to efficiently carry out the experiments. Then, both methods were tested on 15 healthy subjects and sensitivity results were analyzed for different cases. Results show that there are significant differences between methods regarding sensation in some cases, which can affect effectiveness or success of FES.
Resumo:
Accurate and fast decoding of speech imagery from electroencephalographic (EEG) data could serve as a basis for a new generation of brain computer interfaces (BCIs), more portable and easier to use. However, decoding of speech imagery from EEG is a hard problem due to many factors. In this paper we focus on the analysis of the classification step of speech imagery decoding for a three-class vowel speech imagery recognition problem. We empirically show that different classification subtasks may require different classifiers for accurately decoding and obtain a classification accuracy that improves the best results previously published. We further investigate the relationship between the classifiers and different sets of features selected by the common spatial patterns method. Our results indicate that further improvement on BCIs based on speech imagery could be achieved by carefully selecting an appropriate combination of classifiers for the subtasks involved.
Resumo:
The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.
Resumo:
We study quantum state tomography, entanglement detection and channel noise reconstruction of propagating quantum microwaves via dual-path methods. The presented schemes make use of the following key elements: propagation channels, beam splitters, linear amplifiers and field quadrature detectors. Remarkably, our methods are tolerant to the ubiquitous noise added to the signals by phase-insensitive microwave amplifiers. Furthermore, we analyse our techniques with numerical examples and experimental data, and compare them with the scheme developed in Eichler et al (2011 Phys. Rev. Lett. 106 220503; 2011 Phys. Rev. Lett. 107 113601), based on a single path. Our methods provide key toolbox components that may pave the way towards quantum microwave teleportation and communication protocols.
Resumo:
Singular Value Decomposition (SVD) is a key linear algebraic operation in many scientific and engineering applications. In particular, many computational intelligence systems rely on machine learning methods involving high dimensionality datasets that have to be fast processed for real-time adaptability. In this paper we describe a practical FPGA (Field Programmable Gate Array) implementation of a SVD processor for accelerating the solution of large LSE problems. The design approach has been comprehensive, from the algorithmic refinement to the numerical analysis to the customization for an efficient hardware realization. The processing scheme rests on an adaptive vector rotation evaluator for error regularization that enhances convergence speed with no penalty on the solution accuracy. The proposed architecture, which follows a data transfer scheme, is scalable and based on the interconnection of simple rotations units, which allows for a trade-off between occupied area and processing acceleration in the final implementation. This permits the SVD processor to be implemented both on low-cost and highend FPGAs, according to the final application requirements.
Resumo:
IDOKI SCF Technologies S.L. is a technology-based company, set up on September 2006 in Derio (Biscay) with the main scope of developing extraction and purification processes based on the use of supercritical fluid extraction technology (SFE) in food processing, extraction of natural products and the production of personal care products. IDOKI¿s researchers have been working on many different R&D projects so far, most of them using this technology. However, the optimization of a SFE method for the different matrices cannot be performed unless we have an analytical method for the characterisation of the extracts obtained in each experiment. The analytical methods are also essential for the quality control of the raw materials that are going to be used and also for the final product. This PhD thesis was born to tackle this problem and therefore, it is based on the development of different analytical methods for the characterisation of the extracts and products. The projects that we could include in this thesis were the following: the extraction propolis, the recovery of agroindustrial residues (soy and wine) and the dealcoholisation of wine.On the one hand, for the extraction of propolis, several UV-Vis spectroscopic methods were used in order to measure the antioxidant capacity and the total polyphenol and flavonoid content of the extracts. A SFC method was also developed in order to measure more specific phenolic compounds. On the other hand, for the recovery of agroindustrial residues UV-Vis spectroscopy was used to determine the total polyphenol content and two SFC methods were developed to analyse different phenolic compounds. Extraction methods such as MAE, FUSE and rotary agitation were also evaluated for the characterisation of the raw materials.Finally, for the dealcoholisation of wine, the development of a SBSE-TD-GC-MS and DHS-TD-GC-MS methods for the analysis of aromas and a NIR spectroscopic method for the determination of ethanol content with the help of chemometrics was necessary. Most of these methods are typically used in IDOKI¿s lab as routine analyses apart from others not included in this PhD thesis.
Resumo:
Journalism on digital networks, and specifically on Internet, is a relatively recent phenomenon, whose spread began in approximately 1994, in parallel to that of the World Wide Web. Study of this new communicative phenomenon began simultaneously in several countries. This, in its turn, was helped by the new possibilities for communication amongst academics - electronic mail, predating the WWW, was, and is, one of the most widely used tools of the university community. The spread of these new forms of global communication helped to raise mutual awareness between research groups, making it possible to form increasingly broad and cohesive networks.
Resumo:
221 p.+ anexos
Resumo:
306 p.
Resumo:
236 p.
Resumo:
In the problem of one-class classification (OCC) one of the classes, the target class, has to be distinguished from all other possible objects, considered as nontargets. In many biomedical problems this situation arises, for example, in diagnosis, image based tumor recognition or analysis of electrocardiogram data. In this paper an approach to OCC based on a typicality test is experimentally compared with reference state-of-the-art OCC techniques-Gaussian, mixture of Gaussians, naive Parzen, Parzen, and support vector data description-using biomedical data sets. We evaluate the ability of the procedures using twelve experimental data sets with not necessarily continuous data. As there are few benchmark data sets for one-class classification, all data sets considered in the evaluation have multiple classes. Each class in turn is considered as the target class and the units in the other classes are considered as new units to be classified. The results of the comparison show the good performance of the typicality approach, which is available for high dimensional data; it is worth mentioning that it can be used for any kind of data (continuous, discrete, or nominal), whereas state-of-the-art approaches application is not straightforward when nominal variables are present.
Resumo:
In the last decades the creation of new Environmental Specimen Banks (ESB) is increasing due to the necessity of knowing the effects of pollutants in both the environment and human populations. ESBs analyze and store samples in order to understand the effects of chemicals, emerging substances and the environmental changes in biota. For a correct analysis of the effect induced by these variables, there is a need to add biological endpoints, such as biomarkers, to the endpoints based on chemical approaches which have being used until now. It is essential to adapt ESB´s sampling strategies in order to enable scientists to apply new biological methods. The present study was performed to obtain biochemical endpoints from samples stored in the BBEBB (Biscay Bay Environmental Biospecimen Bank) of the Marine Station of Plentzia (PIE - UPV/EHU). The main objective of the present work was to study the variability caused in biochemical biomarkers by different processing methods in mussels (Mytilus galloprovincialis) from two localities (Plentzia and Arriluze) with different pollution history. It can be concluded that the selected biomarkers (glutathione S-transferase and acetylcholinesterase) can be accurately measured in samples stored for years in the ESBs. The results also allowed the discrimination of both sampling sites. However, in a further step, the threshold levels and baseline values should be characterized for a correct interpretation of the results in relation to the assessment of the ecosystem health status.
Resumo:
4th International Workshop on Transverse Polisarization Phenomena in Hard Processes (TRANSVERSITY 2014)