14 resultados para Precipitation methods
em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco
Resumo:
This paper reviews the methods for measuring the economic cost of conflict. Estimating the economic costs of conflict requires a counterfactual calculation, which makes this a very difficult task. Social researchers have resorted to different estimation methods depending on the particular effect in question. The method used in each case depends on the units being analyzed (firms, sectors, regions or countries), the outcome variable under study (aggregate output, market valuation of firms, market shares, etc.) and data availability (a single cross-section, time series or panel data). This paper reviews existing methods used in the literature to assess the economic impact of conflict: cost accounting, cross-section methods, time series methods, panel data methods, gravity models, event studies, natural experiments and comparative case studies. The paper ends with a discussion of cost estimates and directions for further research.
Resumo:
Methods for generating a new population are a fundamental component of estimation of distribution algorithms (EDAs). They serve to transfer the information contained in the probabilistic model to the new generated population. In EDAs based on Markov networks, methods for generating new populations usually discard information contained in the model to gain in efficiency. Other methods like Gibbs sampling use information about all interactions in the model but are computationally very costly. In this paper we propose new methods for generating new solutions in EDAs based on Markov networks. We introduce approaches based on inference methods for computing the most probable configurations and model-based template recombination. We show that the application of different variants of inference methods can increase the EDAs’ convergence rate and reduce the number of function evaluations needed to find the optimum of binary and non-binary discrete functions.
Resumo:
Functional Electrical Stimulation (FES) is a technique that consists on applying electrical current pulses to artificially activate motor nerve fibers and produce muscle contractions to achieve functional movements. The main applications of FES are within the rehabilitation field, in which this technique is used to aid recovery or to restore lost motor functions. People that benefit of FES are usually patients with neurological disorders which result in motor dysfunctions; most common patients include stroke and spinal cord injury (SCI). Neuroprosthesis are devices that have their basis in FES technique, and their aim is to bridge interrupted or damaged neural paths between the brain and upper or lower limbs. One of the aims of neuroprosthesis is to artificially generate muscle contractions that produce functional movements, and therefore, assist impaired people by making them able to perform activities of daily living (ADL). FES applies current pulses and stimulates nerve fibers by means of electrodes, which can be either implanted or surface electrodes. Both of them have advantages and disadvantages. Implanted electrodes need open surgery to place them next to the nerve root, so these electrodes carry many disadvantages that are produced by the use of invasive techniques. In return, as the electrodes are attached to the nerve, they make it easier to achieve selective functional movements. On the contrary, surface electrodes are not invasive and are easily attached or detached on the skin. Main disadvantages of surface electrodes are the difficulty of selectively stimulating nerve fibers and uncomfortable feeling perceived by users due to sensory nerves located in the skin. Electrical stimulation surface electrode technology has improved significantly through the years and recently, multi-field electrodes have been suggested. This multi-field or matrix electrode approach brings many advantages to FES; among them it is the possibility of easily applying different stimulation methods and techniques. The main goal of this thesis is therefore, to test two stimulation methods, which are asynchronous and synchronous stimulation, in the upper limb with multi-field electrodes. To this end, a purpose-built wrist torque measuring system and a graphic user interface were developed to measure wrist torque produced with each of the methods and to efficiently carry out the experiments. Then, both methods were tested on 15 healthy subjects and sensitivity results were analyzed for different cases. Results show that there are significant differences between methods regarding sensation in some cases, which can affect effectiveness or success of FES.
Resumo:
Accurate and fast decoding of speech imagery from electroencephalographic (EEG) data could serve as a basis for a new generation of brain computer interfaces (BCIs), more portable and easier to use. However, decoding of speech imagery from EEG is a hard problem due to many factors. In this paper we focus on the analysis of the classification step of speech imagery decoding for a three-class vowel speech imagery recognition problem. We empirically show that different classification subtasks may require different classifiers for accurately decoding and obtain a classification accuracy that improves the best results previously published. We further investigate the relationship between the classifiers and different sets of features selected by the common spatial patterns method. Our results indicate that further improvement on BCIs based on speech imagery could be achieved by carefully selecting an appropriate combination of classifiers for the subtasks involved.
Resumo:
The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.
Resumo:
In this paper, reanalysis fields from the ECMWF have been statistically downscaled to predict from large-scale atmospheric fields, surface moisture flux and daily precipitation at two observatories (Zaragoza and Tortosa, Ebro Valley, Spain) during the 1961-2001 period. Three types of downscaling models have been built: (i) analogues, (ii) analogues followed by random forests and (iii) analogues followed by multiple linear regression. The inputs consist of data (predictor fields) taken from the ERA-40 reanalysis. The predicted fields are precipitation and surface moisture flux as measured at the two observatories. With the aim to reduce the dimensionality of the problem, the ERA-40 fields have been decomposed using empirical orthogonal functions. Available daily data has been divided into two parts: a training period used to find a group of about 300 analogues to build the downscaling model (1961-1996) and a test period (19972001), where models' performance has been assessed using independent data. In the case of surface moisture flux, the models based on analogues followed by random forests do not clearly outperform those built on analogues plus multiple linear regression, while simple averages calculated from the nearest analogues found in the training period, yielded only slightly worse results. In the case of precipitation, the three types of model performed equally. These results suggest that most of the models' downscaling capabilities can be attributed to the analogues-calculation stage.
Resumo:
We study quantum state tomography, entanglement detection and channel noise reconstruction of propagating quantum microwaves via dual-path methods. The presented schemes make use of the following key elements: propagation channels, beam splitters, linear amplifiers and field quadrature detectors. Remarkably, our methods are tolerant to the ubiquitous noise added to the signals by phase-insensitive microwave amplifiers. Furthermore, we analyse our techniques with numerical examples and experimental data, and compare them with the scheme developed in Eichler et al (2011 Phys. Rev. Lett. 106 220503; 2011 Phys. Rev. Lett. 107 113601), based on a single path. Our methods provide key toolbox components that may pave the way towards quantum microwave teleportation and communication protocols.
Resumo:
280 p. : il.
Resumo:
Lignosulphonates (LS) and fermentable sugars are the main components of spent sulphite liquors (SSL) produced in acid sulphite pulping. In spite of different methods have been used for spent liquor fractionation such as precipitation or vaporization; membrane technology allows the separation of these components from the SSL due to their different size molecular weight, offering great advantages with regards to the traditionally methods (less energy consumption, high selective separation, and many others). In the present study, ceramic membranes with different cut-offs (15 kDa, 5 kDa and 1 kDa) were used to achieve the sugar purification and the LS concentration. The membranes were evaluated according to their efficacy and efficiency properties. Different series system were tested in order to improve the aptitudes of a singular membrane. The system with the three membranes in series (15, 5 and 1 kDa respectively) obtained the most purified permeate stream, referred to the sugar content. Also, a characterisation of the LS contained in the different streams produced in this system was carried out in order to know in a more precise manner the valorisation potential of these components by means of biorefinery processes.
Resumo:
IDOKI SCF Technologies S.L. is a technology-based company, set up on September 2006 in Derio (Biscay) with the main scope of developing extraction and purification processes based on the use of supercritical fluid extraction technology (SFE) in food processing, extraction of natural products and the production of personal care products. IDOKI¿s researchers have been working on many different R&D projects so far, most of them using this technology. However, the optimization of a SFE method for the different matrices cannot be performed unless we have an analytical method for the characterisation of the extracts obtained in each experiment. The analytical methods are also essential for the quality control of the raw materials that are going to be used and also for the final product. This PhD thesis was born to tackle this problem and therefore, it is based on the development of different analytical methods for the characterisation of the extracts and products. The projects that we could include in this thesis were the following: the extraction propolis, the recovery of agroindustrial residues (soy and wine) and the dealcoholisation of wine.On the one hand, for the extraction of propolis, several UV-Vis spectroscopic methods were used in order to measure the antioxidant capacity and the total polyphenol and flavonoid content of the extracts. A SFC method was also developed in order to measure more specific phenolic compounds. On the other hand, for the recovery of agroindustrial residues UV-Vis spectroscopy was used to determine the total polyphenol content and two SFC methods were developed to analyse different phenolic compounds. Extraction methods such as MAE, FUSE and rotary agitation were also evaluated for the characterisation of the raw materials.Finally, for the dealcoholisation of wine, the development of a SBSE-TD-GC-MS and DHS-TD-GC-MS methods for the analysis of aromas and a NIR spectroscopic method for the determination of ethanol content with the help of chemometrics was necessary. Most of these methods are typically used in IDOKI¿s lab as routine analyses apart from others not included in this PhD thesis.
Resumo:
Journalism on digital networks, and specifically on Internet, is a relatively recent phenomenon, whose spread began in approximately 1994, in parallel to that of the World Wide Web. Study of this new communicative phenomenon began simultaneously in several countries. This, in its turn, was helped by the new possibilities for communication amongst academics - electronic mail, predating the WWW, was, and is, one of the most widely used tools of the university community. The spread of these new forms of global communication helped to raise mutual awareness between research groups, making it possible to form increasingly broad and cohesive networks.
Resumo:
221 p.+ anexos
Resumo:
306 p.
Resumo:
236 p.