928 resultados para Weights initialization
Resumo:
We consider the problem of multiple correlated sparse signals reconstruction and propose a new implementation of structured sparsity through a reweighting scheme. We present a particular application for diffusion Magnetic Resonance Imaging data and show how this procedure can be used for fibre orientation reconstruction in the white matter of the brain. In that framework, our structured sparsity prior can be used to exploit the fundamental coherence between fibre directions in neighbour voxels. Our method approaches the ℓ0 minimisation through a reweighted ℓ1-minimisation scheme. The weights are here defined in such a way to promote correlated sparsity between neighbour signals.
Resumo:
We propose a task for eliciting attitudes toward risk that is close to real-world risky decisions which typically involve gains and losses. The task consists of accepting or rejecting gambles that provide a gain with probability p and a loss with probability 1−p . We employ finite mixture models to uncover heterogeneity in risk preferences and find that (i) behavior is heterogeneous, with one half of the subjects behaving as expected utility maximizers, (ii) for the others, reference-dependent models perform better than those where subjects derive utility from final outcomes, (iii) models with sign-dependent decision weights perform better than those without, and (iv) there is no evidence for loss aversion. The procedure is sufficiently simple so that it can be easily used in field or lab experiments where risk elicitation is not the main experiment.
Resumo:
Satellite transmitters and geographic-positioning-system devices often add substantial mass to birds to which they are attached. Studies on the effects of such instruments have focused on indirect measures, whereas the direct influence of extra mass on pelagic behavior is poorly known. We used 2.5-g geolocators to investigate the effect of extra mass on the pelagic behavior of Cory's Shearwaters (Calonectris diomedea) by comparing the traits of a single foraging trip among a group carrying 30-g weights, a group carrying 60-g weights, and a control group. The weights were attached to the birds' backs using typical techniques for attaching satellite transmitters to seabirds. The extra mass increased the duration of the birds' trips and decreased their foraging efficiency and mass gained at sea. These indirect effects may be related to foraging traits: weighted birds showed a greater search effort than control birds, traveled greater distances, covered a greater foraging area, and increased the maximum foraging range. Furthermore, the time spent on the sea surface at night was greater for weighted than for control groups, which showed that the extra mass also affected activity patterns. Our results underline the need to quantify the effects of monitoring equipment commonly used to study the pelagic behavior of seabirds. We suggest that geolocators can be used to obtain control data on foraging-trip movements and activity patterns.
Resumo:
This thesis studies the properties and usability of operators called t-norms, t-conorms, uninorms, as well as many valued implications and equivalences. Into these operators, weights and a generalized mean are embedded for aggregation, and they are used for comparison tasks and for this reason they are referred to as comparison measures. The thesis illustrates how these operators can be weighted with a differential evolution and aggregated with a generalized mean, and the kinds of measures of comparison that can be achieved from this procedure. New operators suitable for comparison measures are suggested. These operators are combination measures based on the use of t-norms and t-conorms, the generalized 3_-uninorm and pseudo equivalence measures based on S-type implications. The empirical part of this thesis demonstrates how these new comparison measures work in the field of classification, for example, in the classification of medical data. The second application area is from the field of sports medicine and it represents an expert system for defining an athlete's aerobic and anaerobic thresholds. The core of this thesis offers definitions for comparison measures and illustrates that there is no actual difference in the results achieved in comparison tasks, by the use of comparison measures based on distance, versus comparison measures based on many valued logical structures. The approach has been highly practical in this thesis and all usage of the measures has been validated mainly by practical testing. In general, many different types of operators suitable for comparison tasks have been presented in fuzzy logic literature and there has been little or no experimental work with these operators.
Resumo:
LWC-syväpainopaperilta vaaditaan hyvän ajettavuuden, kiillon ja sileyden ohella hyvää opasiteettia. Tämä on asettanut haasteita LWC-paperin valmistajille paperin neliömassojen laskiessa. Tässä diplomityössä etsittiin keinoja parantaa kevyiden LWC-syväpainolajien opasiteettia heikentämättä oleellisesti muita tärkeitä paperin ominaisuuksia. Tavoitteena oli nostaa CR48-lajin opasiteetti tavoitearvoon 90 %. Työn kirjallisuusosassa perehdyttiin paperin optisten ominaisuuksien teoriaan sekä raaka-aineisiin ja prosessin osiin, joilla on vaikutusta paperin opasiteettiin. Työn kokeellisessa osassa tutkittiin olemassa olevan aineiston perusteella tekijöitä, joilla uskottiin olevan vaikutusta CR48-lajin opasiteettiin. Tutkimuksen ja kirjallisuuden perusteella ajettiin tehdaskoeajoa, joiden avulla pyrittiin parantamaan paperin opasiteettia. CR48-lajin opasiteettitavoite saavutettiin kolmella eri tavalla. Opasiteettitavoite saavutettiin, kun paperin vaaleus säädettiin tavoitearvoon pigmenttivärin avulla tumman hierteen sijasta. Tällöin väripigmentin määrää päällystyspastassa nostettiin 0,01 osaa ja valkaistun hierteen osuus kokonaishierteen määrästä oli 100 %. Vaaleuden säätö pastavärillä oli käytännössä hidasta ja hankalaa. Opasiteettitavoite saavutettiin myös, kun hierre jauhettiin täysin koeterillä. Koeterillä tapahtuva jauhatus oli rajumpaa ja katkovampaa kuin perinteisillä terillä, joten hienoaineen lisääntyminen ja kuidun lyheneminen paransivat paperin opasiteettia, mutta lujuudet huononivat. Lisäksi tavoiteopasiteetti saavutettiin, kun sellun osuutta vähennettiin 8 %-yksikköä. Lujuuden säilymisen kannalta sellun vähennys oli parempi keino opasiteetin parantamiseksi kuin hierteen jauhaminen koeterillä. Koeajojen perusteella pohjapaperin tuhkapitoisuuden nostolla ja hierteen CSF-luvun alentamisella ei ollut vaikutusta paperin opasiteettiin. Lisäksi 100 %:nen koeterillä jauhettu sahahakehierre antoi paperille huonomman opasiteetin kuin hierre, josta puolet oli jauhettu koeterillä ja raaka-aineesta 25 % oli sahahaketta.
Resumo:
Perfluoro and sulfonated ion-exchange polymers are recognized as a very useful material for various mechanistic studies and applications in electrochemistry. These polymers are characterized by high equivalent weights and by a low number of ion-exchange sites interposed between long organic chains. The solubility enables a preparation of stable polyelectrolyte films on the electrode surface. Examples of the determination of trace metals and organic componds in real environmental samples are presented.
Resumo:
The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.
Resumo:
Calculation of uncertainty of results represents the new paradigm in the area of the quality of measurements in laboratories. The guidance on the Expression of Uncertainty in Measurement of the ISO / International Organization for Standardization assumes that the analyst is being asked to give a parameter that characterizes the range of the values that could reasonably be associated with the result of the measurement. In practice, the uncertainty of the analytical result may arise from many possible sources: sampling, sample preparation, matrix effects, equipments, standards and reference materials, among others. This paper suggests a procedure for calculation of uncertainties components of an analytical result due to sample preparation (uncertainty of weights and volumetric equipment) and instrument analytical signal (calibration uncertainty). A numerical example is carefully explained based on measurements obtained for cadmium determination by flame atomic absorption spectrophotometry. Results obtained for components of total uncertainty showed that the main contribution to the analytical result was the calibration procedure.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
Objectives: The objectives of this study is to review the set of criteria of the Institute of Medicine (IOM) for priority-setting in research with addition of new criteria if necessary, and to develop and evaluate the reliability and validity of the final priority score. Methods: Based on the evaluation of 199 research topics, forty-five experts identified additional criteria for priority-setting, rated their relevance, and ranked and weighted them in a three-round modified Delphi technique. A final priority score was developed and evaluated. Internal consistency, test–retest and inter-rater reliability were assessed. Correlation with experts’ overall qualitative topic ratings were assessed as an approximation to validity. Results: All seven original IOM criteria were considered relevant and two new criteria were added (“potential for translation into practice”, and “need for knowledge”). Final ranks and relative weights differed from those of the original IOM criteria: “research impact on health outcomes” was considered the most important criterion (4.23), as opposed to “burden of disease” (3.92). Cronbach’s alpha (0.75) and test–retest stability (interclass correlation coefficient = 0.66) for the final set of criteria were acceptable. The area under the receiver operating characteristic curve for overall assessment of priority was 0.66. Conclusions: A reliable instrument for prioritizing topics in clinical and health services research has been developed. Further evaluation of its validity and impact on selecting research topics is required
Resumo:
Avui en dia, l’alta competitivitat que existeix al mercat, fa que les empreses hagin d’esprémer al màxim les seves possibilitats per no quedar-se enrere. Un dels processos en que aquest fet hi és més present és el productiu. L’empresa JCM Technologies també engloba aquest camp i és en un dels seus processos productius on aquest projecte pren part. L’objectiu d’aquest projecte final de carrera ha estat desenvolupar un sistema per poder marcar caixes mitjançant un làser de CO2 i un automatisme manipulador de caixes. D’aquesta manera aquest procés productiu té una durada molt inferior a l’antic procés, que consistia en enganxar una etiqueta al lloc on ara és marcat pel làser. Per satisfer els objectius, s’ha creat una aplicació de Windows que per mitjà d’una interfície gràfica, permet a l’usuari realitzar els passos necessaris per fer el marcatge. Primerament es recullen les dades procedents de la comanda; seguidament es seleccionen les que s’han de marcar a les caixes i s’envien al làser mitjançant comunicació sèrie; una vegada aquesta inicialització ha finalitzat correctament, s’engega la seqüència de marcatge de les caixes, que en marcarà la quantitat indicada. Aquest procés de marcatge consisteix en supervisar l’estat en que es troben certes senyals, procedents de l’automatisme i del làser, i depenent d’aquestes generar-ne unes altres. Aconseguint així realitzar el procés de marcatge de cada caixa. Com a conclusions cal dir, que els objectius s’han complert, ja que s’ha aconseguit un procés de marcatge ràpid i robust. També s’ha aconseguit que les parts de configuració de l’aplicació, i de les caixes siguin de fàcil manipulació. Per tant, amb l’acompliment dels objectius d’aquest projecte, aconseguim completar el sistema de marcatge, i permetre que les caixes siguin marcades de forma correcta, ràpida i eficient.
Resumo:
Aquest GUIXDOS és la continuació de dos monogràfics anteriors en què s’han presentat jocs per treballar altres blocs temàtics de la matemàtica: el raonament logicomatemàtic (Alsina, 2002a) i el càlcul (Alsina, 2002b). En aquesta ocasió, s’ofereix un petit ventall de recursos lúdics que poden facilitar i fer més motivador l’aprenentatge de diverses magnituds contínues a l’etapa d’educació primària: longitud, superfície, capacitat, massa, monedes i temps1. En primer lloc, es presenta una breu aproximació al concepte de mesura, quines són les principals competències que, a parer de l’autor, s’haurien d’anar assolint a l’etapa d’educació primària i algunes orientacions metodològiquesfonamentals. En segon lloc, i com és habitual en el GUIXDOS, s’hi presenten diversos recursos per ser aplicats directament a l’aula
Resumo:
The present work discusses the appearance of the concepts of valence and molecular structure, and describes the appropriation and evolution of the concept of molecule in the period following the publication of Avogadro's Hypothesis. The point of reference is the development of what became known as Organic Chemistry, which encompassed Pharmacy, Physiological Chemistry, Animal and Plant Chemistry, Chemistry of Dyestuffs, Agricultural Chemistry, and the fledgling Organic Synthesis industry in the early 19th century. The theories formulated in these areas and the quest for accurate atomic weights led to those concepts of valence and molecular structure and to a precise differentiation between atom and molecule.
Resumo:
This paper describes the development and validation of simple and selective analytical method for determination of 3.4-methylenedioxymethamphetamine (MDMA) in Ecstasy tablets, using high performance liquid chromatography with fluorescence detection. Analysis was performed in a reversed phase column (LiChrospher 100 C18, 150 x 4.6 mm, 5 µm), isocratic elution with phosphate buffer 25 mmol/L pH 3.0 and acetonitrile (95:5, v/v). The method presents adequate linearity, selectivity, precision and accuracy. MDMA concentration in analyzed tablets showed a remarkable variability (from 8.5 to 59.5 mg/tablet) although the tablet weights were uniform, indicating poor manufacturing control thus imposing additional health risks to the users.
Resumo:
Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.