974 resultados para Binary
Resumo:
The size of online image datasets is constantly increasing. Considering an image dataset with millions of images, image retrieval becomes a seemingly intractable problem for exhaustive similarity search algorithms. Hashing methods, which encodes high-dimensional descriptors into compact binary strings, have become very popular because of their high efficiency in search and storage capacity. In the first part, we propose a multimodal retrieval method based on latent feature models. The procedure consists of a nonparametric Bayesian framework for learning underlying semantically meaningful abstract features in a multimodal dataset, a probabilistic retrieval model that allows cross-modal queries and an extension model for relevance feedback. In the second part, we focus on supervised hashing with kernels. We describe a flexible hashing procedure that treats binary codes and pairwise semantic similarity as latent and observed variables, respectively, in a probabilistic model based on Gaussian processes for binary classification. We present a scalable inference algorithm with the sparse pseudo-input Gaussian process (SPGP) model and distributed computing. In the last part, we define an incremental hashing strategy for dynamic databases where new images are added to the databases frequently. The method is based on a two-stage classification framework using binary and multi-class SVMs. The proposed method also enforces balance in binary codes by an imbalance penalty to obtain higher quality binary codes. We learn hash functions by an efficient algorithm where the NP-hard problem of finding optimal binary codes is solved via cyclic coordinate descent and SVMs are trained in a parallelized incremental manner. For modifications like adding images from an unseen class, we propose an incremental procedure for effective and efficient updates to the previous hash functions. Experiments on three large-scale image datasets demonstrate that the incremental strategy is capable of efficiently updating hash functions to the same retrieval performance as hashing from scratch.
Resumo:
Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome
Resumo:
Background and Purpose—Vascular prevention trials mostly count “yes/no” (binary) outcome events, eg, stroke/no stroke. Analysis of ordered categorical vascular events (eg, fatal stroke/nonfatal stroke/no stroke) is clinically relevant and could be more powerful statistically. Although this is not a novel idea in the statistical community, ordinal outcomes have not been applied to stroke prevention trials in the past. Methods—Summary data on stroke, myocardial infarction, combined vascular events, and bleeding were obtained by treatment group from published vascular prevention trials. Data were analyzed using 10 statistical approaches which allow comparison of 2 ordinal or binary treatment groups. The results for each statistical test for each trial were then compared using Friedman 2-way analysis of variance with multiple comparison procedures. Results—Across 85 trials (335 305 subjects) the test results differed substantially so that approaches which used the ordinal nature of stroke events (fatal/nonfatal/no stroke) were more efficient than those which combined the data to form 2 groups (P0.0001). The most efficient tests were bootstrapping the difference in mean rank, Mann–Whitney U test, and ordinal logistic regression; 4- and 5-level data were more efficient still. Similar findings were obtained for myocardial infarction, combined vascular outcomes, and bleeding. The findings were consistent across different types, designs and sizes of trial, and for the different types of intervention. Conclusions—When analyzing vascular events from prevention trials, statistical tests which use ordered categorical data are more efficient and are more likely to yield reliable results than binary tests. This approach gives additional information on treatment effects by severity of event and will allow trials to be smaller. (Stroke. 2008;39:000-000.)
Resumo:
Isobaric vapor-liquid equilibria of binary mixtures of isopropyl acetate plus an alkanol (1-propanol, 2-propanol, 1-butanol, or 2-butanol) were measured at 101.32 kPa, using a dynamic recirculating still. An azeotropic behavior was observed only in the mixtures of isopropyl acetate + 2-propanol and isopropyl acetate + 1-propanol. The application of four thermodynamic consistency tests (the Herington test, the Van Ness test, the infinite dilution test, and the pure component test) showed the high quality of the experimental data. Finally, both NRTL and UNIQUAC activity coefficient models were successfully applied in the correlation of the measured data, with the average absolute deviations in vapor phase composition and temperature of 0.01 and 0.16 K, respectively.
Resumo:
Doutoramento em Matemática.
Resumo:
Model misspecification affects the classical test statistics used to assess the fit of the Item Response Theory (IRT) models. Robust tests have been derived under model misspecification, as the Generalized Lagrange Multiplier and Hausman tests, but their use has not been largely explored in the IRT framework. In the first part of the thesis, we introduce the Generalized Lagrange Multiplier test to detect differential item response functioning in IRT models for binary data under model misspecification. By means of a simulation study and a real data analysis, we compare its performance with the classical Lagrange Multiplier test, computed using the Hessian and the cross-product matrix, and the Generalized Jackknife Score test. The power of these tests is computed empirically and asymptotically. The misspecifications considered are local dependence among items and non-normal distribution of the latent variable. The results highlight that, under mild model misspecification, all tests have good performance while, under strong model misspecification, the performance of the tests deteriorates. None of the tests considered show an overall superior performance than the others. In the second part of the thesis, we extend the Generalized Hausman test to detect non-normality of the latent variable distribution. To build the test, we consider a seminonparametric-IRT model, that assumes a more flexible latent variable distribution. By means of a simulation study and two real applications, we compare the performance of the Generalized Hausman test with the M2 limited information goodness-of-fit test and the Likelihood-Ratio test. Additionally, the information criteria are computed. The Generalized Hausman test has a better performance than the Likelihood-Ratio test in terms of Type I error rates and the M2 test in terms of power. The performance of the Generalized Hausman test and the information criteria deteriorates when the sample size is small and with a few items.
(In)Visibili. Difficoltà, scelte e implicazioni nella mediazione nazionale dei personaggi non-binary
Resumo:
Con il progressivo aumentare del numero di personaggi non-binary nelle serie televisive in lingua inglese, dal punto di vista della mediazione nazionale italiana si va a delineare un interessante campo di studio, determinato dallo scontro fra la predilezione, da parte di questi personaggi, di pronomi ed espressioni neutre e la struttura grammaticale italiana, basata, invece, sull’esclusiva contrapposizione fra maschile e femminile. Il presente elaborato, allora, si pone l’obiettivo di individuare, attraverso una selezione di casi di studio, le difficoltà che sorgono automaticamente nel momento della realizzazione dell’edizione italiana di queste serie, le strategie adottate ai fini di rendere neutro il linguaggio italiano e le loro implicazioni nei confronti dei personaggi non-binary. Il metodo di studio consiste nel confronto fra la versione originale e l’edizione italiana (comprensiva di doppiaggio e sottotitoli) delle battute relative ai personaggi non binari di riferimento, con conseguente analisi delle differenze e somiglianze rilevate. I tre casi presi in considerazione nel corso della trattazione (Sex Education, One Day at a Time e Grey’s Anatomy) permettono, in definitiva, di individuare come rischio primario l’invisibilità del non binarismo di genere agli occhi e alle orecchie degli spettatori italiani e, così, lo snaturamento dell’intento e del valore dell’edizione originale.
Resumo:
Lipidic mixtures present a particular phase change profile highly affected by their unique crystalline structure. However, classical solid-liquid equilibrium (SLE) thermodynamic modeling approaches, which assume the solid phase to be a pure component, sometimes fail in the correct description of the phase behavior. In addition, their inability increases with the complexity of the system. To overcome some of these problems, this study describes a new procedure to depict the SLE of fatty binary mixtures presenting solid solutions, namely the Crystal-T algorithm. Considering the non-ideality of both liquid and solid phases, this algorithm is aimed at the determination of the temperature in which the first and last crystal of the mixture melts. The evaluation is focused on experimental data measured and reported in this work for systems composed of triacylglycerols and fatty alcohols. The liquidus and solidus lines of the SLE phase diagrams were described by using excess Gibbs energy based equations, and the group contribution UNIFAC model for the calculation of the activity coefficients of both liquid and solid phases. Very low deviations of theoretical and experimental data evidenced the strength of the algorithm, contributing to the enlargement of the scope of the SLE modeling.
Resumo:
To analyze the effects of treatment approach on the outcomes of newborns (birth weight [BW] < 1,000 g) with patent ductus arteriosus (PDA), from the Brazilian Neonatal Research Network (BNRN) on: death, bronchopulmonary dysplasia (BPD), severe intraventricular hemorrhage (IVH III/IV), retinopathy of prematurity requiring surgical (ROPsur), necrotizing enterocolitis requiring surgery (NECsur), and death/BPD. This was a multicentric, cohort study, retrospective data collection, including newborns (BW < 1000 g) with gestational age (GA) < 33 weeks and echocardiographic diagnosis of PDA, from 16 neonatal units of the BNRN from January 1, 2010 to Dec 31, 2011. Newborns who died or were transferred until the third day of life, and those with presence of congenital malformation or infection were excluded. Groups: G1 - conservative approach (without treatment), G2 - pharmacologic (indomethacin or ibuprofen), G3 - surgical ligation (independent of previous treatment). Factors analyzed: antenatal corticosteroid, cesarean section, BW, GA, 5 min. Apgar score < 4, male gender, Score for Neonatal Acute Physiology Perinatal Extension (SNAPPE II), respiratory distress syndrome (RDS), late sepsis (LS), mechanical ventilation (MV), surfactant (< 2 h of life), and time of MV. death, O2 dependence at 36 weeks (BPD36wks), IVH III/IV, ROPsur, NECsur, and death/BPD36wks. Student's t-test, chi-squared test, or Fisher's exact test; Odds ratio (95% CI); logistic binary regression and backward stepwise multiple regression. Software: MedCalc (Medical Calculator) software, version 12.1.4.0. p-values < 0.05 were considered statistically significant. 1,097 newborns were selected and 494 newborns were included: G1 - 187 (37.8%), G2 - 205 (41.5%), and G3 - 102 (20.6%). The highest mortality was observed in G1 (51.3%) and the lowest in G3 (14.7%). The highest frequencies of BPD36wks (70.6%) and ROPsur were observed in G3 (23.5%). The lowest occurrence of death/BPD36wks occurred in G2 (58.0%). Pharmacological (OR 0.29; 95% CI: 0.14-0.62) and conservative (OR 0.34; 95% CI: 0.14-0.79) treatments were protective for the outcome death/BPD36wks. The conservative approach of PDA was associated to high mortality, the surgical approach to the occurrence of BPD36wks and ROPsur, and the pharmacological treatment was protective for the outcome death/BPD36wks.
Resumo:
We report the first measurement of charmed-hadron (D(0)) production via the hadronic decay channel (D(0) → K(-) + π(+)) in Au+Au collisions at sqrt[s(NN)] = 200 GeV with the STAR experiment. The charm production cross section per nucleon-nucleon collision at midrapidity scales with the number of binary collisions, N(bin), from p+p to central Au+Au collisions. The D(0) meson yields in central Au + Au collisions are strongly suppressed compared to those in p+p scaled by N(bin), for transverse momenta p(T) > 3 GeV/c, demonstrating significant energy loss of charm quarks in the hot and dense medium. An enhancement at intermediate p(T) is also observed. Model calculations including strong charm-medium interactions and coalescence hadronization describe our measurements.
Resumo:
Lateral pterygoid muscle (LPM) plays an important role in jaw movement and has been implicated in Temporomandibular disorders (TMDs). Migraine has been described as a common symptom in patients with TMDs and may be related to muscle hyperactivity. This study aimed to compare LPM volume in individuals with and without migraine, using segmentation of the LPM in magnetic resonance (MR) imaging of the TMJ. Twenty patients with migraine and 20 volunteers without migraine underwent a clinical examination of the TMJ, according to the Research Diagnostic Criteria for TMDs. MR imaging was performed and the LPM was segmented using the ITK-SNAP 1.4.1 software, which calculates the volume of each segmented structure in voxels per cubic millimeter. The chi-squared test and the Fisher's exact test were used to relate the TMD variables obtained from the MR images and clinical examinations to the presence of migraine. Logistic binary regression was used to determine the importance of each factor for predicting the presence of a migraine headache. Patients with TMDs and migraine tended to have hypertrophy of the LPM (58.7%). In addition, abnormal mandibular movements (61.2%) and disc displacement (70.0%) were found to be the most common signs in patients with TMDs and migraine. In patients with TMDs and simultaneous migraine, the LPM tends to be hypertrophic. LPM segmentation on MR imaging may be an alternative method to study this muscle in such patients because the hypertrophic LPM is not always palpable.
Resumo:
The high cost of sensitivity commercial calorimeters may represent an obstacle for many calorimetric research groups. This work describes the construction and calibration of a batch differential heat conduction calorimeter with sample cells volumes of about 400 μL. The calorimeter was built using two small high sensibility square Peltier thermoelectric sensors and the total cost was estimated to be about US$ 500. The calorimeter was used to study the excess enthalpy of solution of binary mixtures of liquids, as a function of composition, for the following binary systems of solvents: water + 1,4-dioxane or + dimethylsulfoxide at 298,2 ± 0,5 K.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física