958 resultados para Calibration curve
Resumo:
Other
Resumo:
Current interest in measuring quality of life is generating interest in the construction of computerized adaptive tests (CATs) with Likert-type items. Calibration of an item bank for use in CAT requires collecting responses to a large number of candidate items. However, the number is usually too large to administer to each subject in the calibration sample. The concurrent anchor-item design solves this problem by splitting the items into separate subtests, with some common items across subtests; then administering each subtest to a different sample; and finally running estimation algorithms once on the aggregated data array, from which a substantial number of responses are then missing. Although the use of anchor-item designs is widespread, the consequences of several configuration decisions on the accuracy of parameter estimates have never been studied in the polytomous case. The present study addresses this question by simulation, comparing the outcomes of several alternatives on the configuration of the anchor-item design. The factors defining variants of the anchor-item design are (a) subtest size, (b) balance of common and unique items per subtest, (c) characteristics of the common items, and (d) criteria for the distribution of unique items across subtests. The results of this study indicate that maximizing accuracy in item parameter recovery requires subtests of the largest possible number of items and the smallest possible number of common items; the characteristics of the common items and the criterion for distribution of unique items do not affect accuracy.
Resumo:
We provide a new multivariate calibration-function based on South Atlantic modern assemblages of planktonic foraminifera and atlas water column parameters from the Antarctic Circumpolar Current to the Subtropical Gyre and tropical warm waters (i.e., 60°S to 0°S). Therefore, we used a dataset with the abundance pattern of 35 taxonomic groups of planktonic foraminifera in 141 surface sediment samples. Five factors were taken into consideration for the analysis, which account for 93% of the total variance of the original data representing the regional main oceanographic fronts. The new calibration-function F141-35-5 enables the reconstruction of Late Quaternary summer and winter sea-surface temperatures with a statistical error of ~0.5°C. Our function was verified by its application to a sediment core extracted from the western South Atlantic. The downcore reconstruction shows negative anomalies in sea-surface temperatures during the early-mid Holocene and temperatures within the range of modern values during the late Holocene. This pattern is consistent with available reconstructions.
Resumo:
Understanding the overall catalytic activity trend for rational catalyst design is one of the core goals in heterogeneous catalysis. In the past two decades, the development of density functional theory (DFT) and surface kinetics make it feasible to theoretically evaluate and predict the catalytic activity variation of catalysts within a descriptor-based framework. Thereinto, the concept of the volcano curve, which reveals the general activity trend, usually constitutes the basic foundation of catalyst screening. However, although it is a widely accepted concept in heterogeneous catalysis, its origin lacks a clear physical picture and definite interpretation. Herein, starting with a brief review of the development of the catalyst screening framework, we use a two-step kinetic model to refine and clarify the origin of the volcano curve with a full analytical analysis by integrating the surface kinetics and the results of first-principles calculations. It is mathematically demonstrated that the volcano curve is an essential property in catalysis, which results from the self-poisoning effect accompanying the catalytic adsorption process. Specifically, when adsorption is strong, it is the rapid decrease of surface free sites rather than the augmentation of energy barriers that inhibits the overall reaction rate and results in the volcano curve. Some interesting points and implications in assisting catalyst screening are also discussed based on the kinetic derivation. Moreover, recent applications of the volcano curve for catalyst design in two important photoelectrocatalytic processes (the hydrogen evolution reaction and dye-sensitized solar cells) are also briefly discussed.
Resumo:
PEDRINI, Aldomar; WESTPHAL, F. S.; LAMBERT, R.. A methodology for building energy modelling and calibration in warm climates. Building And Environment, Australia, n. 37, p.903-912, 2002. Disponível em:
Resumo:
[EN]The work presented in this paper is related to Depth Recovery from Focus The approach starts calibrating focal length of the camera using the Gaussian lens law for the thin lens camera model Two approaches are presented based on the availability of the internal distance of the lens
Resumo:
Resources created at the University of Southampton for the module Remote Sensing for Earth Observation
Resumo:
We present the market practice for interest rate yield curves construction and pricing interest rate derivatives. Then we give a brief description of the Vasicek and the Hull-White models, with an example of calibration to market data. We generalize the classical Black-Scholes-Merton pricing formulas, considering more general cases such as perfect or partial collateral, derivatives on a dividend paying asset subject to repo funding, and multiple currencies. Finally we derive generic pricing formulae for different combinations of cash flow and collateral currencies, and we apply the results to the pricing of FX swaps and CCS, and we discuss curve bootstrapping.
Resumo:
La tesi si prefigge di definire la molteplicità dell’intersezione tra due curve algebriche piane. La trattazione sarà sviluppata in termini algebrici, per mezzo dello studio degli anelli locali. In seguito, saranno discusse alcune proprietà e sarà proposto qualche esempio di calcolo. Nel terzo capitolo, l’interesse volgerà all’intersezione tra una varietà e un’ipersuperficie di uno spazio proiettivo n-dimensionale. Verrà definita un’ulteriore di molteplicità dell’intersezione, che costituirà una generalizzazione di quella menzionata nei primi due capitoli. A partire da questa definizione, sarà possibile enunciare una versione estesa del Teorema di Bezout. L’ultimo capitolo focalizza l’attenzione nuovamente sulle curve piane, con l’intento di studiarne la topologia in un intorno di un punto singolare. Si introduce, in particolare, l’importante nozione di link di un punto singolare.
Resumo:
In urban areas, interchange spacing and the adequacy of design for weaving, merge, and diverge areas can significantly influence available capacity. Traffic microsimulation tools allow detailed analyses of these critical areas in complex locations that often yield results that differ from the generalized approach of the Highway Capacity Manual. In order to obtain valid results, various inputs should be calibrated to local conditions. This project investigated basic calibration factors for the simulation of traffic conditions within an urban freeway merge/diverge environment. By collecting and analyzing urban freeway traffic data from multiple sources, specific Iowa-based calibration factors for use in VISSIM were developed. In particular, a repeatable methodology for collecting standstill distance and headway/time gap data on urban freeways was applied to locations throughout the state of Iowa. This collection process relies on the manual processing of video for standstill distances and individual vehicle data from radar detectors to measure the headways/time gaps. By comparing the data collected from different locations, it was found that standstill distances vary by location and lead-follow vehicle types. Headways and time gaps were found to be consistent within the same driver population and across different driver populations when the conditions were similar. Both standstill distance and headway/time gap were found to follow fairly dispersed and skewed distributions. Therefore, it is recommended that microsimulation models be modified to include the option for standstill distance and headway/time gap to follow distributions as well as be set separately for different vehicle classes. In addition, for the driving behavior parameters that cannot be easily collected, a sensitivity analysis was conducted to examine the impact of these parameters on the capacity of the facility. The sensitivity analysis results can be used as a reference to manually adjust parameters to match the simulation results to the observed traffic conditions. A well-calibrated microsimulation model can enable a higher level of fidelity in modeling traffic behavior and serve to improve decision making in balancing need with investment.
Resumo:
Nel presente lavoro è affrontato lo studio delle curve ellittiche viste come curve algebriche piane, più precisamente come cubiche lisce nel piano proiettivo complesso. Dopo aver introdotto nella prima parte le nozioni di Superfici compatte e orientabili e curve algebriche, tramite il teorema di classificazione delle Superfici compatte, se ne fornisce una preliminare classificazione basata sul genere della superficie e della curva, rispettivamente. Da qui, segue la definizione di curve ellittiche e uno studio più dettagliato delle loro pricipali proprietà, quali la possibilità di definirle tramite un'equazione affine nota come equazione di Weierstrass e la loro struttura intrinseca di gruppo abeliano. Si fornisce quindi un'ulteriore classificazione delle cubiche lisce, totalmente differente da quella precedente, che si basa invece sul modulo della cubica, invariante per trasformazioni proiettive. Infine, si considera un aspetto computazionale delle curve ellittiche, ovvero la loro applicazione nel campo della Crittografia. Grazie alla struttura che esse assumono sui campi finiti, sotto opportune ipotesi, i crittosistemi a chiave pubblica basati sul problema del logaritmo discreto definiti sulle curve ellittiche, a parità di sicurezza rispetto ai crittosistemi classici, permettono l'utilizzo di chiavi più corte, e quindi meno costose computazionalmente. Si forniscono quindi le definizioni di problema del logaritmo discreto classico e sulle curve ellittiche, ed alcuni esempi di algoritmi crittografici classici definiti su quest'ultime.
Resumo:
Nell'elaborato, dopo una breve descrizione di come vengono suddivise le macchine elettriche a seconda che vi siano o meno parti in movimento al loro interno, vengono esaminati inizialmente gli aspetti teorici che riguardano le macchine sincrone a poli lisci ed a poli salienti prendendo in esame anche quelli che sono i provvedimenti necessari a ridurre il contributo dei campi armonici di ordine superiore. Per questo tipo di macchine, spesso utilizzate in centrale per la pruduzione dell'energia elettrica, risultano di fondamentale importanza le curve a "V" e le curve di "Capability". Esse sono strumenti che permettono di valutare le prestazioni di tali macchine una volta che siano noti i dati di targa delle stesse. Lo scopo della tesi è pertanto quello di sviluppare un software in ambiente Matlab che permetta il calcolo automatico e parametrico di tali curve al fine di poter ottimizzare la scelta di una macchina a seconda delle esigenze. Nel corso dell'eleaborato vengono altresì proposti dei confronti su come varino tali curve, e pertanto i limiti di funzionamento ad esse associati, al variare di alcuni parametri fondamentali come il fattore di potenza, la reattanza sincrona o, nel caso di macchine a poli salienti, il rapporto di riluttanza. Le curve di cui sopra sono state costruite a partire da considerazioni fatte sul diagramma di Behn-Eschemburg per le macchine isotrope o sul diagramma di Arnold e Blondel per le macchine anisotrope.