989 resultados para OptiMAL dipstick test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though patients who develop ischemic stroke despite taking antiplatelet drugs represent a considerable proportion of stroke hospital admissions, there is a paucity of data from investigational studies regarding the most suitable therapeutic intervention. There have been no clinical trials to test whether increasing the dose or switching antiplatelet agents reduces the risk for subsequent events. Certain issues have to be considered in patients managed for a first or recurrent stroke while receiving antiplatelet agents. Therapeutic failure may be due to either poor adherence to treatment, associated co-morbid conditions and diminished antiplatelet effects (resistance to treatment). A diagnostic work up is warranted to identify the etiology and underlying mechanism of stroke, thereby guiding further management. Risk factors (including hypertension, dyslipidemia and diabetes) should be treated according to current guidelines. Aspirin or aspirin plus clopidogrel may be used in the acute and early phase of ischemic stroke, whereas in the long-term, antiplatelet treatment should be continued with aspirin, aspirin/extended release dipyridamole or clopidogrel monotherapy taking into account tolerance, safety, adherence and cost issues. Secondary measures to educate patients about stroke, the importance of adherence to medication, behavioral modification relating to tobacco use, physical activity, alcohol consumption and diet to control excess weight should also be implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

N = 1 designs imply repeated registrations of the behaviour of the same experimental unit and the measurements obtained are often few due to time limitations, while they are also likely to be sequentially dependent. The analytical techniques needed to enhance statistical and clinical decision making have to deal with these problems. Different procedures for analysing data from single-case AB designs are discussed, presenting their main features and revising the results reported by previous studies. Randomization tests represent one of the statistical methods that seemed to perform well in terms of controlling false alarm rates. In the experimental part of the study a new simulation approach is used to test the performance of randomization tests and the results suggest that the technique is not always robust against the violation of the independence assumption. Moreover, sensitivity proved to be generally unacceptably low for series lengths equal to 30 and 40. Considering the evidence available, there does not seem to be an optimal technique for single-case data analysis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To review the available knowledge on epidemiology and diagnoses of acute infections in children aged 2 to 59 months in primary care setting and develop an electronic algorithm for the Integrated Management of Childhood Illness to reach optimal clinical outcome and rational use of medicines. METHODS: A structured literature review in Medline, Embase and the Cochrane Database of Systematic Review (CDRS) looked for available estimations of diseases prevalence in outpatients aged 2-59 months, and for available evidence on i) accuracy of clinical predictors, and ii) performance of point-of-care tests for targeted diseases. A new algorithm for the management of childhood illness (ALMANACH) was designed based on evidence retrieved and results of a study on etiologies of fever in Tanzanian children outpatients. FINDINGS: The major changes in ALMANACH compared to IMCI (2008 version) are the following: i) assessment of 10 danger signs, ii) classification of non-severe children into febrile and non-febrile illness, the latter receiving no antibiotics, iii) classification of pneumonia based on a respiratory rate threshold of 50 assessed twice for febrile children 12-59 months; iv) malaria rapid diagnostic test performed for all febrile children. In the absence of identified source of fever at the end of the assessment, v) urine dipstick performed for febrile children <2 years to consider urinary tract infection, vi) classification of 'possible typhoid' for febrile children >2 years with abdominal tenderness; and lastly vii) classification of 'likely viral infection' in case of negative results. CONCLUSION: This smartphone-run algorithm based on new evidence and two point-of-care tests should improve the quality of care of <5 year children and lead to more rational use of antimicrobials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to estimate the energy cost of linear (EC) and vertical displacement (ECvert), mechanical efficiency and main stride parameters during simulated ski mountaineering at different speeds and gradients, to identify an optimal speed and gradient that maximizes performance. 12 subjects roller skied on a treadmill at three different inclines (10, 17 and 24 %) at three different speeds (approximately 70, 80 and 85 % of estimated peak heart rate). Energy expenditure was calculated by indirect calorimetry, while biomechanical parameters were measured with an inertial sensor-based system. At 10 % there was no significant change with speed in EC, ECvert and mechanical efficiency. At 17 and 24 % the fastest speed was significantly more economical. There was a significant effect of gradient on EC, ECvert and mechanical efficiency. The most economical gradient was the steepest one. There was a significant increase of stride frequency with speed. At steep gradients only, relative thrust phase duration decreased significantly, while stride length increased significantly with speed. There was a significant effect of gradient on stride length (decrease with steepness) and relative thrust phase duration (increase with steepness). A combination of a decreased relative thrust phase duration with increased stride length and frequency decreases ECvert. To minimize the energy expenditure to reach the top of a mountain and to optimize performance, ski-mountaineers should choose a steep gradient (~24 %) and, provided they possess sufficient metabolic scope, combine it with a fast speed (~6 km h(-1)).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to develop and validate a dissolution test for glibenclamide tablets. Optimal conditions to carry out the dissolution test are 500 mL of phosphate buffer at pH 8.0, paddles at 75 rpm stirring speed, time test set to 60 min and using equipment with six vessels. The derivative UV spectrophotometric method for determination of glibenclamide released was developed, validated and compared with the HPLC method. The UVDS method presents linearity (r² = 0.9999) in the concentration range of 5-14 µg/mL. Precision and recoveries were 0.42% and 100.25%, respectively. The method was applied to three products commercially available on the Brazilian market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CHARGE syndrome, Sotos syndrome and 3p deletion syndrome are examples of rare inherited syndromes that have been recognized for decades but for which the molecular diagnostics only have been made possible by recent advances in genomic research. Despite these advances, development of diagnostic tests for rare syndromes has been hindered by diagnostic laboratories having limited funds for test development, and their prioritization of tests for which a (relatively) high demand can be expected. In this study, the molecular diagnostic tests for CHARGE syndrome and Sotos syndrome were developed, resulting in their successful translation into routine diagnostic testing in the laboratory of Medical Genetics (UTUlab). In the CHARGE syndrome group, mutation was identified in 40.5% of the patients and in the Sotos syndrome group, in 34%, reflecting the use of the tests in routine diagnostics in differential diagnostics. In CHARGE syndrome, the low prevalence of structural aberrations was also confirmed. In 3p deletion syndrome, it was shown that small terminal deletions are not causative for the syndrome, and that testing with arraybased analysis provides a reliable estimate of the deletion size but benign copy number variants complicate result interpretation. During the development of the tests, it was discovered that finding an optimal molecular diagnostic strategy for a given syndrome is always a compromise between the sensitivity, specificity and feasibility of applying a new method. In addition, the clinical utility of the test should be considered prior to test development: sometimes a test performing well in a laboratory has limited utility for the patient, whereas a test performing poorly in the laboratory may have a great impact on the patient and their family. At present, the development of next generation sequencing methods is changing the concept of molecular diagnostics of rare diseases from single tests towards whole-genome analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal challenge occurs when an individual perceives the challenge of the task to be equaled or matched by his or her own skill level (Csikszentmihalyi, 1990). The purpose of this study was to test the impact of the OPTIMAL model on physical education students' motivation and perceptions of optimal challenge across four games categories (i. e. target, batting/fielding, net/wall, invasion). Enjoyment, competence, student goal orientation and activity level were examined in relation to the OPTIMAL model. A total of 22 (17 M; 5 F) students and their parents provided informed consent to take part in the study and were taught four OPTIMAL lessons and four non-OPTIMAL lessons ranging across the four different games categories by their own teacher. All students completed the Task and Ego in Sport Questionnaire (TEOSQ; Duda & Whitehead, 1998), the Intrinsic Motivation Inventory (IMI; McAuley, Duncan, & Tanmien, 1987) and the Children's Perception of Optimal Challenge Instrument (CPOCI; Mandigo, 2001). Sixteen students (two each lesson) were observed by using the System for Observing Fitness Instruction Time tool (SOFTT; McKenzie, 2002). As well, they participated in a structured interview which took place after each lesson was completed. Quantitative results concluded that no overall significant difference was found in motivational outcomes when comparing OPTIMAL and non-OPTIMAL lessons. However, when the lessons were broken down into games categories, significant differences emerged. Levels of perceived competence were found to be higher in non-OPTIMAL batting/fielding lessons compared to OPTIMAL lessons, whereas levels of enjoyment and perceived competence were found to be higher in OPTIMAL invasion lessons in comparison to non-OPTIMAL invasion lessons. Qualitative results revealed significance in feehngs of skill/challenge balance, enjoyment and competence in the OPTIMAL lessons. Moreover, a significance of practically twice the active movement time percentage was found in OPTIMAL lessons in comparison to non-OPTIMAL lessons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accelerated life testing (ALT) is widely used to obtain reliability information about a product within a limited time frame. The Cox s proportional hazards (PH) model is often utilized for reliability prediction. My master thesis research focuses on designing accelerated life testing experiments for reliability estimation. We consider multiple step-stress ALT plans with censoring. The optimal stress levels and times of changing the stress levels are investigated. We discuss the optimal designs under three optimality criteria. They are D-, A- and Q-optimal designs. We note that the classical designs are optimal only if the model assumed is correct. Due to the nature of prediction made from ALT experimental data, attained under the stress levels higher than the normal condition, extrapolation is encountered. In such case, the assumed model cannot be tested. Therefore, for possible imprecision in the assumed PH model, the method of construction for robust designs is also explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les tests PAMPA et les tests Caco-2 sont des essais in vitro de l’évaluation de la perméabilité intestinale des médicaments. Ils sont réalisés lors de la phase de découverte du médicament. Les tests PAMPA ne sont pas biologiquement représentatifs de la paroi intestinale, mais ils sont rapides et peu coûteux. Les tests Caco-2 nécessitent plus de 21 jours pour la culture cellulaire et des installations spécifiques sont requises. Ils sont constitués d’une monocouche d’entérocytes à confluence et donc plus biologiquement représentatifs. Il y a un besoin pour le développement d’un essai qui est biologiquement représentatif de la membrane intestinale humaine, rapide et peu coûteux. Le premier but de ce projet était de développer une méthode analytique qui permettrait l’évaluation simultanée de huit médicaments témoins utilisés pour la validation de l’essai de perméabilité. Le deuxième but de ce projet était donc d’améliorer la membrane des tests PAMPA pour proposer un nouveau test : le néoPAMPA. Contrairement au test PAMPA traditionnel, cette membrane est constituée de trois composantes : (1) un filtre poreux qui agit à titre de support, (2) un coussin polydopamine chargé négativement qui sert d’ancrage et qui assure la fluidité de la bicouche et (3) une bicouche lipidique formée par fusion de vésicules. Une méthode analytique HPLC-MS/MS a été validée selon les spécifications de la FDA et de la EMA. Cette méthode a permis de quantifier simultanément les huit médicaments standards utilisés pour le test néoPAMPA. Le test PAMPA traditionnel a été mis en place à titre d’essai control. Les coefficients de perméabilité mesurés pour les huit médicaments au travers de la membrane PAMPA comparaient favorablement aux résultats de la littérature. Les composantes de la membrane néoPAMPA ont été optimisées. Les conditions optimales retenues étaient les filtres de polycarbonate hydrophile ayant des pores de 15 nm, les plaques Costar 12 puits comme dispositif des tests de perméabilité, une bicouche lipidique composée de 70 % DOPC et de 30 % cholestérol cationique ainsi qu’une déposition des liposomes en présence de 150 mM NaCl suivi d’un équilibre d’1 h en présence d’une solution saturée en DOPC. Les stabilités de la cassette de médicaments et des liposomes sont insuffisantes pour le conditionnement commercial des membranes néoPAMPA. Les différentes optimisations réalisées ont permis d’améliorer la membrane néoPAMPA sans toutefois la rendre fonctionnelle. La membrane néoPAMPA n’est toujours pas en mesure de discriminer des molécules en fonction de leur perméabilité attendue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationships between wheat protein quality and baking properties of 20 flour samples were studied for two breadmaking processes; a hearth bread test and the Chorleywood Bread Process (CBP). The strain hardening index obtained from dough inflation measurements, the proportion of unextractable polymeric protein, and mixing properties were among the variables found to be good indicators of protein quality and suitable for predicting potential baking quality of wheat flours. By partial least squares regression, flour and dough test variables were able to account for 71-93% of the variation in crumb texture, form ratio and volume of hearth loaves made using optimal mixing and fixed proving times. These protein quality variables were, however, not related to the volume of loaves produced by the CBP using mixing to constant work input and proving to constant height. On the other hand, variation in crumb texture of CBP loaves (54-55%) could be explained by protein quality. The results underline that the choice of baking procedure and loaf characteristics is vital in assessing the protein quality of flours. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe, and make publicly available, two problem instance generators for a multiobjective version of the well-known quadratic assignment problem (QAP). The generators allow a number of instance parameters to be set, including those controlling epistasis and inter-objective correlations. Based on these generators, several initial test suites are provided and described. For each test instance we measure some global properties and, for the smallest ones, make some initial observations of the Pareto optimal sets/fronts. Our purpose in providing these tools is to facilitate the ongoing study of problem structure in multiobjective (combinatorial) optimization, and its effects on search landscape and algorithm performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coarse spacing of automatic rain gauges complicates near-real- time spatial analyses of precipitation. We test the possibility of improving such analyses by considering, in addition to the in situ measurements, the spatial covariance structure inferred from past observations with a denser network. To this end, a statistical reconstruction technique, reduced space optimal interpolation (RSOI), is applied over Switzerland, a region of complex topography. RSOI consists of two main parts. First, principal component analysis (PCA) is applied to obtain a reduced space representation of gridded high- resolution precipitation fields available for a multiyear calibration period in the past. Second, sparse real-time rain gauge observations are used to estimate the principal component scores and to reconstruct the precipitation field. In this way, climatological information at higher resolution than the near-real-time measurements is incorporated into the spatial analysis. PCA is found to efficiently reduce the dimensionality of the calibration fields, and RSOI is successful despite the difficulties associated with the statistical distribution of daily precipitation (skewness, dry days). Examples and a systematic evaluation show substantial added value over a simple interpolation technique that uses near-real-time observations only. The benefit is particularly strong for larger- scale precipitation and prominent topographic effects. Small-scale precipitation features are reconstructed at a skill comparable to that of the simple technique. Stratifying the reconstruction method by the types of weather type classifications yields little added skill. Apart from application in near real time, RSOI may also be valuable for enhancing instrumental precipitation analyses for the historic past when direct observations were sparse.