987 resultados para fractional predictor-corrector method
Resumo:
BACKGROUND: Straylight gives the appearance of a veil of light thrown over a person's retinal image when there is a strong light source present. We examined the reproducibility of the measurements by C-Quant, and assessed its correlation to characteristics of the eye and subjects' age. PARTICIPANTS AND METHODS: Five repeated straylight measurements were taken using the dominant eye of 45 healthy subjects (age 21-59) with a BCVA of 20/20: 14 emmetropic, 16 myopic, eight hyperopic and seven with astigmatism. We assessed the extent of reproducibility of straylight measures using the intraclass correlation coefficient. RESULTS: The mean straylight value of all measurements was 1.01 (SD 0.23, median 0.97, interquartile range 0.85-1.1). Per 10 years of age, straylight increased in average by 0.10 (95%CI 0.04 to 0.16, p < 0.01]. We found no independent association of refraction (range -5.25 dpt to +2 dpt) on straylight values (0.001; 95%CI -0.022 to 0.024, p = 0.92). Compared to emmetropic subjects, myopia reduced straylight (-.011; -0.024 to 0.02, p = 0.11), whereas higher straylight values (0.09; -0.01 to 0.20, p = 0.09) were observed in subjects with blue irises as compared to dark-colored irises when correcting for age. The intraclass correlation coefficient (ICC) of repeated measurements was 0.83 (95%CI 0.76 to 0.90). CONCLUSIONS: Our study showed that straylight measurements with the C-Quant had a high reproducibility, i.e. a lack of large intra-observer variability, making it appropriate to be applied in long-term follow-up studies assessing the long-term effect of surgical procedures on the quality of vision.
Resumo:
The aim of this study was to evaluate the efficacy of a polymerase chain reaction (PCR)-based method to detect Schistosoma mansoni DNA in stool samples from individuals living in a low-endemicity area in Brazil. Of the 125 initial stool samples, 80 were ELISA reactive and eggs were identified in 19 of the samples by parasitological examination. For the PCR evaluations, 56 stool samples were selected and divided into five groups. Groups I-IV were scored negative for S. mansoni eggs by parasitological examination. Groups I and II were ELISA reactive, whereas Groups III and IV were ELISA nonreactive. Groups II and III were positive for other intestinal parasites. PCR testing scored eight samples as positive from these four groups. Group V represented the S. mansoni -positive group and it included ELISA-reactive samples that were scored positive for S. mansoni by one or more parasitological examinations (6/19 were positive by Kato-Katz method, 9/17 by saline gradient and 10/13 by Helmintex®). PCR scored 13 of these 19 samples as positive for S. mansoni . We conclude that while none of these methods yielded 100% sensitivity, a combination of techniques should be effective for improving the detection of S. mansoni infection in low-endemicity areas.
Resumo:
The aim of this study was to investigate the performance of a new and accurate method for the detection of isoniazid (INH) and rifampicin (RIF) resistance among Mycobacterium tuberculosis isolates using a crystal violet decolourisation assay (CVDA). Fifty-five M. tuberculosis isolates obtained from culture stocks stored at -80ºC were tested. After bacterial inoculation, the samples were incubated at 37ºC for seven days and 100 µL of CV (25 mg/L stock solution) was then added to the control and sample tubes. The tubes were incubated for an additional 24-48 h. CV (blue/purple) was decolourised in the presence of bacterial growth; thus, if CV lost its colour in a sample containing a drug, the tested isolate was reported as resistant. The sensitivity, specificity, positive predictive value, negative predictive value and agreement for INH were 92.5%, 96.4%, 96.1%, 93.1% and 94.5%, respectively, and 88.8%, 100%, 100%, 94.8% and 96.3%, respectively, for RIF. The results were obtained within eight-nine days. This study shows that CVDA is an effective method to detect M. tuberculosis resistance to INH and RIF in developing countries. This method is rapid, simple and inexpensive. Nonetheless, further studies are necessary before routine laboratory implementation.
Resumo:
The identification of mycobacteria is essential because tuberculosis (TB) and mycobacteriosis are clinically indistinguishable and require different therapeutic regimens. The traditional phenotypic method is time consuming and may last up to 60 days. Indeed, rapid, affordable, specific and easy-to-perform identification methods are needed. We have previously described a polymerase chain reaction-based method called a mycobacteria mobility shift assay (MMSA) that was designed for Mycobacterium tuberculosis complex (MTC) and nontuberculous mycobacteria (NTM) species identification. The aim of this study was to assess the MMSA for the identification of MTC and NTM clinical isolates and to compare its performance with that of the PRA-hsp65 method. A total of 204 clinical isolates (102 NTM and 102 MTC) were identified by the MMSA and PRA-hsp65. For isolates for which these methods gave discordant results, definitive species identification was obtained by sequencing fragments of the 16S rRNA and hsp65 genes. Both methods correctly identified all MTC isolates. Among the NTM isolates, the MMSA alone assigned 94 (92.2%) to a complex or species, whereas the PRA-hsp65 method assigned 100% to a species. A 91.5% agreement was observed for the 94 NTM isolates identified by both methods. The MMSA provided correct identification for 96.8% of the NTM isolates compared with 94.7% for PRA-hsp65. The MMSA is a suitable auxiliary method for routine use for the rapid identification of mycobacteria.
Resumo:
Distribution, abundance, feeding behaviour, host preference, parity status and human-biting and infection rates are among the medical entomological parameters evaluated when determining the vector capacity of mosquito species. To evaluate these parameters, mosquitoes must be collected using an appropriate method. Malaria is primarily transmitted by anthropophilic and synanthropic anophelines. Thus, collection methods must result in the identification of the anthropophilic species and efficiently evaluate the parameters involved in malaria transmission dynamics. Consequently, human landing catches would be the most appropriate method if not for their inherent risk. The choice of alternative anopheline collection methods, such as traps, must consider their effectiveness in reproducing the efficiency of human attraction. Collection methods lure mosquitoes by using a mixture of olfactory, visual and thermal cues. Here, we reviewed, classified and compared the efficiency of anopheline collection methods, with an emphasis on Neotropical anthropophilic species, especially Anopheles darlingi, in distinct malaria epidemiological conditions in Brazil.
Resumo:
We consider an exponentially fitted discontinuous Galerkin method for advection dominated problems and propose a block solver for the resulting linear systems. In the case of strong advection the solver is robust with respect to the advection direction and the number of unknowns.
Resumo:
We construct and analyze non-overlapping Schwarz methods for a preconditioned weakly over-penalized symmetric interior penalty (WOPSIP) method for elliptic problems.
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
RATIONALE: The aim of the work was to develop and validate a method for the quantification of vitamin D metabolites in serum using ultra-high-pressure liquid chromatography coupled to mass spectrometry (LC/MS), and to validate a high-resolution mass spectrometry (LC/HRMS) approach against a tandem mass spectrometry (LC/MS/MS) approach using a large clinical sample set. METHODS: A fast, accurate and reliable method for the quantification of the vitamin D metabolites, 25-hydroxyvitamin D2 (25OH-D2) and 25-hydroxyvitamin D3 (25OH-D3), in human serum was developed and validated. The C3 epimer of 25OH-D3 (3-epi-25OH-D3) was also separated from 25OH-D3. The samples were rapidly prepared via a protein precipitation step followed by solid-phase extraction (SPE) using an HLB μelution plate. Quantification was performed using both LC/MS/MS and LC/HRMS systems. RESULTS: Recovery, matrix effect, inter- and intra-day reproducibility were assessed. Lower limits of quantification (LLOQs) were determined for both 25OH-D2 and 25OH-D3 for the LC/MS/MS approach (6.2 and 3.4 µg/L, respectively) and the LC/HRMS approach (2.1 and 1.7 µg/L, respectively). A Passing & Bablok fit was determined between both approaches for 25OH-D3 on 662 clinical samples (1.11 + 1.06x). It was also shown that results can be affected by the inclusion of the isomer 3-epi-25OH-D3. CONCLUSIONS: Quantification of the relevant vitamin D metabolites was successfully developed and validated here. It was shown that LC/HRMS is an accurate, powerful and easy to use approach for quantification within clinical laboratories. Finally, the results here suggest that it is important to separate 3-epi-25OH-D3 from 25OH-D3. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
The objective of traffic engineering is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization, few works have focused on LSR (label switched router) label space. This paper proposes an algorithm that takes advantage of the MPLS label stack features in order to reduce the number of labels used in LSPs. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The described algorithm sets up NHLFE (next hop label forwarding entry) tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the described algorithm achieves a great reduction factor in the label space. The presented works apply for both types of connections: P2MP (point-to-multipoint) and P2P (point-to-point)
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
The recommended treatment for latent tuberculosis (TB) infection in adults is a daily dose of isoniazid (INH) 300 mg for six months. In Brazil, INH was formulated as 100 mg tablets. The treatment duration and the high pill burden compromised patient adherence to the treatment. The Brazilian National Programme for Tuberculosis requested a new 300 mg INH formulation. The aim of our study was to compare the bioavailability of the new INH 300 mg formulation and three 100 mg tablets of the reference formulation. We conducted a randomised, single dose, open label, two-phase crossover bioequivalence study in 28 healthy human volunteers. The 90% confidence interval for the INH maximum concentration of drug observed in plasma and area under the plasma concentration vs. time curve from time zero to the last measurable concentration “time t” was 89.61-115.92 and 94.82-119.44, respectively. The main limitation of our study was that neither adherence nor the safety profile of multiple doses was evaluated. To determine the level of INH in human plasma, we developed and validated a sensitive, simple and rapid high-performance liquid chromatography-tandem mass spectrometry method. Our results showed that the new formulation was bioequivalent to the 100 mg reference product. This finding supports the use of a single 300 mg tablet daily strategy to treat latent TB. This new formulation may increase patients’ adherence to the treatment and quality of life.
Resumo:
BACKGROUND: Prognosis of status epilepticus (SE) depends on its cause, but there is uncertainty as to whether SE represents an independent outcome predictor for a given etiology. Cerebral anoxia is a relatively homogenous severe encephalopathy. Postanoxic SE is associated to a nearly 100% mortality in this setting; however, it is still unclear whether this is a severity marker of the underlying encephalopathy, or an independent factor influencing outcome. The goal of this study was to assess if postanoxic SE is independently associated with mortality after cerebral anoxia. METHODS: This was a retrospective observation of consecutive comatose survivors of cardiac arrest, including subjects treated with hypothermia. On the subgroup with EEG recordings in the first hospitalization days, univariate and multivariate analyses were applied to potential determinants of in-hospital mortality, and included the following variables: age, gender, type and length of cardiac arrest, occurrence of circulatory shock, presence of therapeutic hypothermia, and electrographic SE. RESULTS: Out of 166 postanoxic patients, 107 (64%) had an EEG (median latency from admission, 2 days); in this group, therapeutic hypothermia was administered in 59%. Death occurred in 71 (67%) patients. Postanoxic SE was associated with mortality regardless of type of acute cardiac rhythm and administration of hypothermic treatment. CONCLUSION: In this hospital-based cohort, postanoxic status epilepticus (SE) seems to be independently related to death in cardiac arrest survivors, suggesting that SE might determine a bad prognosis for a given etiology. Confirmation of these results in a prospective assessment is needed.