928 resultados para Non-perturbative methods
Resumo:
We evaluated the expression of 10 adhesion molecules on peripheral blood tumor cells of 17 patients with chronic lymphocytic leukemia, 17 with mantle-cell lymphoma, and 13 with nodal or splenic marginal B-cell lymphoma, all in the leukemic phase and before the beginning of any therapy. The diagnosis of B-cell non-Hodgkin's lymphomas was based on cytological, histological, immunophenotypic, and molecular biology methods. The mean fluorescence intensity of the adhesion molecules in tumor cells was measured by flow cytometry of CD19-positive cells and differed amongst the types of lymphomas. Comparison of chronic lymphocytic leukemia and mantle-cell lymphoma showed that the former presented a higher expression of CD11c and CD49c, and a lower expression of CD11b and CD49d adhesion molecules. Comparison of chronic lymphocytic leukemia and marginal B-cell lymphoma showed that the former presented a higher expression of CD49c and a lower expression of CD11a, CD11b, CD18, CD49d, CD29, and CD54. Finally, comparison of mantle-cell lymphoma and marginal B-cell lymphoma showed that marginal B-cell lymphoma had a higher expression of CD11a, CD11c, CD18, CD29, and CD54. Thus, the CD49c/CD49d pair consistently demonstrated a distinct pattern of expression in chronic lymphocytic leukemia compared with mantle-cell lymphoma and marginal B-cell lymphoma, which could be helpful for the differential diagnosis. Moreover, the distinct profiles of adhesion molecules in these diseases may be responsible for their different capacities to invade the blood stream.
Resumo:
Several methods are used to estimate anaerobic threshold (AT) during exercise. The aim of the present study was to compare AT obtained by a graphic visual method for the estimate of ventilatory and metabolic variables (gold standard), to a bi-segmental linear regression mathematical model of Hinkley's algorithm applied to heart rate (HR) and carbon dioxide output (VCO2) data. Thirteen young (24 ± 2.63 years old) and 16 postmenopausal (57 ± 4.79 years old) healthy and sedentary women were submitted to a continuous ergospirometric incremental test on an electromagnetic braking cycloergometer with 10 to 20 W/min increases until physical exhaustion. The ventilatory variables were recorded breath-to-breath and HR was obtained beat-to-beat over real time. Data were analyzed by the nonparametric Friedman test and Spearman correlation test with the level of significance set at 5%. Power output (W), HR (bpm), oxygen uptake (VO2; mL kg-1 min-1), VO2 (mL/min), VCO2 (mL/min), and minute ventilation (VE; L/min) data observed at the AT level were similar for both methods and groups studied (P > 0.05). The VO2 (mL kg-1 min-1) data showed significant correlation (P < 0.05) between the gold standard method and the mathematical model when applied to HR (r s = 0.75) and VCO2 (r s = 0.78) data for the subjects as a whole (N = 29). The proposed mathematical method for the detection of changes in response patterns of VCO2 and HR was adequate and promising for AT detection in young and middle-aged women, representing a semi-automatic, non-invasive and objective AT measurement.
Resumo:
The fractal dimension has been employed as a useful parameter in the diagnosis of retinal disease. Avakian et al. (Curr Eye Res 2002; 24: 274-280), comparing the vascular pattern of normal patients with mild to moderate non-proliferative diabetic retinopathy (NPDR), found a significant difference between them only in the macular region. This significant difference in the box-counting fractal dimension of the macular region between normal and mild NPDR patients has been proposed as a method of precocious diagnosis of NPDR. The aim of the present study was to determine if fractal dimensions can really be used as a parameter for the early diagnosis of NPDR. Box-counting and information fractal dimensions were used to parameterize the vascular pattern of the human retina. The two methods were applied to the whole retina and to nine anatomical regions of the retina in 5 individuals with mild NPDR and in 28 diabetic but opthalmically normal individuals (controls), with age between 31 and 86 years. All images of retina were obtained from the Digital Retinal Images for Vessel Extraction (DRIVE) database. The results showed that the fractal dimension parameter was not sensitive enough to be of use for an early diagnosis of NPDR.
Resumo:
Leptospirosis is a reemerging infectious disease and the most disseminated zoonosis worldwide. A leptospiral surface protein, LipL32, only occurs in pathogenic Leptospira, and is the most abundant protein on the bacterial surface, being described as an important factor in host immunogenic response and also in bacterial infection. We describe here an alternative and simple purification protocol for non-tagged recombinant LipL32. The recombinant LipL32(21-272) was expressed in Escherichia coli without His-tag or any other tag used to facilitate recombinant protein purification. The recombinant protein was expressed in the soluble form, and the purification was based on ion exchange (anionic and cationic) and hydrophobic interactions. The final purification yielded 3 mg soluble LipL32(21-272) per liter of the induced culture. Antiserum produced against the recombinant protein was effective to detect native LipL32 from cell extracts of several Leptospira serovars. The purified recombinant LipL32(21-272) produced by this protocol can be used for structural, biochemical and functional studies and avoids the risk of possible interactions and interferences of the tags commonly used as well as the time consuming and almost always inefficient methods to cleave these tags when a tag-free LipL32 is needed. Non-tagged LipL32 may represent an alternative antigen for biochemical studies, for serodiagnosis and for the development of a vaccine against leptospirosis.
Resumo:
The aim of this thesis is to propose a novel control method for teleoperated electrohydraulic servo systems that implements a reliable haptic sense between the human and manipulator interaction, and an ideal position control between the manipulator and the task environment interaction. The proposed method has the characteristics of a universal technique independent of the actual control algorithm and it can be applied with other suitable control methods as a real-time control strategy. The motivation to develop this control method is the necessity for a reliable real-time controller for teleoperated electrohydraulic servo systems that provides highly accurate position control based on joystick inputs with haptic capabilities. The contribution of the research is that the proposed control method combines a directed random search method and a real-time simulation to develop an intelligent controller in which each generation of parameters is tested on-line by the real-time simulator before being applied to the real process. The controller was evaluated on a hydraulic position servo system. The simulator of the hydraulic system was built based on Markov chain Monte Carlo (MCMC) method. A Particle Swarm Optimization algorithm combined with the foraging behavior of E. coli bacteria was utilized as the directed random search engine. The control strategy allows the operator to be plugged into the work environment dynamically and kinetically. This helps to ensure the system has haptic sense with high stability, without abstracting away the dynamics of the hydraulic system. The new control algorithm provides asymptotically exact tracking of both, the position and the contact force. In addition, this research proposes a novel method for re-calibration of multi-axis force/torque sensors. The method makes several improvements to traditional methods. It can be used without dismantling the sensor from its application and it requires smaller number of standard loads for calibration. It is also more cost efficient and faster in comparison to traditional calibration methods. The proposed method was developed in response to re-calibration issues with the force sensors utilized in teleoperated systems. The new approach aimed to avoid dismantling of the sensors from their applications for applying calibration. A major complication with many manipulators is the difficulty accessing them when they operate inside a non-accessible environment; especially if those environments are harsh; such as in radioactive areas. The proposed technique is based on design of experiment methodology. It has been successfully applied to different force/torque sensors and this research presents experimental validation of use of the calibration method with one of the force sensors which method has been applied to.
Resumo:
Seed dormancy is a frequent phenomenon in tropical species, causing slow and non-uniform germination. To overcome this, treatments such as scarification on abrasive surface and hot water are efficient. The objective of this study was to quantify seed germination with no treatment (Experiment 1) and identify an efficient method of breaking dormancy in Schizolobium amazonicum Huber ex Ducke seeds (Experiment 2). The effects of manual scarification on electric emery, water at 80ºC and 100ºC and manual scarification on wood sandpaper were studied. Seeds were sown either immediately after scarification or after immersion in water for 24h in a sand and sawdust mixture. Germination and hard seed percentages and germination speed were recorded and analyzed in a completely randomized design. Analysis of germination was carried out at six, nine, 12, 15, 18, 21 and 24 days after sowing as a 4x2 factorial design and through regression analysis. Treatment means of the remaining variables were compared by the Tukey test. Seed germination with no treatment started on the 7th day after sowing and reached 90% on the 2310th day (Experiment 1). Significant interaction between treatments to overcome dormancy and time of immersion in water was observed (Experiment 2). In general, immersion in water increased the germination in most evaluations. The regression analyses were significant for all treatments with exception of the control treatment and immersion in water at 80ºC. Germination speed was higher when seeds were scarified on an abrasive surface (emery and sandpaper) and, in these treatments, the germination ranged from 87% to 96%, with no hard seeds. S. amazonicum seeds coats are impermeable to water, which hinders quick and uniform germination. Scarification on electric emery followed by immediate sowing, scarification on sandpaper followed by immediate sowing and sowing after 24h were the most efficient treatments for overcoming dormancy in S. amazonicum seeds.
Resumo:
The difficulty on identifying, lack of segregation systems and absence of suitable standards for coexistence of non trangenic and transgenic soybean are contributing for contaminations that occur during productive system. The objective of this study was to evaluate the efficiency of two methods for detecting mixtures of seeds genetically modified (GM) into samples of non-GM soybean, in a way that seed lots can be assessed within the standards established by seed legislation. Two sizes of soybean samples (200 and 400 seeds), cv. BRSMG 810C (non-GM) and BRSMG 850GRR (GM), were assessed with four contamination levels (addition of GM seeds, for obtaining 0.0%, 0.5%, 1.0%, and 1.5% contamination), and two detection methods: immunoassay of lateral flux (ILF) and bioassay (pre-imbibition into 0.6% herbicide solution; 25 ºC; 16 h). The bioassay is efficient in detecting presence of GM seeds in seed samples of non-GM soybean, even for contamination lower than 1.0%, provided that seeds have high physiological quality. The ILF was positive, detecting the presence of target protein in contaminated samples, indicating test effectiveness. There was significant correlation between the two detection methods (r = 0.82; p < 0.0001). Sample size did not influence efficiency of the two methods in detecting presence of GM seeds.
Resumo:
The aim of this Master’s thesis is to find a method for classifying spare part criticality in the case company. Several approaches exist for criticality classification of spare parts. The practical problem in this thesis is the lack of a generic analysis method for classifying spare parts of proprietary equipment of the case company. In order to find a classification method, a literature review of various analysis methods is required. The requirements of the case company also have to be recognized. This is achieved by consulting professionals in the company. The literature review states that the analytic hierarchy process (AHP) combined with decision tree models is a common method for classifying spare parts in academic literature. Most of the literature discusses spare part criticality in stock holding perspective. This is relevant perspective also for a customer orientated original equipment manufacturer (OEM), as the case company. A decision tree model is developed for classifying spare parts. The decision tree classifies spare parts into five criticality classes according to five criteria. The criteria are: safety risk, availability risk, functional criticality, predictability of failure and probability of failure. The criticality classes describe the level of criticality from non-critical to highly critical. The method is verified for classifying spare parts of a full deposit stripping machine. The classification can be utilized as a generic model for recognizing critical spare parts of other similar equipment, according to which spare part recommendations can be created. Purchase price of an item and equipment criticality were found to have no effect on spare part criticality in this context. Decision tree is recognized as the most suitable method for classifying spare part criticality in the company.
Resumo:
BACKGROUND: Dyslipidemia is recognized as a major cause of coronary heart disease (CHD). Emerged evidence suggests that the combination of triglycerides (TG) and waist circumference can be used to predict the risk of CHD. However, considering the known limitations of TG, non-high-density lipoprotein (non-HDL = Total cholesterol - HDL cholesterol) cholesterol and waist circumference model may be a better predictor of CHD. PURPOSE: The Framingham Offspring Study data were used to determine if combined non-HDL cholesterol and waist circumference is equivalent to or better than TG and waist circumference (hypertriglyceridemic waist phenotype) in predicting risk of CHD. METHODS: A total of3,196 individuals from Framingham Offspring Study, aged ~ 40 years old, who fasted overnight for ~ 9 hours, and had no missing information on nonHDL cholesterol, TG levels, and waist circumference measurements, were included in the analysis. Receiver Operator Characteristic Curve (ROC) Area Under the Curve (AUC) was used to compare the predictive ability of non-HDL cholesterol and waist circumference and TG and waist circumference. Cox proportional-hazards models were used to examine the association between the joint distributions of non-HDL cholesterol, waist circumference, and non-fatal CHD; TG, waist circumference, and non-fatal CHD; and the joint distribution of non-HDL cholesterol and TG by waist circumference strata, after adjusting for age, gender, smoking, alcohol consumption, diabetes, and hypertension status. RESULTS: The ROC AUC associated with non-HDL cholesterol and waist circumference and TG and waist circumference are 0.6428 (CI: 0.6183, 0.6673) and 0.6299 (CI: 0.6049, 0.6548) respectively. The difference in the ROC AVC is 1.29%. The p-value testing if the difference in the ROC AVCs between the two models is zero is 0.10. There was a strong positive association between non-HDL cholesterol and the risk for non-fatal CHD within each TO levels than that for TO levels within each level of nonHDL cholesterol, especially in individuals with high waist circumference status. CONCLUSION: The results suggest that the model including non-HDL cholesterol and waist circumference may be superior at predicting CHD compared to the model including TO and waist circumference.
Resumo:
This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.
Resumo:
This thesis focuses on developing an evolutionary art system using genetic programming. The main goal is to produce new forms of evolutionary art that filter existing images into new non-photorealistic (NPR) styles, by obtaining images that look like traditional media such as watercolor or pencil, as well as brand new effects. The approach permits GP to generate creative forms of NPR results. The GP language is extended with different techniques and methods inspired from NPR research such as colour mixing expressions, image processing filters and painting algorithm. Colour mixing is a major new contribution, as it enables many familiar and innovative NPR effects to arise. Another major innovation is that many GP functions process the canvas (rendered image), while is dynamically changing. Automatic fitness scoring uses aesthetic evaluation models and statistical analysis, and multi-objective fitness evaluation is used. Results showed a variety of NPR effects, as well as new, creative possibilities.
Resumo:
It is well-known that non-cooperative and cooperative game theory may yield different solutions to games. These differences are particularly dramatic in the case of truels, or three-person duels, in which the players may fire sequentially or simultaneously, and the games may be one-round or n-round. Thus, it is never a Nash equilibrium for all players to hold their fire in any of these games, whereas in simultaneous one-round and n-round truels such cooperation, wherein everybody survives, is in both the a -core and ß -core. On the other hand, both cores may be empty, indicating a lack of stability, when the unique Nash equilibrium is one survivor. Conditions under which each approach seems most applicable are discussed. Although it might be desirable to subsume the two approaches within a unified framework, such unification seems unlikely since the two approaches are grounded in fundamentally different notions of stability.
Resumo:
Moulin (1999) characterizes the fixed-path rationing methods by efficiency, strategy-proofness, consistency, and resource-monotonicity. In this note, we give a straightforward proof of his result.
Resumo:
We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.
Resumo:
In the context of multivariate regression (MLR) and seemingly unrelated regressions (SURE) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. in this paper, we propose finite-and large-sample likelihood-based test procedures for possibly non-linear hypotheses on the coefficients of MLR and SURE systems.