960 resultados para Modified algorithms
Resumo:
Cholesterol (Chol) is an important lipid in cellular membranes functioning both as a membrane fluidity regulator, permeability regulator and co-factor for some membrane proteins, e.g. G-protein coupled receptors. It also participates in the formation of signaling platforms and gives the membrane more mechanical strenght to prevent osmotic lysis of the cell. The sterol structure is very conserved and already minor structural modifications can completely abolish its membrane functions. The right interaction with adjacent lipids and the preference of certain lipid structures over others are also key factors in determining the membrane properties of cholesterol. Because of the many important properties of cholesterol it is of value to understand the forces and structural properties that govern the membrane behavior of this sterol. In this thesis we have used established fluorescence spectroscopy methods to study the membrane behavior of both cholesterol and some of its 3β-modified analogs. Using several fluorescent probes we have established how the acyl chain order of the two main lipid species, sphingomyelin (SM) and phosphatidylcholine (PC) affect sterol partitioning as well as characterized the membrane properties of 3β-aminocholesterol and cholesteryl phosphocholine. We concluded that cholesterol prefers SM over PC at equal acyl chain order, indicating that other structural properties besides the acyl chain order are important for sphingomyelin-sterol interactions. A positive charge at the 3β position only caused minor changes in the sterol membrane behavior compared to cholesterol. A large phosphocholine head group caused a disruption in membrane packing together with other membrane lipids with large head groups, but was also able to form stable fluid bilayers together with ceramide and cholesterol. The Ability of the large head group sterol to form bilayers together with ceramide was further explored in the last paper where cholesteryl phosphocholine/ceramide (Chol-PC/Cer) complexes were successfully used to transfer ceramide into cultured cells.
Resumo:
Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.
Resumo:
Oligonucleotides have a wide range of applications in fields such as biotechnology, molecular biology, diagnosis and therapy. However, the spectrum of uses can be broadened by introducing chemical modifications into their structures. The most prolific field in the search for new oligonucleotide analogs is the antisense strategy, where chemical modifications confer appropriate characteristics such as hybridization, resistance to nucleases, cellular uptake, selectivity and, basically, good pharmacokinetic and pharmacodynamic properties. Combinatorial technology is another research area where oligonucleotides and their analogs are extensively employed. Aptamers, new catalytic ribozymes and deoxyribozymes are RNA or DNA molecules individualized from a randomly synthesized library on the basis of a particular property. They are identified by repeated cycles of selection and amplification, using PCR technologies. Modified nucleotides can be introduced either during the amplification procedure or after selection.
Resumo:
The introduction of highly active antiretroviral therapy (HAART) for patients infected with HIV has significantly prolonged the life expectancy and to some extent has restored a functional immune response. However, the premature introduction of HAART has led to a significant and alarming increase in cardiovascular complications, including myocardial infarction and the appearance of abnormal distribution of body fat seen as lipodystrophy. One key element in the development of ischemic coronary artery disease is the presence of circulating and tissue-fixed modified low density lipoprotein (mLDL) that contributes to the initiation and progression of arterial lesions and to the formation of foam cells. Even though not completely elucidated, the most likely mechanism involves mLDL in the inflammatory response and the induction of a specific immune response against mLDL. Circulating antibodies against mLDL can serve as an indirect marker of the presence of circulating and vessel-fixed mLDL. In the present study, we measured antibodies to mLDL and correlated them with immune status (i.e., number of CD4+ T cells) in 59 HIV patients and with the clinical manifestation of lipodystrophy in 10 patients. We observed a significant reduction in anti-mLDL antibody levels related both to lipodystrophy and to an immunocompromised state in HIV patients. We speculate that these antibodies may explain in part the rapid development of ischemic coronary artery disease in some patients.
Resumo:
The world’s population is growing at a rapid rate and one of the primary problems of a growing is food supply. To ensure food supply and security, the biggest companies in the agricultural sector of the United States and all over the world have collaborated to produce genetically modified organisms, including crops, that have a tendency to increase yields and are speculated to reduce pesticide use. It’s a technology that is declared to have a multitude of benefits. During the same time period another set of practices has risen to the horizon by the name of agroecology. It spreads across many different sectors such as politics, sociology, environment, health and so on. Moreover, it involves primitive organic techniques that can be applied at farm level to enhance the performance of an ecosystem to effectively decrease the negative effect on environment and health of individuals while producing good quality foods. Since both the processes proclaim sustainable development, a natural question may come in mind that which one seems more favorable? During the course of this study, genetically modified organisms (GMOs) and agroecology are compared within the sphere of social, environmental and health aspects. The results derived upon a comparative analysis of scientific literature tend to prove that GMOs pose a greater threat to the environment, health of individuals and the generalized social balance in the United States compared to agroecological practices. Economic indicators were not included in the study and more studies might be needed in the future to get a broader view on the subject.
Resumo:
The objective of the present study was to determine if the acute behavioral effects of cocaine acutely administered intraperitoneally (ip) at doses of 5, 10 and 20 mg/kg on white male CF1 mice, 90 days of age, would be influenced by leptin acutely administered ip (at doses of 5, 10 and 20 µg/kg) or by endogenous leptin production enhanced by a high-fat diet. The acute behavioral effects of cocaine were evaluated in open-field, elevated plus-maze and forced swimming tests. Results were compared between a group of 80 mice consuming a balanced diet and a high-fat diet, and a group of 80 mice fed a commercially available rodent chow formula (Ralston Purina) but receiving recombinant leptin (rLeptin) or saline ip. Both the high-fat-fed and rLeptin-treated mice showed decreased locomotion in the open-field test, spent more time in the open arms of the elevated plus-maze and showed less immobility time in the forced swimming test (F(1,68) = 7.834, P = 0.007). There was an interaction between diets and cocaine/saline treatments in locomotion (F(3,34) = 3.751, P = 0.020) and exploration (F(3,34) = 3.581, P = 0.024). These results suggest that anxiolytic effects and increased general activity were induced by leptin in cocaine-treated mice and that low leptin levels are associated with behavioral depression. Chronic changes in diet composition producing high leptin levels or rLeptin treatment may result in an altered response to cocaine in ethologic tests that measure degrees of anxiety and depression, which could be attributed to an antagonistic effect of leptin.
Resumo:
A method for the screening of tetanus and diphtheria antibodies in serum using anatoxin (inactivated toxin) instead of toxin was developed as an alternative to the in vivo toxin neutralization assay based on the toxin-binding inhibition test (TOBI test). In this study, the serum titers (values between 1.0 and 19.5 IU) measured by a modified TOBI test (Modi-TOBI test) and toxin neutralization assays were correlated (P < 0.0001). Titers of tetanus or diphtheria antibodies were evaluated in serum samples from guinea pigs immunized with tetanus toxoid, diphtheria-tetanus or triple vaccine. For the Modi-TOBI test, after blocking the microtiter plates, standard tetanus or diphtheria antitoxin and different concentrations of guinea pig sera were incubated with the respective anatoxin. Twelve hours later, these samples were transferred to a plate previously coated with tetanus or diphtheria antitoxin to bind the remaining anatoxin. The anatoxin was then detected using a peroxidase-labeled tetanus or diphtheria antitoxin. Serum titers were calculated using a linear regression plot of the results for the corresponding standard antitoxin. For the toxin neutralization assay, L+/10/50 doses of either toxin combined with different concentrations of serum samples were inoculated into mice for anti-tetanus detection, or in guinea pigs for anti-diphtheria detection. Both assays were suitable for determining wide ranges of antitoxin levels. The linear regression plots showed high correlation coefficients for tetanus (r² = 0.95, P < 0.0001) and for diphtheria (r² = 0.93, P < 0.0001) between the in vitro and the in vivo assays. The standardized method is appropriate for evaluating titers of neutralizing antibodies, thus permitting the in vitro control of serum antitoxin levels.
Resumo:
We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.
Resumo:
Permanent bilateral occlusion of the common carotid arteries (2VO) in the rat has been established as a valid experimental model to investigate the effects of chronic cerebral hypoperfusion on cognitive function and neurodegenerative processes. Our aim was to compare the cognitive and morphological outcomes following the standard 2VO procedure, in which there is concomitant artery ligation, with those of a modified protocol, with a 1-week interval between artery occlusions to avoid an abrupt reduction of cerebral blood flow, as assessed by animal performance in the water maze and damage extension to the hippocampus and striatum. Male Wistar rats (N = 47) aged 3 months were subjected to chronic hypoperfusion by permanent bilateral ligation of the common carotid arteries using either the standard or the modified protocol, with the right carotid being the first to be occluded. Three months after the surgical procedure, rat performance in the water maze was assessed to investigate long-term effects on spatial learning and memory and their brains were processed in order to estimate hippocampal volume and striatal area. Both groups of hypoperfused rats showed deficits in reference (F(8,172) = 7.0951, P < 0.00001) and working spatial memory [2nd (F(2,44) = 7.6884, P < 0.001), 3rd (F(2,44) = 21.481, P < 0.00001) and 4th trials (F(2,44) = 28.620, P < 0.0001)]; however, no evidence of tissue atrophy was found in the brain structures studied. Despite similar behavioral and morphological outcomes, the rats submitted to the modified protocol showed a significant increase in survival rate, during the 3 months of the experiment (P < 0.02).
Resumo:
The purpose of the present study was to explore the usefulness of the Mexican sequential organ failure assessment (MEXSOFA) score for assessing the risk of mortality for critically ill patients in the ICU. A total of 232 consecutive patients admitted to an ICU were included in the study. The MEXSOFA was calculated using the original SOFA scoring system with two modifications: the PaO2/FiO2 ratio was replaced with the SpO2/FiO2 ratio, and the evaluation of neurologic dysfunction was excluded. The ICU mortality rate was 20.2%. Patients with an initial MEXSOFA score of 9 points or less calculated during the first 24 h after admission to the ICU had a mortality rate of 14.8%, while those with an initial MEXSOFA score of 10 points or more had a mortality rate of 40%. The MEXSOFA score at 48 h was also associated with mortality: patients with a score of 9 points or less had a mortality rate of 14.1%, while those with a score of 10 points or more had a mortality rate of 50%. In a multivariate analysis, only the MEXSOFA score at 48 h was an independent predictor for in-ICU death with an OR = 1.35 (95%CI = 1.14-1.59, P < 0.001). The SOFA and MEXSOFA scores calculated 24 h after admission to the ICU demonstrated a good level of discrimination for predicting the in-ICU mortality risk in critically ill patients. The MEXSOFA score at 48 h was an independent predictor of death; with each 1-point increase, the odds of death increased by 35%.
Resumo:
Our objective is to evaluate the accuracy of three algorithms in differentiating the origins of outflow tract ventricular arrhythmias (OTVAs). This study involved 110 consecutive patients with OTVAs for whom a standard 12-lead surface electrocardiogram (ECG) showed typical left bundle branch block morphology with an inferior axis. All the ECG tracings were retrospectively analyzed using the following three recently published ECG algorithms: 1) the transitional zone (TZ) index, 2) the V2 transition ratio, and 3) V2 R wave duration and R/S wave amplitude indices. Considering all patients, the V2 transition ratio had the highest sensitivity (92.3%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (93.9%). The latter finding had a maximal area under the ROC curve of 0.925. In patients with left ventricular (LV) rotation, the V2 transition ratio had the highest sensitivity (94.1%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (87.5%). The former finding had a maximal area under the ROC curve of 0.892. All three published ECG algorithms are effective in differentiating the origin of OTVAs, while the V2 transition ratio, and the V2 R wave duration and R/S wave amplitude indices are the most sensitive and specific algorithms, respectively. Amongst all of the patients, the V2 R wave duration and R/S wave amplitude algorithm had the maximal area under the ROC curve, but in patients with LV rotation the V2 transition ratio algorithm had the maximum area under the ROC curve.
Resumo:
Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.
Resumo:
Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.
Resumo:
Wind turbines based on doubly fed induction generators (DFIG) become the most popular solution in high power wind generation industry. While this topology provides great performance with the reduced power rating of power converter, it has more complicated structure in comparison with full-rated topologies, and therefore leads to complexity of control algorithms and electromechanical processes in the system. The purpose of presented study is to present a proper vector control scheme for the DFIG and overall control for the WT to investigate its behavior at different wind speeds and in different grid voltage conditions: voltage sags, magnitude and frequency variations. The key principles of variable-speed wind turbine were implemented in simulation model and demonstrated during the study. Then, based on developed control scheme and mathematical model, the set of simulation is made to analyze reactive power capabilities of the DFIG wind turbine. Further, the rating of rotor-side converter is modified to not only generate active rated active power, but also to fulfill Grid Codes. Results of modelling and analyzing of the DFIG WT behavior under different speeds and different voltage conditions are presented in the work.
Resumo:
The aim of this experiment was to evaluate how susceptible spores become to mechanical damage during food extrusion after being submitted to CO2. B. stearothermophilus spores sowed to corn and soy mix were submitted to 99% CO2 for 10 days and extruded in a single-screw extruder. The treatments were: T1 - spore-containing samples, extruded at screw rotational speed of 65 rpm and barrel wall temperature of 80 °C; T2 - as T1, except for screw rotational speed of 150 rpm; and T3 - as T2, except that samples were submitted to the modified atmosphere. The results for cell viability, minimum and maximum residence times, and static pressure were T1 - 19.90 ± 3.24%, 123.3 ± 14.50 seconds; 203.3 ± 14.05 seconds; 2.217 ± 62 kPa; T2 - 21.42 ± 8.24%, 70.00 ± 5.77 seconds; 170.00 ± 4.67 seconds; 2.310 ± 107 kPa; and T3 - 11.06 ± 2.46%, 86.00 ± 7.23 seconds; 186.00 ± 7.50 seconds; 2.403 ± 93 kPa, respectively. It was concluded that the extrusion process did reduce the cell count. However, screw rotational speed variation or CO2 pre-treatment did not affect cell viability.