969 resultados para Lipids - reference interval


Relevância:

30.00% 30.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bibliographical foot-notes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents a unique research opportunity by using recordings which provide electrocardiogram (ECG) plus a reference breathing signal (RBS). ECG derived breathing (EDR) is measured and correlated against RBS. Standard deviations of multiresolution wavelet analysis coefficients (SDMW) are obtained from heart rate and classified using RBS. Prior works by others used select patients for sleep apnea scoring with EDR but no RBS. Another prior work classified select heart disease patients with SDMW but no RBS. This study used randomly chosen sleep disorder patient recordings; central and obstructive apneas, with and without heart disease.^ Implementation required creating an application because existing systems were limited in power and scope. A review survey was created to choose a development environment. The survey is presented as a learning tool and teaching resource. Development objectives were rapid development using limited resources (manpower and money). Open Source resources were used exclusively for implementation. ^ Results show: (1) Three groups of patients exist in the study. Grouping RBS correlations shows a response with either ECG interval or amplitude variation. A third group exists where neither ECG intervals nor amplitude variation correlate with breathing. (2) Previous work done by other groups analyzed SDMW. Similar results were found in this study but some subjects had higher SDMW, attributed to a large number of apneas, arousals and/or disconnects. SDMW does not need RBS to show apneic conditions exist within ECG recordings. (3) Results in this study support the assertion that autonomic nervous system variation was measured with SDMW. Measurements using RBS are not corrupted due to breathing even though respiration overlaps the same frequency band.^ Overall, this work becomes an Open Source resource which can be reused, modified and/or expanded. It might fast track additional research. In the future the system could also be used for public domain data. Prerecorded data exist in similar formats in public databases which could provide additional research opportunities. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our aim was to determine the normative reference values of cardiorespiratory fitness (CRF) and to establish the proportion of subjects with low CRF suggestive of future cardio-metabolic risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The raft hypothesis proposes that microdomains enriched in sphingolipids, cholesterol, and specific proteins are transiently formed to accomplish important cellular tasks. Equivocally, detergent-resistant membranes were initially assumed to be identical to membrane rafts, because of similarities between their compositions. In fact, the impact of detergents in membrane organization is still controversial. Here, we use phase contrast and fluorescence microscopy to observe giant unilamellar vesicles (GUVs) made of erythrocyte membrane lipids (erythro-GUVs) when exposed to the detergent Triton X-100 (TX-100). We clearly show that TX-100 has a restructuring action on biomembranes. Contact with TX-100 readily induces domain formation on the previously homogeneous membrane of erythro-GUVs at physiological and room temperatures. The shape and dynamics of the formed domains point to liquid-ordered/liquid-disordered (Lo/Ld) phase separation, typically found in raft-like ternary lipid mixtures. The Ld domains are then separated from the original vesicle and completely solubilized by TX-100. The insoluble vesicle left, in the Lo phase, represents around 2/3 of the original vesicle surface at room temperature and decreases to almost 1/2 at physiological temperature. This chain of events could be entirely reproduced with biomimetic GUVs of a simple ternary lipid mixture, 2:1:2 POPC/SM/chol (phosphatidylcholine/sphyngomyelin/cholesterol), showing that this behavior will arise because of fundamental physicochemical properties of simple lipid mixtures. This work provides direct visualization of TX-100-induced domain formation followed by selective (Ld phase) solubilization in a model system with a complex biological lipid composition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lawsonia inermis mediated synthesis of silver nanoparticles (Ag-NPs) and its efficacy against Candida albicans, Microsporum canis, Propioniabacterium acne and Trichophyton mentagrophytes is reported. A two-step mechanism has been proposed for bioreduction and formation of an intermediate complex leading to the synthesis of capped nanoparticles was developed. In addition, antimicrobial gel for M. canis and T. mentagrophytes was also formulated. Ag-NPs were synthesized by challenging the leaft extract of L. inermis with 1 mM AgNO₃. The Ag-NPs were characterized by Ultraviolet-Visible (UV-Vis) spectrophotometer and Fourier transform infrared spectroscopy (FTIR). Transmission electron microscopy (TEM), nanoparticle tracking and analysis sytem (NTA) and zeta potential was measured to detect the size of Ag-NPs. The antimicrobial activity of Ag-NPs was evaluated by disc diffusion method against the test organisms. Thus these Ag-NPs may prove as a better candidate drug due to their biogenic nature. Moreover, Ag-NPs may be an answer to the drug-resistant microorganisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A rapid, sensitive and specific method for quantifying propylthiouracil in human plasma using methylthiouracil as the internal standard (IS) is described. The analyte and the IS were extracted from plasma by liquid-liquid extraction using an organic solvent (ethyl acetate). The extracts were analyzed by high performance liquid chromatography coupled with electrospray tandem mass spectrometry (HPLC-MS/MS) in negative mode (ES-). Chromatography was performed using a Phenomenex Gemini C18 5μm analytical column (4.6mm×150mm i.d.) and a mobile phase consisting of methanol/water/acetonitrile (40/40/20, v/v/v)+0.1% of formic acid. For propylthiouracil and I.S., the optimized parameters of the declustering potential, collision energy and collision exit potential were -60 (V), -26 (eV) and -5 (V), respectively. The method had a chromatographic run time of 2.5min and a linear calibration curve over the range 20-5000ng/mL. The limit of quantification was 20ng/mL. The stability tests indicated no significant degradation. This HPLC-MS/MS procedure was used to assess the bioequivalence of two propylthiouracil 100mg tablet formulations in healthy volunteers of both sexes in fasted and fed state. The geometric mean and 90% confidence interval CI of Test/Reference percent ratios were, without and with food, respectively: 109.28% (103.63-115.25%) and 115.60% (109.03-122.58%) for Cmax, 103.31% (100.74-105.96%) and 103.40% (101.03-105.84) for AUClast. This method offers advantages over those previously reported, in terms of both a simple liquid-liquid extraction without clean-up procedures, as well as a faster run time (2.5min). The LOQ of 20ng/mL is well suited for pharmacokinetic studies. The assay performance results indicate that the method is precise and accurate enough for the routine determination of the propylthiouracil in human plasma. The test formulation with and without food was bioequivalent to reference formulation. Food administration increased the Tmax and decreased the bioavailability (Cmax and AUC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The genera Cochliomyia and Chrysomya contain both obligate and saprophagous flies, which allows the comparison of different feeding habits between closely related species. Among the different strategies for comparing these habits is the use of qPCR to investigate the expression levels of candidate genes involved in feeding behavior. To ensure an accurate measure of the levels of gene expression, it is necessary to normalize the amount of the target gene with the amount of a reference gene having a stable expression across the compared species. Since there is no universal gene that can be used as a reference in functional studies, candidate genes for qPCR data normalization were selected and validated in three Calliphoridae (Diptera) species, Cochliomyia hominivorax Coquerel, Cochliomyia macellaria Fabricius, and Chrysomya albiceps Wiedemann . The expression stability of six genes ( Actin, Gapdh, Rp49, Rps17, α -tubulin, and GstD1) was evaluated among species within the same life stage and between life stages within each species. The expression levels of Actin, Gapdh, and Rp49 were the most stable among the selected genes. These genes can be used as reliable reference genes for functional studies in Calliphoridae using similar experimental settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By comparing the SEED and Pfam functional profiles of metagenomes of two Brazilian coral species with 29 datasets that are publicly available, we were able to identify some functions, such as protein secretion systems, that are overrepresented in the metagenomes of corals and may play a role in the establishment and maintenance of bacteria-coral associations. However, only a small percentage of the reads of these metagenomes could be annotated by these reference databases, which may lead to a strong bias in the comparative studies. For this reason, we have searched for identical sequences (99% of nucleotide identity) among these metagenomes in order to perform a reference-independent comparative analysis, and we were able to identify groups of microbial communities that may be under similar selective pressures. The identification of sequences shared among the metagenomes was found to be even better for the identification of groups of communities with similar niche requirements than the traditional analysis of functional profiles. This approach is not only helpful for the investigation of similarities between microbial communities with high proportion of unknown reads, but also enables an indirect overview of gene exchange between communities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To describe maternal and neonatal outcomes in pregnant women undergoing hemodialysis in a referral center in Brazilian Southeast side. Retrospective and descriptive study, with chart review of all pregnancies undergoing hemodialysis that were followed-up at an outpatient clinic of high- risk prenatal care in Southeast Brazil. Among the 16 women identified, 2 were excluded due to follow-up loss. In 14 women described, hypertension was the most frequent cause of chronic renal failure (half of cases). The majority (71.4%) had performed hemodialysis treatment for more than one year and all of them underwent 5 to 6 hemodialysis sessions per week. Eleven participants had chronic hypertension, 1 of which was also diabetic, and 6 of them were smokers. Regarding pregnancy complications, 1 of the hypertensive women developed malignant hypertension (with fetal growth restriction and preterm delivery at 29 weeks), 2 had acute pulmonary edema and 2 had abruption placenta. The mode of delivery was cesarean section in 9 women (64.3%). All neonates had Apgar score at five minutes above 7. To improve perinatal and maternal outcomes of women undergoing hemodialysis, it is important to ensure multidisciplinary approach in referral center, strict control of serum urea, hemoglobin and maternal blood pressure, as well as close monitoring of fetal well-being and maternal morbidities. Another important strategy is suitable guidance for contraception in these women.