945 resultados para DENSITY ANALYSIS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oxidation of lipids is important in many pathological conditions and lipid peroxidation products such as 4-hydroxynonenal (HNE) and other aldehydes are commonly measured as biomarkers of oxidative stress. However, it is often useful to complement this with analysis of the original oxidized phospholipid. Electrospray mass spectrometry (ESMS) provides an informative method for detecting oxidative alterations to phospholipids, and has been used to investigate oxidative damage to cells, and low-density lipoprotein, as well as for the analysis of oxidized phosphatidylcholines present in atherosclerotic plaque material. There is increasing evidence that intact oxidized phospholipids have biological effects; in particular, oxidation products of 1-palmitoyl-2-arachidonoyl-sn-glycerophosphocholine (PAPC) have been found to cause inflammatory responses, which could be potentially important in the progression of atherosclerosis. The effects of chlorohydrin derivatives of lipids have been much less studied, but it is clear that free fatty acid chlorohydrins and phosphatidylcholine chlorohydrins are toxic to cells at concentrations above 10 micromolar, a range comparable to that of HNE and oxidized PAPC. There is some evidence that chlorohydrins have biological effects that may be relevant to atherosclerosis, but further work is needed to elucidate their pro-inflammatory properties, and to understand the mechanisms and balance of biological effects that could result from oxidation of complex mixtures of lipids in a pathophysiological situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We determine the critical noise level for decoding low-density parity check error-correcting codes based on the magnetization enumerator (M), rather than on the weight enumerator (W) employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived using the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared the molecular lipidomic profi le of LDL in patients with nondiabetic advanced renal disease and no evidence of CVD to that of age-matched controls, with the hypothesis that it would reveal proatherogenic lipid alterations. LDL was isolated from 10 normocholesterolemic patients with stage 4/5 renal disease and 10 controls, and lipids were analyzed by accurate mass LC/MS. Top-down lipidomics analysis and manual examination of the data identifi ed 352 lipid species, and automated comparative analysis demonstrated alterations in lipid profi le in disease. The total lipid and cholesterol content was unchanged, but levels of triacylglycerides and N -acyltaurines were signifi cantly increased, while phosphatidylcholines, plasmenyl ethanolamines, sulfatides, ceramides, and cholesterol sulfate were signifi cantly decreased in chronic kidney disease (CKD) patients. Chemometric analysis of individual lipid species showed very good discrimination of control and disease sample despite the small cohorts and identifi ed individual unsaturated phospholipids and triglycerides mainly responsible for the discrimination. These fi ndings illustrate the point that although the clinical biochemistry parameters may not appear abnormal, there may be important underlying lipidomic changes that contribute to disease pathology. The lipidomic profi le of CKD LDL offers potential for new biomarkers and novel insights into lipid metabolism and cardiovascular risk in this disease. -Reis, A., A. Rudnitskaya, P. Chariyavilaskul, N. Dhaun, V. Melville, J. Goddard, D. J. Webb, A. R. Pitt, and C. M. Spickett. Topdown lipidomics of low density lipoprotein reveal altered lipid profi les in advanced chronic kidney disease. J. Lipid Res. 2015.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report results of an experimental study, complemented by detailed statistical analysis of the experimental data, on the development of a more effective control method of drug delivery using a pH sensitive acrylic polymer. New copolymers based on acrylic acid and fatty acid are constructed from dodecyl castor oil and a tercopolymer based on methyl methacrylate, acrylic acid and acryl amide were prepared using this new approach. Water swelling characteristics of fatty acid, acrylic acid copolymer and tercopolymer respectively in acid and alkali solutions have been studied by a step-change method. The antibiotic drug cephalosporin and paracetamol have also been incorporated into the polymer blend through dissolution with the release of the antibiotic drug being evaluated in bacterial stain media and buffer solution. Our results show that the rate of release of paracetamol getss affected by the pH factor and also by the nature of polymer blend. Our experimental data have later been statistically analyzed to quantify the precise nature of polymer decay rates on the pH density of the relevant polymer solvents. The time evolution of the polymer decay rates indicate a marked transition from a linear to a strictly non-linear regime depending on the whether the chosen sample is a general copolymer (linear) or a tercopolymer (non-linear). Non-linear data extrapolation techniques have been used to make probabilistic predictions about the variation in weight percentages of retained polymers at all future times, thereby quantifying the degree of efficacy of the new method of drug delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A set of 38 epitopes and 183 non-epitopes, which bind to alleles of the HLA-A3 supertype, was subjected to a combination of comparative molecular similarity indices analysis (CoMSIA) and soft independent modeling of class analogy (SIMCA). During the process of T cell recognition, T cell receptors (TCR) interact with the central section of the bound nonamer peptide; thus only positions 4−8 were considered in the study. The derived model distinguished 82% of the epitopes and 73% of the non-epitopes after cross-validation in five groups. The overall preference from the model is for polar amino acids with high electron density and the ability to form hydrogen bonds. These so-called “aggressive” amino acids are flanked by small-sized residues, which enable such residues to protrude from the binding cleft and take an active role in TCR-mediated T cell recognition. Combinations of “aggressive” and “passive” amino acids in the middle part of epitopes constitute a putative TCR binding motif

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Electrosurgery units are widely employed in modern surgery. Advances in technology have enhanced the safety of these devices, nevertheless, accidental burns are still regularly reported. This study focuses on possible causes of sacral burns as complication of the use of electrosurgery. Burns are caused by local densifications of the current, but the actual pathway of current within patient's body is unknown. Numerical electromagnetic analysis can help in understanding the issue. Methods: To this aim, an accurate heterogeneous model of human body (including seventy-seven different tissues), electrosurgery electrodes, operating table and mattress was build to resemble a typical surgery condition. The patient lays supine on the mattress with the active electrode placed onto the thorax and the return electrode on his back. Common operating frequencies of electrosurgery units were considered. Finite Difference Time Domain electromagnetic analysis was carried out to compute the spatial distribution of current density within the patient's body. A differential analysis by changing the electrical properties of the operating table from a conductor to an insulator was also performed. Results: Results revealed that distributed capacitive coupling between patient body and the conductive operating table offers an alternative path to the electrosurgery current. The patient's anatomy, the positioning and the different electromagnetic properties of tissues promote a densification of the current at the head and sacral region. In particular, high values of current density were located behind the sacral bone and beneath the skin. This did not occur in the case of non-conductive operating table. Conclusion: Results of the simulation highlight the role played from capacitive couplings between the return electrode and the conductive operating table. The concentration of current density may result in an undesired rise in temperature, originating burns in body region far from the electrodes. This outcome is concordant with the type of surgery-related sacral burns reported in literature. Such burns cannot be immediately detected after surgery, but appear later and can be confused with bedsores. In addition, the dosimetric analysis suggests that reducing the capacity coupling between the return electrode and the operating table can decrease or avoid this problem. © 2013 Bifulco et al.; licensee BioMed Central Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Петър Господинов, Добри Данков, Владимир Русинов, Стефан Стефанов - Иследвано е цилиндрично течение на Кует на разреден газ в случая на въртене на два коаксиални цилиндъра с еднакви по големина скорости, но в различни посоки. Целта на изследването е да се установи влиянието на малки скорости на въртене върху макрохарактеристиките – ρ, V , . Числените резултати са получени чрез използване на DSMC и числено решение на уравненията на Навие-Стокс за относително малки (дозвукови) скорости на въртене. Установено е добро съвпадение на резултатите получени по двата метода за Kn = 0.02. Установено е, че съществува “стационарна” точка за плътността и скоростта. Получените резултати са важни при решаването на неравнини, задачи от микрофлуидиката с отчитане на ефектите на кривината. Ключови думи: Механика на флуидите, Кинетична теория, Разреден газ, DSMC

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 35Q02, 35Q05, 35Q10, 35B40.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A highly sensitive liquid level monitoring system based on microstructured polymer optical fiber Bragg grating (mPOFBG) array sensors is reported for the first time. The configuration is based on five mPOFBGs inscribed in the same fiber in the 850 nm spectral region, showing the potential to interrogate liquid level by measuring the strain induced in each mPOFBG embedded in a silicone rubber (SR) diaphragm, which deforms due to hydrostatic pressure variations. The sensor exhibits a highly linear response over the sensing range, a good repeatability, and a high resolution. The sensitivity of the sensor is found to be 98 pm/cm of water, enhanced by more than a factor of 9 when compared to an equivalent sensor based on a silica fiber around 1550 nm. The temperature sensitivity is studied and a multi-sensor arrangement proposed, which has the potential to provide level readings independent of temperature and the liquid density.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To determine the factors influencing the distribution of β-amyloid (Aβ) deposits in Alzheimer's disease (AD), the spatial patterns of the diffuse, primitive, and classic Aβ deposits were studied from the superior temporal gyrus (STG) to sector CA4 of the hippocampus in six sporadic cases of the disease. In cortical gyri and in the CA sectors of the hippocampus, the Aβ deposits were distributed either in clusters 200-6400 μm in diameter that were regularly distributed parallel to the tissue boundary or in larger clusters greater than 6400 μm in diameter. In some regions, smaller clusters of Aβ deposits were aggregated into larger 'superclusters'. In many cortical gyri, the density of Aβ deposits was positively correlated with distance below the gyral crest. In the majority of regions, clusters of the diffuse, primitive, and classic deposits were not spatially correlated with each other. In two cases, double immunolabelled to reveal the Aβ deposits and blood vessels, the classic Aβ deposits were clustered around the larger diameter vessels. These results suggest a complex pattern of Aβ deposition in the temporal lobe in sporadic AD. A regular distribution of Aβ deposit clusters may reflect the degeneration of specific cortico-cortical and cortico-hippocampal pathways and the influence of the cerebral blood vessels. Large-scale clustering may reflect the aggregation of deposits in the depths of the sulci and the coalescence of smaller clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is the first work using patterned soft underlayers in multilevel three-dimensional vertical magnetic data storage systems. The motivation stems from an exponentially growing information stockpile, and a corresponding need for more efficient storage devices with higher density. The world information stockpile currently exceeds 150EB (ExaByte=1x1018Bytes); most of which is in analog form. Among the storage technologies (semiconductor, optical and magnetic), magnetic hard disk drives are posed to occupy a big role in personal, network as well as corporate storage. However; this mode suffers from a limit known as the Superparamagnetic limit; which limits achievable areal density due to fundamental quantum mechanical stability requirements. There are many viable techniques considered to defer superparamagnetism into the 100's of Gbit/in2 such as: patterned media, Heat-Assisted Magnetic Recording (HAMR), Self Organized Magnetic Arrays (SOMA), antiferromagnetically coupled structures (AFC), and perpendicular magnetic recording. Nonetheless, these techniques utilize a single magnetic layer; and can thusly be viewed as two-dimensional in nature. In this work a novel three-dimensional vertical magnetic recording approach is proposed. This approach utilizes the entire thickness of a magnetic multilayer structure to store information; with potential areal density well into the Tbit/in2 regime. ^ There are several possible implementations for 3D magnetic recording; each presenting its own set of requirements, merits and challenges. The issues and considerations pertaining to the development of such systems will be examined, and analyzed using empirical and numerical analysis techniques. Two novel key approaches are proposed and developed: (1) Patterned soft underlayer (SUL) which allows for enhanced recording of thicker media, (2) A combinatorial approach for 3D media development that facilitates concurrent investigation of various film parameters on a predefined performance metric. A case study is presented using combinatorial overcoats of Tantalum and Zirconium Oxides for corrosion protection in magnetic media. ^ Feasibility of 3D recording is demonstrated, and an emphasis on 3D media development is emphasized as a key prerequisite. Patterned SUL shows significant enhancement over conventional "un-patterned" SUL, and shows that geometry can be used as a design tool to achieve favorable field distribution where magnetic storage and magnetic phenomena are involved. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently the data storage industry is facing huge challenges with respect to the conventional method of recording data known as longitudinal magnetic recording. This technology is fast approaching a fundamental physical limit, known as the superparamagnetic limit. A unique way of deferring the superparamagnetic limit incorporates the patterning of magnetic media. This method exploits the use of lithography tools to predetermine the areal density. Various nanofabrication schemes are employed to pattern the magnetic material are Focus Ion Beam (FIB), E-beam Lithography (EBL), UV-Optical Lithography (UVL), Self-assembled Media Synthesis and Nanoimprint Lithography (NIL). Although there are many challenges to manufacturing patterned media, the large potential gains offered in terms of areal density make it one of the most promising new technologies on the horizon for future hard disk drives. Thus, this dissertation contributes to the development of future alternative data storage devices and deferring the superparamagnetic limit by designing and characterizing patterned magnetic media using a novel nanoimprint replication process called "Step and Flash Imprint lithography". As opposed to hot embossing and other high temperature-low pressure processes, SFIL can be performed at low pressure and room temperature. Initial experiments carried out, consisted of process flow design for the patterned structures on sputtered Ni-Fe thin films. The main one being the defectivity analysis for the SFIL process conducted by fabricating and testing devices of varying feature sizes (50 nm to 1 μm) and inspecting them optically as well as testing them electrically. Once the SFIL process was optimized, a number of Ni-Fe coated wafers were imprinted with a template having the patterned topography. A minimum feature size of 40 nm was obtained with varying pitch (1:1, 1:1.5, 1:2, and 1:3). The Characterization steps involved extensive SEM study at each processing step as well as Atomic Force Microscopy (AFM) and Magnetic Force Microscopy (MFM) analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined how the themes of environmental sustainability are evident in the national, state and local standards that guide k–12 science curriculum. The study applied the principles of content analysis within the framework of an ecological paradigm. In education, an ecological paradigm focuses on students' use of a holistic lens to view and understand material. The intent of this study was to analyze the seventh grade science content standards at the national, state, and local textbook levels to determine how and the extent to which each of the five themes of environmental sustainability are presented in the language of each text. The themes are: (a) Climate Change Indicators, (b) Biodiversity, (c) Human Population Density, (d) Impact and Presence of Environmental Pollution, (e) Earth as a Closed System. The research study offers practical insight on using a method of content analysis to locate keywords of environmental sustainability in the three texts and determine if the context of each term relates to this ecological paradigm. Using a concordance program, the researcher identified the frequency and context of each vocabulary item associated with these themes. Nine chi squares were run to determine if there were differences in content between the national and state standards and the textbook. Within each level chi squares were also run to determine if there were differences between the appearance of content knowledge and skill words. Results indicate that there is a lack of agreement between levels that is significant p < .01. A discussion of these results in relation to curriculum development and standardized assessments followed. The study found that at the national and state levels, there is a lack of articulation of the goals of environmental sustainability or an ecological paradigm. With respect to the science textbook, a greater number of keywords were present; however, the context of many of these keywords did not align with the discourse of an ecological paradigm. Further, the environmental sustainability themes present in the textbook were limited to the last four chapters of the text. Additional research is recommended to determine whether this situation also exists in other settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^