992 resultados para Aggregation methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: The relationship between the activity of eosinophils and platelets has been observed in recent decades by many scientists. These observations include increased numbers of eosinophils associated with platelet disorders, including changes in the coagulation cascade and platelet aggregation. Based on these observations, the interaction between eosinophils and platelets in platelet aggregation was analyze. MAIN METHODS: Human platelets were incubated with eosinophil cytosolic fraction, promyelocytic human HL-60 clone 15 cell lineage, and eosinophil cationic protein (ECP). Platelet rich plasma (PRP) aggregation was induced by adenosine diphosphate, platelet activating factor, arachidonic acid, and collagen, and washed platelets (WP) were activated by thrombin. KEY FINDINGS: Aggregation induced by all agonists was dose dependently inhibited by eosinophil cytosolic fraction. This inhibition was only partially reversed by previous incubation of the eosinophils with l-Nitro-Arginine-Methyl-Ester (l-NAME). Previous incubation with indomethacin did not prevent the cytosolic fraction induced inhibition. The separation of eosinophil cytosolic fraction by gel filtration on Sephadex G-75 showed that the inhibitory activity was concentrated in the lower molecular weight fraction. HL-60 clone 15 cells differentiated into eosinophils for 5 and 7 day were able to inhibit platelet aggregation. The ECP protein inhibited the platelet aggregation on PRP and WP. This inhibition was more evident in WP, and the citotoxicity MTT assay proved the viability of tested platelets, showing that the observed inhibition by the ECP protein does not occur simply by cell death. SIGNIFICANCE: Our results indicate that eosinophils play a fundamental role in platelet aggregation inhibition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nano(bio)science and nano(bio)technology play a growing and tremendous interest both on academic and industrial aspects. They are undergoing rapid developments on many fronts such as genomics, proteomics, system biology, and medical applications. However, the lack of characterization tools for nano(bio)systems is currently considered as a major limiting factor to the final establishment of nano(bio)technologies. Flow Field-Flow Fractionation (FlFFF) is a separation technique that is definitely emerging in the bioanalytical field, and the number of applications on nano(bio)analytes such as high molar-mass proteins and protein complexes, sub-cellular units, viruses, and functionalized nanoparticles is constantly increasing. This can be ascribed to the intrinsic advantages of FlFFF for the separation of nano(bio)analytes. FlFFF is ideally suited to separate particles over a broad size range (1 nm-1 μm) according to their hydrodynamic radius (rh). The fractionation is carried out in an empty channel by a flow stream of a mobile phase of any composition. For these reasons, fractionation is developed without surface interaction of the analyte with packing or gel media, and there is no stationary phase able to induce mechanical or shear stress on nanosized analytes, which are for these reasons kept in their native state. Characterization of nano(bio)analytes is made possible after fractionation by interfacing the FlFFF system with detection techniques for morphological, optical or mass characterization. For instance, FlFFF coupling with multi-angle light scattering (MALS) detection allows for absolute molecular weight and size determination, and mass spectrometry has made FlFFF enter the field of proteomics. Potentialities of FlFFF couplings with multi-detection systems are discussed in the first section of this dissertation. The second and the third sections are dedicated to new methods that have been developed for the analysis and characterization of different samples of interest in the fields of diagnostics, pharmaceutics, and nanomedicine. The second section focuses on biological samples such as protein complexes and protein aggregates. In particular it focuses on FlFFF methods developed to give new insights into: a) chemical composition and morphological features of blood serum lipoprotein classes, b) time-dependent aggregation pattern of the amyloid protein Aβ1-42, and c) aggregation state of antibody therapeutics in their formulation buffers. The third section is dedicated to the analysis and characterization of structured nanoparticles designed for nanomedicine applications. The discussed results indicate that FlFFF with on-line MALS and fluorescence detection (FD) may become the unparallel methodology for the analysis and characterization of new, structured, fluorescent nanomaterials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work of thesis involves various aspects of crystal engineering. Chapter 1 focuses on crystals containing crown ether complexes. Aspects such as the possibility of preparing these materials by non-solution methods, i.e. by direct reaction of the solid components, thermal behavior and also isomorphism and interconversion between hydrates are taken into account. In chapter 2 a study is presented aimed to understanding the relationship between hydrogen bonding capability and shape of the building blocks chosen to construct crystals. The focus is on the control exerted by shape on the organization of sandwich cations such as cobalticinium, decamethylcobalticinium and bisbenzenchromium(I) and on the aggregation of monoanions all containing carboxylic and carboxylate groups, into 0-D, 1-D, 2-D and 3-D networks. Reactions conducted in multi-component molecular assemblies or co-crystals have been recognized as a way to control reactivity in the solid state. The [2+2] photodimerization of olefins is a successful demonstration of how templated solid state synthesis can efficiently synthesize unique materials with remarkable stereoselectivity and under environment-friendly conditions. A demonstration of this synthetic strategy is given in chapter 3. The combination of various types of intermolecular linkages, leading to formation of high order aggregation and crystalline materials or to a random aggregation resulting in an amorphous precipitate, may not go to completeness. In such rare cases an aggregation process intermediate between crystalline and amorphous materials is observed, resulting in the formation of a gel, i.e. a viscoelastic solid-like or liquid-like material. In chapter 4 design of new Low Molecular Weight Gelators is presented. Aspects such as the relationships between molecular structure, crystal packing and gelation properties and the application of this kind of gels as a medium for crystal growth of organic molecules, such as APIs, are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in the fast growing area of therapeutic/diagnostic proteins and antibodies - novel and highly specific drugs - as well as the progress in the field of functional proteomics regarding the correlation between the aggregation of damaged proteins and (immuno) senescence or aging-related pathologies, underline the need for adequate analytical methods for the detection, separation, characterization and quantification of protein aggregates, regardless of the their origin or formation mechanism. Hollow fiber flow field-flow fractionation (HF5), the miniaturized version of FlowFFF and integral part of the Eclipse DUALTEC FFF separation system, was the focus of this research; this flow-based separation technique proved to be uniquely suited for the hydrodynamic size-based separation of proteins and protein aggregates in a very broad size and molecular weight (MW) range, often present at trace levels. HF5 has shown to be (a) highly selective in terms of protein diffusion coefficients, (b) versatile in terms of bio-compatible carrier solution choice, (c) able to preserve the biophysical properties/molecular conformation of the proteins/protein aggregates and (d) able to discriminate between different types of protein aggregates. Thanks to the miniaturization advantages and the online coupling with highly sensitive detection techniques (UV/Vis, intrinsic fluorescence and multi-angle light scattering), HF5 had very low detection/quantification limits for protein aggregates. Compared to size-exclusion chromatography (SEC), HF5 demonstrated superior selectivity and potential as orthogonal analytical method in the extended characterization assays, often required by therapeutic protein formulations. In addition, the developed HF5 methods have proven to be rapid, highly selective, sensitive and repeatable. HF5 was ideally suitable as first dimension of separation of aging-related protein aggregates from whole cell lysates (proteome pre-fractionation method) and, by HF5-(UV)-MALS online coupling, important biophysical information on the fractionated proteins and protein aggregates was gathered: size (rms radius and hydrodynamic radius), absolute MW and conformation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: To investigate whether preemptive administered lornoxicam changes perioperative platelet function during thoracic surgery. METHODS: A total of 20 patients scheduled for elective thoracic surgery were randomly assigned to receive either lornoxicam (16 mg, i.v.; n = 10) or placebo (n = 10) preoperatively. All patients underwent treatment of solitary lung metastasis and denied any antiplatelet medication within the past 2 weeks. Blood samples were drawn via an arterial catheter directly into silicone-coated Vacutainer tubes containing 0.5 mL of 0.129 M buffered sodium citrate 3.8% before, 15 min, 4 h and 8 h after the study medication was administered. Platelet aggregation curves were obtained by whole blood electrical impedance aggregometry (Chrono Log). RESULTS: Platelet aggregation was significantly reduced 15 min, 4 h and 8 h after lornoxicam administration compared to placebo (P < 0.05) for collagen, adenosine diphosphate and arachidonic acid as trigger substances. Adenosine diphosphate-induced platelet aggregation decreased by 85% 15 min after lornoxicam administration, and remained impaired for 8 h. CONCLUSION: Platelet aggregation assays are impaired for at least 8 h after lornoxicam application. Therefore perioperative analgesia by use of lornoxicam should be carefully administered under consideration of subsequent platelet dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer's disease (AD) is characterized by the cerebral accumulation of misfolded and aggregated amyloid-beta protein (Abeta). Disease symptoms can be alleviated, in vitro and in vivo, by 'beta-sheet breaker' pentapeptides that reduce plaque load. However the peptide nature of these compounds, made them biologically unstable and unable to penetrate membranes with high efficiency. The main goal of this study was to use computational methods to identify small molecule mimetics with better drug-like properties. For this purpose, the docked conformations of the active peptides were used to identify compounds with similar activities. A series of related beta-sheet breaker peptides were docked to solid state NMR structures of a fibrillar form of Abeta. The lowest energy conformations of the active peptides were used to design three dimensional (3D)-pharmacophores, suitable for screening the NCI database with Unity. Small molecular weight compounds with physicochemical features and a conformation similar to the active peptides were selected, ranked by docking and biochemical parameters. Of 16 diverse compounds selected for experimental screening, 2 prevented and reversed Abeta aggregation at 2-3microM concentration, as measured by Thioflavin T (ThT) fluorescence and ELISA assays. They also prevented the toxic effects of aggregated Abeta on neuroblastoma cells. Their low molecular weight and aqueous solubility makes them promising lead compounds for treating AD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE Platelets are known to play a crucial role in hemostasis. Sphingosine kinases (Sphk) 1 and 2 catalyze the conversion of sphingosine to the bioactive metabolite sphingosine 1-phosphate (S1P). Although platelets are able to secrete S1P on activation, little is known about a potential intrinsic effect of S1P on platelet function. OBJECTIVE To investigate the role of Sphk1- and Sphk2-derived S1P in the regulation of platelet function. METHODS AND RESULTS We found a 100-fold reduction in intracellular S1P levels in platelets derived from Sphk2(-/-) mutants compared with Sphk1(-/-) or wild-type mice, as analyzed by mass spectrometry. Sphk2(-/-) platelets also failed to secrete S1P on stimulation. Blood from Sphk2-deficient mice showed decreased aggregation after protease-activated receptor 4-peptide and adenosine diphosphate stimulation in vitro, as assessed by whole blood impedance aggregometry. We revealed that S1P controls platelet aggregation via the sphingosine 1-phosphate receptor 1 through modulation of protease-activated receptor 4-peptide and adenosine diphosphate-induced platelet activation. Finally, we show by intravital microscopy that defective platelet aggregation in Sphk2-deficient mice translates into reduced arterial thrombus stability in vivo. CONCLUSIONS We demonstrate that Sphk2 is the major Sphk isoform responsible for the generation of S1P in platelets and plays a pivotal intrinsic role in the control of platelet activation. Correspondingly, Sphk2-deficient mice are protected from arterial thrombosis after vascular injury, but have normal bleeding times. Targeting this pathway could therefore present a new therapeutic strategy to prevent thrombosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Governance of food systems is a poorly understood determinant of food security. Much scholarship on food systems governance is non-empirical, while existing research is often case study-based and theoretically and methodologically incommensurable. This frustrates aggregation of evidence and generalisation. We undertook a systematic review of methods used in food systems governance research with a view to identifying a core set of indicators for future research. We gathered literature through a structured consultation and sampling from recent reviews. Indicators were identified and classified according to the levels and sectors they investigate. We found a concentration of indicators in food production at local to national levels and a sparseness in distribution and consumption. Unsurprisingly, many indicators of institutional structure were found, while agency-related indicators are moderately represented. We call for piloting and validation of these indicators and for methodological development to fill gaps identified. These efforts are expected to support a more consolidated future evidence base and eventual meta-analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New trends in biometrics are oriented to mobile devices in order to increase the overall security in daily actions like bank account access, e-commerce or even document protection within the mobile. However, applying biometrics to mobile devices imply challenging aspects in biometric data acquisition, feature extraction or private data storage. Concretely, this paper attempts to deal with the problem of hand segmentation given a picture of the hand in an unknown background, requiring an accurate result in terms of hand isolation. For the sake of user acceptability, no restrictions are done on background, and therefore, hand images can be taken without any constraint, resulting segmentation in an exigent task. Multiscale aggregation strategies are proposed in order to solve this problem due to their accurate results in unconstrained and complicated scenarios, together with their properties in time performance. This method is evaluated with a public synthetic database with 480000 images considering different backgrounds and illumination environments. The results obtained in terms of accuracy and time performance highlight their capability of being a suitable solution for the problem of hand segmentation in contact-less environments, outperforming competitive methods in literature like Lossy Data Compression image segmentation (LDC).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. Results: We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as “which particular data was input to a particular workflow to test a particular hypothesis?”, and “which particular conclusions were drawn from a particular workflow?”. Conclusions: Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We quantitatively analyzed, using laser scanning confocal microscopy, the three-dimensional structure of individual senile plaques in Alzheimer disease. We carried out the quantitative analysis using statistical methods to gain insights about the processes that govern Aβ peptide deposition. Our results show that plaques are complex porous structures with characteristic pore sizes. We interpret plaque morphology in the context of a new dynamical model based on competing aggregation and disaggregation processes in kinetic steady-state equilibrium with an additional diffusion process allowing Aβ deposits to diffuse over the surface of plaques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has three main objectives. First, it develops a generalization of the commonly used EKS method to multilateral price comparisons. It is shown that the EKS system can be generalized so that weights can be attached to each of the link comparisons used in the EKS computations. These weights can account for differing levels of reliability of the underlying binary comparisons. Second, various reliability measures and corresponding weighting schemes are presented and their merits discussed. Finally, these new methods are applied to an international data set of manufacturing prices from the ICOP project. Although theoretically superior, it appears that the empirical impact of the weighted EKS method is generally small compared to the unweighted EKS. It is also found that this impact is larger when it is applied at lower levels of aggregation. Finally, the importance of using sector specific PPPs in assessing relative levels of manufacturing productivity is indicated.