874 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generic LC-MS approach for the absolute quantification of undigested peptides in plasma at mid-picomolar levels is described. Nine human peptides namely, brain natriuretic peptide (BNP), substance P (SubP), parathyroid hormone 1-34 (PTH), C-peptide, orexines A and B (Orex-A and -B), oxytocin (Oxy), gonadoliberin-1 (gonadothropin releasing-hormone or luteinizing hormone-releasing hormone, LHRH) and α-melanotropin (α-MSH) were targeted. Plasma samples were extracted via a 2-step procedure: protein precipitation using 1vol of acetonitrile followed by ultrafiltration of supernatants on membranes with a MW cut-off of 30 kDa. By applying a specific LC-MS setup, large volumes of filtrates (e.g., 2×750 μL) were injected and the peptides were trapped on a 1mm i.d.×10 mm length C8 column using a 10× on-line dilution. Then, the peptides were back-flushed and a second on-line dilution (2×) was applied during the transfer step. The refocalized peptides were resolved on a 0.3mm i.d. C18 analytical column. Extraction recovery, matrix effect and limits of detection were evaluated. Our comprehensive protocol demonstrates a simple and efficient sample preparation procedure followed by the analysis of peptides with limits of detection in the mid-picomolar range. This generic approach can be applied for the determination of most therapeutic peptides and possibly for endogenous peptides with latest state-of-the-art instruments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, cocaine use is a public health issue. Cocaine is a powerfully addictive stimulant drug which use is increasing among some part of the population. After a brief description of the physical and psychological effects of cocaine use, the article presents a motivational way for general practitioners to deal with risk-reduction issues. Based on the Transtheoretical Model of human behavior change and providing clinical examples, the article focuses particularly on the two earliest stages of change: "pre-contemplation" and "contemplation".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental question in developmental biology is how tissues are patterned to give rise to differentiated body structures with distinct morphologies. The Drosophila wing disc offers an accessible model to understand epithelial spatial patterning. It has been studied extensively using genetic and molecular approaches. Bristle patterns on the thorax, which arise from the medial part of the wing disc, are a classical model of pattern formation, dependent on a pre-pattern of trans-activators and –repressors. Despite of decades of molecular studies, we still only know a subset of the factors that determine the pre-pattern. We are applying a novel and interdisciplinary approach to predict regulatory interactions in this system. It is based on the description of expression patterns by simple logical relations (addition, subtraction, intersection and union) between simple shapes (graphical primitives). Similarities and relations between primitives have been shown to be predictive of regulatory relationships between the corresponding regulatory factors in other Systems, such as the Drosophila egg. Furthermore, they provide the basis for dynamical models of the bristle-patterning network, which enable us to make even more detailed predictions on gene regulation and expression dynamics. We have obtained a data-set of wing disc expression patterns which we are now processing to obtain average expression patterns for each gene. Through triangulation of the images we can transform the expression patterns into vectors which can easily be analysed by Standard clustering methods. These analyses will allow us to identify primitives and regulatory interactions. We expect to identify new regulatory interactions and to understand the basic Dynamics of the regulatory network responsible for thorax patterning. These results will provide us with a better understanding of the rules governing gene regulatory networks in general, and provide the basis for future studies of the evolution of the thorax-patterning network in particular.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS/HYPOTHESIS: Epidemiological and experimental evidence suggests that uric acid has a role in the aetiology of type 2 diabetes. Using a Mendelian randomisation approach, we investigated whether there is evidence for a causal role of serum uric acid for development of type 2 diabetes. METHODS: We examined the associations of serum-uric-acid-raising alleles of eight common variants recently identified in genome-wide association studies and summarised this in a genetic score with type 2 diabetes in case-control studies including 7,504 diabetes patients and 8,560 non-diabetic controls. We compared the observed effect size to that expected based on: (1) the association between the genetic score and uric acid levels in non-diabetic controls; and (2) the meta-analysed uric acid level to diabetes association. RESULTS: The genetic score showed a linear association with uric acid levels, with a difference of 12.2 μmol/l (95% CI 9.3, 15.1) by score tertile. No significant associations were observed between the genetic score and potential confounders. No association was observed between the genetic score and type 2 diabetes with an OR of 0.99 (95% CI 0.94, 1.04) per score tertile, significantly different (p = 0.046) from that expected (1.04 [95% CI 1.03, 1.05]) based on the observed uric acid difference by score tertile and the uric acid to diabetes association of 1.21 (95% CI 1.14, 1.29) per 60 μmol/l. CONCLUSIONS/INTERPRETATION: Our results do not support a causal role of serum uric acid for the development of type 2 diabetes and limit the expectation that uric-acid-lowering drugs will be effective in the prevention of type 2 diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Testosterone abuse is conventionally assessed by the urinary testosterone/epitestosterone (T/E) ratio, levels above 4.0 being considered suspicious. A deletion polymorphism in the gene coding for UGT2B17 is strongly associated with reduced testosterone glucuronide (TG) levels in urine. Many of the individuals devoid of the gene would not reach a T/E ratio of 4.0 after testosterone intake. Future test programs will most likely shift from population based- to individual-based T/E cut-off ratios using Bayesian inference. A longitudinal analysis is dependent on an individual's true negative baseline T/E ratio. The aim was to investigate whether it is possible to increase the sensitivity and specificity of the T/E test by addition of UGT2B17 genotype information in a Bayesian framework. A single intramuscular dose of 500mg testosterone enanthate was given to 55 healthy male volunteers with either two, one or no allele (ins/ins, ins/del or del/del) of the UGT2B17 gene. Urinary excretion of TG and the T/E ratio was measured during 15 days. The Bayesian analysis was conducted to calculate the individual T/E cut-off ratio. When adding the genotype information, the program returned lower individual cut-off ratios in all del/del subjects increasing the sensitivity of the test considerably. It will be difficult, if not impossible, to discriminate between a true negative baseline T/E value and a false negative one without knowledge of the UGT2B17 genotype. UGT2B17 genotype information is crucial, both to decide which initial cut-off ratio to use for an individual, and for increasing the sensitivity of the Bayesian analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deterioration in portland cement concrete (PCC) pavements can occur due to distresses caused by a combination of traffic loads and weather conditions. Hot mix asphalt (HMA) overlay is the most commonly used rehabilitation technique for such deteriorated PCC pavements. However, the performance of these HMA overlaid pavements is hindered due to the occurrence of reflective cracking, resulting in significant reduction of pavement serviceability. Various fractured slab techniques, including rubblization, crack and seat, and break and seat are used to minimize reflective cracking by reducing the slab action. However, the design of structural overlay thickness for cracked and seated and rubblized pavements is difficult as the resulting structure is neither a “true” rigid pavement nor a “true” flexible pavement. Existing design methodologies use the empirical procedures based on the AASHO Road Test conducted in 1961. But, the AASHO Road Test did not employ any fractured slab technique, and there are numerous limitations associated with extrapolating its results to HMA overlay thickness design for fractured PCC pavements. The main objective of this project is to develop a mechanistic-empirical (ME) design approach for the HMA overlay thickness design for fractured PCC pavements. In this design procedure, failure criteria such as the tensile strain at the bottom of HMA layer and the vertical compressive strain on the surface of subgrade are used to consider HMA fatigue and subgrade rutting, respectively. The developed ME design system is also implemented in a Visual Basic computer program. A partial validation of the design method with reference to an instrumented trial project (IA-141, Polk County) in Iowa is provided in this report. Tensile strain values at the bottom of the HMA layer collected from the FWD testing at this project site are in agreement with the results obtained from the developed computer program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using data from the Spanish household budget survey, we investigate life- cycle effects on several product expenditures. A latent-variable model approach is adopted to evaluate the impact of income on expenditures, controlling for the number of members in the family. Two latent factors underlying repeated measures of monetary and non-monetary income are used as explanatory variables in the expenditure regression equations, thus avoiding possible bias associated to the measurement error in income. The proposed methodology also takes care of the case in which product expenditures exhibit a pattern of infrequent purchases. Multiple-group analysis is used to assess the variation of key parameters of the model across various household life-cycle typologies. The analysis discloses significant life-cycle effects on the mean levels of expenditures; it also detects significant life-cycle effects on the way expenditures are affected by income and family size. Asymptotic robust methods are used to account for possible non-normality of the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed at analyzing nipple trauma resulted from breastfeeding based on dermatological approach. Two integrative reviews of literature were conducted, the first related to definitions, classification and evaluation methods of nipple trauma and another about validation studies related to this theme. In the first part were included 20 studies and only one third defined nipple trauma, more than half did not defined the nipple’s injuries reported, and each author showed a particular way to assess the injuries, without consensus. In the second integrative review, no validation study or algorithm related to nipple trauma resulted from breastfeeding was found. This fact demonstrated that the nipple’s injuries mentioned in the first review did not go through validation studies, justifying the lack of consensus identified as far as definition, classification and assessment methods of nipple trauma.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radioimmunodetection of tumours with monoclonal antibodies is becoming an established procedure. Positron emission tomography (PET) shows better resolution than normal gamma camera single photon emission tomography and can provide more precise quantitative data. Thus, in the present study, these powerful methods have been combined to perform radioimmuno PET (RI-PET). Monoclonal antibodies directed against carcinoembryonic antigen (CEA) an IgG, its F(ab')2 and a mouse-human chimeric IgG derived from it were labelled with 124I, a positron-emitting radionuclide with a convenient physical half-life of four days. Mice, xenografted with a CEA-producing human colon carcinoma, were injected with the 124I-MAb and the tumours were visualized using PET. The concentrations of 124I in tumour and normal tissue were determined by both PET and direct radioactivity counting of the dissected animals, with very good agreement. To allow PET quantification, a procedure was established to account for the presence of radioactivity during the absorption correction measurement (transmission scan). Comparison of PET and tissue counting indicates that this novel combination of radioimmunolocalization and PET (RI-PET) will provide, in addition to more precise diagnosis, more accurate radiation dosimetry for radioimmunotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To analyze the methodological aspects used for the preparation of terminology subsets of the International Classification for Nursing Practice (ICNP®), in dissertations and theses in the Brazilian nursing. Method This is an integrative review of the Brazilian dissertations and theses defended in the period from 2007 to 2013, which were included seven dissertations. Results The increasing production of studies on the theme by Brazilian nurses shows a concern for a unified language for the profession. However, the results demonstrate the lack of uniformity in the conduct of studies, especially in relation to the stages of content validation. The initiatives of some authors to systematize alternative methods for creating these subsets also stood out. Conclusion We suggest the development of new terminology subsets, following standards of methodological rigor, as well as its application and validation by the selected clientele, to ensure greater reliability of results and desired changes for the profession.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Microparticles (MPs) are small phospholipid vesicles of less than 1 microm, shed in blood flow by various cell types. These MPs are involved in several biological processes and diseases. MPs have also been detected in blood products; however, their role in transfused patients is unknown. The purpose of this study was to characterize those MPs in blood bank conditions. MATERIALS AND METHODS: Qualitative and quantitative experiments using flow cytometry or proteomic techniques were performed on MPs derived from erythrocytes concentrates. In order to count MPs, they were either isolated by various centrifugation procedures or counted directly in erythrocyte concentrates. RESULTS: A 20-fold increase after 50 days of storage at 4 degrees C was observed (from 3370 +/- 1180 MPs/microl at day 5 to 64 850 +/- 37 800 MPs/microl at day 50). Proteomic analysis revealed changes of protein expression comparing MPs to erythrocyte membranes. Finally, the expression of Rh blood group antigens was shown on MPs generated during erythrocyte storage. CONCLUSIONS: Our work provides evidence that storage of red blood cell is associated with the generation of MPs characterized by particular proteomic profiles. These results contribute to fundamental knowledge of transfused blood products.