130 resultados para simple algorithms
Resumo:
Many research projects in life sciences require purified biologically active recombinant protein. In addition, different formats of a given protein may be needed at different steps of experimental studies. Thus, the number of protein variants to be expressed and purified in short periods of time can expand very quickly. We have therefore developed a rapid and flexible expression system based on described episomal vector replication to generate semi-stable cell pools that secrete recombinant proteins. We cultured these pools in serum-containing medium to avoid time-consuming adaptation of cells to serum-free conditions, maintain cell viability and reuse the cultures for multiple rounds of protein production. As such, an efficient single step affinity process to purify recombinant proteins from serum-containing medium was optimized. Furthermore, a series of multi-cistronic vectors were designed to enable simultaneous expression of proteins and their biotinylation in vivo as well as fast selection of protein-expressing cell pools. Combining these improved procedures and innovative steps, exemplified with seven cytokines and cytokine receptors, we were able to produce biologically active recombinant endotoxin free protein at the milligram scale in 4-6weeks from molecular cloning to protein purification.
Resumo:
BACKGROUND: The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. RESULTS: Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. CONCLUSION: ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.
Resumo:
We present a combined shape and mechanical anisotropy evolution model for a two-phase inclusion-bearing rock subject to large deformation. A single elliptical inclusion embedded in a homogeneous but anisotropic matrix is used to represent a simplified shape evolution enforced on all inclusions. The mechanical anisotropy develops due to the alignment of elongated inclusions. The effective anisotropy is quantified using the differential effective medium (DEM) approach. The model can be run for any deformation path and an arbitrary viscosity ratio between the inclusion and host phase. We focus on the case of simple shear and weak inclusions. The shape evolution of the representative inclusion is largely insensitive to the anisotropy development and to parameter variations in the studied range. An initial hardening stage is observed up to a shear strain of gamma = 1 irrespective of the inclusion fraction. The hardening is followed by a softening stage related to the developing anisotropy and its progressive rotation toward the shear direction. The traction needed to maintain a constant shear rate exhibits a fivefold drop at gamma = 5 in the limiting case of an inviscid inclusion. Numerical simulations show that our analytical model provides a good approximation to the actual evolution of a two-phase inclusion-host composite. However, the inclusions develop complex sigmoidal shapes resulting in the formation of an S-C fabric. We attribute the observed drop in the effective normal viscosity to this structural development. We study the localization potential in a rock column bearing varying fraction of inclusions. In the inviscid inclusion case, a strain jump from gamma = 3 to gamma = 100 is observed for a change of the inclusion fraction from 20% to 33%.
Resumo:
Management of musculoskeletal tumours usually begins with the appearance of a lump or bump, or the onset of unspecific symptoms. A poor initial work-up, a faulty biopsy or an inadequate resection may have a severe impact on the prognosis, including re-interventions, amputation, local recurrence or systemic spread of the disease. The patient with a suspicious lesion should be referred to a "sarcoma centers" where a planned and well-performed diagnostic work-up will allow a precise diagnosis in terms of histology and staging. After a multidisciplinary discussion of the case, an accurate treatment plan is established. Such an approach allows an adequate patient management, often with a positive impact on the survival and functional outcome.
Resumo:
A simple method determining airborne monoethanolamine has been developed. Monoethanolamine determination has traditionally been difficult due to analytical separation problems. Even in recent sophisticated methods, this difficulty remains as the major issue often resulting in time-consuming sample preparations. Impregnated glass fiber filters were used for sampling. Desorption of monoethanolamine was followed by capillary GC analysis and nitrogen phosphorous selective detection. Separation was achieved using a specific column for monoethanolamines (35% diphenyl and 65% dimethyl polysiloxane). The internal standard was quinoline. Derivatization steps were not needed. The calibration range was 0.5-80 μg/mL with a good correlation (R(2) = 0.996). Averaged overall precisions and accuracies were 4.8% and -7.8% for intraday (n = 30), and 10.5% and -5.9% for interday (n = 72). Mean recovery from spiked filters was 92.8% for the intraday variation, and 94.1% for the interday variation. Monoethanolamine on stored spiked filters was stable for at least 4 weeks at 5°C. This newly developed method was used among professional cleaners and air concentrations (n = 4) were 0.42 and 0.17 mg/m(3) for personal and 0.23 and 0.43 mg/m(3) for stationary measurements. The monoethanolamine air concentration method described here was simple, sensitive, and convenient both in terms of sampling and analytical analysis.
Resumo:
Background: Modelling epidemiological knowledge in validated clinical scores is a practical mean of integrating EBM to usual care. Existing scores about cardiovascular disease have been largely developed in emergency settings, but few in primary care. Such a toll is needed for general practitioners (GP) to evaluate the probability of ischemic heart disease (IHD) in patients with non-traumatic chest pain. Objective: To develop a predictive model to use as a clinical score for detecting IHD in patients with non-traumatic chest-pain in primary care. Methods: A post-hoc secondary analysis on data from an observational study including 672 patients with chest pain of which 85 had IHD diagnosed by their GP during the year following their inclusion. Best subset method was used to select 8 predictive variables from univariate analysis and fitted in a multivariate logistic regression model to define the score. Reliability of the model was assessed using split-group method. Results: Significant predictors were: age (0-3 points), gender (1 point), having at least one cardiovascular risks factor (hypertension, dyslipidemia, diabetes, smoking, family history of CVD; 3 points), personal history of cardiovascular disease (1 point), duration of chest pain from 1 to 60 minutes (2 points), substernal chest pain (1 point), pain increasing with exertion (1 point) and absence of tenderness at palpation (1 point). Area under the ROC curve for the score was of 0.95 (IC95% 0.93; 0.97). Patients were categorised in three groups, low risk of IHD (score under 6; n = 360), moderate risk of IHD (score from 6 to 8; n = 187) and high risk of IHD (score from 9-13; n = 125). Prevalence of IHD in each group was respectively of 0%, 6.7%, 58.5%. Reliability of the model seems satisfactory as the model developed from the derivation set predicted perfectly (p = 0.948) the number of patients in each group in the validation set. Conclusion: This clinical score based only on history and physical exams can be an important tool in the practice of the general physician for the prediction of ischemic heart disease in patients complaining of chest pain. The score below 6 points (in more than half of our population) can avoid demanding complementary exams for selected patients (ECG, laboratory tests) because of the very low risk of IHD. Score above 6 points needs investigation to detect or rule out IHD. Further external validation is required in ambulatory settings.
Resumo:
Because data on rare species usually are sparse, it is important to have efficient ways to sample additional data. Traditional sampling approaches are of limited value for rare species because a very large proportion of randomly chosen sampling sites are unlikely to shelter the species. For these species, spatial predictions from niche-based distribution models can be used to stratify the sampling and increase sampling efficiency. New data sampled are then used to improve the initial model. Applying this approach repeatedly is an adaptive process that may allow increasing the number of new occurrences found. We illustrate the approach with a case study of a rare and endangered plant species in Switzerland and a simulation experiment. Our field survey confirmed that the method helps in the discovery of new populations of the target species in remote areas where the predicted habitat suitability is high. In our simulations the model-based approach provided a significant improvement (by a factor of 1.8 to 4 times, depending on the measure) over simple random sampling. In terms of cost this approach may save up to 70% of the time spent in the field.
Resumo:
The noise power spectrum (NPS) is the reference metric for understanding the noise content in computed tomography (CT) images. To evaluate the noise properties of clinical multidetector (MDCT) scanners, local 2D and 3D NPSs were computed for different acquisition reconstruction parameters.A 64- and a 128-MDCT scanners were employed. Measurements were performed on a water phantom in axial and helical acquisition modes. CT dose index was identical for both installations. Influence of parameters such as the pitch, the reconstruction filter (soft, standard and bone) and the reconstruction algorithm (filtered-back projection (FBP), adaptive statistical iterative reconstruction (ASIR)) were investigated. Images were also reconstructed in the coronal plane using a reformat process. Then 2D and 3D NPS methods were computed.In axial acquisition mode, the 2D axial NPS showed an important magnitude variation as a function of the z-direction when measured at the phantom center. In helical mode, a directional dependency with lobular shape was observed while the magnitude of the NPS was kept constant. Important effects of the reconstruction filter, pitch and reconstruction algorithm were observed on 3D NPS results for both MDCTs. With ASIR, a reduction of the NPS magnitude and a shift of the NPS peak to the low frequency range were visible. 2D coronal NPS obtained from the reformat images was impacted by the interpolation when compared to 2D coronal NPS obtained from 3D measurements.The noise properties of volume measured in last generation MDCTs was studied using local 3D NPS metric. However, impact of the non-stationarity noise effect may need further investigations.
Resumo:
The state of the art to describe image quality in medical imaging is to assess the performance of an observer conducting a task of clinical interest. This can be done by using a model observer leading to a figure of merit such as the signal-to-noise ratio (SNR). Using the non-prewhitening (NPW) model observer, we objectively characterised the evolution of its figure of merit in various acquisition conditions. The NPW model observer usually requires the use of the modulation transfer function (MTF) as well as noise power spectra. However, although the computation of the MTF poses no problem when dealing with the traditional filtered back-projection (FBP) algorithm, this is not the case when using iterative reconstruction (IR) algorithms, such as adaptive statistical iterative reconstruction (ASIR) or model-based iterative reconstruction (MBIR). Given that the target transfer function (TTF) had already shown it could accurately express the system resolution even with non-linear algorithms, we decided to tune the NPW model observer, replacing the standard MTF by the TTF. It was estimated using a custom-made phantom containing cylindrical inserts surrounded by water. The contrast differences between the inserts and water were plotted for each acquisition condition. Then, mathematical transformations were performed leading to the TTF. As expected, the first results showed a dependency of the image contrast and noise levels on the TTF for both ASIR and MBIR. Moreover, FBP also proved to be dependent of the contrast and noise when using the lung kernel. Those results were then introduced in the NPW model observer. We observed an enhancement of SNR every time we switched from FBP to ASIR to MBIR. IR algorithms greatly improve image quality, especially in low-dose conditions. Based on our results, the use of MBIR could lead to further dose reduction in several clinical applications.
Resumo:
Is it possible to perfectly simulate a signature, in the particular and challenging case where the signature is simple? A set of signatures of six writers, considered to be simple on the basis of highlighted criteria, was sampled. These signatures were transferred to forgers requested to produce freehand simulations. Among these simulations, those capable of reproducing the features of the reference signatures were submitted for evaluation to forensic document experts through proficiency testing. The results suggest that there is no perfect simulation. With the supplementary aim of assessing the influence of forger's skills on the results, forgers were selected from three distinct populations, which differ according to professional criteria. The results indicate some differences in graphical capabilities between individuals. However, no trend could be established regarding age, degrees, years of practice and time dedicated to the exercise. The findings show that simulation is made easier if a graphical compatibility exists between the forger's own writing and the signature to be reproduced. Moreover, a global difficulty to preserve proportions and slant as well as the shape of capital letters and initials has been noticed.