15 resultados para Non-optimal Codon
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Although there is dissimiliarity in theoretical research approaches to subjective well-being and to assessments of well-being, there is agreement regarding the value of well-being, especially among student populations. In the highly structured, achievement-oriented, non-optimal context of a classroom, individual well-being is a necessary pre-condition for learning. Among student populations well-being should not be construed as an achievement enhancer; but, rather, recognized and measured as an educational value of its own. However, it is necessary for the positive bias towards learning at least in highly structured, achievement-orientated, non-optional learning contexts like school [cf. Hascher, T. (2004). Wohlbefinden in der Schule. Münster: Waxmann]. How can it be measured? Since different research approaches lead to a variety of instruments, the following paper will focus on two ways of assessing well-being in school: a questionnaire on student well-being (N = 2014) 1 and a semi-structured daily diary about relevant emotional situations in school (N = 58, period 3 × 2 weeks). Both methods are introduced and their methodological quality is discussed in terms of reliability, validity and in terms of their usefulness for improving school practice. Furthermore, the research potential of combining quantitative and qualitative data on students’ well-being is addressed.
Resumo:
Systematic differences in circadian rhythmicity are thought to be a substantial factor determining inter-individual differences in fatigue and cognitive performance. The synchronicity effect (when time of testing coincides with the respective circadian peak period) seems to play an important role. Eye movements have been shown to be a reliable indicator of fatigue due to sleep deprivation or time spent on cognitive tasks. However, eye movements have not been used so far to investigate the circadian synchronicity effect and the resulting differences in fatigue. The aim of the present study was to assess how different oculomotor parameters in a free visual exploration task are influenced by: a) fatigue due to chronotypical factors (being a 'morning type' or an 'evening type'); b) fatigue due to the time spent on task. Eighteen healthy participants performed a free visual exploration task of naturalistic pictures while their eye movements were recorded. The task was performed twice, once at their optimal and once at their non-optimal time of the day. Moreover, participants rated their subjective fatigue. The non-optimal time of the day triggered a significant and stable increase in the mean visual fixation duration during the free visual exploration task for both chronotypes. The increase in the mean visual fixation duration correlated with the difference in subjectively perceived fatigue at optimal and non-optimal times of the day. Conversely, the mean saccadic speed significantly and progressively decreased throughout the duration of the task, but was not influenced by the optimal or non-optimal time of the day for both chronotypes. The results suggest that different oculomotor parameters are discriminative for fatigue due to different sources. A decrease in saccadic speed seems to reflect fatigue due to time spent on task, whereas an increase in mean fixation duration a lack of synchronicity between chronotype and time of the day.
Resumo:
REACH (registration, evaluation, authorisation and restriction of chemicals) regulation requires that all the chemicals produced or imported in Europe above 1 tonne/year are registered. To register a chemical, physicochemical, toxicological and ecotoxicological information needs to be reported in a dossier. REACH promotes the use of alternative methods to replace, refine and reduce the use of animal (eco)toxicity testing. Within the EU OSIRIS project, integrated testing strategies (ITSs) have been developed for the rational use of non-animal testing approaches in chemical hazard assessment. Here we present an ITS for evaluating the bioaccumulation potential of organic chemicals. The scheme includes the use of all available data (also the non-optimal ones), waiving schemes, analysis of physicochemical properties related to the end point and alternative methods (both in silico and in vitro). In vivo methods are used only as last resort. Using the ITS, in vivo testing could be waived for about 67% of the examined compounds, but bioaccumulation potential could be estimated on the basis of non-animal methods. The presented ITS is freely available through a web tool.
Resumo:
The goals of any treatment of cervical spine injuries are: return to maximum functional ability, minimum of residual pain, decrease of any neurological deficit, minimum of residual deformity and prevention of further disability. The advantages of surgical treatment are the ability to reach optimal reduction, immediate stability, direct decompression of the cord and the exiting roots, the need for only minimum external fixation, the possibility for early mobilisation and clearly decreased nursing problems. There are some reasons why those goals can be reached better by anterior surgery. Usually the bony compression of the cord and roots comes from the front therefore anterior decompression is usually the procedure of choice. Also, the anterior stabilisation with a plate is usually simpler than a posterior instrumentation. It needs to be stressed that closed reduction by traction can align the fractured spine and indirectly decompress the neural structures in about 70%. The necessary weight is 2.5 kg per level of injury. In the upper cervical spine, the odontoid fracture type 2 is an indication for anterior surgery by direct screw fixation. Joint C1/C2 dislocations or fractures or certain odontoid fractures can be treated with a fusion of the C1/C2 joint by anterior transarticular screw fixation. In the lower and middle cervical spine, anterior plating combined with iliac crest or fibular strut graft is the procedure of choice, however, a solid graft can also be replaced by filled solid or expandable vertebral cages. The complication of this surgery is low, when properly executed and anterior surgery may only be contra-indicated in case of a significant lesion or locked joints.
Resumo:
The optimal temporal window of intravenous (IV) computed tomography (CT) cholangiography was prospectively determined. Fifteen volunteers (eight women, seven men; mean age, 38 years) underwent dynamic CT cholangiography. Two unenhanced images were acquired at the porta hepatis. Starting 5 min after initiation of IV contrast infusion (20 ml iodipamide meglumine 52%), 15 pairs of images at 5-min intervals were obtained. Attenuation of the extrahepatic bile duct (EBD) and the liver parenchyma was measured. Two readers graded visualization of the higher-order biliary branches. The first biliary opacification in the EBD occurred between 15 and 25 min (mean, 22.3 min +/- 3.2) after initiation of the contrast agent. Biliary attenuation plateaued between the 35- and the 75-min time points. Maximum hepatic parenchymal enhancement was 18.5 HU +/- 2.7. Twelve subjects demonstrated poor or non-visualization of higher-order biliary branches; three showed good or excellent visualization. Body weight and both biliary attenuation and visualization of the higher-order biliary branches correlated significantly (P<0.05). For peak enhancement of the biliary tree, CT cholangiography should be performed no earlier than 35 min after initiation of IV infusion. For a fixed contrast dose, superior visualization of the biliary system is achieved in subjects with lower body weight.
Resumo:
OBJECTIVE: To investigate predictors of continued HIV RNA viral load suppression in individuals switched to abacavir (ABC), lamivudine (3TC) and zidovudine (ZDV) after successful previous treatment with a protease inhibitor or non-nucleoside reverse transcriptase inhibitor-based combination antiretroviral therapy. DESIGN AND METHODS: An observational cohort study, which included individuals in the Swiss HIV Cohort Study switching to ABC/3TC/ZDV following successful suppression of viral load. The primary endpoint was time to treatment failure defined as the first of the following events: two consecutiveviral load measurements > 400 copies/ml under ABC/3TC/ZDV, one viral load measurement > 400 copies/ml and subsequent discontinuation of ABC/3TC/ZDV within 3 months, AIDS or death. RESULTS: We included 495 individuals; 47 experienced treatment failure in 1459 person-years of follow-up [rate = 3.22 events/100 person-years; 95% confidence interval (95% CI), 2.30-4.14]. Of all failures, 62% occurred in the first year after switching to ABC/3TC/ZDV. In a Cox regression analysis, treatment failure was independently associated with earlier exposure to nucleoside reverse transcriptase inhibitor (NRTI) mono or dual therapy [hazard ratio (HR), 8.02; 95% CI, 4.19-15.35) and low CD4 cell count at the time of the switch (HR, 0.66; 95% CI, 0.51-0.87 by +100 cells/microl up to 500 cells/microl). In patients without earlier exposure to mono or dual therapy, AIDS prior to switch to simplified maintenance therapy was an additional risk factor. CONCLUSIONS: The failure rate was low in patients with suppressed viral load and switch to ABC/3TC/ZDV treatment. Patients with earlier exposure to mono or dual NRTI therapy, low CD4 cell count at time of switch, or AIDS are at increased risk of treatment failure, limiting the use of ABC/3TC/ZDV in these patient groups.
Resumo:
Fish behaviourists are increasingly turning to non-invasive measurement of steroid hormones in holding water, as opposed to blood plasma. When some of us met at a workshop in Faro, Portugal, in September, 2007, we realised that there were still many issues concerning the application of this procedure that needed resolution, including: Why do we measure release rates rather than just concentrations of steroids in the water? How does one interpret steroid release rates when dealing with fish of different sizes? What are the merits of measuring conjugated as well as free steroids in water? In the ‘static’ sampling procedure, where fish are placed in a separate container for a short period of time, does this affect steroid release—and, if so, how can it be minimised? After exposing a fish to a behavioural stimulus, when is the optimal time to sample? What is the minimum amount of validation when applying the procedure to a new species? The purpose of this review is to attempt to answer these questions and, in doing so, to emphasize that application of the non-invasive procedure requires more planning and validation than conventional plasma sampling. However, we consider that the rewards justify the extra effort.
Resumo:
OBJECTIVE Standard stroke CT protocols start with non-enhanced CT followed by perfusion-CT (PCT) and end with CTA. We aimed to evaluate the influence of the sequence of PCT and CTA on quantitative perfusion parameters, venous contrast enhancement and examination time to save critical time in the therapeutic window in stroke patients. METHODS AND MATERIALS Stroke CT data sets of 85 patients, 47 patients with CTA before PCT (group A) and 38 with CTA after PCT (group B) were retrospectively analyzed by two experienced neuroradiologists. Parameter maps of cerebral blood flow, cerebral blood volume, time to peak and mean transit time and contrast enhancements (arterial and venous) were compared. RESULTS Both readers rated contrast of brain-supplying arteries to be equal in both groups (p=0.55 (intracranial) and p=0.73 (extracranial)) although the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). Quantitative perfusion parameters did not significantly differ between the groups (all p>0.18), while the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). The time to complete the diagnostic CT examination was significantly shorter for group A (p<0.01). CONCLUSION Performing CTA directly after NECT has no significant effect on PCT parameters and avoids venous preloading in CTA, while examination times were significantly shorter.
Resumo:
Infrared thermography (IRT) was used to detect digital dermatitis (DD) prior to routine claw trimming. A total of 1192 IRT observations were collected from 149 cows on eight farms. All cows were housed in tie-stalls. The maximal surface temperatures of the coronary band (CB) region and skin (S) of the fore and rear feet (mean value of the maximal surface temperatures of both digits for each foot separately, CBmax and Smax) were assessed. Grouping was performed at the foot level (presence of DD, n=99; absence, n=304), or at the cow level (all four feet healthy, n=24) or where there was at least one DD lesion on the rear feet, n=37). For individual cows (n=61), IRT temperature difference was determined by subtracting the mean sum of CBmax and Smax of the rear feet from that of the fore feet. Feet with DD had higher CBmax and Smax (P<0.001) than healthy feet. Smax was significantly higher in feet with infectious DD lesions (M-stage: M2+M4; n=15) than in those with non-infectious M-lesions (M1+M3; n=84) (P=0.03), but this was not the case for CBmax (P=0.12). At the cow level, an optimal cut-off value for detecting DD of 0.99°C (IRT temperature difference between rear and front feet) yielded a sensitivity of 89.1% and a specificity of 66.6%. The results indicate that IRT may be a useful non-invasive diagnostic tool to screen for the presence of DD in dairy cows by measuring CBmax and Smax.
Resumo:
BACKGROUND Vitamin D deficiency is prevalent in HIV-infected individuals and vitamin D supplementation is proposed according to standard care. This study aimed at characterizing the kinetics of 25(OH)D in a cohort of HIV-infected individuals of European ancestry to better define the influence of genetic and non-genetic factors on 25(OH)D levels. These data were used for the optimization of vitamin D supplementation in order to reach therapeutic targets. METHODS 1,397 25(OH)D plasma levels and relevant clinical information were collected in 664 participants during medical routine follow up visits. They were genotyped for 7 SNPs in 4 genes known to be associated with 25(OH)D levels. 25(OH)D concentrations were analyzed using a population pharmacokinetic approach. The percentage of individuals with 25(OH)D concentrations within the recommended range of 20-40ng/ml during 12 months of follow up and several dosage regimens were evaluated by simulation. RESULTS A one-compartment model with linear absorption and elimination was used to describe 25(OH)D pharmacokinetics, while integrating endogenous baseline plasma concentrations. Covariate analyses confirmed the effect of seasonality, body mass index, smoking habits, the analytical method, darunavir/r and the genetic variant in GC (rs2282679) on 25(OH)D concentrations. 11% of the interindividual variability in 25(OH)D levels was explained by seasonality and other non-genetic covariates and 1% by genetics. The optimal supplementation for severe vitamin D deficient patients was 300000 IU two times per year. CONCLUSIONS This analysis allowed identifying factors associated with 25(OH)D plasma levels in HIV-infected individuals. Improvement of dosage regimen and timing of vitamin D supplementation is proposed based on those results.
Resumo:
PURPOSE Therapeutic drug monitoring of patients receiving once daily aminoglycoside therapy can be performed using pharmacokinetic (PK) formulas or Bayesian calculations. While these methods produced comparable results, their performance has never been checked against full PK profiles. We performed a PK study in order to compare both methods and to determine the best time-points to estimate AUC0-24 and peak concentrations (C max). METHODS We obtained full PK profiles in 14 patients receiving a once daily aminoglycoside therapy. PK parameters were calculated with PKSolver using non-compartmental methods. The calculated PK parameters were then compared with parameters estimated using an algorithm based on two serum concentrations (two-point method) or the software TCIWorks (Bayesian method). RESULTS For tobramycin and gentamicin, AUC0-24 and C max could be reliably estimated using a first serum concentration obtained at 1 h and a second one between 8 and 10 h after start of the infusion. The two-point and the Bayesian method produced similar results. For amikacin, AUC0-24 could reliably be estimated by both methods. C max was underestimated by 10-20% by the two-point method and by up to 30% with a large variation by the Bayesian method. CONCLUSIONS The ideal time-points for therapeutic drug monitoring of once daily administered aminoglycosides are 1 h after start of a 30-min infusion for the first time-point and 8-10 h after start of the infusion for the second time-point. Duration of the infusion and accurate registration of the time-points of blood drawing are essential for obtaining precise predictions.
Resumo:
The consumption of immunoglobulins (Ig) is increasing due to better recognition of antibody deficiencies, an aging population, and new indications. This review aims to examine the various dosing regimens and research developments in the established and in some of the relevant off-label indications in Europe. The background to the current regulatory settings in Europe is provided as a backdrop for the latest developments in primary and secondary immunodeficiencies and in immunomodulatory indications. In these heterogeneous areas, clinical trials encompassing different routes of administration, varying intervals, and infusion rates are paving the way toward more individualized therapy regimens. In primary antibody deficiencies, adjustments in dosing and intervals will depend on the clinical presentation, effective IgG trough levels and IgG metabolism. Ideally, individual pharmacokinetic profiles in conjunction with the clinical phenotype could lead to highly tailored treatment. In practice, incremental dosage increases are necessary to titrate the optimal dose for more severely ill patients. Higher intravenous doses in these patients also have beneficial immunomodulatory effects beyond mere IgG replacement. Better understanding of the pharmacokinetics of Ig therapy is leading to a move away from simplistic "per kg" dosing. Defective antibody production is common in many secondary immunodeficiencies irrespective of whether the causative factor was lymphoid malignancies (established indications), certain autoimmune disorders, immunosuppressive agents, or biologics. This antibody failure, as shown by test immunization, may be amenable to treatment with replacement Ig therapy. In certain immunomodulatory settings [e.g., idiopathic thrombocytopenic purpura (ITP)], selection of patients for Ig therapy may be enhanced by relevant biomarkers in order to exclude non-responders and thus obtain higher response rates. In this review, the developments in dosing of therapeutic immunoglobulins have been limited to high and some medium priority indications such as ITP, Kawasaki' disease, Guillain-Barré syndrome, chronic inflammatory demyelinating polyradiculoneuropathy, myasthenia gravis, multifocal motor neuropathy, fetal alloimmune thrombocytopenia, fetal hemolytic anemia, and dermatological diseases.
Resumo:
Antibody-drug conjugates (ADCs) have emerged as a promising class of anticancer agents, combining the specificity of antibodies for tumor targeting and the destructive potential of highly potent drugs as payload. An essential component of these immunoconjugates is a bifunctional linker capable of reacting with the antibody and the payload to assemble a functional entity. Linker design is fundamental, as it must provide high stability in the circulation to prevent premature drug release, but be capable of releasing the active drug inside the target cell upon receptor-mediated endocytosis. Although ADCs have demonstrated an increased therapeutic window, compared to conventional chemotherapy in recent clinical trials, therapeutic success rates are still far from optimal. To explore other regimes of half-life variation and drug conjugation stoichiometries, it is necessary to investigate additional binding proteins which offer access to a wide range of formats, all with molecularly defined drug conjugation. Here, we delineate recent progress with site-specific and biorthogonal conjugation chemistries, and discuss alternative, biophysically more stable protein scaffolds like Designed Ankyrin Repeat Proteins (DARPins), which may provide such additional engineering opportunities for drug conjugates with improved pharmacological performance.
Resumo:
The aim of this study was to compare different bacterial models for in vitro induction of non-cavitated enamel caries-like lesions by microhardness and polarized light microscopy analyses. One hundred blocks of bovine enamel were randomly divided into four groups (n = 25) according to the bacterial model for caries induction: (A) Streptococcus mutans, (B) S. mutans and Lactobacillus acidophilus, (C) S. mutans and L. casei, and (D) S. mutans, L. acidophilus, and L. casei. Within each group, the blocks were randomly divided into five subgroups according to the duration of the period of caries induction (4-20 days). The enamel blocks were immersed in cariogenic solution containing the microorganisms, which was changed every 48 h. Groups C and D presented lower surface hardness values (SMH) and higher area of hardness loss (ΔS) after the cariogenic challenge than groups A and B (P < 0.05). As regards lesion depth, under polarized light microscopy, group A presented significantly lower values, and groups C and D the highest values. Group B showed a higher value than group A (P < 0.05). Groups A and B exhibited subsurface caries lesions after all treatment durations, while groups C and D presented erosion-type lesions with surface softening. The model using S. mutans, whether or not it was associated with L. acidophilus, was less aggressive and may be used for the induction of non-cavitated enamel caries-like lesions. The optimal period for inducing caries-like lesions was 8 days.
Resumo:
BACKGROUND Endometriosis, the growth of endometrial tissue outside the uterine cavity, is associated with chronic pelvic pain, subfertility and an increased risk of ovarian cancer. Current treatments include the surgical removal of the lesions or the induction of a hypoestrogenic state. However, a reappearance of the lesion after surgery is common and a hypoestrogenic state is less than optimal for women of reproductive age. Additional approaches are required. Endometriosis lesions exist in a unique microenvironment characterized by increased concentrations of hormones, inflammation, oxidative stress and iron. This environment influences cell survival through the binding of membrane receptors and a subsequent cascading activation of intracellular kinases that stimulate a cellular response. Many of these kinase signalling pathways are constitutively activated in endometriosis. These pathways are being investigated as therapeutic targets in other diseases and thus may also represent a target for endometriosis treatment. METHODS To identify relevant English language studies published up to 2015 on kinase signalling pathways in endometriosis, we searched the Pubmed database using the following search terms in various combinations; 'endometriosis', 'inflammation', 'oxidative stress', 'iron', 'kinase', 'NF kappa', 'mTOR', 'MAPK' 'p38', 'JNK', 'ERK' 'estrogen' and progesterone'. Further citing references were identified using the Scopus database and finally current clinical trials were searched on the clinicaltrials.gov trial registry. RESULTS The current literature on intracellular kinases activated by the endometriotic environment can be summarized into three main pathways that could be targeted for treatments: the canonical IKKβ/NFκB pathway, the MAPK pathways (ERK1/2, p38 and JNK) and the PI3K/AKT/mTOR pathway. A number of pharmaceutical compounds that target these pathways have been successfully trialled in in vitro and animal models of endometriosis, although they have not yet proceeded to clinical trials. The current generation of kinase inhibitors carry a potential for adverse side effects. CONCLUSIONS Kinase signalling pathways represent viable targets for endometriosis treatment. At present, however, further improvements in clinical efficacy and the profile of adverse effects are required before these compounds can be useful for long-term endometriosis treatment. A better understanding of the molecular activity of these kinases, including the specific extracellular compounds that lead to their activation in endometriotic cells specifically should facilitate their improvement and could potentially lead to new, non-hormonal treatments of endometriosis.