29 resultados para Time-Driven Activity-Based Costing (TDABC)
Resumo:
An assay for the simultaneous determination of the enantiomers of hydroxymebendazole (OH-MBZ) and hydroxyaminomebendazole (OH-AMBZ) together with aminomebendazole (AMBZ) in human plasma is described for the first time. It is based upon liquid-liquid extraction at alkaline pH from 0.5 mL plasma followed by analysis of the reconstituted extract by CE with reversed polarity in the presence of a 50 mM, pH 4.2 acetate buffer containing 15 mg/mL sulfated beta-CD as chiral selector. For all compounds, detection limits are between 0.01 and 0.04 microg/mL, and intraday and interday precisions evaluated from peak area ratios are <6.9 and <8.5%, respectively. Analysis of 39 samples of echinoccocosis patients undergoing pharmacotherapy with mebendazole (MBZ) revealed that the ketoreduction of MBZ and AMBZ is highly stereoselective. One enantiomer of each metabolite (firstly detected peak in both cases) could only be detected. The CE data revealed that OH-MBZ (mean: 0.715 microg/mL) is the major metabolite followed by AMBZ (mean: 0.165 microg/mL) and OH-AMBZ (mean: 0.055 microg/mL) whereas the MBZ plasma levels (mean: 0.096 microg/mL, levels determined by HPLC) were between those of AMBZ and OH-AMBZ.
Resumo:
The demands of developing modern, highly dynamic applications have led to an increasing interest in dynamic programming languages and mechanisms. Not only applications must evolve over time, but the object models themselves may need to be adapted to the requirements of different run-time contexts. Class-based models and prototype-based models, for example, may need to co-exist to meet the demands of dynamically evolving applications. Multi-dimensional dispatch, fine-grained and dynamic software composition, and run-time evolution of behaviour are further examples of diverse mechanisms which may need to co-exist in a dynamically evolving run-time environment How can we model the semantics of these highly dynamic features, yet still offer some reasonable safety guarantees? To this end we present an original calculus in which objects can adapt their behaviour at run-time to changing contexts. Both objects and environments are represented by first-class mappings between variables and values. Message sends are dynamically resolved to method calls. Variables may be dynamically bound, making it possible to model a variety of dynamic mechanisms within the same calculus. Despite the highly dynamic nature of the calculus, safety properties are assured by a type assignment system.
Resumo:
The demands of developing modern, highly dynamic applications have led to an increasing interest in dynamic programming languages and mechanisms. Not only must applications evolve over time, but the object models themselves may need to be adapted to the requirements of different run-time contexts. Class-based models and prototype-based models, for example, may need to co-exist to meet the demands of dynamically evolving applications. Multi-dimensional dispatch, fine-grained and dynamic software composition, and run-time evolution of behaviour are further examples of diverse mechanisms which may need to co-exist in a dynamically evolving run-time environment. How can we model the semantics of these highly dynamic features, yet still offer some reasonable safety guarantees? To this end we present an original calculus in which objects can adapt their behaviour at run-time. Both objects and environments are represented by first-class mappings between variables and values. Message sends are dynamically resolved to method calls. Variables may be dynamically bound, making it possible to model a variety of dynamic mechanisms within the same calculus. Despite the highly dynamic nature of the calculus, safety properties are assured by a type assignment system.
Resumo:
A lack of quantitative high resolution paleoclimate data from the Southern Hemisphere limits the ability to examine current trends within the context of long-term natural climate variability. This study presents a temperature reconstruction for southern Tasmania based on analyses of a sediment core from Duckhole Lake (43.365°S, 146.875°E). The relationship between non-destructive whole core scanning reflectance spectroscopy measurements in the visible spectrum (380–730 nm) and the instrumental temperature record (ad 1911–2000) was used to develop a calibration-in-time reflectance spectroscopy-based temperature model. Results showed that a trough in reflectance from 650 to 700 nm, which represents chlorophyll and its derivatives, was significantly correlated to annual mean temperature. A calibration model was developed (R = 0.56, p auto < 0.05, root mean squared error of prediction (RMSEP) = 0.21°C, five-year filtered data, calibration period 1911–2000) and applied down-core to reconstruct annual mean temperatures in southern Tasmania over the last c. 950 years. This indicated that temperatures were initially cool c. ad 1050, but steadily increased until the late ad 1100s. After a brief cool period in the ad 1200s, temperatures again increased. Temperatures steadily decreased during the ad 1600s and remained relatively stable until the start of the 20th century when they rapidly decreased, before increasing from ad 1960s onwards. Comparisons with high resolution temperature records from western Tasmania, New Zealand and South America revealed some similarities, but also highlighted differences in temperature variability across the mid-latitudes of the Southern Hemisphere. These are likely due to a combination of factors including the spatial variability in climate between and within regions, and differences between records that document seasonal (i.e. warm season/late summer) versus annual temperature variability. This highlights the need for further records from the mid-latitudes of the Southern Hemisphere in order to constrain past natural spatial and seasonal/annual temperature variability in the region, and to accurately identify and attribute changes to natural variability and/or anthropogenic activities.
Resumo:
High-content screening led to the identification of the N-isobutylamide guineensine from Piper nigrum as novel nanomolar inhibitor (EC50 = 290 nM) of cellular uptake of the endocannabinoid anandamide (AEA). Noteworthy, guineensine did not inhibit endocannabinoid degrading enzymes fatty acid amide hydrolase (FAAH) or monoacylglycerol lipase (MAGL) nor interact with cannabinoid receptors or fatty acid binding protein 5 (FABP5), a major cytoplasmic AEA carrier. Activity-based protein profiling showed no inhibition of serine hydrolases. Guineensine also inhibited the cellular uptake of 2-arachidonoylglycerol (2-AG). Preliminary structure–activity relationships between natural guineensine analogs indicate the importance of the alkyl chain length interconnecting the pharmacophoric isobutylamide and benzodioxol moieties for AEA cellular uptake inhibition. Guineensine dose-dependently induced cannabimimetic effects in BALB/c mice shown by strong catalepsy, hypothermia, reduced locomotion and analgesia. The catalepsy and analgesia were blocked by the CB1 receptor antagonist rimonabant (SR141716A). Guineensine is a novel plant natural product which specifically inhibits endocannabinoid uptake in different cell lines independent of FAAH. Its scaffold may be useful to identify yet unknown targets involved in endocannabinoid transport.
Resumo:
Three field isolates of small ruminant lentiviruses (SRLVs) were derived from a mixed flock of goats and sheep certified for many years as free of caprine arthritis encephalitis virus (CAEV). The phylogenetic analysis of pol sequences permitted to classify these isolates as A4 subtype. None of the animals showed clinical signs of SRLV infection, confirming previous observations which had suggested that this particular subtype is highly attenuated, at least for goats. A quantitative real time PCR strategy based on primers and probes derived from a highly variable env region permitted us to classify the animals as uninfected, singly or doubly infected. The performance of different serological tools based on this classification revealed their profound inadequacy in monitoring animals infected with this particular SRLV subtype. In vitro, the isolates showed differences in their cytopathicity and a tendency to replicate more efficiently in goat than sheep cells, especially in goat macrophages. By contrast, in vivo, these viruses reached significantly higher viral loads in sheep than in goats. Both env subtypes infected goats and sheep with equal efficiency. One of these, however, reached significantly higher viral loads in both species. In conclusion, we characterized three isolates of the SRLV subtype A4 that efficiently circulate in a mixed herd of goats and sheep in spite of their apparent attenuation and a strict physical separation between goats and sheep. The poor performance of the serological tools applied indicates that, to support an SRLV eradication campaign, it is imperative to develop novel, subtype specific tools.
Resumo:
We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the e+e−→3π cross section, generalizing previous studies on ω,ϕ→3π decays and γπ→ππ scattering, and verify our result by comparing to e+e−→π0γ data. We perform the analytic continuation to the space-like region, predicting the poorly-constrained space-like transition form factor below 1GeV, and extract the slope of the form factor at vanishing momentum transfer aπ=(30.7±0.6)×10−3. We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon.
Resumo:
The field of animal syndromic surveillance (SyS) is growing, with many systems being developed worldwide. Now is an appropriate time to share ideas and lessons learned from early SyS design and implementation. Based on our practical experience in animal health SyS, with additions from the public health and animal health SyS literature, we put forward for discussion a 6-step approach to designing SyS systems for livestock and poultry. The first step is to formalise policy and surveillance goals which are considerate of stakeholder expectations and reflect priority issues (1). Next, it is important to find consensus on national priority diseases and identify current surveillance gaps. The geographic, demographic, and temporal coverage of the system must be carefully assessed (2). A minimum dataset for SyS that includes the essential data to achieve all surveillance objectives while minimizing the amount of data collected should be defined. One can then compile an inventory of the data sources available and evaluate each using the criteria developed (3). A list of syndromes should then be produced for all data sources. Cases can be classified into syndrome classes and the data can be converted into time series (4). Based on the characteristics of the syndrome-time series, the length of historic data available and the type of outbreaks the system must detect, different aberration detection algorithms can be tested (5). Finally, it is essential to develop a minimally acceptable response protocol for each statistical signal produced (6). Important outcomes of this pre-operational phase should be building of a national network of experts and collective action and evaluation plans. While some of the more applied steps (4 and 5) are currently receiving consideration, more emphasis should be put on earlier conceptual steps by decision makers and surveillance developers (1-3).
Resumo:
Consumers are often less satisfied with a product chosen from a large assortment than a limited one. Experienced choice difficulty presumably causes this as consumers have to engage in a great number of individual comparisons. In two studies we tested whether partitioning the choice task so that consumers decided sequentially on each individual attribute may provide a solution. In a Starbucks coffee house, consumers who chose from the menu rated the coffee as less tasty when chosen from a large rather than a small assortment. However, when the consumers chose it by sequentially deciding about one attribute at a time, the effect reversed. In a tailored-suit customization, consumers who chose multiple attributes at a time were less satisfied with their suit, compared to those who chose one attribute at a time. Sequential attribute-based processing proves to be an effective strategy to reap the benefits of a large assortment.
Resumo:
Intracellular schizonts of the apicomplexans Theileria annulata and Theileria parva immortalize bovine leucocytes thereby causing fatal immunoproliferative diseases. Buparvaquone, a hydroxynaphthoquinone related to parvaquone, is the only drug available against Theileria. The drug is only effective at the onset of infection and emerging resistance underlines the need for identifying alternative compounds. Current drug assays employ monitoring of proliferation of infected cells, with apoptosis of the infected host cell as a read-out, but it is often unclear whether active compounds directly impair the viability of the parasite or primarily induce host cell death. We here report on the development of a quantitative reverse transcriptase real time PCR method based on two Theileria genes, tasp and tap104, which are both expressed in schizonts. Upon in vitro treatment of T. annulata infected bovine monocytes with buparvaquone, TaSP and Tap104 mRNA expression levels significantly decreased in relation to host cell actin already within 4 h of drug exposure, while significant differences in host cell proliferation were detectable only after 48-72 h. TEM revealed marked alterations of the schizont ultrastructure already after 2 h of buparvaquone treatment, while the host cell remained unaffected. Expression of TaSP and Tap104 proteins showed a marked decrease only after 24 h. Therefore, the analysis of expression levels of mRNA coding for TaSP and Tap104 allows to directly measuring impairment of parasite viability. We subsequently applied this method using a series of compounds affecting different targets in other apicomplexan parasites, and show that monitoring of TaSP- and Tap104 mRNA levels constitutes a suitable tool for anti-theilerial drug development.
Resumo:
The paralysis-by-analysis phenomenon, i.e., attending to the execution of one's movement impairs performance, has gathered a lot of attention over recent years (see Wulf, 2007, for a review). Explanations of this phenomenon, e.g., the hypotheses of constrained action (Wulf et al., 2001) or of step-by-step execution (Masters, 1992; Beilock et al., 2002), however, do not refer to the level of underlying mechanisms on the level of sensorimotor control. For this purpose, a “nodal-point hypothesis” is presented here with the core assumption that skilled motor behavior is internally based on sensorimotor chains of nodal points, that attending to intermediate nodal points leads to a muscular re-freezing of the motor system at exactly and exclusively these points in time, and that this re-freezing is accompanied by the disruption of compensatory processes, resulting in an overall decrease of motor performance. Two experiments, on lever sequencing and basketball free throws, respectively, are reported that successfully tested these time-referenced predictions, i.e., showing that muscular activity is selectively increased and compensatory variability selectively decreased at movement-related nodal points if these points are in the focus of attention.
Resumo:
The promise of search-driven development is that developers will save time and resources by reusing external code in their local projects. To efficiently integrate this code, users must be able to trust it, thus trustability of code search results is just as important as their relevance. In this paper, we introduce a trustability metric to help users assess the quality of code search results and therefore ease the cost-benefit analysis they undertake trying to find suitable integration candidates. The proposed trustability metric incorporates both user votes and cross-project activity of developers to calculate a "karma" value for each developer. Through the karma value of all its developers a project is ranked on a trustability scale. We present JBENDER, a proof-of-concept code search engine which implements our trustability metric and we discuss preliminary results from an evaluation of the prototype.
Resumo:
OBJECTIVES: The disease alveolar echinococcosis (AE), caused by the larval stage of the cestode Echinococcus multilocularis, is fatal if treatment is unsuccessful. Current treatment options are, at best, parasitostatic, and involve taking benzimidazoles (albendazole, mebendazole) for the whole of a patient's life. In conjunction with the recent development of optimized procedures for E. multilocularis metacestode cultivation, we aimed to develop a rapid and reliable drug screening test, which enables efficient screening of a large number of compounds in a relatively short time frame. METHODS: Metacestodes were treated in vitro with albendazole, the nitro-thiazole nitazoxanide and 29 nitazoxanide derivatives. The resulting leakage of phosphoglucose isomerase (PGI) activity into the medium supernatant was measured and provided an indication of compound efficacy. RESULTS: We show that upon in vitro culture of E. multilocularis metacestodes in the presence of active drugs such as albendazole, the nitro-thiazole nitazoxanide and 30 different nitazoxanide derivatives, the activity of PGI in culture supernatants increased. The increase in PGI activity correlated with the progressive degeneration and destruction of metacestode tissue in a time- and concentration-dependent manner, which allowed us to perform a structure-activity relationship analysis on the thiazolide compounds used in this study. CONCLUSIONS: The assay presented here is inexpensive, rapid, can be used in 24- and 96-well formats and will serve as an ideal tool for first-round in vitro tests on the efficacy of large numbers of antiparasitic compounds.
Resumo:
BACKGROUND Patients with downbeat nystagmus syndrome suffer from oscillopsia, which leads to an unstable visual perception and therefore impaired visual acuity. The aim of this study was to use real-time computer-based visual feedback to compensate for the destabilizing slow phase eye movements. METHODS The patients were sitting in front of a computer screen with the head fixed on a chin rest. The eye movements were recorded by an eye tracking system (EyeSeeCam®). We tested the visual acuity with a fixed Landolt C (static) and during real-time feedback driven condition (dynamic) in gaze straight ahead and (20°) sideward gaze. In the dynamic condition, the Landolt C moved according to the slow phase eye velocity of the downbeat nystagmus. The Shapiro-Wilk test was used to test for normal distribution and one-way ANOVA for comparison. RESULTS Ten patients with downbeat nystagmus were included in the study. Median age was 76 years and the median duration of symptoms was 6.3 years (SD +/- 3.1y). The mean slow phase velocity was moderate during gaze straight ahead (1.44°/s, SD +/- 1.18°/s) and increased significantly in sideward gaze (mean left 3.36°/s; right 3.58°/s). In gaze straight ahead, we found no difference between the static and feedback driven condition. In sideward gaze, visual acuity improved in five out of ten subjects during the feedback-driven condition (p = 0.043). CONCLUSIONS This study provides proof of concept that non-invasive real-time computer-based visual feedback compensates for the SPV in DBN. Therefore, real-time visual feedback may be a promising aid for patients suffering from oscillopsia and impaired text reading on screen. Recent technological advances in the area of virtual reality displays might soon render this approach feasible in fully mobile settings.