90 resultados para Pseudo Andócides


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction of quercetin, which is a bioflavonoid, with bovine serum albumin (BSA) was investigated under pseudo-physiological conditions by the application of UV–vis spectrometry, spectrofluorimetry and cyclic voltammetry (CV). These studies indicated a cooperative interaction between the quercetin–BSA complex and warfarin, which produced a ternary complex, quercetin–BSA–warfarin. It was found that both quercetin and warfarin were located in site I. However, the spectra of these three components overlapped and the chemometrics method – multivariate curve resolution-alternating least squares (MCR-ALS) was applied to resolve the spectra. The resolved spectra of quercetin–BSA and warfarin agreed well with their measured spectra, and importantly, the spectrum of the quercetin–BSA–warfarin complex was extracted. These results allowed the rationalization of the behaviour of the overlapping spectra. At lower concentrations ([warfarin] < 1 × 10−5 mol L−1), most of the site marker reacted with the quercetin–BSA, but free warfarin was present at higher concentrations. Interestingly, the ratio between quercetin–BSA and warfarin was found to be 1:2, suggesting a quercetin–BSA–(warfarin)2 complex, and the estimated equilibrium constant was 1.4 × 1011 M−2. The results suggest that at low concentrations, warfarin binds at the high-affinity sites (HAS), while low-affinity binding sites (LAS) are occupied at higher concentrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using an in situ-generated calcium-based initiating species derived from pentaerythritol, the bulk synthesis of well-defined 4-arm star poly(L-lactide) oligomers has been studied in detail. The substitution of the traditional initiator, stannous octoate with calcium hydride allowed the synthesis of oligomers that had both low PDIs and a comparable number of polymeric arms (3.7 – 3.9) to oligomers of similar molecular weight. Investigations into the degree of control observed during the course of the polymerization found that the insolubility of pentaerythritol in molten L-lactide resulted in an uncontrolled polymerization only when the feed mole ratio of L-lactide to pentaerythritol was 13. At feed ratios of 40 and greater, a pseudo-living polymerization was observed. As part of this study, in situ FT-Raman spectroscopy was demonstrated to be a suitable method to monitor the kinetics of the ring-opening polymerization (ROP) of lactide. The advantages of using this technique rather than FT-IR-ATR and 1H NMR for monitoring L-lactide consumption during polymerization are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is little evidence that workshops alone have a lasting impact on the day-to-day practice of participants. The current paper examined a strategy to increase generalization and maintenance of skills in the natural environment using pseudo-patients and immediate performance feedback to reinforce skills acquisition. A random half of pharmacies (N=30) took part in workshop training aimed at optimizing consumers' use of nonprescription analgesic products. Pharmacies in the training group also received performance feedback on their adherence to the recommended protocol. Feedback occurred immediately after a pseudo-patient visit in which confederates posed as purchasers of analgesics, and combined positive and corrective elements. Trained pharmacists were significantly more accurate at identifying people who misused the medication (P<0.001). The trained pharmacists were more likely than controls to use open-ended questions (P<0.001), assess readiness to change problematic use (P <0.001), and to deliver a brief intervention that was tailored to the person's commitment to alter his/her usage (P <0.001). Participants responded to the feedback positively. Results were consistent with the hypothesis that when workshop is combined with on-site performance feedback, it enhances practitioners' adherence to protocols in the natural setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines performances that defy established representations of disease, deformity and bodily difference. Historically, the ‘deformed’ body has been cast – onstage and in sideshows – as flawed, an object of pity, or an example of the human capacity to overcome. Such representations define the boundaries of the ‘normal’ body by displaying its Other. They bracket the ‘abnormal’ body off as an example of deviance from the ‘norm’, thus, paradoxically, decreasing the social and symbolic visibility (and agency) of disabled people. Yet, in contemporary theory and culture, these representations are reappropriated – by disabled artists, certainly, but also as what Carrie Sandahl has called a ‘master trope’ for representing a range of bodily differences. In this paper, I investigate this phenomenon. I analyse French Canadian choreographer Marie Chouinard’s bODY rEMIX/gOLDBERG vARIATIONS, in which 10 able-bodied dancers are reborn as bizarre biotechnical mutants via the use of crutches, walkers, ballet shoes and barres as prosthetic pseudo-organs. These bodies defy boundaries, defy expectations, develop new modes of expression, and celebrate bodily difference. The self-inflicted pain dancers experience during training is cast as a ‘disablement’ that is ultimately ‘enabling’. I ask what effect encountering able bodies celebrating ‘dis’ or ‘diff’ ability has on audiences. Do we see the emergence of a once-repressed Other, no longer silenced, censored or negated? Or does using ‘disability’ to express the dancers’ difference and self-determination usurp a ‘trope’ by which disabled people themselves might speak back to the dominant culture, creating further censorship?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The crystal structure of the modified unsymmetrically N, N'-substituted viologen chromophore, N-ethyl- N'-(2-phosphonoethyl)-4, 4'-bipyridinium dichloride 0.75 hydrate. (1) has been determined. Crystals are triclinic, space group P-1 with Z = 2 in a cell with a = 7.2550(1), b = 13.2038(5), c = 18.5752(7) Å, α = 86.495(3), β = 83.527(2), γ = 88.921(2)o. The two independent but pseudo-symmetrically related cations in the asymmetric unit form one-dimensional hydrogen-bonded chains through short homomeric phosphonic acid O-H...O links [2.455(4), 2.464(4)A] while two of the chloride anions are similarly strongly linked to phosphonic acid groups [O-H…Cl, 2.889(4), 2.896(4)Å]. The other two chloride anions together with the two water molecules of solvation (one with partial occupancy) form unusual cyclic hydrogen-bonded bis(Cl...water) dianion units which lie between the layers of bipyridylium rings of the cation chain structures with which they are weakly associated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During wound repair, the balance between matrix metalloproteinases (MMPs) and their natural inhibitors (the TIMPs) is crucial for the normal extra cellular matrix turnover. However, the over expression of several MMPs including MMP-1, 2, 3, 8, 9 and MMP-10, combined with abnormally high levels of activation or low expression of TIMPs, may contribute to excessive degradation of connective tissue and formation of chronic ulcers. There are many groups exploring strategies for promoting wound healing involving delivery of growth factors, cells, ECM components and small molecules. Our approach for improving the balance of MMPs is not to add anything more to the wound, but instead to neutralise the over-expressed MMPs using inhibitors tethered to a bandage-like hydrogel. Our in vitro experiments using designed synthetic pseudo peptide inhibitors have been demonstrated to inhibit MMP activity in standard solutions. These inhibitors have also been tethered to polyethylene glycol hydrogels using a facile reaction between the linker unit on the inhibitor and the gel. After tethering the inhibition of MMPs diminishes to some extent and we postulate that this arises due to poor diffusion of the MMPs into the gels. When the tethered inhibitors were tested against chronic wound fluid obtained against patients we observed over 40% inhibition in proteolytic activity suggesting our approach may prove useful in rebalancing MMPs within chronic wounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the past decade, a significant amount of research has been conducted internationally with the aim of developing, implementing, and verifying "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures. Application of these methods permits comprehensive assessment of the actual failure modes and ultimate strengths of structural systems in practical design situations, without resort to simplified elastic methods of analysis and semi-empirical specification equations. Advanced analysis has the potential to extend the creativity of structural engineers and simplify the design process, while ensuring greater economy and more uniform safety with respect to the ultimate limit state. The application of advanced analysis methods has previously been restricted to steel frames comprising only members with compact cross-sections that are not subject to the effects of local buckling. This precluded the use of advanced analysis from the design of steel frames comprising a significant proportion of the most commonly used Australian sections, which are non-compact and subject to the effects of local buckling. This thesis contains a detailed description of research conducted over the past three years in an attempt to extend the scope of advanced analysis by developing methods that include the effects of local buckling in a non-linear analysis formulation, suitable for practical design of steel frames comprising non-compact sections. Two alternative concentrated plasticity formulations are presented in this thesis: the refined plastic hinge method and the pseudo plastic zone method. Both methods implicitly account for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the methods for the analysis of steel frames comprising non-compact sections has been established by comparison with a comprehensive range of analytical benchmark frame solutions. Both the refined plastic hinge and pseudo plastic zone methods are more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations. For example, the pseudo plastic zone method predicts the ultimate strength of the analytical benchmark frames with an average conservative error of less than one percent, and has an acceptable maximum unconservati_ve error of less than five percent. The pseudo plastic zone model can allow the design capacity to be increased by up to 30 percent for simple frames, mainly due to the consideration of inelastic redistribution. The benefits may be even more significant for complex frames with significant redundancy, which provides greater scope for inelastic redistribution. The analytical benchmark frame solutions were obtained using a distributed plasticity shell finite element model. A detailed description of this model and the results of all the 120 benchmark analyses are provided. The model explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. Its accuracy was verified by comparison with a variety of analytical solutions and the results of three large-scale experimental tests of steel frames comprising non-compact sections. A description of the experimental method and test results is also provided.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The INEX 2010 Focused Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. As in traditional approaches we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions. The evaluation was extended in several ways: the use of exhaustive relevance feedback over entire runs; the evaluation of focused retrieval where both the retrieval results and the feedback are focused; the evaluation was performed over a closed set of documents and complete focused assessments; the evaluation was performed over executable implementations of relevance feedback algorithms; and �finally, the entire evaluation platform is reusable. We present the evaluation methodology, its implementation, and experimental results obtained for nine submissions from three participating organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction of 10-hydroxycamptothecine (HCPT) with DNA under pseudo-physiological conditions (Tris-HCl buffer of pH 7.4), using ethidium bromide (EB) dye as a probe, was investigated with the use of spectrofluorimetry, UV-vis spectrometry and viscosity measurement. The binding constant and binding number for HCPT with DNA were evaluated as (7.1 ± 0.5) × 104 M-1 and 1.1, respectively, by multivariate curve resolution-alternating least squares (MCR-ALS). Moreover, parallel factor analysis (PARAFAC) was applied to resolve the three-way fluorescence data obtained from the interaction system, and the concentration information for the three components of the system at equilibrium was simultaneously obtained. It was found that there was a cooperative interaction between the HCPT-DNA complex and EB, which produced a ternary complex of HCPT-DNA-EB. © 2011 Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.