962 resultados para Series Summation Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an urgent need for thorough analysis of Radix astragali, a widely used Chinese herb, for quality control purposes. This paper describes the development of a total analytical method for Radix astragali extract, a multi-component complex mixture. Twenty-four components were separated step by step from the extract using a series of isocratic isopropanol-methanol elutions, and then 42 components were separated similarly using methanol-water elutions. Based on the log k(w) and -S of the 66 components obtained from the above procedure and the optimization software developed in our laboratory, an optimum elution program consisting of seven methanol-water segments and four isopropanol-methanol segments was developed to finish the task of analyzing the total components in a single run. Under optimized gradient conditions, the sample of Radix astragali extract was analyzed. As expected, most of the components were well separated and the experimental chromatogram was in a good agreement with the predicted one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Titania sols were prepared by acid hydrolysis of a TiCl4 precursor instead of titanium alkoxides. The effect of acid concentration on the particle size and stability of sol was investigated. Stable titania sols with mean particle size of 14 nm could be obtained when the H+/Ti molar ratio was 0.5. The titania sols were modified with Pt, SiO2, ZrO2, WO3 and MoO3 to prepare a series of modified catalysts, which were used for the photocatalytic oxidation of formaldehyde at 37 degreesC. They showed different photocatalytic activities due to the influence of the additives. Comparing with pure TiO2, the addition of silica or zirconia increased the photocatalytic activity, while the addition of Pt and MoO3 decreased the activity, and the addition Of WO3 had little effect on the activity. It is of great significance that the conversion of formaldehyde was increased up to 94% over the SiO2-TiO2 catalyst. The increased activity was partly due to higher surface area and porosity or smaller crystallite size. A comparison of our catalyst compositions with the literature in this field suggested that the difference in activity due to the addition of a second metal oxide maybe caused by the surface chemistry of the catalysts, particularly the acidity. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While genome-wide gene expression data are generated at an increasing rate, the repertoire of approaches for pattern discovery in these data is still limited. Identifying subtle patterns of interest in large amounts of data (tens of thousands of profiles) associated with a certain level of noise remains a challenge. A microarray time series was recently generated to study the transcriptional program of the mouse segmentation clock, a biological oscillator associated with the periodic formation of the segments of the body axis. A method related to Fourier analysis, the Lomb-Scargle periodogram, was used to detect periodic profiles in the dataset, leading to the identification of a novel set of cyclic genes associated with the segmentation clock. Here, we applied to the same microarray time series dataset four distinct mathematical methods to identify significant patterns in gene expression profiles. These methods are called: Phase consistency, Address reduction, Cyclohedron test and Stable persistence, and are based on different conceptual frameworks that are either hypothesis- or data-driven. Some of the methods, unlike Fourier transforms, are not dependent on the assumption of periodicity of the pattern of interest. Remarkably, these methods identified blindly the expression profiles of known cyclic genes as the most significant patterns in the dataset. Many candidate genes predicted by more than one approach appeared to be true positive cyclic genes and will be of particular interest for future research. In addition, these methods predicted novel candidate cyclic genes that were consistent with previous biological knowledge and experimental validation in mouse embryos. Our results demonstrate the utility of these novel pattern detection strategies, notably for detection of periodic profiles, and suggest that combining several distinct mathematical approaches to analyze microarray datasets is a valuable strategy for identifying genes that exhibit novel, interesting transcriptional patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Controversies exist regarding the indications for unicompartmental knee arthroplasty. The objective of this study is to report the mid-term results and examine predictors of failure in a metal-backed unicompartmental knee arthroplasty design. METHODS: At a mean follow-up of 60 months, 80 medial unicompartmental knee arthroplasties (68 patients) were evaluated. Implant survivorship was analyzed using Kaplan-Meier method. The Knee Society objective and functional scores and radiographic characteristics were compared before surgery and at final follow-up. A Cox proportional hazard model was used to examine the association of patient's age, gender, obesity (body mass index > 30 kg/m2), diagnosis, Knee Society scores and patella arthrosis with failure. RESULTS: There were 9 failures during the follow up. The mean Knee Society objective and functional scores were respectively 49 and 48 points preoperatively and 95 and 92 points postoperatively. The survival rate was 92% at 5 years and 84% at 10 years. The mean age was younger in the failure group than the non-failure group (p < 0.01). However, none of the factors assessed was independently associated with failure based on the results from the Cox proportional hazard model. CONCLUSION: Gender, pre-operative diagnosis, preoperative objective and functional scores and patellar osteophytes were not independent predictors of failure of unicompartmental knee implants, although high body mass index trended toward significance. The findings suggest that the standard criteria for UKA may be expanded without compromising the outcomes, although caution may be warranted in patients with very high body mass index pending additional data to confirm our results. LEVEL OF EVIDENCE: IV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Nicardipine is a member of a family of calcium channel blockers named dihydropiridines that are known to be photolabile and may cause phototoxicity. It is therefore vital to develop analytical method which can study the photodegradation of nicardipine. Method: Forced acid degradation of nicardipine was conducted by heating 12 ml of 1 mg/ml nicardipine with 3 ml of 2.5 M HCl for two hours. A gradient HPLC medthod was developed using Agilent Technologies 1200 series quaternary system. Separation was achieved with a Hichrome (250 x 4.6 mm) 5 μm C18 reversed phase column and mobile phase composition of 70% A(100%v/v water) and 30% B(99%v/v acetonitrile + 1%v/v formic acid) at time zero, composition of A and B was then charged to 60%v/v A;40%v/v B at 10minutes, 50%v/v A; 50%v/v B at 30minutes and 70%v/v A; 30%v/v B at 35minutes. 20μl of 0.8mg/ml of nicardipine degradation was injected at room temperature (25oC). The gradient method was transferred onto a HPLC-ESI-MS system (HP 1050 series - AQUAMAX mass detector) and analysis conducted with an acid degradation concentration of 0.25mg/ml and 20μl injection volume. ESI spectra were acquired in positive ionisation mode with MRM 0-600 m/z. Results: Eleven nicardipine degradation products were detected in the HPLC analysis and the resolution (RS) between the respective degradants where 1.0, 1.2, 6.0, 0.4, 1.7, 3.7, 1.8, 1.0, and 1.7 respectively. Nine degradation products were identified in the ESI spectra with the respective m/z ratio; 171.0, 166.1, 441.2, 423.2, 455.2, 455.2, 331.1, 273.1, and 290.1. The possible molecular formulae for each degradants were ambiguously determined. Conclusion: A sensitive and specific method was developed for the analysis of nicardipine degradants. Method enables detection and quantification of nicardipine degradation products that can be used for the study of the kinetics of nicardipine degradation processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a modification to the ACI 318-02 equivalent frame method of analysis of reinforced concrete flat plate exterior panels. Two existing code methods were examined: ACI 318 and BS 8110. The derivation of the torsional stiffness of the edge strip as proposed by ACI 318 is examined and a more accurate estimate of this value is proposed, based on both theoretical analysis and experimental results. A series of 1/3-scale models of flat plate exterior panels have been tested. Unique experimental results were obtained by measuring strains in reinforcing bars at approximately 200 selected locations in the plate panel throughout the entire loading history. The measured strains were used to calculate curvature and, hence, bending moments; these were used along with moments in the columns to assess the accuracy of the equivalent frame methods. The proposed method leads to a more accurate prediction of the moments in the plate at the column front face, at the panel midspan, and in the edge column. Registered Subscribers: View the full article. This document is available as a free download to qualified members. An electronic (PDF) version is available for purchase and download. Click on the Order Now button to continue with the download.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of cis-dihydrodiol metabolites, available from the bacterial dioxygenase-catalysed oxidation of monosubstituted benzene substrates using Pseudomonas putida UV4, have been converted to the corresponding catechols using both a heterogeneous catalyst (Pd/C) and a naphthalene cis-diol dehydrogenase enzyme present in whole cells of the recombinant strain Escherichia coli DH5 alpha(pUC129: nar B). A comparative study of the merits of both routes to 3-substituted catechols has been carried out and the two methods have been found to be complementary. A similarity in mechanism for catechol formation under both enzymatic and chemoenzymatic conditions, involving regioselective oxidation of the hydroxyl group at C-1, has been found using deuterium labelled toluene cis-dihydrodiols. The potential, of combining a biocatalytic step (dioxygenase-catalysed cis-dihydroxylation) with a chemocatalytic step (Pd/C-catalysed dehydrogenation), into a one-pot route to catechols, from the parent substituted benzene substrates, has been realised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is proposed to accelerate the evaluation of the Green's function of an infinite double periodic array of thin wire antennas. The method is based on the expansion of the Green's function into series corresponding to the propagating and evanescent waves and the use of Poisson and Kummer transformations enhanced with the analytic summation of the slowly convergent asymptotic terms. Unlike existing techniques the procedure reported here provides uniform convergence regardless of the geometrical parameters of the problem or plane wave excitation wavelength. In addition, it is numerically stable and does not require numerical integration or internal tuning parameters, since all necessary series are directly calculated in terms of analytical functions. This means that for nonlinear problem scenarios that the algorithm can be deployed without run time intervention or recursive adjustment within a harmonic balance engine. Numerical examples are provided to illustrate the efficiency and accuracy of the developed approach as compared with the Ewald method for which these classes of problems requires run time splitting parameter adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the marine environment, aerial surveys have historically centred on apex predators, such as pinnipeds, cetaceans and sea birds. However, it is becoming increasingly apparent that the utility of this technique may also extend to subsurface species such as pre-spawning fish stocks and aggregations of jellyfish that occur close to the surface. In light of this, we tested the utility of aerial surveys to provide baseline data for 3 poorly understood scyphozoan jellyfish found throughout British and Irish waters: Rhizostoma octopus, Cyanea capillata and Chrysaora hysoscella. Our principal objectives were to develop a simple sampling protocol to identify and quantify surface aggregations, assess their consistency in space and time, and consider the overall applicability of this technique to the study of gelatinous zooplankton. This approach provided a general understanding of range and relative abundance for each target species, with greatest suitability to the study of R. octopus. For this species it was possible to identify and monitor extensive, temporally consistent and previously undocumented aggregations throughout the Irish Sea, an area spanning thousands of square kilometres. This finding has pronounced implications for ecologists and fisheries managers alike and, moreover, draws attention to the broad utility of aerial surveys for the study of gelatinous aggregations beyond the range of conventional ship-based techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stochastic nature of oil price fluctuations is investigated over a twelve-year period, borrowing feedback from an existing database (USA Energy Information Administration database, available online). We evaluate the scaling exponents of the fluctuations by employing different statistical analysis methods, namely rescaled range analysis (R/S), scale windowed variance analysis (SWV) and the generalized Hurst exponent (GH) method. Relying on the scaling exponents obtained, we apply a rescaling procedure to investigate the complex characteristics of the probability density functions (PDFs) dominating oil price fluctuations. It is found that PDFs exhibit scale invariance, and in fact collapse onto a single curve when increments are measured over microscales (typically less than 30 days). The time evolution of the distributions is well fitted by a Levy-type stable distribution. The relevance of a Levy distribution is made plausible by a simple model of nonlinear transfer. Our results also exhibit a degree of multifractality as the PDFs change and converge toward to a Gaussian distribution at the macroscales.