53 resultados para Assisted Vaginal Hysterectomy
em CentAUR: Central Archive University of Reading - UK
Resumo:
Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.
Resumo:
We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet -- matrix-assisted laser desorption/ionisation -- mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. The low-femtomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydroxybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and low-mass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.
Resumo:
Reaction of a group of N-(2'-hydroxyphenyl)benzaldimines, derived from 2-aminophenol and five para-substituted benzaldehydes (the para substituents are OCH3, CH3, H, Cl and NO2), with [Rh(PPh3)(3)Cl] in refluxing toluene in the presence of a base (NEW afforded a family of organometallic complexes of rhodium(III). The crystal structure of one complex has been determined by X-ray crystallography. In these complexes the benzaldimine ligands are coordinated to the metal center, via dissociation of the phenolic proton and the phenyl proton at the ortho position of the phenyl ring in the imine fragment, as dianionic tridentate C,N,O-donors, and the two PPh3 ligands are trans. The complexes are diamagnetic (low-spin d(6), S = 0) and show intense MLCT transitions in the visible region. Cyclic voltammetry shows a Rh(III)-Rh(IV) oxidation within 0.63-0.93 V vs SCE followed by an oxidation of the coordinated benzaldimine ligand. A reduction of the coordinated benzaldimine is also observed within -0.96 to -1.04 V vs SCE. Potential of the Rh(Ill)-Rh(IV) oxidation is found to be sensitive to the nature of the para-substituent. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet - matrix-assisted laser desorption/ ionisation - mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. U. Am. Soc. Mass Spectrom. 1998, 9, 166-174). The low-ferntomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydrox-ybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and lowmass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.
Resumo:
DiGrignard reagents of the form XMg(CH2)(n)MgX, where X = Br or I and n = 6, 8, 10 or 12, were allowed to react with PhSnCl3 to produce highly cross-linked Ph-Sn polymeric networks. The Sn-H moiety was incorporated into these insoluble network polymers by treatment with Br-2 and NaBH4. Excellent accessibility of the Sn-H was displayed by these solvent penetrable but insoluble networks, giving them higher Sn-H loadings than all previously reported supported reagents. These reagents were totally regenerable in NaBH4 for radical assisted organic synthesis and no detectable leaching of the Sn into solution was observed during these reactions.
Resumo:
A linear trinuclear Ni-Schiff base complex [Ni-3(salpen)(2)(PhCH2COO)(2)(EtOH)] has been synthesized by combining Ni(ClO4)(2)center dot 6H(2)O, phenyl acetic acid (C6H5CH2COOH), and the Schiff base ligand, N,N'-bis(salicylidene)-1,3-pentanediamine (H(2)salpen). This complex is self-assembled through hydrogen bonding and C-H-g interaction in the solid state to generate a sheet-like architecture, while in organic solvent (CH2Cl2), it forms vesicles with a mean diameter of 290 nm and fused vesicles, depending upon the concentration of the solution. These vesicles act as an excellent carrier of dye molecules in CH2Cl2. The morphology of the complex has been determined by scanning electron microscopy and transmission electron microscopy experiments, and the encapsulation of dye has been examined by confocal microscopic image and electronic absorption spectra.
Light-assisted synthesis of a Ru(VI) nitrido complex by the reaction of azide with a Ru(III) complex
Resumo:
Reaction of Ru(III)(L)(dmf)Cl-3 (1) (L = 4,4,4',4'-tetramethyl-2,2'- bisoxazoline, dmf = N,N-dimethylformamide) with an excess of sodium azide in a methanol-water mixture leads to the isolation of the sodium salt of a Ru( VI) nitrido complex of the tetraanion of N,N'-bis-(2,2-dimethyl-1-hydroxyethyl)-1,2-ethanediamide ( L'H-4; H a dissociable proton) of the formulation Na[Ru(L')(N)(H2O)].1.4H(2)O (2). Complex 2 is not generated in the absence of light. A tentative mechanism for the reaction is proposed and a Ru(IV) intermediate, Na[Ru(L')(N-3)(H2O)] . 2CH(3)OH.2H(2)O (3), isolated.
Resumo:
Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.