976 resultados para new procedure


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work proposes the synthesis of zeolite A by IZA standard proceedures starting from a natural clay. The clay was used in its natural form and after calcination at 900ºC. The resulting materials were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM) and porosity analysis by nitrogen adsorption. Results showed low surface area for Na-A zeolite in sodium form, but a higher one in CaA based on the nitrogen accessibility. The presence of cubic crystals for the A phase was observed in the SEM micrographies. The new procedure starting from natural clay favors the formation of sodalite while that using the calcinated clay gives A.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Herein, we report the synthesis of β-N-glycosylsulfonamides derivatives of D-glucose and N-acetylglucosamine using conventional methods. We also describe a procedure that allows the preparation of these compounds in good yields without the anomerization of the intermediate glycosylamines. This method includes the intermediates obtained from the less reactive 1- and 2-naphthalenesulfonyl chlorides.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various strength properties of paper are measured to tell how well it resists breaks in a paper machine or in printing presses. The most often measured properties are dry tensile strength and dry tear strength. However, in many situations where paper breaks, it is not dry. For example, in web breaks after the wet pressing the dry matter content can be around 45%. Thus, wet-web strength is often a more critical paper property than dry strength. Both wet and dry strength properties of the samples were measured with a L&W tensile tester. Originally this device was not designed for the measurement of the wet web tensile strength, thus a new procedure to handle the wet samples was developed. The method was tested with Pine Kraft (never dried). The effect of different strength additives on the wet-web and dry paper tensile strength was studied. The polymers used in this experiment were aqueous solution of a cationic polyamidoamine-epichlorohydrin resin (PAE), cationic hydrophilised polyisocyanate and cationic polyvinylamine (PVAm). From all three used chemicals only Cationic PAE considerably increased the wet web strength. However it was noticed that at constant solids content all chemicals decreased the wet web tensile strength. So, since all chemicals enhanced solid content it can be concluded that they work as drainage aids, not as wet web strength additives. From all chemicals only PVAm increased the dry strength and two other chemicals even decreased the strength. As chemicals were used in strong diluted forms and were injected into the pulp slurry, not on the surface of the papersheets, changes in samples densities did not happen. Also it has to be noted that all these chemicals are mainly used to improve the wet strength after the drying of the web.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: A new procedure for the treatment of esophageal fistula, mainly associated to the ebb esophagojejunal in patients submitted to the total gastrectomy and reconstruction with loop jejunal Rouxen-Y anastomosis is present. METHODS: The method is based in the use of "probe standard enteral prolongated with drain to laminate adapted in extremity", which results in advanced positioning inside the jejunum, making the administration of enteral nutrition possible and impeding ebb esophagojejunal. RESULTS: The authors discuss the theoretical advantages of the procedure and they suggest that the treatment of esofagic fistula with probe prolonged enteral would be suitable in the treatment of the fistula esophagojejunal by preventing the ebb esophagojejunal, which would result in smaller period of duration of the fistula esophagojejunal and it would prevent the high mortality rate. CONCLUSIONS: Preliminary studies demonstrated that this is a technically easy, low cost procedure through the endoscopic use. A prospective evaluation for morbility and mortality related to the method is needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The formal calibration procedure of a phase fraction meter is based on registering the outputs resulting from imposed phase fractions at known flow regimes. This can be straightforwardly done in laboratory conditions, but is rarely the case in industrial conditions, and particularly for on-site applications. Thus, there is a clear need for less restrictive calibration methods regarding to the prior knowledge of the complete set of inlet conditions. A new procedure is proposed in this work for the on-site construction of the calibration curve from total flown mass values of the homogeneous dispersed phase. The solution is obtained by minimizing a convenient error functional, assembled with data from redundant tests to handle the intrinsic ill-conditioned nature of the problem. Numerical simulations performed for increasing error levels demonstrate that acceptable calibration curves can be reconstructed, even from total mass measured within a precision of up to 2%. Consequently, the method can readily be applied, especially in on-site calibration problems in which classical procedures fail due to the impossibility of having a strict control of all the input/output parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this prospective study was to evaluate the efficacy and complications of the use of an intraocular sustained-release ganciclovir implant for the treatment of active cytomegalovirus (CMV) retinitis in AIDS patients. Thirty-nine eyes of 26 patients were submitted to ocular surgery. All patients underwent complete ocular examination before and after surgery. The surgical procedure was always done under local anesthesia using the same technique. The mean time for the surgical procedure was 20 min (range, 15 to 30 min). The average follow-up period was 3.7 months. Of all patient, only 4 presented recurrence of retinitis after 8, 8, 9 and 2 months, respectively. Three of them received a successful second implant. All 39 eyes of the 26 patients presented healing of retinitis as shown by clinical improvement evaluated by indirect binocular ophthalmoscopy and retinography. Retinitis healed within a period of 4 to 6 weeks in all patients, with clinical regression signs from the third week on. Six (15.4%) eyes developed retinal detachment. None of the patients developed CMV retinitis in the contralateral eye. The intraocular implant proved to be effective in controlling the progression of retinitis for a period of up to 8 months even in patients for whom systemic therapy with either ganciclovir or foscarnet or both had failed. The intraocular sustained-release ganciclovir implant proved to be a safe new procedure for the treatment of CMV retinitis, avoiding the systemic side effects caused by the intravenous medications and improving the quality of life of the patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les maladies cardio-vasculaires demeurent une cause majeure de mortalité et morbidité dans les sociétés développées. La recherche de déterminants prédictifs d’évènements vasculaires représente toujours un enjeu d’actualité face aux coûts croissants des dépenses reliées aux soins médicaux et à l’élargissement des populations concernées, notamment face à l’occidentalisation des pays émergeants comme l’Inde, le Brésil et la Chine. La cardiologie nucléaire occupe depuis trente ans, une place essentielle dans l’arsenal des méthodes diagnostiques et pronostiques des cardiopathies. De plus, de nouvelles percées permettront de dépister d’une façon plus précoce et précise, la maladie athérosclérotique cardiaque et périphérique chez les populations atteintes ainsi qu’en prévention primaire. Nous présenterons dans cette thèse, deux approches nouvelles de la cardiologie nucléaire. La dysfonction endothéliale est considérée comme le signal pathologique le plus précoce de l’athérosclérose. Les facteurs de risques cardiovasculaires traditionnels atteignent la fonction endothéliale et peuvent initier le processus d’athérosclérose même en l’absence de lésion endothéliale physique. La quantification de la fonction endothéliale coronarienne comporte donc un intérêt certain comme biomarqueur précoce de la maladie coronarienne. La pléthysmographie isotopique, méthodologie développée lors de ce cycle d’étude, permet de quantifier la fonction endothéliale périphérique, cette dernière étant corrélée à la fonction endothéliale coronarienne. Cette méthodologie est démontrée dans le premier manuscrit (Harel et. al., Physiol Meas., 2007). L’utilisation d’un radiomarquage des érythrocytes permet la mesure du flot artériel au niveau du membre supérieur pendant la réalisation d’une hyperémie réactive locale. Cette nouvelle procédure a été validée en comparaison à la pléthysmographie par jauge de contrainte sur une cohorte de 26 patients. Elle a démontré une excellente reproductibilité (coefficient de corrélation intra-classe = 0.89). De plus, la mesure du flot artérielle pendant la réaction hyperémique corrélait avec les mesure réalisées par la méthode de référence (r=0.87). Le deuxième manuscrit expose les bases de la spectroscopie infrarouge comme méthodologie de mesure du flot artériel et quantification de la réaction hyperémique (Harel et. al., Physiol Meas., 2008). Cette étude utilisa un protocole de triples mesures simultanées à l’aide de la pléthysmographie par jauge de contrainte, radio-isotopique et par spectroscopie infrarouge. La technique par spectroscopie fut démontrée précise et reproductible quant à la mesure des flots artériels au niveau de l’avant-bras. Cette nouvelle procédure a présenté des avantages indéniables quant à la diminution d’artéfact et à sa facilité d’utilisation. Le second volet de ma thèse porte sur l’analyse du synchronisme de contraction cardiaque. En effet, plus de 30% des patients recevant une thérapie de resynchronisation ne démontre pas d’amélioration clinique. De plus, ce taux de non-réponse est encore plus élevé lors de l’utilisation de critères morphologiques de réponse à la resynchronisation (réduction du volume télésystolique). Il existe donc un besoin urgent de développer une méthodologie de mesure fiable et précise de la dynamique cardiaque. Le troisième manuscrit expose les bases d’une nouvelle technique radio-isotopique permettant la quantification de la fraction d’éjection du ventricule gauche (Harel et. al. J Nucl Cardiol., 2007). L’étude portant sur 202 patients a démontré une excellente corrélation (r=0.84) avec la méthode de référence (ventriculographie planaire). La comparaison avec le logiciel QBS (Cedar-Sinai) démontrait un écart type du biais inférieur (7.44% vs 9.36%). De plus, le biais dans la mesure ne démontrait pas de corrélation avec la magnitude du paramètre pour notre méthodologie, contrairement au logiciel alterne. Le quatrième manuscrit portait sur la quantification de l’asynchronisme intra-ventriculaire gauche (Harel et. al. J Nucl Cardiol, 2008). Un nouveau paramètre tridimensionnel (CHI: contraction homogeneity index) (médiane 73.8% ; IQ 58.7% - 84.9%) permis d’intégrer les composantes d’amplitude et du synchronisme de la contraction ventriculaire. La validation de ce paramètre fut effectuée par comparaison avec la déviation standard de l’histogramme de phase (SDΦ) (médiane 28.2º ; IQ 17.5º - 46.8º) obtenu par la ventriculographie planaire lors d’une étude portant sur 235 patients. Ces quatre manuscrits, déjà publiés dans la littérature scientifique spécialisée, résument une fraction des travaux de recherche que nous avons effectués durant les trois dernières années. Ces travaux s’inscrivent dans deux axes majeurs de développement de la cardiologie du 21ième siècle.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En écologie, dans le cadre par exemple d’études des services fournis par les écosystèmes, les modélisations descriptive, explicative et prédictive ont toutes trois leur place distincte. Certaines situations bien précises requièrent soit l’un soit l’autre de ces types de modélisation ; le bon choix s’impose afin de pouvoir faire du modèle un usage conforme aux objectifs de l’étude. Dans le cadre de ce travail, nous explorons dans un premier temps le pouvoir explicatif de l’arbre de régression multivariable (ARM). Cette méthode de modélisation est basée sur un algorithme récursif de bipartition et une méthode de rééchantillonage permettant l’élagage du modèle final, qui est un arbre, afin d’obtenir le modèle produisant les meilleures prédictions. Cette analyse asymétrique à deux tableaux permet l’obtention de groupes homogènes d’objets du tableau réponse, les divisions entre les groupes correspondant à des points de coupure des variables du tableau explicatif marquant les changements les plus abrupts de la réponse. Nous démontrons qu’afin de calculer le pouvoir explicatif de l’ARM, on doit définir un coefficient de détermination ajusté dans lequel les degrés de liberté du modèle sont estimés à l’aide d’un algorithme. Cette estimation du coefficient de détermination de la population est pratiquement non biaisée. Puisque l’ARM sous-tend des prémisses de discontinuité alors que l’analyse canonique de redondance (ACR) modélise des gradients linéaires continus, la comparaison de leur pouvoir explicatif respectif permet entre autres de distinguer quel type de patron la réponse suit en fonction des variables explicatives. La comparaison du pouvoir explicatif entre l’ACR et l’ARM a été motivée par l’utilisation extensive de l’ACR afin d’étudier la diversité bêta. Toujours dans une optique explicative, nous définissons une nouvelle procédure appelée l’arbre de régression multivariable en cascade (ARMC) qui permet de construire un modèle tout en imposant un ordre hiérarchique aux hypothèses à l’étude. Cette nouvelle procédure permet d’entreprendre l’étude de l’effet hiérarchisé de deux jeux de variables explicatives, principal et subordonné, puis de calculer leur pouvoir explicatif. L’interprétation du modèle final se fait comme dans une MANOVA hiérarchique. On peut trouver dans les résultats de cette analyse des informations supplémentaires quant aux liens qui existent entre la réponse et les variables explicatives, par exemple des interactions entres les deux jeux explicatifs qui n’étaient pas mises en évidence par l’analyse ARM usuelle. D’autre part, on étudie le pouvoir prédictif des modèles linéaires généralisés en modélisant la biomasse de différentes espèces d’arbre tropicaux en fonction de certaines de leurs mesures allométriques. Plus particulièrement, nous examinons la capacité des structures d’erreur gaussienne et gamma à fournir les prédictions les plus précises. Nous montrons que pour une espèce en particulier, le pouvoir prédictif d’un modèle faisant usage de la structure d’erreur gamma est supérieur. Cette étude s’insère dans un cadre pratique et se veut un exemple pour les gestionnaires voulant estimer précisément la capture du carbone par des plantations d’arbres tropicaux. Nos conclusions pourraient faire partie intégrante d’un programme de réduction des émissions de carbone par les changements d’utilisation des terres.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new procedure for the classification of lower case English language characters is presented in this work . The character image is binarised and the binary image is further grouped into sixteen smaller areas ,called Cells . Each cell is assigned a name depending upon the contour present in the cell and occupancy of the image contour in the cell. A data reduction procedure called Filtering is adopted to eliminate undesirable redundant information for reducing complexity during further processing steps . The filtered data is fed into a primitive extractor where extraction of primitives is done . Syntactic methods are employed for the classification of the character . A decision tree is used for the interaction of the various components in the scheme . 1ike the primitive extraction and character recognition. A character is recognized by the primitive by primitive construction of its description . Openended inventories are used for including variants of the characters and also adding new members to the general class . Computer implementation of the proposal is discussed at the end using handwritten character samples . Results are analyzed and suggestions for future studies are made. The advantages of the proposal are discussed in detail .

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have implemented our new procedure for computing Franck-Condon factors utilizing vibrational configuration interaction based on a vibrational self-consistent field reference. Both Duschinsky rotations and anharmonic three-mode coupling are taken into account. Simulations of the first ionization band of Cl O2 and C4 H4 O (furan) using up to quadruple excitations in treating anharmonicity are reported and analyzed. A developer version of the MIDASCPP code was employed to obtain the required anharmonic vibrational integrals and transition frequencies

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Lisbon Treaty introduced changes to the ordinary revision process of EU Treaties. Article 48(3) TEU (Treaty of European Union) included the possibility of the President of the European Council summoning a Convention to analyse the projects to be revised and to adopt, by consensus, a recommendation to be put before the Intergovernmental Conference which would define the changes being introduced into the Treaties. This present work seeks to clarify the principal characteristics of this new stage in the revision process of EU Treaties. The main objective of this study is to set out how and why this new procedure evolved, how it works and what is new about what it brings to the revision process of the European Union.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data assimilation aims to incorporate measured observations into a dynamical system model in order to produce accurate estimates of all the current (and future) state variables of the system. The optimal estimates minimize a variational principle and can be found using adjoint methods. The model equations are treated as strong constraints on the problem. In reality, the model does not represent the system behaviour exactly and errors arise due to lack of resolution and inaccuracies in physical parameters, boundary conditions and forcing terms. A technique for estimating systematic and time-correlated errors as part of the variational assimilation procedure is described here. The modified method determines a correction term that compensates for model error and leads to improved predictions of the system states. The technique is illustrated in two test cases. Applications to the 1-D nonlinear shallow water equations demonstrate the effectiveness of the new procedure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article describes a new application of key psychological concepts in the area of Sociometry for the selection of workers within organizations in which projects are developed. The project manager can use a new procedure to determine which individuals should be chosen from a given pool of resources and how to combine them into one or several simultaneous groups/projects in order to assure the highest possible overall work efficiency from the standpoint of social interaction. The optimization process was carried out by means of matrix calculations performed using a computer or even manually, and based on a number of new ratios generated ad-hoc and composed on the basis of indices frequently used in Sociometry.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many of the controversies around the concept of homology rest on the subjectivity inherent to primary homology propositions. Dynamic homology partially solves this problem, but there has been up to now scant application of it outside of the molecular domain. This is probably because morphological and behavioural characters are rich in properties, connections and qualities, so that there is less space for conflicting character delimitations. Here we present a new method for the direct optimization of behavioural data, a method that relies on the richness of this database to delimit the characters, and on dynamic procedures to establish character state identity. We use between-species congruence in the data matrix and topological stability to choose the best cladogram. We test the methodology using sequences of predatory behaviour in a group of spiders that evolved the highly modified predatory technique of spitting glue onto prey. The cladogram recovered is fully compatible with previous analyses in the literature, and thus the method seems consistent. Besides the advantage of enhanced objectivity in character proposition, the new procedure allows the use of complex, context-dependent behavioural characters in an evolutionary framework, an important step towards the practical integration of the evolutionary and ecological perspectives on diversity. (C) The Willi Hennig Society 2010.