980 resultados para Resolution of problems
Conventional and Reciprocal Approaches to the Forward and Inverse Problems of Electroencephalography
Resumo:
Le problème inverse en électroencéphalographie (EEG) est la localisation de sources de courant dans le cerveau utilisant les potentiels de surface sur le cuir chevelu générés par ces sources. Une solution inverse implique typiquement de multiples calculs de potentiels de surface sur le cuir chevelu, soit le problème direct en EEG. Pour résoudre le problème direct, des modèles sont requis à la fois pour la configuration de source sous-jacente, soit le modèle de source, et pour les tissues environnants, soit le modèle de la tête. Cette thèse traite deux approches bien distinctes pour la résolution du problème direct et inverse en EEG en utilisant la méthode des éléments de frontières (BEM): l’approche conventionnelle et l’approche réciproque. L’approche conventionnelle pour le problème direct comporte le calcul des potentiels de surface en partant de sources de courant dipolaires. D’un autre côté, l’approche réciproque détermine d’abord le champ électrique aux sites des sources dipolaires quand les électrodes de surfaces sont utilisées pour injecter et retirer un courant unitaire. Le produit scalaire de ce champ électrique avec les sources dipolaires donne ensuite les potentiels de surface. L’approche réciproque promet un nombre d’avantages par rapport à l’approche conventionnelle dont la possibilité d’augmenter la précision des potentiels de surface et de réduire les exigences informatiques pour les solutions inverses. Dans cette thèse, les équations BEM pour les approches conventionnelle et réciproque sont développées en utilisant une formulation courante, la méthode des résidus pondérés. La réalisation numérique des deux approches pour le problème direct est décrite pour un seul modèle de source dipolaire. Un modèle de tête de trois sphères concentriques pour lequel des solutions analytiques sont disponibles est utilisé. Les potentiels de surfaces sont calculés aux centroïdes ou aux sommets des éléments de discrétisation BEM utilisés. La performance des approches conventionnelle et réciproque pour le problème direct est évaluée pour des dipôles radiaux et tangentiels d’excentricité variable et deux valeurs très différentes pour la conductivité du crâne. On détermine ensuite si les avantages potentiels de l’approche réciproquesuggérés par les simulations du problème direct peuvent êtres exploités pour donner des solutions inverses plus précises. Des solutions inverses à un seul dipôle sont obtenues en utilisant la minimisation par méthode du simplexe pour à la fois l’approche conventionnelle et réciproque, chacun avec des versions aux centroïdes et aux sommets. Encore une fois, les simulations numériques sont effectuées sur un modèle à trois sphères concentriques pour des dipôles radiaux et tangentiels d’excentricité variable. La précision des solutions inverses des deux approches est comparée pour les deux conductivités différentes du crâne, et leurs sensibilités relatives aux erreurs de conductivité du crâne et au bruit sont évaluées. Tandis que l’approche conventionnelle aux sommets donne les solutions directes les plus précises pour une conductivité du crâne supposément plus réaliste, les deux approches, conventionnelle et réciproque, produisent de grandes erreurs dans les potentiels du cuir chevelu pour des dipôles très excentriques. Les approches réciproques produisent le moins de variations en précision des solutions directes pour différentes valeurs de conductivité du crâne. En termes de solutions inverses pour un seul dipôle, les approches conventionnelle et réciproque sont de précision semblable. Les erreurs de localisation sont petites, même pour des dipôles très excentriques qui produisent des grandes erreurs dans les potentiels du cuir chevelu, à cause de la nature non linéaire des solutions inverses pour un dipôle. Les deux approches se sont démontrées également robustes aux erreurs de conductivité du crâne quand du bruit est présent. Finalement, un modèle plus réaliste de la tête est obtenu en utilisant des images par resonace magnétique (IRM) à partir desquelles les surfaces du cuir chevelu, du crâne et du cerveau/liquide céphalorachidien (LCR) sont extraites. Les deux approches sont validées sur ce type de modèle en utilisant des véritables potentiels évoqués somatosensoriels enregistrés à la suite de stimulation du nerf médian chez des sujets sains. La précision des solutions inverses pour les approches conventionnelle et réciproque et leurs variantes, en les comparant à des sites anatomiques connus sur IRM, est encore une fois évaluée pour les deux conductivités différentes du crâne. Leurs avantages et inconvénients incluant leurs exigences informatiques sont également évalués. Encore une fois, les approches conventionnelle et réciproque produisent des petites erreurs de position dipolaire. En effet, les erreurs de position pour des solutions inverses à un seul dipôle sont robustes de manière inhérente au manque de précision dans les solutions directes, mais dépendent de l’activité superposée d’autres sources neurales. Contrairement aux attentes, les approches réciproques n’améliorent pas la précision des positions dipolaires comparativement aux approches conventionnelles. Cependant, des exigences informatiques réduites en temps et en espace sont les avantages principaux des approches réciproques. Ce type de localisation est potentiellement utile dans la planification d’interventions neurochirurgicales, par exemple, chez des patients souffrant d’épilepsie focale réfractaire qui ont souvent déjà fait un EEG et IRM.
Resumo:
Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.
Resumo:
Introduction Jatropha gossypifolia has been used quite extensively by traditional medicine for the treatment of several diseases in South America and Africa. This medicinal plant has therapeutic potential as a phytomedicine and therefore the establishment of innovative analytical methods to characterise their active components is crucial to the future development of a quality product. Objective To enhance the chromatographic resolution of HPLC-UV-diode-array detector (DAD) experiments applying chemometric tools. Methods Crude leave extracts from J. gossypifolia were analysed by HPLC-DAD. A chromatographic band deconvolution method was designed and applied using interval multivariate curve resolution by alternating least squares (MCR-ALS). Results The MCR-ALS method allowed the deconvolution from up to 117% more bands, compared with the original HPLC-DAD experiments, even in regions where the UV spectra showed high similarity. The method assisted in the dereplication of three C-glycosylflavones isomers: vitexin/isovitexin, orientin/homorientin and schaftoside/isoschaftoside. Conclusion The MCR-ALS method is shown to be a powerful tool to solve problems of chromatographic band overlapping from complex mixtures such as natural crude samples. Copyright © 2013 John Wiley & Sons, Ltd. Extracts from J. gossypifolia were analyzed by HPLC-DAD and, dereplicated applying MCR-ALS. The method assisted in the detection of three C-glycosylflavones isomers: vitexin/isovitexin, orientin/homorientin and schaftoside/isoschaftoside. The application of MCR-ALS allowed solving problems of chromatographic band overlapping from complex mixtures such as natural crude samples. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Many combinatorial problems coming from the real world may not have a clear and well defined structure, typically being dirtied by side constraints, or being composed of two or more sub-problems, usually not disjoint. Such problems are not suitable to be solved with pure approaches based on a single programming paradigm, because a paradigm that can effectively face a problem characteristic may behave inefficiently when facing other characteristics. In these cases, modelling the problem using different programming techniques, trying to ”take the best” from each technique, can produce solvers that largely dominate pure approaches. We demonstrate the effectiveness of hybridization and we discuss about different hybridization techniques by analyzing two classes of problems with particular structures, exploiting Constraint Programming and Integer Linear Programming solving tools and Algorithm Portfolios and Logic Based Benders Decomposition as integration and hybridization frameworks.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
This thesis is concerned with various aspects of Air Pollution due to smell, the impact it has on communities exposed to it, the means by which it may be controlled and the manner in which a local authority may investigate the problems it causes. The approach is a practical one drawing on examples occurring within a Local Authority's experience and for that reason the research is anecdotal and is not a comprehensive treatise on the full range of options available. Odour Pollution is not yet a well organised discipline and might be considered esoteric as it is necessary to incorporate elements of science and the humanities. It has been necessary to range widely across a number of aspects of the subject so that discussion is often restricted but many references have been included to enable a reader to pursue a particular point in greater depth. In a `fuzzy' subject there is often a yawning gap separating theory and practice, thus case studies have been used to illustrate the interplay of various disciplines in resolution of a problem. The essence of any science is observation and measurement. Observation has been made of the spread of odour pollution through a community and also of relevant meterological data so that a mathematical model could be constructed and its predictions checked. It has been used to explore the results of some options for odour control. Measurements of odour perception and human behaviour seldom have the precision and accuracy of the physical sciences. However methods of social research enabled individual perception of odour pollution to be quantified and an insight gained into reaction of a community exposed to it. Odours have four attributes that can be measured and together provide a complete description of its perception. No objective techniques of measurement have yet been developed but in this thesis simple, structured procedures of subjective assessment have been improvised and their use enabled the functioning of the components of an odour control system to be assessed. Such data enabled the action of the system to be communicated using terms that are understood by a non specialist audience.
Resumo:
Personal selling and sales management play a critical role in the short and long term success of the firm, and have thus received substantial academic interest since the 1970s. Sales research has examined the role of the sales manager in some depth, defining a number of key technical and interpersonal roles which sales managers have in influencing sales force effectiveness. However, one aspect of sales management which appears to remain unexplored is that of their resolution of salesperson-related problems. This study represents the first attempt to address this gap by reporting on the conceptual and empirical development of an instrument designed to measure sales managers' problem resolution styles. A comprehensive literature review and qualitative research study identified three key constructs relating to sales managers' problem resolution styles. The three constructs identified were termed; sales manager willingness to respond, sales manager caring, and sales manager aggressiveness. Building on this, existing literature was used to develop a conceptual model of salesperson-specific consequences of the three problem resolution style constructs. The quantitative phase of the study consisted of a mail survey of UK salespeople, achieving a total sample of 140 fully usable responses. Rigorous statistical assessment of the sales manager problem resolution style measures was undertaken, and construct validity examined. Following this, the conceptual model was tested using latent variable path analysis. The results for the model were encouraging overall, and also with regard to the individual hypotheses. Sales manager problem resolution styles were found individually to have significant impacts on the salesperson-specific variables of role ambiguity, emotional exhaustion, job satisfaction, organisational commitment and organisational citizenship behaviours. The findings, theoretical and managerial implications, limitations and directions for future research are discussed.
Resumo:
Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.
Resumo:
Objective To evaluate the efficiency of pharmaceutical care on the control of clinical parameters, such as fasting glycaemia and glycosylated haemoglobin in patients with Type 2 Diabetes mellitus. Setting This study was conducted at the Training and Community Health Centre of the College of Medicine of Ribeirao Preto, University of Sao Paulo, Brazil. Methods A prospective and experimental study was conducted with 71 participants divided in two groups: (i) pharmaceutical care group (n=40), and (ii) the control group (n=31). The distribution of patients within these groups was made casually, and the patients were monitored for 12 months. Main outcome measure: Values for fasting glycaemia and glycosylated haemoglobin were collected. Results Mean values of fasting glycaemia in the pharmaceutical care group were significantly reduced whilst a small reduction was detected in the control group at the same time. A significant reduction in the levels of glycosylated haemoglobin was detected in patients in the pharmaceutical care group, and an average increase was observed in the control group. Furthermore, the follow-up of the intervention group by a pharmacist contributed to the resolution of 62.7% of 142 drug therapy problems identified. Conclusion In Brazil, the information provided by a pharmacist to patients with Type 2 Diabetes mellitus increases compliance to treatment, solving or reducing the Drug Therapy Problem and, consequently, improving glycaemic control.
Resumo:
Bulk density of undisturbed soil samples can be measured using computed tomography (CT) techniques with a spatial resolution of about 1 mm. However, this technique may not be readily accessible. On the other hand, x-ray radiographs have only been considered as qualitative images to describe morphological features. A calibration procedure was set up to generate two-dimensional, high-resolution bulk density images from x-ray radiographs made with a conventional x-ray diffraction apparatus. Test bricks were made to assess the accuracy of the method. Slices of impregnated soil samples were made using hardsetting seedbeds that had been gamma scanned at 5-mm depth increments in a previous study. The calibration procedure involved three stages: (i) calibration of the image grey levels in terms of glass thickness using a staircase made from glass cover slips, (ii) measurement of ratio between the soil and resin mass attenuation coefficients and the glass mass attenuation coefficient, using compacted bricks of known thickness and bulk density, and (iii) image correction accounting for the heterogeneity of the irradiation field. The procedure was simple, rapid, and the equipment was easily accessible. The accuracy of the bulk density determination was good (mean relative error 0.015), The bulk density images showed a good spatial resolution, so that many structural details could be observed. The depth functions were consistent with both the global shrinkage and the gamma probe data previously obtained. The suggested method would be easily applied to the new fuzzy set approach of soil structure, which requires generation of bulk density images. Also, it would be an invaluable tool for studies requiring high-resolution bulk density measurement, such as studies on soil surface crusts.
Resumo:
The moving finite element collocation method proposed by Kill et al. (1995) Chem. Engng Sci. 51 (4), 2793-2799 for solution of problems with steep gradients is further developed to solve transient problems arising in the field of adsorption. The technique is applied to a model of adsorption in solids with bidisperse pore structures. Numerical solutions were found to match the analytical solution when it exists (i.e. when the adsorption isotherm is linear). The method is simple yet sufficiently accurate for use in adsorption problems, where global collocation methods fail. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
Conotoxins are valuable probes of receptors and ion channels because of their small size and highly selective activity. alpha-Conotoxin EpI, a 16-residue peptide from the mollusk-hunting Conus episcopatus, has the amino acid sequence GCCSDPRCNMNNPDY(SO3H)C-NH2 and appears to be an extremely potent and selective inhibitor of the alpha 3 beta 2 and alpha 3 beta 4 neuronal subtypes of the nicotinic acetylcholine receptor (nAChR). The desulfated form of EpI ([Tyr(15)]EpI) has a potency and selectivity for the nAChR receptor similar to those of EpI. Here we describe the crystal structure of [Tyr(15)]EpI solved at a resolution of 1.1 Angstrom using SnB. The asymmetric unit has a total of 284 non-hydrogen atoms, making this one of the largest structures solved de novo try direct methods. The [Tyr(15)]EpI structure brings to six the number of alpha-conotoxin structures that have been determined to date. Four of these, [Tyr(15)]EpI, PnIA, PnIB, and MII, have an alpha 4/7 cysteine framework and are selective for the neuronal subtype of the nAChR. The structure of [Tyr(15)]EpI has the same backbone fold as the other alpha 4/7-conotoxin structures, supporting the notion that this conotoxin cysteine framework and spacing give rise to a conserved fold. The surface charge distribution of [Tyr(15)]EpI is similar to that of PnIA and PnIB but is likely to be different from that of MII, suggesting that [Tyr(15)]EpI and MII may have different binding modes for the same receptor subtype.
Resumo:
An analysis of the relationships of the major arthropod groups Was undertaken using mitochondrial genome data to examine the hypotheses that Hexapoda is polyphyletic and that Collembola is more closely related to branchiopod crustaceans than insects. We sought to examine the sensitivity of this relationship to outgroup choice, data treatment. gene choice and optimality criteria used in the phylogenetic analysis of mitochondrial genome data. Additionally we sequenced the mitochondrial genome of ail archaeognathan, Nesomachilis australica. to improve taxon selection in the apterygote insects, a group poorly represented in previous mitochondrial phylogenies. The sister group of the Collembola was rarely resolved in our analyses with a significant level of support. The use of different outgroups (myriapods, nematodes, or annelids + mollusks) resulted in many different placements of Collembola. The way in which the dataset was coded for analysis (DNA, DNA with the exclusion of third codon position and as amino acids) also had marked affects on tree topology. We found that nodal Support was spread evenly throughout the 13 mitochondrial genes and the exclusion of genes resulted in significantly less resolution in the inferred trees. Optimality criteria had a much lesser effect on topology than the preceding factors; parsimony and Bayesian trees for a given data set and treatment were quite similar. We therefore conclude that the relationships of the extant arthropod groups as inferred by mitochondrial genomes are highly vulnerable to outgroup choice, data treatment and gene choice, and no consistent alternative hypothesis of Collembola's relationships is supported. Pending the resolution of these identified problems with the application of mitogenomic data to basal arthropod relationships, it is difficult to justify the rejection of hexapod monophyly, which is well supported on morphological grounds. (c) The Willi Hennig Society 2004.
Resumo:
The aim of this study was to evaluate the mid-term outcomes of the laparoscopic ileal interposition associated to a sleeve gastrectomy (LII-SG) for the treatment of morbid obesity. The procedure was performed in 120 patients: 71 women and 49 men with mean age of 41.4 years. Mean body mass index (BMI) was 43.4 +/- 4.2 kg/m(2). Patients had to meet requirements of the 1991 NIH conference criteria for bariatric operations. Associated comorbidities were observed in all patients, including dyslipidemia in 51.7%, hypertension in 35.8%, type 2 diabetes in 15.8%, degenerative joint disease in 55%, gastroesophageal reflux disease in 36.7%, sleep apnea in 10%, and cardiovascular problems in 5.8%. Mean follow-up was 38.4 +/- 10.2 months, range 25.2-61.1. There was no conversion to open surgery nor operative mortality. Early major complications were diagnosed in five patients (4.2%). Postoperatively, 118 patients were evaluated. Late major complications were observed in seven patients (5.9%). Reoperations were performed in six (5.1%). Mean postoperative BMI was 25.7 +/- 3.17 kg/m(2), and 86.4% were no longer obese. Mean %EWL was 84.5 +/- 19.5%. Hypertension was resolved in 88.4% of the patients, dyslipidemia in 82.3%, and T2DM in 84.2%. The LII-SG provided an adequate weight loss and resolution of associated diseases during mid-term outcomes evaluation. There was an acceptable morbidity with no operative mortality. It seems that chronic ileal brake activation determined sustained reduced food intake and increased satiety over time. LII-SG could be regularly used as a surgical alternative for the treatment of morbid obesity.
Resumo:
Fundoplication has been commonly performed in neurologically impaired and normal children with complicated gastroesophageal reflux disease. The relationship between gastroesophageal reflux disease and respiratory diseases is still unclear. We aimed to compare results of open and laparoscopic procedures, as well as the impact of fundoplication over digestive and respiratory symptoms. From January 2000 to June 2007, 151 children underwent Nissen fundoplication. Data were prospectively collected regarding age at surgery, presence of neurologic handicap, symptoms related to reflux (digestive or respiratory, including recurrent lung infections and reactive airways disease), surgical approach, concomitant procedures, complications, and results. Mean age was 6 years and 9 months. Eighty-two children (54.3%) had neurological handicaps. The surgical approach was laparoscopy in 118 cases and laparotomy in 33. Dysphagia occurred in 23 patients submitted to laparoscopic and none to open procedure (P = 0.01). A total of 86.6% of patients with digestive symptoms had complete resolution or significant improvement of the problems after the surgery. A total of 62.2% of children with recurrent lung infections showed any reduction in the frequency of pneumonias. Only 45.2% of patients with reactive airway disease had any relief from bronchospasm episodes after fundoplication. The comparisons demonstrated that Nissen fundoplication was more effective for the resolution of digestive symptoms than to respiratory manifestations (P = 0.04). Open or laparoscopic fundoplication are safe procedures with acceptable complication indices and the results of the surgery are better for digestive than for respiratory symptoms.