911 resultados para ENGINEERING ANALYSIS
Resumo:
This work investigates the performance of cardiorespiratory analysis detecting periodic breathing (PB) in chest wall recordings in mountaineers climbing to extreme altitude. The breathing patterns of 34 mountaineers were monitored unobtrusively by inductance plethysmography, ECG and pulse oximetry using a portable recorder during climbs at altitudes between 4497 and 7546 m on Mt. Muztagh Ata. The minute ventilation (VE) and heart rate (HR) signals were studied, to identify visually scored PB, applying time-varying spectral, coherence and entropy analysis. In 411 climbing periods, 30-120 min in duration, high values of mean power (MP(VE)) and slope (MSlope(VE)) of the modulation frequency band of VE, accurately identified PB, with an area under the ROC curve of 88 and 89%, respectively. Prolonged stay at altitude was associated with an increase in PB. During PB episodes, higher peak power of ventilatory (MP(VE)) and cardiac (MP(LF)(HR) ) oscillations and cardiorespiratory coherence (MP(LF)(Coher)), but reduced ventilation entropy (SampEn(VE)), was observed. Therefore, the characterization of cardiorespiratory dynamics by the analysis of VE and HR signals accurately identifies PB and effects of altitude acclimatization, providing promising tools for investigating physiologic effects of environmental exposures and diseases.
Resumo:
Periacetabular Osteotomy (PAO) is a joint preserving surgical intervention intended to increase femoral head coverage and thereby to improve stability in young patients with hip dysplasia. Previously, we developed a CT-based, computer-assisted program for PAO diagnosis and planning, which allows for quantifying the 3D acetabular morphology with parameters such as acetabular version, inclination, lateral center edge (LCE) angle and femoral head coverage ratio (CO). In order to verify the hypothesis that our morphology-based planning strategy can improve biomechanical characteristics of dysplastic hips, we developed a 3D finite element model based on patient-specific geometry to predict cartilage contact stress change before and after morphology-based planning. Our experimental results demonstrated that the morphology-based planning strategy could reduce cartilage contact pressures and at the same time increase contact areas. In conclusion, our computer-assisted system is an efficient tool for PAO planning.
Resumo:
BACKGROUND: The robotics-assisted tilt table (RATT), including actuators for tilting and cyclical leg movement, is used for rehabilitation of severely disabled neurological patients. Following further engineering development of the system, i.e. the addition of force sensors and visual bio-feedback, patients can actively participate in exercise testing and training on the device. Peak cardiopulmonary performance parameters were previously investigated, but it also important to compare submaximal parameters with standard devices. The aim of this study was to evaluate the feasibility of the RATT for estimation of submaximal exercise thresholds by comparison with a cycle ergometer and a treadmill. METHODS: 17 healthy subjects randomly performed six maximal individualized incremental exercise tests, with two tests on each of the three exercise modalities. The ventilatory anaerobic threshold (VAT) and respiratory compensation point (RCP) were determined from breath-by-breath data. RESULTS: VAT and RCP on the RATT were lower than the cycle ergometer and the treadmill: oxygen uptake (V'O2) at VAT was [mean (SD)] 1.2 (0.3), 1.5 (0.4) and 1.6 (0.5) L/min, respectively (p < 0.001); V'O2 at RCP was 1.7 (0.4), 2.3 (0.8) and 2.6 (0.9) L/min, respectively (p = 0.001). High correlations for VAT and RCP were found between the RATT vs the cycle ergometer and RATT vs the treadmill (R on the range 0.69-0.80). VAT and RCP demonstrated excellent test-retest reliability for all three devices (ICC from 0.81 to 0.98). Mean differences between the test and retest values on each device were close to zero. The ventilatory equivalent for O2 at VAT for the RATT and cycle ergometer were similar and both were higher than the treadmill. The ventilatory equivalent for CO2 at RCP was similar for all devices. Ventilatory equivalent parameters demonstrated fair-to-excellent reliability and repeatability. CONCLUSIONS: It is feasible to use the RATT for estimation of submaximal exercise thresholds: VAT and RCP on the RATT were lower than the cycle ergometer and the treadmill, but there were high correlations between the RATT vs the cycle ergometer and vs the treadmill. Repeatability and test-retest reliability of all submaximal threshold parameters from the RATT were comparable to those of standard devices.
Resumo:
Through the correct implementation of lean manufacturing methods, a company can greatly improve their business. Over a period of three months at TTM Technologies, I utilized my knowledge to fix existing problems ans streamline production. In addition, other trouble areas in their production process were discovered and proper lean methods were used to address them. TTM Technologies saw many changed in the right direction over this time period.
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
Improvements in the analysis of microarray images are critical for accurately quantifying gene expression levels. The acquisition of accurate spot intensities directly influences the results and interpretation of statistical analyses. This dissertation discusses the implementation of a novel approach to the analysis of cDNA microarray images. We use a stellar photometric model, the Moffat function, to quantify microarray spots from nylon microarray images. The inherent flexibility of the Moffat shape model makes it ideal for quantifying microarray spots. We apply our novel approach to a Wilms' tumor microarray study and compare our results with a fixed-circle segmentation approach for spot quantification. Our results suggest that different spot feature extraction methods can have an impact on the ability of statistical methods to identify differentially expressed genes. We also used the Moffat function to simulate a series of microarray images under various experimental conditions. These simulations were used to validate the performance of various statistical methods for identifying differentially expressed genes. Our simulation results indicate that tests taking into account the dependency between mean spot intensity and variance estimation, such as the smoothened t-test, can better identify differentially expressed genes, especially when the number of replicates and mean fold change are low. The analysis of the simulations also showed that overall, a rank sum test (Mann-Whitney) performed well at identifying differentially expressed genes. Previous work has suggested the strengths of nonparametric approaches for identifying differentially expressed genes. We also show that multivariate approaches, such as hierarchical and k-means cluster analysis along with principal components analysis, are only effective at classifying samples when replicate numbers and mean fold change are high. Finally, we show how our stellar shape model approach can be extended to the analysis of 2D-gel images by adapting the Moffat function to take into account the elliptical nature of spots in such images. Our results indicate that stellar shape models offer a previously unexplored approach for the quantification of 2D-gel spots. ^
Resumo:
Decorin, a dermatan/chondroitin sulfate proteoglycan, is ubiquitously distributed in the extracellular matrix (ECM) of mammals. Decorin belongs to the small leucine rich proteoglycan (SLRP) family, a proteoglycan family characterized by a core protein dominated by Leucine Rich Repeat motifs. The decorin core protein appears to mediate the binding of decorin to ECM molecules, such as collagens and fibronectin. It is believed that the interactions of decorin with these ECM molecules contribute to the regulation of ECM assembly, cell adhesions, and cell proliferation. These basic biological processes play critical roles during embryonic development and wound healing and are altered in pathological conditions such as fibrosis and tumorgenesis. ^ In this dissertation, we discover that decorin core protein can bind to Zn2+ ions with high affinity. Zinc is an essential trace element in mammals. Zn2+ ions play a catalytic role in the activation of many enzymes and a structural role in the stabilization of protein conformation. By examining purified recombinant decorin and its core protein fragments for Zn2+ binding activity using Zn2+-chelating column chromatography and Zn2+-equilibrium dialysis approaches, we have located the Zn2+ binding domain to the N-terminal sequence of the decorin core protein. The decorin N-terminal domain appears to contain two Zn2+ binding sites with similar high binding affinity. The sequence of the decorin N-terminal domain does not resemble any other reported zinc-binding motifs and, therefore, represents a novel Zn 2+ binding motif. By investigating the influence of Zn2+ ions on decorin binding interactions, we found a novel Zn2+ dependent interaction with fibrinogen, the major plasma protein in blood clots. Furthermore, a recombinant peptide (MD4) consisting of a 41 amino acid sequence of mouse decorin N-terminal domain can prolong thrombin induced fibrinogen/fibrin clot formation. This suggests that in the presence of Zn2+ the decorin N-terminal domain has an anticoagulation activity. The changed Zn2+-binding activities of the truncated MD4 peptides and site-directed mutagenesis generated mutant peptides revealed that the functional MD4 peptide might contain both a structural zinc-binding site in the cysteine cluster region and a catalytic zinc site that could be created by the flanking sequences of the cysteine cluster region. A model of a loop-like structure for MD4 peptide is proposed. ^
Resumo:
This the tenth in a series of symposia devoted to talks by students on their biochemical engineering research. The first, third, fifth, and ninth were at Kansas State University in Manhattan, the second and fourth were at the University of Nebraska–Lincoln, the sixth was in Kansas City in conjunction with the 81st American Institute of Chemical Engineers National Meeting, the seventh was at Iowa State University in Ames, and the eighth was held at the University of Missouri–Columbia. Contents"Combined Autohydrolysis-Organosolv Pretreatment of Lignocellulosic Materials," Robert A. Lewis, Colorado State University "An Investigation of Cellulase Activity Assays," Minhhuong Nguyen, University of Missouri–Columbia "Action Pattern of a Xylobiohydrolase from Aspergillus niger," Mary M. Frederick, Iowa State University "Estimation of Heats of Combustion of Biomass from Elemental Analysis Using Available Electron Concepts," Snehal A. Patel, Kansas State University "Design of a Wheat Straw to Ethanol Conversion Facility," Michael M. Meagher, Colorado State University "Effects of Salt, Heat, and Physical Form on the Fermentation of Bananas," Carl Drewel, University of Missouri–Columbia "Gas Hold-up in the Downflow Section of a Split Cylinder Airlift Column," Vasanti Deshpande, Kansas State University "Measurement of Michaelis Constants for Soluble and Immobilized Glucoamylase," Robert A. Lesch, Iowa State University "Kinetics of Alkaline Oxidation and Degradation of Sugars," Alfred R. Fratzke, Iowa State University "Stability of Cereal Protein During Microbial Growth on Grain Dust," Bamidele O. Solomon, Kansas State University
Resumo:
This work represents the proceedings of the fifteenth symposium which convened at Colorado State University on May 24, 1985. The two day meeting was scheduled one month later than usual, i.e., after the spring semester, so that travelers from the Midwest (Iowa State University, Kansas State University and University of Missouri) could enjoy the unique mountain setting provided at Pingree Park. The background of the photograph on the cover depicts the beauty of the area. ContentsGreg Sinton and S.M. Leo, KSU. Models for the Biodegration of 2.4-D and Related Xenobiotic Compounds. V. Bringi, CSU. Intrinsic Kinetics from a Novel Immobilized Cell CSTR. Steve Birdsell, CU. Novel Microbial Separation Techniques. Mark Smith, MU. Kinetic Characterization of Growth of E. coli on Glucose. Michael M. Meagher, ISU. Kinetic Parameters of Di- and Trisaccharaide Hydrolysis by Glucoamylase II. G.T. Jones and A.K. Ghosh Hajra, KSU. Modeling and Simulation of Legume Modules with Reactive Cores and Inert Shells. S.A. Patel and C.H. Lee, KSU. Energetic Analysis and Liquid Circulation in an Airlift Fermenter. Rod R. Fisher, ISU. The Effects of Mixing during Acid Addition of Fractionally Precipitated Protein. Mark M. Paige, CSU. Fed-batch Fermentations of Clostridium acetobutylicum. Michael K. Dowd, ISU. A Nonequilibirium Thermodynamic Description of the Variation of Contractile Velocity and Energy Use in Muscle. David D. Drury, CSU. Analysis of Hollow Fiber Bioreactor Performance for MAmmalian Cells by On-Line MMR. H.Y. Lee, KSU. Process Analysis of Photosynthetic Continuous Culture Systems. C.J. Wang, MU. Kinetic Consideration in Fermentation of Cheese Whey to Ethanol.
Resumo:
The nineteenth symposium was held at the University of Missouri–Columbia on April 22, 1989. A total of eighteen papers were scheduled for presentation, of which nine were in poster session. Finally, fifteen papers were presented and sixteen were submitted for this proceedings. It was attended by 53 participants from five institutions. A sixth group (from Colorado State University) was kept from attending the symposium due to mechanical problems on the road and we missed them. Since they worked hard at their presentations, I requested CSU-group to submit their papers for the proceedings and I am happy that they did. ContentsMathematical modelling of a flour milling system. K. Takahashi, Y. Chen, J. Hosokoschi, and L. T. Fan. Kansas State University A novel solution to the problem of plasmid segregation in continuous bacterial fermentations. K.L. Henry, R. H. Davis, and A. L. Taylor. University of Colorado Modelling of embryonic growth in avian and reptile Eggs. C.L. Krause, R. C. Seagrave, and R. A. Ackerman. Iowa State University Mathematical modeling of in situ biodegradation processes. J.C. Wu, L. T. Fan, and L. E. Erickson. Kansas State University Effect of molecular changes on starch viscosity. C.H. Rosane and V. G. Murphy. Colorado State University Analysis of two stage recombinant bacterial fermentations using a structured kinetic model. F. Miao and D. S. Kampala. University of Colorado Lactic acid fermentation from enzyme-thinned starch by Lactobacillus amylovorus. P.S. Cheng, E. L. Iannotti, R. K. Bajpai, R. Mueller, and s. Yaeger. University of Missouri–Columbia Solubilization of preoxidized Texas lignite by cell-free broths of Penicillium strains. R. Moolick, M. N. Karim, J. C. Linden, and B. L. Burback. Colorado State University Separation of proteins from polyelectrolytes by ultrafiltration. A.G. Bazzano and C. E. Glatz. Iowa State University Growth estimation and modelling of Rhizopus oligosporus in solid state fermentations. D.-H. Ryoo, V. G. Murphy, M. N. Karim, and R. P. Tengerdy. Colorado State University Simulation of ethanol fermentations from sugars in cheese whey. C.J. Wang and R. K. Bajpai. University of Missouri–Columbia Studies on protoplast fusion of B. licheniformis. B. Shi, Kansas State University Cell separations of non-dividing and dividing yeasts using an inclined settler. C.-Y. Lee, R. H. Davis, and R. A. Sclafani. University of Colorado Effect of·serum upon local hydrodynamics within an airlift column. G.T. Jones, L. E. Erickson, and L. A. Glasgow. Kansas State University Optimization of heterologous protein secretion in continuous culture. A. Chatterjee, W. F. Remirez, and R. H. Davis. University of Colorado An improved model for lactic acid fermentation. P. Yeh, R. K. Bajpai, and E. L. Iannotti. University of Missouri–Columbia
Resumo:
The hydraulic piston coring device (HPC-15) allows recovery of deep ocean sediments with minimal disturbance. The device was used during Leg 72 of the Deep Sea Drilling Project (DSDP) aboard the Glomar Challenger. Core samples were recovered from bore holes in the Rio Grande Rise in the southwest Atlantic Ocean. Relatively undisturbed sediment cores were obtained from Holes 515A, 516, 517, and 518. The results of shipboard physical property measurements and on-shore geotechnical laboratory tests on these cores are presented in this chapter. A limited number of 0.3 m cores were obtained and used in a series of geotechnical tests, including one-dimensional consolidation, direct shear, Atterburg limit, particle size analysis, and specific gravity tests. Throughout the testing program, attention was focused on assessment of sample disturbance associated with the HPC-15 coring device. The HPC-15 device limits sample disturbance reasonably well in terrigenous muds (clays). However, sample disturbance associated with coring calcareous sediments (nannofossil-foraminifer oozes) is severe. The noncohesive, granular behavior of the calcareous sediments is vulnerable to severe disturbance, because of the design of the sampling head on the device at the time of Leg 72. A number of modifications to the sampling head design are recommended and discussed in this chapter. The modifications will improve sample quality for testing purposes and provide longer unbroken core samples by reducing friction between the sediment column and the sampling tool.
Resumo:
Contaminated soil reuse was investigated, with higher profusion, throughout the early 90’s, coinciding with the 1991 Gulf War, when efforts to amend large crude oil releases began in geotechnical assessment of contaminated soils. Isolated works referring to geotechnical testing with hydrocarbon ground contaminants are described in the state-of-the-art, which have been extended to other type of contaminated soil references. Contaminated soils by light non-aquous phase liquids (LNAPL) bearing capacity reduction has been previously investigated from a forensic point of view. To date, all the research works have been published based on the assumption of constant contaminant saturation for the entire soil mass. In contrast, the actual LNAPLs distribution plumes exhibit complex flow patterns which are subject to physical and chemical changes with time and distance travelled from the release source. This aspect has been considered along the present text. A typical Madrid arkosic soil formation is commonly known as Miga sand. Geotechnical tests have been carried out, with Miga sand specimens, in incremental series of LNAPL concentrations in order to observe the soil engineering properties variation due to a contamination increase. Results are discussed in relation with previous studies and as a matter of fact, soil mechanics parameters change in the presence of LNAPL, showing different tendencies according to each test and depending on the LNAPL content, as well as to the specimen’s initially planned relative density, dense or loose. Geotechnical practical implications are also commented on and analyzed. Variation on geotechnical properties may occur only within the external contour of contamination distribution plume. This scope has motivated the author to develop a physical model based on transparent soil technology. The model aims to reproduce the distribution of LNAPL into the ground due to an accidental release from a storage facility. Preliminary results indicate that the model is a potentially complementary tool for hydrogeological applications, site-characterization and remediation treatment testing within the framework of soil pollution events. A description of the test setup of an innovative three dimensional physical model for the flow of two or more phases, in porous media, is presented herein, along with a summary of the advantages, limitations and future applications for modeling with transparent material. En los primeros años de la década de los años 90, del siglo pasado, coincidiendo con la Guerra del Golfo en 1991, se investigó intensamente sobre la reutilización de suelos afectados por grandes volúmenes de vertidos de crudo, fomentándose la evaluación geotécnica de los suelos contaminados. Se describen, en el estado del arte de esta tésis, una serie de trabajos aislados en relación con la caracterización geotécnica de suelos contaminados con hidrocarburos, descripción ampliada mediante referencias relacionadas con otros tipos de contaminación de suelos. Existen estudios previos de patología de cimentaciones que analizan la reducción de la capacidad portante de suelos contaminados por hidrocarburos líquidos ligeros en fase no acuosa (acrónimo en inglés: LNAPL de “Liquid Non-Aquous Phase Liquid”). A fecha de redacción de la tesis, todas las publicaciones anteriores estaban basadas en la consideración de una saturación del contaminante constante en toda la extensión del terreno de cimentación. La distribución real de las plumas de contaminante muestra, por el contrario, complejas trayectorias de flujo que están sujetas a cambios físico-químicos en función del tiempo y la distancia recorrida desde su origen de vertido. Éste aspecto ha sido considerado y tratado en el presente texto. La arena de Miga es una formación geológica típica de Madrid. En el ámbito de esta tesis se han desarrollado ensayos geotécnicos con series de muestras de arena de Miga contaminadas con distintas concentraciones de LNAPL con el objeto de estimar la variación de sus propiedades geotécnicas debido a un incremento de contaminación. Se ha realizado una evaluación de resultados de los ensayos en comparación con otros estudios previamente analizados, resultando que las propiedades mecánicas del suelo, efectivamente, varían en función del contenido de LNAPL y de la densidad relativa con la que se prepare la muestra, densa o floja. Se analizan y comentan las implicaciones de carácter práctico que supone la mencionada variación de propiedades geotécnicas. El autor ha desarrollado un modelo físico basado en la tecnología de suelos transparentes, considerando que las variaciones de propiedades geotécnicas únicamente deben producirse en el ámbito interior del contorno de la pluma contaminante. El objeto del modelo es el de reproducir la distribución de un LNAPL en un terreno dado, causada por el vertido accidental de una instalación de almecenamiento de combustible. Los resultados preliminares indican que el modelo podría emplearse como una herramienta complementaria para el estudio de eventos contaminantes, permitiendo el desarrollo de aplicaciones de carácter hidrogeológico, caracterización de suelos contaminados y experimentación de tratamientos de remediación. Como aportación de carácter innovadora, se presenta y describe un modelo físico tridimensional de flujo de dos o más fases a través de un medio poroso transparente, analizándose sus ventajas e inconvenientes así como sus limitaciones y futuras aplicaciones.
Resumo:
All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.
Resumo:
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
Resumo:
In this work, we propose the Seasonal Dynamic Factor Analysis (SeaDFA), an extension of Nonstationary Dynamic Factor Analysis, through which one can deal with dimensionality reduction in vectors of time series in such a way that both common and specific components are extracted. Furthermore, common factors are able to capture not only regular dynamics (stationary or not) but also seasonal ones, by means of the common factors following a multiplicative seasonal VARIMA(p, d, q) × (P, D, Q)s model. Additionally, a bootstrap procedure that does not need a backward representation of the model is proposed to be able to make inference for all the parameters in the model. A bootstrap scheme developed for forecasting includes uncertainty due to parameter estimation, allowing enhanced coverage of forecasting intervals. A challenging application is provided. The new proposed model and a bootstrap scheme are applied to an innovative subject in electricity markets: the computation of long-term point forecasts and prediction intervals of electricity prices. Several appendices with technical details, an illustrative example, and an additional table are available online as Supplementary Materials.