990 resultados para predictive modeling
Resumo:
Cotton is the most abundant natural fiber in the world. Many countries are involved in the growing, importation, exportation and production of this commodity. Paper documentation claiming geographic origin is the current method employed at U.S. ports for identifying cotton sources and enforcing tariffs. Because customs documentation can be easily falsified, it is necessary to develop a robust method for authenticating or refuting the source of the cotton commodities. This work presents, for the first time, a comprehensive approach to the chemical characterization of unprocessed cotton in order to provide an independent tool to establish geographic origin. Elemental and stable isotope ratio analysis of unprocessed cotton provides a means to increase the ability to distinguish cotton in addition to any physical and morphological examinations that could be, and are currently performed. Elemental analysis has been conducted using LA-ICP-MS, LA-ICP-OES and LIBS in order to offer a direct comparison of the analytical performance of each technique and determine the utility of each technique for this purpose. Multivariate predictive modeling approaches are used to determine the potential of elemental and stable isotopic information to aide in the geographic provenancing of unprocessed cotton of both domestic and foreign origin. These approaches assess the stability of the profiles to temporal and spatial variation to determine the feasibility of this application. This dissertation also evaluates plasma conditions and ablation processes so as to improve the quality of analytical measurements made using atomic emission spectroscopy techniques. These interactions, in LIBS particularly, are assessed to determine any potential simplification of the instrumental design and method development phases. This is accomplished through the analysis of several matrices representing different physical substrates to determine the potential of adopting universal LIBS parameters for 532 nm and 1064 nm LIBS for some important operating parameters. A novel approach to evaluate both ablation processes and plasma conditions using a single measurement was developed and utilized to determine the "useful ablation efficiency" for different materials. The work presented here demonstrates the potential for an a priori prediction of some probable laser parameters important in analytical LIBS measurement.
Resumo:
Since the introduction of fiber reinforced polymers (FRP) for the repair and retrofit of concrete structures in the 1980’s, considerable research has been devoted to the feasibility of their application and predictive modeling of their performance. However, the effects of flaws present in the constitutive components and the practices in substrate preparation and treatment have not yet been thoroughly studied. This research aims at investigating the effect of surface preparation and treatment for the pre-cured FRP systems and the groove size tolerance for near surface mounted (NSM) FRP systems; and to set thresholds for guaranteed system performance. The research included both analytical and experimental components. The experimental program for the pre-cured FRP systems consisted of a total of twenty-four (24) reinforced concrete (RC) T-beams with various surface preparation parameters and surface flaws, including roughness, flatness, voids and cracks (cuts). For the NSM FRP systems, a total of twelve (12) additional RC T-beams were tested with different grooves sizes for FRP bars and strips. The analytical program included developing an elaborate nonlinear finite element model using the general purpose software ANSYS. The model was subsequently used to extend the experimental range of parameters for surface flatness in pre-cured FRP systems, and for groove size study in the NSM FRP systems. Test results, confirmed by further analyses, indicated that contrary to the general belief in the industry, the impact of surface roughness on the global performance of pre-cured FRP systems was negligible. The study also verified that threshold limits set for wet lay-up FRP systems can be extended to pre-cured systems. The study showed that larger surface voids and cracks (cuts) can adversely impact both the strength and ductility of pre-cured FRP systems. On the other hand, frequency (or spacing) of surface cracks (cuts) may only affect system ductility rather than its strength. Finally, within the range studied, groove size tolerance of +1/8 in. does not appear to have an adverse effect on the performance of NSM FRP systems.
Resumo:
Nesta tese procurou-se demonstrar a valoração do efluente do processamento de pescado por incorporação dos nutrientes em Aphanothece microscopica Nägeli a diferentes temperaturas. Para tanto o trabalho é composto de cinco artigos que objetivaram avaliar sob o ponto de vista do tratamento do efluente pela cianobactéria Aphanothece e a separação e avaliação da biomassa gerada. O primeiro artigo intitula-se “Influência da temperatura na remoção de nutrientes do efluente da indústria de pescado por Aphanothece microscopica Nägeli”, e teve por objetivo avaliar a influência da temperatura (10, 20 e 30ºC) em um sistema de tratamento pela cianobactéria Aphanothece na remoção de matéria orgânica, nitrogênio e fósforo do efluente oriundo do processamento de pescado. A análise dos resultados mostrou que a temperatura influenciou significativamente na remoção de DQO, NTK, N-NH4 + e P-PO4 -3 . Para os experimentos a 20 e 30ºC todos os limites estabelecidos para os parâmetros avaliados foram atingidos. O segundo artigo intitulado “Efeito de coagulantes no efluente da indústria da pesca visando à separação de biomassa quando tratado por cianobactéria” avaliou o efeito da concentração e pH de dois tipos de coagulantes, cloreto férrico (FeCl3) e sulfato de alumínio (Al2(SO4)3), na separação da biomassa da cianobactéria Aphanothece microscopica Nägeli cultivada em efluente da indústria da pesca, assim como a remoção de matéria orgânica e nutrientes do efluente. Os resultados indicaram que o coagulante FeCl3 foi mais eficaz na remoção de todos os parâmetros testados. No que concerne à separação da biomassa, com um número de seis lavagens foi removido cerca de 97,6% da concentração de FeCl3 adicionado inicialmente. O terceiro artigo com o título “Caracterização da biomassa de Aphanothece microscopica Nägeli gerada no efluente da indústria da pesca em diferentes temperaturas de cultivo” avaliou a composição química da biomassa da cianobactéria Aphanothece microscopica Nägeli quando desenvolvida em meio de cultivo padrão BG11 e no efluente do processamento de pescado. O quarto artigo teve como título “Influência do meio de cultivo e temperatura em compostos nitrogenados na cianobactéria Aphanothece microscopica Nägeli” objetivou avaliar o teor de compostos nitrogenados presentes na biomassa da cianobactéria Aphanothece microscopica Nägeli quando cultivada em meio padrão e no efluente da indústria da pesca nas diferentes fases de crescimento. Para o estudo da composição química e nitrogenados no efluente foram realizados experimentos nas temperaturas de 10, 20 e 30ºC. As concentrações de proteína, cinzas e pigmentos aumentaram com o aumento da temperatura. Por outro lado, foi observada uma redução do teor de lipídios e carboidratos com o aumento da temperatura. O íon amônio juntamente com os ácidos nucléicos representa uma importante fração do nitrogênio não protéico presente na biomassa da cianobactéria Aphanothece. Ficou demonstrada a influência do meio de cultivo na concentração de nitrogênio, bem como a determinação de proteína pelo método de Kjeldahl superestima a concentração protéica em cianobactérias. O quinto artigo intitulado “Produção de proteína unicelular a partir do efluente do processamento do pescado: modelagem preditiva e simulação” avaliou a produção de proteína unicelular através do cultivo da cianobactéria Aphanothece microscopica Nägeli no efluente da indústria da pesca. Os dados cinéticos de crescimento celular foram ajustados a quatro modelos matemáticos (Logístico, Gompertz, Gompertz Modificado e Baranyi). Os resultados demonstraram que o modelo Logístico foi considerado o mais adequado para descrever a formação de biomassa. A análise preditiva mostrou a possibilidade da obtenção de 1,66, 18,96 e 57,36 kg.m-3.d-1 de biomassa por volume do reator em 1000 h de processo contínuo, para as temperaturas de 10, 20 e 30ºC, respectivamente.
Resumo:
In economics of information theory, credence products are those whose quality is difficult or impossible for consumers to assess, even after they have consumed the product (Darby & Karni, 1973). This dissertation is focused on the content, consumer perception, and power of online reviews for credence services. Economics of information theory has long assumed, without empirical confirmation, that consumers will discount the credibility of claims about credence quality attributes. The same theories predict that because credence services are by definition obscure to the consumer, reviews of credence services are incapable of signaling quality. Our research aims to question these assumptions. In the first essay we examine how the content and structure of online reviews of credence services systematically differ from the content and structure of reviews of experience services and how consumers judge these differences. We have found that online reviews of credence services have either less important or less credible content than reviews of experience services and that consumers do discount the credibility of credence claims. However, while consumers rationally discount the credibility of simple credence claims in a review, more complex argument structure and the inclusion of evidence attenuate this effect. In the second essay we ask, “Can online reviews predict the worst doctors?” We examine the power of online reviews to detect low quality, as measured by state medical board sanctions. We find that online reviews are somewhat predictive of a doctor’s suitability to practice medicine; however, not all the data are useful. Numerical or star ratings provide the strongest quality signal; user-submitted text provides some signal but is subsumed almost completely by ratings. Of the ratings variables in our dataset, we find that punctuality, rather than knowledge, is the strongest predictor of medical board sanctions. These results challenge the definition of credence products, which is a long-standing construct in economics of information theory. Our results also have implications for online review users, review platforms, and for the use of predictive modeling in the context of information systems research.
Resumo:
La présentation d'antigène par les molécules d'histocompatibilité majeure de classe I (CMHI) permet au système immunitaire adaptatif de détecter et éliminer les agents pathogènes intracellulaires et des cellules anormales. La surveillance immunitaire est effectuée par les lymphocytes T CD8 qui interagissent avec le répertoire de peptides associés au CMHI présentés à la surface de toutes cellules nucléées. Les principaux gènes humains de CMHI, HLA-A et HLA-B, sont très polymorphes et par conséquent montrent des différences dans la présentation des antigènes. Nous avons étudié les différences qualitatives et quantitatives dans l'expression et la liaison peptidique de plusieurs allotypes HLA. Utilisant la technique de cytométrie de flux quantitative nous avons établi une hiérarchie d'expression pour les quatre HLA-A, B allotypes enquête. Nos résultats sont compatibles avec une corrélation inverse entre l'expression allotypique et la diversité des peptides bien que d'autres études soient nécessaires pour consolider cette hypothèse. Les origines mondiales du répertoire de peptides associés au CMHI restent une question centrale à la fois fondamentalement et dans la recherche de cibles immunothérapeutiques. Utilisant des techniques protéogénomiques, nous avons identifié et analysé 25,172 peptides CMHI isolées à partir des lymphocytes B de 18 personnes qui exprime collectivement 27 allotypes HLA-A,B. Alors que 58% des gènes ont été la source de 1-64 peptides CMHI par gène, 42% des gènes ne sont pas représentés dans l'immunopeptidome. Dans l'ensemble, l’immunopeptidome présenté par 27 allotypes HLA-A,B ne couvrent que 17% des séquences exomiques exprimées dans les cellules des sujets. Nous avons identifié plusieurs caractéristiques des transcrits et des protéines qui améliorent la production des peptides CMHI. Avec ces données, nous avons construit un modèle de régression logistique qui prédit avec une grande précision si un gène de notre ensemble de données ou à partir d'ensembles de données indépendants génèrerait des peptides CMHI. Nos résultats montrent la sélection préférentielle des peptides CMHI à partir d'un répertoire limité de produits de gènes avec des caractéristiques distinctes. L'idée que le système immunitaire peut surveiller des peptides CMHI couvrant seulement une fraction du génome codant des protéines a des implications profondes dans l'auto-immunité et l'immunologie du cancer.
Resumo:
Cotton is the most abundant natural fiber in the world. Many countries are involved in the growing, importation, exportation and production of this commodity. Paper documentation claiming geographic origin is the current method employed at U.S. ports for identifying cotton sources and enforcing tariffs. Because customs documentation can be easily falsified, it is necessary to develop a robust method for authenticating or refuting the source of the cotton commodities. This work presents, for the first time, a comprehensive approach to the chemical characterization of unprocessed cotton in order to provide an independent tool to establish geographic origin. Elemental and stable isotope ratio analysis of unprocessed cotton provides a means to increase the ability to distinguish cotton in addition to any physical and morphological examinations that could be, and are currently performed. Elemental analysis has been conducted using LA-ICP-MS, LA-ICP-OES and LIBS in order to offer a direct comparison of the analytical performance of each technique and determine the utility of each technique for this purpose. Multivariate predictive modeling approaches are used to determine the potential of elemental and stable isotopic information to aide in the geographic provenancing of unprocessed cotton of both domestic and foreign origin. These approaches assess the stability of the profiles to temporal and spatial variation to determine the feasibility of this application. This dissertation also evaluates plasma conditions and ablation processes so as to improve the quality of analytical measurements made using atomic emission spectroscopy techniques. These interactions, in LIBS particularly, are assessed to determine any potential simplification of the instrumental design and method development phases. This is accomplished through the analysis of several matrices representing different physical substrates to determine the potential of adopting universal LIBS parameters for 532 nm and 1064 nm LIBS for some important operating parameters. A novel approach to evaluate both ablation processes and plasma conditions using a single measurement was developed and utilized to determine the “useful ablation efficiency” for different materials. The work presented here demonstrates the potential for an a priori prediction of some probable laser parameters important in analytical LIBS measurement.
Resumo:
La présentation d'antigène par les molécules d'histocompatibilité majeure de classe I (CMHI) permet au système immunitaire adaptatif de détecter et éliminer les agents pathogènes intracellulaires et des cellules anormales. La surveillance immunitaire est effectuée par les lymphocytes T CD8 qui interagissent avec le répertoire de peptides associés au CMHI présentés à la surface de toutes cellules nucléées. Les principaux gènes humains de CMHI, HLA-A et HLA-B, sont très polymorphes et par conséquent montrent des différences dans la présentation des antigènes. Nous avons étudié les différences qualitatives et quantitatives dans l'expression et la liaison peptidique de plusieurs allotypes HLA. Utilisant la technique de cytométrie de flux quantitative nous avons établi une hiérarchie d'expression pour les quatre HLA-A, B allotypes enquête. Nos résultats sont compatibles avec une corrélation inverse entre l'expression allotypique et la diversité des peptides bien que d'autres études soient nécessaires pour consolider cette hypothèse. Les origines mondiales du répertoire de peptides associés au CMHI restent une question centrale à la fois fondamentalement et dans la recherche de cibles immunothérapeutiques. Utilisant des techniques protéogénomiques, nous avons identifié et analysé 25,172 peptides CMHI isolées à partir des lymphocytes B de 18 personnes qui exprime collectivement 27 allotypes HLA-A,B. Alors que 58% des gènes ont été la source de 1-64 peptides CMHI par gène, 42% des gènes ne sont pas représentés dans l'immunopeptidome. Dans l'ensemble, l’immunopeptidome présenté par 27 allotypes HLA-A,B ne couvrent que 17% des séquences exomiques exprimées dans les cellules des sujets. Nous avons identifié plusieurs caractéristiques des transcrits et des protéines qui améliorent la production des peptides CMHI. Avec ces données, nous avons construit un modèle de régression logistique qui prédit avec une grande précision si un gène de notre ensemble de données ou à partir d'ensembles de données indépendants génèrerait des peptides CMHI. Nos résultats montrent la sélection préférentielle des peptides CMHI à partir d'un répertoire limité de produits de gènes avec des caractéristiques distinctes. L'idée que le système immunitaire peut surveiller des peptides CMHI couvrant seulement une fraction du génome codant des protéines a des implications profondes dans l'auto-immunité et l'immunologie du cancer.
Resumo:
In this thesis, a machine learning approach was used to develop a predictive model for residual methanol concentration in industrial formalin produced at the Akzo Nobel factory in Kristinehamn, Sweden. The MATLABTM computational environment supplemented with the Statistics and Machine LearningTM toolbox from the MathWorks were used to test various machine learning algorithms on the formalin production data from Akzo Nobel. As a result, the Gaussian Process Regression algorithm was found to provide the best results and was used to create the predictive model. The model was compiled to a stand-alone application with a graphical user interface using the MATLAB CompilerTM.
Resumo:
In this thesis, a predictive analytical and numerical modeling approach for the orthogonal cutting process is proposed to calculate temperature distributions and subsequently, forces and stress distributions. The models proposed include a constitutive model for the material being cut based on the work of Weber, a model for the shear plane based on Merchants model, a model describing the contribution of friction based on Zorev’s approach, a model for the effect of wear on the tool based on the work of Waldorf, and a thermal model based on the works of Komanduri and Hou, with a fraction heat partition for a non-uniform distribution of the heat in the interfaces, but extended to encompass a set of contributions to the global temperature rise of chip, tool and work piece. The models proposed in this work, try to avoid from experimental based values or expressions, and simplifying assumptions or suppositions, as much as possible. On a thermo-physical point of view, the results were affected not only by the mechanical or cutting parameters chosen, but also by their coupling effects, instead of the simplifying way of modeling which is to contemplate only the direct effect of the variation of a parameter. The implementation of these models was performed using the MATLAB environment. Since it was possible to find in the literature all the parameters for AISI 1045 and AISI O2, these materials were used to run the simulations in order to avoid arbitrary assumption.
Resumo:
Empirical evidence and theoretical studies suggest that the phenotype, i.e., cellular- and molecular-scale dynamics, including proliferation rate and adhesiveness due to microenvironmental factors and gene expression that govern tumor growth and invasiveness, also determine gross tumor-scale morphology. It has been difficult to quantify the relative effect of these links on disease progression and prognosis using conventional clinical and experimental methods and observables. As a result, successful individualized treatment of highly malignant and invasive cancers, such as glioblastoma, via surgical resection and chemotherapy cannot be offered and outcomes are generally poor. What is needed is a deterministic, quantifiable method to enable understanding of the connections between phenotype and tumor morphology. Here, we critically assess advantages and disadvantages of recent computational modeling efforts (e.g., continuum, discrete, and cellular automata models) that have pursued this understanding. Based on this assessment, we review a multiscale, i.e., from the molecular to the gross tumor scale, mathematical and computational "first-principle" approach based on mass conservation and other physical laws, such as employed in reaction-diffusion systems. Model variables describe known characteristics of tumor behavior, and parameters and functional relationships across scales are informed from in vitro, in vivo and ex vivo biology. We review the feasibility of this methodology that, once coupled to tumor imaging and tumor biopsy or cell culture data, should enable prediction of tumor growth and therapy outcome through quantification of the relation between the underlying dynamics and morphological characteristics. In particular, morphologic stability analysis of this mathematical model reveals that tumor cell patterning at the tumor-host interface is regulated by cell proliferation, adhesion and other phenotypic characteristics: histopathology information of tumor boundary can be inputted to the mathematical model and used as a phenotype-diagnostic tool to predict collective and individual tumor cell invasion of surrounding tissue. This approach further provides a means to deterministically test effects of novel and hypothetical therapy strategies on tumor behavior.
Resumo:
The usual way of modeling variability using threshold voltage shift and drain current amplification is becoming inaccurate as new sources of variability appear in sub-22nm devices. In this work we apply the four-injector approach for variability modeling to the simulation of SRAMs with predictive technology models from 20nm down to 7nm nodes. We show that the SRAMs, designed following ITRS roadmap, present stability metrics higher by at least 20% compared to a classical variability modeling approach. Speed estimation is also pessimistic, whereas leakage is underestimated if sub-threshold slope and DIBL mismatch and their correlations with threshold voltage are not considered.
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
This thesis aims to illustrate the construction of a mathematical model of a hydraulic system, oriented to the design of a model predictive control (MPC) algorithm. The modeling procedure starts with the basic formulation of a piston-servovalve system. The latter is a complex non linear system with some unknown and not measurable effects that constitute a challenging problem for the modeling procedure. The first level of approximation for system parameters is obtained basing on datasheet informations, provided workbench tests and other data from the company. Then, to validate and refine the model, open-loop simulations have been made for data matching with the characteristics obtained from real acquisitions. The final developed set of ODEs captures all the main peculiarities of the system despite some characteristics due to highly varying and unknown hydraulic effects, like the unmodeled resistive elements of the pipes. After an accurate analysis, since the model presents many internal complexities, a simplified version is presented. The latter is used to linearize and discretize correctly the non linear model. Basing on that, a MPC algorithm for reference tracking with linear constraints is implemented. The results obtained show the potential of MPC in this kind of industrial applications, thus a high quality tracking performances while satisfying state and input constraints. The increased robustness and flexibility are evident with respect to the standard control techniques, such as PID controllers, adopted for these systems. The simulations for model validation and the controlled system have been carried out in a Python code environment.
Resumo:
Southeastern Brazil has seen dramatic landscape modifications in recent decades, due to expansion of agriculture and urban areas; these changes have influenced the distribution and abundance of vertebrates. We developed predictive models of ecological and spatial distributions of capybaras (Hydrochoerus hydrochaeris) using ecological niche modeling. Most Occurrences of capybaras were in flat areas with water bodies Surrounded by sugarcane and pasture. More than 75% of the Piracicaba River basin was estimated as potentially habitable by capybara. The models had low omission error (2.3-3.4%), but higher commission error (91.0-98.5%); these ""model failures"" seem to be more related to local habitat characteristics than to spatial ones. The potential distribution of capybaras in the basin is associated with anthropogenic habitats, particularly with intensive land use for agriculture.