968 resultados para Non-polarizable Water Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - Vaccine development in the post-genomic era often begins with the in silico screening of genome information, with the most probable protective antigens being predicted rather than requiring causative microorganisms to be grown. Despite the obvious advantages of this approach – such as speed and cost efficiency – its success remains dependent on the accuracy of antigen prediction. Most approaches use sequence alignment to identify antigens. This is problematic for several reasons. Some proteins lack obvious sequence similarity, although they may share similar structures and biological properties. The antigenicity of a sequence may be encoded in a subtle and recondite manner not amendable to direct identification by sequence alignment. The discovery of truly novel antigens will be frustrated by their lack of similarity to antigens of known provenance. To overcome the limitations of alignment-dependent methods, we propose a new alignment-free approach for antigen prediction, which is based on auto cross covariance (ACC) transformation of protein sequences into uniform vectors of principal amino acid properties. Results - Bacterial, viral and tumour protein datasets were used to derive models for prediction of whole protein antigenicity. Every set consisted of 100 known antigens and 100 non-antigens. The derived models were tested by internal leave-one-out cross-validation and external validation using test sets. An additional five training sets for each class of antigens were used to test the stability of the discrimination between antigens and non-antigens. The models performed well in both validations showing prediction accuracy of 70% to 89%. The models were implemented in a server, which we call VaxiJen. Conclusion - VaxiJen is the first server for alignment-independent prediction of protective antigens. It was developed to allow antigen classification solely based on the physicochemical properties of proteins without recourse to sequence alignment. The server can be used on its own or in combination with alignment-based prediction methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. We use non-linear, artificial intelligence techniques, namely, recurrent neural networks, evolution strategies and kernel methods in our forecasting experiment. In the experiment, these three methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. There is evidence in the literature that evolutionary methods can be used to evolve kernels hence our future work should combine the evolutionary and kernel methods to get the benefits of both.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, (1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) .7%), borderline (HbA1c 7-8.9%), and poor (HbA1c .9%) glycemic control and potentially new risk factors (e.g. work characteristics), and (2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and (3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates the usefulness of fluorescence techniques for long-term monitoring and assessment of the dynamics (sources, transport and fate) of chromophoric dissolved organic matter (CDOM) in highly compartmentalized estuarine regions with non-point water sources. Water samples were collected monthly from a total of 73 sampling stations in the Florida Coastal Everglades (FCE) estuaries during 2001 and 2002. Spatial and seasonal variability of CDOM characteristics were investigated for geomorphologically distinct sub-regions within Florida Bay (FB), the Ten Thousand Islands (TTI), and Whitewater Bay (WWB). These variations were observed in both quantity and quality of CDOM. TOC concentrations in the FCE estuaries were generally higher during the wet season (June–October), reflecting high freshwater loadings from the Everglades in TTI, and a high primary productivity of marine biomass in FB. Fluorescence parameters suggested that the CDOM in FB is mainly of marine/microbial origin, while for TTI and WWB a terrestrial origin from Everglades marsh plants and mangroves was evident. Variations in CDOM quality seemed mainly controlled by tidal exchange/mixing of Everglades freshwater with Florida Shelf waters, tidally controlled releases of CDOM from fringe mangroves, primary productivity of marine vegetation in FB and diagenetic processes such as photodegradation (particularly for WWB). The source and dynamics of CDOM in these subtropical estuaries is complex and found to be influenced by many factors including hydrology, geomorphology, vegetation cover, landuse and biogeochemical processes. Simple, easy to measure, high sample throughput fluorescence parameters for surface waters can add valuable information on CDOM dynamics to long-term water quality studies which can not be obtained from quantitative determinations alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian methods offer a flexible and convenient probabilistic learning framework to extract interpretable knowledge from complex and structured data. Such methods can characterize dependencies among multiple levels of hidden variables and share statistical strength across heterogeneous sources. In the first part of this dissertation, we develop two dependent variational inference methods for full posterior approximation in non-conjugate Bayesian models through hierarchical mixture- and copula-based variational proposals, respectively. The proposed methods move beyond the widely used factorized approximation to the posterior and provide generic applicability to a broad class of probabilistic models with minimal model-specific derivations. In the second part of this dissertation, we design probabilistic graphical models to accommodate multimodal data, describe dynamical behaviors and account for task heterogeneity. In particular, the sparse latent factor model is able to reveal common low-dimensional structures from high-dimensional data. We demonstrate the effectiveness of the proposed statistical learning methods on both synthetic and real-world data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to determine the prevalence and severity of dental fluorosis among 12-15-year-old students from João Pessoa, PB, Brazil before starting a program of artificial fluoridation of drinking water. The use of fluoridated dentifrice was also surveyed. A sample of 1,402 students was randomly selected. However, 31 students refused to participate and 257 were not permanent residents in João Pessoa, thus leaving a final sample of 1,114 students. Clinical exams were carried out by two calibrated dentists (Kappa = 0.78) under natural indirect light. Upper and lower front teeth were cleaned with gauze and dried, and then examined using the TF index for fluorosis. A questionnaire on dentifrice ingestion and oral hygiene habits was applied to the students. The results revealed that fluorosis prevalence in this age group was higher than expected (29.2%). Most fluorosis cases were TF = 1 (66.8%), and the most severe cases were TF = 4 (2.2%). The majority of the students reported that they had been using fluoridated dentifrices since childhood; 95% of the participants preferred brands with a 1,500 ppm F concentration, and 40% remembered that they usually ingested or still ingest dentifrice during brushing. It was concluded that dental fluorosis prevalence among students in João Pessoa is higher than expected for an area with non-fluoridated water. However, although most students use fluoridated dentifrices, and almost half ingest slurry while brushing, the majority of cases had little aesthetic relevance from the professionals' point of view, thus suggesting that fluorosis is not a public health problem in the locality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os solenóides para eletroválvulas, em conjunto com outros produtos e equipamentos, permitem a automação dos sistemas de distribuição de águas não potáveis, como, por exemplo, em sistemas de rega. Deste modo é possível controlar diversos parâmetros, como o caudal e a pressão da água que passa na válvula, podendo estes ser fiscalizados à distância. Neste trabalho, foram desenvolvidos solenóides para acoplar a válvulas que se destinam à rega agrícola, comercializadas pela empresa JPrior, Fábrica de Plásticos, Lda. Para o efeito, foram definidos vários processos necessários para a criação de uma linha piloto de pré industrialização destes dispositivos. Etapas como a conceção do dispositivo, prototipagem, testes de temperatura, consumo de corrente, estabilidade eletromecânica e testes de desempenho associados a diferentes valores de pressão e de caudal de funcionamento, foram essenciais para este desenvolvimento, assim como, análises estruturais e morfológicas do material que constitui o núcleo dos solenóides. Além disso, procurou-se responder às necessidades do mercado numa perspetiva mais completa do que a existente. Para isso, foram produzidos dois tipos de solenóides de 24 V AC, com pressões máximas de funcionamento de 4 bar e de 12 bar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Departamento de Administração, Programa de Pós-graduação em Administração, 2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resumo: Registros de sobrevivência do nascimento ao desmame de 3846 crias de ovinos da raça Santa Inês foram analisados por modelos de reprodutor linear e não linear (modelo de limiar), para estimar componentes de variância e herdabilidade. Os modelos usados para sobrevivência, analisada como característica da cria, incluíram os efeitos fixos de sexo, da combinação tipo de nascimento-criação da cria e da idade da ovelha ao parto, efeito da covariável peso da cria ao nascer e efeitos aleatórios de reprodutor, da classe rebanho-ano-estação e do resíduo. Componentes de variância para o modelo linear foram estimados pelo método da máxima verossimilhança restrita (REML) e para o modelo não linear por uma aproximação da máxima verossimilhança marginal (MML), pelo programa CMMAT2. O coeficiente de herdabilidade (h2) estimado pelo modelo de limiar foi de 0,29, e pelo modelo linear, 0,14. A correlação de ordem de Spearman entre as capacidades de transmissão dos reprodutores, com base nos dois modelos foi de 0,96. As estimativas de h2 obtidas indicam a possibilidade de se obter, por seleção, ganho genético para sobrevivência. [Linear and nonlinear models in genetic analyses of lamb survival in the Santa Inês hair sheep breed]. Abstract: Records of 3,846 lambs survival from birth to weaning of Santa Inês hair sheep breed, were analyzed by linear and non linear sire models (threshold model) to estimate variance components and heritability (h2). The models that were used to analyze survival, considered in this study as a lamb trait, included the fixed effects of sex of the lamb, combination of type of birth-rearing of lamb, and age of ewe, birth weight of lamb as covariate, and random effects of sire, herd-year-season and residual. Variance components were obtained using restricted maximum likelihood (REML), in linear model and marginal maximum likelihood in threshold model through CMMAT2 program. Estimate of heritability (h2) obtained by threshold model was 0.29 and by linear model was 0.14. Rank correlation of Spearman, between sire solutions based on the two models was 0.96. The obtained estimates in this study indicate that it is possible to acquire genetic gain to survival by selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey articles covering how hedge fund returns are explained, using largely non-linear multifactor models that examine the non-linear pay-offs and exposures of hedge funds. We provide an integrated view of the implicit factor and statistical factor models that are largely able to explain the hedge fund return-generating process. We present their evolution through time by discussing pioneering studies that made a significant contribution to knowledge, and also recent innovative studies that examine hedge fund exposures using advanced econometric methods. This is the first review that analyzes very recent studies that explain a large part of hedge fund variation. We conclude by presenting some gaps for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seven years (2003–2010) of measured shortwave (SW) irradiances were used to obtain estimates of the 10 min averaged effective cloud optical thickness (ECOT) and of the shortwave cloud radiative effect (CRESW) at the surface in a mid-latitude site (Évora — south of Portugal), and its seasonal variability is presented. The ECOT, obtained using transmittance measurements at 415 nm, was compared with the correspondent MODIS cloud optical thickness (MODIS COT) for non-precipitating water clouds and cloud fractions higher than 0.25. This comparison showed that the ECOT represents well the cloud optical thickness over the study area. The CRESW, determined for two SW broadband ranges (300–1100 nm; 285–2800 nm), was normalized (NCRESW) and related with the obtained ECOT. A logarithmic relation between NCRESW and ECOT was found for both SW ranges, presenting lower dispersion for overcast-sky situations than for partially cloudy-sky situations. The NCRESW efficiency (NCRESW per unit of ECOT) was also related with the ECOT for overcast-sky conditions. The relation found is parameterized by a power law function showing that NCRESW efficiency decreases as the ECOT increases, approaching one for ECOT values higher than about 50.