929 resultados para Adsorption. Zeolite 13X. Langmuir model. Dynamic modeling. Pyrolysis of sewage sludge
Resumo:
Ecological validity of static and intense facial expressions in emotional recognition has been questioned. Recent studies have recommended the use of facial stimuli more compatible to the natural conditions of social interaction, which involves motion and variations in emotional intensity. In this study, we compared the recognition of static and dynamic facial expressions of happiness, fear, anger and sadness, presented in four emotional intensities (25 %, 50 %, 75 % and 100 %). Twenty volunteers (9 women and 11 men), aged between 19 and 31 years, took part in the study. The experiment consisted of two sessions in which participants had to identify the emotion of static (photographs) and dynamic (videos) displays of facial expressions on the computer screen. The mean accuracy was submitted to an Anova for repeated measures of model: 2 sexes x [2 conditions x 4 expressions x 4 intensities]. We observed an advantage for the recognition of dynamic expressions of happiness and fear compared to the static stimuli (p < .05). Analysis of interactions showed that expressions with intensity of 25 % were better recognized in the dynamic condition (p < .05). The addition of motion contributes to improve recognition especially in male participants (p < .05). We concluded that the effect of the motion varies as a function of the type of emotion, intensity of the expression and sex of the participant. These results support the hypothesis that dynamic stimuli have more ecological validity and are more appropriate to the research with emotions.
Resumo:
Introducción: Las vacunas clásicamente han representado un método económico y eficaz para el control y prevención de múltiples enfermedades infecciosas. En los últimos años se han introducido nuevas vacunas contra neumococo a precios elevados, y los diferentes análisis económicos a nivel mundial de estas vacunas no muestran tendencias. El objetivo de este trabajo era resumir la evidencia existente a través de los diferentes estudios económicos evaluando las dos vacunas de segunda generación contra neumococo en la población a riesgo. Metodología: En este trabajo se realizo una revisión sistemática de la literatura en 8 bases de datos localizadas en diferentes partes del mundo y también que tuvieran literatura gris. Los artículos fueron inicialmente evaluados acorde a su titulo y resumen, posteriormente los elegidos se analizaron en su totalidad. Resultados: Se encontraron 404 artículos, de los cuales 20 fueron incluidos en el análisis final. Se encontró que la mayoría de los estudios se realizaron en áreas donde la enfermedad tiene una carga baja, como es Norte América y Europa, mientras que en los lugares del mundo donde la carga es mas alta, se realizaron pocos estudios. De igual manera se observo que la mayoría de los estudios mostraron por los menos ser costo efectivos respecto a la no vacunación, y en su totalidad las dos vacunas de segunda generación mostraron costo efectividad respecto a la vacunación con PCV-7. Los resultados de los estudios son muy heterogéneos, hasta dentro del mismo país, señalando la necesidad de guías para la conducción de este tipo de estudios. De igual manera, la mayoría de los estudios fueron financiados por farmacéuticas, mientras en un numero muy reducido por entes gubernamentales. Conclusiones: La mayoría de los estudios económicos sobre las vacunas de segunda generación contra neumococo han sido realizados en países con un alto índice de desarrollo económico y patrocinados por farmacéuticas. Dado que la mayoría de la carga de la enfermedad se encuentran en regiones con un menor nivel de desarrollo económico se deberían realizar mas en estas zonas. De igual manera, al ser la vacunación un asunto de salud publica y con un importante impacto económico los gobiernos deberían estar mas involucrados en los mismos.
Resumo:
Las estrategias de inversión pairs trading se basan en desviaciones del precio entre pares de acciones correlacionadas y han sido ampliamente implementadas por fondos de inversión tomando posiciones largas y cortas en las acciones seleccionadas cuando surgen divergencias y obteniendo utilidad cerrando la posición al converger. Se describe un modelo de reversión a la media para analizar la dinámica que sigue el diferencial del precio entre acciones ordinarias y preferenciales de una misma empresa en el mismo mercado. La media de convergencia en el largo plazo es obtenida con un filtro de media móvil, posteriormente, los parámetros del modelo de reversión a la media se estiman mediante un filtro de Kalman bajo una formulación de estado espacio sobre las series históricas. Se realiza un backtesting a la estrategia de pairs trading algorítmico sobre el modelo propuesto indicando potenciales utilidades en mercados financieros que se observan por fuera del equilibrio. Aplicaciones de los resultados podrían mostrar oportunidades para mejorar el rendimiento de portafolios, corregir errores de valoración y sobrellevar mejor periodos de bajos retornos.
Estado situacional de los modelos basados en agentes y su impacto en la investigación organizacional
Resumo:
En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.
Resumo:
In this paper, we employ techniques from artificial intelligence such as reinforcement learning and agent based modeling as building blocks of a computational model for an economy based on conventions. First we model the interaction among firms in the private sector. These firms behave in an information environment based on conventions, meaning that a firm is likely to behave as its neighbors if it observes that their actions lead to a good pay off. On the other hand, we propose the use of reinforcement learning as a computational model for the role of the government in the economy, as the agent that determines the fiscal policy, and whose objective is to maximize the growth of the economy. We present the implementation of a simulator of the proposed model based on SWARM, that employs the SARSA(λ) algorithm combined with a multilayer perceptron as the function approximation for the action value function.
Resumo:
Muchas de las nuevas aplicaciones emergentes de Internet tales como TV sobre Internet, Radio sobre Internet,Video Streamming multi-punto, entre otras, necesitan los siguientes requerimientos de recursos: ancho de banda consumido, retardo extremo-a-extremo, tasa de paquetes perdidos, etc. Por lo anterior, es necesario formular una propuesta que especifique y provea para este tipo de aplicaciones los recursos necesarios para su buen funcionamiento. En esta tesis, proponemos un esquema de ingeniería de tráfico multi-objetivo a través del uso de diferentes árboles de distribución para muchos flujos multicast. En este caso, estamos usando la aproximación de múltiples caminos para cada nodo egreso y de esta forma obtener la aproximación de múltiples árboles y a través de esta forma crear diferentes árboles multicast. Sin embargo, nuestra propuesta resuelve la fracción de la división del tráfico a través de múltiples árboles. La propuesta puede ser aplicada en redes MPLS estableciendo rutas explícitas en eventos multicast. En primera instancia, el objetivo es combinar los siguientes objetivos ponderados dentro de una métrica agregada: máxima utilización de los enlaces, cantidad de saltos, el ancho de banda total consumido y el retardo total extremo-a-extremo. Nosotros hemos formulado esta función multi-objetivo (modelo MHDB-S) y los resultados obtenidos muestran que varios objetivos ponderados son reducidos y la máxima utilización de los enlaces es minimizada. El problema es NP-duro, por lo tanto, un algoritmo es propuesto para optimizar los diferentes objetivos. El comportamiento que obtuvimos usando este algoritmo es similar al que obtuvimos con el modelo. Normalmente, durante la transmisión multicast los nodos egresos pueden salir o entrar del árbol y por esta razón en esta tesis proponemos un esquema de ingeniería de tráfico multi-objetivo usando diferentes árboles para grupos multicast dinámicos. (en el cual los nodos egresos pueden cambiar durante el tiempo de vida de la conexión). Si un árbol multicast es recomputado desde el principio, esto podría consumir un tiempo considerable de CPU y además todas las comuicaciones que están usando el árbol multicast serán temporalmente interrumpida. Para aliviar estos inconvenientes, proponemos un modelo de optimización (modelo dinámico MHDB-D) que utilice los árboles multicast previamente computados (modelo estático MHDB-S) adicionando nuevos nodos egreso. Usando el método de la suma ponderada para resolver el modelo analítico, no necesariamente es correcto, porque es posible tener un espacio de solución no convexo y por esta razón algunas soluciones pueden no ser encontradas. Adicionalmente, otros tipos de objetivos fueron encontrados en diferentes trabajos de investigación. Por las razones mencionadas anteriormente, un nuevo modelo llamado GMM es propuesto y para dar solución a este problema un nuevo algoritmo usando Algoritmos Evolutivos Multi-Objetivos es propuesto. Este algoritmo esta inspirado por el algoritmo Strength Pareto Evolutionary Algorithm (SPEA). Para dar una solución al caso dinámico con este modelo generalizado, nosotros hemos propuesto un nuevo modelo dinámico y una solución computacional usando Breadth First Search (BFS) probabilístico. Finalmente, para evaluar nuestro esquema de optimización propuesto, ejecutamos diferentes pruebas y simulaciones. Las principales contribuciones de esta tesis son la taxonomía, los modelos de optimización multi-objetivo para los casos estático y dinámico en transmisiones multicast (MHDB-S y MHDB-D), los algoritmos para dar solución computacional a los modelos. Finalmente, los modelos generalizados también para los casos estático y dinámico (GMM y GMM Dinámico) y las propuestas computacionales para dar slución usando MOEA y BFS probabilístico.
Resumo:
A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.
Resumo:
In this study, the processes affecting sea surface temperature variability over the 1992–98 period, encompassing the very strong 1997–98 El Niño event, are analyzed. A tropical Pacific Ocean general circulation model, forced by a combination of weekly ERS1–2 and TAO wind stresses, and climatological heat and freshwater fluxes, is first validated against observations. The model reproduces the main features of the tropical Pacific mean state, despite a weaker than observed thermal stratification, a 0.1 m s−1 too strong (weak) South Equatorial Current (North Equatorial Countercurrent), and a slight underestimate of the Equatorial Undercurrent. Good agreement is found between the model dynamic height and TOPEX/Poseidon sea level variability, with correlation/rms differences of 0.80/4.7 cm on average in the 10°N–10°S band. The model sea surface temperature variability is a bit weak, but reproduces the main features of interannual variability during the 1992–98 period. The model compares well with the TAO current variability at the equator, with correlation/rms differences of 0.81/0.23 m s−1 for surface currents. The model therefore reproduces well the observed interannual variability, with wind stress as the only interannually varying forcing. This good agreement with observations provides confidence in the comprehensive three-dimensional circulation and thermal structure of the model. A close examination of mixed layer heat balance is thus undertaken, contrasting the mean seasonal cycle of the 1993–96 period and the 1997–98 El Niño. In the eastern Pacific, cooling by exchanges with the subsurface (vertical advection, mixing, and entrainment), the atmospheric forcing, and the eddies (mainly the tropical instability waves) are the three main contributors to the heat budget. In the central–western Pacific, the zonal advection by low-frequency currents becomes the main contributor. Westerly wind bursts (in December 1996 and March and June 1997) were found to play a decisive role in the onset of the 1997–98 El Niño. They contributed to the early warming in the eastern Pacific because the downwelling Kelvin waves that they excited diminished subsurface cooling there. But it is mainly through eastward advection of the warm pool that they generated temperature anomalies in the central Pacific. The end of El Niño can be linked to the large-scale easterly anomalies that developed in the western Pacific and spread eastward, from the end of 1997 onward. In the far-western Pacific, because of the shallower than normal thermocline, these easterlies cooled the SST by vertical processes. In the central Pacific, easterlies pushed the warm pool back to the west. In the east, they led to a shallower thermocline, which ultimately allowed subsurface cooling to resume and to quickly cool the surface layer.
Resumo:
Understanding how multiple signals are integrated in living cells to produce a balanced response is a major challenge in biology. Two-component signal transduction pathways, such as bacterial chemotaxis, comprise histidine protein kinases (HPKs) and response regulators (RRs). These are used to sense and respond to changes in the environment. Rhodobacter sphaeroides has a complex chemosensory network with two signaling clusters, each containing a HPK, CheA. Here we demonstrate, using a mathematical model, how the outputs of the two signaling clusters may be integrated. We use our mathematical model supported by experimental data to predict that: (1) the main RR controlling flagellar rotation, CheY6, aided by its specific phosphatase, the bifunctional kinase CheA3, acts as a phosphate sink for the other RRs; and (2) a phosphorelay pathway involving CheB2 connects the cytoplasmic cluster kinase CheA3 with the polar localised kinase CheA2, and allows CheA3-P to phosphorylate non-cognate chemotaxis RRs. These two mechanisms enable the bifunctional kinase/phosphatase activity of CheA3 to integrate and tune the sensory output of each signaling cluster to produce a balanced response. The signal integration mechanisms identified here may be widely used by other bacteria, since like R. sphaeroides, over 50% of chemotactic bacteria have multiple cheA homologues and need to integrate signals from different sources.
Resumo:
In the past decade, a number of mechanistic, dynamic simulation models of several components of the dairy production system have become available. However their use has been limited due to the detailed technical knowledge and special software required to run them, and the lack of compatibility between models in predicting various metabolic processes in the animal. The first objective of the current study was to integrate the dynamic models of [Brit. J. Nutr. 72 (1994) 679] on rumen function, [J. Anim. Sci. 79 (2001) 1584] on methane production, [J. Anim. Sci. 80 (2002) 2481 on N partition, and a new model of P partition. The second objective was to construct a decision support system to analyse nutrient partition between animal and environment. The integrated model combines key environmental pollutants such as N, P and methane within a nutrient-based feed evaluation system. The model was run under different scenarios and the sensitivity of various parameters analysed. A comparison of predictions from the integrated model with the original simulation models showed an improvement in N excretion since the integrated model uses the dynamic model of [Brit. J. Nutr. 72 (1994) 6791 to predict microbial N, which was not represented in detail in the original model. The integrated model can be used to investigate the degree to which production and environmental objectives are antagonistic, and it may help to explain and understand the complex mechanisms involved at the ruminal and metabolic levels. A part of the integrated model outputs were the forms of N and P in excreta and methane, which can be used as indices of environmental pollution. (C) 2004 Elsevier B.V All rights reserved.
Resumo:
The nicotinic Acetylcholine Receptor (nAChR) is the major class of neurotransmitter receptors that is involved in many neurodegenerative conditions such as schizophrenia, Alzheimer's and Parkinson's diseases. The N-terminal region or Ligand Binding Domain (LBD) of nAChR is located at pre- and post-synaptic nervous system, which mediates synaptic transmission. nAChR acts as the drug target for agonist and competitive antagonist molecules that modulate signal transmission at the nerve terminals. Based on Acetylcholine Binding Protein (AChBP) from Lymnea stagnalis as the structural template, the homology modeling approach was carried out to build three dimensional model of the N-terminal region of human alpha(7)nAChR. This theoretical model is an assembly of five alpha(7) subunits with 5 fold axis symmetry, constituting a channel, with the binding picket present at the interface region of the subunits. alpha-netlrotoxin is a potent nAChR competitive antagonist that readily blocks the channel resulting in paralysis. The molecular interaction of alpha-Bungarotoxin, a long chain alpha-neurotoxin from (Bungarus multicinctus) and human alpha(7)nAChR seas studied. Agonists such as acetylcholine, nicotine, which are used in it diverse array of biological activities, such as enhancements of cognitive performances, were also docked with the theoretical model of human alpha(7)nAChR. These docked complexes were analyzed further for identifying the crucial residues involved in interaction. These results provide the details of interaction of agonists and competitive antagonists with three dimensional model of the N-terminal region of human alpha(7)nAChR and thereby point to the design of novel lead compounds.
Resumo:
A mathematical model describing the uptake of low density lipoprotein (LDL) and very low density lipoprotein (VLDL) particles by a single hepatocyte cell is formulated and solved. The model includes a description of the dynamic change in receptor density on the surface of the cell due to the binding and dissociation of the lipoprotein particles, the subsequent internalisation of bound particles, receptors and unbound receptors, the recycling of receptors to the cell surface, cholesterol dependent de novo receptor formation by the cell and the effect that particle uptake has on the cell's overall cholesterol content. The effect that blocking access to LDL receptors by VLDL, or internalisation of VLDL particles containing different amounts of apolipoprotein E (we will refer to these particles as VLDL-2 and VLDL-3) has on LDL uptake is explored. By comparison with experimental data we find that measures of cell cholesterol content are important in differentiating between the mechanisms by which VLDL is thought to inhibit LDL uptake. We extend our work to show that in the presence of both types of VLDL particle (VLDL-2 and VLDL-3), measuring relative LDL uptake does not allow differentiation between the results of blocking and internalisation of each VLDL particle to be made. Instead by considering the intracellular cholesterol content it is found that internalisation of VLDL-2 and VLDL-3 leads to the highest intracellular cholesterol concentration. A sensitivity analysis of the model reveals that binding, unbinding and internalisation rates, the fraction of receptors recycled and the rate at which the cholesterol dependent free receptors are created by the cell have important implications for the overall uptake dynamics of either VLDL or LDL particles and subsequent intracellular cholesterol concentration. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A model for the structure of amorphous molybdenum trisulfide, a-MoS3, has been created using reverse Monte Carlo methods. This model, which consists of chains Of MoS6 units sharing three sulfurs with each of its two neighbors and forming alternate long, nonbonded, and short, bonded, Mo-Mo separations, is a good fit to the neutron diffraction data and is chemically and physically realistic. The paper identifies the limitations of previous models based on Mo-3 triangular clusters in accounting for the available experimental data.
Resumo:
The surfactant properties of aqueous protein mixtures ( ranaspumins) from the foam nests of the tropical frog Physalaemus pustulosus have been investigated by surface tension, two-photon excitation. uorescence microscopy, specular neutron reflection, and related biophysical techniques. Ranaspumins lower the surface tension of water more rapidly and more effectively than standard globular proteins under similar conditions. Two- photon excitation. uorescence microscopy of nest foams treated with fluorescent marker ( anilinonaphthalene sulfonic acid) shows partitioning of hydrophobic proteins into the air-water interface and allows imaging of the foam structure. The surface excess of the adsorbed protein layers, determined from measurements of neutron reflection from the surface of water utilizing H2O/D2O mixtures, shows a persistent increase of surface excess and layer thickness with bulk concentration. At the highest concentration studied ( 0.5 mg ml(-1)), the adsorbed layer is characterized by three distinct regions: a protruding top layer of similar to20 Angstrom, a middle layer of similar to30 Angstrom, and a more diffuse submerged layer projecting some 25 Angstrom into bulk solution. This suggests a model involving self-assembly of protein aggregates at the air-water interface in which initial foam formation is facilitated by specific surfactant proteins in the mixture, further stabilized by subsequent aggregation and cross-linking into a multilayer surface complex.