961 resultados para algebraic preservation theorem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that a self-generated set of combinatorial games, S. may not be hereditarily closed but, strong self-generation and hereditary closure are equivalent in the universe of short games. In [13], the question "Is there a set which will give a non-distributive but modular lattice?" appears. A useful necessary condition for the existence of a finite non-distributive modular L(S) is proved. We show the existence of S such that L(S) is modular and not distributive, exhibiting the first known example. More, we prove a Representation Theorem with Games that allows the generation of all finite lattices in game context. Finally, a computational tool for drawing lattices of games is presented. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Binary operations on commutative Jordan algebras, CJA, can be used to study interactions between sets of factors belonging to a pair of models in which one nests the other. It should be noted that from two CJA we can, through these binary operations, build CJA. So when we nest the treatments from one model in each treatment of another model, we can study the interactions between sets of factors of the first and the second models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the applicability of a reinforcement learning algorithm based on the application of the Bayesian theorem of probability. The proposed reinforcement learning algorithm is an advantageous and indispensable tool for ALBidS (Adaptive Learning strategic Bidding System), a multi-agent system that has the purpose of providing decision support to electricity market negotiating players. ALBidS uses a set of different strategies for providing decision support to market players. These strategies are used accordingly to their probability of success for each different context. The approach proposed in this paper uses a Bayesian network for deciding the most probably successful action at each time, depending on past events. The performance of the proposed methodology is tested using electricity market simulations in MASCEM (Multi-Agent Simulator of Competitive Electricity Markets). MASCEM provides the means for simulating a real electricity market environment, based on real data from real electricity market operators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We increasingly face conservative surgery for rectal cancer and even the so called ‘wait and see’ approach, as far as 10–20% patients can reach a complete pathological response at the time of surgery. But what can we say to our patients about risks? Standard surgery with mesorectal excision gives a <2% local recurrence with a post operative death rate of 2–8% (may reach 30% at 6 months in those over 85), but low AR has some deterioration in bowel function and in low cancer a permanent stoma may be required. Also a long-term impact on urinary and sexual function is possible. Distant metastasis rate seem to be identical in the standard and conservative approach. It is difficult to evaluate conservative approach because a not clear standardization of surgery for low rectal cancer. Rullier et al tried to clarify, and they found identical results for recurrence (5–9%), disease free survival (70%) at 5y for coloanal anastomosis and intersphinteric resection. Other series have found local recurrence higher than with standard approach and functional results may be worse and, in some situations, salvage therapy is compromised or has more complications. In this context, functional outcomes are very important but most studies are incomplete in measuring bowel function in the context of conservative approach. In 2005 Temple et al made a survey of 122/184 patient after sphinter preserving surgery and found a 96.9% of incomplete evacuation, 94.4% clustering, 93.2% food affecting frequency, 91.8% gas incontinence and proposed a systematic evaluation with a specific questionnaire. In which concerns ‘Wait and see’ approach for complete clinical responders, it was first advocated by Habr Gama for tumors up to 7cm, with a low locoregional failure of 4.6%, 5y overall survival 96%, 72% for disease free survival; one fifth of patients failed in the first year; a Dutch trial had identical results but others had worse recurrence rates; in other series 25% of patients could not be salvaged even with APR; 30% have subsequent metastatic disease what seems equal for ‘wait and see’ and operated patients. In a recent review Glynne Jones considers that all the evaluated ‘wait and see’ studies are heterogeneous in staging, inclusion criteria, design and follow up after chemoradiation and that there is the suggestion that patients who progress while under observation fare worse than those resected. He proposes long-term observational studies with more uniform inclusion criteria. We are now facing a moment where we may be more aggressive in early cancer and neoadjuvant treatment to be more conservative in the subsequent treatment but we need a better stratification of patients, better evaluation of results and more clear prognostic markers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Genética Molecular e Biomedicina

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial crisis have happened in the past and will continue to do so in the future. In the most recent 2008 crisis, global equities (as measured by the MSCI ACWI index) lost a staggering 54.2% in USD, on the year. During those periods wealth preservation becomes at the top of most investor’s concerns. The purpose of this paper is to develop a strategy that protects the investment during bear markets and significant market corrections, generates capital appreciation, and that can support Millennium BCP’s Wealth Management Unit on their asset allocation procedures. This strategy extends the Dual Momentum approach introduced by Gary Antonacci (2014) in two ways. First, the investable set of securities in the equities space increases from two to four. Besides the US it will comprise the Japanese, European (excl. UK) and EM equity indices. Secondly, it adds a volatility filter as well as three indicators related to the business cycle and the state of the economy, which are relevant to decide on the strategy’s exposure to equities. Overall the results attest the resiliency of the strategy before, during and after historical financial crashes, as it drastically reduces the downside exposure and consistently outperforms the benchmark index by providing higher mean returns with lower variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper complements the information presented at the CIAV2013 on vernacular build- ings in northern Portugal, and addresses the topic of masonry walls in the rural areas of the northwestern Portuguese coastline. These walls are structural schist masonry constructions, built using ancient tech- niques and locally available resources. The result is a territory built for agricultural exploration, and a landscape imprinted with past social hierarchies and structures. Using the information gathered by the fieldwork study, the paper will present studies on masonry walls with different morphologies, construction materials and building techniques employed. The information presented aims to contribute to enlighten researchers and technicians about these building specificities, to increase the scarce available literature about schist’s potential as construction material, and to enhance the importance of the cultural value of this particular kind of heritage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioactive compounds are a large group of compounds (antimicrobials, antioxidants, nutrients, etc.), but its use in edible fi lms and coatings for application on fruits and vegetables has been very important because nowadays the consumers demand fruits and vegetables that are fresh, healthy, high quality and easy to prepare. A number of investigations have shown that the use of additives in edible fi lms and coatings improve its functionability and provide compounds for human health. However, it is necessary to continue research that can generate specifi c or tailor-made edible fi lms and coatings for each product with the best characteristics for preservation. In this review we present and analyze the concepts, progress and perspectives in the design and application of edible fi lms and coatings for fruits and vegetables in order to defi ne the challenges and opportunities that this topic of study in the fi eld of science, technology and food engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El objetivo general del presente proyecto es contribuir a la caracterización genética y bioquímica molecular de mecanismos involucrados en el mantenimiento de la información génica, a través del estudio de sistemas fisiológicos involucrados en la prevención, reparación y tolerancia de mutaciones. Dichos sistemas se encuentran evolutivamente conservados y ampliamente distribuidos en los seres vivos. La importancia de los mismos se refleja en el hecho que su deficiencia genera en humanos, enfermedades genéticas, apoptosis y cáncer; y en especies procariotas, células denominadas "hipermutadoras". En los últimos años el estudio de la hipermutabilidad en bacterias ha cobrado gran interés ya que se le atribuye importancia en procesos infectivos y en aspectos básicos relacionados a evolución. Nuestro modelo de estudio son las bacterias Pseudomonas aeruginosa y Escherichia coli, siendo esta última especie no solo modelo de estudio sino también especie de referencia. P. aeruginosa es una bacteria ambiental gram negativa, e importante patógeno oportunista de humanos. Específicamente nos proponemos estudiar en P. aeruginosa algunos aspectos particulares del Sistema de Reparación de Bases Apareadas Incorrectamente (Mismatch Repair System, MRS), del Sistema de Prevención/Reparación de Lesiones Oxidativas generadas a través de 8-oxo-7,8-dihidroguanina (8-oxo-dG ó GO) y el papel de las ADN Polimerasas de baja fidelidad en la modulación de la tasa de mutación. Asimismo estamos interesados en estudiar en cepas de E. coli deficientes en el sistema Dam, la existencia de subpoblaciones de alta estabilidad genética debido a la eliminación de posibles mutantes por incremento de la expresión de los otros componentes del MRS. Metodológicamente la caracterización bioquímica de factores proteicos se llevará a cabo utilizando proteínas recombinantes purificadas, análisis de interacción proteína-proteína y proteína-ADN mediante electroforesis en geles y resonancia plasmónica de superficie (Biacore), mutagenésis dirigida in vitro, y estudios de complementación en cepas mutantes específicas. Aspectos fenotípicos y de regulación génica en cultivos de biofilm y células en suspensión serán estudiados mediante la construcción de cepas mutantes, fusiones transcripcionales, PCR en tiempo real, western blot y microscopia de fluorescencia confocal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...