944 resultados para Augmentative manipulation
Resumo:
Software reconfigurability became increasingly relevant to the architectural process due to the crescent dependency of modern societies on reliable and adaptable systems. Such systems are supposed to adapt themselves to surrounding environmental changes with minimal service disruption, if any. This paper introduces an engine that statically applies reconfigurations to (formal) models of software architectures. Reconfigurations are specified using a domain specific language— ReCooPLa—which targets the manipulation of software coordinationstructures,typicallyusedinservice-orientedarchitectures(soa).Theengine is responsible for the compilation of ReCooPLa instances and their application to the relevant coordination structures. The resulting configurations are amenable to formal analysis of qualitative and quantitative (probabilistic) properties.
Resumo:
Stress exposure triggers cognitive and behavioral impairments that influence decision-making processes. Decisions under a context of uncertainty require complex reward-prediction processes that are known to be mediated by the mesocorticolimbic dopamine (DA) system in brain areas sensitive to the deleterious effects of chronic stress, in particular the orbitofrontal cortex (OFC). Using a decision-making task, we show that chronic stress biases risk-based decision-making to safer behaviors. This decision-making pattern is associated with an increased activation of the lateral part of the OFC and with morphological changes in pyramidal neurons specifically recruited by this task. Additionally, stress exposure induces a hypodopaminergic status accompanied by increased mRNA levels of the dopamine receptor type 2 (Drd2) in the OFC; importantly, treatment with a D2/D3 agonist quinpirole reverts the shift to safer behaviors induced by stress on risky decision-making. These results suggest that the brain mechanisms related to risk-based decision-making are altered after chronic stress, but can be modulated by manipulation of dopaminergic transmission.
Resumo:
Dissertação de mestrado em Bioquímica Aplicada (área de especialização em Biotecnologia)
Resumo:
Dissertação de mestrado em Educação Especial (área de especialização em Intervenção Precoce)
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
The adoption of a sustainable approach to meeting the energy needs of society has recently taken on a more central and urgent place in the minds of many people. There are many reasons for this including ecological, environmental and economic concerns. One particular area where a sustainable approach has become very relevant is in the production of electricity. The contribution of renewable sources to the energy mix supplying the electricity grid is nothing new, but the focus has begun to move away from the more conventional renewable sources such as wind and hydro. The necessity of exploring new and innovative sources of renewable energy is now seen as imperative as the older forms (i.e. hydro) reach the saturation point of their possible exploitation. One such innovative source of energy currently beginning to be utilised in this regard is tidal energy. The purpose of this thesis is to isolate one specific drawback to tidal energy, which could be considered a roadblock to this energy source being a major contributor to the Irish national grid. This drawback presents itself in the inconsistent nature in which a tidal device generates energy over the course of a 24 hour period. This inconsistency of supply can result in the cycling of conventional power plants in order to even out the supply, subsequently leading to additional costs. The thesis includes a review of literature relevant to the area of tidal and other marine energy sources with an emphasis on the state of the art devices currently in development or production. The research carried out included tidal data analysis and manipulation into a model of the power generating potential at specific sites. A solution is then proposed to the drawback of inconsistency of supply, which involves the positioning of various tidal generation installations at specifically selected locations around the Irish coast. The temporal shift achieved in the power supply profiles of the individual sites by locating the installations in the correct locations, successfully produced an overall power supply profile with the smoother curve and a consistent base load energy supply. Some limitations to the method employed were also outlined, and suggestions for further improvements to the method were made.
Resumo:
It is well-known that couples that look jointly for jobs in the same centralized labor market may cause instabilities. We demonstrate that for a natural preference domain for couples, namely the domain of responsive preferences, the existence of stable matchings can easily be established. However, a small deviation from responsiveness in one couple's preference relation that models the wish of a couple to be closer together may already cause instability. This demonstrates that the nonexistence of stable matchings in couples markets is not a singular theoretical irregularity. Our nonexistence result persists even when a weaker stability notion is used that excludes myopic blocking. Moreover, we show that even if preferences are responsive there are problems that do not arise for singles markets. Even though for couples markets with responsive preferences the set of stable matchings is nonempty, the lattice structure that this set has for singles markets does not carry over. Furthermore we demonstrate that the new algorithm adopted by the National Resident Matching Program to fill positions for physicians in the United States may cycle, while in fact a stable matchings does exist, and be prone to strategic manipulation if the members of a couple pretend to be single.
Resumo:
Estudio elaborado a partir de una estancia en el Institut fur Vogelforschung. El objeto de la estancia fue participar en la campaña de campo en la colonia de Charrán común (Sterna hirundo) situada en Wilhelmshaven (Alemania), entre los meses de mayo y agosto de 2005. Esta participación se llevó a cabo bajo la dirección del Prof. Dr. Peter H. Becker y junto a su equipo. Se participó en la recogida rutinaria de datos de la colonia así como en distintas técnicas relacionadas con el presente proyecto, como el marcaje de pollos, su observación directa desde escondites y la recogida de distintas muestras biológicas. El objetivo principal era continuar con la obtención de datos para el trabajo de investigación sobre la influencia de la calidad y la condición parental en la manipulación adaptativa de la razón de sexos y la asignación por sexos. La obtención de datos se basa en la implantación de transponders en pollos, que permiten la identificación de cada charrán de por vida. La combinación de esta información con la observación directa de cebas hace de la colonia un lugar excepcional, lo que permite conocer los factores que influyen en las tendencias que existan. Sin embargo, el objetivo específico de la campaña se centraba en investigar la variabilidad individual de la respuesta inmune en los pollos de charrán en relación a un número de atributos de los propios pollos (sexo, tamaño, tasa de crecimiento, proteínas en plasma, hematocrito, carga parasitaria, carotenos en plasma, isótopos de las plumas), de los padres (fecha y tamaño de puesta, calidad parental) y de las condiciones de cría (orden de eclosión, densidad de la sub-colonia). Los resultados de estos datos obtenidos durante la campaña respaldan que existe una influencia de la condición nutricional y la calidad parental en la respuesta immune de los pollos, debida probablemente a un esfuerzo reproductivo diferencial.
Resumo:
From an initial double infection in mice, established by simultaneous and equivalent inocula of bloodstream forms of strains Y and F of Trypanosoma cruzi, two lines were derived by subinoculations: one (W) passaged every week, the other (M) every month. Through biological and biochemical methods only the Y strain was identified at the end of the 10th and 16th passages of line W and only the F strain at the 2nd and 4th passages of line M. The results illustrate strain selection through laboratory manipulation of initially mixed populations of T. cruzi.
Resumo:
JPEG 2000 és un estàndard de compressió d'imatges que utilitza tècniques estat de l’art basades en la transformada wavelet. Els principals avantatges són la millor compressió, la possibilitat d’operar amb dades comprimides i que es pot comprimir amb i sense pèrdua amb el mateix mètode. BOI és la implementació de JPEG 2000 del Grup de Compressió Interactiva d’Imatges del departament d’Enginyeria de la Informació i les Comunicacions, pensada per entendre, criticar i millorar les tecnologies de JPEG 2000. La nova versió intenta arribar a tots els extrems de l’estàndard on la versió anterior no va arribar.
Resumo:
Resection of midline skull base lesions involve approaches needing extensive neurovascular manipulation. Transnasal endoscopic approach (TEA) is minimally invasive and ideal for certain selected lesions of the anterior skull base. A thorough knowledge of endonasal endoscopic anatomy is essential to be well versed with its surgical applications and this is possible only by dedicated cadaveric dissections. The goal in this study was to understand endoscopic anatomy of the orbital apex, petrous apex and the pterygopalatine fossa. Six cadaveric heads (3 injected and 3 non injected) and 12 sides, were dissected using a TEA outlining systematically, the steps of surgical dissection and the landmarks encountered. Dissection done by the "2 nostril, 4 hands" technique, allows better transnasal instrumentation with two surgeons working in unison with each other. The main surgical landmarks for the orbital apex are the carotid artery protuberance in the lateral sphenoid wall, optic nerve canal, lateral optico-carotid recess, optic strut and the V2 nerve. Orbital apex includes structures passing through the superior and inferior orbital fissure and the optic nerve canal. Vidian nerve canal and the V2 are important landmarks for the petrous apex. Identification of the sphenopalatine artery, V2 and foramen rotundum are important during dissection of the pterygopalatine fossa. In conclusion, the major potential advantage of TEA to the skull base is that it provides a direct anatomical route to the lesion without traversing any major neurovascular structures, as against the open transcranial approaches which involve more neurovascular manipulation and brain retraction. Obviously, these approaches require close cooperation and collaboration between otorhinolaryngologists and neurosurgeons.
Resumo:
Biological processes can be elucidated by investigating complex networks of relevant factors and genes. However, this is not possible in species for which dominant selectable markers for genetic studies are unavailable. To overcome the limitation in selectable markers for the dermatophyte Arthroderma vanbreuseghemii (anamorph: Trichophyton mentagrophytes), we adapted the flippase (FLP) recombinase-recombination target (FRT) site-specific recombination system from the yeast Saccharomyces cerevisiae as a selectable marker recycling system for this fungus. Taking into account practical applicability, we designed FLP/FRT modules carrying two FRT sequences as well as the flp gene adapted to the pathogenic yeast Candida albicans (caflp) or a synthetic codon-optimized flp (avflp) gene with neomycin resistance (nptII) cassette for one-step marker excision. Both flp genes were under control of the Trichophyton rubrum copper-repressible promoter (PCTR4). Molecular analyses of resultant transformants showed that only the avflp-harbouring module was functional in A. vanbreuseghemii. Applying this system, we successfully produced the Ku80 recessive mutant strain devoid of any selectable markers. This strain was subsequently used as the recipient for sequential multiple disruptions of secreted metalloprotease (fungalysin) (MEP) or serine protease (SUB) genes, producing mutant strains with double MEP or triple SUB gene deletions. These results confirmed the feasibility of this system for broad-scale genetic manipulation of dermatophytes, advancing our understanding of functions and networks of individual genes in these fungi.
Resumo:
This paper reviews the literature on clinical signs such as imitation behavior, grasp reaction, manipulation of tools, utilization behavior, environmental dependency, hyperlexia, hypergraphia and echolalia. Some aspects of this semiology are of special interest because they refer to essential notions such as free-will and autonomy.
Resumo:
Projecte de recerca elaborat a partir d’una estada al Department of Biological Science a la University of Lincoln, a la Gran Bretanya, entre octubre i desembre del 2006. L'objectiu del present assaig va ser desciure les respostes antioxidants d'estrès en gossos sotmesos a cirurgia electiva, en condicions de pràctica clínica normals, durant les fases de preoperatori i postoperatori.Setze gossos van ser sotmesos a orquiectomia o ovariohisterectomia electives, utilitzant un protocol quirúrgic estàndard. Durant les fases preoperatoria i postoperatoria, cada animal va ser confinat a la Unitat de Cures Intensives, temps durant el qual es va estudiar la seva resposta antioxidant. Els valors obtinguts a diferents temps van ser comparats amb el valor basal, que s'havia obtingut del mateix animal estant aquest en el seu ambient habitual. No es van detectar variacions significants causades per l'estrès perioperatori. Els valors màxims es van observar durant la fase preoperatoria, just després que l'animal fós confinat a la Unitat de Cures Intensives, moment en el que l'estrès percebut era degut a les amenaces psicològiques de una àrea restringida i de la manipulació per persones desonegudes. L'abscència de variacions significants podrien ser degudes al sistema i el temps d'emmagatzement de les mostres. En humana s'han descrit les alteracions en l'activitat dels antioxidants sèrics després d'un mes d'emmagatzematent. Per definir l'estabilitat, després de la recollida de mostres, de l'activitat dels antioxidants en sèrum de gos és necessari realitzar més estudis.
Resumo:
Receptors for interleukin 2 (IL-2) esit in at least three forms which differ in their subunit compositio, their affinity for ligand and their ability to mediate a cellular reponse. Type I receptors occur following cellular acitivation and consist of the 55,000 m. w. glycoprotein Tac. These receptors bind IL-2 with a low affinity, do not internalize ligand and have not been definitively associated with any response. Type II receptors, on the other hand, conssit of one or more glycoproteins of 70,000 m. w. which have been termed "beta ([beta]) chains." They bind IL-2 with an intermediate affinity and rapidly internalize the ligand. [Beta] proteins mediate many cellular IL-2-dependent reponses, including the short-term activation of natural killer cells and the induction of Tac protein expression. Type III receptors consist of a ternary complex of the Tac protein, the [beta] chain(s) and IL-2. They are characterized by a paricularly high affinity for ligand association. Type III receptors also internalize ligand and mediate IL-2-dependent responses at low factor concentrations. The identification of two independent IL-2-binding molecules, Tac and [beta], thus provides the elusive molecular explanation for the differences in IL-2 receptor affinity and suggests the potential for selective therapeutic manipulation of IL-2 reponses.