903 resultados para Many-core
Resumo:
La referencia a la tradición como una fuerza capaz de aportar unidad, al abarcar tanto la continuidad como los cambios en las expresiones al margen de la época o las técnicas empleadas, ha sido siempre un componente muy importante de las manifestaciones artísticas de Japón y es la sutil ligazón que las conecta desde el pasado hasta la actualidad. Se entiende aquí que tradición no equivale simplemente a preservación, sino que se trata de una transmisión con una vertiente dual, pues permite una constante evolución sin que se altere su esencia básica. Es de esta forma que la cultura japonesa de la era Edo (1600-1868), con su alto grado de innovación, riqueza y sofisticación, pero además como epílogo histórico previo a la Restauración Meiji de 1868, ha sido la referencia clave en lo relativo a esa mirada a la tradición nipona para el desarrollo de esta tesis. A partir de la segunda mitad del siglo XIX, todas estas características tan genuinas del período Edo quedaron en suspenso y, desde entonces, Japón ha seguido la vía de la modernización (que en muchos aspectos ha sido también la de la occidentalización). Los entrelazamientos de otras dualidades provocados en Japón por los ataques nucleares de 1945 condujeron a una inevitable fusión de aquel mundo físico con el mundo metafísico, de aquella terrible presencia con un tremendo sentimiento de ausencia y, en definitiva, de Oriente con Occidente. El consiguiente impacto sobre la cultura de la nación tuvo una especial repercusión en el ámbito arquitectónico. En este sentido, el pensamiento francés ha desempeñado un papel fundamental en el replanteamiento radical de muchos supuestos esenciales, de muchos conceptos y valores de la cultura occidental, incluyendo los procedentes de la Ilustración, en este mundo contemporáneo que es cada vez más complejo y plural. Éstos y otros hechos otorgan a la cultura japonesa tradicional una cualidad multidimensional (frente a la marcada bidimensionalidad que suele caracterizar a las culturas occidentales) que, tal y como esta tesis doctoral pretende poner de manifiesto, podría revelarse como una «síntesis de contradicciones» en la obra de los arquitectos japoneses Tadao Ando (Osaka, 1941-) y Toyo Ito (Keijo, actual Seúl, 1941-). En el pensamiento oriental una dualidad es entendida como la complementariedad entre dos polos, sólo en apariencia opuestos, que integra dos vertientes de un único concepto. Al igual que la idea de «parejas» budista, concebida según esta doctrina como una unidad entre dos extremos inevitablemente interrelacionados, el objetivo principal sería el de poner de manifiesto cómo la obra de ambos arquitectos trata de resolver los mismos conflictos partiendo de puntos de vista polarizados. ABSTRACT The reference to tradition as a force for unity, encompassing both continuity and changes in expressions regardless of the time or techniques used, has always been a very important component of the artistic manifestations of Japan and it is the subtle link that connects them from the past to the present. It is understood here that tradition does not mean simply preservation, but it is a transmission with a dual aspect, because it allows a constant evolution without altering its basic essence. It is thus that the Japanese culture of the Edo era (1600-1868), with its high degree of innovation, wealth and sophistication, but also as a historical epilogue prior to the Meiji Restoration of 1868 has been the key reference concerning that look to the Japanese tradition for the development of this thesis. From the second half of the 19th century, all these as genuine features of the Edo period remained at a standstill and, since then, Japan has followed the path of modernization (which has also been that of Westernization in many ways). The interlacements of other dualities caused in Japan by the 1945 nuclear attacks led to an inevitable fusion of the physical with the metaphysical world, of that terrible presence with a tremendous sense of absence and, eventually, of East with West. The resulting impact on the culture of the nation had a special repercussion on the architectural field. In this sense, the French thought has played a key role in the radical rethinking of many core assumptions, many concepts and values of Western thought, including those from the Age of Enlightenment, in this contemporary world that is increasingly complex and plural. These and other facts give the traditional Japanese culture a multidimensional quality (opposed to the marked two-dimensionality that usually characterizes Western cultures). As this thesis aims to highlight, this could prove to be a «synthesis of contradictions» in the work of Japanese architects Tadao Ando (Osaka, 1941-) and Toyo Ito (Keijo, current Seoul, 1941-). In the Eastern thought a duality is understood as the complementarity between two poles, only apparently opposite, which integrates two aspects of a single concept. Like the Buddhist notion of «couples» (conceived according to this doctrine as a unity between two inevitably interlinked extremes) the main objective would be to show how the work of both architects tries to resolve the same conflicts starting with polarized viewpoints.
Resumo:
We describe Janus, a massively parallel FPGA-based computer optimized for the simulation of spin glasses, theoretical models for the behavior of glassy materials. FPGAs (as compared to GPUs or many-core processors) provide a complementary approach to massively parallel computing. In particular, our model problem is formulated in terms of binary variables, and floating-point operations can be (almost) completely avoided. The FPGA architecture allows us to run many independent threads with almost no latencies in memory access, thus updating up to 1024 spins per cycle. We describe Janus in detail and we summarize the physics results obtained in four years of operation of this machine; we discuss two types of physics applications: long simulations on very large systems (which try to mimic and provide understanding about the experimental non equilibrium dynamics), and low-temperature equilibrium simulations using an artificial parallel tempering dynamics. The time scale of our non-equilibrium simulations spans eleven orders of magnitude (from picoseconds to a tenth of a second). On the other hand, our equilibrium simulations are unprecedented both because of the low temperatures reached and for the large systems that we have brought to equilibrium. A finite-time scaling ansatz emerges from the detailed comparison of the two sets of simulations. Janus has made it possible to perform spin glass simulations that would take several decades on more conventional architectures. The paper ends with an assessment of the potential of possible future versions of the Janus architecture, based on state-of-the-art technology.
Resumo:
An abstract of a thesis devoted to using helix-coil models to study unfolded states.\\
Research on polypeptide unfolded states has received much more attention in the last decade or so than it has in the past. Unfolded states are thought to be implicated in various
misfolding diseases and likely play crucial roles in protein folding equilibria and folding rates. Structural characterization of unfolded states has proven to be
much more difficult than the now well established practice of determining the structures of folded proteins. This is largely because many core assumptions underlying
folded structure determination methods are invalid for unfolded states. This has led to a dearth of knowledge concerning the nature of unfolded state conformational
distributions. While many aspects of unfolded state structure are not well known, there does exist a significant body of work stretching back half a century that
has been focused on structural characterization of marginally stable polypeptide systems. This body of work represents an extensive collection of experimental
data and biophysical models associated with describing helix-coil equilibria in polypeptide systems. Much of the work on unfolded states in the last decade has not been devoted
specifically to the improvement of our understanding of helix-coil equilibria, which arguably is the most well characterized of the various conformational equilibria
that likely contribute to unfolded state conformational distributions. This thesis seeks to provide a deeper investigation of helix-coil equilibria using modern
statistical data analysis and biophysical modeling techniques. The studies contained within seek to provide deeper insights and new perspectives on what we presumably
know very well about protein unfolded states. \\
Chapter 1 gives an overview of recent and historical work on studying protein unfolded states. The study of helix-coil equilibria is placed in the context
of the general field of unfolded state research and the basics of helix-coil models are introduced.\\
Chapter 2 introduces the newest incarnation of a sophisticated helix-coil model. State of the art modern statistical techniques are employed to estimate the energies
of various physical interactions that serve to influence helix-coil equilibria. A new Bayesian model selection approach is utilized to test many long-standing
hypotheses concerning the physical nature of the helix-coil transition. Some assumptions made in previous models are shown to be invalid and the new model
exhibits greatly improved predictive performance relative to its predecessor. \\
Chapter 3 introduces a new statistical model that can be used to interpret amide exchange measurements. As amide exchange can serve as a probe for residue-specific
properties of helix-coil ensembles, the new model provides a novel and robust method to use these types of measurements to characterize helix-coil ensembles experimentally
and test the position-specific predictions of helix-coil models. The statistical model is shown to perform exceedingly better than the most commonly used
method for interpreting amide exchange data. The estimates of the model obtained from amide exchange measurements on an example helical peptide
also show a remarkable consistency with the predictions of the helix-coil model. \\
Chapter 4 involves a study of helix-coil ensembles through the enumeration of helix-coil configurations. Aside from providing new insights into helix-coil ensembles,
this chapter also introduces a new method by which helix-coil models can be extended to calculate new types of observables. Future work on this approach could potentially
allow helix-coil models to move into use domains that were previously inaccessible and reserved for other types of unfolded state models that were introduced in chapter 1.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.
Resumo:
Scientific applications rely heavily on floating point data types. Floating point operations are complex and require complicated hardware that is both area and power intensive. The emergence of massively parallel architectures like Rigel creates new challenges and poses new questions with respect to floating point support. The massively parallel aspect of Rigel places great emphasis on area efficient, low power designs. At the same time, Rigel is a general purpose accelerator and must provide high performance for a wide class of applications. This thesis presents an analysis of various floating point unit (FPU) components with respect to Rigel, and attempts to present a candidate design of an FPU that balances performance, area, and power and is suitable for massively parallel architectures like Rigel.
Resumo:
The involvement of dopamine (DA) mechanisms in the nucleus accumbens (NAC) in fear conditioning has been proposed by many studies that have challenged the view that the NAC is solely involved in the modulation of appetitive processes. However, the role of the core and shell subregions of the NAC in aversive conditioning remains unclear. The present study examined DA release in these NAC subregions using microdialysis during the expression of fear memory. Guide cannulae were implanted in rats in the NAC core and shell. Five days later, the animals received 10 footshocks (0.6 mA, 1 s duration) in a distinctive cage A (same context). On the next day, dialysis probes were inserted through the guide cannulae into the NAC core and shell subregions, and the animals were behaviorally tested for fear behavior either in the same context (cage A) or in a novel context (cage B). Dialysates were collected every 5 min for 90 min and analyzed by high-performance liquid chromatography. The rats exhibited a significant fear response in cage A but not in cage B. Moreover, increased DA levels in both NAC subregions were observed 5-25 min after the beginning of the test when the animals were tested in the same context compared with accumbal DA levels from rats tested in the different context. These findings Suggest that DA mechanisms in both the NAC core and shell may play an important role in the expression of contextual fear memory. (c) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Smoothing the potential energy surface for structure optimization is a general and commonly applied strategy. We propose a combination of soft-core potential energy functions and a variation of the diffusion equation method to smooth potential energy surfaces, which is applicable to complex systems such as protein structures; The performance of the method was demonstrated by comparison with simulated annealing using the refinement of the undecapeptide Cyclosporin A as a test case. Simulations were repeated many times using different initial conditions and structures since the methods are heuristic and results are only meaningful in a statistical sense.
Resumo:
A conserved helical peptide vaccine candidate from the M protein of group A streptococci, p145, has been described. Minimal epitopes within p145 have been defined and an epitope recognized by protective antibodies, but not by autoreactive T cells, has been identified. When administered to mice, p145 has low immunogenicity. Many boosts of peptide are required to achieve a high antibody titre (> 12 800). To attempt to overcome this low immunogenicity, lipid-core peptide technology was employed. Lipid-core peptides (LCP) consist of an oligomeric polylysine core, with multiple copies of the peptide of choice, conjugated to a series of lipoamino acids, which acts as an anchor for the antigen. Seven different LCP constructs based on the p145 peptide sequence were synthesized (LCP1-->LCP7) and the immunogenicity of the compounds examined. The most immunogenic constructs contained the longest alkyl side-chains. The number of lipoamino acids in the constructs affected the immunogenicity and spacing between the alkyl side-chains increased immunogenicity. An increase in immunogenicity (enzyme-linked immunosorbent assay (ELISA) titres) of up to 100-fold was demonstrated using this technology and some constructs without adjuvant were more immunogenic than p145 administered with complete Freund's adjuvant (CFA). The fine specificity of the induced antibody response differed for the different constructs but one construct, LCP4, induced antibodies of identical fine specificity to those found in endemic human serum. Opsonic activity of LCP4 antisera was more than double that of p145 antisera. These data show the potential for LCP technology to both enhance immunogenicity of complex peptides and to focus the immune response towards or away from critical epitopes.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
La gestión de recursos en los procesadores multi-core ha ganado importancia con la evolución de las aplicaciones y arquitecturas. Pero esta gestión es muy compleja. Por ejemplo, una misma aplicación paralela ejecutada múltiples veces con los mismos datos de entrada, en un único nodo multi-core, puede tener tiempos de ejecución muy variables. Hay múltiples factores hardware y software que afectan al rendimiento. La forma en que los recursos hardware (cómputo y memoria) se asignan a los procesos o threads, posiblemente de varias aplicaciones que compiten entre sí, es fundamental para determinar este rendimiento. La diferencia entre hacer la asignación de recursos sin conocer la verdadera necesidad de la aplicación, frente a asignación con una meta específica es cada vez mayor. La mejor manera de realizar esta asignación és automáticamente, con una mínima intervención del programador. Es importante destacar, que la forma en que la aplicación se ejecuta en una arquitectura no necesariamente es la más adecuada, y esta situación puede mejorarse a través de la gestión adecuada de los recursos disponibles. Una apropiada gestión de recursos puede ofrecer ventajas tanto al desarrollador de las aplicaciones, como al entorno informático donde ésta se ejecuta, permitiendo un mayor número de aplicaciones en ejecución con la misma cantidad de recursos. Así mismo, esta gestión de recursos no requeriría introducir cambios a la aplicación, o a su estrategia operativa. A fin de proponer políticas para la gestión de los recursos, se analizó el comportamiento de aplicaciones intensivas de cómputo e intensivas de memoria. Este análisis se llevó a cabo a través del estudio de los parámetros de ubicación entre los cores, la necesidad de usar la memoria compartida, el tamaño de la carga de entrada, la distribución de los datos dentro del procesador y la granularidad de trabajo. Nuestro objetivo es identificar cómo estos parámetros influyen en la eficiencia de la ejecución, identificar cuellos de botella y proponer posibles mejoras. Otra propuesta es adaptar las estrategias ya utilizadas por el Scheduler con el fin de obtener mejores resultados.
Resumo:
Thissecond annual report of the Director of Public Health highlights the many public health challenges that affect people in Northern Ireland. It demonstrates how the public health team tackles this complex agenda by working with many statutory, community and voluntary partner organisations across health, local government, education, housing and other sectors. It shows a wealth of innovative work to address the main public health challenges facing communities, health inequality, preventing and protecting against ill-health, detecting illness early, and providing high quality services. Integral to thereport are core tables for 2009 which provide key statistical data on population, birth and death rates, mortality by cause, life expectancy, immunisation and screening.
Resumo:
Too many children and young people are living in circumstances that make it difficult for them to thrive. That is the key message from the third Annual Report of the Director of Public Health (DPH) for Northern Ireland, which was published on 14th June 2012. This significant report highlights the many public health challenges that affect people in Northern Ireland.As Director of Public Health, Dr Carolyn Harper's report describes the main public health challenges across Northern Ireland, and details work being undertaken by the Public Health Agency (PHA) and its partners over the past year to improve the health and wellbeing of people here.A Core Tables report for 2010, available below, produced by the PHA in support of the Director of Public Health's Annual Report for 2011-2012, including information such as estimated home population figures and projections, births information, fertility rates, death rates, information on mortality, life expectancy, immunisation rates and screening uptake rates.
Resumo:
Improving the health and wellbeing of the elderly is the theme of the fourth Director of Public Health annual report, launched on 12 June 2013. Northern Ireland's elderly population is growing and older people are living longer than ever before, which emphasises the importance of providing health and social care that allows them to live a productive life.This report highlights the many areas of public health work aimed at giving elderly people in Northern Ireland the best opportunity to live active and healthy lives in a safe and secure environment. An in-depth overview also provides statistics on many aspects of life as an elderly person here - life expectancy, mortality, mental wellbeing, lifestyle, social determinants of health etc. Further, more detailed, data is included in an accompanying report available�as a separate document.��The core tables for 2011, also available to download below, include information such as estimated home population figures and projections, birth rates, fertility rates, death rates, information on mortality, life expectancy, immunisation rates and screening uptake rates.The presentation slides from key speakers from the launch event on 12 June 2013 and all parallel sessions are also appended below.�Please note:�The PHA cannot be held responsible for any breach of copyright that may exist within individual presentations.Anyone wishing to get a copy of the presentation by Ron McDowell�in the 'Identifying those at risk' category should contact him directly at mcdowell-R3@email.ulster.ac.uk