916 resultados para transmission of data and images


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data centers are easily found in every sector of the worldwide economy. They are composed of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of Data Centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grep 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, Data Centers are responsible for more than 2% of total carbon dioxide emissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los Centros de Datos se encuentran actualmente en cualquier sector de la economía mundial. Están compuestos por miles de servidores, dando servicio a los usuarios de forma global, las 24 horas del día y los 365 días del año. Durante los últimos años, las aplicaciones del ámbito de la e-Ciencia, como la e-Salud o las Ciudades Inteligentes han experimentado un desarrollo muy significativo. La necesidad de manejar de forma eficiente las necesidades de cómputo de aplicaciones de nueva generación, junto con la creciente demanda de recursos en aplicaciones tradicionales, han facilitado el rápido crecimiento y la proliferación de los Centros de Datos. El principal inconveniente de este aumento de capacidad ha sido el rápido y dramático incremento del consumo energético de estas infraestructuras. En 2010, la factura eléctrica de los Centros de Datos representaba el 1.3% del consumo eléctrico mundial. Sólo en el año 2012, el consumo de potencia de los Centros de Datos creció un 63%, alcanzando los 38GW. En 2013 se estimó un crecimiento de otro 17%, hasta llegar a los 43GW. Además, los Centros de Datos son responsables de más del 2% del total de emisiones de dióxido de carbono a la atmósfera. Esta tesis doctoral se enfrenta al problema energético proponiendo técnicas proactivas y reactivas conscientes de la temperatura y de la energía, que contribuyen a tener Centros de Datos más eficientes. Este trabajo desarrolla modelos de energía y utiliza el conocimiento sobre la demanda energética de la carga de trabajo a ejecutar y de los recursos de computación y refrigeración del Centro de Datos para optimizar el consumo. Además, los Centros de Datos son considerados como un elemento crucial dentro del marco de la aplicación ejecutada, optimizando no sólo el consumo del Centro de Datos sino el consumo energético global de la aplicación. Los principales componentes del consumo en los Centros de Datos son la potencia de computación utilizada por los equipos de IT, y la refrigeración necesaria para mantener los servidores dentro de un rango de temperatura de trabajo que asegure su correcto funcionamiento. Debido a la relación cúbica entre la velocidad de los ventiladores y el consumo de los mismos, las soluciones basadas en el sobre-aprovisionamiento de aire frío al servidor generalmente tienen como resultado ineficiencias energéticas. Por otro lado, temperaturas más elevadas en el procesador llevan a un consumo de fugas mayor, debido a la relación exponencial del consumo de fugas con la temperatura. Además, las características de la carga de trabajo y las políticas de asignación de recursos tienen un impacto importante en los balances entre corriente de fugas y consumo de refrigeración. La primera gran contribución de este trabajo es el desarrollo de modelos de potencia y temperatura que permiten describes estos balances entre corriente de fugas y refrigeración; así como la propuesta de estrategias para minimizar el consumo del servidor por medio de la asignación conjunta de refrigeración y carga desde una perspectiva multivariable. Cuando escalamos a nivel del Centro de Datos, observamos un comportamiento similar en términos del balance entre corrientes de fugas y refrigeración. Conforme aumenta la temperatura de la sala, mejora la eficiencia de la refrigeración. Sin embargo, este incremente de la temperatura de sala provoca un aumento en la temperatura de la CPU y, por tanto, también del consumo de fugas. Además, la dinámica de la sala tiene un comportamiento muy desigual, no equilibrado, debido a la asignación de carga y a la heterogeneidad en el equipamiento de IT. La segunda contribución de esta tesis es la propuesta de técnicas de asigación conscientes de la temperatura y heterogeneidad que permiten optimizar conjuntamente la asignación de tareas y refrigeración a los servidores. Estas estrategias necesitan estar respaldadas por modelos flexibles, que puedan trabajar en tiempo real, para describir el sistema desde un nivel de abstracción alto. Dentro del ámbito de las aplicaciones de nueva generación, las decisiones tomadas en el nivel de aplicación pueden tener un impacto dramático en el consumo energético de niveles de abstracción menores, como por ejemplo, en el Centro de Datos. Es importante considerar las relaciones entre todos los agentes computacionales implicados en el problema, de forma que puedan cooperar para conseguir el objetivo común de reducir el coste energético global del sistema. La tercera contribución de esta tesis es el desarrollo de optimizaciones energéticas para la aplicación global por medio de la evaluación de los costes de ejecutar parte del procesado necesario en otros niveles de abstracción, que van desde los nodos hasta el Centro de Datos, por medio de técnicas de balanceo de carga. Como resumen, el trabajo presentado en esta tesis lleva a cabo contribuciones en el modelado y optimización consciente del consumo por fugas y la refrigeración de servidores; el modelado de los Centros de Datos y el desarrollo de políticas de asignación conscientes de la heterogeneidad; y desarrolla mecanismos para la optimización energética de aplicaciones de nueva generación desde varios niveles de abstracción. ABSTRACT Data centers are easily found in every sector of the worldwide economy. They consist of tens of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of data centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grew 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, data centers are responsible for more than 2% of total carbon dioxide emissions. This PhD Thesis addresses the energy challenge by proposing proactive and reactive thermal and energy-aware optimization techniques that contribute to place data centers on a more scalable curve. This work develops energy models and uses the knowledge about the energy demand of the workload to be executed and the computational and cooling resources available at data center to optimize energy consumption. Moreover, data centers are considered as a crucial element within their application framework, optimizing not only the energy consumption of the facility, but the global energy consumption of the application. The main contributors to the energy consumption in a data center are the computing power drawn by IT equipment and the cooling power needed to keep the servers within a certain temperature range that ensures safe operation. Because of the cubic relation of fan power with fan speed, solutions based on over-provisioning cold air into the server usually lead to inefficiencies. On the other hand, higher chip temperatures lead to higher leakage power because of the exponential dependence of leakage on temperature. Moreover, workload characteristics as well as allocation policies also have an important impact on the leakage-cooling tradeoffs. The first key contribution of this work is the development of power and temperature models that accurately describe the leakage-cooling tradeoffs at the server level, and the proposal of strategies to minimize server energy via joint cooling and workload management from a multivariate perspective. When scaling to the data center level, a similar behavior in terms of leakage-temperature tradeoffs can be observed. As room temperature raises, the efficiency of data room cooling units improves. However, as we increase room temperature, CPU temperature raises and so does leakage power. Moreover, the thermal dynamics of a data room exhibit unbalanced patterns due to both the workload allocation and the heterogeneity of computing equipment. The second main contribution is the proposal of thermal- and heterogeneity-aware workload management techniques that jointly optimize the allocation of computation and cooling to servers. These strategies need to be backed up by flexible room level models, able to work on runtime, that describe the system from a high level perspective. Within the framework of next-generation applications, decisions taken at this scope can have a dramatical impact on the energy consumption of lower abstraction levels, i.e. the data center facility. It is important to consider the relationships between all the computational agents involved in the problem, so that they can cooperate to achieve the common goal of reducing energy in the overall system. The third main contribution is the energy optimization of the overall application by evaluating the energy costs of performing part of the processing in any of the different abstraction layers, from the node to the data center, via workload management and off-loading techniques. In summary, the work presented in this PhD Thesis, makes contributions on leakage and cooling aware server modeling and optimization, data center thermal modeling and heterogeneityaware data center resource allocation, and develops mechanisms for the energy optimization for next-generation applications from a multi-layer perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years, the Data Center market has increased exponentially and this tendency continues today. As a direct consequence of this trend, the industry is pushing the development and implementation of different new technologies that would improve the energy consumption efficiency of data centers. An adaptive dashboard would allow the user to monitor the most important parameters of a data center in real time. For that reason, monitoring companies work with IoT big data filtering tools and cloud computing systems to handle the amounts of data obtained from the sensors placed in a data center.Analyzing the market trends in this field we can affirm that the study of predictive algorithms has become an essential area for competitive IT companies. Complex algorithms are used to forecast risk situations based on historical data and warn the user in case of danger. Considering that several different users will interact with this dashboard from IT experts or maintenance staff to accounting managers, it is vital to personalize it automatically. Following that line of though, the dashboard should only show relevant metrics to the user in different formats like overlapped maps or representative graphs among others. These maps will show all the information needed in a visual and easy-to-evaluate way. To sum up, this dashboard will allow the user to visualize and control a wide range of variables. Monitoring essential factors such as average temperature, gradients or hotspots as well as energy and power consumption and savings by rack or building would allow the client to understand how his equipment is behaving, helping him to optimize the energy consumption and efficiency of the racks. It also would help him to prevent possible damages in the equipment with predictive high-tech algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For analyzing the mechanism of energy transduction in the “motor” protein, myosin, it is opportune both to model the structural change in the hydrolytic transition, ATP (myosin-bound) + H2O → ADP⋅Pi (myosin-bound) and to check the plausibility of the model by appropriate site-directed mutations in the functional system. Here, we made a series of mutations to investigate the role of the salt-bridge between Glu-470 and Arg-247 (of chicken smooth muscle myosin) that has been inferred from crystallography to be a central feature of the transition [Fisher, A. J., Smith, C. A., Thoden, J. B., Smith, R., Sutoh, K., Holden, H. M., & Rayment, I. (1995) Biochemistry 34, 8960–8972]. Our results suggest that whether in the normal, or in the inverted, direction an intact salt-bridge is necessary for ATP hydrolysis, but when the salt-bridge is in the inverted direction it does not support actin activation. Normally, fluorescence changes result from adding nucleotides to myosin; these signals are reported by Trp-512 (of chicken smooth muscle myosin). Our results also suggest that structural impairments in the 470–247 region interfere with the transmission of these signals to the responsive Trp.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We dedicate this paper to the memory of Prof. Andres Perez Estaún, who was a great and committed scientist, wonderful colleague and even better friend. The datasets in this work have been funded by Fundación Ciudad de la Energía (Spanish Government, www.ciuden.es) and by the European Union through the “European Energy Programme 15 for Recovery” and the Compostilla OXYCFB300 project. Dr. Juan Alcalde is currently funded by NERC grant NE/M007251/1. Simon Campbell and Samuel Cheyney are acknowledged for thoughtful comments on gravity inversion

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In two experiments, electric brain waves of 14 subjects were recorded under several different conditions to study the invariance of brain-wave representations of simple patches of colors and simple visual shapes and their names, the words blue, circle, etc. As in our earlier work, the analysis consisted of averaging over trials to create prototypes and test samples, to both of which Fourier transforms were applied, followed by filtering and an inverse transformation to the time domain. A least-squares criterion of fit between prototypes and test samples was used for classification. The most significant results were these. By averaging over different subjects, as well as trials, we created prototypes from brain waves evoked by simple visual images and test samples from brain waves evoked by auditory or visual words naming the visual images. We correctly recognized from 60% to 75% of the test-sample brain waves. The general conclusion is that simple shapes such as circles and single-color displays generate brain waves surprisingly similar to those generated by their verbal names. These results, taken together with extensive psychological studies of auditory and visual memory, strongly support the solution proposed for visual shapes, by Bishop Berkeley and David Hume in the 18th century, to the long-standing problem of how the mind represents simple abstract ideas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The observation of high frequencies of certain inherited disorders in the population of Saguenay–Lac Saint Jean can be explained in terms of the variance and the correlation of effective family size (EFS) from one generation to the next. We have shown this effect by using the branching process approach with real demographic data. When variance of EFS is included in the model, despite its profound effect on mutant allele frequency, any mutant introduced in the population never reaches the known carrier frequencies (between 0.035 and 0.05). It is only when the EFS correlation between generations is introduced into the model that we can explain the rise of the mutant alleles. This correlation is described by a c parameter that reflects the dependency of children’s EFS on their parents’ EFS. The c parameter can be considered to reflect social transmission of demographic behavior. We show that such social transmission dramatically reduces the effective population size. This could explain particular distributions in allele frequencies and unusually high frequency of certain inherited disorders in some human populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We optically imaged a visual masking illusion in primary visual cortex (area V-1) of rhesus monkeys to ask whether activity in the early visual system more closely reflects the physical stimulus or the generated percept. Visual illusions can be a powerful way to address this question because they have the benefit of dissociating the stimulus from perception. We used an illusion in which a flickering target (a bar oriented in visual space) is rendered invisible by two counter-phase flickering bars, called masks, which flank and abut the target. The target and masks, when shown separately, each generated correlated activity on the surface of the cortex. During the illusory condition, however, optical signals generated in the cortex by the target disappeared although the image of the masks persisted. The optical image thus was correlated with perception but not with the physical stimulus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple mathematical model of bacterial transmission within a hospital was used to study the effects of measures to control nosocomial transmission of bacteria and reduce antimicrobial resistance in nosocomial pathogens. The model predicts that: (i) Use of an antibiotic for which resistance is not yet present in a hospital will be positively associated at the individual level (odds ratio) with carriage of bacteria resistant to other antibiotics, but negatively associated at the population level (prevalence). Thus inferences from individual risk factors can yield misleading conclusions about the effect of antibiotic use on resistance to another antibiotic. (ii) Nonspecific interventions that reduce transmission of all bacteria within a hospital will disproportionately reduce the prevalence of colonization with resistant bacteria. (iii) Changes in the prevalence of resistance after a successful intervention will occur on a time scale of weeks to months, considerably faster than in community-acquired infections. Moreover, resistance can decline rapidly in a hospital even if it does not carry a fitness cost. The predictions of the model are compared with those of other models and published data. The implications for resistance control and study design are discussed, along with the limitations and assumptions of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To determine the risk factors for and timing of vertical transmission of hepatitis C virus in women who are not infected with HIV-1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UCM Instrumentation Group (GUAIX) is developing currently Data Reduction Pipelines (DRP) for four instruments of the GTC: EMIR, FRIDA, MEGARA and MIRADAS. The purpose of the DRPs is to provide astronomers scientific quality data, removing instrumental biases, calibrating the images in physical units and providing a estimation of the associated uncertainties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"From the 1859 gold rush through the early 1900s, popular press images linked Denver’s civic development, capitalist values and culture to the Rocky Mountains. These prints of a wilderness city sending pioneers and prospectors into the Rockies appeared in national newspapers, magazines, settlement manifestos, railroad guidebooks and tourist pamphlets. Readers were saturated with illustrations associating Denver with prosperity and rejuvenated health"-

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EPA promulgated the Exceptional Events Rule codifying guidance regarding exclusion of monitoring data from compliance decisions due to uncontrollable natural or exceptional events. This capstone examines documentation systems utilized by agencies requesting data be excluded from compliance decisions due to exceptional events. A screening tool is developed to determine whether an event would meet exceptional event criteria. New data sources are available to enhance analysis but evaluation shows many are unusable in their current form. The EPA and States must collaborate to develop consistent evaluation methodologies documenting exceptional events to improve the efficiency and effectiveness of the new rule. To utilize newer sophisticated data, consistent, user-friendly translation systems must be developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exchange traffic was a practice that suited the modus vivendi of caravanners and traders in the Islamic world. The inclusion of the Iberian.Peninsula into the most important commercial routes of Muslims – such as the silk route through the North Africa – provides a very solid reason to deepen in the study of the sources that have been preserved about the role that certain credit instruments, which represented an alternative to the coin, might have played in trade centres.