750 resultados para Open clusters and associations: individual (Berkeley 55)
Resumo:
In this Thesis work we have studied the properties of high-redshift galaxy clusters through the X-ray emission from their intracluster gas. In particular, we have focused on the relation between concentration and mass that is related to the density of the universe at the formation time of the clusters and therefore, it is a powerful cosmological probe. Concentration is expected to be a decreasing function of mass but a complete characterization of this relation has not been reached yet. We have analysed 22 clusters observed withe the Chandra satellite at high redshift and we have investigated the concentration-mass relation.
Resumo:
This work is focused on axions and axion like particles (ALPs) and their possible relation with the 3.55 keV photon line detected, in recent years, from galaxy clusters and other astrophysical objects. We focus on axions that come from string compactification and we study the vacuum structure of the resulting low energy 4D N=1 supergravity effective field theory. We then provide a model which might explain the 3.55 keV line through the following processes. A 7.1 keV dark matter axion decays in two light axions, which, in turn, are transformed into photons thanks to the Primakoff effect and the existence of a kinetic mixing between two U(1)s gauge symmetries belonging respectively to the hidden and the visible sector. We present two models, the first one gives an outcome inconsistent with experimental data, while the second can yield the desired result.
Resumo:
The Gaussian-2, Gaussian-3, Complete Basis Set-QB3, and Complete Basis Set-APNO methods have been used to calculate geometries of neutral clusters of water, (H2O)n, where n = 2–6. The structures are in excellent agreement with those determined from experiment and those predicted from previous high-level calculations. These methods also provide excellent thermochemical predictions for water clusters, and thus can be used with confidence in evaluating the structures and thermochemistry of water clusters.
Resumo:
A mixed molecular dynamics/quantum mechanics model has been applied to the ammonium/water clustering system. The use of the high level MP2 calculation method and correlated basis sets, such as aug-cc-pVDZ and aug-cc-pVTZ, lends confidence in the accuracy of the extrapolated energies. These calculations provide electronic and free energies for the formation of clusters of ammonium and 1−10 water molecules at two different temperatures. Structures and thermodynamic values are in good agreement with previous experimental and theoretical results. The estimated concentration of these clusters in the troposphere was calculated using atmospheric amounts of ammonium and water. Results show the favorability of forming these clusters and implications for ion-induced nucleation in the atmosphere.
Resumo:
To provide insight into the recently published cost comparisons in the context of open, laparoscopic, and robotic-assisted laparoscopic radical cystectomy and to demonstrate the complexity of such economic analyses.
Resumo:
BACKGROUND: Clustering ventricular arrhythmias are the consequence of acute ventricular electrical instability and represent a challenge in the management of the growing number of patients with an implantable cardioverter-defibrillator (ICD). Triggering factors can rarely be identified. OBJECTIVES: Several studies have revealed seasonal variations in the frequency of cardiovascular events and life-threatening arrhythmias, and we sought to establish whether seasonal factors may exacerbate ventricular electrical instability leading to arrhythmia clusters and electrical storm. METHODS: Two hundred and fourteen consecutive defibrillator recipients were followed-up during 3.3 +/- 2.2 years. Arrhythmia cluster was defined as the occurrence of three or more arrhythmic events triggering appropriate defibrillator therapies within 2 weeks. Time intervals between two clusters were calculated for each month and each season, and were compared using Kruskal-Wallis test and Wilcoxon-Mann-Whitney test with Bonferroni adjustment. RESULTS: During a follow-up of 698 patient years, 98 arrhythmia clusters were observed in 51 patients; clustering ventricular arrhythmias were associated with temporal variables; they occurred more frequently in the winter and spring months than during the summer and fall. Accordingly, the time intervals between two clusters were significantly shorter during winter and spring (median and 95% CI): winter 16 (5-19), spring 11.5 (7-25), summer 34.5 (15-55), fall 50.5 (19-65), P = 0.0041. CONCLUSION: There are important seasonal variations in the incidence of arrhythmia clusters in ICD recipients. Whether these variations are related to environmental factors, change in physical activity, or psychological factors requires further study.
Resumo:
From Bush’s September 20, 2001 “War on Terror” speech to Congress to President-Elect Barack Obama’s acceptance speech on November 4, 2008, the U.S. Army produced visual recruitment material that addressed the concerns of falling enlistment numbers—due to the prolonged and difficult war in Iraq—with quickly-evolving and compelling rhetorical appeals: from the introduction of an “Army of One” (2001) to “Army Strong” (2006); from messages focused on education and individual identity to high-energy adventure and simulated combat scenarios, distributed through everything from printed posters and music videos to first-person tactical-shooter video games. These highly polished, professional visual appeals introduced to the American public during a time of an unpopular war fought by volunteers provide rich subject matter for research and analysis. This dissertation takes a multidisciplinary approach to the visual media utilized as part of the Army’s recruitment efforts during the War on Terror, focusing on American myths—as defined by Barthes—and how these myths are both revealed and reinforced through design across media platforms. Placing each selection in its historical context, this dissertation analyzes how printed materials changed as the War on Terror continued. It examines the television ad that introduced “Army Strong” to the American public, considering how the combination of moving image, text, and music structure the message and the way we receive it. This dissertation also analyzes the video game America’s Army, focusing on how the interaction of the human player and the computer-generated player combine to enhance the persuasive qualities of the recruitment message. Each chapter discusses how the design of the particular medium facilitates engagement/interactivity of the viewer. The conclusion considers what recruitment material produced during this time period suggests about the persuasive strategies of different media and how they create distinct relationships with their spectators. It also addresses how theoretical frameworks and critical concepts used by a variety of disciplines can be combined to analyze recruitment media utilizing a Selber inspired three literacy framework (functional, critical, rhetorical) and how this framework can contribute to the multimodal classroom by allowing instructors and students to do a comparative analysis of multiple forms of visual media with similar content.
Resumo:
Large parts of the world are subjected to one or more natural hazards, such as earthquakes, tsunamis, landslides, tropical storms (hurricanes, cyclones and typhoons), costal inundation and flooding. Virtually the entire world is at risk of man-made hazards. In recent decades, rapid population growth and economic development in hazard-prone areas have greatly increased the potential of multiple hazards to cause damage and destruction of buildings, bridges, power plants, and other infrastructure; thus posing a grave danger to the community and disruption of economic and societal activities. Although an individual hazard is significant in many parts of the United States (U.S.), in certain areas more than one hazard may pose a threat to the constructed environment. In such areas, structural design and construction practices should address multiple hazards in an integrated manner to achieve structural performance that is consistent with owner expectations and general societal objectives. The growing interest and importance of multiple-hazard engineering has been recognized recently. This has spurred the evolution of multiple-hazard risk-assessment frameworks and development of design approaches which have paved way for future research towards sustainable construction of new and improved structures and retrofitting of the existing structures. This report provides a review of literature and the current state of practice for assessment, design and mitigation of the impact of multiple hazards on structural infrastructure. It also presents an overview of future research needs related to multiple-hazard performance of constructed facilities.
Resumo:
The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.
Resumo:
The single electron transistor (SET) is a Coulomb blockade device, whose operation is based on the controlled manipulation of individual electrons. Single electron transistors show immense potential to be used in future ultra lowpower devices, high density memory and also in high precision electrometry. Most SET devices operate at cryogenic temperatures, because the charging energy is much smaller than the thermal oscillations. The room temperature operation of these devices is possible with sub- 10nm nano-islands due to the inverse dependance of charging energy on the radius of the conducting nano-island. The fabrication of sub-10nm features with existing lithographic techniques is a technological challenge. Here we present the results for the first room temperature operating SET device fabricated using Focused Ion Beam deposition technology. The SET device, incorporates an array of tungsten nano-islands with an average diameter of 8nm. The SET devices shows clear Coulomb blockade for different gate voltages at room temperature. The charging energy of the device was calculated to be 160.0 meV; the capacitance per junction was found to be 0.94 atto F; and the tunnel resistance per junction was calculated to be 1.26 G Ω. The tunnel resistance is five orders of magnitude larger than the quantum of resistance (26 k Ω) and allows for the localization of electrons on the tungsten nano-island. The lower capacitance of the device combined with the high tunnel resistance, allows for the Coulomb blockade effects observed at room temperature. Different device configurations, minimizing the total capacitance of the device have been explored. The effect of the geometry of the nano electrodes on the device characteristics has been presented. Simulated device characteristics, based on the soliton model have been discussed. The first application of SET device as a gas sensor has been demonstrated.
Resumo:
Regional flood frequency techniques are commonly used to estimate flood quantiles when flood data is unavailable or the record length at an individual gauging station is insufficient for reliable analyses. These methods compensate for limited or unavailable data by pooling data from nearby gauged sites. This requires the delineation of hydrologically homogeneous regions in which the flood regime is sufficiently similar to allow the spatial transfer of information. It is generally accepted that hydrologic similarity results from similar physiographic characteristics, and thus these characteristics can be used to delineate regions and classify ungauged sites. However, as currently practiced, the delineation is highly subjective and dependent on the similarity measures and classification techniques employed. A standardized procedure for delineation of hydrologically homogeneous regions is presented herein. Key aspects are a new statistical metric to identify physically discordant sites, and the identification of an appropriate set of physically based measures of extreme hydrological similarity. A combination of multivariate statistical techniques applied to multiple flood statistics and basin characteristics for gauging stations in the Southeastern U.S. revealed that basin slope, elevation, and soil drainage largely determine the extreme hydrological behavior of a watershed. Use of these characteristics as similarity measures in the standardized approach for region delineation yields regions which are more homogeneous and more efficient for quantile estimation at ungauged sites than those delineated using alternative physically-based procedures typically employed in practice. The proposed methods and key physical characteristics are also shown to be efficient for region delineation and quantile development in alternative areas composed of watersheds with statistically different physical composition. In addition, the use of aggregated values of key watershed characteristics was found to be sufficient for the regionalization of flood data; the added time and computational effort required to derive spatially distributed watershed variables does not increase the accuracy of quantile estimators for ungauged sites. This dissertation also presents a methodology by which flood quantile estimates in Haiti can be derived using relationships developed for data rich regions of the U.S. As currently practiced, regional flood frequency techniques can only be applied within the predefined area used for model development. However, results presented herein demonstrate that the regional flood distribution can successfully be extrapolated to areas of similar physical composition located beyond the extent of that used for model development provided differences in precipitation are accounted for and the site in question can be appropriately classified within a delineated region.
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system. With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system.
Analysis of spring break-up and its effects on a biomass feedstock supply chain in northern Michigan
Resumo:
Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.
Resumo:
BACKGROUND: Chronic pain is an important outcome variable after inguinal hernia repair that is generally not assessed by objective methods. The aim of this study was to objectively investigate chronic pain and hypoesthesia after inguinal hernia repair using three types of operation: open suture, open mesh, and laparoscopic. METHODS: A total of 96 patients were included in the study with a median follow-up of 4.7 years. Open suture repair was performed in 40 patients (group A), open mesh repair in 20 patients (group B), and laparoscopic repair in 36 patients (group C). Hypoesthesia and pain were assessed using von Frey monofilaments. Quality of life was investigated with Short Form 36. RESULTS: Pain occurring at least once a week was found in 7 (17.5%) patients of group A, in 5 (25%) patients of group B, and in 6 (16.6%) patients of group C. Area and intensity of hyposensibility were increased significantly after open nonmesh and mesh repair compared to those after laparoscopy (p = 0.01). Hyposensibility in patients who had laparoscopic hernia repair was significantly associated with postoperative pain (p = 0.03). Type of postoperative pain was somatic in 19 (61%), neuropathic in 9 (29%), and visceral in 3 (10%) patients without significant differences between the three groups. CONCLUSIONS: The incidence of hypoesthesia in patients who had laparoscopic hernia repair is significantly lower than in those who had open hernia repair. Hypoesthesia after laparoscopic but not after open repair is significantly associated with postoperative pain. Von Frey monofilaments are important tools for the assessment of inguinal hypoesthesia and pain in patients who had inguinal hernia repair allowing quantitative and qualitative comparison between various surgical techniques.