48 resultados para planets and satellites: individual: Uranus


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detailed observations of the solar system planets reveal a wide variety of local atmospheric conditions. Astronomical observations have revealed a variety of extrasolar planets none of which resembles any of the solar system planets in full. Instead, the most massive amongst the extrasolar planets, the gas giants, appear very similar to the class of (young) Brown Dwarfs which are amongst the oldest objects in the universe. Despite of this diversity, solar system planets, extrasolar planets and Brown Dwarfs have broadly similar global temperatures between 300K and 2500K. In consequence, clouds of different chemical species form in their atmospheres. While the details of these clouds differ, the fundamental physical processes are the same. Further to this, all these objects were observed to produce radio and X-ray emission. While both kinds of radiation are well studied on Earth and to a lesser extent on the solar system planets, the occurrence of emission that potentially originate from accelerated electrons on Brown Dwarfs, extrasolar planets and protoplanetary disks is not well understood yet. This paper offers an interdisciplinary view on electrification processes and their feedback on their hosting environment in meteorology, volcanology, planetology and research on extrasolar planets and planet formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Innovation continues to be high on the agenda in construction. It is widely considered to be an essential prerequisite of improved performance both for the sector at large and for individual firms. Success stories dominate the parts of the academic literature that rely heavily on the recollections of key individuals. A complementary interpretation focuses on the way innovation champions in hindsight interpret, justify and legitimize the diffusion of innovations. Emphasis is put on the temporal dimension of interpretation and how this links to rhetorical strategies and impression management tactics. Rhetorical theories are drawn upon to analyse the accounts given by innovation champions in seven facilities management organizations. In particular, the three persuasive appeals in classic rhetoric are used to highlight the rhetorical justifications mobilized in the descriptions of what took place. The findings demonstrate the usefulness of rhetorical theories in complementing studies of innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dietary isoflavones are currently receiving much attention because of their potential role in preventing coronary artery disease and other chronic diseases. Accumulating evidence from cell culture and laboratory animal experiments indicates that isoflavones have the potential to prevent or delay atherogenesis. Suggested mechanisms of action include: a reduction in low-density lipoprotein (LDL) cholesterol and a potential reduction in the susceptibility of the LDL particle to oxidation; (2) an improvement in vascular reactivity; (3) an inhibition of pro-inflammatory cytokines, cell adhesion proteins and nitric oxide (NO) production; and (4) an inhibition of platelet aggregation. These mechanisms are consistent with the epidemiological evidence that a high consumption of isoflavone-rich soy products is associated with a reduced incidence of coronary artery disease. Biological effects of isoflavones are dependent on many factors, including dose consumed, duration of use, protein-binding affinity, and an individual's metabolism or intrinsic oestrogenic state. Further clinical studies are necessary to determine the potential health effects of isoflavones in specific population groups as we currently know little about age-related differences in exposure to these compounds and there are few guidelines on optimal dose for cardiovascular health benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method for dynamic data reconciliation of nonlinear systems that are simulated using the sequential modular approach, and where individual modules are represented by a class of differential algebraic equations. The estimation technique consists of a bank of extended Kalman filters that are integrated with the modules. The paper reports a study based on experimental data obtained from a pilot scale mixing process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review the scientific literature since the 1960s to examine the evolution of modeling tools and observations that have advanced understanding of global stratospheric temperature changes. Observations show overall cooling of the stratosphere during the period for which they are available (since the late 1950s and late 1970s from radiosondes and satellites, respectively), interrupted by episodes of warming associated with volcanic eruptions, and superimposed on variations associated with the solar cycle. There has been little global mean temperature change since about 1995. The temporal and vertical structure of these variations are reasonably well explained bymodels that include changes in greenhouse gases, ozone, volcanic aerosols, and solar output, although there are significant uncertainties in the temperature observations and regarding the nature and influence of past changes in stratospheric water vapor. As a companion to a recent WIREs review of tropospheric temperature trends, this article identifies areas of commonality and contrast between the tropospheric and stratospheric trend literature. For example, the increased attention over time to radiosonde and satellite data quality has contributed to better characterization of uncertainty in observed trends both in the troposphere and in the lower stratosphere, and has highlighted the relative deficiency of attention to observations in the middle and upper stratosphere. In contrast to the relatively unchanging expectations of surface and tropospheric warming primarily induced by greenhouse gas increases, stratospheric temperature change expectations have arisen from experiments with a wider variety of model types, showingmore complex trend patterns associated with a greater diversity of forcing agents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The National Grid Company plc. owns and operates the electricity transmission network in England and Wales, the day to day running of the network being carried out by teams of engineers within the national control room. The task of monitoring and operating the transmission network involves the transfer of large amounts of data and a high degree of cooperation between these engineers. The purpose of the research detailed in this paper is to investigate the use of interfacing techniques within the control room scenario, in particular, the development of an agent based architecture for the support of cooperative tasks. The proposed architecture revolves around the use of interface and user supervisor agents. Primarily, these agents are responsible for the flow of information to and from individual users and user groups. The agents are also responsible for tackling the synchronisation and control issues arising during the completion of cooperative tasks. In this paper a novel approach to human computer interaction (HCI) for power systems incorporating an embedded agent infrastructure is presented. The agent architectures used to form the base of the cooperative task support system are discussed, as is the nature of the support system and tasks it is intended to support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The effects of landscape modifications on the long-term persistence of wild animal populations is of crucial importance to wildlife managers and conservation biologists, but obtaining experimental evidence using real landscapes is usually impossible. To circumvent this problem we used individual-based models (IBMs) of interacting animals in experimental modifications of a real Danish landscape. The models incorporate as much as possible of the behaviour and ecology of four species with contrasting life-history characteristics: skylark (Alauda arvensis), vole (Microtus agrestis), a ground beetle (Bembidion lampros) and a linyphiid spider (Erigone atra). This allows us to quantify the population implications of experimental modifications of landscape configuration and composition. Methodology/Principal Findings: Starting with a real agricultural landscape, we progressively reduced landscape complexity by (i) homogenizing habitat patch shapes, (ii) randomizing the locations of the patches, and (iii) randomizing the size of the patches. The first two steps increased landscape fragmentation. We assessed the effects of these manipulations on the long-term persistence of animal populations by measuring equilibrium population sizes and time to recovery after disturbance. Patch rearrangement and the presence of corridors had a large effect on the population dynamics of species whose local success depends on the surrounding terrain. Landscape modifications that reduced population sizes increased recovery times in the short-dispersing species, making small populations vulnerable to increasing disturbance. The species that were most strongly affected by large disturbances fluctuated little in population sizes in years when no perturbations took place. Significance: Traditional approaches to the management and conservation of populations use either classical methods of population analysis, which fail to adequately account for the spatial configurations of landscapes, or landscape ecology, which accounts for landscape structure but has difficulty predicting the dynamics of populations living in them. Here we show how realistic and replicable individual-based models can bridge the gap between non-spatial population theory and non-dynamic landscape ecology. A major strength of the approach is its ability to identify population vulnerabilities not detected by standard population viability analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wild house mouse (Mus domesticus) population originally trapped near Reading, Berkshire, United Kingdom, and maintained as a colony in the laboratory, was subjected to the discriminating feeding period of the warfarin resistance test, as used by Wallace and MacSwiney (1976) and derived from the work of Rowe and Redfern (1964). Eighty percent of this heterogeneous population survived the resistance-test. A similar proportion of the population was found to survive the normally lethal dose of bromadiolone administered by oral gavage. The majority of this population of mice were classified as "warfarin-resistant" and "bromadiolone-resistant." The dose of 10mg.kg-1 of bromadiolone administered by oral gavage appeared to give good discrimination between susceptible and resistant individuals. The results of breeding tests indicate a single dominant gene that confers both "warfarin-resistance" and "bromadiolone-resistance", with complete expression of the resistance genotype in both males and females. Individual mice were classified as to genotype by back-crossing to a homozygous-susceptible strain, and resistance-testing the F1 generation. Separate strains of homozygous-resistant and homozygous-susceptible house mice are now being established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recession of mountain glaciers around the world has been linked to anthropogenic climate change and small glaciers (e.g. < 2 km2) are thought to be particularly vulnerable, with reports of their disappearance from several regions. However, the response of small glaciers to climate change can be modulated by non-climatic factors such as topography and debris cover and there remain a number of regions where their recent change has evaded scrutiny. This paper presents results of the first multi-year remote sensing survey of glaciers in the Kodar Mountains, the only glaciers in SE Siberia, which we compare to previous glacier inventories from this continental setting that reported total glacier areas of 18.8 km2 in ca. 1963 (12.6 km2 of exposed ice) and 15.5 km2 in 1974 (12 km2 of exposed ice). Mapping their debris-covered termini is difficult but delineation of debris-free ice on Landsat imagery reveals 34 glaciers with a total area of 11.72 ± 0.72 km2 in 1995, followed by a reduction to 9.53 ± 0.29 km2 in 2001 and 7.01 ± 0.23 km2 in 2010. This represents a ~ 44% decrease in exposed glacier ice between ca. 1963 and 2010, but with 40% lost since 1995 and with individual glaciers losing as much as 93% of their exposed ice. Thus, although continental glaciers are generally thought to be less sensitive than their maritime counterparts, a recent acceleration in shrinkage of exposed ice has taken place and we note its coincidence with a strong summer warming trend in the region initiated at the start of the 1980s. Whilst smaller and shorter glaciers have, proportionally, tended to shrink more rapidly, we find no statistically significant relationship between shrinkage and elevation characteristics, aspect or solar radiation. This is probably due to the small sample size, limited elevation range, and topographic setting of the glaciers in deep valleys-heads. Furthermore, many of the glaciers possess debris-covered termini and it is likely that the ablation of buried ice is lagging the shrinkage of exposed ice, such that a growth in the proportion of debris cover is occurring, as observed elsewhere. If recent trends continue, we hypothesise that glaciers could evolve into a type of rock glacier within the next few decades, introducing additional complexity in their response and delaying their potential demise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The substorm current wedge (SCW) is a fundamental component of geomagnetic substorms. Models tend to describe the SCW as a simple line current flowing into the ionosphere towards dawn and out of the ionosphere towards dusk, linked by a westward electrojet. We use multi-spacecraft observations from perigee passes of the Cluster 1 and 4 spacecraft during a substorm on 15 Jan 2010, in conjunction with ground-based observations, to examine the spatial structuring and temporal variability of the SCW. At this time, the spacecraft travelled east-west azimuthally above the auroral region. We show that the SCW has significant azimuthal sub-structure on scales of 100~km at altitudes of 4,000-7,000~km. We identify 26 individual current sheets in the Cluster 4 data and 34 individual current sheets in the Cluster 1 data, with Cluster 1 passing through the SCW 120-240~s after Cluster 4 at 1,300-2,000~km higher altitude. Both spacecraft observed large-scale regions of net upward and downward field-aligned current, consistent with the large-scale characteristics of the SCW, although sheets of oppositely directed currents were observed within both regions. We show that the majority of these current sheets were closely aligned to a north-south direction, in contrast to the expected east-west orientation of the pre-onset aurora. Comparing our results with observations of the field-aligned current associated with bursty bulk flows (BBFs) we conclude that significant questions remain for the explanation of SCW structuring by BBF driven ``wedgelets". Our results therefore represent constraints on future modelling and theoretical frameworks on the generation of the SCW.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydro-graph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter presents findings on English Language instruction at the lower primary level in the context of policies for curricular innovation at national, school and classroom levels. The focus is on policies which connect national and school levels, and on how they might be interpreted when implemented in multiple schools within Singapore’s educational system. Referring to case studies in two schools and to individual lesson observations in 10 schools, we found much agreement with national policies in terms of curriculum (i.e. lesson content and activity selection),leading to great uniformity in the lessons taught by different teachers in different schools. In addition, we found that schools had an important mediating influence on implementation of national policies. However, adoptions and adaptations of policy innovations at the classroom level were somewhat superficial as they were more related to changes in educational facilities and procedures than in philosophies.