956 resultados para Applied identity-based encryption
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.
Resumo:
The chapter reports on the ‘This Is Me’ project, that aimed to help students and the wider public to be aware of the impact that online material, particularly that on the Internet, has on their identity and reputation. The chapter explores practical aspects of Digital Identity, relating to issues such as employability, relationships and even death. For example, understanding the impact a photograph posted on a social networking website might have for different groups of people, ranging from friends or parents to future employers. As part of the ‘This is Me’ project stories were collected from students and others about Digital Identity matters, a grounded methodological approach based on action research was used to establish issues related to Digital Identity particularly relevant to those in academia. Drawing from these issues, resources were developed to help inform and educate people about how they can understand and control their own Digital Identity. A number of these resources are presented here, along with reflections on how they are used and can be adapted.
Resumo:
The study explores what happens to teachers practice and ’ professional identity when they adopt a collaborative action research approach to teaching and involve external creative partners and a university mentor. The teachers aim to nurture and develop the creative potential of their learners through empowering them to make decisions for themselves about their own progress and learning directions. The teachers worked creatively and collaboratively designing creative teaching and learning methods in support of pupils with language and communication difficulties. The respondents are from an English special school, primary school and girls secondary school. A mixed methods methodology is adopted. Gains in teacher confidence and capability were identified in addition to shifts in values that impacted directly on their self-concept of what it is to be an effective teacher promoting effective learning. The development of their professional identities within a team ethos included them being able to make decisions about learning that are based on the educational potential of learners that they proved resulted in elevated standards achieved by this group of learners. They were able to justify their actions on established educational principles. Tensions however were revealed between what they perceived as their normal required professionalism imposed by external agencies and the enhanced professionalism experienced working through the project where they were able to integrate theory and practice.
Resumo:
Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.
Resumo:
Lava dome eruptions are sometimes characterised by large periodic fluctuations in extrusion rate over periods of hours that may be accompanied by Vulcanian explosions and pyroclastic flows. We consider a simple system of nonlinear equations describing a 1D flow of lava extrusion through a deep elastic dyke feeding a shallower cylindrical conduit in order to simulate this short-period cyclicity. Stick-slip conditions depending on a critical shear stress are assumed at the wall boundary of the cylindrical conduit. By analogy with the behaviour of industrial polymers in a plastic extruder, the elastic dyke acts like a barrel and the shallower cylindrical portion of the conduit as a die for the flow of magma acting as a polymer. When we applied the model to the Soufrière Hills Volcano, Montserrat, for which the key parameters have been evaluated from previous studies, cyclic extrusions with periods from 3 to 30 h were readily simulated, matching observations. The model also reproduces the reduced period of cycles observed when a major unloading event occurs due to lava dome collapse.
Resumo:
Consent's capacity to legitimise actions and claims is limited by conditions such as coercion, which render consent ineffective. A better understanding of the limits to consent's capacity to legitimise can shed light on a variety of applied debates, in political philosophy, bioethics, economics and law. I show that traditional paternalist explanations for limits to consent's capacity to legitimise cannot explain the central intuition that consent is often rendered ineffective when brought about by a rights violation or threatened rights violation. I argue that this intuition is an expression of the same principles of corrective justice that underlie norms of compensation and rectification. I show how these principles can explain and clarify core intuitions about conditions which render consent ineffective, including those concerned with the consenting agent's option set, his mental competence, and available information.
Resumo:
Particulate matter generated during the cooking process has been identified as one of the major problems of indoor air quality and indoor environmental health. Reliable assessment of exposure to cooking-generated particles requires accurate information of emission characteristics especially the size distribution. This study characterizes the volume/mass-based size distribution of the fume particles at the oil-heating stage for the typical Chinese-style cooking in a laboratory kitchen. A laser-diffraction size analyzer is applied to measure the volume frequency of fume particles ranged from 0.1 to 10 μm, which contribute to most mass proportion in PM2.5 and PM10. Measurements show that particle emissions have little dependence on the types of vegetable oil used but have a close relationship with the heating temperature. It is found that volume frequency of fume particles in the range of 1.0–4.0 μm accounts for nearly 100% of PM0.1–10 with the mode diameter 2.7 μm, median diameter 2.6 μm, Sauter mean diameter 3.0 μm, DeBroukere mean diameter 3.2 μm, and distribution span 0.48. Such information on emission characteristics obtained in this study can be possibly used to improve the assessment of indoor air quality due to PM0.1–10 in the kitchen and residential flat.
Resumo:
The butanol-HCl spectrophotometric assay is widely used for quantifying extractable and insoluble condensed tannins (CT, syn. proanthocyanidins) in foods, feeds, and foliage of herbaceous and woody plants, but the method underestimates total CT content when applied directly to plant material. To improve CT quantitation, we tested various cosolvents with butanol-HCl and found that acetone increased anthocyanidin yields from two forage Lotus species having contrasting procyanidin and prodelphinidin compositions. A butanol-HCl-iron assay run with 50% (v/v) acetone gave linear responses with Lotus CT standards and increased estimates of total CT in Lotus herbage and leaves by up to 3.2-fold over the conventional method run without acetone. The use of thiolysis to determine the purity of CT standards further improved quantitation. Gel-state 13C and 1H–13C HSQC NMR spectra of insoluble residues collected after butanol-HCl assays revealed that acetone increased anthocyanidin yields by facilitating complete solubilization of CT from tissue.
Resumo:
Contemporary artists exploring Jewish identity in the UK are caught between two exclusions, broadly speaking: an art community that that sees itself as ‘post –identity’ and a ‘black’ art scene that revolves around the organizations that emerged out of the Identity debates of the 1980s and 1990s, namely Iniva, Third Text, Autograph. These organizations and those debates, don’t usually include Jewish identity within their remit as Jewish artists are considered to be well represented in the British art scene and, in any case, white. Out of these assumptions, questions arise in relation to the position of Jews in Britain and what is at stake for an artist in exploring Jewish Identity in their work. There is considerable scholarship, relatively speaking on art and Jewish Identity in the US (such as Lisa Bloom; Norman Kleeblatt; Catherine Sousslouf), which inform the debates on visual culture and Jews. In this chapter, I will be drawing out some of the distinctions between the US and the UK debates within my analysis, building on my own writing over the last ten years as well as the work of Juliet Steyn, Jon Stratton and Griselda Pollock. In short, this chapter aims to explore the problematic of what Jewish Identity can offer the viewer as art; what place such art inhabits within a wider artistic context and how, if at all, it is received. There is a predominance of lens based work that explores Identity arising out of the provenance of feminist practices and the politics of documentary that will be important in the framing of the work. I do not aim to consider what constitutes a Jewish artist, that has been done elsewhere and is an inadequate and somewhat spurious conversation . I will also not be focusing on artists whose intention is to celebrate an unproblematised Jewishness (however that is constituted in any given work). Recent artworks and scholarship has in any case rendered the trumpeting of attachment to any singular identity anachronistic at best. I will focus on artists working in the UK who incorporate questions of Jewishness into a larger visual enquiry that build on Judith Butler’s notion of identity as process or performative as well as the more recent debates and artwork that consider the intersectionality of identifications that co-constitute provisional identities (Jones, Modood, Sara Ahmed, Braidotti/Nikki S Lee, Glenn Ligon). The case studies to think through these questions of identity, will be artworks by Susan Hiller, Doug Fishbone and Suzanne Triester. In thinking through works by these artists, I will also serve to contextualise them, situating them briefly within the history of the landmark exhibition in the UK, Rubies and Rebels and the work of Ruth Novaczek, Lily Markewitz, Oreet Ashery and myself.
Resumo:
In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.
Resumo:
A deeper understanding of random markers is important if they are to be employed for a range of objectives. The sequence specific amplified polymorphism (S-SAP) technique is a powerful genetic analysis tool which exploits the high copy number of retrotransposon long terminal repeats (LTRs) in the plant genome. The distribution and inheritance of S-SAP bands in the barley genome was studied using the Steptoe × Morex (S × M) double haploid (DH) population. Six S-SAP primer combinations generated 98 polymorphic bands, and map positions were assigned to all but one band. Eight putative co-dominant loci were detected, representing 16 of the mapped markers. Thus at least 81 of the mapped S-SAP loci were dominant. The markers were distributed along all of the seven chromosomes and a tendency to cluster was observed. The distribution of S-SAP markers over the barley genome concurred with the knowledge of the high copy number of retrotransposons in plants. This experiment has demonstrated the potential for the S-SAP technique to be applied in a range of analyses such as genetic fingerprinting, marker assisted breeding, biodiversity assessment and phylogenetic analyses.
Resumo:
Introgression in Festulolium is a potentially powerful tool to isolate genes for a large number of traits which differ between Festuca pratensis Huds. and Lolium perenne L. Not only are hybrids between the two species fertile, but the two genomes can be distinguished by genomic in situ hybridisation and a high frequency of recombination occurs between homoeologous chromosomes and chromosome segments. By a programme of introgression and a series of backcrosses, L. perenne lines have been produced which contain small F. pratensis substitutions. This material is a rich source of polymorphic markers targeted towards any trait carried on the F. pratensis substitution not observed in the L. perenne background. We describe here the construction of an F. pratensis BAC library, which establishes the basis of a map-based cloning strategy in L. perenne. The library contains 49,152 clones, with an average insert size of 112 kbp, providing coverage of 2.5 haploid genome equivalents. We have screened the library for eight amplified fragment length polymorphism (AFLP) derived markers known to be linked to an F. pratensis gene introgressed into L. perenne and conferring a staygreen phenotype as a consequence of a mutation in primary chlorophyll catabolism. While for four of the markers it was possible to identify bacterial artificial chromosome (BAC) clones, the other four AFLPs were too repetitive to enable reliable identification of locus-specific BACs. Moreover, when the four BACs were partially sequenced, no obvious coding regions could be identified. This contrasted to BACs identified using cDNA sequences, when multiple genes were identified on the same BAC.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
A detailed spectrally-resolved extraterrestrial solar spectrum (ESS) is important for line-by-line radiative transfer modeling in the near-infrared (near-IR). Very few observationally-based high-resolution ESS are available in this spectral region. Consequently the theoretically-calculated ESS by Kurucz has been widely adopted. We present the CAVIAR (Continuum Absorption at Visible and Infrared Wavelengths and its Atmospheric Relevance) ESS which is derived using the Langley technique applied to calibrated observations using a ground-based high-resolution Fourier transform spectrometer (FTS) in atmospheric windows from 2000–10000 cm-1 (1–5 μm). There is good agreement between the strengths and positions of solar lines between the CAVIAR and the satellite-based ACE-FTS (Atmospheric Chemistry Experiment-FTS) ESS, in the spectral region where they overlap, and good agreement with other ground-based FTS measurements in two near-IR windows. However there are significant differences in the structure between the CAVIAR ESS and spectra from semi-empirical models. In addition, we found a difference of up to 8 % in the absolute (and hence the wavelength-integrated) irradiance between the CAVIAR ESS and that of Thuillier et al., which was based on measurements from the Atmospheric Laboratory for Applications and Science satellite and other sources. In many spectral regions, this difference is significant, as the coverage factor k = 2 (or 95 % confidence limit) uncertainties in the two sets of observations do not overlap. Since the total solar irradiance is relatively well constrained, if the CAVIAR ESS is correct, then this would indicate an integrated “loss” of solar irradiance of about 30 W m-2 in the near-IR that would have to be compensated by an increase at other wavelengths.