966 resultados para room set up
Resumo:
As part of the Sentinel-3 mission and in order to ensure the highest quality of products, ESA in cooperation with EUMETSAT has set up the Sentinel-3 Mission Performance Centre (S-3 MPC). This facility is part of the Payload Data Ground Segment (PDGS) and aims at controlling the quality of all generated products, from L0 to L2. The S-3 MPC is composed of a Coordinating Centre (CC), where the core infrastructure is hosted, which is in charge of the main routine activities (especially the quality control of data) and the overall service management. Expert Support Laboratories (ESLs) are involved in calibration and validation activities and provide specific assessment of the products (e.g., analysis of trends, ad hoc analysis of anomalies, etc.). The S-3 MPC interacts with the Processing Archiving Centres (PACs) and the Marine centre at EUMETSAT.
Resumo:
Measures of prevention and control against polycyclic aromatic hydrocarbons (PAHs) focus on an official food control, a code of best practice to reduce PAHs levels by controlling industry and in the development of a chemopreventive strategy. Regulation (EU) 835/2011 establishes maximum levels of PAHs for each food group. In addition, Regulations (EU) 333/2007 and 836/2011 set up the methods of sampling and analysis for its official control. Scientific studies prove that the chemopreventive strategy is effective against these genotoxic compounds effects. Most chemopreventive compounds studied with proven protective effects against PAHs are found in fruit and vegetables.
Resumo:
This study examined whether adding spin to a ball in the free kick situation in football affects a professional footballer's perception of the ball's future arrival position. Using a virtual reality set-up, participants observed the flight paths of aerodynamically realistic free kicks with (+/- 600 rpm) and without sidespin. With the viewpoint being fixed in the centre of the goal, participants had to judge whether the ball would have ended up in the goal or not. Results show that trajectories influenced by the Magnus force caused by sidespin gave rise to a significant shift in the percentage of goal responses. The resulting acceleration that causes the ball to continually change its heading direction as the trajectory unfolds does not seem to be taken into account by the participants when making goal judgments. We conclude that the visual system is not attuned to such accelerated motion, which may explain why goalkeepers appear to misjudge the future arrival point of such curved free kicks.
Resumo:
The corrosion of reinforcement in bridge deck slabs has been the cause of major deterioration and high costs in repair and maintenance.This problem could be overcome by reducing the amount of reinforcement and/or altering the location.This is possible because, in addition to the strength provided by the reinforcement, bridge deck slabs have an inherent strength due to the in-plane arching forces set up as a result of restraint provided by the slab boundary conditions. This is known as arching action or Compressive Membrane Action (CMA). It has been recognised for some time that laterally restrained slabs exhibit strengths far in excess of those predicted by most design codes but the phenomenon has not been recognised by the majority of bridge design engineers. This paper presents the results of laboratory tests on fifteen reinforced concrete slab strips typical of a bridge deck slab and compares them to predicted strengths using the current codes and CMA theory. The tests showed that the strength of laterally restrained slabs is sensitive to both the degree of external lateral restraint and the concrete compressive strength.The tests particularly highlighted the benefits in strength obtained from very high strength concrete slabs. The theory extends the existing knowledge of CMA in slabs with concrete compressive strengths up to 100 N/mm[2] and promotes more economical and durable bridge deck construction by utilising the benefits of high strength concrete.
Resumo:
There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.
Resumo:
The aim of the 5-year European Union (EU)-Integrated Project GEnetics of Healthy Aging (GEHA), constituted by 25 partners (24 from Europe plus the Beijing Genomics Institute from China), is to identify genes involved in healthy aging and longevity, which allow individuals to survive to advanced old age in good cognitive and physical function and in the absence of major age-related diseases. To achieve this aim a coherent, tightly integrated program of research that unites demographers, geriatricians, geneticists, genetic epidemiologists, molecular biologists, bioinfomaticians, and statisticians has been set up. The working plan is to: (a) collect DNA and information on the health status from an unprecedented number of long-lived 90+ sibpairs (n = 2650) and of younger ethnically matched controls (n = 2650) from 11 European countries; (b) perform a genome-wide linkage scannning in all the sibpairs (a total of 5300 individuals); this investigation will be followed by linkage disequilibrium mapping (LD mapping) of the candidate chromosomal regions; (c) study in cases (i.e., the 2650 probands of the sibpairs) and controls (2650 younger people), genomic regions (chromosome 4, D4S1564, chromosome 11, 11.p15.5) which were identified in previous studies as possible candidates to harbor longevity genes; (d) genotype all recruited subjects for apoE polymorphisms; and (e) genotype all recruited subjects for inherited as well as epigenetic variability of the mitochondrial DNA (mtDNA). The genetic analysis will be performed by 9 high-throughput platforms, within the framework of centralized databases for phenotypic, genetic, and mtDNA data. Additional advanced approaches (bioinformatics, advanced statistics, mathematical modeling, functional genomics and proteomics, molecular biology, molecular genetics) are envisaged to identify the gene variant(s) of interest. The experimental design will also allow (a) to identify gender-specific genes involved in healthy aging and longevity in women and men stratified for ethnic and geographic origin and apoE genotype; (b) to perform a longitudinal survival study to assess the impact of the identified genetic loci on 90+ people mortality; and (c) to develop mathematical and statistical models capable of combining genetic data with demographic characteristics, health status, socioeconomic factors, lifestyle habits.
Resumo:
Background and purpose: Currently, optimal use of virtual simulation for all treatment sites is not entirely clear. This study presents data to identify specific patient groups for whom conventional simulation may be completely eliminated and replaced by virtual simulation. Sampling and method: Two hundred and sixty patients were recruited from four treatment sites (head and neck, breast, pelvis, and thorax). Patients were randomly assigned to be treated using the usual treatment process involving conventional simulation, or a treatment process differing only in the replacement of conventional plan verification with virtual verification. Data were collected on set-up accuracy at verification, and the number of unsatisfactory verifications requiring a return to the conventional simulator. A micro-economic costing analysis was also undertaken, whereby data for each treatment process episode were also collected: number and grade of staff present, and the time for each treatment episode. Results: The study shows no statistically significant difference in the number of returns to the conventional simulator for each site and study arm. Image registration data show similar quality of verification for each study arm. The micro-costing data show no statistical difference between the virtual and conventional simulation processes. Conclusions: At our institution, virtual simulation including virtual verification for the sites investigated presents no disadvantage compared to conventional simulation.
Resumo:
Key pre-distribution schemes have been proposed as means to overcome Wireless Sensor Networks constraints such as limited communication and processing power. Two sensor nodes can establish a secure link with some probability based on the information stored in their memories though it is not always possible that two sensor nodes may set up a secure link. In this paper, we propose a new approach that elects trusted common nodes called ”Proxies” which reside on an existing secure path linking two sensor nodes. These sensor nodes are used to send the generated key which will be divided into parts (nuggets) according to the number of elected proxies. Our approach has been assessed against previously developed algorithms and the results show that our algorithm discovers proxies more quickly which are closer to both end nodes, thus producing shorter path lengths. We have also assessed the impact of our algorithm on the average time to establish a secure link when the transmitter and receiver of the sensor nodes are ”ON”. The results show the superiority of our algorithm in this regard. Overall, the proposed algorithm is well suited for Wireless Sensor Networks.
Resumo:
We introduce a novel scheme for one-way quantum computing (QC) based on the use of information encoded qubits in an effective cluster state resource. With the correct encoding structure, we show that it is possible to protect the entangled resource from phase damping decoherence, where the effective cluster state can be described as residing in a decoherence-free subspace (DFS) of its supporting quantum system. One-way QC then requires either single or two-qubit adaptive measurements. As an example where this proposal can be realized, we describe an optical lattice set-up where the scheme provides robust quantum information processing. We also outline how one can adapt the model to provide protection from other types of decoherence.
Resumo:
Anticoccidials are compounds that are widely used as feed additives to prevent and treat coccidiosis. They are licensed for use in a prescribed concentration and during a certain time interval for broilers and pullets but not for laying hens. It was shown in the past that carry-over at the feeding mill is found to be the main reason for the presence of residues in eggs. An animal experiment was set up to investigate the effect of carry-over at the feeding mill on the presence of residues of anticoccidials in eggs. For the compounds diclazuril, robenidine, halofuginone and nicarbazin in combination with narasin, two concentration levels were tested: the maximum allowed concentration for broilers (100%) and a concentration corresponding to 5% carry-over during feed preparation. Also dimetridazole was included in the experiment but only at one concentration level. Eggs were sampled during treatment (14 days) and for a period of 30 days after withdrawal of the anticoccidial-containing feed. Residues were determined, and deposition and depletion curves were generated. Analyses were performed by ELISA and LC-MS/MS. For all compounds, substantial residues could be found in the 5% groups, which points out the risk of carry-over at the feeding mill. The distribution of the residues between egg yolk and white was determined by analyzing both fractions.
Resumo:
There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the “utilisation” which is basically the percentage of available ’seat-hours’ that are employed. In real institutions, this utilisation can often takes values as low as 20-40%. One consequence of such low utilisation is that space managers are under pressure to make a more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. Nor, a good basis within space planning (long-term planning) of how best to accommodate the expected low utilisations. This motivates our two main goals: (i) To understand the factors that drive down utilisations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilisations seen in reality. Furthermore, on considering the decision question “Can this given set of courses all be allocated in the available teaching space?” we find that the answer depends on the associated utilisation in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is “almost always yes” and those of “almost always no”. Our work suggests that progress in space management and planning will arise from an integrated approach; combining purely space issues with restrictions representing an aggregated or abstracted version of key constraints such as timetabling or location, and
Resumo:
We describe an experimental system designed for single-shot photoelectron spectroscopy on free atoms and molecules at the Free Electron Laser in Hamburg (FLASH at DESY). The combination of the extreme ultra-violet (EUV) Free Electron Laser and a temporally synchronized optical fs laser (Ti:Sapphire) enables a variety of two-color pump-probe experiments. The spectral, temporal and spatial characteristics of both the EUV FEL and the optical laser pulses, the experimental procedure to control their overlap as well as the performance of an electron spectrometer used to obtain single-shot photoelectron spectra are discussed. As an illustration of the capabilities of this set-up, some results on two-photon two-color ionization of rare gases are presented. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Saturated output has been observed for both Ne and Ni-like X-ray lasers when Pumped in the transient mode. As these 'normal' transitions display very high gain, attempts have been made to observe a 2p --> 2s inner shell transition in Ne-like ions, which scale well towards the water window. Modelling of the pump conditions for Ge lasing at 6.2 run is presented. As the predicted gain is low the experiment was set up for 18 mm targets. Shots were taken on Ti, Fe, Ni and Ge. A similar to1.5 ps travelling wave pulse is applied at various times after the peak of a long, preforming Pulse. Various pump conditions were attempted but no inner shell X-ray laser was detected.
Resumo:
H-3(+) is the simplest triatomic molecule and plays an important role in laboratory and astrophysical plasmas. It is very stable both in terms of its electronic and nuclear degrees of freedom but is difficult to study in depth in the laboratory due to its ionic nature. In this communication, experimental results are presented for the strong field dissociation of the isotopic analogue D-3(+), using 30 fs, 800 nm laser pulses with intensities up to 10(16) W cm(-2). By employing a novel experimental set-up, ions were confined in an electrostatic ion trap so that dissociation of the molecule could be studied as it radiatively cools. It was determined that dissociation could only be observed for molecules in ro-vibrational states relatively close to the dissociation limit, while more tightly bound states demonstrated remarkable stability in even the strongest fields.
Resumo:
In the manufacture of granular NPK fertilizer the product is cooled before packaging and storage in moisture-proof bags. It has been shown that the temperature of the fertilizer prior to packing is significant in that at high temperatures, drying of the granules takes place in the bag which causes an increase in the humidity of the air surrounding the granules and thus an increase in moisture content at the granule - granule interface. This surface moisture was shown to increase the likelihood of agglomeration in the fertilizer by a capillary adhesion/unconfined yield stress model. An iterative model was set up to establish conditions that would prevent drying occurring, which takes into account fertilizer drying rate, fertilizer cooling rate cooling rate and the effect of coating oils on the drying mechanism.