18 resultados para Chemical processes Data processing

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motion of a car is described using a stochastic model in which the driving processes are the steering angle and the tangential acceleration. The model incorporates exactly the kinematic constraint that the wheels do not slip sideways. Two filters based on this model have been implemented, namely the standard EKF, and a new filter (the CUF) in which the expectation and the covariance of the system state are propagated accurately. Experiments show that i) the CUF is better than the EKF at predicting future positions of the car; and ii) the filter outputs can be used to control the measurement process, leading to improved ability to recover from errors in predictive tracking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A driver controls a car by turning the steering wheel or by pressing on the accelerator or the brake. These actions are modelled by Gaussian processes, leading to a stochastic model for the motion of the car. The stochastic model is the basis of a new filter for tracking and predicting the motion of the car, using measurements obtained by fitting a rigid 3D model to a monocular sequence of video images. Experiments show that the filter easily outperforms traditional filters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ozone profiles from the Microwave Limb Sounder (MLS) onboard the Aura satellite of the NASA's Earth Observing System (EOS) were experimentally added to the European Centre for Medium-range Weather Forecasts (ECMWF) four-dimensional variational (4D-var) data assimilation system of version CY30R1, in which total ozone columns from Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY (SCIAMACHY) onboard the Envisat satellite and partial profiles from the Solar Backscatter Ultraviolet (SBUV/2) instrument onboard the NOAA-16 satellite have been operationally assimilated. As shown by results for the autumn of 2005, additional constraints from MLS data significantly improved the agreement of the analyzed ozone fields with independent observations throughout most of the stratosphere, owing to the daily near-global coverage and good vertical resolution of MLS observations. The largest impacts were seen in the middle and lower stratosphere, where model deficiencies could not be effectively corrected by the operational observations without the additional information on the ozone vertical distribution provided by MLS. Even in the upper stratosphere, where ozone concentrations are mainly determined by rapid chemical processes, dense and vertically resolved MLS data helped reduce the biases related to model deficiencies. These improvements resulted in a more realistic and consistent description of spatial and temporal variations in stratospheric ozone, as demonstrated by cases in the dynamically and chemically active regions. However, combined assimilation of the often discrepant ozone observations might lead to underestimation of tropospheric ozone. In addition, model deficiencies induced large biases in the upper stratosphere in the medium-range (5-day) ozone forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In all biological processes, protein molecules and other small molecules interact to function and form transient macromolecular complexes. This interaction of two or more molecules can be described by a docking event. Docking is an important phase for structure-based drug design strategies, as it can be used as a method to simulate protein-ligand interactions. Various docking programs exist that allow automated docking, but most of them have limited visualization and user interaction. It would be advantageous if scientists could visualize the molecules participating in the docking process, manipulate their structures and manually dock them before submitting the new conformations to an automated docking process in an immersive environment, which can help stimulate the design/docking process. This also could greatly reduce docking time and resources. To achieve this, we propose a new virtual modelling/docking program, whereby the advantages of virtual modelling programs and the efficiency of the algorithms in existing docking programs will be merged.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ground surface net solar radiation is the energy that drives physical and chemical processes at the ground surface. In this paper, multi-spectral data from the Landsat-5 TM, topographic data from a gridded digital elevation model, field measurements, and the atmosphere model LOWTRAN 7 are used to estimate surface net solar radiation over the FIFE site. Firstly an improved method is presented and used for calculating total surface incoming radiation. Then, surface albedo is integrated from surface reflectance factors derived from remotely sensed data from Landsat-5 TM. Finally, surface net solar radiation is calculated by subtracting surface upwelling radiation from the total surface incoming radiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following on from the companion study (Johnson et al., 2006), a photochemical trajectory model (PTM) has been used to simulate the chemical composition of organic aerosol for selected events during the 2003 TORCH (Tropospheric Organic Chemistry Experiment) field campaign. The PTM incorporates the speciated emissions of 124 nonmethane anthropogenic volatile organic compounds (VOC) and three representative biogenic VOC, a highly-detailed representation of the atmospheric degradation of these VOC, the emission of primary organic aerosol (POA) material and the formation of secondary organic aerosol (SOA) material. SOA formation was represented by the transfer of semi and non-volatile oxidation products from the gas-phase to a condensed organic aerosol-phase, according to estimated thermodynamic equilibrium phase-partitioning characteristics for around 2000 reaction products. After significantly scaling all phase-partitioning coefficients, and assuming a persistent background organic aerosol (both required in order to match the observed organic aerosol loadings), the detailed chemical composition of the simulated SOA has been investigated in terms of intermediate oxygenated species in the Master Chemical Mechanism, version 3.1 ( MCM v3.1). For the various case studies considered, 90% of the simulated SOA mass comprises between ca. 70 and 100 multifunctional oxygenated species derived, in varying amounts, from the photooxidation of VOC of anthropogenic and biogenic origin. The anthropogenic contribution is dominated by aromatic hydrocarbons and the biogenic contribution by alpha-and beta-pinene (which also constitute surrogates for other emitted monoterpene species). Sensitivity in the simulated mass of SOA to changes in the emission rates of anthropogenic and biogenic VOC has also been investigated for 11 case study events, and the results have been compared to the detailed chemical composition data. The role of accretion chemistry in SOA formation, and its implications for the results of the present investigation, is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A photochemical trajectory model has been used to simulate the chemical evolution of air masses arriving at the TORCH field campaign site in the southern UK during late July and August 2003, a period which included a widespread and prolonged photochemical pollution episode. The model incorporates speciated emissions of 124 nonmethane anthropogenic VOC and three representative biogenic VOC, coupled with a comprehensive description of the chemistry of their degradation. A representation of the gas/aerosol absorptive partitioning of ca. 2000 oxygenated organic species generated in the Master Chemical Mechanism (MCM v3.1) has been implemented, allowing simulation of the contribution to organic aerosol (OA) made by semi- and non-volatile products of VOC oxidation; emissions of primary organic aerosol (POA) and elemental carbon (EC) are also represented. Simulations of total OA mass concentrations in nine case study events (optimised by comparison with observed hourly-mean mass loadings derived from aerosol mass spectrometry measurements) imply that the OA can be ascribed to three general sources: (i) POA emissions; (ii) a '' ubiquitous '' background concentration of 0.7 mu g m(-3); and (iii) gas-to-aerosol transfer of lower volatility products of VOC oxidation generated by the regional scale processing of emitted VOC, but with all partitioning coefficients increased by a species-independent factor of 500. The requirement to scale the partitioning coefficients, and the implied background concentration, are both indicative of the occurrence of chemical processes within the aerosol which allow the oxidised organic species to react by association and/or accretion reactions which generate even lower volatility products, leading to a persistent, non-volatile secondary organic aerosol (SOA). The contribution of secondary organic material to the simulated OA results in significant elevations in the simulated ratio of organic carbon (OC) to EC, compared with the ratio of 1.1 assigned to the emitted components. For the selected case study events, [OC]/[EC] is calculated to lie in the range 2.7-9.8, values which are comparable with the high end of the range reported in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban boundary layers (UBLs) can be highly complex due to the heterogeneous roughness and heating of the surface, particularly at night. Due to a general lack of observations, it is not clear whether canonical models of boundary layer mixing are appropriate in modelling air quality in urban areas. This paper reports Doppler lidar observations of turbulence profiles in the centre of London, UK, as part of the second REPARTEE campaign in autumn 2007. Lidar-measured standard deviation of vertical velocity averaged over 30 min intervals generally compared well with in situ sonic anemometer measurements at 190 m on the BT telecommunications Tower. During calm, nocturnal periods, the lidar underestimated turbulent mixing due mainly to limited sampling rate. Mixing height derived from the turbulence, and aerosol layer height from the backscatter profiles, showed similar diurnal cycles ranging from c. 300 to 800 m, increasing to c. 200 to 850 m under clear skies. The aerosol layer height was sometimes significantly different to the mixing height, particularly at night under clear skies. For convective and neutral cases, the scaled turbulence profiles resembled canonical results; this was less clear for the stable case. Lidar observations clearly showed enhanced mixing beneath stratocumulus clouds reaching down on occasion to approximately half daytime boundary layer depth. On one occasion the nocturnal turbulent structure was consistent with a nocturnal jet, suggesting a stable layer. Given the general agreement between observations and canonical turbulence profiles, mixing timescales were calculated for passive scalars released at street level to reach the BT Tower using existing models of turbulent mixing. It was estimated to take c. 10 min to diffuse up to 190 m, rising to between 20 and 50 min at night, depending on stability. Determination of mixing timescales is important when comparing to physico-chemical processes acting on pollutant species measured simultaneously at both the ground and at the BT Tower during the campaign. From the 3 week autumnal data-set there is evidence for occasional stable layers in central London, effectively decoupling surface emissions from air aloft.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.