29 resultados para street-level drug problems

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Street-level mean flow and turbulence govern the dispersion of gases away from their sources in urban areas. A suitable reference measurement in the driving flow above the urban canopy is needed to both understand and model complex street-level flow for pollutant dispersion or emergency response purposes. In vegetation canopies, a reference at mean canopy height is often used, but it is unclear whether this is suitable for urban canopies. This paper presents an evaluation of the quality of reference measurements at both roof-top (height = H) and at height z = 9H = 190 m, and their ability to explain mean and turbulent variations of street-level flow. Fast response wind data were measured at street canyon and reference sites during the six-week long DAPPLE project field campaign in spring 2004, in central London, UK, and an averaging time of 10 min was used to distinguish recirculation-type mean flow patterns from turbulence. Flow distortion at each reference site was assessed by considering turbulence intensity and streamline deflection. Then each reference was used as the dependent variable in the model of Dobre et al. (2005) which decomposes street-level flow into channelling and recirculating components. The high reference explained more of the variability of the mean flow. Coupling of turbulent kinetic energy was also stronger between street-level and the high reference flow rather than the roof-top. This coupling was weaker when overnight flow was stratified, and turbulence was suppressed at the high reference site. However, such events were rare (<1% of data) over the six-week long period. The potential usefulness of a centralised, high reference site in London was thus demonstrated with application to emergency response and air quality modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the event of a release of toxic gas in the center of London, the emergency services would need to determine quickly the extent of the area contaminated. The transport of pollutants by turbulent flow within the complex street and building architecture of cities is not straightforward, and we might wonder whether it is at all possible to make a scientifically-reasoned decision. Here we describe recent progress from a major UK project, ‘Dispersion of Air Pollution and its Penetration into the Local Environment’ (DAPPLE, www.dapple.org.uk). In DAPPLE, we focus on the movement of airborne pollutants in cities by developing a greater understanding of atmospheric flow and dispersion within urban street networks. In particular, we carried out full-scale dispersion experiments in central London (UK) during 2003, 2004, 2007, and 2008 to address the extent of the dispersion of tracers following their release at street level. These measurements complemented previous studies because (i) our focus was on dispersion within the first kilometer from the source, when most of the material was expected to remain within the street network rather than being mixed into the boundary layer aloft, (ii) measurements were made under a wide variety of meteorological conditions, and (iii) central London represents a European, rather than North American, city geometry. Interpretation of the results from the full-scale experiments was supported by extensive numerical and wind tunnel modeling, which allowed more detailed analysis under idealized and controlled conditions. In this article, we review the full-scale DAPPLE methodologies and show early results from the analysis of the 2007 field campaign data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eddy-covariance measurements of carbon dioxide fluxes were taken semi-continuously between October 2006 and May 2008 at 190 m height in central London (UK) to quantify emissions and study their controls. Inner London, with a population of 8.2 million (~5000 inhabitants per km2) is heavily built up with 8% vegetation cover within the central boroughs. CO2 emissions were found to be mainly controlled by fossil fuel combustion (e.g. traffic, commercial and domestic heating). The measurement period allowed investigation of both diurnal patterns and seasonal trends. Diurnal averages of CO2 fluxes were found to be highly correlated to traffic. However changes in heating-related natural gas consumption and, to a lesser extent, photosynthetic activity that controlled the seasonal variability. Despite measurements being taken at ca. 22 times the mean building height, coupling with street level was adequate, especially during daytime. Night-time saw a higher occurrence of stable or neutral stratification, especially in autumn and winter, which resulted in data loss in post-processing. No significant difference was found between the annual estimate of net exchange of CO2 for the expected measurement footprint and the values derived from the National Atmospheric Emissions Inventory (NAEI), with daytime fluxes differing by only 3%. This agreement with NAEI data also supported the use of the simple flux footprint model which was applied to the London site; this also suggests that individual roughness elements did not significantly affect the measurements due to the large ratio of measurement height to mean building height.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban boundary layers (UBLs) can be highly complex due to the heterogeneous roughness and heating of the surface, particularly at night. Due to a general lack of observations, it is not clear whether canonical models of boundary layer mixing are appropriate in modelling air quality in urban areas. This paper reports Doppler lidar observations of turbulence profiles in the centre of London, UK, as part of the second REPARTEE campaign in autumn 2007. Lidar-measured standard deviation of vertical velocity averaged over 30 min intervals generally compared well with in situ sonic anemometer measurements at 190 m on the BT telecommunications Tower. During calm, nocturnal periods, the lidar underestimated turbulent mixing due mainly to limited sampling rate. Mixing height derived from the turbulence, and aerosol layer height from the backscatter profiles, showed similar diurnal cycles ranging from c. 300 to 800 m, increasing to c. 200 to 850 m under clear skies. The aerosol layer height was sometimes significantly different to the mixing height, particularly at night under clear skies. For convective and neutral cases, the scaled turbulence profiles resembled canonical results; this was less clear for the stable case. Lidar observations clearly showed enhanced mixing beneath stratocumulus clouds reaching down on occasion to approximately half daytime boundary layer depth. On one occasion the nocturnal turbulent structure was consistent with a nocturnal jet, suggesting a stable layer. Given the general agreement between observations and canonical turbulence profiles, mixing timescales were calculated for passive scalars released at street level to reach the BT Tower using existing models of turbulent mixing. It was estimated to take c. 10 min to diffuse up to 190 m, rising to between 20 and 50 min at night, depending on stability. Determination of mixing timescales is important when comparing to physico-chemical processes acting on pollutant species measured simultaneously at both the ground and at the BT Tower during the campaign. From the 3 week autumnal data-set there is evidence for occasional stable layers in central London, effectively decoupling surface emissions from air aloft.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article looks at the controversial music genre Oi! in relation to youth cultural identity in late 1970s and early 1980s Britain. By examining the six compilation albums released to promote Oi! as a distinct strand of punk, it seeks to challenge prevailing dismissals of the genre as inherently racist or bound to the politics of the far right. Rather, Oi! – like punk more generally – was a contested cultural form. It was, moreover, centred primarily on questions of class and locality. To this end, Oi! sought to realize the working-class rebellion of punk’s early aesthetic; to give substance to its street-level pretentions and offer a genuine ‘song from the streets’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Holm oak (Quercus ilex), a widespread urban street tree in the Mediterranean region, is widely used as biomonitor of persistent atmospheric pollutants, especially particulate-bound metals. By using lab- and field-based experimental approaches, we compared the leaf-level capacity for particles’ capture and retention between Q. ilex and other common Mediterranean urban trees: Quercus cerris, Platanus × hispanica, Tilia cordata and Olea europaea. All applied methods were effective in quantifying particulate capture and retention, although not univocal in ranking species performances. Distinctive morphological features of leaves led to differences in species’ ability to trap and retain particles of different size classes and to accumulate metals after exposure to traffic in an urban street. Overall, P. × hispanica and T. cordata showed the largest capture potential per unit leaf area for most model particles (Na+ and powder particles), and street-level Cu and Pb, while Q. ilex acted intermediately. After wash-off experiments, P. × hispanica leaves had the greatest retention capacity among the tested species and O. europaea the lowest. We concluded that the Platanus planting could be considered in Mediterranean urban environments due to its efficiency in accumulating and retaining airborne particulates; however, with atmospheric pollution being typically higher in winter, the evergreen Q. ilex represents a better year-round choice to mitigate the impact of airborne particulate pollutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farming systems research is a multi-disciplinary holistic approach to solve the problems of small farms. Small and marginal farmers are the core of the Indian rural economy Constituting 0.80 of the total farming community but possessing only 0.36 of the total operational land. The declining trend of per capita land availability poses a serious challenge to the sustainability and profitability of farming. Under such conditions, it is appropriate to integrate land-based enterprises such as dairy, fishery, poultry, duckery, apiary, field and horticultural cropping within the farm, with the objective of generating adequate income and employment for these small and marginal farmers Under a set of farm constraints and varying levels of resource availability and Opportunity. The integration of different farm enterprises can be achieved with the help of a linear programming model. For the current review, integrated farming systems models were developed, by Way Of illustration, for the marginal, small, medium and large farms of eastern India using linear programming. Risk analyses were carried out for different levels of income and enterprise combinations. The fishery enterprise was shown to be less risk-prone whereas the crop enterprise involved greater risk. In general, the degree of risk increased with the increasing level of income. With increase in farm income and risk level, the resource use efficiency increased. Medium and large farms proved to be more profitable than small and marginal farms with higher level of resource use efficiency and return per Indian rupee (Rs) invested. Among the different enterprises of integrated farming systems, a chain of interaction and resource flow was observed. In order to make fanning profitable and improve resource use efficiency at the farm level, the synergy among interacting components of farming systems should be exploited. In the process of technology generation, transfer and other developmental efforts at the farm level (contrary to the discipline and commodity-based approaches which have a tendency to be piecemeal and in isolation), it is desirable to place a whole-farm scenario before the farmers to enhance their farm income, thereby motivating them towards more efficient and sustainable fanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall immunopathogenesis relevant to a large series of disorders caused by a drug or its associated hyperimmune condition is discussed based upon examining the genetics of severe drug-induced bullous skin problems (sporadic idiosyncratic adverse events including Stevens-Johnson syndrome and Toxic epidermal necrolysis). New results from an exemplar study on shared precipitating and perpetuating inner causes with other related disease phenotypes including aphtous stomatitis, Behcets, erythema multiforme, Hashimoto's thyroiditis, pemphigus, periodic fevers, Sweet's syndrome and drug-induced multisystem hypersensitivity are presented. A call for a collaborative, wider demographic profiling and deeper immunotyping in suggested future work is made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall immunopathogenesis relevant to a large series of disorders caused by a drug or its associated hyperimmune condition is discussed based upon the examination of the genetics of severe drug-induced bullous skin problems (sporadic idiosyncratic adverse events, including Stevens-Johnson syndrome and toxic epidermal necrolysis). An overarching pharmacogenetic schema is proposed. Immune cognition and early-effector processes are focused upon and a challenging synthesis around systems evolution is explained by a variety of projective analogies. Etiology, human leukocyte antigen-B, immune stability, clysiregulation, pharmacomimicry, viruses and an aggressive ethnically differentiated 'karmic' response are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally acknowledged that population-level assessments provide,I better measure of response to toxicants than assessments of individual-level effects. population-level assessments generally require the use of models to integrate potentially complex data about the effects of toxicants on life-history traits, and to provide a relevant measure of ecological impact. Building on excellent earlier reviews we here briefly outline the modelling options in population-level risk assessment. Modelling is used to calculate population endpoints from available data, which is often about Individual life histories, the ways that individuals interact with each other, the environment and other species, and the ways individuals are affected by pesticides. As population endpoints, we recommend the use of population abundance, population growth rate, and the chance of population persistence. We recommend two types of model: simple life-history models distinguishing two life-history stages, juveniles and adults; and spatially-explicit individual-based landscape models. Life-history models are very quick to set up and run, and they provide a great deal or insight. At the other extreme, individual-based landscape models provide the greatest verisimilitude, albeit at the cost of greatly increased complexity. We conclude with a discussion of the cations of the severe problems of parameterising models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.