98 resultados para failure by defendant to appear at hearing
Resumo:
Although in several EU Member States many public interventions have been running for the prevention and/or management of obesity and other nutrition-related health conditions, few have yet been formally evaluated. The multidisciplinary team of the EATWELL project will gather benchmark data on healthy eating interventions in EU Member States and review existing information on the effectiveness of interventions using a three-stage procedure (i) Assessment of the intervention's impact on consumer attitudes, consumer behaviour and diets; (ii) The impact of the change in diets on obesity and health and (iii) The value attached by society to these changes, measured in life years gained, cost savings and quality-adjusted life years. Where evaluations have been inadequate, EATWELL will gather secondary data and analyse them with a multidisciplinary approach incorporating models from the psychology and economics disciplines. Particular attention will be paid to lessons that can be learned from private sector that are transferable to the healthy eating campaigns in the public sector. Through consumer surveys and workshops with other stakeholders, EATWELL will assess the acceptability of the range of potential interventions. Armed with scientific quantitative evaluations of policy interventions and their acceptability to stakeholders, EATWELL expects to recommend more appropriate interventions for Member States and the EU, providing a one-stop guide to methods and measures in interventions evaluation, and outline data collection priorities for the future.
Resumo:
The Prony fitting theory is applied in this paper to solve the deconvolution problem. There are two cases in deconvolution in which unstable solution is easy to appear. They are: (1)the frequency band of known kernel is more narraw than that of the unknown kernel; (2) there exists noise. These two cases are studied thoroughly and the effectiveness of Prony fitting method is showed. Finally, this method is simulated in computer. The simulation results are compared with those obtained by using FFT method directly.
Resumo:
Evidence increasingly suggests that sub-Saharan Africa is at the center of human evolution and understanding routes of dispersal “out of Africa” is thus becoming increasingly important. The Sahara Desert is considered by many to be an obstacle to these dispersals and a Nile corridor route has been proposed to cross it. Here we provide evidence that the Sahara was not an effective barrier and indicate how both animals and humans populated it during past humid phases. Analysis of the zoogeography of the Sahara shows that more animals crossed via this route than used the Nile corridor. Furthermore, many of these species are aquatic. This dispersal was possible because during the Holocene humid period the region contained a series of linked lakes, rivers, and inland deltas comprising a large interlinked waterway, channeling water and animals into and across the Sahara, thus facilitating these dispersals. This system was last active in the early Holocene when many species appear to have occupied the entire Sahara. However, species that require deep water did not reach northern regions because of weak hydrological connections. Human dispersals were influenced by this distribution; Nilo-Saharan speakers hunting aquatic fauna with barbed bone points occupied the southern Sahara, while people hunting Savannah fauna with the bow and arrow spread southward. The dating of lacustrine sediments show that the “green Sahara” also existed during the last interglacial (∼125 ka) and provided green corridors that could have formed dispersal routes at a likely time for the migration of modern humans out of Africa.
Resumo:
Geographic distributions of pathogens are the outcome of dynamic processes involving host availability, susceptibility and abundance, suitability of climate conditions, and historical contingency including evolutionary change. Distributions have changed fast and are changing fast in response to many factors, including climatic change. The response time of arable agriculture is intrinsically fast, but perennial crops and especially forests are unlikely to adapt easily. Predictions of many of the variables needed to predict changes in pathogen range are still rather uncertain, and their effects will be profoundly modified by changes elsewhere in the agricultural system, including both economic changes affecting growing systems and hosts and evolutionary changes in pathogens and hosts. Tools to predict changes based on environmental correlations depend on good primary data, which is often absent, and need to be checked against the historical record, which remains very poor for almost all pathogens. We argue that at present the uncertainty in predictions of change is so great that the important adaptive response is to monitor changes and to retain the capacity to innovate, both by access to economic capital with reasonably long-term rates of return and by retaining wide scientific expertise, including currently less fashionable specialisms.
Resumo:
The intensity and distribution of daily precipitation is predicted to change under scenarios of increased greenhouse gases (GHGs). In this paper, we analyse the ability of HadCM2, a general circulation model (GCM), and a high-resolution regional climate model (RCM), both developed at the Met Office's Hadley Centre, to simulate extreme daily precipitation by reference to observations. A detailed analysis of daily precipitation is made at two UK grid boxes, where probabilities of reaching daily thresholds in the GCM and RCM are compared with observations. We find that the RCM generally overpredicts probabilities of extreme daily precipitation but that, when the GCM and RCM simulated values are scaled to have the same mean as the observations, the RCM captures the upper-tail distribution more realistically. To compare regional changes in daily precipitation in the GHG-forced period 2080-2100 in the GCM and the RCM, we develop two methods. The first considers the fractional changes in probability of local daily precipitation reaching or exceeding a fixed 15 mm threshold in the anomaly climate compared with the control. The second method uses the upper one-percentile of the control at each point as the threshold. Agreement between the models is better in both seasons with the latter method, which we suggest may be more useful when considering larger scale spatial changes. On average, the probability of precipitation exceeding the 1% threshold increases by a factor of 2.5 (GCM and RCM) in winter and by I .7 (GCM) or 1.3 (RCM) in summer.
Resumo:
This paper addresses the commercial leases policy issue of how to deal with small business tenants. The UK has adopted a voluntary solution to commercial lease reform by using Codes of Practice which is in contrast to the legislative approach adopted by Australia to attempt to solve its perceived problems with small business retail tenancies. The aim of the research was to examine the perceptions of the effectiveness of the legislation in Australia and discuss any implications for the UK policy debate. The research used a combination of literature and legislation review and a semi structured interview survey to investigate the policy aims and objectives of Australian Federal and State Governments, identify the nature and scope of the Australian legislation and examine perceptions of effectiveness of the legislation in informing small business tenants. The situation is complicated in Australia due to leases being a State rather than Federal responsibility therefore the main fieldwork was carried out in one case study State, Victoria. The paper concludes that some aspects of the Australian system can inform the UK policy debate including mandatory information provision at the commencement of negotiations and the use of lease registrars/commissioners. However, there are a number of issues that the Australian legislation does not appear to have successfully addressed including the difficulties of legislating across partial segments of the commercial property market and the collection of data for enforcement purposes.
Resumo:
Milk oligosaccharides are believed to have beneficial biological properties. Caprine milk has a relatively high concentration of oligosaccharides in comparison to other ruminant milks and has the closest oligosaccharide profile to human milk. The first stage in recovering oligosaccharides from caprine milk whey, a by-product of cheese making, was accomplished by ultrafiltration to remove proteins and fat globules, leaving more than 97% of the initial carbohydrates, mainly lactose, in the permeate. The ultrafiltered permeate was further processed using a 1 kDa ‘tight’ ultrafiltration membrane, which retained less than 7% of the remaining lactose. The final retentate was fractionated by preparative scale molecular size exclusion chromatography, to yield 28 fractions, of which oligosaccharide-rich fractions were detected somewhere between fractions 9/10 to 16/17, suitable for functionality and gut health promotion testing. All fractions were evaluated for their oligosaccharide and carbohydrate profiles using three complementary analytical methods.
Resumo:
Otto Neurath (1882–1945) wrote From hieroglyphics to Isotype during the last two years of his life and this is the first publication of the text in full, carefully edited from the original manuscripts. He called it a 'visual autobiography', in which he documents the importance of visual material to him from his earliest years to his professional activity with the picture language of Isotype. Neurath draws clear links between the stimulus he received as a boy from illustrated books, toys and exhibitions to the considered work in visual education that occupied him for the last two decades of his life. This engaging and informal account gives a rich picture of Central European culture around the turn of the twentieth century, seen through the eyes of Neurath's insatiable intelligence, as well as a detailed exposition of the technique of Isotype, a milestone of modern graphic design. This edition includes the numerous illustrations intended by Neurath to accompany his text, and is completed by an extensive appendix showing examples from the rich variety of graphic material that he collected.
Resumo:
Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.
Resumo:
Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.
Resumo:
Remote transient changes in the environment, such as the onset of visual distractors, impact on the exe- cution of target directed saccadic eye movements. Studies that have examined the latency of the saccade response have shown conflicting results. When there was an element of target selection, saccade latency increased as the distance between distractor and target increased. In contrast, when target selection is minimized by restricting the target to appear on one axis position, latency has been found to be slowest when the distractor is shown at fixation and reduces as it moves away from this position, rather than from the target. Here we report four experiments examining saccade latency as target and distractor posi- tions are varied. We find support for both a dependence of saccade latency on distractor distance from target and from fixation: saccade latency was longer when distractor is shown close to fixation and even longer still when shown in an opposite location (180°) to the target. We suggest that this is due to inhib- itory interactions between the distractor, fixation and the target interfering with fixation disengagement and target selection.
Resumo:
Now that stratospheric ozone depletion has been controlled by the Montreal Protocol1, interest has turned to the effects of climate change on the ozone layer. Climate models predict an accelerated stratospheric circulation, leading to changes in the spatial distribution of stratospheric ozone and an increased stratosphere-to-troposphere ozone flux. Here we use an atmospheric chemistry climate model to isolate the effects of climate change from those of ozone depletion and recovery on stratosphere-to-troposphere ozone flux and the clear-sky ultraviolet radiation index—a measure of potential human exposure to ultraviolet radiation. We show that under the Intergovernmental Panel on Climate Change moderate emissions scenario, global stratosphere-to- troposphere ozone flux increases by 23% between 1965 and 2095 as a result of climate change. During this time, the clear-sky ultraviolet radiation index decreases by 9% in northern high latitudes — a much larger effect than that of stratospheric ozone recovery — and increases by 4% in the tropics, and by up to 20% in southern high latitudes in late spring and early summer. The latter increase in the ultraviolet index is equivalent to nearly half of that generated by the Antarctic ‘ozone hole’ that was created by anthropogenic halogens. Our results suggest that climate change will alter the tropospheric ozone budget and the ultraviolet index, which would have consequences for tropospheric radiative forcing, air quality and human and ecosystem health.
Resumo:
Many studies warn that climate change may undermine global food security. Much work on this topic focuses on modelling crop-weather interactions but these models do not generally account for the ways in which socio-economic factors influence how harvests are affected by weather. To address this gap, this paper uses a quantitative harvest vulnerability index based on annual soil moisture and grain production data as the dependent variables in a Linear Mixed Effects model with national scale socio-economic data as independent variables for the period 1990-2005. Results show that rice, wheat and maize production in middle income countries were especially vulnerable to droughts. By contrast, harvests in countries with higher investments in agriculture (e.g higher amounts of fertilizer use) were less vulnerable to drought. In terms of differences between the world's major grain crops, factors that made rice and wheat crops vulnerable to drought were quite consistent, whilst those of maize crops varied considerably depending on the type of region. This is likely due to the fact that maize is produced under very different conditions worldwide. One recommendation for reducing drought vulnerability risks is coordinated development and adaptation policies, including institutional support that enables farmers to take proactive action.
Resumo:
A two-phase system composed by a leach bed and a methanogenic reactor was modified for the first time to improve volumetric substrate degradation and methane yields from a complex substrate (maize; Zea mays). The system, which was operated for consecutive feed cycles of different durations for 120 days, was highly flexible and its performance improved by altering operational conditions. Daily substrate degradation was higher the shorter the feed cycle, reaching 8.5 g TSdestroyed d�1 (7-day feed cycle) but the overall substrate degradation was higher by up to 55% when longer feed cycles (14 and 28 days) were applied. The same occurred with volumetric methane yields, reaching 0.839 m3 (m3)�1 d�1. The system performed better than others on specific methane yields, reaching 0.434 m3 kg�1 TSadded, in the 14-day and 28-day systems. The UASB and AF designs performed similarly as second stage reactors on methane yields, SCOD and VFA removal efficiencies.
Resumo:
There has been considerable interest in the climate impact of trends in stratospheric water vapor (SWV). However, the representation of the radiative properties of water vapor under stratospheric conditions remains poorly constrained across different radiation codes. This study examines the sensitivity of a detailed line-by-line (LBL) code, a Malkmus narrow-band model and two broadband GCM radiation codes to a uniform perturbation in SWV in the longwave spectral region. The choice of sampling rate in wave number space (Δν) in the LBL code is shown to be important for calculations of the instantaneous change in heating rate (ΔQ) and the instantaneous longwave radiative forcing (ΔFtrop). ΔQ varies by up to 50% for values of Δν spanning 5 orders of magnitude, and ΔFtrop varies by up to 10%. In the three less detailed codes, ΔQ differs by up to 45% at 100 hPa and 50% at 1 hPa compared to a LBL calculation. This causes differences of up to 70% in the equilibrium fixed dynamical heating temperature change due to the SWV perturbation. The stratosphere-adjusted radiative forcing differs by up to 96% across the less detailed codes. The results highlight an important source of uncertainty in quantifying and modeling the links between SWV trends and climate.