44 resultados para LARGE NUMBERS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diverticular disease (DD) is an age-related disorder of the large bowel which may affect half of the population over the age of 65 in the UK. This high prevalence ranks it as one of the most common bowel disorders in western nations. The majority of patients remain asymptomatic but there are associated life-threatening co-morbidities, which, given the large numbers of people with DD, translates into a considerable number of deaths per annum. Despite this public health burden, relatively little seems to be known about either the mechanisms of development or causality. In the 1970s, a model of DD formulated the concept that diverticula occur as a consequence of pressureinduced damage to the colon wall amongst those with a low intake of dietary fiber. In this review, we have examined the evidence regarding the influence of ageing, diet, inflammation and genetics on DD development. We argue that the evidence supporting the barotrauma hypothesis is largely anecdotal. We have also identified several gaps in the knowledge base which need to be filled before we can complete

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The possibilities and need for adaptation and mitigation depends on uncertain future developments with respect to socio-economic factors and the climate system. Scenarios are used to explore the impacts of different strategies under uncertainty. In this chapter, some scenarios are presented that are used in the ADAM project for this purpose. One scenario explores developments with no mitigation, and thus with high temperature increase and high reliance on adaptation (leading to 4oC increase by 2100 compared to pre-industrial levels). A second scenario explores an ambitious mitigation strategy (leading to 2oC increase by 2100 compared to pre-industrial levels). In the latter scenario, stringent mitigation strategies effectively reduces the risks of climate change, but based on uncertainties in the climate system a temperature increase of 3oC or more cannot be excluded. The analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but even then adaptation will be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which adaptation is more cost-effective than mitigation, but mitigation can help reduce damages and the cost of adaptation. For agriculture, finally, only the scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Scenarios are used to explore the consequences of different adaptation and mitigation strategies under uncertainty. In this paper, two scenarios are used to explore developments with (1) no mitigation leading to an increase of global mean temperature of 4 °C by 2100 and (2) an ambitious mitigation strategy leading to 2 °C increase by 2100. For the second scenario, uncertainties in the climate system imply that a global mean temperature increase of 3 °C or more cannot be ruled out. Our analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but adaptation will still be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which, from a global and purely monetary perspective, adaptation (up to 2100) seems more effective than mitigation. From the perspective of poorer and small island countries, however, stringent mitigation is necessary to keep risks at manageable levels. For agriculture, only a scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the potential application of mid-infrared spectroscopy (MIR 4,000–900 cm−1) for the determination of milk coagulation properties (MCP), titratable acidity (TA), and pH in Brown Swiss milk samples (n = 1,064). Because MCP directly influence the efficiency of the cheese-making process, there is strong industrial interest in developing a rapid method for their assessment. Currently, the determination of MCP involves time-consuming laboratory-based measurements, and it is not feasible to carry out these measurements on the large numbers of milk samples associated with milk recording programs. Mid-infrared spectroscopy is an objective and nondestructive technique providing rapid real-time analysis of food compositional and quality parameters. Analysis of milk rennet coagulation time (RCT, min), curd firmness (a30, mm), TA (SH°/50 mL; SH° = Soxhlet-Henkel degree), and pH was carried out, and MIR data were recorded over the spectral range of 4,000 to 900 cm−1. Models were developed by partial least squares regression using untreated and pretreated spectra. The MCP, TA, and pH prediction models were improved by using the combined spectral ranges of 1,600 to 900 cm−1, 3,040 to 1,700 cm−1, and 4,000 to 3,470 cm−1. The root mean square errors of cross-validation for the developed models were 2.36 min (RCT, range 24.9 min), 6.86 mm (a30, range 58 mm), 0.25 SH°/50 mL (TA, range 3.58 SH°/50 mL), and 0.07 (pH, range 1.15). The most successfully predicted attributes were TA, RCT, and pH. The model for the prediction of TA provided approximate prediction (R2 = 0.66), whereas the predictive models developed for RCT and pH could discriminate between high and low values (R2 = 0.59 to 0.62). It was concluded that, although the models require further development to improve their accuracy before their application in industry, MIR spectroscopy has potential application for the assessment of RCT, TA, and pH during routine milk analysis in the dairy industry. The implementation of such models could be a means of improving MCP through phenotypic-based selection programs and to amend milk payment systems to incorporate MCP into their payment criteria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The water vapour continuum is characterised by absorption that varies smoothly with wavelength, from the visible to the microwave. It is present within the rotational and vibrational–rotational bands of water vapour, which consist of large numbers of narrow spectral lines, and in the many ‘windows’ between these bands. The continuum absorption in the window regions is of particular importance for the Earth’s radiation budget and for remote-sensing techniques that exploit these windows. Historically, most attention has focused on the 8–12 μm (mid-infrared) atmospheric window, where the continuum is relatively well-characterised, but there have been many fewer measurements within bands and in other window regions. In addition, the causes of the continuum remain a subject of controversy. This paper provides a brief historical overview of the development of understanding of the continuum and then reviews recent developments, with a focus on the near-infrared spectral region. Recent laboratory measurements in near-infrared windows, which reveal absorption typically an order of magnitude stronger than in widely used continuum models, are shown to have important consequences for remote-sensing techniques that use these windows for retrieving cloud properties.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliable techniques for screening large numbers of plants for root traits are still being developed, but include aeroponic, hydroponic and agar plate systems. Coupled with digital cameras and image analysis software, these systems permit the rapid measurement of root numbers, length and diameter in moderate ( typically <1000) numbers of plants. Usually such systems are employed with relatively small seedlings, and information is recorded in 2D. Recent developments in X-ray microtomography have facilitated 3D non-invasive measurement of small root systems grown in solid media, allowing angular distributions to be obtained in addition to numbers and length. However, because of the time taken to scan samples, only a small number can be screened (typically<10 per day, not including analysis time of the large spatial datasets generated) and, depending on sample size, limited resolution may mean that fine roots remain unresolved. Although agar plates allow differences between lines and genotypes to be discerned in young seedlings, the rank order may not be the same when the same materials are grown in solid media. For example, root length of dwarfing wheat ( Triticum aestivum L.) lines grown on agar plates was increased by similar to 40% relative to wild-type and semi-dwarfing lines, but in a sandy loam soil under well watered conditions it was decreased by 24-33%. Such differences in ranking suggest that significant soil environment-genotype interactions are occurring. Developments in instruments and software mean that a combination of high-throughput simple screens and more in-depth examination of root-soil interactions is becoming viable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the absence of a suitable method for routine analysis of large numbers of natural river water samples for organic nitrogen and phosphorus fractions, a new simultaneous digestion technique was developed, based on a standard persulphate digestion procedure. This allows rapid analysis of river, lake and groundwater samples from a range of environments for total nitrogen and phosphorus. The method was evaluated using a range of organic nitrogen and phosphorus structures tested at low, mid and high range concentrations from 2 to 50 mg l-1 nitrogen and 0.2 to 10 mg l-1 phosphorus. Mean recoveries for nitrogen ranged from 94.5% (2 mg I-1) to 92.7% (50 mg I-1) and for phosphorus were 98.2% (0.2 mg l-1) to 100.2% (10 mg l-1). The method is precise in its ability m reproduce results from replicate digestions, and robust in its ability to handle a variety of natural water samples in the pH range 5-8.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Association mapping, initially developed in human disease genetics, is now being applied to plant species. The model species Arabidopsis provided some of the first examples of association mapping in plants, identifying previously cloned flowering time genes, despite high population sub-structure. More recently, association genetics has been applied to barley, where breeding activity has resulted in a high degree of population sub-structure. A major genotypic division within barley is that between winter- and spring-sown varieties, which differ in their requirement for vernalization to promote subsequent flowering. To date, all attempts to validate association genetics in barley by identifying major flowering time loci that control vernalization requirement (VRN-H1 and VRN-H2) have failed. Here, we validate the use of association genetics in barley by identifying VRN-H1 and VRN-H2, despite their prominent role in determining population sub-structure. Results: By taking barley as a typical inbreeding crop, and seasonal growth habit as a major partitioning phenotype, we develop an association mapping approach which successfully identifies VRN-H1 and VRN-H2, the underlying loci largely responsible for this agronomic division. We find a combination of Structured Association followed by Genomic Control to correct for population structure and inflation of the test statistic, resolved significant associations only with VRN-H1 and the VRN-H2 candidate genes, as well as two genes closely linked to VRN-H1 (HvCSFs1 and HvPHYC). Conclusion: We show that, after employing appropriate statistical methods to correct for population sub-structure, the genome-wide partitioning effect of allelic status at VRN-H1 and VRN-H2 does not result in the high levels of spurious association expected to occur in highly structured samples. Furthermore, we demonstrate that both VRN-H1 and the candidate VRN-H2 genes can be identified using association mapping. Discrimination between intragenic VRN-H1 markers was achieved, indicating that candidate causative polymorphisms may be discerned and prioritised within a larger set of positive associations. This proof of concept study demonstrates the feasibility of association mapping in barley, even within highly structured populations. A major advantage of this method is that it does not require large numbers of genome-wide markers, and is therefore suitable for fine mapping and candidate gene evaluation, especially in species for which large numbers of genetic markers are either unavailable or too costly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel version of the classical surface pressure tendency equation (PTE) is applied to ERA-Interim reanalysis data to quantitatively assess the contribution of diabatic processes to the deepening of extratropical cyclones relative to effects of temperature advection and vertical motions. The five cyclone cases selected, Lothar and Martin in December 1999, Kyrill in January 2007, Klaus in January 2009, and Xynthia in February 2010, all showed explosive deepening and brought considerable damage to parts of Europe. For Xynthia, Klaus and Lothar diabatic processes contribute more to the observed surface pressure fall than horizontal temperature advection during their respective explosive deepening phases, while Kyrill and Martin appear to be more baroclinically driven storms. The powerful new diagnostic tool presented here can easily be applied to large numbers of cyclones and will help to better understand the role of diabatic processes in future changes in extratropical storminess.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A potential problem with Ensemble Kalman Filter is the implicit Gaussian assumption at analysis times. Here we explore the performance of a recently proposed fully nonlinear particle filter on a high-dimensional but simplified ocean model, in which the Gaussian assumption is not made. The model simulates the evolution of the vorticity field in time, described by the barotropic vorticity equation, in a highly nonlinear flow regime. While common knowledge is that particle filters are inefficient and need large numbers of model runs to avoid degeneracy, the newly developed particle filter needs only of the order of 10-100 particles on large scale problems. The crucial new ingredient is that the proposal density cannot only be used to ensure all particles end up in high-probability regions of state space as defined by the observations, but also to ensure that most of the particles have similar weights. Using identical twin experiments we found that the ensemble mean follows the truth reliably, and the difference from the truth is captured by the ensemble spread. A rank histogram is used to show that the truth run is indistinguishable from any of the particles, showing statistical consistency of the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Massive Open Online Courses (MOOCs) open up learning opportunities to a large number of people. A small percentage (around 10%) of the large numbers of participants enrolling in MOOCs manage to finish the course by completing all parts. The term ‘dropout’ is commonly used to refer to ‘all who failed to complete’ a course, and is used in relation to MOOCs. Due to the nature of MOOCs, with students not paying enrolment and tuition fees, there is no direct financial cost incurred by a student. Therefore it is debatable whether the traditional definition of dropout in higher education could be directly applied to MOOCs. This paper reports ongoing exploratory work on MOOC participants’ perspectives based on six qualitative interviews. The findings show that MOOC participants are challenging the widely held view of dropout, suggesting that it is more about failing to achieve their personal aims.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we develop and apply methods for the spectral analysis of non-selfadjoint tridiagonal infinite and finite random matrices, and for the spectral analysis of analogous deterministic matrices which are pseudo-ergodic in the sense of E. B. Davies (Commun. Math. Phys. 216 (2001), 687–704). As a major application to illustrate our methods we focus on the “hopping sign model” introduced by J. Feinberg and A. Zee (Phys. Rev. E 59 (1999), 6433–6443), in which the main objects of study are random tridiagonal matrices which have zeros on the main diagonal and random ±1’s as the other entries. We explore the relationship between spectral sets in the finite and infinite matrix cases, and between the semi-infinite and bi-infinite matrix cases, for example showing that the numerical range and p-norm ε - pseudospectra (ε > 0, p ∈ [1,∞] ) of the random finite matrices converge almost surely to their infinite matrix counterparts, and that the finite matrix spectra are contained in the infinite matrix spectrum Σ. We also propose a sequence of inclusion sets for Σ which we show is convergent to Σ, with the nth element of the sequence computable by calculating smallest singular values of (large numbers of) n×n matrices. We propose similar convergent approximations for the 2-norm ε -pseudospectra of the infinite random matrices, these approximations sandwiching the infinite matrix pseudospectra from above and below.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Polyommatus bellargus is a priority species of butterfly in the UK as a result of its scarcity and the rate of population decline over the last few years. In the UK, the species is associated with chalk grassland on hot, south-facing slopes suitable for the growth of the food plant Hippocrepis comosa. Shooting game birds is a popular pastime in the UK. Over 40 million game birds, principally Phasianus colchicus and Alectoris rufa, are bred and released into the countryside each year for shooting interests. There is a concern that the release of such a large number of non-native birds has an adverse effect on native wildlife. A study was carried out over a period of 3 years out to examine whether there was any evidence that A. rufa released into chalk grassland habitat negatively affects populations of P. bellargus. A comparison was made between sites where large numbers of A. rufa were released versus sites where no, or few, birds were released. The study involved the construction of exclosures in these sites to allow an examination of the number of butterflies emerging from H. comosa when the birds were excluded versus when the birds had free range across the area. Where birds were present the on-site vegetation was shorter than where they were absent indicating that the birds were definitely influencing habitat structure. However, the evidence that A. rufa was negatively influencing the number of adult butterflies emerging was not strong, although there was a largely non-significant tendency for higher butterfly emergence when the birds were excluded or absent.