872 resultados para Complexity analyses


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical methods of geophysical survey are known to produce results that are hard to predict at different times of the year, and under differing weather conditions. This is a problem which can lead to misinterpretation of archaeological features under investigation. The dynamic relationship between a ‘natural’ soil matrix and an archaeological feature is a complex one, which greatly affects the success of the feature’s detection when using active electrical methods of geophysical survey. This study has monitored the gradual variation of measured resistivity over a selection of study areas. By targeting difficult to find, and often ‘missing’ electrical anomalies of known archaeological features, this study has increased the understanding of both the detection and interpretation capabilities of such geophysical surveys. A 16 month time-lapse study over 4 archaeological features has taken place to investigate the aforementioned detection problem across different soils and environments. In addition to the commonly used Twin-Probe earth resistance survey, electrical resistivity imaging (ERI) and quadrature electro-magnetic induction (EMI) were also utilised to explore the problem. Statistical analyses have provided a novel interpretation, which has yielded new insights into how the detection of archaeological features is influenced by the relationship between the target feature and the surrounding ‘natural’ soils. The study has highlighted both the complexity and previous misconceptions around the predictability of the electrical methods. The analysis has confirmed that each site provides an individual and nuanced situation, the variation clearly relating to the composition of the soils (particularly pore size) and the local weather history. The wide range of reasons behind survey success at each specific study site has been revealed. The outcomes have shown that a simplistic model of seasonality is not universally applicable to the electrical detection of archaeological features. This has led to the development of a method for quantifying survey success, enabling a deeper understanding of the unique way in which each site is affected by the interaction of local environmental and geological conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increase in e-commerce and the digitisation of design data and information,the construction sector has become reliant upon IT infrastructure and systems. The design and production process is more complex, more interconnected, and reliant upon greater information mobility, with seamless exchange of data and information in real time. Construction small and medium-sized enterprises (CSMEs), in particular,the speciality contractors, can effectively utilise cost-effective collaboration-enabling technologies, such as cloud computing, to help in the effective transfer of information and data to improve productivity. The system dynamics (SD) approach offers a perspective and tools to enable a better understanding of the dynamics of complex systems. This research focuses upon system dynamics methodology as a modelling and analysis tool in order to understand and identify the key drivers in the absorption of cloud computing for CSMEs. The aim of this paper is to determine how the use of system dynamics (SD) can improve the management of information flow through collaborative technologies leading to improved productivity. The data supporting the use of system dynamics was obtained through a pilot study consisting of questionnaires and interviews from five CSMEs in the UK house-building sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let λ1,…,λn be real numbers in (0,1) and p1,…,pn be points in Rd. Consider the collection of maps fj:Rd→Rd given by fj(x)=λjx+(1−λj)pj. It is a well known result that there exists a unique nonempty compact set Λ⊂Rd satisfying Λ=∪nj=1fj(Λ). Each x∈Λ has at least one coding, that is a sequence (ϵi)∞i=1 ∈{1,…,n}N that satisfies limN→∞fϵ1…fϵN(0)=x. We study the size and complexity of the set of codings of a generic x∈Λ when Λ has positive Lebesgue measure. In particular, we show that under certain natural conditions almost every x∈Λ has a continuum of codings. We also show that almost every x∈Λ has a universal coding. Our work makes no assumptions on the existence of holes in Λ and improves upon existing results when it is assumed Λ contains no holes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A causal explanation provides information about the causal history of whatever is being explained. However, most causal histories extend back almost infinitely and can be described in almost infinite detail. Causal explanations therefore involve choices about which elements of causal histories to pick out. These choices are pragmatic: they reflect our explanatory interests. When adjudicating between competing causal explanations, we must therefore consider not only questions of epistemic adequacy (whether we have good grounds for identifying certain factors as causes) but also questions of pragmatic adequacy (whether the aspects of the causal history picked out are salient to our explanatory interests). Recognizing that causal explanations differ pragmatically as well as epistemically is crucial for identifying what is at stake in competing explanations of the relative peacefulness of the nineteenth-century Concert system. It is also crucial for understanding how explanations of past events can inform policy prescription.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The size and complexity of data sets generated within ecosystem-level programmes merits their capture, curation, storage and analysis, synthesis and visualisation using Big Data approaches. This review looks at previous attempts to organise and analyse such data through the International Biological Programme and draws on the mistakes made and the lessons learned for effective Big Data approaches to current Research Councils United Kingdom (RCUK) ecosystem-level programmes, using Biodiversity and Ecosystem Service Sustainability (BESS) and Environmental Virtual Observatory Pilot (EVOp) as exemplars. The challenges raised by such data are identified, explored and suggestions are made for the two major issues of extending analyses across different spatio-temporal scales and for the effective integration of quantitative and qualitative data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter analyses the major UK economic crises that have occurred since the speculative bubbles of the seventeenth century. It integrates insights from economic history and business history to analyse both the general economic conditions and the specific business and financial practices that led to these crises. The analysis suggests a significant reinterpretation of the evidence – one that questions economists’ conventional views.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demand for organic milk is partially driven by consumer perceptions that it is more nutritious. However, there is still considerable uncertainty over whether the use of organic production standards affects milk quality. Here we report results of meta-analyses based on 170 published studies comparing the nutrient content of organic and conventional bovine milk. There were no significant differences in total SFA and MUFA concentrations between organic and conventional milk. However, concentrations of total PUFA and n-3 PUFA were significantly higher in organic milk, by an estimated 7 (95 % CI −1, 15) % and 56 (95 % CI 38, 74) %, respectively. Concentrations of α-linolenic acid (ALA), very long-chain n-3 fatty acids (EPA+DPA+DHA) and conjugated linoleic acid were also significantly higher in organic milk, by an 69 (95 % CI 53, 84) %, 57 (95 % CI 27, 87) % and 41 (95 % CI 14, 68) %, respectively. As there were no significant differences in total n-6 PUFA and linoleic acid (LA) concentrations, the n-6:n-3 and LA:ALA ratios were lower in organic milk, by an estimated 71 (95 % CI −122, −20) % and 93 (95 % CI −116, −70) %. It is concluded that organic bovine milk has a more desirable fatty acid composition than conventional milk. Meta-analyses also showed that organic milk has significantly higher α-tocopherol and Fe, but lower I and Se concentrations. Redundancy analysis of data from a large cross-European milk quality survey indicates that the higher grazing/conserved forage intakes in organic systems were the main reason for milk composition differences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recruitment of patients to a clinical trial usually occurs over a period of time, resulting in the steady accumulation of data throughout the trial's duration. Yet, according to traditional statistical methods, the sample size of the trial should be determined in advance, and data collected on all subjects before analysis proceeds. For ethical and economic reasons, the technique of sequential testing has been developed to enable the examination of data at a series of interim analyses. The aim is to stop recruitment to the study as soon as there is sufficient evidence to reach a firm conclusion. In this paper we present the advantages and disadvantages of conducting interim analyses in phase III clinical trials, together with the key steps to enable the successful implementation of sequential methods in this setting. Examples are given of completed trials, which have been carried out sequentially, and references to relevant literature and software are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analysed Hordeum spontaneum accessions from 21 different locations to understand the genetic diversity of HsDhn3 alleles and effects of single base mutations on the intrinsically disordered structure of the resulting polypeptide (HsDHN3). HsDHN3 was found to be YSK2-type with a low-frequency 6-aa deletion in the beginning of Exon 1. There is relatively high diversity in the intron region of HsDhn3 compared to the two exon regions. We have found subtle differences in K segments led to changes in amino acids chemical properties. Predictions for protein interaction profiles suggest the presence of a protein-binding site in HsDHN3 that coincides with the K1 segment. Comparison of DHN3 to closely related cereals showed that all of them contain a nuclear localization signal sequence flanking to the K1 segment and a novel conserved region located between the S and K1 segments [E(D/T)DGMGGR]. We found that H. vulgare, H. spontaneum, and Triticum urartu DHN3s have a greater number of phosphorylation sites for protein kinase C than other cereal species, which may be related to stress adaptation. Our results show that the nature and extent of mutations in the conserved segments of K1 and K2 are likely to be key factors in protection of cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) is a popular family of algorithms which perform approximate parameter inference when numerical evaluation of the likelihood function is not possible but data can be simulated from the model. They return a sample of parameter values which produce simulations close to the observed dataset. A standard approach is to reduce the simulated and observed datasets to vectors of summary statistics and accept when the difference between these is below a specified threshold. ABC can also be adapted to perform model choice. In this article, we present a new software package for R, abctools which provides methods for tuning ABC algorithms. This includes recent dimension reduction algorithms to tune the choice of summary statistics, and coverage methods to tune the choice of threshold. We provide several illustrations of these routines on applications taken from the ABC literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. The rapid expansion of systematic monitoring schemes necessitates robust methods to reliably assess species' status and trends. Insect monitoring poses a challenge where there are strong seasonal patterns, requiring repeated counts to reliably assess abundance. Butterfly monitoring schemes (BMSs) operate in an increasing number of countries with broadly the same methodology, yet they differ in their observation frequency and in the methods used to compute annual abundance indices. 2. Using simulated and observed data, we performed an extensive comparison of two approaches used to derive abundance indices from count data collected via BMS, under a range of sampling frequencies. Linear interpolation is most commonly used to estimate abundance indices from seasonal count series. A second method, hereafter the regional generalized additive model (GAM), fits a GAM to repeated counts within sites across a climatic region. For the two methods, we estimated bias in abundance indices and the statistical power for detecting trends, given different proportions of missing counts. We also compared the accuracy of trend estimates using systematically degraded observed counts of the Gatekeeper Pyronia tithonus (Linnaeus 1767). 3. The regional GAM method generally outperforms the linear interpolation method. When the proportion of missing counts increased beyond 50%, indices derived via the linear interpolation method showed substantially higher estimation error as well as clear biases, in comparison to the regional GAM method. The regional GAM method also showed higher power to detect trends when the proportion of missing counts was substantial. 4. Synthesis and applications. Monitoring offers invaluable data to support conservation policy and management, but requires robust analysis approaches and guidance for new and expanding schemes. Based on our findings, we recommend the regional generalized additive model approach when conducting integrative analyses across schemes, or when analysing scheme data with reduced sampling efforts. This method enables existing schemes to be expanded or new schemes to be developed with reduced within-year sampling frequency, as well as affording options to adapt protocols to more efficiently assess species status and trends across large geographical scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ‘soft’ ionization technique matrix-assisted laser desorption/ionization (MALDI) is without doubt one of the great success stories of modern mass spectrometry (MS). In particular, the further development of MALDI and in general ‘soft’ laser ionization, focusing on their unique characteristics and advantages in areas such as speed, spatial resolution, sample preparation and low spectral complexity, have led to great advances in mass spectral profiling and imaging with an extremely auspicious future in (bio)medical analyses.