31 resultados para post-Newtonian approximation to general relativity
Resumo:
The impact that “Romanization” and the development of urban centers had on the health of the Romano-British population is little understood. A re-examination of the skeletal remains of 364 nonadults from the civitas capital at Roman Dorchester (Durnovaria) in Dorset was carried out to measure the health of the children living in this small urban area. The cemetery population was divided into two groups; the first buried their dead organized within an east–west alignment with possible Christian-style graves, and the second with more varied “pagan” graves, aligned north–south. A higher prevalence of malnutrition and trauma was evident in the children from Dorchester than in any other published Romano-British group, with levels similar to those seen in postmedieval industrial communities. Cribra orbitalia was present in 38.5% of the children, with rickets and/or scurvy at 11.2%. Twelve children displayed fractures of the ribs, with 50% of cases associated with rickets and/or scurvy, suggesting that rib fractures should be considered during the diagnosis of these conditions. The high prevalence of anemia, rickets, and scurvy in the Poundbury children, and especially the infants, indicates that this community may have adopted child-rearing practices that involved fasting the newborn, a poor quality weaning diet, and swaddling, leading to general malnutrition and inadequate exposure to sunlight. The Pagan group showed no evidence of scurvy or rib fractures, indicating difference in religious and child-rearing practices but that both burial groups were equally susceptible to rickets and anemia suggests a shared poor standard of living in this urban environment.
Resumo:
Despite recent scholarship that has suggested that most if not all Athenian vases were created primarily for the symposium, vases associated with weddings constitute a distinct range of Athenian products that were used at Athens in the period of the Peloponnesian War and its immediate aftermath (430-390 BCE). Just as the subject matter of sympotic vases suggested stories or other messages to the hetaireia among whom they were used, so the wedding vases may have conveyed messages to audiences at weddings. This paper is an assessment of these wedding vases with particular attention to function: how the images reflect the use of vases in wedding rituals (as containers and/or gifts); how the images themselves were understood and interpreted in the context of weddings; and the post-nuptial uses to which the vases were put. The first part is an iconographic overview of how the Athenian painters depicted weddings, with an emphasis on the display of pottery to onlookers and guests during the public parts of weddings, important events in the life of the polis. The second part focuses on a large group of late fifth century vases that depict personifications of civic virtues, normally in the retinue of Aphrodite (Pandemos). The images would reinforce social expectations, as they advertised the virtues that would create a happy marriage—Peitho, Harmonia (Harmony), and Eukleia (Good Repute)—and promise the benefits that might result from adherence to these values—Eudaimonia and Eutychia (Prosperity), Hygieia (Health), and Paidia (Play or Childrearing). Civic personifications could be interpreted on the private level—as personal virtues—and on the public level—as civic virtues— especially when they appeared on vases that functioned both in public and private, at weddings, which were public acknowledgments of private changes in the lives of individuals within the demos.
Resumo:
Two errors in my paper “Wave functions for the methane molecule” [1] are corrected. They concern my f-harmonic approximation to the wave-function in the equilibrium configuration, for which the final expression for the wave function, the energy lowering, and the density function were all in error.
Resumo:
Sequential techniques can enhance the efficiency of the approximate Bayesian computation algorithm, as in Sisson et al.'s (2007) partial rejection control version. While this method is based upon the theoretical works of Del Moral et al. (2006), the application to approximate Bayesian computation results in a bias in the approximation to the posterior. An alternative version based on genuine importance sampling arguments bypasses this difficulty, in connection with the population Monte Carlo method of Cappe et al. (2004), and it includes an automatic scaling of the forward kernel. When applied to a population genetics example, it compares favourably with two other versions of the approximate algorithm.
Resumo:
Aim The Mediterranean region is a species-rich area with a complex geographical history. Geographical barriers have been removed and restored due to sea level changes and local climatic change. Such barriers have been proposed as a plausible mechanism driving the high levels of speciation and endemism in the Mediterranean basin. This raises the fundamental question: is allopatric isolation the mechanism by which speciation occurs? This study explores the potential driving influence of palaeo-geographical events on the speciation of Cyclamen (Myrsinaceae), a group with most species endemic to the Mediterranean region. Cyclamen species have been shown experimentally to have few genetic barriers to hybridization. Location The Mediterranean region, including northern Africa, extending eastwards to the Black Sea coast. Methods A generic level molecular phylogeny of Myrsinaceae and Primulaceae is constructed, using Bayesian approximation, to produce a secondary age estimate for the stem lineage of Cyclamen. This estimate is used to calibrate temporally an infrageneric phylogeny of Cyclamen, built with nrDNA ITS, cpDNA trnL-F and cpDNA rps16 sequences. A biogeographical analysis of Cyclamen is performed using dispersal-vicariance analysis. Results The emergence of the Cyclamen stem lineage is estimated at 30.1-29.2 Ma, and the crown divergence at 12.9-12.2 Ma. The average age of Cyclamen species is 3.7 Myr. Every pair of sister species have mutually exclusive, allopatric distributions relative to each other. This pattern appears typical of divergence events throughout the evolutionary history of the genus. Main conclusions Geographical barriers, such as the varying levels of the Mediterranean Sea, are the most plausible explanation for speciation events throughout the phylogenetic history of Cyclamen. The genus demonstrates distributional patterns congruent with the temporally reticulate palaeogeography of the Mediterranean region.
Resumo:
Stephens and Donnelly have introduced a simple yet powerful importance sampling scheme for computing the likelihood in population genetic models. Fundamental to the method is an approximation to the conditional probability of the allelic type of an additional gene, given those currently in the sample. As noted by Li and Stephens, the product of these conditional probabilities for a sequence of draws that gives the frequency of allelic types in a sample is an approximation to the likelihood, and can be used directly in inference. The aim of this note is to demonstrate the high level of accuracy of "product of approximate conditionals" (PAC) likelihood when used with microsatellite data. Results obtained on simulated microsatellite data show that this strategy leads to a negligible bias over a wide range of the scaled mutation parameter theta. Furthermore, the sampling variance of likelihood estimates as well as the computation time are lower than that obtained with importance sampling on the whole range of theta. It follows that this approach represents an efficient substitute to IS algorithms in computer intensive (e.g. MCMC) inference methods in population genetics. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
The inaugural meeting of the International Scientific Association for Probiotics and Prebiotics (ISAPP) was held May 3 to May 5 2002 in London, Ontario, Canada. A group of 63 academic and industrial scientists from around the world convened to discuss current issues in the science of probiotics and prebiotics. ISAPP is a non-profit organization comprised of international scientists whose intent is to strongly support and improve the levels of scientific integrity and due diligence associated with the study, use, and application of probiotics and prebiotics. In addition, ISAPP values its role in facilitating communication with the public and healthcare providers and among scientists in related fields on all topics pertinent to probiotics and prebiotics. It is anticipated that such efforts will lead to development of approaches and products that are optimally designed for the improvement of human and animal health and well being. This article is a summary of the discussions, conclusions, and recommendations made by 8 working groups convened during the first ISAPP workshop focusing on the topics of: definitions, intestinal flora, extra-intestinal sites, immune function, intestinal disease, cancer, genetics and genomics, and second generation prebiotics. Humans have evolved in symbiosis with an estimated 1014 resident microorganisms. However, as medicine has widely defined and explored the perpetrators of disease, including those of microbial origin, it has paid relatively little attention to the microbial cells that constitute the most abundant life forms associated with our body. Microbial metabolism in humans and animals constitutes an intense biochemical activity in the body, with profound repercussions for health and disease. As understanding of the human genome constantly expands, an important opportunity will arise to better determine the relationship between microbial populations within the body and host factors (including gender, genetic background, and nutrition) and the concomitant implications for health and improved quality of life. Combined human and microbial genetic studies will determine how such interactions can affect human health and longevity, which communication systems are used, and how they can be influenced to benefit the host. Probiotics are defined as live microorganisms which, when administered in adequate amounts confer a health benefit on the host.1 The probiotic concept dates back over 100 years, but only in recent times have the scientific knowledge and tools become available to properly evaluate their effects on normal health and well being, and their potential in preventing and treating disease. A similar situation exists for prebiotics, defined by this group as non-digestible substances that provide a beneficial physiological effect on the host by selectively stimulating the favorable growth or activity of a limited number of indigenous bacteria. Prebiotics function complementary to, and possibly synergistically with, probiotics. Numerous studies are providing insights into the growth and metabolic influence of these microbial nutrients on health. Today, the science behind the function of probiotics and prebiotics still requires more stringent deciphering both scientifically and mechanistically. The explosion of publications and interest in probiotics and prebiotics has resulted in a body of collective research that points toward great promise. However, this research is spread among such a diversity of organisms, delivery vehicles (foods, pills, and supplements), and potential health targets such that general conclusions cannot easily be made. Nevertheless, this situation is rapidly changing on a number of important fronts. With progress over the past decade on the genetics of lactic acid bacteria and the recent, 2,3 and pending, 4 release of complete genome sequences for major probiotic species, the field is now armed with detailed information and sophisticated microbiological and bioinformatic tools. Similarly, advances in biotechnology could yield new probiotics and prebiotics designed for enhanced or expanded functionality. The incorporation of genetic tools within a multidisciplinary scientific platform is expected to reveal the contributions of commensals, probiotics, and prebiotics to general health and well being and explicitly identify the mechanisms and corresponding host responses that provide the basis for their positive roles and associated claims. In terms of human suffering, the need for effective new approaches to prevent and treat disease is paramount. The need exists not only to alleviate the significant mortality and morbidity caused by intestinal diseases worldwide (especially diarrheal diseases in children), but also for infections at non-intestinal sites. This is especially worthy of pursuit in developing nations where mortality is too often the outcome of food and water borne infection. Inasmuch as probiotics and prebiotics are able to influence the populations or activities of commensal microflora, there is evidence that they can also play a role in mitigating some diseases. 5,6 Preliminary support that probiotics and prebiotics may be useful as intervention in conditions including inflammatory bowel disease, irritable bowel syndrome, allergy, cancer (especially colorectal cancer of which 75% are associated with diet), vaginal and urinary tract infections in women, kidney stone disease, mineral absorption, and infections caused by Helicobacter pylori is emerging. Some metabolites of microbes in the gut may also impact systemic conditions ranging from coronary heart disease to cognitive function, suggesting the possibility that exogenously applied microbes in the form of probiotics, or alteration of gut microecology with prebiotics, may be useful interventions even in these apparently disparate conditions. Beyond these direct intervention targets, probiotic cultures can also serve in expanded roles as live vehicles to deliver biologic agents (vaccines, enzymes, and proteins) to targeted locations within the body. The economic impact of these disease conditions in terms of diagnosis, treatment, doctor and hospital visits, and time off work exceeds several hundred billion dollars. The quality of life impact is also of major concern. Probiotics and prebiotics offer plausible opportunities to reduce the morbidity associated with these conditions. The following addresses issues that emerged from 8 workshops (Definitions, Intestinal Flora, Extra-Intestinal Sites, Immune Function, Intestinal Disease, Cancer, Genomics, and Second Generation Prebiotics), reflecting the current scientific state of probiotics and prebiotics. This is not a comprehensive review, however the study emphasizes pivotal knowledge gaps, and recommendations are made as to the underlying scientific and multidisciplinary studies that will be required to advance our understanding of the roles and impact of prebiotics, probiotics, and the commensal microflora upon health and disease management.
Resumo:
The transreal numbers are a total number system in which even, arithmetical operation is well defined even-where. This has many benefits over the real numbers as a basis for computation and, possibly, for physical theories. We define the topology of the transreal numbers and show that it gives a more coherent interpretation of two's complement arithmetic than the conventional integer model. Trans-two's-complement arithmetic handles the infinities and 0/0 more coherently, and with very much less circuitry, than floating-point arithmetic. This reduction in circuitry is especially beneficial in parallel computers, such as the Perspex machine, and the increase in functionality makes Digital Signal Processing chips better suited to general computation.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
This paper describes experiments relating to the perception of the roughness of simulated surfaces via the haptic and visual senses. Subjects used a magnitude estimation technique to judge the roughness of “virtual gratings” presented via a PHANToM haptic interface device, and a standard visual display unit. It was shown that under haptic perception, subjects tended to perceive roughness as decreasing with increased grating period, though this relationship was not always statistically significant. Under visual exploration, the exact relationship between spatial period and perceived roughness was less well defined, though linear regressions provided a reliable approximation to individual subjects’ estimates.
Resumo:
Texture and small-scale surface details are widely recognised as playing an important role in the haptic identification of objects. In order to simulate realistic textures in haptic virtual environments, it has become increasingly necessary to identify a robust technique for modelling of surface profiles. This paper describes a method whereby Fourier series spectral analysis is employed in order to describe the measured surface profiles of several characteristic surfaces. The results presented suggest that a bandlimited Fourier series can be used to provide a realistic approximation to surface amplitude profiles.
Resumo:
The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.
Resumo:
European economic and political integration have been recognised as having implications for patterns of performance in national real estate and capital markets and have generated a wide body of research and commentary. In 1999, progress towards monetary integration within the European Union culminated in the introduction of a common currency and monetary policy. This paper investigates the effects of this ‘event’ on the behaviour of stock returns in European real estate companies. A range of statistical tests is applied to the performance of European property companies to test for changes in segmentation, co-movement and causality. The results suggest that, relative to the wider equity markets, the dispersion of performance is higher, correlations are lower, a common contemporaneous factor has much lower explanatory power whilst lead-lag relationships are stronger. Consequently, the evidence of transmission of monetary integration to real estate securities is less noticeable than to general securities. Less and slower integration is attributed to the relatively small size of the real estate securities market and the local and national nature of the majority of the companies’ portfolios.
Resumo:
Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.