892 resultados para Babylon revisited
Resumo:
Variational data assimilation in continuous time is revisited. The central techniques applied in this paper are in part adopted from the theory of optimal nonlinear control. Alternatively, the investigated approach can be considered as a continuous time generalization of what is known as weakly constrained four-dimensional variational assimilation (4D-Var) in the geosciences. The technique allows to assimilate trajectories in the case of partial observations and in the presence of model error. Several mathematical aspects of the approach are studied. Computationally, it amounts to solving a two-point boundary value problem. For imperfect models, the trade-off between small dynamical error (i.e. the trajectory obeys the model dynamics) and small observational error (i.e. the trajectory closely follows the observations) is investigated. This trade-off turns out to be trivial if the model is perfect. However, even in this situation, allowing for minute deviations from the perfect model is shown to have positive effects, namely to regularize the problem. The presented formalism is dynamical in character. No statistical assumptions on dynamical or observational noise are imposed.
Resumo:
The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.
Resumo:
Aerosol indirect effects continue to constitute one of the most important uncertainties for anthropogenic climate perturbations. Within the international AEROCOM initiative, the representation of aerosol-cloud-radiation interactions in ten different general circulation models (GCMs) is evaluated using three satellite datasets. The focus is on stratiform liquid water clouds since most GCMs do not include ice nucleation effects, and none of the model explicitly parameterises aerosol effects on convective clouds. We compute statistical relationships between aerosol optical depth (τa) and various cloud and radiation quantities in a manner that is consistent between the models and the satellite data. It is found that the model-simulated influence of aerosols on cloud droplet number concentration (Nd ) compares relatively well to the satellite data at least over the ocean. The relationship between �a and liquid water path is simulated much too strongly by the models. This suggests that the implementation of the second aerosol indirect effect mainly in terms of an autoconversion parameterisation has to be revisited in the GCMs. A positive relationship between total cloud fraction (fcld) and �a as found in the satellite data is simulated by the majority of the models, albeit less strongly than that in the satellite data in most of them. In a discussion of the hypotheses proposed in the literature to explain the satellite-derived strong fcld–�a relationship, our results indicate that none can be identified as a unique explanation. Relationships similar to the ones found in satellite data between �a and cloud top temperature or outgoing long-wave radiation (OLR) are simulated by only a few GCMs. The GCMs that simulate a negative OLR - �a relationship show a strong positive correlation between �a and fcld. The short-wave total aerosol radiative forcing as simulated by the GCMs is strongly influenced by the simulated anthropogenic fraction of �a, and parameterisation assumptions such as a lower bound on Nd . Nevertheless, the strengths of the statistical relationships are good predictors for the aerosol forcings in the models. An estimate of the total short-wave aerosol forcing inferred from the combination of these predictors for the modelled forcings with the satellite-derived statistical relationships yields a global annual mean value of −1.5±0.5Wm−2. In an alternative approach, the radiative flux perturbation due to anthropogenic aerosols can be broken down into a component over the cloud-free portion of the globe (approximately the aerosol direct effect) and a component over the cloudy portion of the globe (approximately the aerosol indirect effect). An estimate obtained by scaling these simulated clearand cloudy-sky forcings with estimates of anthropogenic �a and satellite-retrieved Nd–�a regression slopes, respectively, yields a global, annual-mean aerosol direct effect estimate of −0.4±0.2Wm−2 and a cloudy-sky (aerosol indirect effect) estimate of −0.7±0.5Wm−2, with a total estimate of −1.2±0.4Wm−2.
Resumo:
In recent years, research into the impact of genetic abnormalities on cognitive development, including language, has become recognized for its potential to make valuable contributions to our understanding of the brain–behaviour relationships underlying language acquisition as well as to understanding the cognitive architecture of the human mind. The publication of Fodor’s ( 1983 ) book The Modularity of Mind has had a profound impact on the study of language and the cognitive architecture of the human mind. Its central claim is that many of the processes involved in comprehension are undertaken by special brain systems termed ‘modules’. This domain specificity of language or modularity has become a fundamental feature that differentiates competing theories and accounts of language acquisition (Fodor 1983 , 1985 ; Levy 1994 ; Karmiloff-Smith 1998 ). However, although the fact that the adult brain is modularized is hardly disputed, there are different views of how brain regions become specialized for specific functions. A question of some interest to theorists is whether the human brain is modularized from the outset (nativist view) or whether these distinct brain regions develop as a result of biological maturation and environmental input (neuroconstructivist view). One source of insight into these issues has been the study of developmental disorders, and in particular genetic syndromes, such as Williams syndrome (WS) and Down syndrome (DS). Because of their uneven profiles characterized by dissociations of different cognitive skills, these syndromes can help us address theoretically significant questions. Investigations into the linguistic and cognitive profiles of individuals with these genetic abnormalities have been used as evidence to advance theoretical views about innate modularity and the cognitive architecture of the human mind. The present chapter will be organized as follows. To begin, two different theoretical proposals in the modularity debate will be presented. Then studies of linguistic abilities in WS and in DS will be reviewed. Here, the emphasis will be mainly on WS due to the fact that theoretical debates have focused primarily on WS, there is a larger body of literature on WS, and DS subjects have typically been used for the purposes of comparison. Finally, the modularity debate will be revisited in light of the literature review of both WS and DS. Conclusions will be drawn regarding the contribution of these two genetic syndromes to the issue of cognitive modularity, and in particular innate modularity.
Resumo:
Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.
Resumo:
According to the principle of copyright exhaustion, once a copy of a work is placed on the market, the right holder’s control over further distribution of that copy is exhausted. Unlike the distribution of hard copies of copyright works, however, the electronic dissemination of content is not subject to the exhaustion principle. This means that second-hand markets of digital goods cannot exist. Traditionally, exhaustion is premised on four assumptions that cannot be safely assumed in the online context: it applies to tangible copies only; it covers goods and not services; the goods should be sold but not licensed; and the property entitlement should be alienated upon transfer. After long jurisprudential silence, courts at worldwide level have revisited these normative impediments to affirm that exhaustion can apply online in specific instances. The article discusses the doctrinal norms that underpin exhaustion and determines the conditions under which online copyright exhaustion can apply.
Resumo:
Many studies have widely accepted the assumption that learning processes can be promoted when teaching styles and learning styles are well matched. In this study, the synergy between learning styles, learning patterns, and gender as a selected demographic feature and learners’ performance were quantitatively investigated in a blended learning setting. This environment adopts a traditional teaching approach of ‘one-sizefits-all’ without considering individual user’s preferences and attitudes. Hence, evidence can be provided about the value of taking such factors into account in Adaptive Educational Hypermedia Systems (AEHSs). Felder and Soloman’s Index of Learning Styles (ILS) was used to identify the learning styles of 59 undergraduate students at the University of Babylon. Five hypotheses were investigated in the experiment. Our findings show that there is no statistical significance in some of the assessed factors. However, processing dimension, the total number of hits on course website and gender indicated a statistical significance on learners’ performance. This finding needs more investigation in order to identify the effective factors on students’ achievement to be considered in Adaptive Educational Hypermedia Systems (AEHSs).
Resumo:
Lifestyle factors are responsible for a considerable portion of cancer incidence worldwide, but credible estimates from the World Health Organization and the International Agency for Research on Cancer (IARC) suggest that the fraction of cancers attributable to toxic environmental exposures is between 7% and 19%. To explore the hypothesis that low-dose exposures to mixtures of chemicals in the environment may be combining to contribute to environmental carcinogenesis, we reviewed 11 hallmark phenotypes of cancer, multiple priority target sites for disruption in each area and prototypical chemical disruptors for all targets, this included dose-response characterizations, evidence of low-dose effects and cross-hallmark effects for all targets and chemicals. In total, 85 examples of chemicals were reviewed for actions on key pathways/mechanisms related to carcinogenesis. Only 15% (13/85) were found to have evidence of a dose-response threshold, whereas 59% (50/85) exerted low-dose effects. No dose-response information was found for the remaining 26% (22/85). Our analysis suggests that the cumulative effects of individual (non-carcinogenic) chemicals acting on different pathways, and a variety of related systems, organs, tissues and cells could plausibly conspire to produce carcinogenic synergies. Additional basic research on carcinogenesis and research focused on low-dose effects of chemical mixtures needs to be rigorously pursued before the merits of this hypothesis can be further advanced. However, the structure of the World Health Organization International Programme on Chemical Safety 'Mode of Action' framework should be revisited as it has inherent weaknesses that are not fully aligned with our current understanding of cancer biology.
Resumo:
Aims. Although the time of the Maunder minimum (1645–1715) is widely known as a period of extremely low solar activity, it is still being debated whether solar activity during that period might have been moderate or even higher than the current solar cycle (number 24). We have revisited all existing evidence and datasets, both direct and indirect, to assess the level of solar activity during the Maunder minimum. Methods. We discuss the East Asian naked-eye sunspot observations, the telescopic solar observations, the fraction of sunspot active days, the latitudinal extent of sunspot positions, auroral sightings at high latitudes, cosmogenic radionuclide data as well as solar eclipse observations for that period. We also consider peculiar features of the Sun (very strong hemispheric asymmetry of the sunspot location, unusual differential rotation and the lack of the K-corona) that imply a special mode of solar activity during the Maunder minimum. Results. The level of solar activity during the Maunder minimum is reassessed on the basis of all available datasets. Conclusions. We conclude that solar activity was indeed at an exceptionally low level during the Maunder minimum. Although the exact level is still unclear, it was definitely lower than during the Dalton minimum of around 1800 and significantly below that of the current solar cycle #24. Claims of a moderate-to-high level of solar activity during the Maunder minimum are rejected with a high confidence level.
Resumo:
The notions of resolution and discrimination of probability forecasts are revisited. It is argued that the common concept underlying both resolution and discrimination is the dependence (in the sense of probability theory) of forecasts and observations. More specifically, a forecast has no resolution if and only if it has no discrimination if and only if forecast and observation are stochastically independent. A statistical tests for independence is thus also a test for no resolution and, at the same time, for no discrimination. The resolution term in the decomposition of the logarithmic scoring rule, and the area under the Receiver Operating Characteristic will be investigated in this light.
Resumo:
Purpose – The purpose of this paper is to introduce the debate forum on internationalization motives of this special issue of Multinational Business Review. Design/methodology/approach – The authors reflect on the background and evolution of the internationalization motives over the past few decades, and then provide suggestions for how to use the motives for future analyses. The authors also reflect on the contributions to the debate of the accompanying articles of the forum. Findings – There continue to be new developments in the way in which firms organize themselves as multinational enterprises (MNEs), and this implies that the “classic” motives originally introduced by Dunning in 1993 need to be revisited. Dunning’s motives and arguments were deductive and atheoretical, and these were intended to be used as a toolkit, used in conjunction with other theories and frameworks. They are not an alternative to a classification of possible MNE strategies. Originality/value – This paper and the ones that accompany it, provide a deeper and nuanced understanding on internationalization motives for future research to build on.
Resumo:
Quiescin Q6/sulfhydryl oxidases (QSOX) are revisited thiol oxidases considered to be involved in the oxidative protein folding, cell cycle control and extracellular matrix remodeling. They contain thioredoxin domains and introduce disulfide bonds into proteins and peptides, with the concomitant hydrogen peroxide formation, likely altering the redox environment. Since it is known that several developmental processes are regulated by the redox state, here we assessed if QSOX could have a role during mouse fetal development. For this purpose, an anti-recombinant mouse QSOX antibody was produced and characterized. In E-13.5, E-16.5 fetal tissues, QSOX immunostaining was confined to mesoderm- and ectoderm-derived tissues, while in P1 neonatal tissues it was slightly extended to some endoderm-derived tissues. QSOX expression, particularly by epithelial tissues, seemed to be developmentally-regulated, increasing with tissue maturation. QSOX was observed in loose connective tissues in all stages analyzed, intra and possibly extracellularly, in agreement with its putative role in oxidative folding and extracellular matrix remodeling. In conclusion, QSOX is expressed in several tissues during mouse development, but preferentially in those derived from mesoderm and ectoderm, suggesting it could be of relevance during developmental processes.
Resumo:
Intestinal ischemia-reperfusion (I/R) injury may cause acute systemic and lung inflammation. Here, we revisited the role of TNF-alpha in an intestinal I/R model in mice, showing that this cytokine is not required for the local and remote inflammatory response upon intestinal I/R injury using neutralizing TNF-alpha antibodies and TNF ligand-deficient mice. We demonstrate increased neutrophil recruitment in the lung as assessed by myeloperoxidase activity and augmented IL-6, granulocyte colony-stimulating factor, and KC levels, whereas TNF-alpha levels in serum were not increased and only minimally elevated in intestine and lung upon intestinal I/R injury. Importantly, TNF-alpha antibody neutralization neither diminished neutrophil recruitment nor any of the cytokines and chemokines evaluated. In addition, the inflammatory response was not abrogated in TNF and TNF receptors 1 and 2-deficient mice. However, in view of the damage on the intestinal barrier upon intestinal I/R with systemic bacterial translocation, we asked whether Toll-like receptor (TLR) activation is driving the inflammatory response. In fact, the inflammatory lung response is dramatically reduced in TLR2/4-deficient mice, confirming an important role of TLR receptor signaling causing the inflammatory lung response. In conclusion, endogenous TNF-alpha is not or minimally elevated and plays no role as a mediator for the inflammatory response upon ischemic tissue injury. By contrast, TLR2/4 signaling induces an orchestrated cytokine/chemokine response leading to local and remote pulmonary inflammation, and therefore disruption of TLR signaling may represent an alternative therapeutic target.