216 resultados para Caravaggio, Michelangelo Merisi da, 1573-1610.
Resumo:
We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.
Landscape, regional and global estimates of nitrogen flux from land to sea: errors and uncertainties
Resumo:
Regional to global scale modelling of N flux from land to ocean has progressed to date through the development of simple empirical models representing bulk N flux rates from large watersheds, regions, or continents on the basis of a limited selection of model parameters. Watershed scale N flux modelling has developed a range of physically-based approaches ranging from models where N flux rates are predicted through a physical representation of the processes involved, through to catchment scale models which provide a simplified representation of true systems behaviour. Generally, these watershed scale models describe within their structure the dominant process controls on N flux at the catchment or watershed scale, and take into account variations in the extent to which these processes control N flux rates as a function of landscape sensitivity to N cycling and export. This paper addresses the nature of the errors and uncertainties inherent in existing regional to global scale models, and the nature of error propagation associated with upscaling from small catchment to regional scale through a suite of spatial aggregation and conceptual lumping experiments conducted on a validated watershed scale model, the export coefficient model. Results from the analysis support the findings of other researchers developing macroscale models in allied research fields. Conclusions from the study confirm that reliable and accurate regional scale N flux modelling needs to take account of the heterogeneity of landscapes and the impact that this has on N cycling processes within homogenous landscape units.
Resumo:
Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.
Resumo:
An interdisciplinary theoretical framework is proposed for analysing justice in global working conditions. In addition to gender and race as popular criteria to identify disadvantaged groups in organizations, in multinational corporations (MNCs) local employees (i.e. host country nationals (HCNs) working in foreign subsidiaries) deserve special attention. Their working conditions are often substantially worse than those of expatriates (i.e. parent country nationals temporarily assigned to a foreign subsidiary). Although a number of reasons have been put forward to justify such inequalities—usually with efficiency goals in mind—recent studies have used equity theory to question the extent to which they are perceived as fair by HCNs. However, since perceptual equity theory has limitations, this study develops an alternative and non-perceptual framework for analysing such inequalities. Employment discrimination theory and elements of Rawls’s ‘Theory of Justice’ are the theoretical pillars of this framework. This article discusses the advantages of this approach for MNCs and identifies some expatriation practices that are fair according to our non-perceptual justice standards, whilst also reasonably (if not highly) efficient.
Resumo:
Placing a child in out-of-home care is one of the most important decisions made by professionals in the child care system, with substantial social, psychological, educational, medical and economic consequences. This paper considers the challenges and difficulties of building statistical models of this decision by reviewing the available international evidence. Despite the large number of empirical investigations over a 50 year period, a consensus on the variables associated with this decision is hard to identify. In addition, the individual models have low explanatory and predictive power and should not be relied on to make placement decisions. A number of reasons for this poor performance are offered, and some ways forwards suggested. This paper also aims to facilitate the emergence of a coherent and integrated international literature from the disconnected and fragmented empirical studies. Rather than one placement problem, there are many slightly different problems, and therefore it is expected that a number of related sub-literatures will emerge, each concentrating on a particular definition of the placement problem.
Resumo:
The policy context for mother-tongue educators at all levels in England has been dominated by a matrix with four key elements,running along two spectra, one of learning (content↘assessment) and one of teaching (autonomy↘accountability). In each case the trend has been towards increasing external control and decreasing professional autonomy. Whilst some imposed changes have been recognised as intrinsically valuable, the majority are viewed as detrimental to teachers' status and obstructive for students. The research community has been largely marginalised and has had little scope to influence proceedings. A rapidly developing crisis in teacher retention may yet reverse these trends as the government is forced to recognise the long-term implications of their treatment of the profession.
Resumo:
OBJECTIVES: The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. METHODS: To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. RESULTS: To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. CONCLUSIONS: Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.
Resumo:
The development of novel molecules for the creation of nanometer structures with specific properties has been the current interest of this research. We have developed a set of molecules from hydrophobic omega- and alpha-amino acids by protecting the -NH(2) with Boc (t-butyloxycarbonyl) group and -CO(2)H with para-nitroanilide such as BocHN-Xx-CONH-(p-NO(2))center dot C(6)H(4), where Xx is gamma-aminobutyric acid (gamma-Abu), (L)-isoleucine, alpha-aminoisobutyric acid, proline, etc. These molecules generate various nanometer structures, such as nanofibrils, nanotubes and nanovesicles, in methanol/water through the self-assembly of bilayers in which the nitro benzene moieties are stacked in the middle and the Boc-protected amino acids parts are packed in the outer surface. The bilayers can be further stacked one over the other through hydrophobic interactions to form multilayer structure, which helps to generate different kinds of nanoscopic structures. The formation of the nanostructures has been facilitated through the participation of various noncovalent interactions, such as hydrophobic interactions, hydrogen bonding and aromatic p-stacking interactions. Fluorescence microscopy and UV studies reveal that the nanovesicles generated from pro-based molecule can encapsulate dye molecules which can be released by addition of acid (at pH 2). These single amino acid based molecules are both easy to synthesize and cost-effective and therefore offer novel scaffolds for the future design of nanoscale structures.
Resumo:
A synthesis of global climate model results and inferences from proxy records suggests an increased sea surface temperature gradient between the tropical Indian and Pacific Oceans during medieval times.
The use of stalagmite geochemistry to detect past volcanic eruptions and their environmental impacts
Resumo:
We analyze the choice between the origin and destination principles of taxation when there is product differentiation and Bertrand competition. If taxes are redistributed to consumers and demand is linear the origin principle dominates the destination principle whatever the degree of product differentiation and extent of economic integration. With nonlinear demand the origin principle dominates if there is sufficient economic integration. When the social value assigned to tax revenue is higher than the private value, the destination principle dominates for intermediate values of product differentiation and economic integration. The same results are also shown to hold with Cournot competition.
Resumo:
Interest in the impacts of climate change is ever increasing. This is particularly true of the water sector where understanding potential changes in the occurrence of both floods and droughts is important for strategic planning. Climate variability has been shown to have a significant impact on UK climate and accounting for this in future climate cahgne projections is essential to fully anticipate potential future impacts. In this paper a new resampling methodology is developed which includes the variability of both baseline and future precipitation. The resampling methodology is applied to 13 CMIP3 climate models for the 2080s, resulting in an ensemble of monthly precipitation change factors. The change factors are applied to the Eden catchment in eastern Scotland with analysis undertaken for the sensitivity of future river flows to the changes in precipitation. Climate variability is shown to influence the magnitude and direction of change of both precipitation and in turn river flow, which are not apparent without the use of the resampling methodology. The transformation of precipitation changes to river flow changes display a degree of non-linearity due to the catchment's role in buffering the response. The resampling methodology developed in this paper provides a new technique for creating climate change scenarios which incorporate the important issue of climate variability.
Resumo:
Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous environmental pollutants that frequently accumulate in soils. There is therefore a requirement to determine their levels in contaminated environments for the purposes of determining impacts on human health. PAHs are a suite of individual chemicals, and there is an ongoing debate as to the most appropriate method for assessing the risk to humans from them. Two methods predominate: the surrogate marker approach and the toxic equivalency factor. The former assumes that all chemicals in a mixture have an equivalent toxicity. The toxic equivalency approach estimates the potency of individual chemicals relative to the usually most toxic Benzo(a)pyrene. The surrogate marker approach is believed to overestimate risk and the toxic equivalency factor to underestimate risk. When analysing the risks from soils, the surrogate marker approach is preferred due to its simplicity, but there are concerns because of the potential diversity of the PAH profile across the range of impacted soils. Using two independent data sets containing soils from 274 sites across a diverse range of locations, statistical analysis was undertaken to determine the differences in the composition of carcinogenic PAH between site locations, for example, rural versus industrial. Following principal components analysis, distinct population differences were not seen between site locations in spite of large differences in the total PAH burden between individual sites. Using all data, highly significant correlations were seen between BaP and other carcinogenic PAH with the majority of r2 values > 0.8. Correlations with the European Food Standards Agency (EFSA) summed groups, that is, EFSA2, EFSA4 and EFSA8 had even higher correlations (r2 > 0.95). We therefore conclude that BaP is a suitable surrogate marker to represent mixtures of PAH in soil during risk assessments.