967 resultados para Palaeomagnetism Applied to Tectonics
Resumo:
This article analyzes the different forms of library cooperation and the different types of relations that can be established for collaboration among libraries.
Resumo:
This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.
Resumo:
When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.
Resumo:
We evaluated the feasibility of using faeces as a non-invasively collected DNA source for the genetic study of an endangered bird population (capercaillie; Tetrao urogallus). We used a multitube approach, and for our panel of 11 microsatellites genotyping reliability was estimated at 98% with five repetitions. Experiments showed that free DNases in faecal material were the major cause of DNA degradation. Our results demonstrate that using avian faeces as a source of DNA, reliable microsatellite genotyping can be obtained with a reasonable number of PCR replicates.
Resumo:
In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.
Resumo:
In this paper, we present a stochastic model for disability insurance contracts. The model is based on a discrete time non-homogeneous semi-Markov process (DTNHSMP) to which the backward recurrence time process is introduced. This permits a more exhaustive study of disability evolution and a more efficient approach to the duration problem. The use of semi-Markov reward processes facilitates the possibility of deriving equations of the prospective and retrospective mathematical reserves. The model is applied to a sample of contracts drawn at random from a mutual insurance company.
Resumo:
DREAM is an initiative that allows researchers to assess how well their methods or approaches can describe and predict networks of interacting molecules [1]. Each year, recently acquired datasets are released to predictors ahead of publication. Researchers typically have about three months to predict the masked data or network of interactions, using any predictive method. Predictions are assessed prior to an annual conference where the best predictions are unveiled and discussed. Here we present the strategy we used to make a winning prediction for the DREAM3 phosphoproteomics challenge. We used Amelia II, a multiple imputation software method developed by Gary King, James Honaker and Matthew Blackwell[2] in the context of social sciences to predict the 476 out of 4624 measurements that had been masked for the challenge. To chose the best possible multiple imputation parameters to apply for the challenge, we evaluated how transforming the data and varying the imputation parameters affected the ability to predict additionally masked data. We discuss the accuracy of our findings and show that multiple imputations applied to this dataset is a powerful method to accurately estimate the missing data. We postulate that multiple imputations methods might become an integral part of experimental design as a mean to achieve cost savings in experimental design or to increase the quantity of samples that could be handled for a given cost.
Resumo:
The concept of ideal geometric configurations was recently applied to the classification and characterization of various knots. Different knots in their ideal form (i.e., the one requiring the shortest length of a constant-diameter tube to form a given knot) were shown to have an overall compactness proportional to the time-averaged compactness of thermally agitated knotted polymers forming corresponding knots. This was useful for predicting the relative speed of electrophoretic migration of different DNA knots. Here we characterize the ideal geometric configurations of catenanes (called links by mathematicians), i.e., closed curves in space that are topologically linked to each other. We demonstrate that the ideal configurations of different catenanes show interrelations very similar to those observed in the ideal configurations of knots. By analyzing literature data on electrophoretic separations of the torus-type of DNA catenanes with increasing complexity, we observed that their electrophoretic migration is roughly proportional to the overall compactness of ideal representations of the corresponding catenanes. This correlation does not apply, however, to electrophoretic migration of certain replication intermediates, believed up to now to represent the simplest torus-type catenanes. We propose, therefore, that freshly replicated circular DNA molecules, in addition to forming regular catenanes, may also form hemicatenanes.
Resumo:
A simple and time efficient technique to illustrate specimens is described and demonstrated with Paleogene radiolarians. This method produces Scanning Electron Microscope (SEM) and composite focal depth Transmitted Light Microscope (TLM) images for single radiolarian specimens. We propose the use of this technique to clarify radiolarian taxonomy. This technique has distinct advantages over previously published time consuming techniques that can also require toxic materials.
Resumo:
Three techniques to extract parasite remains from archaeological sediments were tested. The aim was to improve the sensibility of recommended paleoparasitological techniques applied in archaeological remains. Sediment collected from the pelvic girdle of a human body found in Cabo Vírgenes, Santa Cruz, Argentina, associated to a Spanish settlement founded in 1584 known as Nombre de Jesús, was used to search for parasites. Sediment close to the skull was used as control. The techniques recommended by Jones, Reinhard, and Dittmar and Teejen were used and compared with the modified technique presented here, developed to improve the sensibility to detect parasite remains. Positive results were obtained only with the modified technique, resulting in the finding of Trichuris trichiura eggs in the sediment.
Resumo:
This article reports the effects of a pour-on formulation of cypermethrin (6% active ingredient) applied to chickens exposed to Triatoma infestans, the main vector of Chagas disease in rural houses of the Gran Chaco Region of South America. This study was designed as a completely random experiment with three experimental groups and five replicates. Third instar nymphs were fed on chickens treated with 0, 1 and 2 cc of the formulation. Nymphs were allowed to feed on the chickens at different time intervals after the insecticide application. Third-instar nymphs fed on treated chickens showed a higher mortality, took less blood during feeding and had a lower moulting rate. The mortality rate was highest seven days after the insecticide solution application and blood intake was affected until 30 days after the application of the solution.
Resumo:
A study of how the machine learning technique, known as gentleboost, could improve different digital watermarking methods such as LSB, DWT, DCT2 and Histogram shifting.