996 resultados para Time ranges
Resumo:
There is overwhelming evidence for the existence of substantial genetic influences on individual differences in general and specific cognitive abilities, especially in adults. The actual localization and identification of genes underlying variation in cognitive abilities and intelligence has only just started, however. Successes are currently limited to neurological mutations with rather severe cognitive effects. The current approaches to trace genes responsible for variation in the normal ranges of cognitive ability consist of large scale linkage and association studies. These are hampered by the usual problems of low statistical power to detect quantitative trait loci (QTLs) of small effect. One strategy to boost the power of genomic searches is to employ endophenotypes of cognition derived from the booming field of cognitive neuroscience This special issue of Behavior Genetics reports on one of the first genome-wide association studies for general IQ. A second paper summarizes candidate genes for cognition, based on animal studies. A series of papers then introduces two additional levels of analysis in the ldquoblack boxrdquo between genes and cognitive ability: (1) behavioral measures of information-processing speed (inspection time, reaction time, rapid naming) and working memory capacity (performance on on single or dual tasks of verbal and spatio-visual working memory), and (2) electrophyiosological derived measures of brain function (e.g., event-related potentials). The obvious way to assess the reliability and validity of these endophenotypes and their usefulness in the search for cognitive ability genes is through the examination of their genetic architecture in twin family studies. Papers in this special issue show that much of the association between intelligence and speed-of-information processing/brain function is due to a common gene or set of genes, and thereby demonstrate the usefulness of considering these measures in gene-hunting studies for IQ.
Resumo:
Renal drug elimination is determined by glomerular filtration, tubular secretion, and tubular reabsorption. Changes in the integrity of these processes influence renal drug clearance, and these changes may not be detected by conventional measures of renal function such as creatinine clearance. The aim of the current study was to examine the analytic issues needed to develop a cocktail of marker drugs (fluconazole, rac-pindolol, para-aminohippuric acid, sinistrin) to measure simultaneously the mechanisms contributing to renal clearance. High-performance liquid chromatographic methods of analysis for fluconazole, pindolol, para-aminohippuric acid, and creatinine and an enzymatic assay for sinistrin were developed or modified and then validated to allow determination of each of the compounds in both plasma and urine in the presence of all other marker drugs. A pilot clinical study in one volunteer was conducted to ensure that the assays were suitable for quantitating all the marker drugs to the sensitivity and specificity needed to allow accurate determination of individual renal clearances. The performance of all assays (plasma and urine) complied with published validation criteria. All standard curves displayed linearity over the concentration ranges required, with coefficients of correlation greater than 0.99. The precision of the interday and intraday variabilities of quality controls for each marker in plasma and urine were all less than 11.9% for each marker. Recoveries of markers (and internal standards) in plasma and urine were all at least 90%. All markers investigated were shown to be stable when plasma or urine was frozen and thawed. For all the assays developed, there were no interferences from other markers or endogenous substances. In a pilot clinical study, concentrations of all markers could be accurately and reproducibly determined for a sufficient duration of time after administration to calculate accurate renal clearance for each marker. This article presents details of the analytic techniques developed for measuring concentrations of marker drugs for different renal elimination processes administered as a single dose to define the processes contributing to renal drug elimination.
Resumo:
Research on the stability of flavours during high temperature extrusion cooking is reviewed. The important factors that affect flavour and aroma retention during the process of extrusion are illustrated. A substantial number of flavour volatiles which are incorporated prior to extrusion are normally lost during expansion, this is because of steam distillation. Therefore, a general practice has been to introduce a flavour mix after the extrusion process. This extra operation requires a binding agent (normally oil), and may also result in a non-uniform distribution of the flavour and low oxidative stability of the flavours exposed on the surface. Therefore, the importance of encapsulated flavours, particularly the beta -cyclodextrin-flavour complex, is highlighted in this paper.
Resumo:
It is not possible to make measurements of the phase of an optical mode using linear optics without introducing an extra phase uncertainty. This extra phase variance is quite large for heterodyne measurements, however it is possible to reduce it to the theoretical limit of log (n) over bar (4 (n) over bar (2)) using adaptive measurements. These measurements are quite sensitive to experimental inaccuracies, especially time delays and inefficient detectors. Here it is shown that the minimum introduced phase variance when there is a time delay of tau is tau/(8 (n) over bar). This result is verified numerically, showing that the phase variance introduced approaches this limit for most of the adaptive schemes using the best final phase estimate. The main exception is the adaptive mark II scheme with simplified feedback, which is extremely sensitive to time delays. The extra phase variance due to time delays is considered for the mark I case with simplified feedback, verifying the tau /2 result obtained by Wiseman and Killip both by a more rigorous analytic technique and numerically.
Resumo:
New Zealand is generally thought to have been physically isolated from the rest of the world for over 60 million years. But physical isolation may not mean biotic isolation, at least on the time scale of millions of years. Are New Zealand's present complement of plants the direct descendants of what originally rafted from Gondwana? Or has there been total extinction of this initial flora with replacement through long-distance dispersal (a complete biotic turnover)? These are two possible extremes which have come under recent discussion. Can the fossil record be used to decide the relative importance of the two endpoints, or is it simply too incomplete and too dependent on factors of chance? This paper suggests two approaches to the problem-the use of statistics to apply levels of confidence to first appearances in the fossil record and the analysis of trends based on the entire palynorecord. Statistics can suggest that the first appearance of a taxon was after New Zealand broke away from Gondwana-as long as the first appearance in the record was not due to an increase in biomass from an initially rare state. Two observations can be drawn from the overall palynorecord that are independent of changes in biomass: (1) The first appearance of palynotaxa common to both Australia and New Zealand is decidedly non-random. Most taxa occur first in Australia. This suggests a bias in air or water transport from west to east. (2) The percentage of endemic palynospecies in New Zealand shows no simple correlation with the time New Zealand drifted into isolation. The conifer macrorecord also hints at complete turnover since the Cretaceous.
Resumo:
Mass balance calculations were performed to model the effect of solution treatment time on A356 and A357 alloy microstructures. Image analysis and electron probe microanalysis were used to characterise microstructures and confirm model predictions. In as-cast microstructures, up to 8 times more Mg is tied up in the pi-phase than in Mg2Si. The dissolution of pi is accompanied by a corresponding increase in the amount of beta-phase. This causes the rate of pi dissolution to be limited by the rate of beta formation. It is predicted that solution treatments of the order of tens of minutes at 540degreesC produce near-maximum T6 yield strengths, and that Mg contents in excess of 0.52 wt% have no advantage.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Applied econometricians often fail to impose economic regularity constraints in the exact form economic theory prescribes. We show how the Singular Value Decomposition (SVD) Theorem and Markov Chain Monte Carlo (MCMC) methods can be used to rigorously impose time- and firm-varying equality and inequality constraints. To illustrate the technique we estimate a system of translog input demand functions subject to all the constraints implied by economic theory, including observation-varying symmetry and concavity constraints. Results are presented in the form of characteristics of the estimated posterior distributions of functions of the parameters. Copyright (C) 2001 John Wiley Sons, Ltd.
Resumo:
A deterministic mathematical model which predicts the probability of developing a new drug-resistant parasite population within the human host is reported, The model incorporates the host's specific antibody response to PfEMP1, and also investigates the influence of chemotherapy on the probability of developing a viable drug-resistant parasite population within the host. Results indicate that early, treatment, and a high antibody threshold coupled with a long lag time between antibody stimulation and activity, are risk factors which increase the likelihood of developing a viable drug-resistant parasite population. High parasite mutation rates and fast PfEMP1 var gene switching are also identified as risk factors. The model output allows the relative importance of the various risk factors as well as the relationships between them to be established, thereby increasing the understanding of the conditions which favour the development of a new drug-resistant parasite population.