930 resultados para Multi rate processing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A detailed analysis procedure is described for evaluating rates of volumetric change in brain structures based on structural magnetic resonance (MR) images. In this procedure, a series of image processing tools have been employed to address the problems encountered in measuring rates of change based on structural MR images. These tools include an algorithm for intensity non-uniforniity correction, a robust algorithm for three-dimensional image registration with sub-voxel precision and an algorithm for brain tissue segmentation. However, a unique feature in the procedure is the use of a fractional volume model that has been developed to provide a quantitative measure for the partial volume effect. With this model, the fractional constituent tissue volumes are evaluated for voxels at the tissue boundary that manifest partial volume effect, thus allowing tissue boundaries be defined at a sub-voxel level and in an automated fashion. Validation studies are presented on key algorithms including segmentation and registration. An overall assessment of the method is provided through the evaluation of the rates of brain atrophy in a group of normal elderly subjects for which the rate of brain atrophy due to normal aging is predictably small. An application of the method is given in Part 11 where the rates of brain atrophy in various brain regions are studied in relation to normal aging and Alzheimer's disease. (C) 2002 Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The processing of lexical ambiguity in context was investigated in eight individuals with schizophrenia and a matched control group. Participants made speeded lexical decisions on the third word in auditory word triplets representing concordant (coin-bank-money), discordant (river-bank-money). neutral (day-bank-money), and unrelated (river-day-money) conditions. When the interstimulus interval (ISI) between the words was 100 ms. individuals with schizophrenia demonstrated priming consistent with selective. context-based lexical activation. At 1250 ms ISI a pattern of nonselective meaning facilitation was obtained. These results suggest an attentional breakdown in the sustained inhibition of meanings on the basis of lexical context. (C) 2002 Elsevier Science (USA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Primary vaccine strategies against group A streptococci (GAS) have focused on the M protein-the target of opsonic antibodies important for protective immunity. We have previously reported protection of mice against GAS infection following parenteral delivery of a multi-epitope vaccine construct, referred to as a heteropolymer. This current report has assessed mucosal (intranasal (i.n.) and oral) delivery of the heteropolymer in mice with regard to the induction and specificity of mucosal and systemic antibody responses, and compared this to parenteral delivery. GAS-specific IgA responses were detected in saliva and gut upon i.n. and oral delivery of the heteropolymer co-administered with cholera toxin B subunit, respectively. High titre serum IgG responses were elicited to the heteropolymer following all routes of delivery when administered with adjuvant. Moreover, as with parenteral delivery, serum IgG antibodies were detected to the individual heteropolymer peptides following i.n. but not oral delivery. These data support the potential of the i.n. route in the mucosal delivery of a GAS vaccine. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to understand the earthquake nucleation process, we need to understand the effective frictional behavior of faults with complex geometry and fault gouge zones. One important aspect of this is the interaction between the friction law governing the behavior of the fault on the microscopic level and the resulting macroscopic behavior of the fault zone. Numerical simulations offer a possibility to investigate the behavior of faults on many different scales and thus provide a means to gain insight into fault zone dynamics on scales which are not accessible to laboratory experiments. Numerical experiments have been performed to investigate the influence of the geometric configuration of faults with a rate- and state-dependent friction at the particle contacts on the effective frictional behavior of these faults. The numerical experiments are designed to be similar to laboratory experiments by DIETERICH and KILGORE (1994) in which a slide-hold-slide cycle was performed between two blocks of material and the resulting peak friction was plotted vs. holding time. Simulations with a flat fault without a fault gouge have been performed to verify the implementation. These have shown close agreement with comparable laboratory experiments. The simulations performed with a fault containing fault gouge have demonstrated a strong dependence of the critical slip distance D-c on the roughness of the fault surfaces and are in qualitative agreement with laboratory experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the exchange rate exposure displayed by a sample of Australian international equity trusts (IET). Exchange rate exposure is also examined in the context of differing economic climates with particular emphasis on the Asian crisis in mid-1997. It is found that there is evidence of exchange rate exposure particularly in the context of a multiple exchange rate model. Exposure varies substantially between three alternative time periods with different exposure apparent subsequent to the Asian crisis than prior to this event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of heating and cooling on heart rate in the estuarine crocodile Crocodylus porosus was studied in response to different heat transfer mechanisms and heat loads. Three heating treatments were investigated. C. porosus were: (1) exposed to a radiant heat source under dry conditions; (2) heated via radiant energy while half-submerged in flowing water at 23degreesC and (3) heated via convective transfer by increasing water temperature from 23degreesC to 35degreesC. Cooling was achieved in all treatments by removing the heat source and with C. porosus half-submerged in flowing water at 23degreesC. In all treatments, the heart rate of C. porosus increased markedly in response to heating and decreased rapidly with the removal of the heat source. Heart rate during heating was significantly faster than during cooling at any given body temperature, i.e. there was a significant heart rate hysteresis. There were two identifiable responses to heating and cooling. During the initial stages of applying or removing the heat source, there was a dramatic increase or decrease in heart rate ('rapid response'), respectively, indicating a possible cardiac reflex. This rapid change in heart rate with only a small change or no change in body temperature (

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the robustness of a range of short–term interest rate models. We examine the robustness of these models over different data sets, time periods, sampling frequencies, and estimation techniques. We examine a range of popular one–factor models that allow the conditional mean (drift) and conditional variance (diffusion) to be functions of the current short rate. We find that parameter estimates are highly sensitive to all of these factors in the eight countries that we examine. Since parameter estimates are not robust, these models should be used with caution in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A literature review has highlighted the need to measure flotation froth rheology in order to fully characterise the role of the froth in the flotation process. The initial investigation using a coaxial cylinder viscometer for froth rheology measurement led to the development of a new device employing a vane measuring head. The modified rheometer was used in industrial scale flotation tests at Mt. Isa Copper Concentrator. The measured froth rheograms show a non-Newtonian nature for the flotation froths (pseudoplastic flow). The evidence of the non-Newtonian flow has questioned the validity of application of the Laplace equation in froth motion modelling as used by a number of researchers, since the assumption of irrotational flow is violated. Correlations between the froth rheology and the froth retention time, water hold-up in the froth and concentrate grades have been found. These correlations are independent of air flow rate (test data at various air flow rates fall on one similar trend line). This implies that froth rheology may be used as a lumped parameter for other operating variables in flotation modelling and scale up. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pulp lifters, also known, as pan lifters are an integral part of the majority of autogenous (AG), semi-autogenous (SAG) and grate discharge ball mills. The performance of the pulp lifters in conjunction with grate design determines the ultimate flow capacity of these mills. Although the function of the pulp lifters is simply to transport the slurry passed through the discharge grate into the discharge trunnion, their performance depends on their design as well as that of the grate and operating conditions such as mill speed and charge level. However, little or no work has been reported on the performance of grate-pulp lifter assemblies and in particular the influence of pulp lifter design on slurry transport. Ideally, the discharge rate through a grate-pulp lifter assembly should be equal to the discharge rate through at a given mill hold-up. However, the results obtained have shown that conventional pulp lifter designs cause considerable restrictions to flow resulting in reduced flow capacity. In this second of a two-part series of papers the performance of conventional pulp lifters (radial and spiral designs) is described and is based on extensive test work carried out in a I m diameter pilot SAG mill. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Loss of magnetic medium solids from dense medium circuits is a substantial contributor to operating cost. Much of this loss is by way of wet drum magnetic separator effluent. A model of the separator would be useful for process design, optimisation and control. A review of the literature established that although various rules of thumb exist, largely based on empirical or anecdotal evidence, there is no model of magnetics recovery in a wet drum magnetic separator which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was therefore carried out using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 mm diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in the work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. Observations carried out as an adjunct to this work, as well as magnetic theory, suggests that the capture of magnetic particles in the wet drum magnetic separator is by a flocculation process. Such a process should be defined by a flocculation rate and a flocculation time; the latter being defined by the volumetric flowrate and the volume within the separation zone. A model based on this concept and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to provide a satisfactory fit to the data over three orders of magnitude of magnetics loss. (C) 2003 Elsevier Science BY. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of exchange of substances between blood and tissue has been a long-lasting challenge to physiologists, and considerable theoretical and experimental accomplishments were achieved before the development of the positron emission tomography (PET). Today, when modeling data from modern PET scanners, little use is made of earlier microvascular research in the compartmental models, which have become the standard model by which the vast majority of dynamic PET data are analysed. However, modern PET scanners provide data with a sufficient temporal resolution and good counting statistics to allow estimation of parameters in models with more physiological realism. We explore the standard compartmental model and find that incorporation of blood flow leads to paradoxes, such as kinetic rate constants being time-dependent, and tracers being cleared from a capillary faster than they can be supplied by blood flow. The inability of the standard model to incorporate blood flow consequently raises a need for models that include more physiology, and we develop microvascular models which remove the inconsistencies. The microvascular models can be regarded as a revision of the input function. Whereas the standard model uses the organ inlet concentration as the concentration throughout the vascular compartment, we consider models that make use of spatial averaging of the concentrations in the capillary volume, which is what the PET scanner actually registers. The microvascular models are developed for both single- and multi-capillary systems and include effects of non-exchanging vessels. They are suitable for analysing dynamic PET data from any capillary bed using either intravascular or diffusible tracers, in terms of physiological parameters which include regional blood flow. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most widely used method for predicting the onset of continuous caving is Laubscher's caving chart. A detailed examination of this method was undertaken which concluded that it had limitations which may impact on results, particularly when dealing with stronger rock masses that are outside current experience. These limitations relate to inadequate guidelines for adjustment factors to rock mass rating (RMR), concerns about the position on the chart of critical case history data, undocumented changes to the method and an inadequate number of data points to be confident of stability boundaries. A review was undertaken on the application and reliability of a numerical method of assessing cavability. The review highlighted a number of issues, which at this stage, make numerical continuum methods problematic for predicting cavability. This is in particular reference to sensitivity to input parameters that are difficult to determine accurately and mesh dependency. An extended version of the Mathews method for open stope design was developed as an alternative method of predicting the onset of continuous caving. A number of caving case histories were collected and analyzed and a caving boundary delineated statistically on the Mathews stability graph. The definition of the caving boundary was aided by the existence of a large and wide-ranging stability database from non-caving mines. A caving rate model was extrapolated from the extended Mathews stability graph but could only be partially validated due to a lack of reliable data.