807 resultados para Mathematical ability
Resumo:
1. Although population viability analysis (PVA) is widely employed, forecasts from PVA models are rarely tested. This study in a fragmented forest in southern Australia contrasted field data on patch occupancy and abundance for the arboreal marsupial greater glider Petauroides volans with predictions from a generic spatially explicit PVA model. This work represents one of the first landscape-scale tests of its type. 2. Initially we contrasted field data from a set of eucalypt forest patches totalling 437 ha with a naive null model in which forecasts of patch occupancy were made, assuming no fragmentation effects and based simply on remnant area and measured densities derived from nearby unfragmented forest. The naive null model predicted an average total of approximately 170 greater gliders, considerably greater than the true count (n = 81). 3. Congruence was examined between field data and predictions from PVA under several metapopulation modelling scenarios. The metapopulation models performed better than the naive null model. Logistic regression showed highly significant positive relationships between predicted and actual patch occupancy for the four scenarios (P = 0.001-0.006). When the model-derived probability of patch occupancy was high (0.50-0.75, 0.75-1.00), there was greater congruence between actual patch occupancy and the predicted probability of occupancy. 4. For many patches, probability distribution functions indicated that model predictions for animal abundance in a given patch were not outside those expected by chance. However, for some patches the model either substantially over-predicted or under-predicted actual abundance. Some important processes, such as inter-patch dispersal, that influence the distribution and abundance of the greater glider may not have been adequately modelled. 5. Additional landscape-scale tests of PVA models, on a wider range of species, are required to assess further predictions made using these tools. This will help determine those taxa for which predictions are and are not accurate and give insights for improving models for applied conservation management.
Resumo:
A modelling framework is developed to determine the joint economic and environmental net benefits of alternative land allocation strategies. Estimates of community preferences for preservation of natural land, derived from a choice modelling study, are used as input to a model of agricultural production in an optimisation framework. The trade-offs between agricultural production and environmental protection are analysed using the sugar industry of the Herbert River district of north Queensland as an example. Spatially-differentiated resource attributes and the opportunity costs of natural land determine the optimal tradeoffs between production and conservation for a range of sugar prices.
Resumo:
A number of mathematical models have been used to describe percutaneous absorption kinetics. In general, most of these models have used either diffusion-based or compartmental equations. The object of any mathematical model is to a) be able to represent the processes associated with absorption accurately, b) be able to describe/summarize experimental data with parametric equations or moments, and c) predict kinetics under varying conditions. However, in describing the processes involved, some developed models often suffer from being of too complex a form to be practically useful. In this chapter, we attempt to approach the issue of mathematical modeling in percutaneous absorption from four perspectives. These are to a) describe simple practical models, b) provide an overview of the more complex models, c) summarize some of the more important/useful models used to date, and d) examine sonic practical applications of the models. The range of processes involved in percutaneous absorption and considered in developing the mathematical models in this chapter is shown in Fig. 1. We initially address in vitro skin diffusion models and consider a) constant donor concentration and receptor conditions, b) the corresponding flux, donor, skin, and receptor amount-time profiles for solutions, and c) amount- and flux-time profiles when the donor phase is removed. More complex issues, such as finite-volume donor phase, finite-volume receptor phase, the presence of an efflux. rate constant at the membrane-receptor interphase, and two-layer diffusion, are then considered. We then look at specific models and issues concerned with a) release from topical products, b) use of compartmental models as alternatives to diffusion models, c) concentration-dependent absorption, d) modeling of skin metabolism, e) role of solute-skin-vehicle interactions, f) effects of vehicle loss, a) shunt transport, and h) in vivo diffusion, compartmental, physiological, and deconvolution models. We conclude by examining topics such as a) deep tissue penetration, b) pharmacodynamics, c) iontophoresis, d) sonophoresis, and e) pitfalls in modeling.
Resumo:
The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.
Resumo:
The aim of this work was to exemplify the specific contribution of both two- and three-dimensional (31)) X-ray computed tomography to characterise earthworm burrow systems. To achieve this purpose we used 3D mathematical morphology operators to characterise burrow systems resulting from the activity of an anecic (Aporrectodea noctunia), and an endogeic species (Allolobophora chlorotica), when both species were introduced either separately or together into artificial soil cores. Images of these soil cores were obtained using a medical X-ray tomography scanner. Three-dimensional reconstructions of burrow systems were obtained using a specifically developed segmentation algorithm. To study the differences between burrow systems, a set of classical tools of mathematical morphology (granulometries) were used. So-called granulometries based on different structuring elements clearly separated the different burrow systems. They enabled us to show that burrows made by the anecic species were fatter, longer, more vertical, more continuous but less sinuous than burrows of the endogeic species. The granulometry transform of the soil matrix showed that burrows made by A. nocturna were more evenly distributed than those of A. chlorotica. Although a good discrimination was possible when only one species was introduced into the soil cores, it was not possible to separate burrows of the two species from each other in cases where species were introduced into the same soil core. This limitation, partly due to the insufficient spatial resolution of the medical scanner, precluded the use of the morphological operators to study putative interactions between the two species.
Resumo:
Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.
Resumo:
Mental rotation involves the creation and manipulation of internal images, with the later being particularly useful cognitive capacities when applied to high-level mathematical thinking and reasoning. Many neuroimaging studies have demonstrated mental rotation to be mediated primarily by the parietal lobes, particularly on the right side. Here, we use fMRI to show for the first time that when performing 3-dimensional mental rotations, mathematically gifted male adolescents engage a qualitatively different brain network than those of average math ability, one that involves bilateral activation of the parietal lobes and frontal cortex, along with heightened activation of the anterior cingulate. Reliance on the processing characteristics of this uniquely bilateral system and the interplay of these anterior/posterior regions may be contributors to their mathematical precocity.
Resumo:
Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.
Resumo:
An important consideration in the development of mathematical models for dynamic simulation, is the identification of the appropriate mathematical structure. By building models with an efficient structure which is devoid of redundancy, it is possible to create simple, accurate and functional models. This leads not only to efficient simulation, but to a deeper understanding of the important dynamic relationships within the process. In this paper, a method is proposed for systematic model development for startup and shutdown simulation which is based on the identification of the essential process structure. The key tool in this analysis is the method of nonlinear perturbations for structural identification and model reduction. Starting from a detailed mathematical process description both singular and regular structural perturbations are detected. These techniques are then used to give insight into the system structure and where appropriate to eliminate superfluous model equations or reduce them to other forms. This process retains the ability to interpret the reduced order model in terms of the physico-chemical phenomena. Using this model reduction technique it is possible to attribute observable dynamics to particular unit operations within the process. This relationship then highlights the unit operations which must be accurately modelled in order to develop a robust plant model. The technique generates detailed insight into the dynamic structure of the models providing a basis for system re-design and dynamic analysis. The technique is illustrated on the modelling for an evaporator startup. Copyright (C) 1996 Elsevier Science Ltd
Resumo:
Streptococcus pyogenes infections remain a health problem in several countries due to poststreptococcal sequelae. We developed a vaccine epitope (StreptInCor) composed of 55 amino acids residues of the C-terminal portion of the M protein that encompasses both T and B cell protective epitopes. The nuclear magnetic resonance (NMR) structure of the StreptInCor peptide showed that the structure was composed of two microdomains linked by an 18-residue alpha-helix. A chemical stability study of the StreptInCor folding/unfolding process using far-UV circular dichroism showed that the structure was chemically stable with respect to pH and the concentration of urea. The T cell epitope is located in the first microdomain and encompasses 11 out of the 18 alpha-helix residues, whereas the B cell epitope is in the second microdomain and showed no alpha-helical structure. The prediction of StreptInCor epitope binding to different HLA class II molecules was evaluated based on an analysis of the 55 residues and the theoretical possibilities for the processed peptides to fit into the P1, P4, P6, and P9 pockets in the groove of several HLA class II molecules. We observed 7 potential sites along the amino acid sequence of StreptInCor that were capable of recognizing HLA class II molecules (DRB1*, DRB3*, DRB4*, and DRB5*). StreptInCoroverlapping peptides induced cellular and humoral immune responses of individuals bearing different HLA class II molecules and could be considered as a universal vaccine epitope.
Resumo:
Experimental data for E. coli debris size reduction during high-pressure homogenisation at 55 MPa are presented. A mathematical model based on grinding theory is developed to describe the data. The model is based on first-order breakage and compensation conditions. It does not require any assumption of a specified distribution for debris size and can be used given information on the initial size distribution of whole cells and the disruption efficiency during homogenisation. The number of homogeniser passes is incorporated into the model and used to describe the size reduction of non-induced stationary and induced E. coil cells during homogenisation. Regressing the results to the model equations gave an excellent fit to experimental data ( > 98.7% of variance explained for both fermentations), confirming the model's potential for predicting size reduction during high-pressure homogenisation. This study provides a means to optimise both homogenisation and disc-stack centrifugation conditions for recombinant product recovery. (C) 1997 Elsevier Science Ltd.
Resumo:
Aims: To compare the performance of schizophrenia, mania and well control groups on tests sensitive to impaired executive ability, and to assess the within-group stability of these measures across the acute and subacute phases of psychoses. Method: Recently admitted patients with schizophrenia (n=36), mania (n=18) and a well control group (n=20) were assessed on two occasions separated by 4 weeks. Tests included: the Controlled Oral Word Association Test, the Stroop Test, the Wisconsin Card Sort Test, and the Trail Making Test. Results: The two patient groups were significantly impaired on the Stroop Test at both time points compared to the control group. Significant group differences were also found for the Trail Making Test at Time 1 and for the Wisconsin Card Sort Test at Time 2. When controlled for practice effect, significant improvements over time were found on the Stroop and Trail Making tests in the schizophrenia group and on WCST Categories Achieved in the mania group. Discussion: Compared to controls, the patient groups were impaired on measures related to executive ability. The pattern of improvement on test scores between the acute and subacute phases differed between patients with schizophrenia versus patients with mania. (C) 1997 Elsevier Science B.V.
Resumo:
Rats exposed to a relatively high dose (7.5 g/kg body weight) of alcohol on either the fifth or tenth postnatal day of age have been reported to have long-lasting deficits in spatial learning ability as tested on the Morris water maze task. The question arises concerning the level of alcohol required to achieve this effect. Wistar rats were exposed to either 2, 4 or 6 g/kg body weight of ethanol administered as a 10% solution. This ethanol was given over an 8-h period on the fifth postnatal day of age by means of an intragastric cannula. Gastrostomy controls received a 5% sucrose solution substituted isocalorically for the ethanol. Another set of pups raised by their mother were used as suckle controls. All surgical procedures were carried out under halothane vapour anaesthesia. After the artificial feeding regimes all pups were returned to lactating dams and weaned at 21 days of age. The spatial learning ability of these rats was tested in the Morris water maze when they were between 61-64 days of age. This task requires the rats to swim in a pool containing water made opaque and locate and climb onto a submerged platform. The time taken to accomplish this is known as the escape latency. Each rat was subjected to 24 trials over 3 days of the test period. Statistical analysis of the escape latency data revealed that the rats given 6 g/kg body weight of ethanol had significant deficits in their spatial learning ability compared with their control groups. However, there was no significant difference in spatial learning ability for the rats given either 2 or 4 g/kg body weight of ethanol compared with their respective gastrostomy or suckle control animals. We concluded that ethanol exposure greater than 4 g/kg over an 8-h period to 5-day-old rats is required for them to develop long-term deficits in spatial learning behaviour. (C) 1998 Elsevier Science Inc.