31 resultados para Universities and colleges -- Australia -- entrance requirements -- Data processing
Resumo:
It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.
Resumo:
Measured process data normally contain inaccuracies because the measurements are obtained using imperfect instruments. As well as random errors one can expect systematic bias caused by miscalibrated instruments or outliers caused by process peaks such as sudden power fluctuations. Data reconciliation is the adjustment of a set of process data based on a model of the process so that the derived estimates conform to natural laws. In this paper, techniques for the detection and identification of both systematic bias and outliers in dynamic process data are presented. A novel technique for the detection and identification of systematic bias is formulated and presented. The problem of detection, identification and elimination of outliers is also treated using a modified version of a previously available clustering technique. These techniques are also combined to provide a global dynamic data reconciliation (DDR) strategy. The algorithms presented are tested in isolation and in combination using dynamic simulations of two continuous stirred tank reactors (CSTR).
Resumo:
The premotor theory of attention claims that attentional shifts are triggered during response programming, regardless of which response modality is involved. To investigate this claim, event-related brain potentials (ERPs) were recorded while participants covertly prepared a left or right response, as indicated by a precue presented at the beginning of each trial. Cues signalled a left or right eye movement in the saccade task, and a left or right manual response in the manual task. The cued response had to be executed or withheld following the presentation of a Go/Nogo stimulus. Although there were systematic differences between ERPs triggered during covert manual and saccade preparation, lateralised ERP components sensitive to the direction of a cued response were very similar for both tasks, and also similar to the components previously found during cued shifts of endogenous spatial attention. This is consistent with the claim that the control of attention and of covert response preparation are closely linked. N1 components triggered by task-irrelevant visual probes presented during the covert response preparation interval were enhanced when these probes were presented close to cued response hand in the manual task, and at the saccade target location in the saccade task. This demonstrates that both manual and saccade preparation result in spatially specific modulations of visual processing, in line with the predictions of the premotor theory.
Resumo:
Numerical weather prediction (NWP) centres use numerical models of the atmospheric flow to forecast future weather states from an estimate of the current state. Variational data assimilation (VAR) is used commonly to determine an optimal state estimate that miminizes the errors between observations of the dynamical system and model predictions of the flow. The rate of convergence of the VAR scheme and the sensitivity of the solution to errors in the data are dependent on the condition number of the Hessian of the variational least-squares objective function. The traditional formulation of VAR is ill-conditioned and hence leads to slow convergence and an inaccurate solution. In practice, operational NWP centres precondition the system via a control variable transform to reduce the condition number of the Hessian. In this paper we investigate the conditioning of VAR for a single, periodic, spatially-distributed state variable. We present theoretical bounds on the condition number of the original and preconditioned Hessians and hence demonstrate the improvement produced by the preconditioning. We also investigate theoretically the effect of observation position and error variance on the preconditioned system and show that the problem becomes more ill-conditioned with increasingly dense and accurate observations. Finally, we confirm the theoretical results in an operational setting by giving experimental results from the Met Office variational system.
Resumo:
Sub-seasonal variability including equatorial waves significantly influence the dehydration and transport processes in the tropical tropopause layer (TTL). This study investigates the wave activity in the TTL in 7 reanalysis data sets (RAs; NCEP1, NCEP2, ERA40, ERA-Interim, JRA25, MERRA, and CFSR) and 4 chemistry climate models (CCMs; CCSRNIES, CMAM, MRI, and WACCM) using the zonal wave number-frequency spectral analysis method with equatorially symmetric-antisymmetric decomposition. Analyses are made for temperature and horizontal winds at 100 hPa in the RAs and CCMs and for outgoing longwave radiation (OLR), which is a proxy for convective activity that generates tropopause-level disturbances, in satellite data and the CCMs. Particular focus is placed on equatorial Kelvin waves, mixed Rossby-gravity (MRG) waves, and the Madden-Julian Oscillation (MJO). The wave activity is defined as the variance, i.e., the power spectral density integrated in a particular zonal wave number-frequency region. It is found that the TTL wave activities show significant difference among the RAs, ranging from ∼0.7 (for NCEP1 and NCEP2) to ∼1.4 (for ERA-Interim, MERRA, and CFSR) with respect to the averages from the RAs. The TTL activities in the CCMs lie generally within the range of those in the RAs, with a few exceptions. However, the spectral features in OLR for all the CCMs are very different from those in the observations, and the OLR wave activities are too low for CCSRNIES, CMAM, and MRI. It is concluded that the broad range of wave activity found in the different RAs decreases our confidence in their validity and in particular their value for validation of CCM performance in the TTL, thereby limiting our quantitative understanding of the dehydration and transport processes in the TTL.
Resumo:
Large changes in the extent of northern subtropical arid regions during the Holocene are attributed to orbitally forced variations in monsoon strength and have been implicated in the regulation of atmospheric trace gas concentrations on millenial timescales. Models that omit biogeophysical feedback, however, are unable to account for the full magnitude of African monsoon amplification and extension during the early to middle Holocene (˜9500–5000 years B.P.). A data set describing land-surface conditions 6000 years B.P. on a 1° × 1° grid across northern Africa and the Arabian Peninsula has been prepared from published maps and other sources of palaeoenvironmental data, with the primary aim of providing a realistic lower boundary condition for atmospheric general circulation model experiments similar to those performed in the Palaeoclimate Modelling Intercomparison Project. The data set includes information on the percentage of each grid cell occupied by specific vegetation types (steppe, savanna, xerophytic woods/scrub, tropical deciduous forest, and tropical montane evergreen forest), open water (lakes), and wetlands, plus information on the flow direction of major drainage channels for use in large-scale palaeohydrological modeling.
Resumo:
The term 'big data' has recently emerged to describe a range of technological and commercial trends enabling the storage and analysis of huge amounts of customer data, such as that generated by social networks and mobile devices. Much of the commercial promise of big data is in the ability to generate valuable insights from collecting new types and volumes of data in ways that were not previously economically viable. At the same time a number of questions have been raised about the implications for individual privacy. This paper explores key perspectives underlying the emergence of big data, and considers both the opportunities and ethical challenges raised for market research.
Resumo:
Language processing plays a crucial role in language development, providing the ability to assign structural representations to input strings (e.g., Fodor, 1998). In this paper we aim at contributing to the study of children's processing routines, examining the operations underlying the auditory processing of relative clauses in children compared to adults. English-speaking children (6–8;11) and adults participated in the study, which employed a self-paced listening task with a final comprehension question. The aim was to determine (i) the role of number agreement in object relative clauses in which the subject and object NPs differ in terms of number properties, and (ii) the role of verb morphology (active vs. passive) in subject relative clauses. Even though children's off-line accuracy was not always comparable to that of adults, analyses of reaction times results support the view that children have the same structural processing reflexes observed in adults.
Resumo:
This study evaluates model-simulated dust aerosols over North Africa and the North Atlantic from five global models that participated in the Aerosol Comparison between Observations and Models phase II model experiments. The model results are compared with satellite aerosol optical depth (AOD) data from Moderate Resolution Imaging Spectroradiometer (MODIS), Multiangle Imaging Spectroradiometer (MISR), and Sea-viewing Wide Field-of-view Sensor, dust optical depth (DOD) derived from MODIS and MISR, AOD and coarse-mode AOD (as a proxy of DOD) from ground-based Aerosol Robotic Network Sun photometer measurements, and dust vertical distributions/centroid height from Cloud Aerosol Lidar with Orthogonal Polarization and Atmospheric Infrared Sounder satellite AOD retrievals. We examine the following quantities of AOD and DOD: (1) the magnitudes over land and over ocean in our study domain, (2) the longitudinal gradient from the dust source region over North Africa to the western North Atlantic, (3) seasonal variations at different locations, and (4) the dust vertical profile shape and the AOD centroid height (altitude above or below which half of the AOD is located). The different satellite data show consistent features in most of these aspects; however, the models display large diversity in all of them, with significant differences among the models and between models and observations. By examining dust emission, removal, and mass extinction efficiency in the five models, we also find remarkable differences among the models that all contribute to the discrepancies of model-simulated dust amount and distribution. This study highlights the challenges in simulating the dust physical and optical processes, even in the best known dust environment, and stresses the need for observable quantities to constrain the model processes.
Resumo:
Purpose: To investigate the relationship between research data management (RDM) and data sharing in the formulation of RDM policies and development of practices in higher education institutions (HEIs). Design/methodology/approach: Two strands of work were undertaken sequentially: firstly, content analysis of 37 RDM policies from UK HEIs; secondly, two detailed case studies of institutions with different approaches to RDM based on semi-structured interviews with staff involved in the development of RDM policy and services. The data are interpreted using insights from Actor Network Theory. Findings: RDM policy formation and service development has created a complex set of networks within and beyond institutions involving different professional groups with widely varying priorities shaping activities. Data sharing is considered an important activity in the policies and services of HEIs studied, but its prominence can in most cases be attributed to the positions adopted by large research funders. Research limitations/implications: The case studies, as research based on qualitative data, cannot be assumed to be universally applicable but do illustrate a variety of issues and challenges experienced more generally, particularly in the UK. Practical implications: The research may help to inform development of policy and practice in RDM in HEIs and funder organisations. Originality/value: This paper makes an early contribution to the RDM literature on the specific topic of the relationship between RDM policy and services, and openness – a topic which to date has received limited attention.