69 resultados para HIPPOCAMPAL SLICES
Resumo:
Introduction. Endoscopic anterior scoliosis correction has been employed recently as a less invasive and level-sparing approach compared with open surgical techniques. We have previously demonstrated that during the two-year post-operative period, there was a mean loss of rib hump correction by 1.4 degrees. The purpose of this study was to determine whether intra- or inter-vertebral rotational deformity during the post-operative period could account for the loss of rib hump correction. Materials and Methods. Ten consecutive patients diagnosed with adolescent idiopathic scoliosis were treated with an endoscopic anterior scoliosis correction. Low-dose computed tomography scans of the instrumented segment were obtained post-operatively at 6 and 24 months following institutional ethical approval and patient consent. Three-dimensional multi-planar reconstruction software (Osirix Imaging Software, Pixmeo, Switzerland) was used to create axial slices of each vertebral level, corrected in both coronal and sagittal planes. Vertebral rotation was measured using Ho’s method for every available superior and inferior endplate at 6 and 24 months. Positive changes in rotation indicate a reduction and improvement in vertebral rotation. Intra-observer variability analysis was performed on a subgroup of images. Results. Mean change in rotation for vertebral endplates between 6 and 24 months post-operatively was -0.26˚ (range -3.5 to 4.9˚) within the fused segment and +1.26˚ (range -7.2 to 15.1˚) for the un-instrumented vertebrae above and below the fusion. Mean change in clinically measured rib hump for the 10 patients was -1.6˚ (range -3 to 0˚). The small change in rotation within the fused segment accounts for only 16.5% of the change in rib hump measured clinically whereas the change in rotation between the un-instrumented vertebrae above and below the construct accounts for 78.8%. There was no clear association between rib hump recurrence and intra- or inter-vertebral rotation in individual patients. Intra-rater variability was ± 3˚. Conclusions. Intra- and inter-vertebral rotation continues post-operatively both within the instrumented and un-instrumented segments of the immature spine. Rotation between the un-instrumented vertebrae above and below the fusion was +1.26˚, suggesting that the un-instrumented vertebrae improved and de-rotated slightly after surgery. This may play a role in rib hump recurrence, however this remains clinically insignificant.
Resumo:
In this study x-ray CT has been used to produce a 3D image of an irradiated PAGAT gel sample, with noise-reduction achieved using the ‘zero-scan’ method. The gel was repeatedly CT scanned and a linear fit to the varying Hounsfield unit of each pixel in the 3D volume was evaluated across the repeated scans, allowing a zero-scan extrapolation of the image to be obtained. To minimise heating of the CT scanner’s x-ray tube, this study used a large slice thickness (1 cm), to provide image slices across the irradiated region of the gel, and a relatively small number of CT scans (63), to extrapolate the zero-scan image. The resulting set of transverse images shows reduced noise compared to images from the initial CT scan of the gel, without being degraded by the additional radiation dose delivered to the gel during the repeated scanning. The full, 3D image of the gel has a low spatial resolution in the longitudinal direction, due to the selected scan parameters. Nonetheless, important features of the dose distribution are apparent in the 3D x-ray CT scan of the gel. The results of this study demonstrate that the zero-scan extrapolation method can be applied to the reconstruction of multiple x-ray CT slices, to provide useful 2D and 3D images of irradiated dosimetry gels.
Resumo:
Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.
Resumo:
Osmotic treatments are often applied prior to convective drying of foods to impart sensory appeal aspects. During this process a multicomponent mass flow, composed mainly of water and osmotic agent, takes place. In this work, a heat and mass transfer model for the osmo-convective drying of yacon was developed and solved by the Finite Element Method using COMSOL Multiphysics®, considering a 2-D axisymmetric geometry and moisture dependent thermophysical properties. Yacon slices were osmotically dehydrated for 2 hours in a solution of sucralose and then dried in a tray dryer for 3 hours. The model was validated by experimental data of temperature, moisture content and sucralose uptake (R²> 0.90).
Resumo:
Lyngbya majuscula is a cyanobacterium (blue-green algae) occurring naturally in tropical and subtropical coastal areas worldwide. Deception Bay, in Northern Moreton Bay, Queensland, has a history of Lyngbya blooms, and forms a case study for this investigation. The South East Queensland (SEQ) Healthy Waterways Partnership, collaboration between government, industry, research and the community, was formed to address issues affecting the health of the river catchments and waterways of South East Queensland. The Partnership coordinated the Lyngbya Research and Management Program (2005-2007) which culminated in a Coastal Algal Blooms (CAB) Action Plan for harmful and nuisance algal blooms, such as Lyngbya majuscula. This first phase of the project was predominantly of a scientific nature and also facilitated the collection of additional data to better understand Lyngbya blooms. The second phase of this project, SEQ Healthy Waterways Strategy 2007-2012, is now underway to implement the CAB Action Plan and as such is more management focussed. As part of the first phase of the project, a Science model for the initiation of a Lyngbya bloom was built using Bayesian Networks (BN). The structure of the Science Bayesian Network was built by the Lyngbya Science Working Group (LSWG) which was drawn from diverse disciplines. The BN was then quantified with annual data and expert knowledge. Scenario testing confirmed the expected temporal nature of bloom initiation and it was recommended that the next version of the BN be extended to take this into account. Elicitation for this BN thus occurred at three levels: design, quantification and verification. The first level involved construction of the conceptual model itself, definition of the nodes within the model and identification of sources of information to quantify the nodes. The second level included elicitation of expert opinion and representation of this information in a form suitable for inclusion in the BN. The third and final level concerned the specification of scenarios used to verify the model. The second phase of the project provides the opportunity to update the network with the newly collected detailed data obtained during the previous phase of the project. Specifically the temporal nature of Lyngbya blooms is of interest. Management efforts need to be directed to the most vulnerable periods to bloom initiation in the Bay. To model the temporal aspects of Lyngbya we are using Object Oriented Bayesian networks (OOBN) to create ‘time slices’ for each of the periods of interest during the summer. OOBNs provide a framework to simplify knowledge representation and facilitate reuse of nodes and network fragments. An OOBN is more hierarchical than a traditional BN with any sub-network able to contain other sub-networks. Connectivity between OOBNs is an important feature and allows information flow between the time slices. This study demonstrates more sophisticated use of expert information within Bayesian networks, which combine expert knowledge with data (categorized using expert-defined thresholds) within an expert-defined model structure. Based on the results from the verification process the experts are able to target areas requiring greater precision and those exhibiting temporal behaviour. The time slices incorporate the data for that time period for each of the temporal nodes (instead of using the annual data from the previous static Science BN) and include lag effects to allow the effect from one time slice to flow to the next time slice. We demonstrate a concurrent steady increase in the probability of initiation of a Lyngbya bloom and conclude that the inclusion of temporal aspects in the BN model is consistent with the perceptions of Lyngbya behaviour held by the stakeholders. This extended model provides a more accurate representation of the increased risk of algal blooms in the summer months and show that the opinions elicited to inform a static BN can be readily extended to a dynamic OOBN, providing more comprehensive information for decision makers.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
This study extends the ‘zero scan’ method for CT imaging of polymer gel dosimeters to include multi-slice acquisitions. Multi slice CT images consisting of 24 slices of 1.2 mm thickness were acquired of an irradiated polymer gel dosimeter, and processed with the zero scan technique. The results demonstrate that zero scan based gel readout can be successfully applied to generate a three dimensional image of the irradiated gel field. Compared to the raw CT images the processed figures and cross gel profiles demonstrated reduced noise and clear visibility of the penumbral region. Moreover these improved results further highlight the suitability of this method in volumetric reconstruction with reduced CT data acquisition per slice. This work shows that 3D volumes of irradiated polymer gel dosimeters can be acquired and processed with x-ray CT.
Resumo:
Olfactory ensheathing cells (OECs) are specialized glial cells in the mammalian olfactory system supporting growth of axons from the olfactory epithelium into the olfactory bulb. OECs in the olfactory bulb can be subdivided into OECs of the outer nerve layer and the inner nerve layer according to the expression of marker proteins and their location in the nerve layer. In the present study, we have used confocal calcium imaging of OECs in acute mouse brain slices and olfactory bulbs in toto to investigate physiological differences between OEC subpopulations. OECs in the outer nerve layer, but not the inner nerve layer, responded to glutamate, ATP, serotonin, dopamine, carbachol, and phenylephrine with increases in the cytosolic calcium concentration. The calcium responses consisted of a transient and a tonic component, the latter being mediated by store-operated calcium entry. Calcium measurements in OECs during the first three postnatal weeks revealed a downregulation of mGluR(1) and P2Y(1) receptor-mediated calcium signaling within the first 2 weeks, suggesting that the expression of these receptors is developmentally controlled. In addition, electrical stimulation of sensory axons evoked calcium signaling via mGluR(1) and P2Y(1) only in outer nerve layer OECs. Downregulation of the receptor-mediated calcium responses in postnatal animals is reflected by a decrease in amplitude of stimulation-evoked calcium transients in OECs from postnatal days 3 to 21. In summary, the results presented reveal striking differences in receptor responses during development and in axon-OEC communication between the two subpopulations of OECs in the olfactory bulb.
Resumo:
Insulated rail joints (IRJs) are an integral part of the rail track signaling system and pose significant maintenance and replacement costs due to their low and fluctuating service lives. Failure occurs mainly in rail head region, bolt- holes of fishplates and web-holes of the rails. Propagation of cracks is influenced by the evolution of internal residual stresses in rails during rail manufacturing (hot-rolling, roller-straightening, and head-hardening process), and during service, particularly in heavy rail haul freight systems where loads are high. In this investigation, rail head accumulated residual stresses were analysed using neutron diffraction at the Australian Nuclear Science and Technology Organisation (ANSTO). Two ex-service two head-hardened rail joints damaged under different loading were examined and results were compared with those obtained from an unused rail joint reference sample in order to differentiate the stresses developed during rail manufacturing and stresses accumulated during rail service. Neutron diffraction analyses were carried out on the samples in longitudinal, transverse and vertical directions, and on 5mm thick sliceed samples cut by Electric Discharge Machining (EDM). For the rail joints from the service line, irrespective of loading conditions and in-service times, results revealed similar depth profiles of stress distribution. Evolution of residual stress fields in rails due to service was also accompanied by evidence of larger material flow based on reflected light and scanning electron microscopy studies. Stress evolution in the vicinity of rail ends was characterised by a compressive layer, approximately 5 mm deep, and a tension zone located approximately 5- 15mm below the surfaces. A significant variation of d0 with depth near the top surface was detected and was attributed to decarburization in the top layer induced by cold work. Stress distributions observed in longitudinal slices of the two different deformed rail samples were found to be similar. For the undeformed rail, the stress distributions obtained could be attributed to variations associated with thermo-mechanical history of the rail.
Resumo:
Adult neural stem cells (NSCs) play important roles in learning and memory and are negatively impacted by neurological disease. It is known that biochemical and genetic factors regulate self-renewal and differentiation, and it has recently been suggested that mechanical and solid-state cues, such as extracellular matrix (ECM) stiffness, can also regulate the functions of NSCs and other stem cell types. However, relatively little is known of the molecular mechanisms through which stem cells transduce mechanical inputs into fate decisions, the extent to which mechanical inputs instruct fate decisions versus select for or against lineage-committed blast populations, or the in vivo relevance of mechanotransductive signaling molecules in native stem cell niches. Here we demonstrate that ECM-derived mechanical signals act through Rho GTPases to activate the cellular contractility machinery in a key early window during differentiation to regulate NSC lineage commitment. Furthermore, culturing NSCs on increasingly stiff ECMs enhances RhoA and Cdc42 activation, increases NSC stiffness, and suppresses neurogenesis. Likewise, inhibiting RhoA and Cdc42 or downstream regulators of cellular contractility rescues NSCs from stiff matrix- and Rho GTPase-induced neurosuppression. Importantly, Rho GTPase expression and ECM stiffness do not alter proliferation or apoptosis rates indicating that an instructive rather than selective mechanism modulates lineage distributions. Finally, in the adult brain, RhoA activation in hippocampal progenitors suppresses neurogenesis, analogous to its effect in vitro. These results establish Rho GTPase-based mechanotransduction and cellular stiffness as biophysical regulators of NSC fate in vitro and RhoA as an important regulatory protein in the hippocampal stem cell niche.
Resumo:
The lipid composition of the human lens is distinct from most other tissues in that it is high in dihydrosphingomyelin and the most abundant glycerophospholipids in the lens are unusual 1-O-alkyl-ether linked phosphatidylethanolamines and phosphatidylserines. In this study, desorption electrospray ionization (DESI) mass spectrometry-imaging was used to determine the distribution of these lipids in the human lens along with other lipids including, ceramides, ceramide-1-phosphates, and lyso 1-O-alkyl ethers. To achieve this, 25 μm lens slices were mounted onto glass slides and analyzed using a linear ion-trap mass spectrometer equipped with a custom-built, 2-D automated DESI source. In contrast to other tissues that have been previously analyzed by DESI, the presence of a strong acid in the spray solvent was required to desorb lipids directly from lens tissue. Distinctive distributions were observed for [M + H]+ ions arising from each lipid class. Of particular interest were ionized 1-O-alkyl phosphatidylethanolamines and phosphatidylserines, PE (18:1e/18:1), and PS (18:1e/18:1), which were found in a thin ring in the outermost region of the lens. This distribution was confirmed by quantitative analysis of lenses that were sectioned into four distinct regions (outer, barrier, inner, and core), extracted and analyzed by electrospray ionization tandem mass spectrometry. DESI-imaging also revealed a complementary distribution for the structurally-related lyso 1-O-alkyl phosphatidylethanolamine, LPE (18:1e), which was localized closer to the centre of the lens. The data obtained in this study indicate that DESI-imaging is a powerful tool for determining the spatial distribution of human lens lipids. © 2010 American Society for Mass Spectrometry.
Resumo:
Neuropsychological tests requiring patients to find a path through a maze can be used to assess visuospatial memory performance in temporal lobe pathology, particularly in the hippocampus. Alternatively, they have been used as a task sensitive to executive function in patients with frontal lobe damage. We measured performance on the Austin Maze in patients with unilateral left and right temporal lobe epilepsy (TLE), with and without hippocampal sclerosis, compared to healthy controls. Performance was correlated with a number of other neuropsychological tests to identify the cognitive components that may be associated with poor Austin Maze performance. Patients with right TLE were significantly impaired on the Austin Maze task relative to patients with left TLE and controls, and error scores correlated with their performance on the Block Design task. The performance of patients with left TLE was also impaired relative to controls; however, errors correlated with performance on tests of executive function and delayed recall. The presence of hippocampal sclerosis did not have an impact on maze performance. A discriminant function analysis indicated that the Austin Maze alone correctly classified 73.5% of patients as having right TLE. In summary, impaired performance on the Austin Maze task is more suggestive of right than left TLE; however, impaired performance on this visuospatial task does not necessarily involve the hippocampus. The relationship of the Austin Maze task with other neuropsychological tests suggests that differential cognitive components may underlie performance decrements in right versus left TLE.
Resumo:
An analysis of the emissions from 14 CNG and 5 Diesel buses was conducted during April & May, 2006. Studies were conducted at both steady state and transient driving modes on a vehicle dynamometer utilising a CVS dilution system. This article will focus on the volatile properties of particles from 4 CNG and 4 Diesel vehicles from within this group with a priority given to the previously un-investigated CNG emissions produced at transient loads. Particle number concentration data was collected by three CPC’s (TSI 3022, 3010 & 3782WCPC) having D50 cut-offs set to 5nm, 10nm & 20nm respectively. Size distribution data was collected using a TSI 3080 SMPS with a 3025 CPC during the steady state driving modes. During transient cycles mono-disperse “slices” of between 5nm & 25nm were measured. The volatility of these particles was determined by placing a thermodenuder before the 3022 and the SMPS and measuring the reduction in particle number concentration as the temperature in the thermodenuder was increased. This was then normalised against the total particle count given by the 3010 CPC to provide high resolution information on the reduction in particle concentration with respect to temperature.
Resumo:
It is well established that the coordinated regulation of activity-dependent gene expression by the histone acetyltransferase (HAT) family of transcriptional coactivators is crucial for the formation of contextual fear and spatial memory, and for hippocampal synaptic plasticity. However, no studies have examined the role of this epigenetic mechanism within the infralimbic prefrontal cortex (ILPFC), an area of the brain that is essential for the formation and consolidation of fear extinction memory. Here we report that a postextinction training infusion of a combined p300/CBP inhibitor (Lys-CoA-Tat), directly into the ILPFC, enhances fear extinction memory in mice. Our results also demonstrate that the HAT p300 is highly expressed within pyramidal neurons of the ILPFC and that the small-molecule p300-specific inhibitor (C646) infused into the ILPFC immediately after weak extinction training enhances the consolidation of fear extinction memory. C646 infused 6 h after extinction had no effect on fear extinction memory, nor did an immediate postextinction training infusion into the prelimbic prefrontal cortex. Consistent with the behavioral findings, inhibition of p300 activity within the ILPFC facilitated long-term potentiation (LTP) under stimulation conditions that do not evoke long-lasting LTP. These data suggest that one function of p300 activity within the ILPFC is to constrain synaptic plasticity, and that a reduction in the function of this HAT is required for the formation of fear extinction memory.
Resumo:
The therapeutic effects induced by serotonin-selective reuptake inhibitor (SSRI) antidepressants are initially triggered by blocking the serotonin transporter and rely on long-term adaptations of pre- and post-synaptic receptors. We show here that long-term behavioral and neurogenic SSRI effects are abolished after either genetic or pharmacological inactivation of 5-HT(2B) receptors. Conversely, direct agonist stimulation of 5-HT(2B) receptors induces an SSRI-like response in behavioral and neurogenic assays. Moreover, the observation that (i) this receptor is expressed by raphe serotonergic neurons, (ii) the SSRI-induced increase in hippocampal extracellular serotonin concentration is strongly reduced in the absence of functional 5-HT(2B) receptors and (iii) a selective 5-HT(2B) agonist mimics SSRI responses, supports a positive regulation of serotonergic neurons by 5-HT(2B) receptors. The 5-HT(2B) receptor appears, therefore, to positively modulate serotonergic activity and to be required for the therapeutic actions of SSRIs. Consequently, the 5-HT(2B) receptor should be considered as a new tractable target in the combat against depression.