984 resultados para Skinner, Thompson J.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several genetic variants are thought to influence white matter (WM) integrity, measured with diffusion tensor imaging (DTI). Voxel based methods can test genetic associations, but heavy multiple comparisons corrections are required to adjust for searching the whole brain and for all genetic variants analyzed. Thus, genetic associations are hard to detect even in large studies. Using a recently developed multi-SNP analysis, we examined the joint predictive power of a group of 18 cholesterol-related single nucleotide polymorphisms (SNPs) on WM integrity, measured by fractional anisotropy. To boost power, we limited the analysis to brain voxels that showed significant associations with total serum cholesterol levels. From this space, we identified two genes with effects that replicated in individual voxel-wise analyses of the whole brain. Multivariate analyses of genetic variants on a reduced anatomical search space may help to identify SNPs with strongest effects on the brain from a broad panel of genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As connectivity analyses become more popular, claims are often made about how the brain's anatomical networks depend on age, sex, or disease. It is unclear how results depend on tractography methods used to compute fiber networks. We applied 11 tractography methods to high angular resolution diffusion images of the brain (4-Tesla 105-gradient HARDI) from 536 healthy young adults. We parcellated 70 cortical regions, yielding 70×70 connectivity matrices, encoding fiber density. We computed popular graph theory metrics, including network efficiency, and characteristic path lengths. Both metrics were robust to the number of spherical harmonics used to model diffusion (4th-8th order). Age effects were detected only for networks computed with the probabilistic Hough transform method, which excludes smaller fibers. Sex and total brain volume affected networks measured with deterministic, tensor-based fiber tracking but not with the Hough method. Each tractography method includes different fibers, which affects inferences made about the reconstructed networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key question in diffusion imaging is how many diffusion-weighted images suffice to provide adequate signal-to-noise ratio (SNR) for studies of fiber integrity. Motion, physiological effects, and scan duration all affect the achievable SNR in real brain images, making theoretical studies and simulations only partially useful. We therefore scanned 50 healthy adults with 105-gradient high-angular resolution diffusion imaging (HARDI) at 4T. From gradient image subsets of varying size (6 ≤ N ≤ 94) that optimized a spherical angular distribution energy, we created SNR plots (versus gradient numbers) for seven common diffusion anisotropy indices: fractional and relative anisotropy (FA, RA), mean diffusivity (MD), volume ratio (VR), geodesic anisotropy (GA), its hyperbolic tangent (tGA), and generalized fractional anisotropy (GFA). SNR, defined in a region of interest in the corpus callosum, was near-maximal with 58, 66, and 62 gradients for MD, FA, and RA, respectively, and with about 55 gradients for GA and tGA. For VR and GFA, SNR increased rapidly with more gradients. SNR was optimized when the ratio of diffusion-sensitized to non-sensitized images was 9.13 for GA and tGA, 10.57 for FA, 9.17 for RA, and 26 for MD and VR. In orientation density functions modeling the HARDI signal as a continuous mixture of tensors, the diffusion profile reconstruction accuracy rose rapidly with additional gradients. These plots may help in making trade-off decisions when designing diffusion imaging protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To classify each stage for a progressing disease such as Alzheimer’s disease is a key issue for the disease prevention and treatment. In this study, we derived structural brain networks from diffusion-weighted MRI using whole-brain tractography since there is growing interest in relating connectivity measures to clinical, cognitive, and genetic data. Relatively little work has usedmachine learning to make inferences about variations in brain networks in the progression of the Alzheimer’s disease. Here we developed a framework to utilize generalized low rank approximations of matrices (GLRAM) and modified linear discrimination analysis for unsupervised feature learning and classification of connectivity matrices. We apply the methods to brain networks derived from DWI scans of 41 people with Alzheimer’s disease, 73 people with EMCI, 38 people with LMCI, 47 elderly healthy controls and 221 young healthy controls. Our results show that this new framework can significantly improve classification accuracy when combining multiple datasets; this suggests the value of using data beyond the classification task at hand to model variations in brain connectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper critiques the focus of creative industries policy on capability development of small and medium sized firms and the provision of regional incentives. It analyses factors affecting the competitiveness and sustainability of the games development industry and visual effects suppliers to feature films. Interviews with participants in these industries highlight the need for policy instruments to take into consideration the structure and organization of global markets and the power of lead multinational corporations. We show that although forms of economic governance in these industries may allow sustainable value capture, they are interrupted by bottlenecks in which ferocious competition among suppliers is confronted by comparatively little competition among the lead firms. We argue that current approaches to creative industries policy aimed at building self-sustaining creative industries are unlikely to be sufficient because of the globalized nature of the industries. Rather, we argue that a more profitable approach is likely to require supporting diversification of the industries as ‘feeders’ into other areas of the economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Everything revolves around desiring-machines and the production of desire… Schizoanalysis merely asks what are the machinic, social and technical indices on a socius that open to desiring-machines (Deleuze & Guattari, 1983, pp. 380-381). Achievement tests like NAPLAN are fairly recent, yet common, education policy initiatives in much of the Western world. They intersect with, use and change pre-existing logics of education, teaching and learning. There has been much written about the form and function of these tests, the ‘stakes’ involved and the effects of their practice. This paper adopts a different “angle of vision” to ask what ‘opens’ education to these regimes of testing(Roy, 2008)? This paper builds on previous analyses of NAPLAN as a modulating machine, or a machine characterised by the increased intensity of connections and couplings. One affect can be “an existential disquiet” as “disciplinary subjects attempt to force coherence onto a disintegrating narrative of self”(Thompson & Cook, 2012, p. 576). Desire operates at all levels of the education assemblage, however our argument is that achievement testing manifests desire as ‘lack’; seen in the desire for improved results, the desire for increased control, the desire for freedom, the desire for acceptance to name a few. For Deleuze and Guattari desire is irreducible to lack, instead desire is productive. As a productive assemblage, education machines operationalise and produce through desire; “Desire is a machine, and the object of the desire is another machine connected to it”(Deleuze & Guattari, 1983, p. 26). This intersection is complexified by the strata at which they occur, the molar and molecular connections and flows they make possible. Our argument is that when attention is paid to the macro and micro connections, the machines built and disassembled as a result of high-stakes testing, a map is constructed that outlines possibilities, desires and blockages within the education assemblage. This schizoanalytic cartography suggests a new analysis of these ‘axioms’ of testing and accountability. It follows the flows and disruptions made possible as different or altered connections are made and as new machines are brought online. Thinking of education machinically requires recognising that “every machine functions as a break in the flow in relation to the machine to which it is connected, but at the same time is also a flow itself, or the production of flow, in relation to the machine connected to it”(Deleuze & Guattari, 1983, p. 37). Through its potential to map desire, desire-production and the production of desire within those assemblages that have come to dominate our understanding of what is possible, Deleuze and Guattari’s method of schizoanalysis provides a provocative lens for grappling with the question of what one can do, and what lines of flight are possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the aims of Deleuze. Guattari. Schizoanalysis. Education. is to focus on the radical reconfiguration that education is undergoing, impacting educator, administrator, institution and ‘sector’ alike. More to the point, it is the responses to that process of reconfiguration - this newly emerging assemblage - that are a key focal point in this issue. Essential to these responses, we propose, is Deleuze and Guattari’s method of schizonalysis, which offers a way to not only understand the rules of this new game, but also, hopefully, some escape from the promise of a brave new world of continuous education and motivation. A brave new world of digitised courses, impersonal and corporate expertise, updatable performance metrics, Massive Open Online Courses (MOOCs), learning analytics, transformative teaching and learning, online high-stakes testing in the name of transforming and augmenting human capital overlays the corporeal practices of institutional surveillance, examination and categorical sorting. A brave new world, importantly, where people’s continuous education is instituted less, or not simply, through disciplinary practices, and increasingly through a constant and continuous sampling and profiling of not simply performance but their activity, measured against the profiled activity of a ‘like’ age group, person, or an institution. This continuous education, including the sampling that accompanies it, we are all informed through various information and marketing campaigns, is in our best interest. An interest that is driven and governed by an ever-increasing corporatisation and monetisation of ‘the knowledge sector’, as well as an interest that is sustained through an ever-increasing, as well as continuous, debt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To this point, the collection has provided research-based, empirical accounts of the various and multiple effects of the National Assessment Program – Literacy and Numeracy (NAPLAN) in Australian schooling as a specific example of the global phenomenon of national testing. In this chapter, we want to develop a more theoretical analysis of national testing systems, globalising education policy and the promise of national testing as adaptive, online tests. These future moves claim to provide faster feedback and more useful diagnostic help for teachers. There is a utopian testing dream that one day adaptive, online tests will be responsive in real time providing an integrated personalised testing, pedagogy and intervention for each student. The moves towards these next generation assessments are well advanced, including the work of Pearson’s NextGen Learning and Assessment research group, the Organization for Economic Co-operation and Development’s (OECD) move into assessing affective skills and the Australian Curriculum, Assessment and Reporting Authority’s (ACARA) decision to phase in NAPLAN as an online, adaptive test from 2017...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction This book examines a pressing educational issue: the global phenomenon of national testing in schooling and its vernacular development in Australia. The Australian National Assessment Program – Literacy and Numeracy (NAPLAN), introduced in 2008, involves annual census testing of students in Years 3, 5, 7 and 9 in nearly all Australian schools. In a variety of ways, NAPLAN affects the lives of Australia’s 3.5 million school students and their families, as well as more than 350,000 school staff and many other stakeholders in education. This book is organised in relation to a simple question: What are the effects of national testing for systems, schools and individuals? Of course, this simple question requires complex answers. The chapters in this edited collection consider issues relating to national testing policy, the construction of the test, usages of the testing data and various effects of testing in systems, schools and classrooms. Each chapter examines an aspect of national testing in Australia using evidence drawn from research. The final chapter by the editors of this collection provides a broader reflection on this phenomenon and situates developments in testing globally...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since 2008, Australian schoolchildren in Years 3, 5, 7 and 9 have sat a series of tests each May designed to assess their attainment of basic skills in literacy and numeracy. These tests are known as the National Assessment Program – Literacy and Numeracy (NAPLAN). In 2010, individual school NAPLAN data were first published on the MySchool website which enables comparisons to be made between individual schools and statistically like schools across Australia. NAPLAN represents the increased centrality of the federal government in education, particularly in regards to education policy. One effect of this has been a recast emphasis of education as an economic, rather than democratic, good. As Reid (2009) suggests, this recasting of education within national productivity agendas mobilises commonsense discourses of accountability and transparency. These are common articles of faith for many involved in education administration and bureaucracy; more and better data, and holding people to account for that data, must improve education...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article uses topological approaches to suggest that education is becoming-topological. Analyses presented in a recent double-issue of Theory, Culture & Society are used to demonstrate the utility of topology for education. In particular, the article explains education's topological character through examining the global convergence of education policy, testing and the discursive ranking of systems, schools and individuals in the promise of reforming education through the proliferation of regimes of testing at local and global levels that constitute a new form of governance through data. In this conceptualisation of global education policy changes in the form and nature of testing combine with it the emergence of global policy network to change the nature of the local (national, regional, school and classroom) forces that operate through the ‘system’. While these forces change, they work through a discursivity that produces disciplinary effects, but in a different way. This new–old disciplinarity, or ‘database effect’, is here represented through a topological approach because of its utility for conceiving education in an increasingly networked world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the global policy convergence toward high-stakes testing in schools and the use of test results to ‘steer at a distance’, particularly as it applies to policy-makers’ promise to improve teacher quality. Using Deleuze’s three syntheses of time in the context of the Australian policy blueprint Quality Education, this paper argues that using test scores to discipline teaching repeats the past habit of policy-making as continuing the problem of the unaccountable teacher. This results in local policy-making enfolding test scores in a pure past where the teacher-as-problem is resolved through the use of data from testing to deliver accountability and transparency. This use of the database returns a digitised form of inspection that is a repetition of the habit of teacher-as-problem. While dystopian possibilities are available through the database, in what Deleuze refers to as a control society, for us the challenge is to consider policy-making as a step into an unknown future, to engage with producing policy that is not grounded on the unconscious interiority of solving the teacher problem, but of imagining new ways of conceiving the relationship between policy-making and teaching.