585 resultados para user-defined function (UDF)
Resumo:
We study the influence of the choice of template in tensor-based morphometry. Using 3D brain MR images from 10 monozygotic twin pairs, we defined a tensor-based distance in the log-Euclidean framework [1] between each image pair in the study. Relative to this metric, twin pairs were found to be closer to each other on average than random pairings, consistent with evidence that brain structure is under strong genetic control. We also computed the intraclass correlation and associated permutation p-value at each voxel for the determinant of the Jacobian matrix of the transformation. The cumulative distribution function (cdf) of the p-values was found at each voxel for each of the templates and compared to the null distribution. Surprisingly, there was very little difference between CDFs of statistics computed from analyses using different templates. As the brain with least log-Euclidean deformation cost, the mean template defined here avoids the blurring caused by creating a synthetic image from a population, and when selected from a large population, avoids bias by being geometrically centered, in a metric that is sensitive enough to anatomical similarity that it can even detect genetic affinity among anatomies.
Resumo:
We propose in this paper a new method for the mapping of hippocampal (HC) surfaces to establish correspondences between points on HC surfaces and enable localized HC shape analysis. A novel geometric feature, the intrinsic shape context, is defined to capture the global characteristics of the HC shapes. Based on this intrinsic feature, an automatic algorithm is developed to detect a set of landmark curves that are stable across population. The direct map between a source and target HC surface is then solved as the minimizer of a harmonic energy function defined on the source surface with landmark constraints. For numerical solutions, we compute the map with the approach of solving partial differential equations on implicit surfaces. The direct mapping method has the following properties: (1) it has the advantage of being automatic; (2) it is invariant to the pose of HC shapes. In our experiments, we apply the direct mapping method to study temporal changes of HC asymmetry in Alzheimer's disease (AD) using HC surfaces from 12 AD patients and 14 normal controls. Our results show that the AD group has a different trend in temporal changes of HC asymmetry than the group of normal controls. We also demonstrate the flexibility of the direct mapping method by applying it to construct spherical maps of HC surfaces. Spherical harmonics (SPHARM) analysis is then applied and it confirms our results on temporal changes of HC asymmetry in AD.
Resumo:
This paper describes algorithms that can identify patterns of brain structure and function associated with Alzheimer's disease, schizophrenia, normal aging, and abnormal brain development based on imaging data collected in large human populations. Extraordinary information can be discovered with these techniques: dynamic brain maps reveal how the brain grows in childhood, how it changes in disease, and how it responds to medication. Genetic brain maps can reveal genetic influences on brain structure, shedding light on the nature-nurture debate, and the mechanisms underlying inherited neurobehavioral disorders. Recently, we created time-lapse movies of brain structure for a variety of diseases. These identify complex, shifting patterns of brain structural deficits, revealing where, and at what rate, the path of brain deterioration in illness deviates from normal. Statistical criteria can then identify situations in which these changes are abnormally accelerated, or when medication or other interventions slow them. In this paper, we focus on describing our approaches to map structural changes in the cortex. These methods have already been used to reveal the profile of brain anomalies in studies of dementia, epilepsy, depression, childhood- and adult-onset schizophrenia, bipolar disorder, attention-deficit/hyperactivity disorder, fetal alcohol syndrome, Tourette syndrome, Williams syndrome, and in methamphetamine abusers. Specifically, we describe an image analysis pipeline known as cortical pattern matching that helps compare and pool cortical data over time and across subjects. Statistics are then defined to identify brain structural differences between groups, including localized alterations in cortical thickness, gray matter density (GMD), and asymmetries in cortical organization. Subtle features, not seen in individual brain scans, often emerge when population-based brain data are averaged in this way. Illustrative examples are presented to show the profound effects of development and various diseases on the human cortex. Dynamically spreading waves of gray matter loss are tracked in dementia and schizophrenia, and these sequences are related to normally occurring changes in healthy subjects of various ages.
Resumo:
High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.
Resumo:
We demonstrate a geometrically inspired technique for computing Evans functions for the linearised operators about travelling waves. Using the examples of the F-KPP equation and a Keller–Segel model of bacterial chemotaxis, we produce an Evans function which is computable through several orders of magnitude in the spectral parameter and show how such a function can naturally be extended into the continuous spectrum. In both examples, we use this function to numerically verify the absence of eigenvalues in a large region of the right half of the spectral plane. We also include a new proof of spectral stability in the appropriate weighted space of travelling waves of speed c≥sqrt(2δ) in the F-KPP equation.
Resumo:
The first User-Focused Service Engineering, Consumption and Aggregation workshop (USECA) in 2011 was held in conjunction with the WISE 2011 conference in Sydney, Australia. Web services and related technology are a widely accepted standard architectural paradigm for application development. The idea of reusing existing software components to build new applications has been well documented and supported for the world of enterprise computing and professional developers. However, this powerful idea has not been transferred to end-users who have limited or no computing knowledge. The current methodologies, models, languages and tools developed for Web service composition are suited to IT professionals and people with years of training in computing technologies. It is still hard to imagine any of these technologies being used by business professionals, as opposed to computing professionals. © 2013 Springer-Verlag.
Resumo:
Service oriented architecture is gaining momentum. However, in order to be successful, the proper and up-to-date description of services is required. Such a description may be provided by service profiling mechanisms, such as one presented in this article. Service profile can be defined as an up-to-date description of a subset of non-functional properties of a service. It allows for service comparison on the basis of non-functional parameters, and choosing the service which is most suited to the needs of a user. In this article the notion of a service profile along with service profiling mechanism is presented as well as the architecture of a profiling system. © 2006 IEEE.
Resumo:
The size and arrangement of stromal collagen fibrils (CFs) influence the optical properties of the cornea and hence its function. The spatial arrangement of the collagen is still questionable in relation to the diameter of collagen fibril. In the present study, we introduce a new parameter, edge-fibrillar distance (EFD) to measure how two collagen fibrils are spaced with respect to their closest edges and their spatial distribution through normalized standard deviation of EFD (NSDEFD) accessed through the application of two commercially available multipurpose solutions (MPS): ReNu and Hippia. The corneal buttons were soaked separately in ReNu and Hippia MPS for five hours, fixed overnight in 2.5% glutaraldehyde containing cuprolinic blue and processed for transmission electron microscopy. The electron micrographs were processed using ImageJ user-coded plugin. Statistical analysis was performed to compare the image processed equivalent diameter (ED), inter-fibrillar distance (IFD), and EFD of the CFs of treated versus normal corneas. The ReNu-soaked cornea resulted in partly degenerated epithelium with loose hemidesmosomes and Bowman’s collagen. In contrast, the epithelium of the cornea soaked in Hippia was degenerated or lost but showed closely packed Bowman’s collagen. Soaking the corneas in both MPS caused a statistically significant decrease in the anterior collagen fibril, ED and a significant change in IFD, and EFD than those of the untreated corneas (p < 0.05, for all comparisons). The introduction of EFD measurement in the study directly provided a sense of gap between periphery of the collagen bundles, their spatial distribution; and in combination with ED, they showed how the corneal collagen bundles are spaced in relation to their diameters. The spatial distribution parameter NSDEFD indicated that ReNu treated cornea fibrils were uniformly distributed spatially, followed by normal and Hippia. The EFD measurement with relatively lower standard deviation and NSDEFD, a characteristic of uniform CFs distribution, can be an additional parameter used in evaluating collagen organization and accessing the effects of various treatments on corneal health and transparency.
Resumo:
Section 180 of the Property Law Act 1974 (Qld) makes provision for an applicant to seek a statutory right of user over a neighbour’s property where such right of use is reasonably necessary in the interests of effective use in any reasonable manner of the dominant land. A key issue in an application under s 180 is compensation. Unfortunately, while s 180 expressly contemplates that an order for compensation will include provision for payment of compensation to the owner of servient land there are certain issues that are less clear. One of these is the basis for determination of the amount of compensation. In this regard, s 180(4)(a) provides that, in making an order for a statutory right of user, the court: (a) shall, except in special circumstances, include provision for payment by the applicant to such person or persons as may be specified in the order of such amount by way of compensation or consideration as in the circumstances appears to the court to be just The operation of this statutory provision was considered by de Jersey CJ (as he then was) in Peulen v Agius [2015] QSC 137.
Resumo:
The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.
Resumo:
The digital divide is the disparancy in access to information, in the ability to communicate, and in the capacity to make information and communication serve full participation in the information society. Indeed, the conversation about the digital divide has developed over the last decade from a focus on connectivity and access to information and communication technologies, to a conversation that encompasses the ability to use them and to the utility that usage provides (Wei et al., 2011). However, this conversation, while transitioning from technology to the skills of the people that use them and to the fruits of their use is limited in its ability to take into account the social role of information and communication technologies (ICTs). One successful attempt in conceptualizing the social impact of the differences in access to and utilization of digital communication technologies, was developed by van Dijk (2005) whose sequential model for analyzing the divide states that: 1. Categorical inequalities in society produce an unequal distribution of resources; 2. An unequal distribution of resources causes unequal access to digital technologies; 3. Unequal access to digital technologies also depends on the characteristics of these technologies; 4. Unequal access to digital technologies brings about unequal participation in society; 5. Unequal participation in society reinforces categorical inequalities and unequal distributions of resources.” (p. 15) As van Dijk’s model demonstrates, the divide’s impact is the exclusion of individuals from participation. Still left to be defined are the “categorical inequalities,” the “resources,” the “characteristics of digital technologies,” and the different levels of “access” that result in differentiated levels of participation, as these change over time due to the evolving nature of technology and the dynamics of society. And most importantly, the meaning of “participation” in contemporary society needs to be determined as it is differentiated levels of participation that are the result of the divide and the engine of the ever-growing disparities. Our argument is structured in the following manner: We first claim that contemporary digital media differ from the previous generation of ICTs along four dimensions: They offer an abundance of information resources and communication channels when compared to the relative paucity of both in the past; they offer mobility as opposed to the stationary nature of their predecessors; they are interactive in that they provide users with the capability to design their own media environments in contrast to the dictated environs of previous architectures; and, they allow users to communicate utilizing multi forms of mediation, unlike the uniformity of sound or word that limited users in the past. We then submit that involvement in the information society calls for egalitarian access to all four dimensions of the user experience that make contemporary media different from their predecessors and that the ability to experience all four affects the levels in which humans partake in the shaping of society. The model being cyclical, we then discuss how lower levels of participation contribute to the enhancement of social inequalities. Finally, we discuss why participation is needed in order to achieve full membership in the information society and what political philosophy should govern policy solutions targeting the re-inclusion of those digitally excluded.
Resumo:
Light-emitting field effect transistors (LEFETs) are an emerging class of multifunctional optoelectronic devices. It combines the light emitting function of an OLED with the switching function of a transistor in a single device architecture the dual functionality of LEFETs has the potential applications in active matrix displays. However, the key problem of existing LEFETs thus far has been their low EQEs at high brightness, poor ON/OFF and poorly defined light emitting area-a thin emissive zone at the edge of the electrodes. Here we report heterostructure LEFETs based on solution processed unipolar charge transport and an emissive polymer that have an EQE of up to 1% at a brightness of 1350a €...cd/m 2, ON/OFF ratio > 10 4 and a well-defined light emitting zone suitable for display pixel design. We show that a non-planar hole-injecting electrode combined with a semi-transparent electron-injecting electrode enables to achieve high EQE at high brightness and high ON/OFF ratio. Furthermore, we demonstrate that heterostructure LEFETs have a better frequency response (f cut-off = 2.6a €...kHz) compared to single layer LEFETs the results presented here therefore are a major step along the pathway towards the realization of LEFETs for display applications.
Resumo:
Interleukin-10 (IL-10) is an important immunoregulatory cytokine produced by various types of cells. Researchers describe here the isolation and characterization of olive flounder IL-10 (ofIL-10) cDNA and genomic organization. The ofIL-10 gene encodes a 187 amino acid protein and is composed of a five exon/four intron structure, similar to other known IL-10 genes. The ofIL-10 promoter sequence analysis shows a high level of homology in putative binding sites for transcription factors which are sufficient for transcriptional regulation ofIL-10. Important structural residues are maintained in the ofIL-10 protein including the four cysteines responsible for the two intra-chain disulfide bridges reported for human IL-10 and two extra cysteine residues that exist only in fish species. The phylogenetic analysis clustered ofIL-10 with other fish IL-10s and apart from mammalian IL-10 molecules. Quantitative real-time Polymerase Chain Reaction (PCR) analysis demonstrated ubiquitous ofIL-10 gene expression in the 13 tissues examined. Additionally, the induction of ofIL-10 gene expression was observed in the kidney tissue from olive flounder infected with bacteria (Edawardsiella tarda) or virus (Viral Hemorrhagic Septicemia Virus; VHSV). These data indicate that IL-10 is an important immune regulator that is conserved strictly genomic organization and function during the evolution of vertebrate immunity.
Resumo:
Analysing the engagement of students in university-based Facebook groups can shed light on the nature of their learning experience and highlight leverage points to build on student success. While post-semester surveys and demographic participation data can highlight who was involved and how they subsequently felt about the experience, these techniques do not necessarily reflect real-time engagement. One way to gain insight into in-situ student experiences is by categorising the original posts and comments into predetermined frameworks of learning. This paper offers a systematic method of coding Facebook contributions within various engagement categories: motivation, discourse, cognition and emotive responses.
Early mathematical learning: Number processing skills and executive function at 5 and 8 years of age
Resumo:
This research investigated differences and associations in performance in number processing and executive function for children attending primary school in a large Australian metropolitan city. In a cross-sectional study, performance of 25 children in the first full-time year of school, (Prep; mean age = 5.5 years) and 21 children in Year 3 (mean age = 8.5 years) completed three number processing tasks and three executive function tasks. Year 3 children consistently outperformed the Prep year children on measures of accuracy and reaction time, on the tasks of number comparison, calculation, shifting, and inhibition but not on number line estimation. The components of executive function (shifting, inhibition, and working memory) showed different patterns of correlation to performance on number processing tasks across the early years of school. Findings could be used to enhance teachers’ understanding about the role of the cognitive processes employed by children in numeracy learning, and so inform teachers’ classroom practices.