196 resultados para Yokohama-shi
Resumo:
Genetic analysis of diffusion tensor images (DTI) shows great promise in revealing specific genetic variants that affect brain integrity and connectivity. Most genetic studies of DTI analyze voxel-based diffusivity indices in the image space (such as 3D maps of fractional anisotropy) and overlook tract geometry. Here we propose an automated workflow to cluster fibers using a white matter probabilistic atlas and perform genetic analysis on the shape characteristics of fiber tracts. We apply our approach to large study of 4-Tesla high angular resolution diffusion imaging (HARDI) data from 198 healthy, young adult twins (age: 20-30). Illustrative results show heritability for the shapes of several major tracts, as color-coded maps.
Labeling white matter tracts in hardi by fusing multiple tract atlases with applications to genetics
Resumo:
Accurate identification of white matter structures and segmentation of fibers into tracts is important in neuroimaging and has many potential applications. Even so, it is not trivial because whole brain tractography generates hundreds of thousands of streamlines that include many false positive fibers. We developed and tested an automatic tract labeling algorithm to segment anatomically meaningful tracts from diffusion weighted images. Our multi-atlas method incorporates information from multiple hand-labeled fiber tract atlases. In validations, we showed that the method outperformed the standard ROI-based labeling using a deformable, parcellated atlas. Finally, we show a high-throughput application of the method to genetic population studies. We use the sub-voxel diffusion information from fibers in the clustered tracts based on 105-gradient HARDI scans of 86 young normal twins. The whole workflow shows promise for larger population studies in the future.
Resumo:
To understand factors that affect brain connectivity and integrity, it is beneficial to automatically cluster white matter (WM) fibers into anatomically recognizable tracts. Whole brain tractography, based on diffusion-weighted MRI, generates vast sets of fibers throughout the brain; clustering them into consistent and recognizable bundles can be difficult as there are wide individual variations in the trajectory and shape of WM pathways. Here we introduce a novel automated tract clustering algorithm based on label fusion - a concept from traditional intensity-based segmentation. Streamline tractography generates many incorrect fibers, so our top-down approach extracts tracts consistent with known anatomy, by mapping multiple hand-labeled atlases into a new dataset. We fuse clustering results from different atlases, using a mean distance fusion scheme. We reliably extracted the major tracts from 105-gradient high angular resolution diffusion images (HARDI) of 198 young normal twins. To compute population statistics, we use a pointwise correspondence method to match, compare, and average WM tracts across subjects. We illustrate our method in a genetic study of white matter tract heritability in twins.
Resumo:
Automatic labeling of white matter fibres in diffusion-weighted brain MRI is vital for comparing brain integrity and connectivity across populations, but is challenging. Whole brain tractography generates a vast set of fibres throughout the brain, but it is hard to cluster them into anatomically meaningful tracts, due to wide individual variations in the trajectory and shape of white matter pathways. We propose a novel automatic tract labeling algorithm that fuses information from tractography and multiple hand-labeled fibre tract atlases. As streamline tractography can generate a large number of false positive fibres, we developed a top-down approach to extract tracts consistent with known anatomy, based on a distance metric to multiple hand-labeled atlases. Clustering results from different atlases were fused, using a multi-stage fusion scheme. Our "label fusion" method reliably extracted the major tracts from 105-gradient HARDI scans of 100 young normal adults. © 2012 Springer-Verlag.
Resumo:
We propose in this paper a new method for the mapping of hippocampal (HC) surfaces to establish correspondences between points on HC surfaces and enable localized HC shape analysis. A novel geometric feature, the intrinsic shape context, is defined to capture the global characteristics of the HC shapes. Based on this intrinsic feature, an automatic algorithm is developed to detect a set of landmark curves that are stable across population. The direct map between a source and target HC surface is then solved as the minimizer of a harmonic energy function defined on the source surface with landmark constraints. For numerical solutions, we compute the map with the approach of solving partial differential equations on implicit surfaces. The direct mapping method has the following properties: (1) it has the advantage of being automatic; (2) it is invariant to the pose of HC shapes. In our experiments, we apply the direct mapping method to study temporal changes of HC asymmetry in Alzheimer's disease (AD) using HC surfaces from 12 AD patients and 14 normal controls. Our results show that the AD group has a different trend in temporal changes of HC asymmetry than the group of normal controls. We also demonstrate the flexibility of the direct mapping method by applying it to construct spherical maps of HC surfaces. Spherical harmonics (SPHARM) analysis is then applied and it confirms our results on temporal changes of HC asymmetry in AD.
Resumo:
Developing nano/micro-structures which can effectively upgrade the intriguing properties of electrode materials for energy storage devices is always a key research topic. Ultrathin nanosheets were proved to be one of the potential nanostructures due to their high specific surface area, good active contact areas and porous channels. Herein, we report a unique hierarchical micro-spherical morphology of well-stacked and completely miscible molybdenum disulfide (MoS2) nanosheets and graphene sheets, were successfully synthesized via a simple and industrial scale spray-drying technique to take the advantages of both MoS2 and graphene in terms of their high practical capacity values and high electronic conductivity, respectively. Computational studies were performed to understand the interfacial behaviour of MoS2 and graphene, which proves high stability of the composite with high interfacial binding energy (−2.02 eV) among them. Further, the lithium and sodium storage properties have been tested and reveal excellent cyclic stability over 250 and 500 cycles, respectively, with the highest initial capacity values of 1300 mAh g−1 and 640 mAh g−1 at 0.1 A g−1.
Resumo:
As a vital component of construction professional services (CPS), construction management consultancy is in nature knowledge-intensive and client-tailored. Although recent studies have acknowledged the increasing role of this subsector of CPS in the attainment of sustainable construction, little attention has been given to the education and training of its main body, namely construction management consultants (CMCs). This study investigated the competence and knowledge structure of CMCs by taking China as an example. Using the methods of interview and questionnaire survey, three key competences of CMCs and the underpinned knowledge structure were identified. The identified competences are personnel quality, onsite practical skills, and continuing professional learning. Underpinned these competences are the knowledge structure composed of a number of disciplines including construction cost planning and control, civil engineering and construction, engineering contract and law, and construction project management. The research findings lay a solid foundation for future studies to probe into the role of construction management consultants in the area of sustainable construction.
Resumo:
The Australian housing sector contributes about a fifth of national greenhouse gas (GHG) emissions. GHG emissions contribute to climate change which leads to an increase in the occurrence or intensity of natural disasters and damage of houses. To ensure housing performance in the face of climate change, various rating tools for residential property have been introduced in different countries. The aim of this paper is to present a preliminary comparison between international and Australian rating tools in terms of purpose, use and sustainability elements for residential property. The methodologies used are to review, classify, compare and identify similarities and differences between rating tools. Two international tools, Building Research Establishment Environmental Assessment Methodology (BREEAM) (UK) and Leadership in Energy and Environmental Design for Homes (LEED-Homes) (USA), will be compared to two Australian tools, Green Star – Multi Unit Residential v1 and EnviroDevelopment. All four rating tools include management, energy, water and material aspects. The findings reveal thirteen elements that fall under three categories: spatial planning, occupants’ health and comfort, and environmental conditions. The variations in different tools may result from differences in local prevailing climate. Not all sustainability elements covered by international rating tools are included in the Australian rating tools. The voluntary nature of the tools implies they are not broadly applied in their respective market and that there is a policy implementation gap. A comprehensive rating tool could be developed in Australia to promote and lessen the confusion about sustainable housing, which in turn assist in improving the supply and demand of sustainable housing.
Resumo:
Light emitting field effect transistors (LEFETs) are emerging as a multi-functional class of optoelectronic devices. LEFETs can simultaneously execute light emission and the standard logic functions of a transistor in a single architecture. However, current LEFET architectures deliver either high brightness or high efficiency but not both concurrently, thus limiting their use in technological applications. Here we show an LEFET device strategy that simultaneously improves brightness and efficiency. The key step change in LEFET performance arises from the bottom gate top-contact device architecture in which the source/drain electrodes are semitransparent and the active channel contains a bi-layer comprising of a high mobility charge-transporting polymer, and a yellow-green emissive polymer. A record external quantum efficiency (EQE) of 2.1% at 1000cd/m2 is demonstrated for polymer based bilayer LEFETs.
Resumo:
The system for high utilization of LNG cold energy is proposed by use of process simulator. The proposed design is a closed loop system, and composed by a Hampson type heat exchanger, turbines, pumps and advanced humid air turbine (AHAT) or Gas turbine combined cycle (GTCC). Its heat sources are Boil-off gas and cooling water for AHAT or GTCC. The higher cold exergy recovery to power can be about 38 to 56% as compared to the existing cold power generation of about 20% with a Rankine cycle of a single component. The advantage of the proposed system is to reduce the number of heat exchangers. Furthermore, the environmental impact is minimized because the proposed design is a closed loop system. A life cycle comparative cost is calculated to demonstrate feasibility of the proposed design. The development of the Hampson type exchangers is expected to meet the key functional requirements and will result in much higher LNG cold exergy recovery and the overall system performance i.e. re-gasification. Additionally, the proposed design is expected to provide flexibility to meet different gas pressure suited for the deregulation of energy system in Japan and higher reliability for an integrated boil-off gas system.
Resumo:
In the mining optimisation literature, most researchers focused on two strategic-level and tactical-level open-pit mine optimisation problems, which are respectively termed ultimate pit limit (UPIT) or constrained pit limit (CPIT). However, many researchers indicate that the substantial numbers of variables and constraints in real-world instances (e.g., with 50-1000 thousand blocks) make the CPIT’s mixed integer programming (MIP) model intractable for use. Thus, it becomes a considerable challenge to solve the large scale CPIT instances without relying on exact MIP optimiser as well as the complicated MIP relaxation/decomposition methods. To take this challenge, two new graph-based algorithms based on network flow graph and conjunctive graph theory are developed by taking advantage of problem properties. The performance of our proposed algorithms is validated by testing recent large scale benchmark UPIT and CPIT instances’ datasets of MineLib in 2013. In comparison to best known results from MineLib, it is shown that the proposed algorithms outperform other CPIT solution approaches existing in the literature. The proposed graph-based algorithms leads to a more competent mine scheduling optimisation expert system because the third-party MIP optimiser is no longer indispensable and random neighbourhood search is not necessary.
Resumo:
A highly extended dithienothiophene comonomer building block was used in combination with highly fused aromatic furan substituted diketopyrrolopyrrole for the synthesis of novel donor–acceptor alternating copolymer PDPPF-DTT. Upon testing PDPPF-DTT as a channel semiconductor in top contact bottom gate organic field effect transistors (OFETs), it was found to exhibit p-channel behaviour. The highest hole mobility of 3.56 cm2 V−1 s−1 was reported for PDPPF-DTT. To our knowledge, this is the highest mobility reported so far for the furan flanked diketopyrrolopyrrole class of copolymers using conventional device geometry with straightforward processing.
Resumo:
The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.
Resumo:
Workshops and seminars are widely-used forms of doctoral training. However, research with a particular focus on these forms of doctoral training is sporadic in the literature. There is no, if any, such research concerning the international context and participants’ own voices. Mindful of these lacunae in the literature, we write the current paper as a group of participants in one of a series of doctoral forums co-organised annually by Beijing Normal University, China and Queensland University of Technology, Australia. The paper voices our own experiences of participation in the doctoral forum. Data were drawn from reflections, journals, and group discussions of all 12 student and academic participants. These qualitative data were organised and analysed through Bourdieu’s notions of capital and field. Findings indicate that the doctoral forum created enabling and challenging social fields where participants accrued and exchanged various forms of capital and negotiated transient and complex power relations. In this respect, the sociological framework used provides a distinctive theoretical tool to conceptualise and analyse the benefits and tensions of participation in the doctoral forum. Knowledge built and lessons learned through our paper will provide implications and recommendations for future planning of, and participation in, the doctoral forum series and similar activities elsewhere.
Resumo:
This paper proposes a new multi-resource multi-stage mine production timetabling problem for optimising the open-pit drilling, blasting and excavating operations under equipment capacity constraints. The flow process is analysed based on the real-life data from an Australian iron ore mine site. The objective of the model is to maximise the throughput and minimise the total idle times of equipment at each stage. The following comprehensive mining attributes and constraints are considered: types of equipment; operating capacities of equipment; ready times of equipment; speeds of equipment; block-sequence-dependent movement times; equipment-assignment-dependent operational times; etc. The model also provides the availability and usage of equipment units at multiple operational stages such as drilling, blasting and excavating stages. The problem is formulated by mixed integer programming and solved by ILOG-CPLEX optimiser. The proposed model is validated with extensive computational experiments to improve mine production efficiency at the operational level.