306 resultados para Individual-based modeling
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. High-angular resolution diffusion imaging (HARDI) can resolve more complex diffusion geometries than standard DTI, including fibers crossing or mixing. The tensor distribution function (TDF) can be used to reconstruct multiple underlying fibers per voxel, representing the diffusion profile as a probabilistic mixture of tensors. Here we found that DTIderived mean diffusivity (MD) correlates well with actual individual fiber MD, but DTI-derived FA correlates poorly with actual individual fiber anisotropy, and may be suboptimal when used to detect disease processes that affect myelination. Analysis of the TDFs revealed that almost 40% of voxels in the white matter had more than one dominant fiber present. To more accurately assess fiber integrity in these cases, we here propose the differential diffusivity (DD), which measures the average anisotropy based on all dominant directions in each voxel.
Resumo:
We study the influence of the choice of template in tensor-based morphometry. Using 3D brain MR images from 10 monozygotic twin pairs, we defined a tensor-based distance in the log-Euclidean framework [1] between each image pair in the study. Relative to this metric, twin pairs were found to be closer to each other on average than random pairings, consistent with evidence that brain structure is under strong genetic control. We also computed the intraclass correlation and associated permutation p-value at each voxel for the determinant of the Jacobian matrix of the transformation. The cumulative distribution function (cdf) of the p-values was found at each voxel for each of the templates and compared to the null distribution. Surprisingly, there was very little difference between CDFs of statistics computed from analyses using different templates. As the brain with least log-Euclidean deformation cost, the mean template defined here avoids the blurring caused by creating a synthetic image from a population, and when selected from a large population, avoids bias by being geometrically centered, in a metric that is sensitive enough to anatomical similarity that it can even detect genetic affinity among anatomies.
Resumo:
Population-based brain mapping provides great insight into the trajectory of aging and dementia, as well as brain changes that normally occur over the human life span.We describe three novel brain mapping techniques, cortical thickness mapping, tensor-based morphometry (TBM), and hippocampal surface modeling, which offer enormous power for measuring disease progression in drug trials, and shed light on the neuroscience of brain degeneration in Alzheimer's disease (AD) and mild cognitive impairment (MCI).We report the first time-lapse maps of cortical atrophy spreading dynamically in the living brain, based on averaging data from populations of subjects with Alzheimer's disease and normal subjects imaged longitudinally with MRI. These dynamic sequences show a rapidly advancing wave of cortical atrophy sweeping from limbic and temporal cortices into higher-order association and ultimately primary sensorimotor areas, in a pattern that correlates with cognitive decline. A complementary technique, TBM, reveals the 3D profile of atrophic rates, at each point in the brain. A third technique, hippocampal surface modeling, plots the profile of shape alterations across the hippocampal surface. The three techniques provide moderate to highly automated analyses of images, have been validated on hundreds of scans, and are sensitive to clinically relevant changes in individual patients and groups undergoing different drug treatments. We compare time-lapse maps of AD, MCI, and other dementias, correlate these changes with cognition, and relate them to similar time-lapse maps of childhood development, schizophrenia, and HIV-associated brain degeneration. Strengths and weaknesses of these different imaging measures for basic neuroscience and drug trials are discussed.
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.
Resumo:
Scenario planning is a method widely used by strategic planners to address uncertainty about the future. However, current methods either fail to address the future behaviour and impact of stakeholders or they treat the role of stakeholders informally. We present a practical decision-analysis-based methodology for analysing stakeholder objectives and likely behaviour within contested unfolding futures. We address issues of power, interest, and commitment to achieve desired outcomes across a broad stakeholder constituency. Drawing on frameworks for corporate social responsibility (CSR), we provide an illustrative example of our approach to analyse a complex contested issue that crosses geographic, organisational and cultural boundaries. Whilst strategies can be developed by individual organisations that consider the interests of others – for example in consideration of an organisation's CSR agenda – we show that our augmentation of scenario method provides a further, nuanced, analysis of the power and objectives of all concerned stakeholders across a variety of unfolding futures. The resulting modelling framework is intended to yield insights and hence more informed decision making by individual stakeholders or regulators.
Resumo:
Deterrence-based initiatives form a cornerstone of many road safety countermeasures. This approach is informed by Classical Deterrence Theory, which proposes that individuals will be deterred from committing offences if they fear the perceived consequences of the act, especially the perceived certainty, severity and swiftness of sanctions. While deterrence-based countermeasures have proven effective in reducing a range of illegal driving behaviours known to cause crashes such as speeding and drink driving, the exact level of exposure, and how the process works, remains unknown. As a result the current study involved a systematic review of the literature to identify theoretical advancements within deterrence theory that has informed evidence-based practice. Studies that reported on perceptual deterrence between 1950 and June 2015 were searched in electronic databases including PsychINFO and ScienceDirect, both within road safety and non-road safety fields. This review indicated that scientific efforts to understand deterrence processes for road safety were most intense during the 1970s and 1980s. This era produced competing theories that postulated both legal and non-legal factors can influence offending behaviours. Since this time, little theoretical progression has been made in the road safety arena, apart from Stafford and Warr's (1993) reconceptualisation of deterrence that illuminated the important issue of punishment avoidance. In contrast, the broader field of criminology has continued to advance theoretical knowledge by investigating a range of individual difference-based factors proposed to influence deterrent processes, including: moral inhibition, social bonding, self-control, tendencies to discount the future, etc. However, this scientific knowledge has not been directed towards identifying how to best utilise deterrence mechanisms to improve road safety. This paper will highlight the implications of this lack of progression and provide direction for future research.
Resumo:
As of today, user-generated information such as online reviews has become increasingly significant for customers in decision making process. Meanwhile, as the volume of online reviews proliferates, there is an insistent demand to help the users tackle the information overload problem. In order to extract useful information from overwhelming reviews, considerable work has been proposed such as review summarization and review selection. Particularly, to avoid the redundant information, researchers attempt to select a small set of reviews to represent the entire review corpus by preserving its statistical properties (e.g., opinion distribution). However, one significant drawback of the existing works is that they only measure the utility of the extracted reviews as a whole without considering the quality of each individual review. As a result, the set of chosen reviews may consist of low-quality ones even its statistical property is close to that of the original review corpus, which is not preferred by the users. In this paper, we proposed a review selection method which takes review quality into consideration during the selection process. Specifically, we examine the relationships between product features based upon a domain ontology to capture the review characteristics based on which to select reviews that have good quality and preserve the opinion distribution as well. Our experimental results based on real world review datasets demonstrate that our proposed approach is feasible and able to improve the performance of the review selection effectively.
Resumo:
An innovative cement-based soft-hard-soft (SHS) multi-layer composite has been developed for protective infrastructures. Such composite consists of three layers including asphalt concrete (AC), high strength concrete (HSC), and engineered cementitious composites (ECC). A three dimensional benchmark numerical model for this SHS composite as pavement under blast load was established using LSDYNA and validated by field blast test. Parametric studies were carried out to investigate the influence of a few key parameters including thickness and strength of HSC and ECC layers, interface properties, soil conditions on the blast resistance of the composite. The outcomes of this study also enabled the establishment of a damage pattern chart for protective pavement design and rapid repair after blast load. Efficient methods to further improve the blast resistance of the SHS multi-layer pavement system were also recommended.
Resumo:
Employees’ safety climate perceptions dictate their safety behavior because individuals act based on their perceptions of reality. Extensive empirical research in applied psychology has confirmed this relationship. However, rare efforts have been made to investigate the factors contributing to a favorable safety climate in construction research. As an initial effort to address the knowledge gap, this paper examines factors contributing to a psychological safety climate, an operationalization of a safety climate at the individual level, and, hence, the basic element of a safety climate at higher levels. A multiperspective framework of contributors to a psychological safety climate is estimated by a structural equation modeling technique using individual questionnaire responses from a random sample of construction project personnel. The results inform management of three routes to psychological safety climate: a client’s proactive involvement in safety management, a workforce-friendly workplace created by the project team, and transformational supervisors’ communication about safety matters with the workforce. This paper contributes to the field of construction engineering and management by highlighting a broader contextual influence in a systematic formation of psychological safety climate perceptions.
Resumo:
This paper proposes an analytical Incident Traffic Management framework for freeway incident modeling and traffic re-routing. The proposed framework incorporates an econometric incident duration model and a traffic re-routing optimization module. The incident duration model is used to estimate the expected duration of the incident and thus determine the planning horizon for the re-routing module. The re-routing module is a CTM-based Single Destination System Optimal Dynamic Traffic Assignment model that generates optimal real-time strategies of re-routing freeway traffic to its adjacent arterial network during incidents. The proposed framework has been applied to a case study network including a freeway and its adjacent arterial network in South East Queensland, Australia. The results from different scenarios of freeway demand and incident blockage extent have been analyzed and advantages of the proposed framework are demonstrated.
Resumo:
The construction industry is a knowledge-based industry where various actors with diverse expertise create unique information within different phases of a project. The industry has been criticized by researchers and practitioners as being unable to apply newly created knowledge effectively to innovate. The fragmented nature of the construction industry reduces the opportunity of project participants to learn from each other and absorb knowledge. Building Information Modelling (BIM), referring to digital representations of constructed facilities, is a promising technological advance that has been proposed to assist in the sharing of knowledge and creation of linkages between firms. Previous studies have mainly focused on the technical attributes of BIM and there is little evidence on its capability to enhance learning in construction firms. This conceptual paper identifies six ‘functional attributes’ of BIM that act as triggers to stimulate learning: (1) comprehensibility; (2) predictability; (3) accuracy; (4) transparency; (5) mutual understanding and; (6) integration.
Resumo:
If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.
Resumo:
Large sized power transformers are important parts of the power supply chain. These very critical networks of engineering assets are an essential base of a nation’s energy resource infrastructure. This research identifies the key factors influencing transformer normal operating conditions and predicts the asset management lifespan. Engineering asset research has developed few lifespan forecasting methods combining real-time monitoring solutions for transformer maintenance and replacement. Utilizing the rich data source from a remote terminal unit (RTU) system for sensor-data driven analysis, this research develops an innovative real-time lifespan forecasting approach applying logistic regression based on the Weibull distribution. The methodology and the implementation prototype are verified using a data series from 161 kV transformers to evaluate the efficiency and accuracy for energy sector applications. The asset stakeholders and suppliers significantly benefit from the real-time power transformer lifespan evaluation for maintenance and replacement decision support.
Resumo:
Structural equation modeling (SEM) is a powerful statistical approach for the testing of networks of direct and indirect theoretical causal relationships in complex data sets with intercorrelated dependent and independent variables. SEM is commonly applied in ecology, but the spatial information commonly found in ecological data remains difficult to model in a SEM framework. Here we propose a simple method for spatially explicit SEM (SE-SEM) based on the analysis of variance/covariance matrices calculated across a range of lag distances. This method provides readily interpretable plots of the change in path coefficients across scale and can be implemented using any standard SEM software package. We demonstrate the application of this method using three studies examining the relationships between environmental factors, plant community structure, nitrogen fixation, and plant competition. By design, these data sets had a spatial component, but were previously analyzed using standard SEM models. Using these data sets, we demonstrate the application of SE-SEM to regularly spaced, irregularly spaced, and ad hoc spatial sampling designs and discuss the increased inferential capability of this approach compared with standard SEM. We provide an R package, sesem, to easily implement spatial structural equation modeling.
Resumo:
A simulation model (PCPF-B) was developed based on the PCPF-1 model to predict the runoff of pesticides from paddy plots to a drainage canal in a paddy block. The block-scale model now comprises three modules: (1) a module for pesticide application, (2) a module for pesticide behavior in paddy fields, and (3) a module for pesticide concentration in the drainage canal. The PCPF-B model was first evaluated by published data in a single plot and then was applied to predict the concentration of bensulfuron-methyl in one paddy block in the Sakura river basin, Ibaraki, Japan, where a detailed field survey was conducted. The PCPF-B model simulated well the behavior of bensulfuron-methyl in individual paddy plots. It also reflected the runoff pattern of bensulfuron-methyl at the block outlet, although overestimation of bensulfuronmethyl concentrations occurred due to uncertainty in water balance estimation. Application of water management practice such as water-holding period and seepage control also affected the performance of the model. A probabilistic approach may be necessary for a comprehensive risk assessment in large-scale paddy areas.