947 resultados para G-extremal processes
Resumo:
Purpose of this paper This research aims to examine the effects of inadequate documentation to the cost management & tendering processes in Managing Contractor Contracts using Fixed Lump Sum as a benchmark. Design/methodology/approach A questionnaire survey was conducted with industry practitioners to solicit their views on documentation quality issues associated with the construction industry. This is followed by a series of semi-structured interviews with a purpose of validating survey findings. Findings and value The results showed that documentation quality remains a significant issue, contributing to the industries inefficiency and poor reputation. The level of satisfaction for individual attributes of documentation quality varies. Attributes that do appear to be affected by the choice of procurement method include coordination, build ability, efficiency, completeness and delivery time. Similarly the use and effectiveness of risk mitigation techniques appears to vary between the methods, based on a number of factors such as documentation completeness, early involvement, fast tracking etc. Originality/value of paper This research fills the gap of existing body of knowledge in terms of limited studies on the choice of a project procurement system has an influence on the documentation quality and the level of impact. Conclusions Ultimately research concludes that the entire project team including the client and designers should carefully consider the individual projects requirements and compare those to the trade-offs associated with documentation quality and the procurement method. While documentation quality is definitely an issue to be improved upon, by identifying the projects performance requirements a procurement method can be chosen to maximise the likelihood that those requirements will be met. This allows the aspects of documentation quality considered most important to the individual project to be managed appropriately.
Resumo:
The environment moderates behaviour using a subtle language of ‘affordances’ and ‘behaviour-settings’. Affordances are environmental offerings. They are objects that demand action; a cliff demands a leap and binoculars demand a peek. Behaviour-settings are ‘places;’ spaces encoded with expectations and meanings. Behaviour-settings work the opposite way to affordances; they demand inhibition; an introspective demeanour in a church or when under surveillance. Most affordances and behaviour-settings are designed, and as such, designers are effectively predicting brain reactions. • Affordances are nested within, and moderated by behaviour-settings. Both trigger automatic neural responses (excitation and inhibition). These, for the best part cancel each other out. This balancing enables object recognition and allows choice about what action should be taken (if any). But when excitation exceeds inhibition, instinctive action will automatically commence. In positive circumstances this may mean laughter or a smile. In negative circumstances, fleeing, screaming or other panic responses are likely. People with poor frontal function, due to immaturity (childhood or developmental disorders) or due to hypofrontality (schizophrenia, brain damage or dementia) have a reduced capacity to balance excitatory and inhibitory impulses. For these people, environmental behavioural demands increase with the decline of frontal brain function. • The world around us is not only encoded with symbols and sensory information. Opportunities and restrictions work on a much more primal level. Person/space interactions constantly take place at a molecular scale. Every space we enter has its own special dynamic, where individualism vies for supremacy between the opposing forces of affordance-related excitation and the inhibition intrinsic to behaviour-settings. And in this context, even a small change–the installation of a CCTV camera can turn a circus to a prison. • This paper draws on cutting-edge neurological theory to understand the psychological determinates of the everyday experience of the designed environment.
Resumo:
Both a systemic inflammatory response as well as DNA damage has been observed following exhaustive endurance exercise. Hypothetically, exercise-induced DNA damage might either be a consequence of inflammatory processes or causally involved in inflammation and immunological alterations after strenuous prolonged exercise (e.g. by inducing lymphocyte apoptosis and lymphocytopenia). Nevertheless, up to now only few studies have addressed this issue and there is hardly any evidence regarding a direct relationship between DNA or chromosomal damage and inflammatory responses in the context of exercise. The most conclusive picture that emerges from available data is that reactive oxygen and nitrogen species (RONS) appear to be the key effectors which link inflammation with DNA damage. Considering the time-courses of inflammatory and oxidative stress responses on the one hand and DNA effects on the other the lack of correlations between these responses might also be explained by too short observation periods. This review summarizes and discusses the recent findings on this topic. Furthermore, data from our own study are presented that aimed to verify potential associations between several endpoints of genome stability and inflammatory, immune-endocrine and muscle damage parameters in competitors of an Ironman triathlon until 19 days into recovery. The current results indicate that DNA effects in lymphocytes are not responsible for exercise-induced inflammatory responses. Furthermore, this investigation shows that inflammatory processes, vice versa, do not promote DNA damage, neither directly nor via an increased formation of RONS derived from inflammatory cells. Oxidative DNA damage might have been counteracted by training- and exercise-induced antioxidant responses. However, further studies are needed that combine advanced -omics based techniques (transcriptomics, proteomics) with state-of-the-art biochemical biomarkers to gain more insights into the underlying mechanisms.
Resumo:
Kimberlite terminology remains problematic because both descriptive and genetic terms are mixed together in most existing terminology schemes. In addition, many terms used in existing kimberlite terminology schemes are not used in mainstream volcanology, even though kimberlite bodies are commonly the remains of kimberlite volcanic vents and edifices. We build on our own recently published approach to kimberlite facies terminology, involving a systematic progression from descriptive to genetic. The scheme can be used for both coherent kimberlite (i.e. kimberlite that was emplaced without undergoing any fragmentation processes and therefore preserving coherent igneous textures) and fragmental kimberlites. The approach involves documentation of components, textures and assessing the degree and effects of alteration on both components and original emplacement textures. This allows a purely descriptive composite component, textural and compositional petrological rock or deposit name to be constructed first, free of any biases about emplacement setting and processes. Then important facies features such as depositional structures, contact relationships and setting are assessed, leading to a composite descriptive and genetic name for the facies or rock unit that summarises key descriptive characteristics, emplacement processes and setting. Flow charts summarising the key steps in developing a progressive descriptive to genetic terminology are provided for both coherent and fragmental facies/deposits/rock units. These can be copied and used in the field, or in conjunction with field (e.g. drill core observations) and petrographic data. Because the approach depends heavily on field scale observations, characteristics and process interpretations, only the first descriptive part is appropriate where only petrographic observations are being made. Where field scale observations are available the progression from developing descriptive to interpretative terminology can be used, especially where some petrographic data also becomes available.
Resumo:
The purpose of this study is to examine the current level of stakeholder involvement during the project's planning process. Stakeholders often provide the needed resources and have the ability to control the interaction and resource flows in the network. They also ultimately have strong impact on an organisation's survival, and therefore appropriate management and involvement of key stakeholders should be an important part of any project management plan. A series of literature reviews was conducted to identify and categorise significant phases involved in the planning. For data collection, a questionnaire survey was designed and distributed amongst nearly 200 companies who were involved in the residential building sector in Australia. Results of the analysis demonstrate the engagement levels of the four stakeholder groups involved in the planning process and establish a basis for further stakeholder involvement improvement.
Resumo:
We commend Swanenburg et al. (2013) on translation, development, and clinimetric analysis of the NDI-G. However, the dual-factor structure with factor analysis and the high level of internal consistency (IC) highlighted in their discussion were not emphasized in the abstract or conclusion. These points may imply some inconsistencies with the final conclusions since determination of stable point estimates with the study's small sample are exceedingly difficult.
Resumo:
A recurring question for cognitive science is whether functional neuroimaging data can provide evidence for or against psychological theories. As posed, the question reflects an adherence to a popular scientific method known as 'strong inference'. The method entails constructing multiple hypotheses (Hs) and designing experiments so that alternative possible outcomes will refute at least one (i.e., 'falsify' it). In this article, after first delineating some well-documented limitations of strong inference, I provide examples of functional neuroimaging data being used to test Hs from rival modular information-processing models of spoken word production. 'Strong inference' for neuroimaging involves first establishing a systematic mapping of 'processes to processors' for a common modular architecture. Alternate Hs are then constructed from psychological theories that attribute the outcome of manipulating an experimental factor to two or more distinct processing stages within this architecture. Hs are then refutable by a finding of activity differentiated spatially and chronometrically by experimental condition. When employed in this manner, the data offered by functional neuroimaging may be more useful for adjudicating between accounts of processing loci than behavioural measures.
Resumo:
In the picture-word interference task, naming responses are facilitated when a distractor word is orthographically and phonologically related to the depicted object as compared to an unrelated word. We used event-related functional magnetic resonance imaging (fMRI) to investigate the cerebral hemodynamic responses associated with this priming effect. Serial (or independent-stage) and interactive models of word production that explicitly account for picture-word interference effects assume that the locus of the effect is at the level of retrieving phonological codes, a role attributed recently to the left posterior superior temporal cortex (Wernicke's area). This assumption was tested by randomly presenting participants with trials from orthographically related and unrelated distractor conditions and acquiring image volumes coincident with the estimated peak hemodynamic response for each trial. Overt naming responses occurred in the absence of scanner noise, allowing reaction time data to be recorded. Analysis of this data confirmed the priming effect. Analysis of the fMRI data revealed blood oxygen level-dependent signal decreases in Wernicke's area and the right anterior temporal cortex, whereas signal increases were observed in the anterior cingulate, the right orbitomedial prefrontal, somatosensory, and inferior parietal cortices, and the occipital lobe. The results are interpreted as supporting the locus for the facilitation effect as assumed by both classes of theoretical model of word production. In addition, our results raise the possibilities that, counterintuitively, picture-word interference might be increased by the presentation of orthographically related distractors, due to competition introduced by activation of phonologically related word forms, and that this competition requires inhibitory processes to be resolved. The priming effect is therefore viewed as being sufficient to offset the increased interference. We conclude that information from functional imaging studies might be useful for constraining theoretical models of word production.
Resumo:
In two fMRI experiments, participants named pictures with superimposed distractors that were high or low in frequency or varied in terms of age of acquisition. Pictures superimposed with low-frequency words were named more slowly than those superimposed with high-frequency words, and late-acquired words interfered with picture naming to a greater extent than early-acquired words. The distractor frequency effect (Experiment 1) was associated with increased activity in left premotor and posterior superior temporal cortices, consistent with the operation of an articulatory response buffer and verbal selfmonitoring system. Conversely, the distractor age-of-acquisition effect (Experiment 2) was associated with increased activity in the left middle and posterior middle temporal cortex, consistent with the operation of lexical level processes such as lemma and phonological word form retrieval. The spatially dissociated patterns of activity across the two experiments indicate that distractor effects in picture-word interference may occur at lexical or postlexical levels of processing in speech production.
Resumo:
Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. High-angular resolution diffusion imaging (HARDI) can resolve more complex diffusion geometries than standard DTI, including fibers crossing or mixing. The tensor distribution function (TDF) can be used to reconstruct multiple underlying fibers per voxel, representing the diffusion profile as a probabilistic mixture of tensors. Here we found that DTIderived mean diffusivity (MD) correlates well with actual individual fiber MD, but DTI-derived FA correlates poorly with actual individual fiber anisotropy, and may be suboptimal when used to detect disease processes that affect myelination. Analysis of the TDFs revealed that almost 40% of voxels in the white matter had more than one dominant fiber present. To more accurately assess fiber integrity in these cases, we here propose the differential diffusivity (DD), which measures the average anisotropy based on all dominant directions in each voxel.
Resumo:
Objects presented in categorically related contexts are typically named slower than objects presented in unrelated contexts, a phenomenon termed semantic interference. However, not all semantic relationships induce interference. In the present study, we investigated the influence of object part-relations in the blocked cyclic naming paradigm. In Experiment 1 we established that an object's parts do induce a semantic interference effect when named in context compared to unrelated parts (e.g., leaf, root, nut, bark; for tree). In Experiment 2) we replicated the effect during perfusion functional magnetic resonance imaging (fMRI) to identify the cerebral regions involved. The interference effect was associated with significant perfusion signal increases in the hippocampal formation and decreases in the dorsolateral prefrontal cortex. We failed to observe significant perfusion signal changes in the left lateral temporal lobe, a region that shows reliable activity for interference effects induced by categorical relations in the same paradigm and is proposed to mediate lexical-semantic processing. We interpret these results as supporting recent explanations of semantic interference in blocked cyclic naming that implicate working memory mechanisms. However, given the failure to observe significant perfusion signal changes in the left temporal lobe, the results provide only partial support for accounts that assume semantic interference in this paradigm arises solely due to lexical-level processes.
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.
Resumo:
This study tested the utility of a stress and coping model of employee adjustment to a merger. Two hundred and twenty employees completed both questionnaires (Time 1: 3 months after merger implementation; Time 2: 2 years later). Structural equation modeling analyses revealed that positive event characteristics predicted greater appraisals of self-efficacy and less stress at Time 1. Self-efficacy, in turn, predicted greater use of problem-focused coping at Time 2, whereas stress predicted a greater use of problem-focused and avoidance coping. Finally, problem-focused coping predicted higher levels of job satisfaction and identification with the merged organization (Time 2), whereas avoidance coping predicted lower identification.
Resumo:
IODP Expedition 340 successfully drilled a series of sites offshore Montserrat, Martinique and Dominica in the Lesser Antilles from March to April 2012. These are among the few drill sites gathered around volcanic islands, and the first scientific drilling of large and likely tsunamigenic volcanic island-arc landslide deposits. These cores provide evidence and tests of previous hypotheses for the composition and origin of those deposits. Sites U1394, U1399, and U1400 that penetrated landslide deposits recovered exclusively seafloor sediment, comprising mainly turbidites and hemipelagic deposits, and lacked debris avalanche deposits. This supports the concepts that i/ volcanic debris avalanches tend to stop at the slope break, and ii/ widespread and voluminous failures of preexisting low-gradient seafloor sediment can be triggered by initial emplacement of material from the volcano. Offshore Martinique (U1399 and 1400), the landslide deposits comprised blocks of parallel strata that were tilted or microfaulted, sometimes separated by intervals of homogenized sediment (intense shearing), while Site U1394 offshore Montserrat penetrated a flat-lying block of intact strata. The most likely mechanism for generating these large-scale seafloor sediment failures appears to be propagation of a decollement from proximal areas loaded and incised by a volcanic debris avalanche. These results have implications for the magnitude of tsunami generation. Under some conditions, volcanic island landslide deposits composed of mainly seafloor sediment will tend to form smaller magnitude tsunamis than equivalent volumes of subaerial block-rich mass flows rapidly entering water. Expedition 340 also successfully drilled sites to access the undisturbed record of eruption fallout layers intercalated with marine sediment which provide an outstanding high-resolution data set to analyze eruption and landslides cycles, improve understanding of magmatic evolution as well as offshore sedimentation processes.