17 resultados para Management Decisions
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Lead (Pb) is a non-threshold toxin capable of inducing toxic effects at any blood level but availability of soil screening criteria for assessing potential health risks is limited. The oral bioaccessibility of Pb in 163 soil samples was attributed to sources through solubility estimation and domain identification. Samples were extracted following the Unified BARGE Method. Urban, mineralisation, peat and granite domains accounted for elevated Pb concentrations compared to rural samples. High Pb solubility explained moderate-high gastric (G) bioaccessible fractions throughout the study area. Higher maximum G concentrations were measured in urban (97.6 mg kg−1) and mineralisation (199.8 mg kg−1) domains. Higher average G concentrations occurred in mineralisation (36.4 mg kg−1) and granite (36.0 mg kg−1) domains. Findings suggest diffuse anthropogenic and widespread geogenic contamination could be capable of presenting health risks, having implications for land management decisions in jurisdictions where guidance advises these forms of pollution should not be regarded as contaminated land.
Resumo:
A large hydrochemical data-set for the East Yorkshire Chalk has been assessed. Controls on the distribution of water qualities within this aquifer reflect: water-rock interactions (affecting especially the carbonate system and associated geochemistry); effects of land-use change (especially where the aquifer is unconfined); saline intrusion and aquifer refreshening (including ion exchange effects); and aquifer overexploitation (in the semi-confined and confined zones of the aquifer). Both Sr and I prove useful indicators of groundwater ages, with I/Cl ratios characterising two sources of saline waters. The hydrochemical evidence clearly reveals the importance of both recent management decisions and palaeohydrogeology in determining the evolution and distribution of groundwater salinity within the artesian and confined zones of the aquifer. Waters currently encountered in the aquifer are identified as complex (and potentially dynamic) mixtures between modern recharge waters, modern seawater, and old seawaters which entered the aquifer many millennia ago.
Resumo:
Accelerated soil erosion is an aspect of dryland degradation that is affected by repeated intense drought events and land management activities such as commercial livestock grazing. A soil stability index (SSI) that detects the erosion status and susceptibility of a landscape at the pixel level, i.e., stable, erosional, or depositional pixels, was derived from the spectral properties of an archived time series (from 1972 to 1997) of Landsat satellite data of a commercial ranch in northeastern Utah. The SSI was retrospectively validated with contemporary field measures of soil organic matter and erosion status that was surveyed by US federal land management agencies. Catastrophe theory provided the conceptual framework for retrospective assessment of the impact of commercial grazing and soil water availability on the SSI. The overall SSI trend was from an eroding landscape in the early drier 1970s towards stable conditions in the wetter mid-1980s and late 1990s. The landscape catastrophically shifted towards an extreme eroding state that was coincident with the “The Great North American Drought of 1988”. Periods of landscape stability and trajectories toward stability were coincident with extremely wet El Niño events. Commercial grazing had less correlation with soil stability than drought conditions. However, the landscape became more susceptible to erosion events under multiple droughts and grazing. Land managers now have nearly a year warning of El Niño and La Niña events and can adjust their management decisions according to predicted landscape erosion conditions.
Resumo:
The cryptic, subterranean ways of golden moles (Chrysochloridae) hamper studies of their biology in the field. Ten species appear on the IUCN red list, but the dearth of information available for most inhibits effective conservation planning. New techniques are consequently required to further our understanding and facilitate informed conservation management decisions. We studied the endangered Juliana's golden mole Neamblysomus julianae and aimed to evaluate the feasibility of using implantable temperature sensing transmitters to remotely acquire physiological and behavioural data. We also aimed to assess potential body temperature (T-b) fluctuations in relation to ambient soil temperature (T-a) in order to assess the potential use of torpor. Hourly observations revealed that T-b was remarkably changeable, ranging from 27 to 33 degrees C. In several instances T-b declined during periods of low T-a. Such 'shallow torpor' may result in a daily energy saving of c. 20%. Behavioural thermoregulation was used during periods of high T-a by selecting cooler microclimates, while passive heating was used to raise T-b early morning when T-a was increasing. In contrast to anecdotal reports of nocturnal patterns of activity, our results suggest that activity is flexible, being primarily dependent on T-a. These results exemplify how behavioural patterns and microclimatic conditions can be examined in this and other subterranean mammal species, the results of which can be used in the urgently required conservation planning of endangered Chrysochlorid species.
Resumo:
BACKGROUND: PET/CT scanning can determine suitability for curative therapy and inform decision making when considering radical therapy in patients with non-small cell lung cancer (NSCLC). Metastases to central mediastinal lymph nodes (N2) may alter such management decisions. We report a 2 year retrospective series assessing N2 lymph node staging accuracy with PET/CT compared to pathological analysis at surgery.
METHODS: Patients with NSCLC attending our centre (excluding those who had induction chemotherapy) who had staging PET/CT scans and pathological nodal sampling between June 2006 and June 2008 were analysed. For each lymph node assessed pathologically, the corresponding PET/CT status was determined. 64 patients with 200 N2 lymph nodes were analysed.
RESULTS: Sensitivity of PET/CT scans for indentifying involved N2 lymph nodes was
39%, specificity 96% and overall accuracy 90%. For individual lymph node analysis, logistic regression demonstrated a significant linear association between PET/CT sensitivity and time from scanning to surgery (p=0.031) but not for specificity and accuracy. Those scanned <9 weeks before pathological sampling were significantly more sensitive (64% >9 weeks, 0% ≥ 9 weeks, p=0.013) and more accurate (94% <9 weeks, 81% ≥ 9 weeks, p=0.007). Differences in specificity were not seen (97% <9 weeks, 91% ≥ 9 weeks, p=0.228). No significant difference in specificity was found at any time point.
CONCLUSIONS: We recommend that if a PET/CT scan is older than 9 weeks, and management would be altered by the presence of N2 nodes, re-staging of the
mediastinum should be undertaken.
Resumo:
Seafloor massive sulfides (SMS) contain commercially viable quantities of high grade ores, making them attractive prospect sites for marine mining. SMS deposits may also contain hydrothermal vent ecosystems populated by high conservation value vent-endemic species. Responsible environmental management of these resources is best achieved by the adoption of a precautionary approach. Part of this precautionary approach involves the Environmental Impact Assessment (EIA) of exploration and exploitative activities at SMS deposits. The VentBase 2012 workshop provided a forum for stakeholders and scientists to discuss issues surrounding SMS exploration and exploitation. This forum recognised the requirement for a primer which would relate concepts underpinning EIA at SMS deposits. The purpose of this primer is to inform policy makers about EIA at SMS deposits in order to aid management decisions. The primer offers a basic introduction to SMS deposits and their associated ecology, and the basic requirements for EIA at SMS deposits; including initial data and information scoping, environmental survey, and ecological risk assessment. © 2013 Elsevier Ltd.
Resumo:
This study integrates the concepts of value creation and value claiming into a theoretical framework that emphasizes the dependence of resource value maximization on value-claiming motivations in outsourcing decisions. To test this theoretical framework, it develops refutable implications to explain the firm's outsourcing decision, and it uses data from 178 firms in the publishing and printing industry on outsourcing of application services. The results show that in outsourcing decisions, resource value and transaction costs are simultaneously considered and that outsourcing decisions are dependent on alignment between resource and transaction attributes. The findings support a resource contingency view that highlights value-claiming mechanisms as resource contingency in interorganizational strategic decisions.
Resumo:
Task-based dataflow programming models and runtimes emerge as promising candidates for programming multicore and manycore architectures. These programming models analyze dynamically task dependencies at runtime and schedule independent tasks concurrently to the processing elements. In such models, cache locality, which is critical for performance, becomes more challenging in the presence of fine-grain tasks, and in architectures with many simple cores.
This paper presents a combined hardware-software approach to improve cache locality and offer better performance is terms of execution time and energy in the memory system. We propose the explicit bulk prefetcher (EBP) and epoch-based cache management (ECM) to help runtimes prefetch task data and guide the replacement decisions in caches. The runtimem software can use this hardware support to expose its internal knowledge about the tasks to the architecture and achieve more efficient task-based execution. Our combined scheme outperforms HW-only prefetchers and state-of-the-art replacement policies, improves performance by an average of 17%, generates on average 26% fewer L2 misses, and consumes on average 28% less energy in the components of the memory system.
Resumo:
Larsen and Toubro (L&T) Limited is India’s largest construction conglomerate. L&T’s expertise is harnessed to execute high value projects that demand adherence to stringent timelines in a scenario where disparate disciplines of engineering are required to be coordinated on a critical path. However, no company can acquire such a feat without systematic management of its human resource. An investigation on the human resource management practices in orienting L&T’s success can help to identify some of the ethical human resource practices, especially in the context of Indian market. Accordingly, a well-designed employee satisfaction survey was conducted for assessment of the HRM practices being followed in L&T. Unlike other companies, L&T aims to meet the long-term needs of its employees rather than short-term needs. There were however few areas of concerns, such as yearly appraisal system and equality to treat the employees. It is postulated that the inequality to treat the male and female employees is primarily a typical stereotype due to the fact that construction is conventionally believed to be a male dominant activity. A periodic survey intended to provide 360° feedback system can help to avoid such irregularities. This study is thus expected to provide healthy practices of HRM to nurture the young talents of India. This may help them to evaluate their decisions by analyzing the complex relationship between HRM practices and output of an organization.
Resumo:
This paper presents the design and implementation of a measurement-based QoS and resource management framework, CNQF (Converged Networks’ QoS Management Framework). CNQF is designed to provide unified, scalable QoS control and resource management through the use of a policy-based network
management paradigm. It achieves this via distributed functional entities that are deployed to co-ordinate the resources of the transport network through centralized policy-driven decisions supported by measurement-based control architecture. We present the CNQF architecture, implementation of the
prototype and validation of various inbuilt QoS control mechanisms using real traffic flows on a Linux-based experimental test bed.
Resumo:
Policy-based management is considered an effective approach to address the challenges of resource management in large complex networks. Within the IU-ATC QoS Frameworks project, a policy-based network management framework, CNQF (Converged Networks QoS Framework) is being developed aimed at providing context-aware, end-to-end QoS control and resource management in converged next generation networks. CNQF is designed to provide homogeneous, transparent QoS control over heterogeneous access technologies by means of distributed functional entities that co-ordinate the resources of the transport network through policy-driven decisions. In this paper, we present a measurement-based evaluation of policy-driven QoS management based on CNQF architecture, with real traffic flows on an experimental testbed. A Java based implementation of the CNQF Resource Management Subsystem is deployed on the testbed and results of the experiments validate the framework operation for policy-based QoS management of real traffic flows.
Resumo:
Waste management and sustainability are two core underlying philosophies that the construction sector must acknowledge and implement; however, this can prove difficult and time consuming. To this end, the aim of this paper is to examine waste management strategies and the possible benefits, advantages and disadvantages to their introduction and use, while also to examine any inter-relationship with sustainability, particularly at the design stage. The purpose of this paper is to gather, examine and review published works and investigate factors which influence economic decisions at the design phase of a construction project. In addressing this aim, a three tiered sequential research approach is adopted; in-depth literature review, interviews/focus groups and qualitative analysis. The resulting data is analyzed, discussed, with potential conclusions identified; paying particular attention to implications for practice within architectural firms. This research is of importance, particularly to the architectural sector, as it can add to the industry’s understanding of the design process, while also considering the application and integration of waste management into the design procedure. Results indicate that the researched topic had many advantages but also had inherent disadvantages. It was found that the potential advantages outweighed disadvantages, but uptake within industry was still slow and that better promotion and their benefits to; sustainability, the environment, society and the industry were required.
Resumo:
Burial grounds are commonly surveyed and searched by both police/humanitarian search teams and archaeologists.
One aspect of an efficient search is to establish areas free of recent internments to allow the concentration of assets in suspect
terrain. While 100% surety in locating remains can never be achieved, the deployment of a red, amber green (RAG) system for
assessment has proven invaluable to our surveys. The RAG system is based on a desktop study (including burial ground
records), visual inspection (mounding, collapses) and use of geophysics (in this case, ground penetrating radar or GPR) for a
multi-proxy assessment that provides search authorities an assessment of the state of inhumations and a level of legal backup
for decisions they make on excavation or not (‘exit strategy’). The system is flexible and will be built upon as research
continues.