960 resultados para Multiple Sources
Resumo:
Soil contamination on the Elm Street site is located mainly underneath and near the building foundation. Groundwater contamination appears to extend beyond the property boundaries to the west towards the Fox River, which is approximately 1100 feet west of the site. The groundwater contamination is located in a mixed industrial, commercial and residential area. It is not clear at this point whether there may be multiple sources of contamination in the area. Currently the public water supply is only available to some properties along Route 120, where there is a water main in place. Most of the homes and businesses in the area use private wells for their water source.
Resumo:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The integration of geo-information from multiple sources and of diverse nature in developing mineral favourability indexes (MFIs) is a well-known problem in mineral exploration and mineral resource assessment. Fuzzy set theory provides a convenient framework to combine and analyse qualitative and quantitative data independently of their source or characteristics. A novel, data-driven formulation for calculating MFIs based on fuzzy analysis is developed in this paper. Different geo-variables are considered fuzzy sets and their appropriate membership functions are defined and modelled. A new weighted average-type aggregation operator is then introduced to generate a new fuzzy set representing mineral favourability. The membership grades of the new fuzzy set are considered as the MFI. The weights for the aggregation operation combine the individual membership functions of the geo-variables, and are derived using information from training areas and L, regression. The technique is demonstrated in a case study of skarn tin deposits and is used to integrate geological, geochemical and magnetic data. The study area covers a total of 22.5 km(2) and is divided into 349 cells, which include nine control cells. Nine geo-variables are considered in this study. Depending on the nature of the various geo-variables, four different types of membership functions are used to model the fuzzy membership of the geo-variables involved. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Objective: To quantify the burden of disease and injury for the Aboriginal and non-Aboriginal populations in the Northern Territory. Design and setting: Analysis of Northern Territory data for 1 January 1994 to 30 December 1998 from multiple sources. Main outcome measures: Disability-adjusted life-years (DALYs), by age, sex, cause and Aboriginality. Results: Cardiovascular disease was the leading contributor (14.9%) to the total burden of disease and injury in the NT, followed by mental disorders (14.5%) and malignant neoplasms (11.2%). There was also a substantial contribution from unintentional injury (10.4%) and intentional injury (4.9%). Overall, the NT Aboriginal population had a rate of burden of disease 2.5 times higher than the non-Aboriginal population; in the 35-54-year age group their DALY rate was 4.1 times higher. The leading causes of disease burden were cardiovascular disease for both Aboriginal men (19.1%) and women (15.7%) and mental disorders for both non-Aboriginal men (16.7%) and women (22.3%). Conclusions: A comprehensive assessment of fatal and non-fatal conditions is important in describing differentials in health status of the NT population. Our study provides comparative data to identify health priorities and facilitate a more equitable distribution of health funding.
Resumo:
Multilevel theories integrate individual-level processes with those occurring at the level of the firm and above to generate richer and more complete explanations of IB phenomena than the traditional specification of IB relationships as single-level and parsimonious allows. Case study methods permit the timely collection of multiple sources of data, in context, from multiple individuals and multiple organizational units. Further, because the definitions for each level emerge from case data rather than being imposed a priori, case analysis promotes an understanding of deeper structures and cross-level processes. This paper considers the example of sport as an internationalized service to illustrate how the case method might be used to illuminate the multilevel phenomena of knowledge.
Resumo:
Analysing investments in ISs in order to maximise benefits has become a prime concern, especially for private corporations. No formula of equilibrium exists that could link the injected amounts and accrued returns. The relationship is simply not straightforward. This thesis is based upon empirical work which involved sketching organisational ethnographies (four organographies and a sectography) into the role and value of information systems in Jordanian financial organisations. Besides deciphering the map of impacts, it explains the attributions of the variations in the impacts of ISs which were found to be related to the internal organisational processes: culturally and politically specific considerations, economically or technically rooted factors and environmental factors. The research serves as an empirical attempt to test out the applicability of adopting the interpretive paradigm to researching organisations in a developing country. The fieldwork comprised an exploratory stage, a detailed investigation of four case studies and a survey stage encompassing 16 organisations. Primary and secondary data were collected from multiple sources using a range of instruments. The evidence highlights the fact that little long term strategic planning was pursued; the emphasis was more focused on short term planning. There was no noticeable adoption of any strategic fit principle linking IS strategy to the corporate strategy. In addition, the benefits obtained were mostly intangible. Although ISs were central to the work of the organisations surveyed as the core technology, they were considered as tools or work enablers rather than weapons for competitive rivalry. The cultural specificity of IS impacts was evident and the cultural and political considerations were key factors in explaining the attributions of the variations in the impacts of ISs in JFOs. The thesis confirms that measuring the benefits of ISs is the problematic. However, in order to gain more insight, the phenomenon of "the use of ISs" has to be studied within its context.
Resumo:
Technological innovation has been widely studied: however surprisingly little is known about the experience of managing the process. Most reports tend to be generalistic and/or prescriptive whereas it is argued that multiple sources of variation in the process limit the value of these. A description of the innovation process is given together with a presentation of what is knovrn from existing studies. Gaps identified in this area suggest that a variety of organisational influences are important and an attempt is made to identify some of these at individual, group and organisational level. A simple system model of the innovation management process is developed. Further investigation of the influence of these factors was made possible through an extended on-site case study. Methodology for this based upon participant observation coupled wth a wide and flexible range of techniques is described. Evidence is presented about many aspects of the innovation process from a number of different levels and perspectives: the attempt is to demonstrate the extent to which variation due to contingent influences takes place. It is argued that problems identified all relate to the issue of integration. This theme is also developed from an analytical viewoint and it is suggested that organisational response to increases in complexity in the external environment will be to match them with internal complexity. Differentiation of this kind will require extensive and flexible integration, especially in those inherently uncertain areas associated with innovation. Whilst traditionally a function of management, it is argued that integration needs have increased to the point where a new specialism is required. The concept of integration specialist is developed from this analysis and attempts at simple integrative change during the research are described. Finally a strategy for integration - or rather for building in integrative capability - ln the organisation studied is described.
Resumo:
The initial image-processing stages of visual cortex are well suited to a local (patchwise) analysis of the viewed scene. But the world's structures extend over space as textures and surfaces, suggesting the need for spatial integration. Most models of contrast vision fall shy of this process because (i) the weak area summation at detection threshold is attributed to probability summation (PS) and (ii) there is little or no advantage of area well above threshold. Both of these views are challenged here. First, it is shown that results at threshold are consistent with linear summation of contrast following retinal inhomogeneity, spatial filtering, nonlinear contrast transduction and multiple sources of additive Gaussian noise. We suggest that the suprathreshold loss of the area advantage in previous studies is due to a concomitant increase in suppression from the pedestal. To overcome this confound, a novel stimulus class is designed where: (i) the observer operates on a constant retinal area, (ii) the target area is controlled within this summation field, and (iii) the pedestal is fixed in size. Using this arrangement, substantial summation is found along the entire masking function, including the region of facilitation. Our analysis shows that PS and uncertainty cannot account for the results, and that suprathreshold summation of contrast extends over at least seven target cycles of grating. © 2007 The Royal Society.
Resumo:
The sources of ideas embodied within successful technological innovation has been a subject of interest in many studies since the 1950s. This research suggests that sources external to the innovating organisation account for between one and two-thirds of the inputs important to the innovation process. In addition, studies have long highlighted the important role played by the personal boundary-spanning relationships of engineers and scientists as a channel for the transference of such inputs. However, research concerning the role and nature of personal boundary-spanning links in the innovation process have either been primarily structurally orientated, seeking to map out the informal networks of scientists and engineers, or more typically, anecdotal. The objective of this research was to reveal and build upon our knowledge of the role, nature and importance of informal exchange activity in the innovation process. In order to achieve this, an empirical study was undertaken to determine the informal sources, channels and mechanisms employed in the development of thirty five award-winning innovations. Through the adoption of the network perspective, the multiple sources and pluralistic patterns of collaboration and communication in the innovation process were systematically explored. This approach provided a framework that allowed for the detailed study of both the individual dyadic links and morphology of the innovation action-sets in which these dyads were embedded. The research found, for example, that the mobilisation of boundary-spanning links and networks was an important or critical factor in nineteen (54%) of the development projects. Of these, informal boundary-spanning exchange activity was considered to be important or critical in eight (23%).
Resumo:
Liberalisation has become an increasingly important policy trend, both in the private and public sectors of advanced industrial economies. This article eschews deterministic accounts of liberalisation by considering why government attempts to institute competition may be successful in some cases and not others. It considers the relative strength of explanations focusing on the institutional context, and on the volume and power of sectoral actors supporting liberalisation. These approaches are applied to two attempts to liberalise, one successful and one unsuccessful, within one sector in one nation – higher education in Britain. Each explanation is seen to have some explanatory power, but none is sufficient to explain why competition was generalised in the one case and not the other. The article counsels the need for scholars of liberalisation to be open to multiple explanations which may require the marshalling of multiple sources and types of evidence.
Resumo:
Linked Data semantic sources, in particular DBpedia, can be used to answer many user queries. PowerAqua is an open multi-ontology Question Answering (QA) system for the Semantic Web (SW). However, the emergence of Linked Data, characterized by its openness, heterogeneity and scale, introduces a new dimension to the Semantic Web scenario, in which exploiting the relevant information to extract answers for Natural Language (NL) user queries is a major challenge. In this paper we discuss the issues and lessons learned from our experience of integrating PowerAqua as a front-end for DBpedia and a subset of Linked Data sources. As such, we go one step beyond the state of the art on end-users interfaces for Linked Data by introducing mapping and fusion techniques needed to translate a user query by means of multiple sources. Our first informal experiments probe whether, in fact, it is feasible to obtain answers to user queries by composing information across semantic sources and Linked Data, even in its current form, where the strength of Linked Data is more a by-product of its size than its quality. We believe our experiences can be extrapolated to a variety of end-user applications that wish to scale, open up, exploit and re-use what possibly is the greatest wealth of data about everything in the history of Artificial Intelligence. © 2010 Springer-Verlag.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
Background - Problems of quality and safety persist in health systems worldwide. We conducted a large research programme to examine culture and behaviour in the English National Health Service (NHS). Methods - Mixed-methods study involving collection and triangulation of data from multiple sources, including interviews, surveys, ethnographic case studies, board minutes and publicly available datasets. We narratively synthesised data across the studies to produce a holistic picture and in this paper present a highlevel summary. Results - We found an almost universal desire to provide the best quality of care. We identified many 'bright spots' of excellent caring and practice and high-quality innovation across the NHS, but also considerable inconsistency. Consistent achievement of high-quality care was challenged by unclear goals, overlapping priorities that distracted attention, and compliance-oriented bureaucratised management. The institutional and regulatory environment was populated by multiple external bodies serving different but overlapping functions. Some organisations found it difficult to obtain valid insights into the quality of the care they provided. Poor organisational and information systems sometimes left staff struggling to deliver care effectively and disempowered them from initiating improvement. Good staff support and management were also highly variable, though they were fundamental to culture and were directly related to patient experience, safety and quality of care. Conclusions - Our results highlight the importance of clear, challenging goals for high-quality care. Organisations need to put the patient at the centre of all they do, get smart intelligence, focus on improving organisational systems, and nurture caring cultures by ensuring that staff feel valued, respected, engaged and supported.