881 resultados para DDM Data Distribution Management testbed benchmark design implementation instance generator
Resumo:
Emergency management is one of the key aspects within the day-to-day operation procedures in a highway. Efficiency in the overall response in case of an incident is paramount in reducing the consequences of any incident. However, the approach of highway operators to the issue of incident management is still usually far from a systematic, standardized way. This paper attempts to address the issue and provide several hints on why this happens, and a proposal on how the situation could be overcome. An introduction to a performance based approach to a general system specification will be described, and then applied to a particular road emergency management task. A real testbed has been implemented to show the validity of the proposed approach. Ad-hoc sensors (one camera and one laser scanner) were efficiently deployed to acquire data, and advanced fusion techniques applied at the processing stage to reach the specific user requirements in terms of functionality, flexibility and accuracy.
Resumo:
Over the last few years, the Data Center market has increased exponentially and this tendency continues today. As a direct consequence of this trend, the industry is pushing the development and implementation of different new technologies that would improve the energy consumption efficiency of data centers. An adaptive dashboard would allow the user to monitor the most important parameters of a data center in real time. For that reason, monitoring companies work with IoT big data filtering tools and cloud computing systems to handle the amounts of data obtained from the sensors placed in a data center.Analyzing the market trends in this field we can affirm that the study of predictive algorithms has become an essential area for competitive IT companies. Complex algorithms are used to forecast risk situations based on historical data and warn the user in case of danger. Considering that several different users will interact with this dashboard from IT experts or maintenance staff to accounting managers, it is vital to personalize it automatically. Following that line of though, the dashboard should only show relevant metrics to the user in different formats like overlapped maps or representative graphs among others. These maps will show all the information needed in a visual and easy-to-evaluate way. To sum up, this dashboard will allow the user to visualize and control a wide range of variables. Monitoring essential factors such as average temperature, gradients or hotspots as well as energy and power consumption and savings by rack or building would allow the client to understand how his equipment is behaving, helping him to optimize the energy consumption and efficiency of the racks. It also would help him to prevent possible damages in the equipment with predictive high-tech algorithms.
Resumo:
Background: Despite the existence of ample literature dealing, on the one hand, with the integration of innovations within health systems and team learning, and, on the other hand, with different aspects of the detection and management of intimate partner violence (IPV) within healthcare facilities, research that explores how health innovations that go beyond biomedical issues—such as IPV management—get integrated into health systems, and that focuses on healthcare teams’ learning processes is, to the best of our knowledge, very scarce if not absent. This realist evaluation protocol aims to ascertain: why, how, and under what circumstances primary healthcare teams engage (if at all) in a learning process to integrate IPV management in their practices; and why, how, and under what circumstances team learning processes lead to the development of organizational culture and values regarding IPV management, and the delivery of IPV management services. Methods: This study will be conducted in Spain using a multiple-case study design. Data will be collected from selected cases (primary healthcare teams) through different methods: individual and group interviews, routinely collected statistical data, documentary review, and observation. Cases will be purposively selected in order to enable testing the initial middle-range theory (MRT). After in-depth exploration of a limited number of cases, additional cases will be chosen for their ability to contribute to refining the emerging MRT to explain how primary healthcare learn to integrate intimate partner violence management. Discussion: Evaluations of health sector responses to IPV are scarce, and even fewer focus on why, how, and when the healthcare services integrate IPV management. There is a consensus that healthcare professionals and healthcare teams play a key role in this integration, and that training is important in order to realize changes. However, little is known about team learning of IPV management, both in terms of how to trigger such learning and how team learning is connected with changes in organizational culture and values, and in service delivery. This realist evaluation protocol aims to contribute to this knowledge by conducting this project in a country, Spain, where great endeavours have been made towards the integration of IPV management within the health system.
Resumo:
Thesis--University of Maryland.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The Internet of Things (IoT) consists of a worldwide “network of networks,” composed by billions of interconnected heterogeneous devices denoted as things or “Smart Objects” (SOs). Significant research efforts have been dedicated to port the experience gained in the design of the Internet to the IoT, with the goal of maximizing interoperability, using the Internet Protocol (IP) and designing specific protocols like the Constrained Application Protocol (CoAP), which have been widely accepted as drivers for the effective evolution of the IoT. This first wave of standardization can be considered successfully concluded and we can assume that communication with and between SOs is no longer an issue. At this time, to favor the widespread adoption of the IoT, it is crucial to provide mechanisms that facilitate IoT data management and the development of services enabling a real interaction with things. Several reference IoT scenarios have real-time or predictable latency requirements, dealing with billions of device collecting and sending an enormous quantity of data. These features create a new need for architectures specifically designed to handle this scenario, hear denoted as “Big Stream”. In this thesis a new Big Stream Listener-based Graph architecture is proposed. Another important step, is to build more applications around the Web model, bringing about the Web of Things (WoT). As several IoT testbeds have been focused on evaluating lower-layer communication aspects, this thesis proposes a new WoT Testbed aiming at allowing developers to work with a high level of abstraction, without worrying about low-level details. Finally, an innovative SOs-driven User Interface (UI) generation paradigm for mobile applications in heterogeneous IoT networks is proposed, to simplify interactions between users and things.
Resumo:
INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.
Resumo:
Purpose: The purpose of this paper is to examine the effect of the quality of senior management leadership on social support and job design, whose main effects on strains, and moderating effects on work stressors-to-strains relationships were assessed. Design/methodology/approach: A survey involving distribution of questionnaires was carried out on a random sample of health care employees in acute hospital practice in the UK. The sample comprised 65,142 respondents. The work stressors tested were quantitative overload and hostile environment, whereas strains were measured through job satisfaction and turnover intentions. Structural equation modelling and moderated regression analyses were used in the analysis. Findings: Quality of senior management leadership explained 75 per cent and 94 per cent of the variance of social support and job design respectively, whereas work stressors explained 51 per cent of the variance of strains. Social support and job design predicted job satisfaction and turnover intentions, as well as moderated significantly the relationships between quantitative workload/hostility and job satisfaction/turnover intentions. Research limitations/implications: The findings are useful to management and to health employees working in acute/specialist hospitals. Further research could be done in other counties to take into account cultural differences and variations in health systems. The limitations included self-reported data and percept-percept bias due to same source data collection. Practical implications: The quality of senior management leaders in hospitals has an impact on the social environment, the support given to health employees, their job design, as well as work stressors and strains perceived. Originality/value: The study argues in favour of effective senior management leadership of hospitals, as well as ensuring adequate support structures and job design. The findings may be useful to health policy makers and human resources managers. © Emerald Group Publishing Limited.
Resumo:
Purpose: The purpose of the research described in this paper is to disentangle the rhetoric from the reality in relation to supply chain management (SCM) adoption in practice. There is significant evidence of a divergence between theory and practice in the field of SCM. Design/methodology/approach: Based on a review of extant theory, the authors posit a new definitional construct for SCM – the Four Fundamentals – and investigated four research questions (RQs) that emerged from the theoretical review. The empirical work comprised three main phases: focussed interviews, focus groups and a questionnaire survey. Each phase used the authors’ definitional construct as its basis. While the context of the paper’s empirical work is Ireland, the insights and results are generalisable to other geographical contexts. Findings: The data collected during the various stages of the empirical research supported the essence of the definitional construct and allowed it to be further developed and refined. In addition, the findings suggest that, while levels of SCM understanding are generally quite high, there is room for improvement in relation to how this understanding is translated into practice. Research limitations/implications: Expansion of the research design to incorporate case studies, grounded theory and action research has the potential to generate new SCM theory that builds on the Four Fundamentals construct, thus facilitating a deeper and richer understanding of SCM phenomena. The use of longitudinal studies would enable a barometer of progress to be developed over time. Practical implications: The authors’ definitional construct supports improvement in the cohesion of SCM practices, thereby promoting the effective implementation of supply chain strategies. A number of critical success factors and/or barriers to implementation of SCM theory in practice are identified, as are a number of practical measures that could be implemented at policy/supply chain/firm level to improve the level of effective SCM adoption. Originality/value: The authors’ robust definitional construct supports a more cohesive approach to the development of a unified theory of SCM. In addition to a profile of SCM understanding and adoption by firms in Ireland, the related critical success factors and/or inhibitors to success, as well as possible interventions, are identified.
Resumo:
An implementation of Sem-ODB—a database management system based on the Semantic Binary Model is presented. A metaschema of Sem-ODB database as well as the top-level architecture of the database engine is defined. A new benchmarking technique is proposed which allows databases built on different database models to compete fairly. This technique is applied to show that Sem-ODB has excellent efficiency comparing to a relational database on a certain class of database applications. A new semantic benchmark is designed which allows evaluation of the performance of the features characteristic of semantic database applications. An application used in the benchmark represents a class of problems requiring databases with sparse data, complex inheritances and many-to-many relations. Such databases can be naturally accommodated by semantic model. A fixed predefined implementation is not enforced allowing the database designer to choose the most efficient structures available in the DBMS tested. The results of the benchmark are analyzed. ^ A new high-level querying model for semantic databases is defined. It is proven adequate to serve as an efficient native semantic database interface, and has several advantages over the existing interfaces. It is optimizable and parallelizable, supports the definition of semantic userviews and the interoperability of semantic databases with other data sources such as World Wide Web, relational, and object-oriented databases. The query is structured as a semantic database schema graph with interlinking conditionals. The query result is a mini-database, accessible in the same way as the original database. The paradigm supports and utilizes the rich semantics and inherent ergonomics of semantic databases. ^ The analysis and high-level design of a system that exploits the superiority of the Semantic Database Model to other data models in expressive power and ease of use to allow uniform access to heterogeneous data sources such as semantic databases, relational databases, web sites, ASCII files, and others via a common query interface is presented. The Sem-ODB engine is used to control all the data sources combined under a unified semantic schema. A particular application of the system to provide an ODBC interface to the WWW as a data source is discussed. ^
Resumo:
Next-generation integrated wireless local area network (WLAN) and 3G cellular networks aim to take advantage of the roaming ability in a cellular network and the high data rate services of a WLAN. To ensure successful implementation of an integrated network, many issues must be carefully addressed, including network architecture design, resource management, quality-of-service (QoS), call admission control (CAC) and mobility management. ^ This dissertation focuses on QoS provisioning, CAC, and the network architecture design in the integration of WLANs and cellular networks. First, a new scheduling algorithm and a call admission control mechanism in IEEE 802.11 WLAN are presented to support multimedia services with QoS provisioning. The proposed scheduling algorithms make use of the idle system time to reduce the average packet loss of realtime (RT) services. The admission control mechanism provides long-term transmission quality for both RT and NRT services by ensuring the packet loss ratio for RT services and the throughput for non-real-time (NRT) services. ^ A joint CAC scheme is proposed to efficiently balance traffic load in the integrated environment. A channel searching and replacement algorithm (CSR) is developed to relieve traffic congestion in the cellular network by using idle channels in the WLAN. The CSR is optimized to minimize the system cost in terms of the blocking probability in the interworking environment. Specifically, it is proved that there exists an optimal admission probability for passive handoffs that minimizes the total system cost. Also, a method of searching the probability is designed based on linear-programming techniques. ^ Finally, a new integration architecture, Hybrid Coupling with Radio Access System (HCRAS), is proposed for lowering the average cost of intersystem communication (IC) and the vertical handoff latency. An analytical model is presented to evaluate the system performance of the HCRAS in terms of the intersystem communication cost function and the handoff cost function. Based on this model, an algorithm is designed to determine the optimal route for each intersystem communication. Additionally, a fast handoff algorithm is developed to reduce the vertical handoff latency.^
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.