999 resultados para Zachman Framework


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a mobile sensor network monitoring a spatio-temporal field. Given limited cache sizes at the sensor nodes, the goal is to develop a distributed cache management algorithm to efficiently answer queries with a known probability distribution over the spatial dimension. First, we propose a novel distributed information theoretic approach in which the nodes locally update their caches based on full knowledge of the space-time distribution of the monitored phenomenon. At each time instant, local decisions are made at the mobile nodes concerning which samples to keep and whether or not a new sample should be acquired at the current location. These decisions account for minimizing an entropic utility function that captures the average amount of uncertainty in queries given the probability distribution of query locations. Second, we propose a different correlation-based technique, which only requires knowledge of the second-order statistics, thus relaxing the stringent constraint of having a priori knowledge of the query distribution, while significantly reducing the computational overhead. It is shown that the proposed approaches considerably improve the average field estimation error by maintaining efficient cache content. It is further shown that the correlation-based technique is robust to model mismatch in case of imperfect knowledge of the underlying generative correlation structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of virtualization and cloud computing technologies necessitates the development of effective mechanisms for the estimation and reservation of resources needed by content providers to deliver large numbers of video-on-demand (VOD) streams through the cloud. Unfortunately, capacity planning for the QoS-constrained delivery of a large number of VOD streams is inherently difficult as VBR encoding schemes exhibit significant bandwidth variability. In this paper, we present a novel resource management scheme to make such allocation decisions using a mixture of per-stream reservations and an aggregate reservation, shared across all streams to accommodate peak demands. The shared reservation provides capacity slack that enables statistical multiplexing of peak rates, while assuring analytically bounded frame-drop probabilities, which can be adjusted by trading off buffer space (and consequently delay) and bandwidth. Our two-tiered bandwidth allocation scheme enables the delivery of any set of streams with less bandwidth (or equivalently with higher link utilization) than state-of-the-art deterministic smoothing approaches. The algorithm underlying our proposed frame-work uses three per-stream parameters and is linear in the number of servers, making it particularly well suited for use in an on-line setting. We present results from extensive trace-driven simulations, which confirm the efficiency of our scheme especially for small buffer sizes and delay bounds, and which underscore the significant realizable bandwidth savings, typically yielding losses that are an order of magnitude or more below our analytically derived bounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classifying novel terrain or objects from sparse, complex data may require the resolution of conflicting information from sensors woring at different times, locations, and scales, and from sources with different goals and situations. Information fusion methods can help resolve inconsistencies, as when eveidence variously suggests that and object's class is car, truck, or airplane. The methods described her address a complementary problem, supposing that information from sensors and experts is reliable though inconsistent, as when evidence suggests that an object's class is car, vehicle, and man-made. Underlying relationships among classes are assumed to be unknown to the autonomated system or the human user. The ARTMAP information fusion system uses distributed code representations that exploit the neural network's capacity for one-to-many learning in order to produce self-organizing expert systems that discover hierachical knowlege structures. The fusion system infers multi-level relationships among groups of output classes, without any supervised labeling of these relationships. The procedure is illustrated with two image examples, but is not limited to image domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biological-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalabiltiy, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multu-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions of ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further developement of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effecitively collaborate using a modern neural simulation platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologically-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multi-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of temperature distribution in cold rooms is an important consideration in the design of food storage solutions. Two common approaches used in both industry and academia to address this question are the deployment of wireless sensors, and modelling with Computational Fluid Dynamics (CFD). However, for a realworld evaluation of temperature distribution in a cold room, both approaches have their limitations. For wireless sensors, it is economically unfeasible to carry out large-scale deployment (to obtain a high resolution of temperature distribution); while with CFD modelling, it is usually not accurate enough to get a reliable result. In this paper, we propose a model-based framework which combines the wireless sensors technique with CFD modelling technique together to achieve a satisfactory trade-off between minimum number of wireless sensors and the accuracy of temperature profile in cold rooms. A case study is presented to demonstrate the usability of the framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis will analyse Anglo-Irish relations between 1969 and 1975, when two topics dominated the relationship: Northern Ireland and the entry of Britain and Ireland into the European Economic Community (hereafter EEC). In 1969 entry to the EEC was still only a possibility and awaited political developments, while the Northern Ireland problem had yet to escalate. 1975 on the other hand confirmed that Ireland would remain in the EEC even if Britain left while Direct Rule for Northern Ireland was confirmed as the British policy for the foreseeable future. These dates are significant because they encompass firstly pre and post entry to the EEC and how this transformed Anglo-Irish relations. Secondly they contain the commencement and then deterioration of the Northern Ireland problem and the attempts to resolve it that finally led to direct rule by Westminster. The study will examine the fluctuating nature of the relationship between Britain and Ireland. Special regard will be devoted to the demands of internal British politics and how such demands affected the relationship. Overall, the study will demonstrate how the bilateral relationship evolved under the pressure of events in Northern Ireland and adapted to the multilateral context of the EEC. It will compare the dynamics of the states’ interactions in two extremely different areas. The thesis will demonstrate how entry to the EEC transformed the unequal Anglo-Irish economic relationship and created one of partners within the EEC. It will also analyse how the developing Northern Ireland problem caused changes to British policy. In particular, it will examine how the British Government came to recognise the beneficial role that the Republic of Ireland might play in resolving the Troubles in Northern Ireland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural and human-made disasters cause on average 120,000 deaths and over US$140 billion in damage to property and infrastructure every year, with national, regional and international actors consistently responding to the humanitarian imperative to alleviate suffering wherever it may be found. Despite various attempts to codify international disaster laws since the 1920s, a right to humanitarian assistance remains contested, reflecting concerns regarding the relative importance of state sovereignty vis-à-vis individual rights under international law. However, the evolving acquis humanitaire of binding and non-binding normative standards for responses to humanitarian crises highlights the increasing focus on rights and responsibilities applicable in disasters; although the International Law Commission has also noted the difficulty of identifying lex lata and lex ferenda regarding the protection of persons in the event of disasters due to the “amorphous state of the law relating to international disaster response.” Therefore, using the conceptual framework of transnational legal process, this thesis analyses the evolving normative frameworks and standards for rights-holders and duty-bearers in disasters. Determining the process whereby rights are created and evolve, and their potential internalisation into domestic law and policy, provides a powerful analytical framework for examining the progress and challenges of developing accountable responses to major disasters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing work in Computer Science and Electronic Engineering demonstrates that Digital Signal Processing techniques can effectively identify the presence of stress in the speech signal. These techniques use datasets containing real or actual stress samples i.e. real-life stress such as 911 calls and so on. Studies that use simulated or laboratory-induced stress have been less successful and inconsistent. Pervasive, ubiquitous computing is increasingly moving towards voice-activated and voice-controlled systems and devices. Speech recognition and speaker identification algorithms will have to improve and take emotional speech into account. Modelling the influence of stress on speech and voice is of interest to researchers from many different disciplines including security, telecommunications, psychology, speech science, forensics and Human Computer Interaction (HCI). The aim of this work is to assess the impact of moderate stress on the speech signal. In order to do this, a dataset of laboratory-induced stress is required. While attempting to build this dataset it became apparent that reliably inducing measurable stress in a controlled environment, when speech is a requirement, is a challenging task. This work focuses on the use of a variety of stressors to elicit a stress response during tasks that involve speech content. Biosignal analysis (commercial Brain Computer Interfaces, eye tracking and skin resistance) is used to verify and quantify the stress response, if any. This thesis explains the basis of the author’s hypotheses on the elicitation of affectively-toned speech and presents the results of several studies carried out throughout the PhD research period. These results show that the elicitation of stress, particularly the induction of affectively-toned speech, is not a simple matter and that many modulating factors influence the stress response process. A model is proposed to reflect the author’s hypothesis on the emotional response pathways relating to the elicitation of stress with a required speech content. Finally the author provides guidelines and recommendations for future research on speech under stress. Further research paths are identified and a roadmap for future research in this area is defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis assesses the current regulatory framework regarding clinical trials with neonates in Ireland from a children’s rights perspective, as derived from the UN Convention on the Rights of the Child 1989 (UN CRC) and its supporting instruments. The focus on neonates in the thesis is due to the particular need for clinical research with this group of children, their dependency on others for their protection and the lack of attention which has been given to them in the regulatory framework. The importance of children’s rights in this area is linked to the role of human rights in the regulation of clinical research in general. A rights-based approach is of great practical relevance in reforming law, policy and practice. For example, the CRC contains a set of commonly agreed legal benchmarks which can be used to assess the current framework and shape recommendations for reform. In this way, it provides a set of binding norms under international law, which must be complied with by states and state actors in all law, policy and practice affecting children. However, the contribution which a children’s rights approach could make to the regulation of research with children has not, to date, been explored in detail. This thesis aims to address this gap by developing a set of children’s rights-based benchmarks, which are used to assess the Irish regulatory framework for clinical trials with neonates and to develop recommendations for reform. The purpose of the analysis and recommendations is to assess Ireland’s compliance with international children’s rights law in the area and to analyse the potential of children’s rights to effectively address inadequacies in the Irish framework. The recommendations ultimately aim to develop a framework which will enhance the protection of neonates’ rights in this important area of children’s lives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a framework for estimating household preferences for school and neighborhood attributes in the presence of sorting. It embeds a boundary discontinuity design in a heterogeneous residential choice model, addressing the endogeneity of school and neighborhood characteristics. The model is estimated using restricted-access Census data from a large metropolitan area, yielding a number of new results. First, households are willing to pay less than 1 percent more in house prices - substantially lower than previous estimates - when the average performance of the local school increases by 5 percent. Second, much of the apparent willingness to pay for more educated and wealthier neighbors is explained by the correlation of these sociodemographic measures with unobserved neighborhood quality. Third, neighborhood race is not capitalized directly into housing prices; instead, the negative correlation of neighborhood percent black and housing prices is due entirely to the fact that blacks live in unobservably lower-quality neighborhoods. Finally, there is considerable heterogeneity in preferences for schools and neighbors, with households preferring to self-segregate on the basis of both race and education. © 2007 by The University of Chicago. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A focus on ecosystem services (ES) is seen as a means for improving decisionmaking. In the research to date, the valuation of the material contributions of ecosystems to human well-being has been emphasized, with less attention to important cultural ES and nonmaterial values. This gap persists because there is no commonly accepted framework for eliciting less tangible values, characterizing their changes, and including them alongside other services in decisionmaking. Here, we develop such a framework for ES research and practice, addressing three challenges: (1) Nonmaterial values are ill suited to characterization using monetary methods; (2) it is difficult to unequivocally link particular changes in socioecological systems to particular changes in cultural benefits; and (3) cultural benefits are associated with many services, not just cultural ES. There is no magic bullet, but our framework may facilitate fuller and more socially acceptable integrations of ES information into planning and management. © 2012 by American Institute of Biological Sciences. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: A hierarchical taxonomy of organisms is a prerequisite for semantic integration of biodiversity data. Ideally, there would be a single, expansive, authoritative taxonomy that includes extinct and extant taxa, information on synonyms and common names, and monophyletic supraspecific taxa that reflect our current understanding of phylogenetic relationships. DESCRIPTION: As a step towards development of such a resource, and to enable large-scale integration of phenotypic data across vertebrates, we created the Vertebrate Taxonomy Ontology (VTO), a semantically defined taxonomic resource derived from the integration of existing taxonomic compilations, and freely distributed under a Creative Commons Zero (CC0) public domain waiver. The VTO includes both extant and extinct vertebrates and currently contains 106,947 taxonomic terms, 22 taxonomic ranks, 104,736 synonyms, and 162,400 cross-references to other taxonomic resources. Key challenges in constructing the VTO included (1) extracting and merging names, synonyms, and identifiers from heterogeneous sources; (2) structuring hierarchies of terms based on evolutionary relationships and the principle of monophyly; and (3) automating this process as much as possible to accommodate updates in source taxonomies. CONCLUSIONS: The VTO is the primary source of taxonomic information used by the Phenoscape Knowledgebase (http://phenoscape.org/), which integrates genetic and evolutionary phenotype data across both model and non-model vertebrates. The VTO is useful for inferring phenotypic changes on the vertebrate tree of life, which enables queries for candidate genes for various episodes in vertebrate evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mozambique, with approximately 0.4 physicians and 4.1 nurses per 10,000 people, has one of the lowest ratios of health care providers to population in the world. To rapidly scale up health care coverage, the Mozambique Ministry of Health has pushed for greater investment in training nonphysician clinicians, Tιcnicos de Medicina (TM). Based on identified gaps in TM clinical performance, the Ministry of Health requested technical assistance from the International Training and Education Center for Health (I-TECH) to revise the two-and-a-half-year preservice curriculum. A six-step process was used to revise the curriculum: (i) Conducting a task analysis, (ii) defining a new curriculum approach and selecting an integrated model of subject and competency-based education, (iii) revising and restructuring the 30-month course schedule to emphasize clinical skills, (iv) developing a detailed syllabus for each course, (v) developing content for each lesson, and (vi) evaluating implementation and integrating feedback for ongoing improvement. In May 2010, the Mozambique Minister of Health approved the revised curriculum, which is currently being implemented in 10 training institutions around the country. Key lessons learned: (i) Detailed assessment of training institutions' strengths and weaknesses should inform curriculum revision. (ii) Establishing a Technical Working Group with respected and motivated clinicians is key to promoting local buy-in and ownership. (iii) Providing ready-to-use didactic material helps to address some challenges commonly found in resource-limited settings. (iv) Comprehensive curriculum revision is an important first step toward improving the quality of training provided to health care providers in developing countries. Other aspects of implementation at training institutions and health care facilities must also be addressed to ensure that providers are adequately trained and equipped to provide quality health care services. This approach to curriculum revision and implementation teaches several key lessons, which may be applicable to preservice training programs in other less developed countries.