944 resultados para Three Pillar Framework
Resumo:
Key management is a core mechanism to ensure the security of applications and network services in wireless sensor networks. It includes two aspects: key distribution and key revocation. Many key management protocols have been specifically designed for wireless sensor networks. However, most of the key management protocols focus on the establishment of the required keys or the removal of the compromised keys. The design of these key management protocols does not consider the support of higher level security applications. When the applications are integrated later in sensor networks, new mechanisms must be designed. In this paper, we propose a security framework, uKeying, for wireless sensor networks. This framework can be easily extended to support many security applications. It includes three components: a security mechanism to provide secrecy for communications in sensor networks, an efficient session key distribution scheme, and a centralized key revocation scheme. The proposed framework does not depend on a specific key distribution scheme and can be used to support many security applications, such as secure group communications. Our analysis shows that the framework is secure, efficient, and extensible. The simulation and results also reveal for the first time that a centralized key revocation scheme can also attain a high efficiency.
Resumo:
In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.
Resumo:
The existence and stability of three-dimensional (3D) solitons, in cross-combined linear and nonlinear optical lattices, are investigated. In particular, with a starting optical lattice (OL) configuration such that it is linear in the x-direction and nonlinear in the y-direction, we consider the z-direction either unconstrained (quasi-2D OL case) or with another linear OL (full 3D case). We perform this study both analytically and numerically: analytically by a variational approach based on a Gaussian ansatz for the soliton wavefunction and numerically by relaxation methods and direct integrations of the corresponding Gross-Pitaevskii equation. We conclude that, while 3D solitons in the quasi-2D OL case are always unstable, the addition of another linear OL in the z-direction allows us to stabilize 3D solitons both for attractive and repulsive mean interactions. From our results, we suggest the possible use of spatial modulations of the nonlinearity in one of the directions as a tool for the management of stable 3D solitons.
Resumo:
Social businesses present a new paradigm to capitalism, in which private companies, non-profit organizations and civil society create a new type of business with the main objective of solving social problems with financial sustainability and efficiency through market mechanisms. As any new phenomenon, different authors conceptualize social businesses with distinct views. This article aims to present and characterize three different perspectives of social business definitions: the European, the American and that of the emerging countries. Each one of these views was illustrated by a different Brazilian case. We conclude with the idea that all the cases have similar characteristics, but also relevant differences that are more than merely geographical. The perspectives analyzed in this paper provide an analytical framework for understanding the field of social businesses. Moreover, the cases demonstrate that in the Brazilian context the field of social business is under construction and that as such it draws on different conceptual influences to deal with a complex and challenging reality.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.
Resumo:
En la sociedad europea crece la preocupación por el retorno de tendencias fascistas y neonazis y por la extensión de ideologías xenófobas y antisemitas, algunas de ellas alimentadas a partir de tesis de negacionistas de aquellos trágicos eventos de nuestra historia reciente. La lucha frente a los discursos negacionistas se ha llevado más allá del ámbito social y académico, y se ha propuesto la incorporación en los ordenamientos jurídicos europeos de tipos penales específicos que incriminan este tipo de discurso: negar, banalizar, o justificar el Holocausto u otros genocidios o graves crímenes contra la humanidad. Esta legislación, que encuentra su mayor expresión en la Decisión marco 2008/913/JAI, aunque castiga un discurso socialmente repugnante, sin embargo presenta dudas en cuanto a su legitimidad con un sistema de libertades erigido sobre el pilar del pluralismo propio de los Estados democráticos. Surge así la cuestión de si pueden estar surgiendo «nuevos» delitos de opinión y a ello se dedica entonces la presente tesis. El objetivo concreto de este trabajo será analizar esta política-criminal para proponer una configuración del delito de negacionismo compatible con la libertad de expresión, aunque se cuestionará la conveniencia de castigar penalmente a través de un específico delito este tipo de conductas. En particular se pretende responder a tres preguntas: en primer lugar, ¿el discurso negacionista debe ampararse prima facie por la libertad de expresión en un ordenamiento abierto y personalista y cuáles podrían ser las «reglas» que podrían servir como criterio para limitar este género de manifestaciones? La segunda pregunta sería entonces: ¿Cómo podría construirse un tipo penal respetuoso con los principios constitucionales y penales que específicamente incriminara este género de conductas? Y, como última pregunta: ¿Es conveniente o adecuada una política criminal que lleve a crear un específico delito de negacionismo?
Resumo:
In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.
Resumo:
Three-dimensional electron microscopy (3-D EM) provides a framework for the analysis of large protein quaternary structures. The advantage over the generally higher resolving meth- od of X-ray crystallography is the embedding of the proteins in their physiological environ- ment. However, results of the two methods can be combined to obtain superior structural information. In this work, three different protein types – (i) Myriapod hemocyanin, (ii) vesi- cle-inducing protein in plastids 1 (Vipp1) and (iii) acetylcholine-binding protein (AChBP) – were structurally analyzed by 2-D and 3-D EM and, where possible, functionally interpreted.rnMyriapod hemocyanins have been previously shown to be 6x6-meric assemblies that, in case of Scutigera coleoptrata hemocyanin (ScoHc), show two 3x6-mer planes whith a stag- gering angle of approximately 60°. Here, previously observed structural differences between oxy- and deoxy-ScoHc could be substantiated. A 4° rotation between hexamers of two dif- ferent 3x6-mer planes was measured, which originates at the most central inter-hexamer in- terface. Further information about allosteric behaviour in myriapod hemocyanin was gained by analyzing Polydesmus angustus hemocyanin (PanHc), which shows a stable 3x6-mer and divergent histidine patterns in the inter-hexamer interfaces when compared to ScoHc. Both findings would conclusively explain the very different oxygen binding properties of chilopod and diplopod hemocyanin.rnVipp1 is a protein found in cyanobacteria and higher plants which is essential for thyla- koid membrane function and forms highly variable ring-shaped structures. In the course of this study, the first 3-D analysis of Vipp1 was conducted and yielded reconstructions of six differently sized Vipp1 rings from negatively stained images at resolutions between 20 to 30 Å. Furthermore, mutational analyses identified specific N-terminal amino acids that are essential for ring formation. On the basis of these analyses and previously published results, a hypothetical model of the Vipp1 tertiary and quaternary structure was generated.rnAChBP is a water-soluble protein in the hemolymph of mollusks. It is a structural and functional homologue of the ligand-binding domain of nicotinic acetylcholine receptors. For the freshwater snail Biomphalaria glabrata, we previously described two types of AChBP (BgAChBP1 and BgAChBP2). In this work, a 6 Å 3-D reconstruction of native BgAChBP is presented, which shows a dodecahedral assembly that is unprecedented for an AChBP. Single particle analysis of recombinantely expressed BgAChBP types led to preliminary results show- ing a dodecahedral assembly of BgAChBP1 and a dipentameric assembly of BgAChBP2. This indicates divergent biological functions of the two types.
Resumo:
Emmanuel Levinas once stated that his “project” was “the deformalization of time.” Jacques Derrida, too, laid out a framework of thinking about time that dismissed the relevance of the past and the future and even belittled the significance of/or ourability to know anything about the “present.” Both of these thinkers discussed such notions of time in the context of complex theories of representation—or of the “relationship” between signifier and signified. This thesis considers the connection between theories of time and conceptions of the “relationship” between signifier andsignified to ask how Hamlet’s role as the agent of the plot in Hamlet relates to his own consideration of his “relationship” to the ghost as a potentially empty signifier.
Resumo:
Metabolomics as one of the most rapidly growing technologies in the "-omics" field denotes the comprehensive analysis of low molecular-weight compounds and their pathways. Cancer-specific alterations of the metabolome can be detected by high-throughput mass-spectrometric metabolite profiling and serve as a considerable source of new markers for the early differentiation of malignant diseases as well as their distinction from benign states. However, a comprehensive framework for the statistical evaluation of marker panels in a multi-class setting has not yet been established. We collected serum samples of 40 pancreatic carcinoma patients, 40 controls, and 23 pancreatitis patients according to standard protocols and generated amino acid profiles by routine mass-spectrometry. In an intrinsic three-class bioinformatic approach we compared these profiles, evaluated their selectivity and computed multi-marker panels combined with the conventional tumor marker CA 19-9. Additionally, we tested for non-inferiority and superiority to determine the diagnostic surplus value of our multi-metabolite marker panels. Compared to CA 19-9 alone, the combined amino acid-based metabolite panel had a superior selectivity for the discrimination of healthy controls, pancreatitis, and pancreatic carcinoma patients [Formula: see text] We combined highly standardized samples, a three-class study design, a high-throughput mass-spectrometric technique, and a comprehensive bioinformatic framework to identify metabolite panels selective for all three groups in a single approach. Our results suggest that metabolomic profiling necessitates appropriate evaluation strategies and-despite all its current limitations-can deliver marker panels with high selectivity even in multi-class settings.
Resumo:
Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.
Resumo:
In distribution system operations, dispatchers at control center closely monitor system operating limits to ensure system reliability and adequacy. This reliability is partly due to the provision of remote controllable tie and sectionalizing switches. While the stochastic nature of wind generation can impact the level of wind energy penetration in the network, an estimate of the output from wind on hourly basis can be extremely useful. Under any operating conditions, the switching actions require human intervention and can be an extremely stressful task. Currently, handling a set of switching combinations with the uncertainty of distributed wind generation as part of the decision variables has been nonexistent. This thesis proposes a three-fold online management framework: (1) prediction of wind speed, (2) estimation of wind generation capacity, and (3) enumeration of feasible switching combinations. The proposed methodology is evaluated on 29-node test system with 8 remote controllable switches and two wind farms of 18MW and 9MW nameplate capacities respectively for generating the sequence of system reconfiguration states during normal and emergency conditions.
Resumo:
Capital cities that are not the economic centers of their nations – so-called secondary capital cities (SSCs) – tend to be overlooked in the field of political science. Consequentially, there is a lack of research and resulting theory describing their local economy and their public policies. This paper analyzes how SCCs try to develop and position themselves through the formulation of locational policies. By linking three different theoretical strands – the Regional Innovation System (RIS) approach, the concept of locational policies, and the regime perspective – this paper aims for constructing a framework to study the economic and political dynamics in SCCs.
Resumo:
Online reputation management deals with monitoring and influencing the online record of a person, an organization or a product. The Social Web offers increasingly simple ways to publish and disseminate personal or opinionated information, which can rapidly have a disastrous influence on the online reputation of some of the entities. This dissertation can be split into three parts: In the first part, possible fuzzy clustering applications for the Social Semantic Web are investigated. The second part explores promising Social Semantic Web elements for organizational applications,while in the third part the former two parts are brought together and a fuzzy online reputation analysis framework is introduced and evaluated. Theentire PhD thesis is based on literature reviews as well as on argumentative-deductive analyses.The possible applications of Social Semantic Web elements within organizations have been researched using a scenario and an additional case study together with two ancillary case studies—based on qualitative interviews. For the conception and implementation of the online reputation analysis application, a conceptual framework was developed. Employing test installations and prototyping, the essential parts of the framework have been implemented.By following a design sciences research approach, this PhD has created two artifacts: a frameworkand a prototype as proof of concept. Bothartifactshinge on twocoreelements: a (cluster analysis-based) translation of tags used in the Social Web to a computer-understandable fuzzy grassroots ontology for the Semantic Web, and a (Topic Maps-based) knowledge representation system, which facilitates a natural interaction with the fuzzy grassroots ontology. This is beneficial to the identification of unknown but essential Web data that could not be realized through conventional online reputation analysis. Theinherent structure of natural language supports humans not only in communication but also in the perception of the world. Fuzziness is a promising tool for transforming those human perceptions intocomputer artifacts. Through fuzzy grassroots ontologies, the Social Semantic Web becomes more naturally and thus can streamline online reputation management.
Resumo:
The causes and contexts of food insecurity among children in the U.S. are poorly understood because the prevalence of food insecurity at the child level is low compared to the prevalence of household food insecurity. In addition, caregivers may be reluctant to admit their children may not be getting enough food due to shame or fear they might lose custody of their children. Based on our ongoing qualitative research with mothers of young children, we suggest that food security among children is related to adverse childhood experiences of caregivers. This translates into poor mental and physical health in adolescence and adulthood, which can lead to inability to secure and maintain meaningful employment that pays a living wage. In this paper we propose that researchers shift the framework for understanding food insecurity in the United States to adopt a life course approach. This demands we pay greater attention to the lifelong consequences of exposure to trauma or toxic stress—exposure to violence, rape, abuse and neglect, and housing, food, and other forms of deprivation—during childhood. We then describe three case studies of women from our ongoing study to describe a variety of toxic stress exposures and how they have an impact on a woman’s earning potential, her mental health, and attitudes toward raising children. Each woman describes her exposure to violence and deprivation as a child and adolescent, describes experiences with child hunger, and explains how her experiences have shaped her ability to nourish her children. We describe ways in which we can shift the nature of research investigations on food insecurity, and provide recommendations for policy-oriented solutions regarding income support programs, early intervention programs, child and adult mental health services, and violence prevention programs.