885 resultados para Self-organisation, Nature-inspired coordination, Bio pattern, Biochemical tuple spaces
Resumo:
Livelihood resilience draws attention to the factors and processes that keep livelihoods functioning despite change and thus enriches the livelihood approach which puts people, their differential capabilities to cope with shocks and how to reduce poverty and improve adaptive capacity at the centre of analysis. However, the few studies addressing resilience from a livelihood perspective take different approaches and focus only on some dimensions of livelihoods. This paper presents a framework that can be used for a comprehensive empirical analysis of livelihood resilience. We use a concept of resilience that considers agency as well as structure. A review of both theoretical and empirical literature related to livelihoods and resilience served as the basis to integrate the perspectives. The paper identifies the attributes and indicators of the three dimensions of resilience, namely, buffer capacity, self-organisation and capacity for learning. The framework has not yet been systematically tested; however, potentials and limitations of the components of the framework are explored and discussed by drawing on empirical examples from literature on farming systems. Besides providing a basis for applying the resilience concept in livelihood-oriented research, the framework offers a way to communicate with practitioners on identifying and improving the factors that build resilience. It can thus serve as a tool for monitoring the effectiveness of policies and practices aimed at building livelihood resilience.
Resumo:
This paper analyses the adaptiveness of the Public Agricultural Extension Services (PAES) to climate change. Existing literature, interviews and group discussions among PAES actors in larger Makueni district, Kenya, provided the data for the analyses. The findings show that the PAES already have various elements of adaptiveness in its policies, approaches and methods of extension provision. However, the hierarchical structure of the PAES does not augur well for self-organisation at local levels of extension provision, especially under conditions of abrupt change which climate change might trigger. Most importantly, adpativeness presupposes adaptive capacity but the lack of resources in terms of funding for extension, limited mobility of extension officers, the low extension staff/farmer ratio, the aging of extension staff and significant dependence on donor funding limits the adaptiveness of the PAES. Accordingly criteria and indicators were identified in literature with which an initial assessement of the adaptiiveneess of PAES was conducted. However this assessment framework needs to be improved and future steps will integrate more specific inputs from actors in PAES in order to make the framework operational.
Resumo:
En este proyecto se hace un análisis en profundidad de las técnicas de ataque a las redes de ordenadores conocidas como APTs (Advanced Persistent Threats), viendo cuál es el impacto que pueden llegar a tener en los equipos de una empresa y el posible robo de información y pérdida monetaria que puede llevar asociada. Para hacer esta introspección veremos qué técnicas utilizan los atacantes para introducir el malware en la red y también cómo dicho malware escala privilegios, obtiene información privilegiada y se mantiene oculto. Además, y cómo parte experimental de este proyecto se ha desarrollado una plataforma para la detección de malware de una red en base a las webs, URLs e IPs que visitan los nodos que la componen. Obtendremos esta visión gracias a la extracción de los logs y registros de DNS de consulta de la compañía, sobre los que realizaremos un análisis exhaustivo. Para poder inferir correctamente qué equipos están infectados o no se ha utilizado un algoritmo de desarrollo propio inspirado en la técnica Belief Propagation (“Propagación basada en creencia”) que ya ha sido usada antes por desarrolladores cómo los de los Álamos en Nuevo México (Estados Unidos) para fines similares a los que aquí se muestran. Además, para mejorar la velocidad de inferencia y el rendimiento del sistema se propone un algoritmo adaptado a la plataforma Hadoop de Apache, por lo que se modifica el paradigma de programación habitual y se busca un nuevo paradigma conocido como MapReduce que consiste en la división de la información en conceptos clave-valor. Por una parte, los algoritmos que existen basados en Belief Propagation para el descubrimiento de malware son propietarios y no han sido publicados completamente hasta la fecha, por otra parte, estos algoritmos aún no han sido adaptados a Hadoop ni a ningún modelo de programación distribuida aspecto que se abordará en este proyecto. No es propósito de este proyecto desarrollar una plataforma comercial o funcionalmente completa, sino estudiar el problema de las APTs y una implementación que demuestre que la plataforma mencionada es factible de implementar. Este proyecto abre, a su vez, un horizonte nuevo de investigación en el campo de la adaptación al modelo MapReduce de algoritmos del tipo Belief Propagation basados en la detección del malware mediante registros DNS. ABSTRACT. This project makes an in-depth investigation about problems related to APT in computer networks nowadays, seeing how much damage could they inflict on the hosts of a Company and how much monetary and information loss may they cause. In our investigation we will find what techniques are generally applied by attackers to inject malware into networks and how this malware escalates its privileges, extracts privileged information and stays hidden. As the main part of this Project, this paper shows how to develop and configure a platform that could detect malware from URLs and IPs visited by the hosts of the network. This information can be extracted from the logs and DNS query records of the Company, on which we will make an analysis in depth. A self-developed algorithm inspired on Belief Propagation technique has been used to infer which hosts are infected and which are not. This technique has been used before by developers of Los Alamos Lab (New Mexico, USA) for similar purposes. Moreover, this project proposes an algorithm adapted to Apache Hadoop Platform in order to improve the inference speed and system performance. This platform replaces the traditional coding paradigm by a new paradigm called MapReduce which splits and shares information among hosts and uses key-value tokens. On the one hand, existing algorithms based on Belief Propagation are part of owner software and they have not been published yet because they have been patented due to the huge economic benefits they could give. On the other hand these algorithms have neither been adapted to Hadoop nor to other distributed coding paradigms. This situation turn the challenge into a complicated problem and could lead to a dramatic increase of its installation difficulty on a client corporation. The purpose of this Project is to develop a complete and 100% functional brand platform. Herein, show a short summary of the APT problem will be presented and make an effort will be made to demonstrate the viability of an APT discovering platform. At the same time, this project opens up new horizons of investigation about adapting Belief Propagation algorithms to the MapReduce model and about malware detection with DNS records.
Resumo:
Human WEE1 (WEE1Hu) was cloned on the basis of its ability to rescue wee1+ mutants in fission yeast [Igarashi, M., Nagata, A., Jinno, S., Suto, K. & Okayama, H. (1991) Nature (London) 353, 80-83]. Biochemical studies carried out in vitro with recombinant protein demonstrated that WEE1Hu encodes a tyrosine kinase of approximately 49 kDa that phosphorylates p34cdc2 on Tyr-15 [Parker, L. L. & Piwnica-Worms, H. (1992) Science 257, 1955-1957]. To study the regulation of WEE1Hu in human cells, two polyclonal antibodies to bacterially produced p49WEE1Hu were generated. In addition, a peptide antibody generated against amino acids 361-388 of p49WEE1Hu was also used. Unexpectantly, these antibodies recognized a protein with an apparent molecular mass of 95 kDa in HeLa cells, rather than one of 49 kDa. Immunoprecipitates of p95 phosphorylated p34cdc2 on Tyr-15, indicating that p95 is functionally related to p49WEEIHu, and mapping studies demonstrated that p95 is structurally related to p49WEE1Hu. In addition, the substrate specificity of p95 was more similar to that of fission yeast p107wee1 than to that of human p49WEE1. Finally, the kinase activity of p95 toward p34cdc2/cyclin B was severely impaired during mitosis. Taken together, these results indicate that the original WEE1Hu clone isolated in genetic screens encodes only the catalytic domain of human WEE1 and that the authentic human WEE1 protein has an apparent molecular mass of approximately 95 kDa.
Resumo:
Diferentes complexos de cobre(II), contendo ligantes do tipo base de Schiff e um grupamento imidazólico, com interesse bioinorgânico, catalítico e como novos materiais, foram preparados na forma de sais perclorato, nitrato ou cloreto e caracterizados através de diferentes técnicas espectroscópicas (UV/Vis, IR, EPR, Raman) e espectrometria de massa Tandem (ESI-MS/MS), além de análise elementar, condutividade molar e medidas de propriedades magnéticas. Alguns destes compostos, obtidos como cristais adequados, tiveram suas estruturas determinadas por cristalografia de raios-X. As espécies di- e polinucleares contendo pontes cloreto, mostraram desdobramentos das hiperfinas nos espectros de EPR, relacionados à presença do equilíbrio com a respectiva espécie mononuclear, devido à labilidade dos íons cloretos, dependendo do contra-íon e do tipo de solvente utilizado. Adicionalmente, em solução alcalina, estes compostos estão em equilíbrio com as correspondentes espécies polinucleares, onde os centros de cobre estão ligados através de um ligante imidazolato. Em meio alcalino, estes compostos polinucleares contendo ponte imidazolato foram também isolados e caracterizados por diferentes técnicas espectroscópicas e magnéticas. Através da variação estrutural e também do ligante-ponte foi possível modular o fenômeno da interação magnética entre os íons de cobre em estruturas correlatas di- e polinucleares. Os respectivos parâmetros magnéticos foram obtidos com ajuste das curvas experimentais de XM vs T, correlacionando-se muito bem com a geometria, ângulos e distâncias de ligação entre os íons, quando comparado com outros complexos similares descritos na literatura. Posteriormente, estudaram-se os fatores relacionados com a reatividade de todas essas espécies como catalisadores na oxidação de substratos de interesse (fenóis e aminas), através da variação do tamanho da cavidade nas estruturas cíclicas ou de variações no ligante coordenado ao redor do íon metálico. Vários deles se mostraram bons miméticos de tirosinases e catecol oxidases. Um novo complexo-modelo da citocromo c oxidase (CcO), utilizando a protoporfirina IX condensada ao quelato N,N,-bis[2-(1,2-metilbenzimidazolil)etil]amino e ao resíduo de glicil-L-histidina, foi sintetizado e caracterizado através de diferentes técnicas espectroscópicas, especialmente EPR. A adição de H2O2 ao sistema completamente oxidado, FeIII/CuII, a -55°C, ou o borbulhamento de oxigênio molecular a uma solução do complexo na sua forma reduzida, FeII/CuI, saturada de CO, resultou na formação de adutos com O2, de baixo spin, estáveis a baixas temperaturas.
Resumo:
The reduction of Greek sovereign debt by €106 billion, agreed in the second bailout package of February 2012, is the largest in history. Nevertheless, immediately after publishing the key terms of the package, doubts arose whether it would achieve its goals: to reduce the debt-to-GDP ratio to 120.5% in 2020 and to ensure the return of Greece to market financing by 2015. This Briefing gives a timely input to the debate as it develops an analytical framework through which the expected failure of the Greek debt reduction can be assessed. It surveys the economic literature to identify three groups of factors reducing the effectiveness of sovereign debt restructuring: (1) sovereign’s fundamentals, (2) inefficiencies inherent in the restructuring process and (3) costs of restructuring; and applies them to the case of Greece. Based on this analysis, three policy implications are formulated, with relevance to Greece and the wider eurozone. Firstly, the importance of increased policy effort by Greece to enact current structural and growth-enhancing reforms is underlined. Secondly, the introduction of uniform CACs is proposed that will reduce the market participants’ uncertainty, discipline the runs on government debt and address the holdout inefficiency. Finally, sovereign debt restructuring is not recommended as a universal solution for over- indebtedness in the EU, given the direct and reputation costs of sovereign debt restructuring and the self-fulfilling nature of sovereign debt crises.
Resumo:
The situation of the third sector in Russia, i.e. the civil society structures independent from the state, is worsening on a regular basis. The Kremlin’s actions aimed at paralysing and destroying the independent non-governmental sector seen over the past four years have been presented as part of a struggle for the country’s sovereignty. This is above all a consequence of the Russian government’s efforts to take full control of the socio-political situation in the country while it also needs to deal with the geopolitical confrontation with the West and the worsening economic crisis. The policy aimed against non-governmental organisations is depriving the public of structures for self-organisation, protection of civil rights and the means of controlling the ever more authoritarian government. At the same time, the Kremlin has been depriving itself of channels of co-operation and communication with the public and antagonising the most active citizens. The restrictive measures the Kremlin has taken over the past few years with regard to NGOs prove that Russian decision-makers believe that any social initiative independent of the government may give rise to unrest, which is dangerous for the regime, and – given the economic slump – any unrest brings unnecessary political risk.
Resumo:
The deposition and properties of electroless nickel composite coatings containing graphite, PTFE and chromium were investigated. Solutions were developed for the codeposition of graphite and chromium with electroless nickel. Solutions for the deposition of graphite contained heavy metal ions for stability, with non-ionic and anionic surfactants to provide wetting and dispersion of the particles. Stability for the codeposition of chromium particles was achieved by oxidation of the chromium. Thin oxide layers of 200 nm thick prevented initiation of the electroless reaction onto the chromium. A mechanism for the formation of electroless composite coatings was considered based on the physical adsorption of particles and as a function of the adsorption of charged surfactants and metal cations from solution. The influence of variables such as particle concentration in solution, particle size, temperature, pH, and agitation on the volume percentage of particles codeposited was studied. The volume percentage of graphite codeposited was found to increase with concentration in solution and playing rate. An increase in particle size and agitation reduced the volume percentage codeposited. The hardness of nickel-graphite deposits was found to decrease with graphite content in the as-deposited and heat treated condition. The frictional and wear properties of electroless nickel-graphite were studied and compared to those of electroless nickel-PTFE. The self-lubricating nature of both coatings was found to be dependent on the ratio of coated area to uncoated area, the size and content of lubricating material in the deposit, and the load between contacting surfaces. The mechanism of self-lubrication was considered, concluding that graphite only produced an initial lubricating surface due to the orientation of flakes, unlike PTFE, which produced true self-lubrication throughout the coating life. Heat treatment of electroless nickel chromium deposits at 850oC for 8 and 16 hours produced nickel-iron-chromium alloy deposits with a phosphorus rich surface of high hardness. Coefficients of friction and wear rates were intially moderate for the phosphorus rich layer but increased for the nickel-iron-chromium region of the coating.
Resumo:
This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.
Resumo:
Swarm intelligence is a popular paradigm for algorithm design. Frequently drawing inspiration from natural systems, it assigns simple rules to a set of agents with the aim that, through local interactions, they collectively solve some global problem. Current variants of a popular swarm based optimization algorithm, particle swarm optimization (PSO), are investigated with a focus on premature convergence. A novel variant, dispersive PSO, is proposed to address this problem and is shown to lead to increased robustness and performance compared to current PSO algorithms. A nature inspired decentralised multi-agent algorithm is proposed to solve a constrained problem of distributed task allocation. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. New rules for specialisation are proposed and are shown to exhibit improved eciency and exibility compared to existing ones. These new rules are compared with a market based approach to agent control. The eciency (average number of tasks performed), the exibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved eciency and robustness. Evolutionary algorithms are employed, both to optimize parameters and to allow the various rules to evolve and compete. We also observe extinction and speciation. In order to interpret algorithm performance we analyse the causes of eciency loss, derive theoretical upper bounds for the eciency, as well as a complete theoretical description of a non-trivial case, and compare these with the experimental results. Motivated by this work we introduce agent "memory" (the possibility for agents to develop preferences for certain cities) and show that not only does it lead to emergent cooperation between agents, but also to a signicant increase in efficiency.
Resumo:
A nature inspired decentralised multi-agent algorithm is proposed to solve a problem of distributed task allocation in which cities produce and store batches of different mail types. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. The problem is constrained so that agents are penalised for switching mail types. When an agent process a mail batch of different type to the previous one, it must undergo a change-over, with repeated change-overs rendering the agent inactive. The efficiency (average amount of mail retrieved), and the flexibility (ability of the agents to react to changes in the environment) are investigated both in static and dynamic environments and with respect to sudden changes. New rules for mail selection and specialisation are proposed and are shown to exhibit improved efficiency and flexibility compared to existing ones. We employ a evolutionary algorithm which allows the various rules to evolve and compete. Apart from obtaining optimised parameters for the various rules for any environment, we also observe extinction and speciation.
Resumo:
Multi-agent algorithms inspired by the division of labour in social insects and by markets, are applied to a constrained problem of distributed task allocation. The efficiency (average number of tasks performed), the flexibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved efficiency and robustness. We employ nature inspired particle swarm optimisation to obtain optimised parameters for all algorithms in a range of representative environments. Although results are obtained for large population sizes to avoid finite size effects, the influence of population size on the performance is also analysed. From a theoretical point of view, we analyse the causes of efficiency loss, derive theoretical upper bounds for the efficiency, and compare these with the experimental results.
Resumo:
Ant colony optimisation algorithms model the way ants use pheromones for marking paths to important locations in their environment. Pheromone traces are picked up, followed, and reinforced by other ants but also evaporate over time. Optimal paths attract more pheromone and less useful paths fade away. The main innovation of the proposed Multiple Pheromone Ant Clustering Algorithm (MPACA) is to mark objects using many pheromones, one for each value of each attribute describing the objects in multidimensional space. Every object has one or more ants assigned to each attribute value and the ants then try to find other objects with matching values, depositing pheromone traces that link them. Encounters between ants are used to determine when ants should combine their features to look for conjunctions and whether they should belong to the same colony. This paper explains the algorithm and explores its potential effectiveness for cluster analysis. © 2014 Springer International Publishing Switzerland.
Resumo:
This paper examines two concepts, social vulnerability and social resilience, often used to describe people and their relationship to a disaster. Social vulnerability is the exposure to harm resulting from demographic and socioeconomic factors that heighten the exposure to disaster. Social resilience is the ability to avoid disaster, cope with change and recover from disaster. Vulnerability to a space and social resilience through society is explored through a focus on the elderly, a group sometimes regarded as having low resilience while being particularly vulnerable. Our findings explore the degree to which an elderly group exposed to coastal flood risk exhibits social resilience through both cognitive strategies, such as risk perception and self-perception, as well as through coping mechanisms, such as accepting change and self-organisation. These attenuate and accentuate the resilience of individuals through their own preparations as well as their communities' preparations and also contribute to the delusion of resilience which leads individuals to act as if they are more resilient than they are in reality, which we call negative resilience. Thus, we draw attention to three main areas: the degree to which social vulnerability can disguise its social resilience; the role played by cognitive strategies and coping mechanisms on an individual's social resilience; and the high risk aspects of social resilience. © 2014 Elsevier Ltd. All rights reserved.
Resumo:
Over the past two years there have been several large-scale disasters (Haitian earthquake, Australian floods, UK riots, and the Japanese earthquake) that have seen wide use of social media for disaster response, often in innovative ways. This paper provides an analysis of the ways in which social media has been used in public-to-public communication and public-to-government organisation communication. It discusses four ways in which disaster response has been changed by social media: 1. Social media appears to be displacing the traditional media as a means of communication with the public during a crisis. In particular social media influences the way traditional media communication is received and distributed. 2. We propose that user-generated content may provide a new source of information for emergency management agencies during a disaster, but there is uncertainty with regards to the reliability and usefulness of this information. 3. There are also indications that social media provides a means for the public to self-organise in ways that were not previously possible. However, the type and usefulness of self-organisation sometimes works against efforts to mitigate the outcome of the disaster. 4. Social media seems to influence information flow during a disaster. In the past most information flowed in a single direction from government organisation to public, but social media negates this model. The public can diffuse information with ease, but also expect interaction with Government Organisations rather than a simple one-way information flow. These changes have implications for the way government organisations communicate with the public during a disaster. The predominant model for explaining this form of communication, the Crisis and Emergency Risk Communication (CERC), was developed in 2005 before social media achieved widespread popularity. We will present a modified form of the CERC model that integrates social media into the disaster communication cycle, and addresses the ways in which social media has changed communication between the public and government organisations during disasters.