603 resultados para Indigenous certification mark


Relevância:

20.00% 20.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Safety Certification of Software-Intensive Systems with Reusable Components project, in short SafeCer (www.safecer.eu),is targeting increased efficiency and reduced time-to-market by composable safety certification of safety- relevant embedded systems. The industrial domains targeted are within automotive and construction equipment, avionics, and rail. Some of the companies involved are: Volvo Tech- nology, Thales, TTTech, and Intecs among others. SafeCer includes more than 30 partners in six different countries and has a budget of e25.7 millions. A primary objective is to provide support for system safety arguments based on arguments and properties of system components as well as to provide support for generation of corresponding evidence in a similar compositional way. By providing support for efficient reuse of certification and stronger links between certification and development, compo- nent reuse will be facilitated, and by providing support for reuse across domains the amount of components available for reuse will increase dramatically. The resulting efficiency and reduced time to market will, together with increased quality and reduced risk, increase competitiveness and pave the way for a cross-domain market for software components qualified for certification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The binomial knowledge/action understood under the biunivocal relationship of both components is the basis of planning from a postmodern approach. Within this binomial, social communication gives appropriate information, nurtures the knowledge that leads to transformative action, promotes participation and enhances the community?s self-esteem and recognition; to deeply reflect on action is a source of new knowledge; and communication fosters the adoption of the new knowledge by the community with new actions that feed the process knowledge/action as a planning source. From this approach the project Radio Message is born as a new communication channel with the aim of offering Andean indigenous communities from the area of Cayambe (Ecuador), a series of multidisciplinary training programs that enable transformative action with a strong effect on the life quality in these communities and their importance as social actors. The contents are designed through participatory communication between the training authorities and the communities themselves, analyzing their opportunities and needs. In this research the impact of social media in the development of more than 100 indigenous communities in Cayambe is analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resumo Em resposta aos desafios atuais de muitas grandes cidades, o contexto institucional e o planeamento territorial formam dimensões para melhorar a governação metropolitana. No quadro das regiões capitais do sudoeste europeu, quais poderão ser as inovações e diferenças nos seus modelos e processos em curso? Este artigo propõe uma investigação aplicada para apresentar a análise da governação metropolitana. Através do método de estudos de caso em perspectiva comparada, vários elementos e entrevistas são ponderados qualitativamente nas regiões de Madrid, Barcelona, Paris e Lisboa. As conclusões encontram uma tendência para o equilíbrio entre os esforços dessas duas dimensões da governação territorial metropolitana, não impedindo registrar os seus diferentes percursos: por exemplo Ile-de-France desenvolveu boas iniciativas em matéria de planeamento, que então pedem alguns ajustamentos no quadro político, enquanto Madrid teve “menos actividade” nos últimos anos, em resultado da sua grande estabilidade institucional. A região de Lisboa permanece talvez numa “posição intermédia”, com uma dinâmica de evolução pouco previsível. Mas de acordo com este argumento, admite-se que os seus processos podem levar a melhorias graduais no sistema de governação, com o seu próprio percurso, implementando acções que devem respeitar, em particular, a geografia do território. Abstract Addressing the running challenges of several greater cities, the institutional mark and regional planning are dimensions for improving metropolitan governance. Regarding the southwest European capital regions, what can be the innovations and differences in their currently processes and models? This paper proposes an applied framework to present the metropolitan governance analysis. Through a comparative case study methodology, various elements and interviews are qualitatively measured in the regions of Madrid, Barcelona, Paris and Lisbon. The conclusions find a tendency to balance between the efforts on those two regional metropolitan governance dimensions, which does not prevent to register their different paths: for example Ile-de-France has developed good initiatives in terms of planning, which then require some adjustments in the political mark, while Madrid had in recent years “less activity”, in result of his institutional stability. The Lisbon region maybe stays in an “intermediate position” with a dynamic evolution that is difficult to predict. But according to that argument, it’s possible to admit that his processes can gradually lead to small improvements in his governance system, with his own path, implementing actions that must respect, in particularly, the geography of the territory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta pesquisa objetiva analisar o desenvolvimento da missão adventista na cidade de São Paulo em busca de um modelo missiológico para centros urbanos. São Paulo, uma das maiores metrópoles do mundo tem uma formação cultural plural, não apenas pelas forças atuantes da modernização, secularização, globalização e pós-modernidade. A composição da população da cidade possui uma gênese étnica plural. Além da matriz autóctone indígena, do colonizador branco europeu e dos escravos africanos, desde o início do século XIX chegaram outros imigrantes, europeus e asiáticos. Nas primeiras décadas do século XX, o Brasil foi o país que mais recebeu imigrantes em todo o mundo. Estima-se que nos anos de 1920, apenas um terço da população na cidade de São Paulo fosse de brasileiros, o restante era composto por imigrantes. A inserção do adventismo em São Paulo se deu por missionários imigrantes que trabalharam primeiro com outros imigrantes antes de evangelizar e desenvolver a missão adventista com os brasileiros nacionais. De alguma forma, esse início deixou marcas na missão adventista paulistana. São Paulo é hoje a cidade com o maior número total de adventistas no mundo e a única com Igrejas Adventistas étnicas que atendem cinco grupos étnicos distintos: japoneses, coreanos, judeus, árabes e bolivianos/peruanos. Esta pesquisa busca investigar a formação de uma sensibilidade cultural no adventismo paulistano que lhe permitiu dialogar com a pluralidade cultural da metrópole paulistana.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lateral transfer of bacterial plasmids is thought to play an important role in microbial evolution and population dynamics. However, this assumption is based primarily on investigations of medically or agriculturally important bacterial species. To explore the role of lateral transfer in the evolution of bacterial systems not under intensive, human-mediated selection, we examined the association of genotypes at plasmid-encoded and chromosomal loci of native Rhizobium, the nitrogen-fixing symbiont of legumes. To this end, Rhizobium leguminosarum strains nodulating sympatric species of native Trifolium were characterized genetically at plasmid-encoded symbiotic (sym) regions (nodulation AB and nodulation CIJT loci) and a repeated chromosomal locus not involved in the symbiosis with legumes. Restriction fragment length polymorphism analysis was used to distinguish genetic groups at plasmid and chromosomal loci. The correlation between major sym and chromosomal genotypes and the distribution of genotypes across host plant species and sampling location were determined using χ2 analysis. In contrast to findings of previous studies, a strict association existed between major sym plasmid and chromosomal genetic groups, suggesting a lack of successful sym plasmid transfer between major Rhizobium chromosomal types. These data indicate that previous observations of sym plasmid transfer in agricultural settings may seriously overestimate the rates of successful conjugation in systems not impacted by human activities. In addition, a nonrandom distribution of Rhizobium genotypes across host plant species and sampling site demonstrates the importance of both factors in shaping Rhizobium population dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hepatitis B viruses (HBV) and related viruses, classified in the Hepadnaviridae family, are found in a wide variety of mammals and birds. Although the chimpanzee has been the primary experimental model of HBV infection, this species has not been considered a natural host for the virus. Retrospective analysis of 13 predominantly wild-caught chimpanzees with chronic HBV infection identified a unique chimpanzee HBV strain in 11 animals. Nucleotide and derived amino acid analysis of the complete HBV genome and the gene coding for the hepatitis B surface antigen (S gene) identified sequence patterns that could be used to reliably identify chimpanzee HBV. This analysis indicated that chimpanzee HBV is distinct from known human HBV genotypes and is closely related to HBVs previously isolated from a chimpanzee, gibbons, gorillas, and orangutans.