921 resultados para meaning of a norm
Resumo:
Free riders and holdouts are market failures that potentially impede the completion of otherwise beneficial transactions. The key difference is that the free rider problem is a demand side externality that requires taxation to compel payment for a public good, while the holdout problem is a supply side externality that requires eminent domain to force the sale of land for large scale projects. This paper highlights that distinction between these two problems and uses the resulting insights to clarify the meaning of the public use requirement of the Fifth Amendment takings clause.
Resumo:
How organisms may adapt to rising global temperatures is uncertain, but concepts can emerge from studying adaptive physiological trait variations across existing spatial climate gradients. Many ectotherms, particularly fish, have evolved increasing genetic growth capacities with latitude (i.e. countergradient variation (CnGV) in growth), which are thought to be an adaptation primarily to strong gradients in seasonality. In contrast, evolutionary responses to gradients in mean temperature are often assumed to involve an alternative mode, 'thermal adaptation'. We measured thermal growth reaction norms in Pacific silverside populations (Atherinops affinis) occurring across a weak latitudinal temperature gradient with invariant seasonality along the North American Pacific coast. Instead of thermal adaptation, we found novel evidence for CnGV in growth, suggesting that CnGV is a ubiquitous mode of reaction-norm evolution in ectotherms even in response to weak spatial and, by inference, temporal climate gradients. A novel, large-scale comparison between ecologically equivalent Pacific versus Atlantic silversides (Menidia menidia) revealed how closely growth CnGV patterns reflect their respective climate gradients. While steep growth reaction norms and increasing growth plasticity with latitude in M. menidia mimicked the strong, highly seasonal Atlantic coastal gradient, shallow reaction norms and much smaller, latitude-independent growth plasticity in A. affinis resembled the weak Pacific latitudinal temperature gradient.
Resumo:
In an attempt to establish criteria for obtaining reliable K-Ar dates, conventional K-Ar studies of several Deep Sea Drilling Project sites were undertaken. K-Ar dates of these rocks may be subject to inaccuracies as the result of sea-water alteration. Inaccuracies may also result from the presence of excess radiogenic 40Ar trapped in rapidly cooled rocks at the time of their formation. The results obtained for DSDP Leg 34 basalts indicate that lowering of K-Ar dates, which is related to potassium addition by weathering, is a major cause of uncertainty in obtaining reliable K-Ar dates for deep-sea rocks. It could not be determined if the potassium addition to the basalts occurred at the time of formation, t_o, or continuously from t_o to the present. Calculations show that sediment cover is not a significant barrier to the diffusion of potassium into the basalt. 40Ar loss contributes, at least in part, to the lowering of the K-Ar date in rocks that have added potassium. The meaning of the K-Ar results obtained for DSDP Legs 35 and 2 basalts could not be unambiguously established. Because of the problems involved, caution must be used in interpreting the meaning of conventional K-Ar dates for deep-sea rocks.
Resumo:
The “Innovatio Educativa Tertio Millennio” group has been 10 years developing educational innovation techniques, actually has reached the level of teaching on the technical teachers has developed, and share them with other groups, that can implement them in their teaching activities. UNESCO Chair of Mining and Industrial Heritage has been years working on heritage, and on the one hand teaching in conservation and maintenance of heritage, and on the other doing raise awareness of the meaning of heritage, the social value and as must be managed effectively. Recently these two groups work together, thus is spreading in a much more effective manner the concepts of heritage, its meaning, its value, and how to manage it and provide effective protection. On one hand being a work of dissemination based on internet and on radio broadcasting, and on the other one of teaching based on educational innovation, and courses, conferences, and face-to-face seminars or distance platforms.
Resumo:
The well-known Noether theorem in Lagrangian and Hamiltonian mechanics associates symmetries in the evolution equations of a mechanical system with conserved quantities. In this work, we extend this classical idea to problems of non-equilibrium thermodynamics formulated within the GENERIC (General Equations for Non-Equilibrium Reversible-Irreversible Coupling) framework. The geometric meaning of symmetry is reviewed in this formal setting and then utilized to identify possible conserved quantities and the conditions that guarantee their strict conservation. Examples are provided that demonstrate the validity of the proposed definition in the context of finite and infinite dimensional thermoelastic problems.
Resumo:
The aim of this paper is to discuss the meaning of five neologisms in the domain of videogames in Spanish: título, aventura, personaje, plataforma, and rol. Our study focuses on a special type of neologism since the Spanish terms we deal with here are not strictly new words; they are what have been called sense neologisms or neosemanticisms, that is, old words taking a new sense in a different domain. These words were identified as new concepts after a process of analysis based on contextual evidence. This study of neology is based on the analysis of a corpus of press articles evaluating videogames published by the Spanish newspaper El País from 1998 to 2008. The analysis of the instances of use of domain specific terms in the corpus revealed that they acquired new senses different to those they have in other domains where they are also used. The paper explains the process of discovering the specialized meaning these words have developed in the domain of videogames and how the analysis of collocational behavior helps in the process of discovering the new sense and in the design of the definition provided.
Resumo:
There is no doubt that there is no possibility of finding a single reference about domotics in the first half of the 20th century. The best known authors and those who have documented this discipline, set its origin in the 1970’s, when the x-10 technology began to be used, but it was not until 1988 when Larousse Encyclopedia decided to include the definition of "Smart Building". Furthermore, even nowadays, there is not a single definition widely accepted, and for that reason, many other expressions, namely "Intelligent Buildings" "Domotics" "Digital Home" or "Home Automation" have appeared to describe the automated buildings and homes. The lack of a clear definition for "Smart Buildings" causes difficulty not only in the development of a common international framework to develop research in this field, but it also causes insecurity in the potential user of these buildings. That is to say, the user does not know what is offered by this kind of buildings, hindering the dissemination of the culture of building automation in society. Thus, the main purpose of this paper is to propose a definition of the expression “Smart Buildings” that satisfactorily describes the meaning of this discipline. To achieve this aim, a thorough review of the origin of the term itself and the historical background before the emergence of the phenomenon of domotics was conducted, followed by a critical discussion of existing definitions of the term "Smart Buildings" and other similar terms. The extent of each definition has been analyzed, inaccuracies have been discarded and commonalities have been compared. Throughout the discussion, definitions that bring the term "Smart Buildings" near to disciplines such as computer science, robotics and also telecommunications have been found. However, there are also many other definitions that emphasize in a more abstract way the role of these new buildings in the society and the future of mankind.
Resumo:
Assuring the sustainability of quality in photovoltaic rural electrification programmes involves enhancing the reliability of the components of solar home systems as well as the characterization of the overall programme cost structure. Batteries and photovoltaic modules have a great impact on both the reliability and the cost assessment, the battery being the weakest component of the solar home system and consequently the most expensive element of the programme. The photovoltaic module, despite being the most reliable component, has a significant impact cost-wise on the initial investment, even at current market prices. This paper focuses on the in-field testing of both batteries and photovoltaic modules working under real operating conditions within a sample of 41 solar home systems belonging to a large photovoltaic rural electrification programme with more than 13,000 installed photovoltaic systems. Different reliability parameters such as lifetime have been evaluated, taking into account different factors, for example energy consumption rates, or the manufacturing quality of batteries. A degradation model has been proposed relating both loss of capacity and time of operation. The user e solar home system binomial is also analysed in order to understand the meaning of battery lifetime in rural electrification.
Resumo:
The calculation of the effective delayed neutron fraction, beff , with Monte Carlo codes is a complex task due to the requirement of properly considering the adjoint weighting of delayed neutrons. Nevertheless, several techniques have been proposed to circumvent this difficulty and obtain accurate Monte Carlo results for beff without the need of explicitly determining the adjoint flux. In this paper, we make a review of some of these techniques; namely we have analyzed two variants of what we call the k-eigenvalue technique and other techniques based on different interpretations of the physical meaning of the adjoint weighting. To test the validity of all these techniques we have implemented them with the MCNPX code and we have benchmarked them against a range of critical and subcritical systems for which either experimental or deterministic values of beff are available. Furthermore, several nuclear data libraries have been used in order to assess the impact of the uncertainty in nuclear data in the calculated value of beff .
Resumo:
The aim of this paper is to discuss the meaning of five neologisms in the domain of videogames in Spanish: título, aventura, personaje, plataforma, and rol. Our study focuses on a special type of neologism since the Spanish terms we deal with here are not strictly new words; they are what have been called sense neologisms or neosemanticisms, that is, old words taking a new sense in a different domain. These words were identified as new concepts after a process of analysis based on contextual evidence. This study of neology is based on the analysis of a corpus of press articles evaluating videogames published by the Spanish newspaper El País from 1998 to 2008. The analysis of the instances of use of domain specific terms in the corpus revealed that they acquired new senses different to those they have in other domains where they are also used. The paper explains the process of discovering the specialized meaning these words have developed in the domain of videogames and how the analysis of collocational behavior helps in the process of discovering the new sense and in the design of the definition provided. RESUMEN: En este trabajo se presentan cinco neologismos del ámbito del videojuego en español: “título”, “aventura”, “personaje”, “plataforma” y “rol”. Se trata de un tipo especial de neologismo, conocido también como “neologismo semántico” o “neosemanticismo”, ya que son palabras ya existentes en la lengua que adquieren un nuevo significado. Los nuevos significados que adquieren estos términos en el ámbito del videojuego se establecieron tras el análisis del contexto de uso en un corpus periodístico de críticas de videojuegos. Este corpus recoge las críticas de videojuegos publicadas por el periódico El País entre 1998 y 2008. El análisis de los casos de uso de los términos en el corpus de videojuegos reveló que adquirían un nuevo significado diferente al de su uso en otros ámbitos o en el lenguaje general. El artículo describe cada uno de los neologismos y el proceso de análisis contextual que conduce a descubrir el nuevo significado y elaborar su definición.
Resumo:
La ecología no solamente ha puesto de manifiesto problemas ambientales, sino que ha confirmado la necesidad de una nueva armonía entre los propios seres humanos y de éstos con la naturaleza y con todos los seres que la habitan. Es necesario un nuevo contrato que determine nuestras relaciones con la Naturaleza (Serrs1), y una nueva Ética para nuestras vidas (Guattari2). La ética medioambiental nos ha dado una visión universal y supra-generacional de la gestión de la naturaleza y como consecuencia, una nueva forma de construir nuestra ‘segunda’ Naturaleza, que es la arquitectura. ¿Qué es lo esencial que esta nueva ética exige para la arquitectura? Este es un momento crucial para reconsiderar los objetivos de la arquitectura, porque lo ‘eco’ está produciendo grandes cambios. ¿Implica esta era post-ecológica una particular ética, es decir, referida a sus fines y medios? ¿Porqué, para qué, para quién, cómo debemos hacer la arquitectura de nuestro tiempo? Es momento de afrontar críticamente el discurso de la eco-arquitectura, e incluso de repensar los propios límites de la arquitectura. El desarrollo actual del conocimiento medioambiental es esencialmente técnico y utilitario, pero ¿es el reto solamente técnico?¿Es suficiente la suma de lo medioambiental-social-económico-cultural para definirla? ¿Hay claves que nos puedan dar la dimensión ética de esta aproximación técnica-empírica? ¿Sabemos lo que estamos haciendo cuando aplicamos este conocimiento? Y, sobre todo, ¿cuál es el sentido de lo que estamos haciendo? La tesis que se propone puede resumirse: De acuerdo con el actual conocimiento que tenemos de la Naturaleza, la Arquitectura de nuestro tiempo deber reconsiderar sus fines y medios, puesto que la ética medioambiental está definiendo nuevos objetivos. Para fundamentar y profundizar en esta afirmación la tesis analiza cómo son hoy día las relaciones entre Ética-Naturaleza-Arquitectura (Fig.1), lo que facilitará las claves de cuáles son los criterios éticos (en cuanto a fines y medios) que deben definir la arquitectura del tiempo de la ecología. ABSTRACT Ecology shows us not only environmental problems; it shows that we need a new balance and harmony between individuals, beings, communities and Nature. We need a new contract with Nature according to Serres576, and a new Ethics for our lives according to Guattari577. Environmental ethics have given us a universal and supra-generational vision of the management of our Nature and, as a consequence, a new way to construct our ‘second’ nature, which is architecture. What is essential for this new architecture that the new ethics demand? This is a critical moment to reconsider the object of architecture, because the ‘eco’ is making significant changes in it. Are there any specifically ethical concerns (ends and means) in the post-ecological era? Why, for what, for whom, how should we make architecture in our times? This is the time to approach the eco-architectural discourse critically and to question the current boundaries of architecture itself: Where is eco-architecture going? The current development of environmental knowledge is essentially technical and utilitarian, but it is its technical aspect the only challenge? Is the sum of environmental-social-economic aspects enough to define it? Are there any clues which can give an ethical sense to this technical-empirical approach? Do we know what we are doing when we apply this knowledge? And overall, what is the meaning of what we are doing? Exploring this subject, this thesis makes a statement: In accordance with the actual knowledge of Nature, Architecture of our time must reconsider its ends and means, since the environmental ethics is defining new objectives. To support that, the thesis analyzes what the relationships between Ethics –Nature- Architecture (Fig. 53) are like nowadays, this will provide the clues of which ethical criteria (ends and means) must architecture of an ecological era define.
Resumo:
There are many open issues that must be addressed before the replication process can be successfully formalized in empirical software engineering research. We define replication as the deliberate repetition of the same empirical study for the purpose of determining whether the results of the first experiment can be reproduced. This definition would appear at first glance to be good. However, it needs several clarifications that have not yet been forthcoming in software engineering: – What is the exact meaning of the same empirical study? Namely how similar should an experiment be to the baseline study for it to be considered a replication? What is the exact meaning of a result being reproduced? Namely how similar does a result have to be to the result of the baseline study for it to be considered reproduced? These and other methodological questions need to be researched and tailored for empirical software engineering.
Resumo:
Los hipergrafos dirigidos se han empleado en problemas relacionados con lógica proposicional, bases de datos relacionales, linguística computacional y aprendizaje automático. Los hipergrafos dirigidos han sido también utilizados como alternativa a los grafos (bipartitos) dirigidos para facilitar el estudio de las interacciones entre componentes de sistemas complejos que no pueden ser fácilmente modelados usando exclusivamente relaciones binarias. En este contexto, este tipo de representación es conocida como hiper-redes. Un hipergrafo dirigido es una generalización de un grafo dirigido especialmente adecuado para la representación de relaciones de muchos a muchos. Mientras que una arista en un grafo dirigido define una relación entre dos de sus nodos, una hiperarista en un hipergrafo dirigido define una relación entre dos conjuntos de sus nodos. La conexión fuerte es una relación de equivalencia que divide el conjunto de nodos de un hipergrafo dirigido en particiones y cada partición define una clase de equivalencia conocida como componente fuertemente conexo. El estudio de los componentes fuertemente conexos de un hipergrafo dirigido puede ayudar a conseguir una mejor comprensión de la estructura de este tipo de hipergrafos cuando su tamaño es considerable. En el caso de grafo dirigidos, existen algoritmos muy eficientes para el cálculo de los componentes fuertemente conexos en grafos de gran tamaño. Gracias a estos algoritmos, se ha podido averiguar que la estructura de la WWW tiene forma de “pajarita”, donde más del 70% del los nodos están distribuidos en tres grandes conjuntos y uno de ellos es un componente fuertemente conexo. Este tipo de estructura ha sido también observada en redes complejas en otras áreas como la biología. Estudios de naturaleza similar no han podido ser realizados en hipergrafos dirigidos porque no existe algoritmos capaces de calcular los componentes fuertemente conexos de este tipo de hipergrafos. En esta tesis doctoral, hemos investigado como calcular los componentes fuertemente conexos de un hipergrafo dirigido. En concreto, hemos desarrollado dos algoritmos para este problema y hemos determinado que son correctos y cuál es su complejidad computacional. Ambos algoritmos han sido evaluados empíricamente para comparar sus tiempos de ejecución. Para la evaluación, hemos producido una selección de hipergrafos dirigidos generados de forma aleatoria inspirados en modelos muy conocidos de grafos aleatorios como Erdos-Renyi, Newman-Watts-Strogatz and Barabasi-Albert. Varias optimizaciones para ambos algoritmos han sido implementadas y analizadas en la tesis. En concreto, colapsar los componentes fuertemente conexos del grafo dirigido que se puede construir eliminando ciertas hiperaristas complejas del hipergrafo dirigido original, mejora notablemente los tiempos de ejecucion de los algoritmos para varios de los hipergrafos utilizados en la evaluación. Aparte de los ejemplos de aplicación mencionados anteriormente, los hipergrafos dirigidos han sido también empleados en el área de representación de conocimiento. En concreto, este tipo de hipergrafos se han usado para el cálculo de módulos de ontologías. Una ontología puede ser definida como un conjunto de axiomas que especifican formalmente un conjunto de símbolos y sus relaciones, mientras que un modulo puede ser entendido como un subconjunto de axiomas de la ontología que recoge todo el conocimiento que almacena la ontología sobre un conjunto especifico de símbolos y sus relaciones. En la tesis nos hemos centrado solamente en módulos que han sido calculados usando la técnica de localidad sintáctica. Debido a que las ontologías pueden ser muy grandes, el cálculo de módulos puede facilitar las tareas de re-utilización y mantenimiento de dichas ontologías. Sin embargo, analizar todos los posibles módulos de una ontología es, en general, muy costoso porque el numero de módulos crece de forma exponencial con respecto al número de símbolos y de axiomas de la ontología. Afortunadamente, los axiomas de una ontología pueden ser divididos en particiones conocidas como átomos. Cada átomo representa un conjunto máximo de axiomas que siempre aparecen juntos en un modulo. La decomposición atómica de una ontología es definida como un grafo dirigido de tal forma que cada nodo del grafo corresponde con un átomo y cada arista define una dependencia entre una pareja de átomos. En esta tesis introducimos el concepto de“axiom dependency hypergraph” que generaliza el concepto de descomposición atómica de una ontología. Un modulo en una ontología correspondería con un componente conexo en este tipo de hipergrafos y un átomo de una ontología con un componente fuertemente conexo. Hemos adaptado la implementación de nuestros algoritmos para que funcionen también con axiom dependency hypergraphs y poder de esa forma calcular los átomos de una ontología. Para demostrar la viabilidad de esta idea, hemos incorporado nuestros algoritmos en una aplicación que hemos desarrollado para la extracción de módulos y la descomposición atómica de ontologías. A la aplicación la hemos llamado HyS y hemos estudiado sus tiempos de ejecución usando una selección de ontologías muy conocidas del área biomédica, la mayoría disponibles en el portal de Internet NCBO. Los resultados de la evaluación muestran que los tiempos de ejecución de HyS son mucho mejores que las aplicaciones más rápidas conocidas. ABSTRACT Directed hypergraphs are an intuitive modelling formalism that have been used in problems related to propositional logic, relational databases, computational linguistic and machine learning. Directed hypergraphs are also presented as an alternative to directed (bipartite) graphs to facilitate the study of the interactions between components of complex systems that cannot naturally be modelled as binary relations. In this context, they are known as hyper-networks. A directed hypergraph is a generalization of a directed graph suitable for representing many-to-many relationships. While an edge in a directed graph defines a relation between two nodes of the graph, a hyperedge in a directed hypergraph defines a relation between two sets of nodes. Strong-connectivity is an equivalence relation that induces a partition of the set of nodes of a directed hypergraph into strongly-connected components. These components can be collapsed into single nodes. As result, the size of the original hypergraph can significantly be reduced if the strongly-connected components have many nodes. This approach might contribute to better understand how the nodes of a hypergraph are connected, in particular when the hypergraphs are large. In the case of directed graphs, there are efficient algorithms that can be used to compute the strongly-connected components of large graphs. For instance, it has been shown that the macroscopic structure of the World Wide Web can be represented as a “bow-tie” diagram where more than 70% of the nodes are distributed into three large sets and one of these sets is a large strongly-connected component. This particular structure has been also observed in complex networks in other fields such as, e.g., biology. Similar studies cannot be conducted in a directed hypergraph because there does not exist any algorithm for computing the strongly-connected components of the hypergraph. In this thesis, we investigate ways to compute the strongly-connected components of directed hypergraphs. We present two new algorithms and we show their correctness and computational complexity. One of these algorithms is inspired by Tarjan’s algorithm for directed graphs. The second algorithm follows a simple approach to compute the stronglyconnected components. This approach is based on the fact that two nodes of a graph that are strongly-connected can also reach the same nodes. In other words, the connected component of each node is the same. Both algorithms are empirically evaluated to compare their performances. To this end, we have produced a selection of random directed hypergraphs inspired by existent and well-known random graphs models like Erd˝os-Renyi and Newman-Watts-Strogatz. Besides the application examples that we mentioned earlier, directed hypergraphs have also been employed in the field of knowledge representation. In particular, they have been used to compute the modules of an ontology. An ontology is defined as a collection of axioms that provides a formal specification of a set of terms and their relationships; and a module is a subset of an ontology that completely captures the meaning of certain terms as defined in the ontology. In particular, we focus on the modules computed using the notion of syntactic locality. As ontologies can be very large, the computation of modules facilitates the reuse and maintenance of these ontologies. Analysing all modules of an ontology, however, is in general not feasible as the number of modules grows exponentially in the number of terms and axioms of the ontology. Nevertheless, the modules can succinctly be represented using the Atomic Decomposition of an ontology. Using this representation, an ontology can be partitioned into atoms, which are maximal sets of axioms that co-occur in every module. The Atomic Decomposition is then defined as a directed graph such that each node correspond to an atom and each edge represents a dependency relation between two atoms. In this thesis, we introduce the notion of an axiom dependency hypergraph which is a generalization of the atomic decomposition of an ontology. A module in the ontology corresponds to a connected component in the hypergraph, and the atoms of the ontology to the strongly-connected components. We apply our algorithms for directed hypergraphs to axiom dependency hypergraphs and in this manner, we compute the atoms of an ontology. To demonstrate the viability of this approach, we have implemented the algorithms in the application HyS which computes the modules of ontologies and calculate their atomic decomposition. In the thesis, we provide an experimental evaluation of HyS with a selection of large and prominent biomedical ontologies, most of which are available in the NCBO Bioportal. HyS outperforms state-of-the-art implementations in the tasks of extracting modules and computing the atomic decomposition of these ontologies.