922 resultados para Bull Run, 2nd Battle of, Va., 1862.
Resumo:
Purpose: Leadership positions are still stereotyped as male, especially in male-dominated fields such as STEM. Therefore, women in such positions run the risk of being evaluated less favorably than men. Our study investigates how female and male leaders in existing teams (engineering project) are evaluated, and how these evaluations change over time. Design/Methodology: Participants worked in 45 teams to develop specific engineering projects. Evaluations of 45 leaders (33% women) by 258 team members (39% women) were analyzed, that is, leaders’ self-evaluation and their evaluation by team members. Results: Although female and male leaders did not differ in their self-evaluations at the beginning of the project, female leaders evaluated themselves better within time. However, team members evaluated female leaders better than male leaders at the beginning of the project. These gender differences disappeared over the time. Limitations: It should be replicated in a non-student sample. Implications: The results show that female leaders entering a male-dominated field (engineering) are evaluated better by team members than male leaders at the beginning of the team work, in line with the ‘shifting standard model’ (Biernat & Fuegen, 2001). While the initial impression formation of female and male leaders is influenced by category membership, its impact decreases over time as a consequence of individualization (Fiske & Neuberg, 1990); this results in similar evaluations over time. Originality: To our knowledge this is the first study to systematically test perceptions of change in the evaluation over time of female and male leaders in natural setting.
Resumo:
Institutional Review Boards (IRBs) are the primary gatekeepers for the protection of ethical standards of federally regulated research on human subjects in this country. This paper focuses on what general, broad measures that may be instituted or enhanced to exemplify a "model IRB". This is done by examining the current regulatory standards of federally regulated IRBs, not private or commercial boards, and how many of those standards have been found either inadequate or not generally understood or followed. The analysis includes suggestions on how to bring about changes in order to make the IRB process more efficient, less subject to litigation, and create standardized educational protocols for members. The paper also considers how to include better oversight for multi-center research, increased centralization of IRBs, utilization of Data Safety Monitoring Boards when necessary, payment for research protocol review, voluntary accreditation, and the institution of evaluation/quality assurance programs. ^ This is a policy study utilizing secondary analysis of publicly available data. Therefore, the research for this paper focuses on scholarly medical/legal journals, web information from the Department of Health and Human Services, Federal Drug Administration, and the Office of the Inspector General, Accreditation Programs, law review articles, and current regulations applicable to the relevant portions of the paper. ^ Two issues are found to be consistently cited by the literature as major concerns. One is a need for basic, standardized educational requirements across all IRBs and its members, and secondly, much stricter and more informed management of continuing research. There is no federally regulated formal education system currently in place for IRB members, except for certain NIH-based trials. Also, IRBs are not keeping up with research once a study has begun, and although regulated to do so, it does not appear to be a great priority. This is the area most in danger of increased litigation. Other issues such as voluntary accreditation and outcomes evaluation are slowing gaining steam as the processes are becoming more available and more sought after, such as JCAHO accrediting of hospitals. ^ Adopting the principles discussed in this paper should promote better use of a local IRBs time, money, and expertise for protecting the vulnerable population in their care. Without further improvements to the system, there is concern that private and commercial IRBs will attempt to create a monopoly on much of the clinical research in the future as they are not as heavily regulated and can therefore offer companies quicker and more convenient reviews. IRBs need to consider the advantages of charging for their unique and important services as a cost of doing business. More importantly, there must be a minimum standard of education for all IRB members in the area of the ethical standards of human research and a greater emphasis placed on the follow-up of ongoing research as this is the most critical time for study participants and may soon lead to the largest area for litigation. Additionally, there should be a centralized IRB for multi-site trials or a study website with important information affecting the trial in real time. There needs to be development of standards and metrics to assess the performance of the IRBs for quality assurance and outcome evaluations. The boards should not be content to run the business of human subjects' research without determining how well that function is actually being carried out. It is important that federally regulated IRBs provide excellence in human research and promote those values most important to the public at large.^
Resumo:
This study demonstrated that accurate, short-term forecasts of Veterans Affairs (VA) hospital utilization can be made using the Patient Treatment File (PTF), the inpatient discharge database of the VA. Accurate, short-term forecasts of two years or less can reduce required inventory levels, improve allocation of resources, and are essential for better financial management. These are all necessary achievements in an era of cost-containment.^ Six years of non-psychiatric discharge records were extracted from the PTF and used to calculate four indicators of VA hospital utilization: average length of stay, discharge rate, multi-stay rate (a measure of readmissions) and days of care provided. National and regional levels of these indicators were described and compared for fiscal year 1984 (FY84) to FY89 inclusive.^ Using the observed levels of utilization for the 48 months between FY84 and FY87, five techniques were used to forecast monthly levels of utilization for FY88 and FY89. Forecasts were compared to the observed levels of utilization for these years. Monthly forecasts were also produced for FY90 and FY91.^ Forecasts for days of care provided were not produced. Current inpatients with very long lengths of stay contribute a substantial amount of this indicator and it cannot be accurately calculated.^ During the six year period between FY84 and FY89, average length of stay declined substantially, nationally and regionally. The discharge rate was relatively stable, while the multi-stay rate increased slightly during this period. FY90 and FY91 forecasts show a continued decline in the average length of stay, while the discharge rate is forecast to decline slightly and the multi-stay rate is forecast to increase very slightly.^ Over a 24 month ahead period, all three indicators were forecast within a 10 percent average monthly error. The 12-month ahead forecast errors were slightly lower. Average length of stay was less easily forecast, while the multi-stay rate was the easiest indicator to forecast.^ No single technique performed significantly better as determined by the Mean Absolute Percent Error, a standard measure of error. However, Autoregressive Integrated Moving Average (ARIMA) models performed well overall and are recommended for short-term forecasting of VA hospital utilization. ^
Resumo:
The aim of the present study was to evaluate the influence of different light quality, especially ultraviolet radiation (UVR), on the dynamics of volatile halogenated organic compounds (VHOCs) at the sea surface. Short term experiments were conducted with floating gas-tight mesocosms of different optical qualities. Six halocarbons (CH3I, CHCl3, CH2Br2, CH2ClI, CHBr3 and CH2I2), known to be produced by phytoplankton, together with a variety of biological and environmental variables were measured in the coastal southern Baltic Sea and in the Raunefjord (North Sea). These experiments showed that ambient levels of UVR have no significant influence on VHOC dynamics in the natural systems. We attribute it to the low radiation doses that phytoplankton cells receive in a normal turbulent surface mixed layer. The VHOC concentrations were influenced by their production and removal processes, but they were not correlated with biological or environmental parameters investigated. Diatoms were most likely the dominant biogenic source of VHOCs in the Baltic Sea experiment, whereas in the Raunefjord experiment macroalgae probably contributed strongly to the production of VHOCs. The variable stable carbon isotope signatures (d13C values) of bromoform (CHBr3) also indicate that different autotrophic organisms were responsible for CHBr3 production in the two coastal environments. In the Raunefjord, despite strong daily variations in CHBr3 concentration, the carbon isotopic ratio was fairly stable with a mean value of -26 per mil. During the declining spring phytoplankton bloom in the Baltic Sea, the d13C values of CHBr3 were enriched in 13C and showed noticeable diurnal changes (-12 per mil ± 4). These results show that isotope signature analysis is a useful tool to study both the origin and dynamics of VHOCs in natural systems.
Resumo:
In this paper, abstract interpretation algorithms are described for computing the sharmg as well as the freeness information about the run-time instantiations of program variables. An abstract domain is proposed which accurately and concisely represents combined freeness and sharing information for program variables. Abstract unification and all other domain-specific functions for an abstract interpreter working on this domain are presented. These functions are illustrated with an example. The importance of inferring freeness is stressed by showing (1) the central role it plays in non-strict goal independence, and (2) the improved accuracy it brings to the analysis of sharing information when both are computed together. Conversely, it is shown that keeping accurate track of sharing allows more precise inference of freeness, thus resulting in an overall much more powerful abstract interpreter.
Resumo:
Although the sequential execution speed of logic programs has been greatly improved by the concepts introduced in the Warren Abstract Machine (WAM), parallel execution represents the only way to increase this speed beyond the natural limits of sequential systems. However, most proposed parallel logic programming execution models lack the performance optimizations and storage efficiency of sequential systems. This paper presents a parallel abstract machine which is an extension of the WAM and is thus capable of supporting ANDParallelism without giving up the optimizations present in sequential implementations. A suitable instruction set, which can be used as a target by a variety of logic programming languages, is also included. Special instructions are provided to support a generalized version of "Restricted AND-Parallelism" (RAP), a technique which reduces the overhead traditionally associated with the run-time management of variable binding conflicts to a series of simple run-time checks, which select one out of a series of compiled execution graphs.
Resumo:
La Internet de las Cosas (IoT), como parte de la Futura Internet, se ha convertido en la actualidad en uno de los principales temas de investigación; en parte gracias a la atención que la sociedad está poniendo en el desarrollo de determinado tipo de servicios (telemetría, generación inteligente de energía, telesanidad, etc.) y por las recientes previsiones económicas que sitúan a algunos actores, como los operadores de telecomunicaciones (que se encuentran desesperadamente buscando nuevas oportunidades), al frente empujando algunas tecnologías interrelacionadas como las comunicaciones Máquina a Máquina (M2M). En este contexto, un importante número de actividades de investigación a nivel mundial se están realizando en distintas facetas: comunicaciones de redes de sensores, procesado de información, almacenamiento de grandes cantidades de datos (big--‐data), semántica, arquitecturas de servicio, etc. Todas ellas, de forma independiente, están llegando a un nivel de madurez que permiten vislumbrar la realización de la Internet de las Cosas más que como un sueño, como una realidad tangible. Sin embargo, los servicios anteriormente mencionados no pueden esperar a desarrollarse hasta que las actividades de investigación obtengan soluciones holísticas completas. Es importante proporcionar resultados intermedios que eviten soluciones verticales realizadas para desarrollos particulares. En este trabajo, nos hemos focalizado en la creación de una plataforma de servicios que pretende facilitar, por una parte la integración de redes de sensores y actuadores heterogéneas y geográficamente distribuidas, y por otra lado el desarrollo de servicios horizontales utilizando dichas redes y la información que proporcionan. Este habilitador se utilizará para el desarrollo de servicios y para la experimentación en la Internet de las Cosas. Previo a la definición de la plataforma, se ha realizado un importante estudio focalizando no sólo trabajos y proyectos de investigación, sino también actividades de estandarización. Los resultados se pueden resumir en las siguientes aseveraciones: a) Los modelos de datos definidos por el grupo “Sensor Web Enablement” (SWE™) del “Open Geospatial Consortium (OGC®)” representan hoy en día la solución más completa para describir las redes de sensores y actuadores así como las observaciones. b) Las interfaces OGC, a pesar de las limitaciones que requieren cambios y extensiones, podrían ser utilizadas como las bases para acceder a sensores y datos. c) Las redes de nueva generación (NGN) ofrecen un buen sustrato que facilita la integración de redes de sensores y el desarrollo de servicios. En consecuencia, una nueva plataforma de Servicios, llamada Ubiquitous Sensor Networks (USN), se ha definido en esta Tesis tratando de contribuir a rellenar los huecos previamente mencionados. Los puntos más destacados de la plataforma USN son: a) Desde un punto de vista arquitectónico, sigue una aproximación de dos niveles (Habilitador y Gateway) similar a otros habilitadores que utilizan las NGN (como el OMA Presence). b) Los modelos de datos están basado en los estándares del OGC SWE. iv c) Está integrado en las NGN pero puede ser utilizado sin ellas utilizando infraestructuras IP abiertas. d) Las principales funciones son: Descubrimiento de sensores, Almacenamiento de observaciones, Publicacion--‐subscripcion--‐notificación, ejecución remota homogénea, seguridad, gestión de diccionarios de datos, facilidades de monitorización, utilidades de conversión de protocolos, interacciones síncronas y asíncronas, soporte para el “streaming” y arbitrado básico de recursos. Para demostrar las funcionalidades que la Plataforma USN propuesta pueden ofrecer a los futuros escenarios de la Internet de las Cosas, se presentan resultados experimentales de tres pruebas de concepto (telemetría, “Smart Places” y monitorización medioambiental) reales a pequeña escala y un estudio sobre semántica (sistema de información vehicular). Además, se está utilizando actualmente como Habilitador para desarrollar tanto experimentación como servicios reales en el proyecto Europeo SmartSantander (que aspira a integrar alrededor de 20.000 dispositivos IoT). v Abstract Internet of Things, as part of the Future Internet, has become one of the main research topics nowadays; in part thanks to the pressure the society is putting on the development of a particular kind of services (Smart metering, Smart Grids, eHealth, etc.), and by the recent business forecasts that situate some players, like Telecom Operators (which are desperately seeking for new opportunities), at the forefront pushing for some interrelated technologies like Machine--‐to--‐Machine (M2M) communications. Under this context, an important number of research activities are currently taking place worldwide at different levels: sensor network communications, information processing, big--‐ data storage, semantics, service level architectures, etc. All of them, isolated, are arriving to a level of maturity that envision the achievement of Internet of Things (IoT) more than a dream, a tangible goal. However, the aforementioned services cannot wait to be developed until the holistic research actions bring complete solutions. It is important to come out with intermediate results that avoid vertical solutions tailored for particular deployments. In the present work, we focus on the creation of a Service--‐level platform intended to facilitate, from one side the integration of heterogeneous and geographically disperse Sensors and Actuator Networks (SANs), and from the other the development of horizontal services using them and the information they provide. This enabler will be used for horizontal service development and for IoT experimentation. Prior to the definition of the platform, we have realized an important study targeting not just research works and projects, but also standardization topics. The results can be summarized in the following assertions: a) Open Geospatial Consortium (OGC®) Sensor Web Enablement (SWE™) data models today represent the most complete solution to describe SANs and observations. b) OGC interfaces, despite the limitations that require changes and extensions, could be used as the bases for accessing sensors and data. c) Next Generation Networks (NGN) offer a good substrate that facilitates the integration of SANs and the development of services. Consequently a new Service Layer platform, called Ubiquitous Sensor Networks (USN), has been defined in this Thesis trying to contribute to fill in the previous gaps. The main highlights of the proposed USN Platform are: a) From an architectural point of view, it follows a two--‐layer approach (Enabler and Gateway) similar to other enablers that run on top of NGN (like the OMA Presence). b) Data models and interfaces are based on the OGC SWE standards. c) It is integrated in NGN but it can be used without it over open IP infrastructures. d) Main functions are: Sensor Discovery, Observation Storage, Publish--‐Subscribe--‐Notify, homogeneous remote execution, security, data dictionaries handling, monitoring facilities, authorization support, protocol conversion utilities, synchronous and asynchronous interactions, streaming support and basic resource arbitration. vi In order to demonstrate the functionalities that the proposed USN Platform can offer to future IoT scenarios, some experimental results have been addressed in three real--‐life small--‐scale proofs--‐of concepts (Smart Metering, Smart Places and Environmental monitoring) and a study for semantics (in--‐vehicle information system). Furthermore we also present the current use of the proposed USN Platform as an Enabler to develop experimentation and real services in the SmartSantander EU project (that aims at integrating around 20.000 IoT devices).
Resumo:
It is well known that winter chilling is necessary for the flowering of temperate trees. The chilling requirement is a criterion for choosing a species or variety at a given location. Also chemistry products can be used for reducing the chilling-hours needs but make our production more expensive. This study first analysed the observed values of chilling hours for some representative agricultural locations in Spain for the last three decades and their projected changes under climate change scenarios. Usually the chilling is measured and calculated as chilling-hours, and different methods have been used to calculate them (e.g. Richarson et al., 1974 among others) according to the species considered. For our objective North Carolina method (Shaltout and Unrath, 1983) was applied for apples, Utah method (Richardson et al. 1974) for peach and grapevine and the approach used by De Melo-Abreu et al. (2004) for olive trees. The influence of climate change in temperate trees was studied by calculating projections of chilling-hours with climate data from Regional Climate Models (RCMs) at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). These projections will allow for analysing the modelled variations of chill-hours between 2nd half of 20C and 1st half of 21C at the study locations.
Resumo:
This work sets out an innovative methodology that aims to facilitate the implementation and continuous improvement of Social Responsibility. It is a methodology that takes account of strategic-economic, social and environmental questions and allows measuring the impact of each of these aspects on the stakeholders and on each of the value areas. It can be extrapolated to all kinds of organisations regardless of their size and sector and admits scaleable models. A marked feature that sets it aside from other methodologies is that it eliminates subjectivity from the qualitative aspects and introduces an algorithm to quantify them.
Resumo:
Numerical analysis is a suitable tool in the design of complex reinforced concrete structures under extreme impulsive loadings such as impacts or explosions at close range. Such events may be the result of terrorist attacks. Reinforced concrete is commonly used for buildings and infrastructures. For this reason, the ability to accurately run numerical simulations of concrete elements subjected to blast loading is needed. In this context, reliable constitutive models for concrete are of capital importance. In this research numerical simulations using two different constitutive models for concrete (Continuous Surface Cap Model and Brittle Damage Model) have been carried out using LS-DYNA. Two experimental benchmark tests have been taken as reference. The results of the numerical simulations with the aforementioned constitutive models show different abilities to accurately represent the structural response of the reinforced concrete elements studied.
Resumo:
Esta tesis se centra en el análisis de dos aspectos complementarios de la ciberdelincuencia (es decir, el crimen perpetrado a través de la red para ganar dinero). Estos dos aspectos son las máquinas infectadas utilizadas para obtener beneficios económicos de la delincuencia a través de diferentes acciones (como por ejemplo, clickfraud, DDoS, correo no deseado) y la infraestructura de servidores utilizados para gestionar estas máquinas (por ejemplo, C & C, servidores explotadores, servidores de monetización, redirectores). En la primera parte se investiga la exposición a las amenazas de los ordenadores victimas. Para realizar este análisis hemos utilizado los metadatos contenidos en WINE-BR conjunto de datos de Symantec. Este conjunto de datos contiene metadatos de instalación de ficheros ejecutables (por ejemplo, hash del fichero, su editor, fecha de instalación, nombre del fichero, la versión del fichero) proveniente de 8,4 millones de usuarios de Windows. Hemos asociado estos metadatos con las vulnerabilidades en el National Vulnerability Database (NVD) y en el Opens Sourced Vulnerability Database (OSVDB) con el fin de realizar un seguimiento de la decadencia de la vulnerabilidad en el tiempo y observar la rapidez de los usuarios a remiendar sus sistemas y, por tanto, su exposición a posibles ataques. Hemos identificado 3 factores que pueden influir en la actividad de parches de ordenadores victimas: código compartido, el tipo de usuario, exploits. Presentamos 2 nuevos ataques contra el código compartido y un análisis de cómo el conocimiento usuarios y la disponibilidad de exploit influyen en la actividad de aplicación de parches. Para las 80 vulnerabilidades en nuestra base de datos que afectan código compartido entre dos aplicaciones, el tiempo entre el parche libera en las diferentes aplicaciones es hasta 118 das (con una mediana de 11 das) En la segunda parte se proponen nuevas técnicas de sondeo activos para detectar y analizar las infraestructuras de servidores maliciosos. Aprovechamos técnicas de sondaje activo, para detectar servidores maliciosos en el internet. Empezamos con el análisis y la detección de operaciones de servidores explotadores. Como una operación identificamos los servidores que son controlados por las mismas personas y, posiblemente, participan en la misma campaña de infección. Hemos analizado un total de 500 servidores explotadores durante un período de 1 año, donde 2/3 de las operaciones tenían un único servidor y 1/2 por varios servidores. Hemos desarrollado la técnica para detectar servidores explotadores a diferentes tipologías de servidores, (por ejemplo, C & C, servidores de monetización, redirectores) y hemos logrado escala de Internet de sondeo para las distintas categorías de servidores maliciosos. Estas nuevas técnicas se han incorporado en una nueva herramienta llamada CyberProbe. Para detectar estos servidores hemos desarrollado una novedosa técnica llamada Adversarial Fingerprint Generation, que es una metodología para generar un modelo único de solicitud-respuesta para identificar la familia de servidores (es decir, el tipo y la operación que el servidor apartenece). A partir de una fichero de malware y un servidor activo de una determinada familia, CyberProbe puede generar un fingerprint válido para detectar todos los servidores vivos de esa familia. Hemos realizado 11 exploraciones en todo el Internet detectando 151 servidores maliciosos, de estos 151 servidores 75% son desconocidos a bases de datos publicas de servidores maliciosos. Otra cuestión que se plantea mientras se hace la detección de servidores maliciosos es que algunos de estos servidores podrán estar ocultos detrás de un proxy inverso silente. Para identificar la prevalencia de esta configuración de red y mejorar el capacidades de CyberProbe hemos desarrollado RevProbe una nueva herramienta a través del aprovechamiento de leakages en la configuración de la Web proxies inversa puede detectar proxies inversos. RevProbe identifica que el 16% de direcciones IP maliciosas activas analizadas corresponden a proxies inversos, que el 92% de ellos son silenciosos en comparación con 55% para los proxies inversos benignos, y que son utilizado principalmente para equilibrio de carga a través de múltiples servidores. ABSTRACT In this dissertation we investigate two fundamental aspects of cybercrime: the infection of machines used to monetize the crime and the malicious server infrastructures that are used to manage the infected machines. In the first part of this dissertation, we analyze how fast software vendors apply patches to secure client applications, identifying shared code as an important factor in patch deployment. Shared code is code present in multiple programs. When a vulnerability affects shared code the usual linear vulnerability life cycle is not anymore effective to describe how the patch deployment takes place. In this work we show which are the consequences of shared code vulnerabilities and we demonstrate two novel attacks that can be used to exploit this condition. In the second part of this dissertation we analyze malicious server infrastructures, our contributions are: a technique to cluster exploit server operations, a tool named CyberProbe to perform large scale detection of different malicious servers categories, and RevProbe a tool that detects silent reverse proxies. We start by identifying exploit server operations, that are, exploit servers managed by the same people. We investigate a total of 500 exploit servers over a period of more 13 months. We have collected malware from these servers and all the metadata related to the communication with the servers. Thanks to this metadata we have extracted different features to group together servers managed by the same entity (i.e., exploit server operation), we have discovered that 2/3 of the operations have a single server while 1/3 have multiple servers. Next, we present CyberProbe a tool that detects different malicious server types through a novel technique called adversarial fingerprint generation (AFG). The idea behind CyberProbe’s AFG is to run some piece of malware and observe its network communication towards malicious servers. Then it replays this communication to the malicious server and outputs a fingerprint (i.e. a port selection function, a probe generation function and a signature generation function). Once the fingerprint is generated CyberProbe scans the Internet with the fingerprint and finds all the servers of a given family. We have performed a total of 11 Internet wide scans finding 151 new servers starting with 15 seed servers. This gives to CyberProbe a 10 times amplification factor. Moreover we have compared CyberProbe with existing blacklists on the internet finding that only 40% of the server detected by CyberProbe were listed. To enhance the capabilities of CyberProbe we have developed RevProbe, a reverse proxy detection tool that can be integrated with CyberProbe to allow precise detection of silent reverse proxies used to hide malicious servers. RevProbe leverages leakage based detection techniques to detect if a malicious server is hidden behind a silent reverse proxy and the infrastructure of servers behind it. At the core of RevProbe is the analysis of differences in the traffic by interacting with a remote server.
Resumo:
Changes in polymerized actin during stress conditions were correlated with potato (Solanum tuberosum L.) tuber protein synthesis. Fluorescence microscopy and immunoblot analyses indicated that filamentous actin was nearly undetectable in mature, quiescent aerobic tubers. Mechanical wounding of postharvest tubers resulted in a localized increase of polymerized actin, and microfilament bundles were visible in cells of the wounded periderm within 12 h after wounding. During this same period translational activity increased 8-fold. By contrast, low-oxygen stress caused rapid reduction of polymerized actin coincident with acute inhibition of protein synthesis. Treatment of aerobic tubers with cytochalasin D, an agent that disrupts actin filaments, reduced wound-induced protein synthesis in vivo. This effect was not observed when colchicine, an agent that depolymerizes microtubules, was used. Neither of these drugs had a significant effect in vitro on run-off translation of isolated polysomes. However, cytochalasin D did reduce translational competence in vitro of a crude cellular fraction containing both polysomes and cytoskeletal elements. These results demonstrate the dependence of wound-induced protein synthesis on the integrity of microfilaments and suggest that the dynamics of the actin cytoskeleton may affect translational activity during stress conditions.
Resumo:
A total of 1268 available (excluding mitochondrial) tRNA sequences was used to reconstruct the common consensus image of their acceptor domains. Its structure appeared as a 11-bp-long double-stranded palindrome with complementary triplets in the center, each flanked by the 3'-ACCD and NGGU-5' motifs on each strand (D, base determinator). The palindrome readily extends up to the modern tRNA-like cloverleaf passing through an intermediate hairpin having in the center the single-stranded triplet, in supplement to its double-stranded precursor. The latter might represent an original anticodon-codon pair mapped at 1-2-3 positions of the present-day tRNA acceptors. This conclusion is supported by the striking correlation: in pairs of consensus tRNAs with complementary anticodons, their bases at the 2nd position of the acceptor stem were also complementary. Accordingly, inverse complementarity was also evident at the 71st position of the acceptor stem. With a single exception (tRNA(Phe)-tRNA(Glu) pair), the parallelism is especially impressive for the pairs of tRNAs recognized by aminoacyl-tRNA synthetases (aaRS) from the opposite classes. The above complementarity still doubly presented at the key central position of real single-stranded anticodons and their hypothetical double-stranded precursors is consistent with our previous data pointing to the double-strand use of ancient RNAs in the origin of the main actors in translation- tRNAs with complementary anticodons and the two classes of aaRS.
Resumo:
This dissertation examines the role of worldview and language in the cultural framework of American Indian people. In it I develop a theory of worldview which can be defined as an interrelated set of logics that orients a culture to space (land), time, the rest of life, and provides a prescription for understanding that life. Considering the strong links between language and worldview, it is methodologically necessary to focus on a particular language and culture to decolonize concepts of and relationships to land. In particular, this dissertation focuses on an Anishinaabe worldview as consisting of four components, which are; (1) an intimate relationship to a localized space; (2) a cyclical understanding of time; (3) living in a web of relatedness with all life, and (4) understanding the world around us in terms of balance. The methodological approach draws from Anishinaabemowin, the traditional Anishinaabe language, as a starting place for negotiating a linguistic-conceptual analysis of these logics to decolonize the understandings of land, time, relatedness and balance. This dissertation helps to demonstrate that the religious language as codified in the 1st Amendment to the United States Constitution as religious freedom is unable to carry the meaning of the fundamental relationships to land that are embedded in Anishinaabemowin and culture. I compare the above Anishinaabe worldview to that of the eurowestern culture in America, which is; (1) the domination of space; (2) a linear progression of time; (3) a hierarchical organization of life; and (4) understanding the world as a Manichean battle of good versus evil. This dissertation seeks to decolonize American Indian translational methodologies and undermine the assumptions of eurowestern cultural universality.
Resumo:
The military event that sealed the defeat of Napoleon in Spain was the Battle of Vitoria on 21st June 1813, which saw the allied British, Spanish and Portuguese troops led by Wellington gain victory over Joseph Bonaparte's forces. It was the last great battle of what is known in Spain as the Guerra de la Independencia, in the United Kingdom as the Peninsular War and in France as the Guerre d'Espagne. While a sliver of Spanish territory remained under French control and the war itself went on for a few more months, it was the Battle of Vitoria that marked the end of Napoleon's rule on the Iberian Peninsula, as symbolised by the departure from Spain of Joseph Bonaparte, the monarch who had been imposed five years earlier to remove the Bourbons from the Spanish throne.