973 resultados para Near Field Communication
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
La Biología bajo Demanda es un concepto novedoso, que está siendo abordado en la actualidad desde distintos enfoques, que serán expuestos en este documento. Dado este carácter innovador, se trata de un ámbito donde la investigación está muy presente en estos momentos. Las Tecnologías de la Información y Comunicación (TICs) llevan un tiempo aportando soluciones muy efectivas para algunos de los problemas a los que se enfrente actualmente la biología sintética. Una de estas soluciones son las plataformas de Cloud Computing, que aportan un entorno de trabajo escalable, flexible y seguro. Por ello, se ha empleado este tipo de tecnología en este trabajo fin de grado en el área de la biología sintética mediante el concepto de biología bajo demanda. Para desarrollar la plataforma de biología bajo demanda ha sido necesario analizar el estado de esta temática actualmente y sus avances. Además, ha sido estimable el estudio de las opiniones de los miembros del grupo de investigación. Todo ello ha permitido llevar a cabo una captura de requisitos adecuada para el ámbito de este proyecto. Se ha decidido que los servidores de aplicaciones web son la respuesta más adecuada a la hora de implementar las soluciones obtenidas para el desarrollo de la plataforma de biología bajo demanda. En concreto, por sus características, se ha decidido emplear JavaEE de Oracle. El modelo implementado emplea soluciones conocidas y fiables basadas en patrones de diseño software. Así, conseguimos cumplir con uno de los principales objetivos de este proyecto, que es lograr un sistema flexible y escalable. Por otro lado, debido a la incertidumbre que conlleva un área tan innovadora, se ha decidido optar por una metodología ágil. Esto supone un plan de trabajo centrado en reuniones semanales conjuntas con el director y los compañeros del grupo de trabajo, empleando prototipado rápido y programación extrema. Finalmente, se ha conseguido desarrollar una plataforma de biología bajo demanda que puede ser la base para el trabajo de los biólogos del ámbito de la biología sintética en un futuro próximo.---ABSTRACT---Biology on demand is a new concept, which is currently being addressed from different approaches, which will be presented in this document. Given this innovative character, it is an area where research is a main factor right now. Technologies of Information and Communication Technologies (ICTs) have provided very effective solutions to some of the problems that synthetic biology is currently facing. One of these solutions is cloud computing platforms, which provide an environment for scalable, flexible and secure work. Therefore, we have used this technology in this final project in the area of synthetic biology through the concept of biology on demand. To develop a biology-on-demand platform it has been necessary to analyze the state of art. The opinions of members of the research group have also been very influential. All this has allowed us to conduct a proper capture requirements for the scope of this project here developed. It was decided that web application servers are the best answer when it comes to implementing the solutions obtained for the development of biology-on-demand platform. In particular, by its main features, it was decided to use Oracle’s JavaEE. The implemented model uses known and reliable solutions based on software design patterns. So, we get to meet one of the main objectives of this project, which is to achieve a flexible and scalable system. On the other hand, due to the uncertainty involved in such an innovative area, it was appropriate to opt for an agile methodology. The work plan was focused on weekly meetings with the director and coworkers, using additive technology and extreme programming. Finally, this project has been successful in developing a biology-on-demand platform that can be the basis for the work of biologists in the field of synthetic biology in the near future.
Resumo:
Esta tesis se desarrolla dentro del marco de las comunicaciones satelitales en el innovador campo de los pequeños satélites también llamados nanosatélites o cubesats, llamados así por su forma cubica. Estos nanosatélites se caracterizan por su bajo costo debido a que usan componentes comerciales llamados COTS (commercial off-the-shelf) y su pequeño tamaño como los Cubesats 1U (10cm*10 cm*10 cm) con masa aproximada a 1 kg. Este trabajo de tesis tiene como base una iniciativa propuesta por el autor de la tesis para poner en órbita el primer satélite peruano en mi país llamado chasqui I, actualmente puesto en órbita desde la Estación Espacial Internacional. La experiencia de este trabajo de investigación me llevo a proponer una constelación de pequeños satélites llamada Waposat para dar servicio de monitoreo de sensores de calidad de agua a nivel global, escenario que es usado en esta tesis. Es ente entorno y dadas las características limitadas de los pequeños satélites, tanto en potencia como en velocidad de datos, es que propongo investigar una nueva arquitectura de comunicaciones que permita resolver en forma óptima la problemática planteada por los nanosatélites en órbita LEO debido a su carácter disruptivo en sus comunicaciones poniendo énfasis en las capas de enlace y aplicación. Esta tesis presenta y evalúa una nueva arquitectura de comunicaciones para proveer servicio a una red de sensores terrestres usando una solución basada en DTN (Delay/Disruption Tolerant Networking) para comunicaciones espaciales. Adicionalmente, propongo un nuevo protocolo de acceso múltiple que usa una extensión del protocolo ALOHA no ranurado, el cual toma en cuenta la prioridad del trafico del Gateway (ALOHAGP) con un mecanismo de contienda adaptativo. Utiliza la realimentación del satélite para implementar el control de la congestión y adapta dinámicamente el rendimiento efectivo del canal de una manera óptima. Asumimos un modelo de población de sensores finito y una condición de tráfico saturado en el que cada sensor tiene siempre tramas que transmitir. El desempeño de la red se evaluó en términos de rendimiento efectivo, retardo y la equidad del sistema. Además, se ha definido una capa de convergencia DTN (ALOHAGP-CL) como un subconjunto del estándar TCP-CL (Transmission Control Protocol-Convergency Layer). Esta tesis muestra que ALOHAGP/CL soporta adecuadamente el escenario DTN propuesto, sobre todo cuando se utiliza la fragmentación reactiva. Finalmente, esta tesis investiga una transferencia óptima de mensajes DTN (Bundles) utilizando estrategias de fragmentación proactivas para dar servicio a una red de sensores terrestres utilizando un enlace de comunicaciones satelitales que utiliza el mecanismo de acceso múltiple con prioridad en el tráfico de enlace descendente (ALOHAGP). El rendimiento efectivo ha sido optimizado mediante la adaptación de los parámetros del protocolo como una función del número actual de los sensores activos recibidos desde el satélite. También, actualmente no existe un método para advertir o negociar el tamaño máximo de un “bundle” que puede ser aceptado por un agente DTN “bundle” en las comunicaciones por satélite tanto para el almacenamiento y la entrega, por lo que los “bundles” que son demasiado grandes son eliminados o demasiado pequeños son ineficientes. He caracterizado este tipo de escenario obteniendo una distribución de probabilidad de la llegada de tramas al nanosatélite así como una distribución de probabilidad del tiempo de visibilidad del nanosatélite, los cuales proveen una fragmentación proactiva óptima de los DTN “bundles”. He encontrado que el rendimiento efectivo (goodput) de la fragmentación proactiva alcanza un valor ligeramente inferior al de la fragmentación reactiva. Esta contribución permite utilizar la fragmentación activa de forma óptima con todas sus ventajas tales como permitir implantar el modelo de seguridad de DTN y la simplicidad al implementarlo en equipos con muchas limitaciones de CPU y memoria. La implementación de estas contribuciones se han contemplado inicialmente como parte de la carga útil del nanosatélite QBito, que forma parte de la constelación de 50 nanosatélites que se está llevando a cabo dentro del proyecto QB50. ABSTRACT This thesis is developed within the framework of satellite communications in the innovative field of small satellites also known as nanosatellites (<10 kg) or CubeSats, so called from their cubic form. These nanosatellites are characterized by their low cost because they use commercial components called COTS (commercial off-the-shelf), and their small size and mass, such as 1U Cubesats (10cm * 10cm * 10cm) with approximately 1 kg mass. This thesis is based on a proposal made by the author of the thesis to put into orbit the first Peruvian satellite in his country called Chasqui I, which was successfully launched into orbit from the International Space Station in 2014. The experience of this research work led me to propose a constellation of small satellites named Waposat to provide water quality monitoring sensors worldwide, scenario that is used in this thesis. In this scenario and given the limited features of nanosatellites, both power and data rate, I propose to investigate a new communications architecture that allows solving in an optimal manner the problems of nanosatellites in orbit LEO due to the disruptive nature of their communications by putting emphasis on the link and application layers. This thesis presents and evaluates a new communications architecture to provide services to terrestrial sensor networks using a space Delay/Disruption Tolerant Networking (DTN) based solution. In addition, I propose a new multiple access mechanism protocol based on extended unslotted ALOHA that takes into account the priority of gateway traffic, which we call ALOHA multiple access with gateway priority (ALOHAGP) with an adaptive contention mechanism. It uses satellite feedback to implement the congestion control, and to dynamically adapt the channel effective throughput in an optimal way. We assume a finite sensor population model and a saturated traffic condition where every sensor always has frames to transmit. The performance was evaluated in terms of effective throughput, delay and system fairness. In addition, a DTN convergence layer (ALOHAGP-CL) has been defined as a subset of the standard TCP-CL (Transmission Control Protocol-Convergence Layer). This thesis reveals that ALOHAGP/CL adequately supports the proposed DTN scenario, mainly when reactive fragmentation is used. Finally, this thesis investigates an optimal DTN message (bundles) transfer using proactive fragmentation strategies to give service to a ground sensor network using a nanosatellite communications link which uses a multi-access mechanism with priority in downlink traffic (ALOHAGP). The effective throughput has been optimized by adapting the protocol parameters as a function of the current number of active sensors received from satellite. Also, there is currently no method for advertising or negotiating the maximum size of a bundle which can be accepted by a bundle agent in satellite communications for storage and delivery, so that bundles which are too large can be dropped or which are too small are inefficient. We have characterized this kind of scenario obtaining a probability distribution for frame arrivals to nanosatellite and visibility time distribution that provide an optimal proactive fragmentation of DTN bundles. We have found that the proactive effective throughput (goodput) reaches a value slightly lower than reactive fragmentation approach. This contribution allows to use the proactive fragmentation optimally with all its advantages such as the incorporation of the security model of DTN and simplicity in protocol implementation for computers with many CPU and memory limitations. The implementation of these contributions was initially contemplated as part of the payload of the nanosatellite QBito, which is part of the constellation of 50 nanosatellites envisaged under the QB50 project.
Resumo:
El yacimiento Casablanca es un campo petrolífero maduro en etapa de agotamiento considerado como el más grande del mar Mediterráneo. Lleva en explotación desde 1977 y tiene una producción acumulada de 22.6 MMm3 de petróleo. La formación productiva consiste en carbonatos karstificados del Grupo Basal Terciario y del Mesozoico. El mecanismo de drenaje identificado es por empuje de agua de un gran acuífero activo considerado como infinito ya que ha mantenido la presión al 95% de la original después de casi 40 años de producción. En el año 1999, los pozos asociados al campo Casablanca producían unos 500 m3/d de agua que era tratada y vertida al mar. Para cumplir con las leyes medio ambientales de la época, se convirtió el pozo Casablanca-9 en pozo sumidero con el objetivo de devolver a la formación toda el agua producida de una manera segura, limpia y totalmente respetuosa con el medio ambiente. Años después se observó que ésta inyección no era inocua, sino que tenía un impacto en la producción de petróleo. En la presente tesis se ha definido una metodología que, mediante la experimentación en campo con trazadores, pruebe la existencia de comunicación entre pozos productores y pozos sumidero, rompiendo así el paradigma instaurado en el campo que reza que no es posible la recuperación mejorada mediante inyección de agua en Casablanca debido al gran acuífero existente. Los resultados obtenidos serán el punto de partida para la construcción de un modelo de simulación que permita verificar que es posible la aplicación de técnicas IOR/EOR, y más concretamente la recuperación mejorada mediante inyección de agua en presencia de un acuífero activo infinito. ABSTRACT Casablanca is a brown field in the decline stage and is considered as the largest field in the Mediterranean Sea. It has been on production since 1977 and the cumulative production is 22.6 MMm3 of oil. The productive reservoir formation consists on complex karstified carbonates from Basal Tertiary Group and Mesozoic. The drive mechanism identified is water drive by a large aquifer considered as infinite acting due to the pressure maintenance at 95% of the original after near 40 years of production. In 1999, the wells associated to Casablanca field produced about 500 m3/d of water that was treated and disposed to the sea. In order to comply with the environmental laws at that time, Casablanca-9 was converted from producer to water disposal well with the objective to dispose all the water back to the formation in a safe, clean and environmental fully respectful way. Years later, it was observed that injection was not innocuous, but had an impact on oil production. The methodology defined in this thesis will demonstrate the existence of communication between producers and disposal wells through field experiments with tracers, breaking the paradigm established in the field that says it is not possible to apply waterflooding techniques in Casablanca due to the existence of a strong infinite acting aquifer. The results obtained will be the starting point in order to build a simulation model able to demonstrate that the application of IOR / EOR techniques are suitable, more specifically water flooding techniques in presence of an infinite active aquifer.
Resumo:
Blindsight is the rare and paradoxical ability of some human subjects with occipital lobe brain damage to discriminate unseen stimuli in their clinically blind field defects when forced-choice procedures are used, implying that lesions of striate cortex produce a sharp dissociation between visual performance and visual awareness. Skeptics have argued that this is no different from the behavior of normal subjects at the lower limits of conscious vision, at which such dissociations could arise trivially by using different response criteria during clinical and forced-choice tests. We tested this claim explicitly by measuring the sensitivity of a hemianopic patient independently of his response criterion in yes-no and forced-choice detection tasks with the same stimulus and found that, unlike normal controls, his sensitivity was significantly higher during the forced-choice task. Thus, the dissociation by which blindsight is defined is not simply due to a difference in the patients’ response bias between the two paradigms. This result implies that blindsight is unlike normal, near-threshold vision and that information about the stimulus is processed in blindsighted patients in an unusual way.
Resumo:
This review presents a view of hyperalgesia and allodynia not typical of the field as a whole. That is, exaggerated pain is presented as one of many natural consequences of peripheral infection and injury. The constellation of changes that results from such immune challenges is called the sickness response. This sickness response results from immune-to-brain communication initiated by proinflammatory cytokines released by activated immune cells. In response to signals it receives from the immune system, the brain orchestrates the broad array of physiological, behavioral, and hormonal changes that comprise the sickness response. The neurocircuitry and neurochemistry of sickness-induced hyperalgesia are described. One focus of this discussion is on the evidence that spinal cord microglia and astrocytes are key mediators of sickness-induced hyperalgesia. Last, evidence is presented that hyperalgesia and allodynia also result from direct immune activation, rather than neural activation, of these same spinal cord glia. Such glial activation is induced by viruses such as HIV-1 that are known to invade the central nervous system. Implications of exaggerated pain states created by peripheral and central immune activation are discussed.
Resumo:
Understanding how the brain processes vocal communication sounds is one of the most challenging problems in neuroscience. Our understanding of how the cortex accomplishes this unique task should greatly facilitate our understanding of cortical mechanisms in general. Perception of species-specific communication sounds is an important aspect of the auditory behavior of many animal species and is crucial for their social interactions, reproductive success, and survival. The principles of neural representations of these behaviorally important sounds in the cerebral cortex have direct implications for the neural mechanisms underlying human speech perception. Our progress in this area has been relatively slow, compared with our understanding of other auditory functions such as echolocation and sound localization. This article discusses previous and current studies in this field, with emphasis on nonhuman primates, and proposes a conceptual platform to further our exploration of this frontier. It is argued that the prerequisite condition for understanding cortical mechanisms underlying communication sound perception and production is an appropriate animal model. Three issues are central to this work: (i) neural encoding of statistical structure of communication sounds, (ii) the role of behavioral relevance in shaping cortical representations, and (iii) sensory–motor interactions between vocal production and perception systems.
Resumo:
Optimism is growing that the near future will witness rapid growth in human-computer interaction using voice. System prototypes have recently been built that demonstrate speaker-independent real-time speech recognition, and understanding of naturally spoken utterances with vocabularies of 1000 to 2000 words, and larger. Already, computer manufacturers are building speech recognition subsystems into their new product lines. However, before this technology can be broadly useful, a substantial knowledge base is needed about human spoken language and performance during computer-based spoken interaction. This paper reviews application areas in which spoken interaction can play a significant role, assesses potential benefits of spoken interaction with machines, and compares voice with other modalities of human-computer interaction. It also discusses information that will be needed to build a firm empirical foundation for the design of future spoken and multimodal interfaces. Finally, it argues for a more systematic and scientific approach to investigating spoken input and performance with future language technology.
Resumo:
We present a library of Penn State Fiber Optic Echelle (FOE) observations of a sample of field stars with spectral types F to M and luminosity classes V to I. The spectral coverage is from 3800 to 10000 Å with a nominal resolving power of 12,000. These spectra include many of the spectral lines most widely used as optical and near-infrared indicators of chromospheric activity such as the Balmer lines (Hα to H epsilon), Ca II H & K, the Mg I b triplet, Na I D_1, D_2, He I D_3, and Ca II IRT lines. There are also a large number of photospheric lines, which can also be affected by chromospheric activity, and temperature-sensitive photospheric features such as TiO bands. The spectra have been compiled with the goal of providing a set of standards observed at medium resolution. We have extensively used such data for the study of active chromosphere stars by applying a spectral subtraction technique. However, the data set presented here can also be utilized in a wide variety of ways ranging from radial velocity templates to study of variable stars and stellar population synthesis. This library can also be used for spectral classification purposes and determination of atmospheric parameters (T_eff, log g, [Fe/H]). A digital version of all the fully reduced spectra is available via ftp and the World Wide Web (WWW) in FITS format.
Resumo:
We present a library of Utrecht echelle spectrograph (UES) observations of a sample of F, G, K and M field dwarf stars covering the spectral range from 4800 Å to 10600 Å with a resolution of 55000. These spectra include some of the spectral lines most widely used as optical and near-infrared indicators of chromospheric activity such as Hβ, Mg I b triplet, Na I D_1, D_2, He I D_3, Hα, and Ca II IRT lines, as well as a large number of photospheric lines which can also be affected by chromospheric activity. The spectra have been compiled with the aim of providing a set of standards observed at high-resolution to be used in the application of the spectral subtraction technique to obtain the active-chromosphere contribution to these lines in chromospherically active single and binary stars. This library can also be used for spectral classification purposes. A digital version with all the spectra is available via ftp and the World Wide Web (WWW) in both ASCII and FITS formats.
Resumo:
EMIR (Balcells et al. 2000) is a near-infrared wide-field camera and multi-object spectrograph being built for the GTC. The Data Reduction Pipeline (DRP) will be optimized for handling and reducing near-infrared data acquired with EMIR.
Resumo:
The last two decades have been marked by a growing public awareness of family violence. Research by social scientists has suggested that family violence is widespread (Gelles and Straus, 1988). It is estimated that every year 1.8 to 4 million women are physically abused by their partners (Novello, 1992). In fact, more women are abused by their husbands or boyfriends than are injured in car accidents, muggings, or rapes (Jaffe, Wolfe, and Wilson, 1990). A recent prevalence study by Fantuzzo, Boruch, Beriama, Atkins, and Marcus (1997) found that children were disproportionately present in households where there was a substantial incident of adult female assault. Experts estimate that 3.3 to 10 million children are exposed to marital violence each year (Carlson, 1984; Straus, 1991). Until recently, most researchers did not consider the impact of parental conflict on the children who witness this violence. The early literature in this field primarily focused on the incidence of violence against women and the inadequate response of community agencies (Jaffe et al, 1990). The needs of children were rarely considered. However, researchers have become increasingly aware that children exposed to marital violence are victims of a range of psychological maltreatment (e.g., terrorizing, isolation;Hart, Brassared & Karlson, 1996) and are at serious risk for the development of psychological problems (Fantuzzo, DePaola, Lambert, Martino, Anderson, and Sutton, 1991). Jouriles, Murphy and O'Leary (1989) found that children of battered women were four times more likely to exhibit psychopathology as were children living in non-violent homes. Further, researchers have found associations between childhood exposure to parental violence and the expression of violence in adulthood (Carlson, 1990). Existing research suggests that children who have witnessed marital violence manifest numerous emotional, social, and behavioral problems (Sternberg et al., 1993; Fantuzzo et al., 1991; Jaffe et al, 1990). Studies have found that children of battered women exhibit more internalizing and externalizing behavior problems than non-witnesschildren (Hughes and Fantuzzo, 1994; McCloskey, Figueredo, and Koss, 1995). In addition, children exposed to marital violence have been found to exhibit difficulties with social problem-solving, and have lower levels of social competence than nonwitnesses (Rosenberg, 1987; Moore, Pepler, Weinberg, Hammond, Waddell, & Weiser, 1990). Other reported difficulties include low self esteem (Hughes, 1988), poor school performance (Moore et al., 1990) and problems with aggression (Holden & Ritchie, 1991; Jaffe, Wolfe, Wilson, & Zak, 1986). Further, within the last decade, researchers have found that some children are traumatized by the witnessing experience, showing elevated levels of posttraumatic stress symptoms (Devoe & Graham-Bermann, 1997; Rossman, Bingham, & Emde, 1996; Kilpatrick, Litt, & Williams, 1997). These findings corroborate clinical reports that describe many exposed children as experiencing trauma reactions. It appears that the negative effects of witnessing marital violence are numerous and varied, ranging from mild emotional and behavioral problems to clinically significant levels of posttraumatic stress symptoms. These incidence figures and research findings indicate that children's exposure to violence is a significant problem in our nation today and has serious implications for the future.
Resumo:
We study the timing and spectral properties of the low-magnetic field, transient magnetar SWIFT J1822.3−1606 as it approached quiescence. We coherently phase-connect the observations over a time-span of ∼500 d since the discovery of SWIFT J1822.3−1606 following the Swift-Burst Alert Telescope (BAT) trigger on 2011 July 14, and carried out a detailed pulse phase spectroscopy along the outburst decay. We follow the spectral evolution of different pulse phase intervals and find a phase and energy-variable spectral feature, which we interpret as proton cyclotron resonant scattering of soft photon from currents circulating in a strong (≳1014 G) small-scale component of the magnetic field near the neutron star surface, superimposed to the much weaker (∼3 × 1013 G) magnetic field. We discuss also the implications of the pulse-resolved spectral analysis for the emission regions on the surface of the cooling magnetar.
Resumo:
no.34(1940)
Resumo:
We report experimental results of near-surface winter temperatures along and adjacent to the channel bed of a High Arctic river on Melville Island, Canada. Temperature loggers 5cm below the ground surface in areas where the terrain suggests varying snow accumulation patterns revealed that the maximum winter difference between air and near-surface temperatures ranged from 0 to +30°C during the winter of 2012–13, and that shallow near-surface freezing conditions were delayed for up to 21 days in some locations. Cooling to -10°C was delayed for up to 117 days. Modelled temperature at the top of permafrost indicates that permafrost at locations with thick snow can be up to 8°C warmer than those with thin snow. This thermal evidence for an ameliorated surface environment indicates the potential for substantial extended microbial and biogeochemical cycling during early winter. Rapid thaw of the bed during initiation of snowmelt in spring also indicates a high degree of hydrological connectivity. Therefore, snow-filled channels may contribute to biogeochemical and aquatic cycling in High Arctic rivers.