43 resultados para RECALIBRATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determinar con buena precisión la posición en la que se encuentra un terminal móvil, cuando éste se halla inmerso en un entorno de interior (centros comerciales, edificios de oficinas, aeropuertos, estaciones, túneles, etc), es el pilar básico sobre el que se sustentan un gran número de aplicaciones y servicios. Muchos de esos servicios se encuentran ya disponibles en entornos de exterior, aunque los entornos de interior se prestan a otros servicios específicos para ellos. Ese número, sin embargo, podría ser significativamente mayor de lo que actualmente es, si no fuera necesaria una costosa infraestructura para llevar a cabo el posicionamiento con la precisión adecuada a cada uno de los hipotéticos servicios. O, igualmente, si la citada infraestructura pudiera tener otros usos distintos, además del relacionado con el posicionamiento. La usabilidad de la misma infraestructura para otros fines distintos ofrecería la oportunidad de que la misma estuviera ya presente en las diferentes localizaciones, porque ha sido previamente desplegada para esos otros usos; o bien facilitaría su despliegue, porque el coste de esa operación ofreciera un mayor retorno de usabilidad para quien lo realiza. Las tecnologías inalámbricas de comunicaciones basadas en radiofrecuencia, ya en uso para las comunicaciones de voz y datos (móviles, WLAN, etc), cumplen el requisito anteriormente indicado y, por tanto, facilitarían el crecimiento de las aplicaciones y servicios basados en el posicionamiento, en el caso de poderse emplear para ello. Sin embargo, determinar la posición con el nivel de precisión adecuado mediante el uso de estas tecnologías, es un importante reto hoy en día. El presente trabajo pretende aportar avances significativos en este campo. A lo largo del mismo se llevará a cabo, en primer lugar, un estudio de los principales algoritmos y técnicas auxiliares de posicionamiento aplicables en entornos de interior. La revisión se centrará en aquellos que sean aptos tanto para tecnologías móviles de última generación como para entornos WLAN. Con ello, se pretende poner de relieve las ventajas e inconvenientes de cada uno de estos algoritmos, teniendo como motivación final su aplicabilidad tanto al mundo de las redes móviles 3G y 4G (en especial a las femtoceldas y small-cells LTE) como al indicado entorno WLAN; y teniendo siempre presente que el objetivo último es que vayan a ser usados en interiores. La principal conclusión de esa revisión es que las técnicas de triangulación, comúnmente empleadas para realizar la localización en entornos de exterior, se muestran inútiles en los entornos de interior, debido a efectos adversos propios de este tipo de entornos como la pérdida de visión directa o los caminos múltiples en el recorrido de la señal. Los métodos de huella radioeléctrica, más conocidos bajo el término inglés “fingerprinting”, que se basan en la comparación de los valores de potencia de señal que se están recibiendo en el momento de llevar a cabo el posicionamiento por un terminal móvil, frente a los valores registrados en un mapa radio de potencias, elaborado durante una fase inicial de calibración, aparecen como los mejores de entre los posibles para los escenarios de interior. Sin embargo, estos sistemas se ven también afectados por otros problemas, como por ejemplo los importantes trabajos a realizar para ponerlos en marcha, y la variabilidad del canal. Frente a ellos, en el presente trabajo se presentan dos contribuciones originales para mejorar los sistemas basados en los métodos fingerprinting. La primera de esas contribuciones describe un método para determinar, de manera sencilla, las características básicas del sistema a nivel del número de muestras necesarias para crear el mapa radio de la huella radioeléctrica de referencia, junto al número mínimo de emisores de radiofrecuencia que habrá que desplegar; todo ello, a partir de unos requerimientos iniciales relacionados con el error y la precisión buscados en el posicionamiento a realizar, a los que uniremos los datos correspondientes a las dimensiones y realidad física del entorno. De esa forma, se establecen unas pautas iniciales a la hora de dimensionar el sistema, y se combaten los efectos negativos que, sobre el coste o el rendimiento del sistema en su conjunto, son debidos a un despliegue ineficiente de los emisores de radiofrecuencia y de los puntos de captura de su huella. La segunda contribución incrementa la precisión resultante del sistema en tiempo real, gracias a una técnica de recalibración automática del mapa radio de potencias. Esta técnica tiene en cuenta las medidas reportadas continuamente por unos pocos puntos de referencia estáticos, estratégicamente distribuidos en el entorno, para recalcular y actualizar las potencias registradas en el mapa radio. Un beneficio adicional a nivel operativo de la citada técnica, es la prolongación del tiempo de usabilidad fiable del sistema, bajando la frecuencia en la que se requiere volver a capturar el mapa radio de potencias completo. Las mejoras anteriormente citadas serán de aplicación directa en la mejora de los mecanismos de posicionamiento en interiores basados en la infraestructura inalámbrica de comunicaciones de voz y datos. A partir de ahí, esa mejora será extensible y de aplicabilidad sobre los servicios de localización (conocimiento personal del lugar donde uno mismo se encuentra), monitorización (conocimiento por terceros del citado lugar) y seguimiento (monitorización prolongada en el tiempo), ya que todos ellas toman como base un correcto posicionamiento para un adecuado desempeño. ABSTRACT To find the position where a mobile is located with good accuracy, when it is immersed in an indoor environment (shopping centers, office buildings, airports, stations, tunnels, etc.), is the cornerstone on which a large number of applications and services are supported. Many of these services are already available in outdoor environments, although the indoor environments are suitable for other services that are specific for it. That number, however, could be significantly higher than now, if an expensive infrastructure were not required to perform the positioning service with adequate precision, for each one of the hypothetical services. Or, equally, whether that infrastructure may have other different uses beyond the ones associated with positioning. The usability of the same infrastructure for purposes other than positioning could give the opportunity of having it already available in the different locations, because it was previously deployed for these other uses; or facilitate its deployment, because the cost of that operation would offer a higher return on usability for the deployer. Wireless technologies based on radio communications, already in use for voice and data communications (mobile, WLAN, etc), meet the requirement of additional usability and, therefore, could facilitate the growth of applications and services based on positioning, in the case of being able to use it. However, determining the position with the appropriate degree of accuracy using these technologies is a major challenge today. This paper provides significant advances in this field. Along this work, a study about the main algorithms and auxiliar techniques related with indoor positioning will be initially carried out. The review will be focused in those that are suitable to be used with both last generation mobile technologies and WLAN environments. By doing this, it is tried to highlight the advantages and disadvantages of each one of these algorithms, having as final motivation their applicability both in the world of 3G and 4G mobile networks (especially in femtocells and small-cells of LTE) and in the WLAN world; and having always in mind that the final aim is to use it in indoor environments. The main conclusion of that review is that triangulation techniques, commonly used for localization in outdoor environments, are useless in indoor environments due to adverse effects of such environments as loss of sight or multipaths. Triangulation techniques used for external locations are useless due to adverse effects like the lack of line of sight or multipath. Fingerprinting methods, based on the comparison of Received Signal Strength values measured by the mobile phone with a radio map of RSSI Recorded during the calibration phase, arise as the best methods for indoor scenarios. However, these systems are also affected by other problems, for example the important load of tasks to be done to have the system ready to work, and the variability of the channel. In front of them, in this paper we present two original contributions to improve the fingerprinting methods based systems. The first one of these contributions describes a method for find, in a simple way, the basic characteristics of the system at the level of the number of samples needed to create the radio map inside the referenced fingerprint, and also by the minimum number of radio frequency emitters that are needed to be deployed; and both of them coming from some initial requirements for the system related to the error and accuracy in positioning wanted to have, which it will be joined the data corresponding to the dimensions and physical reality of the environment. Thus, some initial guidelines when dimensioning the system will be in place, and the negative effects into the cost or into the performance of the whole system, due to an inefficient deployment of the radio frequency emitters and of the radio map capture points, will be minimized. The second contribution increases the resulting accuracy of the system when working in real time, thanks to a technique of automatic recalibration of the power measurements stored in the radio map. This technique takes into account the continuous measures reported by a few static reference points, strategically distributed in the environment, to recalculate and update the measurements stored into the map radio. An additional benefit at operational level of such technique, is the extension of the reliable time of the system, decreasing the periodicity required to recapture the radio map within full measurements. The above mentioned improvements are directly applicable to improve indoor positioning mechanisms based on voice and data wireless communications infrastructure. From there, that improvement will be also extensible and applicable to location services (personal knowledge of the location where oneself is), monitoring (knowledge by other people of your location) and monitoring (prolonged monitoring over time) as all of them are based in a correct positioning for proper performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses the uses and misuses in the application of the European Arrest Warrant (EAW) system in the European Union. It examines the main quantitative results of this extradition system achieved between 2005 and 2011 on the basis of the existing statistical knowledge on its implementation at EU official levels. The EAW has been anchored in a high level of ‘mutual trust’ between the participating states’ criminal justice regimes and authorities. This reciprocal confidence, however, has been subject to an increasing number of challenges resulting from its practical application, presenting a dual conundrum: 1. Principle of proportionality: Who are the competent judicial authorities cooperating with each other and ensuring that there are sufficient impartial controls over the necessity and proportionality of the decisions on the issuing and execution of EAWs? 2. Principle of division of powers: How can criminal justice authorities be expected to handle different criminal judicial traditions in what is supposed to constitute a ‘serious’ or ‘minor’ crime in their respective legal settings and ‘who’ is ultimately to determine (divorced from political considerations) when is it duly justified to make the EAW system operational? It is argued that the next generation of the EU’s criminal justice cooperation and the EAW need to recognise and acknowledge that the mutual trust premise upon which the European system has been built so far is no longer viable without devising new EU policy stakeholders’ structures and evaluation mechanisms. These should allow for the recalibration of mutual trust and mistrust in EU justice systems in light of the experiences of the criminal justice actors and practitioners having a stake in putting the EAW into daily effect. Such a ‘bottom-up approach’ should be backed up with the best impartial and objective evaluation, an improved system of statistical collection and an independent qualitative assessment of its implementation. This should be placed as the central axis of a renewed EAW framework which should seek to better ensure the accountability, impartial (EU-led) scrutiny and transparency of member states’ application of the EAW in light of the general principles and fundamental rights constituting the foundations of the European system of criminal justice cooperation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

All three parties principally responsible for the Vilnius fiasco are to blame, each in their very different way: the EU for having drafted agreements with an inadequate balance between incentives and obligations, and vulnerable as a result to Putin’s aim to torpedo the whole process in favour of his misconceived Eurasian Union, while Yanukovich tried playing geo-political games that left him personally and the Ukrainian state as Putin’s hostage. It will require a major recalibration of policies to get this unstable new status quo back onto sound strategic lines, and proposals are advanced along three tracks in parallel: for rebuilding the remnants of the EU’s neighbourhood policy, for attempting to get Russia to take Lisbon to Vladivostok seriously, and for promoting a Greater Eurasia concept fit for the 21st century that would embrace the whole of the European and Asian landmass.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Metamorphosis is both an ecological and a developmental genetic transition that an organism undergoes as a normal part of ontogeny. Many organisms have the ability to delay metamorphosis when conditions are unsuitable. This strategy carries obvious benefits, but may also result in severe consequences for older larvae that run low on energy. In the marine environment, some lecithotrophic larvae that have prolonged periods in the plankton may begin forming postlarval and juvenile structures that normally do not appear until after settlement and the initiation of metamorphosis. This precocious activation of the postlarval developmental program may reflect an adaptation to increase the survival of older, energy-depleted larvae by allowing them to metamorphose more quickly. In the present study, we investigate morphological and genetic consequences of delay of metamorphosis in larvae of Herdmania momus (a solitary stolidobranch ascidian). We observe significant morphological and genetic changes during prolonged larval life, with older larvae displaying significant changes in RNA levels, precocious migration of mesenchyme cells, and changes in larval shape including shortening of the tail. While these observations suggest that the older H. momus larvae are functionally different from younger larvae and possibly becoming more predisposed to undergo metamorphosis, we did not find any significant differences in gene expression levels between postlarvae arising from larvae that metamorphosed as soon as they were competent and postlarvae developing from larvae that postponed metamorphosis. This recalibration, or convergence, of transcript levels in the early postlarva suggests that changes that occur during prolonged larval life of H. momus are not necessarily associated with early activation of adult organ differentiation. Instead, it suggests that an autonomous developmental program is activated in H. momus upon the induction of metamorphosis regardless of the history of the larva.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By contrast to the far-reaching devolution settlements elsewhere in the UK, political agreement on the governance of England outside London remains unsettled. There is cross- party consensus on the need to 'decentre down' authority to regions and localities, but limited agreement on how this should be achieved. This paper explores the welter of initiatives adopted by the recent Labour government that were ostensibly designed to make the meso-level of governance more coherent, accountable and responsive to meeting territorial priorities. Second, it explores the current Conservative-Liberal Democrat Coalition's programme of reform that involves the elimination of Labour's regional institutional architecture and is intended to restore powers to local government and communities and promote local authority co-operation around sub-regions. Labour's reforms were ineffective in achieving any substantial transfer of authority away from Whitehall and, given the Coalition's plans to cut public expenditure, the likelihood of any significant recalibration in central-local relations also appears improbable. © 2012 Copyright Taylor and Francis Group, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation investigated the relationship between the September 11, 2001 terrorist attacks and the internationalization agenda of U.S. colleges and universities. The construct, post-9/11 syndrome, is used metaphorically to delineate the apparent state of panic and disequilibrium that followed the incident. Three research questions were investigated, with two universities in the Miami-area of South Florida, one private and the other public, as qualitative case studies. The questions are: (a) How are international student advisors and administrators across two types of institutions dealing with the post-9/11 syndrome? (b) What, if any, are the differences in international education after 9/11? (c) What have been the institutional priorities in relation to international education before and after 9/11? Data-gathering methods included interviews with international student/study abroad advisors and administrators with at least 8 years of experience in the function(s) at their institutions, document and institutional data analysis. The interviews were based on the three-part scheme developed by Schuman (1982): context of experience, details of experience and reflection on the meaning of experiences. Taped interviews, researcher insights, and member checks of transcripts constituted an audit trail for this study. Key findings included a progressive decline in Fall to Fall enrollment of international students at UM by 13.05% in the 5 years after 9/11, and by 6.15% at FIU in the seven post-9/11 years. In both institutions, there was an upsurge in interest in study abroad during the same period but less than 5% of enrolled students ventured abroad annually. I summarized the themes associated with the post-9/11 environment of international education as perceived by my participants at both institutions as 3Ms, 3Ts, and 1D: Menace of Anxiety and Fear, Menace of Insularity and Insecurity, Menace of Over-Regulation and Bigotry, Trajectory of Opportunity, Trajectory of Contradictions, Trajectory of Illusion, Fatalism and Futility, and Dominance of Technology. Based on these findings, I recommended an integrated Internationalization At Home Plus Collaborative Outreach (IAHPCO) approach to internationalization that is based on a post-9/11 recalibration of national security and international education as complementary rather than diametrically opposed concepts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article revisits the official culture of the early khedivate through a microhistory of the first modern Egyptian theater in Arabic. Based on archival research, it aims at a recalibration of recent scholarship by showing khedivial culture as a complex framework of competing patriotisms. It analyzes the discourse about theater in the Arabic press, including the journalist Muhammad Unsi's call for performances in Arabic in 1870. It shows that the realization of this idea was the theater group led by James Sanua between 1871 and 1872, which also performed Ê¿Abd al-Fattah al-Misri's tragedy. But the troupe was not an expression of subversive nationalism, as has been claimed by scholars. My historical reconstruction and my analysis of the content of Sanua's comedies show loyalism toward the Khedive Ismail. Yet his form of contemporary satire was incompatible with elite cultural patriotism, which employed historicization as its dominant technique. This revision throws new light on a crucial moment of social change in the history of modern Egypt, when the ruler was expected to preside over the plural cultural bodies of the nation. © 2014 Cambridge University Press .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008-11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40-65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present numerical investigation offers evidence concerning the validity and objectivity of the predictions of a simple, yet practical, finite element model concerning the responses of steel fibre reinforced concrete structural elements under static monotonic and cyclic loading. Emphasis is focused on realistically describing the fully brittle tensile behaviour of plain concrete and the contribution of steel fibres on the post-cracking behaviour it exhibits. The good correlation exhibited between the numerical predictions and their experimental counterparts reveals that, despite its simplicity, the subject model is capable of providing realistic predictions concerning the response of steel fibre reinforced concrete structural configurations exhibiting both ductile and brittle modes of failure without requiring recalibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questions relating to contemporary understandings of democracy continue to preoccupy the academic landscape, from politics to law—how does one define democracy; is it necessary to recalibrate the concept of democracy to meet the exigencies of the current global security "crisis" and, following from this, how does one understand (and control) the democratic relationship of representation and accountability between citizen and state? Although those writing on the recalibration of democratic theory come from different points of departure, they often arrive at a similar conclusion; namely that this global era poses significant challenges to contemporary understandings of democracy. This article identifies and focuses on one challenge posed by the concept of “militant” democracy against the backdrop of the Turkish case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oscillometric blood pressure (BP) monitors are currently used to diagnose hypertension both in home and clinical settings. These monitors take BP measurements once every 15 minutes over a 24 hour period and provide a reliable and accurate system that is minimally invasive. Although intermittent cuff measurements have proven to be a good indicator of BP, a continuous BP monitor is highly desirable for the diagnosis of hypertension and other cardiac diseases. However, no such devices currently exist. A novel algorithm has been developed based on the Pulse Transit Time (PTT) method, which would allow non-invasive and continuous BP measurement. PTT is defined as the time it takes the BP wave to propagate from the heart to a specified point on the body. After an initial BP measurement, PTT algorithms can track BP over short periods of time, known as calibration intervals. After this time has elapsed, a new BP measurement is required to recalibrate the algorithm. Using the PhysioNet database as a basis, the new algorithm was developed and tested using 15 patients, each tested 3 times over a period of 30 minutes. The predicted BP of the algorithm was compared to the arterial BP of each patient. It has been established that this new algorithm is capable of tracking BP over 12 minutes without the need for recalibration, using the BHS standard, a 100% improvement over what has been previously identified. The algorithm was incorporated into a new system based on its requirements and was tested using three volunteers. The results mirrored those previously observed, providing accurate BP measurements when a 12 minute calibration interval was used. This new system provides a significant improvement to the existing method allowing BP to be monitored continuously and non-invasively, on a beat-to-beat basis over 24 hours, adding major clinical and diagnostic value.