972 resultados para data link
Resumo:
The sense and avoid capability is one of the greatest challenges that has to be addressed to safely integrate unmanned aircraft systems into civil and nonsegregated airspace. This paper gives a review of existing regulations, recommended practices, and standards in sense and avoid for unmanned aircraft systems. Gaps and issues are identified, as are the different factors that are likely to affect actual sense and avoid requirements. It is found that the operational environment (flight altitude, meteorological conditions, and class of airspace) plays an important role when determining the type of flying hazards that the unmanned aircraft system might encounter. In addition, the automation level and the data-link architecture of the unmanned aircraft system are key factors that will definitely determine the sense and avoid system requirements. Tactical unmanned aircraft, performing similar missions to general aviation, are found to be the most challenging systems from an sense and avoid point of view, and further research and development efforts are still needed before their seamless integration into nonsegregated airspace
Resumo:
Esta revisión sistemática de la literatura tuvo como objetivo investigar sobre la depresión en personas con epilepsia en la última década (2005-2015), enfocándose en identificar en el paciente con epilepsia: características sociodemográficas, prevalencia de la depresión, tipos de intervención para el manejo de la depresión, factores asociados con la aparición y el mantenimiento de la depresión y por último, identificar las tendencias en investigación en el estudio de la depresión en pacientes con epilepsia. Se revisaron 103 artículos publicados entre 2005 y 2015 en bases de datos especializadas. Los resultados revelaron que la prevalencia de depresión en pacientes con epilepsia es diversa y oscila en un rango amplio entre 3 y 70 %, por otro lado, que las principales características sociodemográficas asociadas a la depresión está el ser mujer, tener un estado civil soltero y tener una edad comprendida entre los 25 y los 45 años. A esto se añade, que los tratamientos conformados por terapia psicológica y fármacos, son la mejor opción para garantizar la eficacia en los resultados del manejo de la depresión en los pacientes con epilepsia. Con respecto a los factores asociados a la aparición de la depresión en pacientes con epilepsia, se identificaron causas tanto neurobiológicas como psicosociales, asimismo los factores principales asociados al mantenimiento fueron una percepción de baja calidad de vida y una baja auto-eficacia. Y finalmente los tipos de investigación más comunes son de tipo aplicado, de carácter descriptivo, transversales y de medición cuantitativa.
Resumo:
A link failure in the path of a virtual circuit in a packet data network will lead to premature disconnection of the circuit by the end-points. A soft failure will result in degraded throughput over the virtual circuit. If these failures can be detected quickly and reliably, then appropriate rerouteing strategies can automatically reroute the virtual circuits that use the failed facility. In this paper, we develop a methodology for analysing and designing failure detection schemes for digital facilities. Based on errored second data, we develop a Markov model for the error and failure behaviour of a T1 trunk. The performance of a detection scheme is characterized by its false alarm probability and the detection delay. Using the Markov model, we analyse the performance of detection schemes that use physical layer or link layer information. The schemes basically rely upon detecting the occurrence of severely errored seconds (SESs). A failure is declared when a counter, that is driven by the occurrence of SESs, reaches a certain threshold.For hard failures, the design problem reduces to a proper choice;of the threshold at which failure is declared, and on the connection reattempt parameters of the virtual circuit end-point session recovery procedures. For soft failures, the performance of a detection scheme depends, in addition, on how long and how frequent the error bursts are in a given failure mode. We also propose and analyse a novel Level 2 detection scheme that relies only upon anomalies observable at Level 2, i.e. CRC failures and idle-fill flag errors. Our results suggest that Level 2 schemes that perform as well as Level 1 schemes are possible.
Resumo:
The literature on pricing implicitly assumes an "infinite data" model, in which sources can sustain any data rate indefinitely. We assume a more realistic "finite data" model, in which sources occasionally run out of data; this leads to variable user data rates. Further, we assume that users have contracts with the service provider, specifying the rates at which they can inject traffic into the network. Our objective is to study how prices can be set such that a single link can be shared efficiently and fairly among users in a dynamically changing scenario where a subset of users occasionally has little data to send. User preferences are modelled by concave increasing utility functions. Further, we introduce two additional elements: a convex increasing disutility function and a convex increasing multiplicative congestion-penally function. The disutility function takes the shortfall (contracted rate minus present rate) as its argument, and essentially encourages users to send traffic at their contracted rates, while the congestion-penalty function discourages heavy users from sending excess data when the link is congested. We obtain simple necessary and sufficient conditions on prices for fair and efficient link sharing; moreover, we show that a single price for all users achieves this. We illustrate the ideas using a simple experiment.
Resumo:
The literature on pricing implicitly assumes an "infinite data" model, in which sources can sustain any data rate indefinitely. We assume a more realistic "finite data" model, in which sources occasionally run out of data. Further, we assume that users have contracts with the service provider, specifying the rates at which they can inject traffic into the network. Our objective is to study how prices can be set such that a single link can be shared efficiently and fairly among users in a dynamically changing scenario where a subset of users occasionally has little data to send. We obtain simple necessary and sufficient conditions on prices such that efficient and fair link sharing is possible. We illustrate the ideas using a simple example
Resumo:
The trophic link density and the stability of food webs are thought to be related, but the nature of this relation is controversial. This article introduces a method for estimating the link density from diet tables which do not cover the complete food web and do not resolve all diet items to species level. A simple formula for the error of this estimate is derived. Link density is determined as a function of a threshold diet fraction below which diet items are ignored (
Resumo:
In the B-ISDN there is a provision for four classes of services, all of them supported by a single transport network (the ATM network). Three of these services, the connected oriented (CO) ones, permit connection access control (CAC) but the fourth, the connectionless oriented (CLO) one, does not. Therefore, when CLO service and CO services have to share the same ATM link, a conflict may arise. This is because a bandwidth allocation to obtain maximum statistical gain can damage the contracted ATM quality of service (QOS); and vice versa, in order to guarantee the contracted QOS, the statistical gain have to be sacrificed. The paper presents a performance evaluation study of the influence of the CLO service on a CO service (a circuit emulation service or a variable bit-rate service) when sharing the same link
Resumo:
Lo scopo di questo elaborato è di analizzare e progettare un sistema in grado di supportare la definizione dei dati nel formato utilizzato per definire in modo formale la semantica dei dati, ma soprattutto nella complessa e innovativa attività di link discovery. Una attività molto potente che, tramite gli strumenti e le regole del Web Semantico (chiamato anche Web of Data), permette data una base di conoscenza sorgente ed altre basi di conoscenza esterne e distribuite nel Web, di interconnettere i dati della base di conoscenza sorgente a quelli esterni sulla base di complessi algoritmi di interlinking. Questi algoritmi fanno si che i concetti espressi sulla base di dati sorgente ed esterne vengano interconnessi esprimendo la semantica del collegamento ed in base a dei complessi criteri di confronto definiti nel suddetto algoritmo. Tramite questa attività si è in grado quindi di aumentare notevolmente la conoscenza della base di conoscenza sorgente, se poi tutte le basi di conoscenza presenti nel Web of Data seguissero questo procedimento, la conoscenza definita aumenterebbe fino a livelli che sono limitati solo dalla immensa vastità del Web, dando una potenza di elaborazione dei dati senza eguali. Per mezzo di questo sistema si ha l’ambizioso obiettivo di fornire uno strumento che permetta di aumentare sensibilmente la presenza dei Linked Open Data principalmente sul territorio nazionale ma anche su quello internazionale, a supporto di enti pubblici e privati che tramite questo sistema hanno la possibilità di aprire nuovi scenari di business e di utilizzo dei dati, dando una potenza al dato che attualmente è solo immaginabile.
Resumo:
Geostrophic surface velocities can be derived from the gradients of the mean dynamic topography-the difference between the mean sea surface and the geoid. Therefore, independently observed mean dynamic topography data are valuable input parameters and constraints for ocean circulation models. For a successful fit to observational dynamic topography data, not only the mean dynamic topography on the particular ocean model grid is required, but also information about its inverse covariance matrix. The calculation of the mean dynamic topography from satellite-based gravity field models and altimetric sea surface height measurements, however, is not straightforward. For this purpose, we previously developed an integrated approach to combining these two different observation groups in a consistent way without using the common filter approaches (Becker et al. in J Geodyn 59(60):99-110, 2012, doi:10.1016/j.jog.2011.07.0069; Becker in Konsistente Kombination von Schwerefeld, Altimetrie und hydrographischen Daten zur Modellierung der dynamischen Ozeantopographie, 2012, http://nbn-resolving.de/nbn:de:hbz:5n-29199). Within this combination method, the full spectral range of the observations is considered. Further, it allows the direct determination of the normal equations (i.e., the inverse of the error covariance matrix) of the mean dynamic topography on arbitrary grids, which is one of the requirements for ocean data assimilation. In this paper, we report progress through selection and improved processing of altimetric data sets. We focus on the preprocessing steps of along-track altimetry data from Jason-1 and Envisat to obtain a mean sea surface profile. During this procedure, a rigorous variance propagation is accomplished, so that, for the first time, the full covariance matrix of the mean sea surface is available. The combination of the mean profile and a combined GRACE/GOCE gravity field model yields a mean dynamic topography model for the North Atlantic Ocean that is characterized by a defined set of assumptions. We show that including the geodetically derived mean dynamic topography with the full error structure in a 3D stationary inverse ocean model improves modeled oceanographic features over previous estimates.
Resumo:
ENVISAT ASAR WSM images with pixel size 150 × 150 m, acquired in different meteorological, oceanographic and sea ice conditions were used to determined icebergs in the Amundsen Sea (Antarctica). An object-based method for automatic iceberg detection from SAR data has been developed and applied. The object identification is based on spectral and spatial parameters on 5 scale levels, and was verified with manual classification in four polygon areas, chosen to represent varying environmental conditions. The algorithm works comparatively well in freezing temperatures and strong wind conditions, prevailing in the Amundsen Sea during the year. The detection rate was 96% which corresponds to 94% of the area (counting icebergs larger than 0.03 km**2), for all seasons. The presented algorithm tends to generate errors in the form of false alarms, mainly caused by the presence of ice floes, rather than misses. This affects the reliability since false alarms were manually corrected post analysis.
Resumo:
There is increasing evidence that different light intensities strongly modulate the effects of ocean acidification (OA) on marine phytoplankton. The aim of the present study was to investigate interactive effects of OA and dynamic light, mimicking natural mixing regimes. The Antarctic diatom Chaetoceros debilis was grown under two pCO2 (390 and 1000 latm) and light conditions (constant and dynamic), the latter yielding the same integrated irradiance over the day. To characterize interactive effects between treatments, growth, elemental composition, primary production and photophysiology were investigated. Dynamic light reduced growth and strongly altered the effects of OA on primary production, being unaffected by elevated pCO2 under constant light, yet significantly reduced under dynamic light. Interactive effects between OA and light were also observed for Chl production and particulate organic carbon (POC) quotas. Response patterns can be explained by changes in the cellular energetic balance. While the energy transfer efficiency from photochemistry to biomass production (Phi_e,C) was not affected by OA under constant light, it was drastically reduced under dynamic light. Contrasting responses under different light conditions need to be considered when making predictions regarding a more stratified and acidified future ocean.