78 resultados para Offset Paraboloidal reflectors
Resumo:
Clock synchronization is critical for the operation of a distributed wireless network system. In this paper we investigate on a method able to evaluate in real time the synchronization offset between devices down to nanoseconds (as needed for positioning). The method is inspired by signal processing algorithms and relies on fine-grain time information obtained during the reconstruction of the signal at the receiver. Applying the method to a GPS-synchronized system show that GPS-based synchronization has high accuracy potential but still suffers from short-term clock drift, which limits the achievable localization error.
Resumo:
This study compared Pundamilia nyererei and Pundamilia pundamilia males in routine metabolic rate (RR ) and in the metabolic costs males pay during territorial interactions (active metabolic rate, RA ). Pundamilia nyererei and P. pundamilia males housed in social isolation did not differ in RR . In contrast to expectation, however, P. nyererei males used less oxygen than P. pundamilia males, for a given mass and level of agonistic activity. This increased metabolic efficiency may be an adaptation to limit the metabolic cost that P. nyererei males pay for their higher rate of aggressiveness compared to P. pundamilia males. Thus, the divergence between the species in agonistic behaviour is correlated with metabolic differentiation. Such concerted divergence in physiology and behaviour might be widespread in the dramatically diverse cichlid radiations in East African lakes and may be an important factor in the remarkably rapid speciation of these fishes. The results did not support the hypothesis that higher metabolic rates caused a physiological cost to P. nyererei males that would offset their dominance advantage.
Resumo:
In addition to plasma metabolites and hormones participating as humoral signals in the control of feed intake, oxidative metabolic processes in peripheral organs also generate signals to terminate feeding. Although the degree of oxidation over longer periods is relatively constant, recent work suggests that the periprandial pattern of fuel oxidation is involved in regulating feeding behavior in the bovine. However, the association between periprandial oxidative metabolism and feed intake of dairy cows has not yet been studied. Therefore, the aim of this study was to elucidate possible associations existing between single feed intake events and whole-body net fat and net carbohydrate oxidation as well as their relation to plasma metabolite concentrations. To this end, 4 late-lactating cows equipped with jugular catheters were kept in respiratory chambers with continuous and simultaneous recording of gas exchange and feed intake. Animals were fed ad libitum (AL) for 24h and then feed restricted (RE) to 50% of the previous AL intake for a further 24h. Blood samples were collected hourly to analyze β-hydroxybutyrate (BHBA), glucose, nonesterified fatty acids (NEFA), insulin, and acylated ghrelin concentrations. Cross-correlation analysis revealed an offset ranging between 30 and 42 min between the maximum of a feed intake event and the lowest level of postprandial net fat oxidation (FOX(net)) and the maximum level of postprandial net carbohydrate oxidation (COX(net)), respectively. During the AL period, FOX(net) did not increase above -0.2g/min, whereas COX(net) did not decrease below 6g/min before the start of the next feed intake event. A strong inverse cross-correlation was obtained between COX(net) and plasma glucose concentration. Direct cross-correlations were observed between COXnet and insulin, between heat production and BHBA, between insulin and glucose, and between BHBA and ghrelin. We found no cross-correlation between FOX(net) and NEFA. During RE, FOX(net) increased with an exponential slope, exceeded the threshold of -0.2g/min as indicated by increasing plasma NEFA concentrations, and approached a maximum rate of 0.1g/min, whereas COX(net) decayed in an exponential manner, approaching a minimal COX(net) rate of about 2.5 g/min in all cows. Our novel findings suggest that, in late-lactating cows, postprandial increases in metabolic oxidative processes seem to signal suppression of feed intake, whereas preprandially an accelerated FOX(net) rate and a decelerated COX(net) rate initiate feed intake.
Resumo:
CONTEXT: E-learning resources, such as virtual patients (VPs), can be more effective when they are integrated in the curriculum. To gain insights that can inform guidelines for the curricular integration of VPs, we explored students' perceptions of scenarios with integrated and non-integrated VPs aimed at promoting clinical reasoning skills. METHODS: During their paediatric clerkship, 116 fifth-year medical students were given at least ten VPs embedded in eight integrated scenarios and as non-integrated add-ons. The scenarios differed in the sequencing and alignment of VPs and related educational activities, tutor involvement, number of VPs, relevance to assessment and involvement of real patients. We sought students' perceptions on the VP scenarios in focus group interviews with eight groups of 4-7 randomly selected students (n = 39). The interviews were recorded, transcribed and analysed qualitatively. RESULTS: The analysis resulted in six themes reflecting students' perceptions of important features for effective curricular integration of VPs: (i) continuous and stable online access, (ii) increasing complexity, adapted to students' knowledge, (iii) VP-related workload offset by elimination of other activities, (iv) optimal sequencing (e.g.: lecture--1 to 2 VP(s)--tutor-led small group discussion--real patient) and (V) optimal alignment of VPs and educational activities, (vi) inclusion of VP topics in assessment. CONCLUSIONS: The themes appear to offer starting points for the development of a framework to guide the curricular integration of VPs. Their impact needs to be confirmed by studies using quantitative controlled designs.
Resumo:
The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.
Resumo:
An ever increasing number of low Earth orbiting (LEO) satellites is, or will be, equipped with retro-reflectors for Satellite Laser Ranging (SLR) and on-board receivers to collect observations from Global Navigation Satellite Systems (GNSS) such as the Global Positioning Sys- tem (GPS) and the Russian GLONASS and the European Galileo systems in the future. At the Astronomical Insti- tute of the University of Bern (AIUB) LEO precise or- bit determination (POD) using either GPS or SLR data is performed for a wide range of applications for satellites at different altitudes. For this purpose the classical numeri- cal integration techniques, as also used for dynamic orbit determination of satellites at high altitudes, are extended by pseudo-stochastic orbit modeling techniques to effi- ciently cope with potential force model deficiencies for satellites at low altitudes. Accuracies of better than 2 cm may be achieved by pseudo-stochastic orbit modeling for satellites at very low altitudes such as for the GPS-based POD of the Gravity field and steady-state Ocean Circula- tion Explorer (GOCE).
Resumo:
Atrial fibrillation (AF) is associated with an increased risk of thromboembolism, and is the most prevalent factor for cardioembolic stroke. Vitamin K antagonists (VKAs) have been the standard of care for stroke prevention in patients with AF since the early 1990s. They are very effective for the prevention of cardioembolic stroke, but are limited by factors such as drug-drug interactions, food interactions, slow onset and offset of action, haemorrhage and need for routine anticoagulation monitoring to maintain a therapeutic international normalised ratio (INR). Multiple new oral anticoagulants have been developed as potential replacements for VKAs for stroke prevention in AF. Most are small synthetic molecules that target thrombin (e.g. dabigatran etexilate) or factor Xa (e.g. rivaroxaban, apixaban, edoxaban, betrixaban, YM150). These drugs have predictable pharmacokinetics that allow fixed dosing without routine laboratory monitoring. Dabigatran etexilate, the first of these new oral anticoagulants to be approved by the United States Food and Drug Administration and the European Medicines Agency for stroke prevention in patients with non-valvular AF, represents an effective and safe alternative to VKAs. Under the auspices of the Regional Anticoagulation Working Group, a multidisciplinary group of experts in thrombosis and haemostasis from Central and Eastern Europe, an expert panel with expertise in AF convened to discuss practical, clinically important issues related to the long-term use of dabigatran for stroke prevention in non-valvular AF. The practical information reviewed in this article will help clinicians make appropriate use of this new therapeutic option in daily clinical practice.
Resumo:
Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.
Resumo:
For atmospheric CO2 reconstructions using ice cores, the technique to release the trapped air from the ice samples is essential for the precision and accuracy of the measurements. We present here a new dry extraction technique in combination with a new gas analytical system that together show significant improvements with respect to current systems. Ice samples (3–15 g) are pulverised using a novel centrifugal ice microtome (CIM) by shaving the ice in a cooled vacuum chamber (−27 °C) in which no friction occurs due to the use of magnetic bearings. Both, the shaving principle of the CIM and the use of magnetic bearings have not been applied so far in this field. Shaving the ice samples produces finer ice powder and releases a minimum of 90% of the trapped air compared to 50%–70% when needle crushing is employed. In addition, the friction-free motion with an optimized design to reduce contaminations of the inner surfaces of the device result in a reduced system offset of about 2.0 ppmv compared to 4.9 ppmv. The gas analytical part shows a higher precision than the corresponding part of our previous system by a factor of two, and all processes except the loading and cleaning of the CIM now run automatically. Compared to our previous system, the complete system shows a 3 times better measurement reproducibility of about 1.1 ppmv (1 σ) which is similar to the best reproducibility of other systems applied in this field. With this high reproducibility, no replicate measurements are required anymore for most future measurement campaigns resulting in a possible output of 12–20 measurements per day compared to a maximum of 6 with other systems.
Resumo:
Time-based localization techniques such as multilateration are favoured for positioning to wide-band signals. Applying the same techniques with narrow-band signals such as GSM is not so trivial. The process is challenged by the needs of synchronization accuracy and timestamp resolution both in the nanoseconds range. We propose approaches to deal with both challenges. On the one hand, we introduce a method to eliminate the negative effect of synchronization offset on time measurements. On the other hand, we propose timestamps with nanoseconds accuracy by using timing information from the signal processing chain. For a set of experiments, ranging from sub-urban to indoor environments, we show that our proposed approaches are able to improve the localization accuracy of TDOA approaches by several factors. We are even able to demonstrate errors as small as 10 meters for outdoor settings with narrow-band signals.
Resumo:
The relationship between trade and culture can be singled-out and deservedly labelled as unique in the discussion of 'trade and ...' issues. The reasons for this exceptional quality lie in the intensity of the relationship, which is indeed most often framed as 'trade versus culture' and has been a significant stumbling block, especially as audiovisual services are concerned, in the Uruguay Round and in the subsequent developments. The second specificity of the relationship is that the international community has organised its efforts in a rather effective manner to offset the lack of satisfying solutions within the framework of the WTO. The legally binding UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions is a clear sign of the potency of the international endeavour, on the one hand, and of the (almost desperate) desire to contest the existing WTO norms in the field of trade and culture, on the other. A third distinctive characteristic of the pair 'trade and culture', which is rarely mentioned and blissfully ignored in any Geneva or Paris talks, is that while the pro-trade and pro-culture opponents have been digging deeper in their respective trenches, the environment where trade and cultural issues are to be regulated has radically changed. The emergence and spread of digital technologies have modified profoundly the conditions for cultural content creation, distribution and access, and rendered some of the associated market failures obsolete, thus mitigating to a substantial degree the 'clash' nature of trade and culture. Against this backdrop, the present paper analyses in a finer-grained manner the move from 'trade and culture' towards 'trade versus culture'. It argues that both the domain of trade and that of culture have suffered from the aspirations to draw clearer lines between the WTO and other trade-related issues, charging the conflict to an extent that leaves few opportunities for practical solutions, which in an advanced digital setting would have been feasible.
Resumo:
Gaining economic benefits from substantially lower labor costs has been reported as a major reason for offshoring labor-intensive information systems services to low-wage countries. However, if wage differences are so high, why is there such a high level of variation in the economic success between offshored IS projects? This study argues that offshore outsourcing involves a number of extra costs for the ^his paper was recommended for acceptance by Associate Guest Editor Erran Carmel. client organization that account for the economic failure of offshore projects. The objective is to disaggregate these extra costs into their constituent parts and to explain why they differ between offshored software projects. The focus is on software development and maintenance projects that are offshored to Indian vendors. A theoretical framework is developed a priori based on transaction cost economics (TCE) and the knowledge-based view of the firm, comple mented by factors that acknowledge the specific offshore context The framework is empirically explored using a multiple case study design including six offshored software projects in a large German financial service institution. The results of our analysis indicate that the client incurs post contractual extra costs for four types of activities: (1) re quirements specification and design, (2) knowledge transfer, (3) control, and (4) coordination. In projects that require a high level of client-specific knowledge about idiosyncratic business processes and software systems, these extra costs were found to be substantially higher than in projects where more general knowledge was needed. Notably, these costs most often arose independently from the threat of oppor tunistic behavior, challenging the predominant TCE logic of market failure. Rather, the client extra costs were parti cularly high in client-specific projects because the effort for managing the consequences of the knowledge asymmetries between client and vendor was particularly high in these projects. Prior experiences of the vendor with related client projects were found to reduce the level of extra costs but could not fully offset the increase in extra costs in highly client-specific projects. Moreover, cultural and geographic distance between client and vendor as well as personnel turnover were found to increase client extra costs. Slight evidence was found, however, that the cost-increasing impact of these factors was also leveraged in projects with a high level of required client-specific knowledge (moderator effect).
Resumo:
High reflective materials in the microwave region play a very important role in the realization of antenna reflectors for a broad range of applications, including radiometry. These reflectors have a characteristic emissivity which needs to be characterized accurately in order to perform a correct radiometric calibration of the instrument. Such a characterization can be performed by using open resonators, waveguide cavities or by radiometric measurements. The latter consists of comparative radiometric observations of absorbers, reference mirrors and the sample under test, or using the cold sky radiation as a direct reference source. While the first two mentioned techniques are suitable for the characterization of metal plates and mirrors, the latter has the advantages to be also applicable to soft materials. This paper describes how, through this radiometric techniques, it is possible to characterize the emissivity of the sample relative to a reference mirror and how to characterize the absolute emissivity of the latter by performing measurements at different incident angles. The results presented in this paper are based on our investigations on emissivity of a multilayer insulation material (MLI) for space mission, at the frequencies of 22 and 90 GHz.
Resumo:
In this paper, we present a novel technique for the removal of astigmatism in submillimeter-wave optical systems through employment of a specific combination of so-called astigmatic off-axis reflectors. This technique treats an orthogonally astigmatic beam using skew Gaussian beam analysis, from which an anastigmatic imaging network is derived. The resultant beam is considered truly stigmatic, with all Gaussian beam parameters in the orthogonal directions being matched. This is thus considered an improvement over previous techniques wherein a beam corrected for astigmatism has only the orthogonal beam amplitude radii matched, with phase shift and phase radius of curvature not considered. This technique is computationally efficient, negating the requirement for computationally intensive numerical analysis of shaped reflector surfaces. The required optical surfaces are also relatively simple to implement compared to such numerically optimized shaped surfaces. This technique is implemented in this work as part of the complete optics train for the STEAMR antenna. The STEAMR instrument is envisaged as a mutli-beam limb sounding instrument operating at submillimeter wavelengths. The antenna optics arrangement for this instrument uses multiple off-axis reflectors to control the incident radiation and couple them to their corresponding receiver feeds. An anastigmatic imaging network is successfully implemented into an optical model of this antenna, and the resultant design ensures optimal imaging of the beams to the corresponding feed horns. This example also addresses the challenges of imaging in multi-beam antenna systems.
Resumo:
The International GNSS Service (IGS) provides operational products for the GPS and GLONASS constellation. Homogeneously processed time series of parameters from the IGS are only available for GPS. Reprocessed GLONASS series are provided only by individual Analysis Centers (i. e. CODE and ESA), making it difficult to fully include the GLONASS system into a rigorous GNSS analysis. In view of the increasing number of active GLONASS satellites and a steadily growing number of GPS+GLONASS-tracking stations available over the past few years, Technische Universität Dresden, Technische Universität München, Universität Bern and Eidgenössische Technische Hochschule Zürich performed a combined reprocessing of GPS and GLONASS observations. Also, SLR observations to GPS and GLONASS are included in this reprocessing effort. Here, we show only SLR results from a GNSS orbit validation. In total, 18 years of data (1994–2011) have been processed from altogether 340 GNSS and 70 SLR stations. The use of GLONASS observations in addition to GPS has no impact on the estimated linear terrestrial reference frame parameters. However, daily station positions show an RMS reduction of 0.3 mm on average for the height component when additional GLONASS observations can be used for the time series determination. Analyzing satellite orbit overlaps, the rigorous combination of GPS and GLONASS neither improves nor degrades the GPS orbit precision. For GLONASS, however, the quality of the microwave-derived GLONASS orbits improves due to the combination. These findings are confirmed using independent SLR observations for a GNSS orbit validation. In comparison to previous studies, mean SLR biases for satellites GPS-35 and GPS-36 could be reduced in magnitude from −35 and −38 mm to −12 and −13 mm, respectively. Our results show that remaining SLR biases depend on the satellite type and the use of coated or uncoated retro-reflectors. For Earth rotation parameters, the increasing number of GLONASS satellites and tracking stations over the past few years leads to differences between GPS-only and GPS+GLONASS combined solutions which are most pronounced in the pole rate estimates with maximum 0.2 mas/day in magnitude. At the same time, the difference between GLONASS-only and combined solutions decreases. Derived GNSS orbits are used to estimate combined GPS+GLONASS satellite clocks, with first results presented in this paper. Phase observation residuals from a precise point positioning are at the level of 2 mm and particularly reveal poorly modeled yaw maneuver periods.