41 resultados para Free-space method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The coverage of recurrent pressure sores with unstable scar in the surrounding tissue is still an unsolved problem in the literature. Local and regional transfer of tissue often does not meet the requirements of the tissue deficit. Especially in recurrent pressure sores, the adjacent skin has already been consumed due to multiple surgeries. As a good alternative, the microsurgical transfer of flaps offers viable tissue to cover even large pressure sores. METHODS: We performed a total of six free flaps in five patients who suffered from intractable pressure sores in the hip region. The age of the patients was between 41 and 63 years. The defect size varied between 6 x 6 cm and 25 x 30 cm. Two combined myocutaneous scapula-latissimus dorsi, two myocutaneous latissimus dorsi, one anteromedial thigh, and one rectus femoris flap were used to cover the defects. RESULTS: The average follow-up time was 29 months. Flaps provided stable coverage in four of five patients at 12-month follow-up. There was one subtotal flap necrosis that was subsequently treated with split-thickness skin grafting. CONCLUSION: In this series of five patients with six free flaps, we were able to show that the microsurgical transfer of tissue is a valuable option in the treatment of difficult pressure sores. Even in older and debilitated patients, this method is a good alternative to conventional local flaps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whereas a non-operative approach for hemodynamically stable patients with free intraabdominal fluid in the presence of solid organ injury is generally accepted, the presence of free fluid in the abdomen without evidence of solid organ injury not only presents a challenge for the treating emergency physician but also for the surgeon in charge. Despite recent advances in imaging modalities, with multi-detector computed tomography (CT) (with or without contrast agent) usually the imaging method of choice, diagnosis and interpretation of the results remains difficult. While some studies conclude that CT is highly accurate and relatively specific at diagnosing mesenteric and hollow viscus injury, others studies deem CT to be unreliable. These differences may in part be due to the experience and the interpretation of the radiologist and/or the treating physician or surgeon.A search of the literature has made it apparent that there is no straightforward answer to the question what to do with patients with free intraabdominal fluid on CT scanning but without signs of solid organ injury. In hemodynamically unstable patients, free intraabdominal fluid in the absence of solid organ injury usually mandates immediate surgical intervention. For patients with blunt abdominal trauma and more than just a trace of free intraabdominal fluid or for patients with signs of peritonitis, the threshold for a surgical exploration - preferably by a laparoscopic approach - should be low. Based on the available information, we aim to provide the reader with an overview of the current literature with specific emphasis on diagnostic and therapeutic approaches to this problem and suggest a possible algorithm, which might help with the adequate treatment of such patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rationale: Focal onset epileptic seizures are due to abnormal interactions between distributed brain areas. By estimating the cross-correlation matrix of multi-site intra-cerebral EEG recordings (iEEG), one can quantify these interactions. To assess the topology of the underlying functional network, the binary connectivity matrix has to be derived from the cross-correlation matrix by use of a threshold. Classically, a unique threshold is used that constrains the topology [1]. Our method aims to set the threshold in a data-driven way by separating genuine from random cross-correlation. We compare our approach to the fixed threshold method and study the dynamics of the functional topology. Methods: We investigate the iEEG of patients suffering from focal onset seizures who underwent evaluation for the possibility of surgery. The equal-time cross-correlation matrices are evaluated using a sliding time window. We then compare 3 approaches assessing the corresponding binary networks. For each time window: * Our parameter-free method derives from the cross-correlation strength matrix (CCS)[2]. It aims at disentangling genuine from random correlations (due to finite length and varying frequency content of the signals). In practice, a threshold is evaluated for each pair of channels independently, in a data-driven way. * The fixed mean degree (FMD) uses a unique threshold on the whole connectivity matrix so as to ensure a user defined mean degree. * The varying mean degree (VMD) uses the mean degree of the CCS network to set a unique threshold for the entire connectivity matrix. * Finally, the connectivity (c), connectedness (given by k, the number of disconnected sub-networks), mean global and local efficiencies (Eg, El, resp.) are computed from FMD, CCS, VMD, and their corresponding random and lattice networks. Results: Compared to FMD and VMD, CCS networks present: *topologies that are different in terms of c, k, Eg and El. *from the pre-ictal to the ictal and then post-ictal period, topological features time courses that are more stable within a period, and more contrasted from one period to the next. For CCS, pre-ictal connectivity is low, increases to a high level during the seizure, then decreases at offset. k shows a ‘‘U-curve’’ underlining the synchronization of all electrodes during the seizure. Eg and El time courses fluctuate between the corresponding random and lattice networks values in a reproducible manner. Conclusions: The definition of a data-driven threshold provides new insights into the topology of the epileptic functional networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Earth's bow shock is very efficient in accelerating ions out of the incident solar wind distribution to high energies (≈ 200 keV/e). Fluxes of energetic ions accelerated at the quasi-parallel bow shock, also known as diffuse ions, are best represented by exponential spectra in energy/charge, which require additional assumptions to be incorporated into these model spectra. One of these assumptions is a so-called "free escape boundary" along the interplanetary magnetic field into the upstream direction. Locations along the IBEX orbit are ideally suited for in situ measurements to investigate the existence of an upstream free escape boundary for bow shock accelerated ions. In this study we use 2 years of ion measurements from the background monitor on the IBEX spacecraft, supported by ACE solar wind observations. The IBEX Background Monitor is sensitive to protons > 14 keV, which includes the energy of the maximum flux for diffuse ions. With increasing distance from the bow shock along the interplanetary magnetic field, the count rates for diffuse ions stay constant for ions streaming away from the bow shock, while count rates for diffuse ions streaming toward the shock gradually decrease from a maximum value to ~1/e at distances of about 10 RE to 14 RE. These observations of a gradual decrease support the transition to a free escape continuum for ions of energy >14 keV at distances from 10 RE to 14 RE from the bow shock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adsorption interactions of thallium and its compounds with gold and quartz surfaces were investigated. Carrier-free amounts of thallium were produced in nuclear fusion reactions of alpha particles with thick gold targets. The method chosen for the studies was gas thermochromatography and varying the redox potential of the carrier gases. It was observed that thallium is extremely sensitive to trace amounts of oxygen and water, and can even be oxidized by the hydroxyl groups located on the quartz surface. The experiments on a quartz surface with O2, He, H2 gas in addition with water revealed the formation and deposition of only one thallium species – TlOH. The adsorption enthalpy was determined to be Δ HSiO2ads(TlOH) = −134 ± 5 kJ mol−1. A series of experiments using gold as stationary surface and different carrier gases resulted in the detection of two thallium species – metallic Tl (H2 as carrier gas) and TlOH (O2, O2+H2O and H2+H2O as pure carrier gas or carrier gas mixture) with Δ HAuads(Tl) = −270 ± 10 kJ mol− and Δ HAuads(TlOH) = −146 ± 3 kJ mol−1. These data demonstrate a weak interaction of TlOH with both quartz and gold surfaces. The data represent important information for the design of future experiments with the heavier homologue of Tl in group 13 of the periodic table – element 113 (E113).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High reflective materials in the microwave region play a very important role in the realization of antenna reflectors for a broad range of applications, including radiometry. These reflectors have a characteristic emissivity which needs to be characterized accurately in order to perform a correct radiometric calibration of the instrument. Such a characterization can be performed by using open resonators, waveguide cavities or by radiometric measurements. The latter consists of comparative radiometric observations of absorbers, reference mirrors and the sample under test, or using the cold sky radiation as a direct reference source. While the first two mentioned techniques are suitable for the characterization of metal plates and mirrors, the latter has the advantages to be also applicable to soft materials. This paper describes how, through this radiometric techniques, it is possible to characterize the emissivity of the sample relative to a reference mirror and how to characterize the absolute emissivity of the latter by performing measurements at different incident angles. The results presented in this paper are based on our investigations on emissivity of a multilayer insulation material (MLI) for space mission, at the frequencies of 22 and 90 GHz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In free viewpoint applications, the images are captured by an array of cameras that acquire a scene of interest from different perspectives. Any intermediate viewpoint not included in the camera array can be virtually synthesized by the decoder, at a quality that depends on the distance between the virtual view and the camera views available at decoder. Hence, it is beneficial for any user to receive camera views that are close to each other for synthesis. This is however not always feasible in bandwidth-limited overlay networks, where every node may ask for different camera views. In this work, we propose an optimized delivery strategy for free viewpoint streaming over overlay networks. We introduce the concept of layered quality-of-experience (QoE), which describes the level of interactivity offered to clients. Based on these levels of QoE, camera views are organized into layered subsets. These subsets are then delivered to clients through a prioritized network coding streaming scheme, which accommodates for the network and clients heterogeneity and effectively exploit the resources of the overlay network. Simulation results show that, in a scenario with limited bandwidth or channel reliability, the proposed method outperforms baseline network coding approaches, where the different levels of QoE are not taken into account in the delivery strategy optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acquisition of accurate information on the size of traits in animals is fundamental for the study of animal ecology and evolution and their management. We demonstrate how morphological traits of free-ranging animals can reliably be estimated on very large observation distances of several hundred meters by the use of ordinary digital photographic equipment and simple photogrammetric software. In our study, we estimated the length of horn annuli in free-ranging male Alpine ibex (Capra ibex) by taking already measured horn annuli of conspecifics on the same photographs as scaling units. Comparisons with hand-measured horn annuli lengths and repeatability analyses revealed a high accuracy of the photogrammetric estimates. If length estimations of specific horn annuli are based on multiple photographs measurement errors of <5.5 mm can be expected. In the current study the application of the described photogrammetric procedure increased the sample size of animals with known horn annuli length by an additional 104%. The presented photogrammetric procedure is of broad applicability and represents an easy, robust and cost-efficient method for the measuring of individuals in populations where animals are hard to capture or to approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The direct Bayesian admissible region approach is an a priori state free measurement association and initial orbit determination technique for optical tracks. In this paper, we test a hybrid approach that appends a least squares estimator to the direct Bayesian method on measurements taken at the Zimmerwald Observatory of the Astronomical Institute at the University of Bern. Over half of the association pairs agreed with conventional geometric track correlation and least squares techniques. The remaining pairs cast light on the fundamental limits of conducting tracklet association based solely on dynamical and geometrical information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flavour of foods is determined by the interaction of taste molecules with receptors in the mouth, and fragrances or aroma with receptors in the upper part of the nose. Here, we discuss the properties of taste and fragrance molecules, from the public databases Superscent, Flavornet, SuperSweet and BitterDB, taken collectively as flavours, in the perspective of the chemical space. We survey simple descriptor profiles in comparison with the public collections ChEMBL (bioactive small molecules), ZINC (commercial drug-like molecules) and GDB-13 (all possible organic molecules up to 13 atoms of C, N, O, S, Cl). A global analysis of the chemical space of flavours is also presented based on molecular quantum numbers (MQN) and SMILES fingerprints (SMIfp). While taste molecules span a very broad property range, fragrances occupy a narrow area of the chemical space consisting of generally very small and relatively nonpolar molecules distinct of standard drug molecules. Proximity searching in the chemical space is exemplified as a simple method to facilitate the search for new fragrances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Because computed tomography (CT) has advantages for visualizing the manifestation of necrosis and local complications, a series of scoring systems based on CT manifestations have been developed for assessing the clinical outcomes of acute pancreatitis (AP), including the CT severity index (CTSI), modified CTSI, etc. Despite the internationally accepted CTSI having been successfully used to predict the overall mortality and disease severity of AP, recent literature has revealed the limitations of the CTSI. Using the Delphi method, we establish a new scoring system based on retrocrural space involvement (RCSI), and compared its effectiveness at evaluating the mortality and severity of AP with that of the CTSI. METHODS We reviewed CT images of 257 patients with AP taken within 3-5 days of admission in 2012. The RCSI scoring system, which includes assessment of infectious conditions involving the retrocrural space and the adjacent pleural cavity, was established using the Delphi method. Two radiologists independently assessed the RCSI and CTSI scores. The predictive points of the RCSI and CTSI scoring systems in evaluating the mortality and severity of AP were estimated using receiver operating characteristic (ROC) curves. PRINCIPAL FINDINGS The RCSI score can accurately predict the mortality and disease severity. The area under the ROC curve for the RCSI versus CTSI score was 0.962±0.011 versus 0.900±0.021 for predicting the mortality, and 0.888±0.025 versus 0.904±0.020 for predicting the severity of AP. Applying ROC analysis to our data showed that a RCSI score of 4 was the best cutoff value, above which mortality could be identified. CONCLUSION The Delphi method was innovatively adopted to establish a scoring system to predict the clinical outcome of AP. The RCSI scoring system can predict the mortality of AP better than the CTSI system, and the severity of AP equally as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free arachidonic acid is functionally interlinked with different lipid signaling networks including those involving prostanoid pathways, the endocannabinoid system, N-acylethanolamines, as well as steroids. A sensitive and specific LC-MS/MS method for the quantification of arachidonic acid, prostaglandin E2, thromboxane B2, anandamide, 2-arachidonoylglycerol, noladin ether, lineoyl ethanolamide, oleoyl ethanolamide, palmitoyl ethanolamide, steroyl ethanolamide, aldosterone, cortisol, dehydroepiandrosterone, progesterone, and testosterone in human plasma was developed and validated. Analytes were extracted using acetonitrile precipitation followed by solid phase extraction. Separations were performed by UFLC using a C18 column and analyzed on a triple quadrupole MS with electron spray ionization. Analytes were run first in negative mode and, subsequently, in positive mode in two independent LC-MS/MS runs. For each analyte, two MRM transitions were collected in order to confirm identity. All analytes showed good linearity over the investigated concentration range (r>0.98). Validated LLOQs ranged from 0.1 to 190ng/mL and LODs ranged from 0.04 to 12.3ng/mL. Our data show that this LC-MS/MS method is suitable for the quantification of a diverse set of bioactive lipids in plasma from human donors (n=32). The determined plasma levels are in agreement with the literature, thus providing a versatile method to explore pathophysiological processes in which changes of these lipids are implicated.