42 resultados para Power spectral analysis
Resumo:
Palaeoclimatic information can be retrieved from the diffusion of the stable water isotope signal during firnification of snow. The diffusion length, a measure for the amount of diffusion a layer has experienced, depends on the firn temperature and the accumulation rate. We show that the estimation of the diffusion length using power spectral densities (PSDs) of the record of a single isotope species can be biased by uncertainties in spectral properties of the isotope signal prior to diffusion. By using a second water isotope and calculating the difference in diffusion lengths between the two isotopes, this problem is circumvented. We study the PSD method applied to two isotopes in detail and additionally present a new forward diffusion method for retrieving the differential diffusion length based on the Pearson correlation between the two isotope signals. The two methods are discussed and extensively tested on synthetic data which are generated in a Monte Carlo manner. We show that calibration of the PSD method with this synthetic data is necessary to be able to objectively determine the differential diffusion length. The correlation-based method proves to be a good alternative for the PSD method as it yields precision equal to or somewhat higher than the PSD method. The use of synthetic data also allows us to estimate the accuracy and precision of the two methods and to choose the best sampling strategy to obtain past temperatures with the required precision. In addition to application to synthetic data the two methods are tested on stable-isotope records from the EPICA (European Project for Ice Coring in Antarctica) ice core drilled in Dronning Maud Land, Antarctica, showing that reliable firn temperatures can be reconstructed with a typical uncertainty of 1.5 and 2 °C for the Holocene period and 2 and 2.5 °C for the last glacial period for the correlation and PSD method, respectively.
Resumo:
Potential Desiccation Polygons (PDPs), tens to hundreds of meters in size, have been observed in numerous regions on Mars, particularly in ancient (>3Gyr old) terrains of inferred paleolacustrine/playa geologic setting, and in association with hydrous minerals such as smectites. Therefore, a better understanding of the conditions in which large desiccation polygons form could yield unique insight into the ancient climate on Mars. Many dried lakebeds/playas in western United States display large (>50m wide) desiccation polygons, which we consider to be analogues for PDPs on Mars. Therefore, we have carried out fieldwork in seven of these dried lakes in San Bernardino and the Death Valley National Park regions complemented with laboratory and spectral analysis of collected samples. Our study shows that the investigated lacustrine/playa sediments have (a) a soil matrix containing 40-75% clays and fine silt (by volume) where the clay minerals are dominated by illite/muscovite followed by smectite, (b) carbonaceous mineralogy with variable amounts of chloride and sulfate salts, and significantly, (c) roughly similar spectral signatures in the visible-near-infrared (VIS-NIR) range. We conclude that the development of large desiccation fractures is consistent with water table retreat. In addition, the comparison of the mineralogical to the spectral observations further suggests that remote sensing VIS-NIR spectroscopy has its limitations for detailed characterization of lacustrine/playa deposits. Finally, our results imply that the widespread distribution of PDPs on Mars indicates global or regional climatic transitions from wet conditions to more arid ones making them important candidate sites for future in situ missions.
Resumo:
Optical pulse amplification in doped fibers is studied using an extended power transport equation for the coupled pulse spectral components. This equation includes the effects of gain saturation, gain dispersion, fiber dispersion, fiber nonlinearity, and amplified spontaneous emission. The new model is employed to study nonlinear gain-induced effects on the spectrotemporal characteristics of amplified subpicosecond pulses, in both the anomalous and the normal dispersion regimes.
Resumo:
OBJECT: In this study, 1H magnetic resonance (MR) spectroscopy was prospectively tested as a reliable method for presurgical grading of neuroepithelial brain tumors. METHODS: Using a database of tumor spectra obtained in patients with histologically confirmed diagnoses, 94 consecutive untreated patients were studied using single-voxel 1H spectroscopy (point-resolved spectroscopy; TE 135 msec, TE 135 msec, TR 1500 msec). A total of 90 tumor spectra obtained in patients with diagnostic 1H MR spectroscopy examinations were analyzed using commercially available software (MRUI/VARPRO) and classified using linear discriminant analysis as World Health Organization (WHO) Grade I/II, WHO Grade III, or WHO Grade IV lesions. In all cases, the classification results were matched with histopathological diagnoses that were made according to the WHO classification criteria after serial stereotactic biopsy procedures or open surgery. Histopathological studies revealed 30 Grade I/II tumors, 29 Grade III tumors, and 31 Grade IV tumors. The reliability of the histological diagnoses was validated considering a minimum postsurgical follow-up period of 12 months (range 12-37 months). Classifications based on spectroscopic data yielded 31 tumors in Grade I/II, 32 in Grade III, and 27 in Grade IV. Incorrect classifications included two Grade II tumors, one of which was identified as Grade III and one as Grade IV; two Grade III tumors identified as Grade II; two Grade III lesions identified as Grade IV; and six Grade IV tumors identified as Grade III. Furthermore, one glioblastoma (WHO Grade IV) was classified as WHO Grade I/II. This represents an overall success rate of 86%, and a 95% success rate in differentiating low-grade from high-grade tumors. CONCLUSIONS: The authors conclude that in vivo 1H MR spectroscopy is a reliable technique for grading neuroepithelial brain tumors.
Resumo:
PURPOSE Quantification of retinal layers using automated segmentation of optical coherence tomography (OCT) images allows for longitudinal studies of retinal and neurological disorders in mice. The purpose of this study was to compare the performance of automated retinal layer segmentation algorithms with data from manual segmentation in mice using the Spectralis OCT. METHODS Spectral domain OCT images from 55 mice from three different mouse strains were analyzed in total. The OCT scans from 22 C57Bl/6, 22 BALBc, and 11 C3A.Cg-Pde6b(+)Prph2(Rd2) /J mice were automatically segmented using three commercially available automated retinal segmentation algorithms and compared to manual segmentation. RESULTS Fully automated segmentation performed well in mice and showed coefficients of variation (CV) of below 5% for the total retinal volume. However, all three automated segmentation algorithms yielded much thicker total retinal thickness values compared to manual segmentation data (P < 0.0001) due to segmentation errors in the basement membrane. CONCLUSIONS Whereas the automated retinal segmentation algorithms performed well for the inner layers, the retinal pigmentation epithelium (RPE) was delineated within the sclera, leading to consistently thicker measurements of the photoreceptor layer and the total retina. TRANSLATIONAL RELEVANCE The introduction of spectral domain OCT allows for accurate imaging of the mouse retina. Exact quantification of retinal layer thicknesses in mice is important to study layers of interest under various pathological conditions.
Resumo:
This work investigates the performance of cardiorespiratory analysis detecting periodic breathing (PB) in chest wall recordings in mountaineers climbing to extreme altitude. The breathing patterns of 34 mountaineers were monitored unobtrusively by inductance plethysmography, ECG and pulse oximetry using a portable recorder during climbs at altitudes between 4497 and 7546 m on Mt. Muztagh Ata. The minute ventilation (VE) and heart rate (HR) signals were studied, to identify visually scored PB, applying time-varying spectral, coherence and entropy analysis. In 411 climbing periods, 30-120 min in duration, high values of mean power (MP(VE)) and slope (MSlope(VE)) of the modulation frequency band of VE, accurately identified PB, with an area under the ROC curve of 88 and 89%, respectively. Prolonged stay at altitude was associated with an increase in PB. During PB episodes, higher peak power of ventilatory (MP(VE)) and cardiac (MP(LF)(HR) ) oscillations and cardiorespiratory coherence (MP(LF)(Coher)), but reduced ventilation entropy (SampEn(VE)), was observed. Therefore, the characterization of cardiorespiratory dynamics by the analysis of VE and HR signals accurately identifies PB and effects of altitude acclimatization, providing promising tools for investigating physiologic effects of environmental exposures and diseases.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.