989 resultados para Statistical methodologies
Resumo:
The properties of Ellerman bombs (EBs), small-scale brightenings in the Hα line wings, have proved difficult to establish because their size is close to the spatial resolution of even the most advanced telescopes. Here, we aim to infer the size and lifetime of EBs using high-resolution data of an emerging active region collected using the Interferometric BIdimensional Spectrometer (IBIS) and Rapid Oscillations of the Solar Atmosphere (ROSA) instruments as well as the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO). We develop an algorithm to track EBs through their evolution, finding that EBs can often be much smaller (around 0.3″) and shorter-lived (less than one minute) than previous estimates. A correlation between G-band magnetic bright points and EBs is also found. Combining SDO/HMI and G-band data gives a good proxy of the polarity for the vertical magnetic field. It is found that EBs often occur both over regions of opposite polarity flux and strong unipolar fields, possibly hinting at magnetic reconnection as a driver of these events.The energetics of EB events is found to follow a power-law distribution in the range of a nanoflare (1022-25 ergs).
Resumo:
Here we describe approaches and methods to assaying in vitro the major variant bacterial sigma factor, Sigma 54 (σ54), in a purified system. We include the complete transcription system, binding interactions between σ54 and its activators, as well as the self-assembly and the critical ATPase activity of the cognate activators which serve to remodel the closed promoter complexes. We also present in vivo methodologies that are used to study the impact of physiological processes, metabolic states, global signalling networks, and cellular architecture on the control of σ54-dependent gene expression.
Resumo:
In this paper, we study the achievable ergodic sum-rate of multiuser multiple-input multiple-output downlink systems in Rician fading channels. We first derive a lower bound on the average signal-to-leakage-and-noise ratio by using the Mullen’s inequality, and then use it to analyze the effect of channel mean information on the achievable ergodic sum-rate. A novel statistical-eigenmode space-division multiple-access (SESDMA) downlink transmission scheme is then proposed. For this scheme, we derive an exact analytical closed-form expression for the achievable ergodic rate and present tractable tight upper and lower bounds. Based on our analysis, we gain valuable insights into the system parameters, such as the number of transmit antennas, the signal-to-noise ratio (SNR) and Rician K-factor on the system sum-rate. Results show that the sum-rate converges to a saturation value in the high SNR regime and tends to a lower limit for the low Rician K-factor case. In addition, we compare the achievable ergodic sum-rate between SE-SDMA and zeroforcing beamforming with perfect channel state information at the base station. Our results reveal that the rate gap tends to zero in the high Rician K-factor regime. Finally, numerical results are presented to validate our analysis.
Resumo:
Current variation aware design methodologies, tuned for worst-case scenarios, are becoming increasingly pessimistic from the perspective of power and performance. A good example of such pessimism is setting the refresh rate of DRAMs according to the worst-case access statistics, thereby resulting in very frequent refresh cycles, which are responsible for the majority of the standby power consumption of these memories. However, such a high refresh rate may not be required, either due to extremely low probability of the actual occurrence of such a worst-case, or due to the inherent error resilient nature of many applications that can tolerate a certain number of potential failures. In this paper, we exploit and quantify the possibilities that exist in dynamic memory design by shifting to the so-called approximate computing paradigm in order to save power and enhance yield at no cost. The statistical characteristics of the retention time in dynamic memories were revealed by studying a fabricated 2kb CMOS compatible embedded DRAM (eDRAM) memory array based on gain-cells. Measurements show that up to 73% of the retention power can be saved by altering the refresh time and setting it such that a small number of failures is allowed. We show that these savings can be further increased by utilizing known circuit techniques, such as body biasing, which can help, not only in extending, but also in preferably shaping the retention time distribution. Our approach is one of the first attempts to access the data integrity and energy tradeoffs achieved in eDRAMs for utilizing them in error resilient applications and can prove helpful in the anticipated shift to approximate computing.
Resumo:
In this paper, we introduce a statistical data-correction framework that aims at improving the DSP system performance in presence of unreliable memories. The proposed signal processing framework implements best-effort error mitigation for signals that are corrupted by defects in unreliable storage arrays using a statistical correction function extracted from the signal statistics, a data-corruption model, and an application-specific cost function. An application example to communication systems demonstrates the efficacy of the proposed approach.
Resumo:
The worsening of process variations and the consequent increased spreads in circuit performance and consumed power hinder the satisfaction of the targeted budgets and lead to yield loss. Corner based design and adoption of design guardbands might limit the yield loss. However, in many cases such methods may not be able to capture the real effects which might be way better than the predicted ones leading to increasingly pessimistic designs. The situation is even more severe in memories which consist of substantially different individual building blocks, further complicating the accurate analysis of the impact of variations at the architecture level leaving many potential issues uncovered and opportunities unexploited. In this paper, we develop a framework for capturing non-trivial statistical interactions among all the components of a memory/cache. The developed tool is able to find the optimum memory/cache configuration under various constraints allowing the designers to make the right choices early in the design cycle and consequently improve performance, energy, and especially yield. Our, results indicate that the consideration of the architectural interactions between the memory components allow to relax the pessimistic access times that are predicted by existing techniques.
Resumo:
This paper investigates the characteristics of the shadowed fading observed in off-body communications channels at 5.8 GHz using the κ-μ / gamma composite fading model. Realistic measurements have been conducted considering four individual scenarios namely line of sight (LOS) and non-LOS (NLOS) walking, rotation and random movements within an indoor laboratory environment. It is shown that the κ-μ / gamma composite fading model provides a better fit to the fading observed in off-body communications channels compared to the conventional Nakagami-m and Rician fading models.
Resumo:
Mining seafloor massive sulfides for metals is an emergent industry faced with environmental management challenges. These revolve largely around limits to our current understanding of biological variability in marine systems, a challenge common to all marine environmental management. VentBase was established as a forum where academic, commercial, governmental, and non-governmental stakeholders can develop a consensus regarding the management of exploitative activities in the deep-sea. Participants advocate a precautionary approach with the incorporation of lessons learned from coastal studies. This workshop report from VentBase encourages the standardization of sampling methodologies for deep-sea environmental impact assessment. VentBase stresses the need for the collation of spatial data and importance of datasets amenable to robust statistical analyses. VentBase supports the identification of set-asides to prevent the local extirpation of vent-endemic communities and for the post-extraction recolonization of mine sites. © 2013.
Resumo:
Neutrophil elastase (NE), a biomarker of infection and inflammation, correlates with the severity of several respiratory diseases including chronic obstructive pulmonary disease (COPD). However, it’s detection and quantification in biological samples is confounded by a lack of reliable and robust methodologies. Standard assays using chromogenic or fluorogenic substrates are not specific when added to complex clinical samples containing multiple proteolytic and hydrolytic enzymes which have the ability to hydrolyse the substrate, thereby resulting in an over-estimation of the target protease. Furthermore, ELISA systems measure total protease levels which can be a mixture of latent, active and protease-inhibitor complexes. Therefore, we have developed a novel immunoassay (ProteaseTag™ Active NE Immunoassay) which is selective and specific for the capture of active NE in sputum and Bronchoalveolar Lavage (BAL) in patients with COPD. The objective of this study was to clinically validate ProteaseTag™ Active NE Ultra Immunoassay for the detection of NE in sputum from COPD patients. 20 matched sputum sol samples were collected from 10 COPD patients (M=6, F=4; 73 ± 6 years) during stable and exacerbation phases. Samples were assayed for NE activity utilising both ProteaseTag™ Active NE Ultra Immunoassay and a fluorogenic substrate-based kinetic activity assay. Both assays detected elevated levels of NE in the majority of patients (n=7) during an exacerbation (mean=217.2 μg/ml ±296.6) compared to their stable phase (mean=92.37 μg/ml ±259.8). However, statistical analysis did not show this difference to be significant (p=0.07, ProteaseTag™ Active NE Ultra Immunoassay; p=0.06 kinetic assay), most likely due to the low study number. A highly significant correlation was found between the 2 assay types (p≤0.0001, r=0.996). NE as a primary efficacy endpoint in clinical trials or as a marker of inflammation within the clinic has been hampered by the lack of a robust and simple to use assay. ProteaseTag™ Active NE Immunoassay specifically measures only active NE in clinical samples, is quick and easy to use (< 3 hours) and has no dependency on a kinetic readout. ProteaseTag™ technology is currently being transferred to a lateral flow device for use at Point of Care.