932 resultados para Weakly Compact Sets
Resumo:
A low-profile wearable antenna suitable for integration into low-cost, disposable medical vital signs monitors is presented. Simulated and measured antenna performance was characterized on a layered human tissue phantom, representative of the thorax region of a range of human bodies. The wearable antenna has sufficient bandwidth for the 868 MHz Industrial, Scientific and Medical frequency band. Wearable radiation efficiency of up to 30 % is reported when mounted in close proximity to the novel human tissue phantom antenna test-bed at 868 MHz.
Resumo:
This letter presents the design of a thin microwave absorber which exhibits a -10 dB reflectivity bandwidth of 108% at normal incidence and 16% for simultaneous suppression of TE and TM polarised waves over the angular range 0-45° is presented. The structure consists of a 3 mm-thick metal backed frequency selective surface (FSS) with four resistively loaded hexagonal loop elements in each unit cell. The surface resistivity and width of the loops are carefully chosen to maximise the bandwidth by merging the reflection nulls that are generated by the multi-resonant absorber. Measurement and simulation results are in good agreement over the broad frequency range 7.8-24 GHz.
Resumo:
In the study of complex genetic diseases, the identification of subgroups of patients sharing similar genetic characteristics represents a challenging task, for example, to improve treatment decision. One type of genetic lesion, frequently investigated in such disorders, is the change of the DNA copy number (CN) at specific genomic traits. Non-negative Matrix Factorization (NMF) is a standard technique to reduce the dimensionality of a data set and to cluster data samples, while keeping its most relevant information in meaningful components. Thus, it can be used to discover subgroups of patients from CN profiles. It is however computationally impractical for very high dimensional data, such as CN microarray data. Deciding the most suitable number of subgroups is also a challenging problem. The aim of this work is to derive a procedure to compact high dimensional data, in order to improve NMF applicability without compromising the quality of the clustering. This is particularly important for analyzing high-resolution microarray data. Many commonly used quality measures, as well as our own measures, are employed to decide the number of subgroups and to assess the quality of the results. Our measures are based on the idea of identifying robust subgroups, inspired by biologically/clinically relevance instead of simply aiming at well-separated clusters. We evaluate our procedure using four real independent data sets. In these data sets, our method was able to find accurate subgroups with individual molecular and clinical features and outperformed the standard NMF in terms of accuracy in the factorization fitness function. Hence, it can be useful for the discovery of subgroups of patients with similar CN profiles in the study of heterogeneous diseases.
Resumo:
Kuznetsov independence of variables X and Y means that, for any pair of bounded functions f(X) and g(Y), E[f(X)g(Y)]=E[f(X)] *times* E[g(Y)], where E[.] denotes interval-valued expectation and *times* denotes interval multiplication. We present properties of Kuznetsov independence for several variables, and connect it with other concepts of independence in the literature; in particular we show that strong extensions are always included in sets of probability distributions whose lower and upper expectations satisfy Kuznetsov independence. We introduce an algorithm that computes lower expectations subject to judgments of Kuznetsov independence by mixing column generation techniques with nonlinear programming. Finally, we define a concept of conditional Kuznetsov independence, and study its graphoid properties.
Resumo:
A credal network associates a directed acyclic graph with a collection of sets of probability measures; it offers a compact representation for sets of multivariate distributions. In this paper we present a new algorithm for inference in credal networks based on an integer programming reformulation. We are concerned with computation of lower/upper probabilities for a variable in a given credal network. Experiments reported in this paper indicate that this new algorithm has better performance than existing ones for some important classes of networks.
Resumo:
The use of TiO 2 photocatalysis for the destruction of dyes such as methylene blue has been extensively reported. One of the challenges faced in both the laboratory and large scale water treatment plants is the fact that the samples have to be removed from the reactor vessel and the catalyst separated prior to analysis being undertaken. In this paper we report the development of a simple fluorimeter instrument and its use in monitoring the photocatalytic destruction of methylene blue dyes in the presence of catalyst suspensions. The results reported show that the instrument provides an effective method for in situ monitoring of the photocatalytic destruction of fluorescent dyes hence allowing more accurate measurement due to the minimisation of sample loss and cross contamination. Furthermore it also provides a method for real time monitoring of the dye pollutant destruction in large scale photocatalytic reactors.
Resumo:
Ultra-intense lasers can nowadays routinely accelerate kiloampere ion beams. These unique sources of particle beams could impact many societal (e.g., proton-therapy or fuel recycling) and fundamental (e.g., neutron probing) domains. However, this requires overcoming the beam angular divergence at the source. This has been attempted, either with large-scale conventional setups or with compact plasma techniques that however have the restriction of short (<1 mm) focusing distances or a chromatic behavior. Here, we show that exploiting laser-triggered, long-lasting (>50 ps), thermoelectric multi-megagauss surface magnetic (B)-fields, compact capturing, and focusing of a diverging laser-driven multi-MeV ion beam can be achieved over a wide range of ion energies in the limit of a 5° acceptance angle.
Resumo:
Purpose of review: Appropriate selection and definition of outcome measures are essential for clinical trials to be maximally informative. Core outcome sets (an agreed, standardized collection of outcomes measured and reported in all trials for a specific clinical area) were developed due to established inconsistencies in trial outcome selection. This review discusses the rationale for, and methods of, core outcome set development, as well as current initiatives in critical care.
Recent findings: Recent systematic reviews of reported outcomes and measurement instruments relevant to the critically ill highlight inconsistencies in outcome selection, definition, and measurement, thus establishing the need for core outcome sets. Current critical care initiatives include development of core outcome sets for trials aimed at reducing mechanical ventilation duration; rehabilitation following critical illness; long-term outcomes in acute respiratory failure; and epidemic and pandemic studies of severe acute respiratory infection.
Summary: Development and utilization of core outcome sets for studies relevant to the critically ill is in its infancy compared to other specialties. Notwithstanding, core outcome set development frameworks and guidelines are available, several sets are in various stages of development, and there is strong support from international investigator-led collaborations including the International Forum for Acute Care Trialists.
Resumo:
Many AMS systems can measure 14C, 13C and 12C simultaneously thus providing δ13C values which can be used for fractionation normalization without the need for offline 13C /12C measurements on isotope ratio mass spectrometers (IRMS). However AMS δ13C values on our 0.5MV NEC Compact Accelerator often differ from IRMS values on the same material by 4-5‰ or more. It has been postulated that the AMS δ13C values account for the potential graphitization and machine induced fractionation, in addition to natural fractionation, but how much does this affect the 14C ages or F14C? We present an analysis of F14C as a linear least squares fit with AMS δ13C results for several of our secondary standards. While there are samples for which there is an obvious correlation between AMS δ13C and F14C, as quantified with the calculated probability of no correlation, we find that the trend lies within one standard deviation of the variance on our F14C measurements. Our laboratory produces both zinc and hydrogen reduced graphite, and we present our results for each type. Additionally, we show the variance on our AMS δ13C measurements of our secondary standards.
Resumo:
BACKGROUND: Core outcome sets can increase the efficiency and value of research and, as a result, there are an increasing number of studies looking to develop core outcome sets (COS). However, the credibility of a COS depends on both the use of sound methodology in its development and clear and transparent reporting of the processes adopted. To date there is no reporting guideline for reporting COS studies. The aim of this programme of research is to develop a reporting guideline for studies developing COS and to highlight some of the important methodological considerations in the process.
METHODS/DESIGN: The study will include a reporting guideline item generation stage which will then be used in a Delphi study. The Delphi study is anticipated to include two rounds. The first round will ask stakeholders to score the items listed and to add any new items they think are relevant. In the second round of the process, participants will be shown the distribution of scores for all stakeholder groups separately and asked to re-score. A final consensus meeting will be held with an expert panel and stakeholder representatives to review the guideline item list. Following the consensus meeting, a reporting guideline will be drafted and review and testing will be undertaken until the guideline is finalised. The final outcome will be the COS-STAR (Core Outcome Set-STAndards for Reporting) guideline for studies developing COS and a supporting explanatory document.
DISCUSSION: To assess the credibility and usefulness of a COS, readers of a COS development report need complete, clear and transparent information on its methodology and proposed core set of outcomes. The COS-STAR guideline will potentially benefit all stakeholders in COS development: COS developers, COS users, e.g. trialists and systematic reviewers, journal editors, policy-makers and patient groups.
Resumo:
Some reasons for registering trials might be considered as self-serving, such as satisfying the requirements of a journal in which the researchers wish to publish their eventual findings or publicising the trial to boost recruitment. Registry entries also help others, including systematic reviewers, to know about ongoing or unpublished studies and contribute to reducing research waste by making it clear what studies are ongoing. Other sources of research waste include inconsistency in outcome measurement across trials in the same area, missing data on important outcomes from some trials, and selective reporting of outcomes. One way to reduce this waste is through the use of core outcome sets: standardised sets of outcomes for research in specific areas of health and social care. These do not restrict the outcomes that will be measured, but provide the minimum to include if a trial is to be of the most use to potential users. We propose that trial registries, such as ISRCTN, encourage researchers to note their use of a core outcome set in their entry. This will help people searching for trials and those worried about selective reporting in closed trials. Trial registries can facilitate these efforts to make new trials as useful as possible and reduce waste. The outcomes section in the entry could prompt the researcher to consider using a core outcome set and facilitate the specification of that core outcome set and its component outcomes through linking to the original core outcome set. In doing this, registries will contribute to the global effort to ensure that trials answer important uncertainties, can be brought together in systematic reviews, and better serve their ultimate aim of improving health and well-being through improving health and social care.
Resumo:
Physically Unclonable Functions (PUFs), exploit inherent manufacturing variations and present a promising solution for hardware security. They can be used for key storage, authentication and ID generations. Low power cryptographic design is also very important for security applications. However, research to date on digital PUF designs, such as Arbiter PUFs and RO PUFs, is not very efficient. These PUF designs are difficult to implement on Field Programmable Gate Arrays (FPGAs) or consume many FPGA hardware resources. In previous work, a new and efficient PUF identification generator was presented for FPGA. The PUF identification generator is designed to fit in a single slice per response bit by using a 1-bit PUF identification generator cell formed as a hard-macro. In this work, we propose an ultra-compact PUF identification generator design. It is implemented on ten low-cost Xilinx Spartan-6 FPGA LX9 microboards. The resource utilization is only 2.23%, which, to the best of the authors' knowledge, is the most compact and robust FPGA-based PUF identification generator design reported to date. This PUF identification generator delivers a stable range of uniqueness of around 50% and good reliability between 85% and 100%.