945 resultados para Precision Xtra®
Resumo:
Precision medicine is an emerging approach to disease treatment and prevention that considers variability in patient genes, environment, and lifestyle. However, little has been written about how such research impacts emergency care. Recent advances in analytical techniques have made it possible to characterize patients in a more comprehensive and sophisticated fashion at the molecular level, promising highly individualized diagnosis and treatment. Among these techniques are various systematic molecular phenotyping analyses (e.g., genomics, transcriptomics, proteomics, and metabolomics). Although a number of emergency physicians use such techniques in their research, widespread discussion of these approaches has been lacking in the emergency care literature and many emergency physicians may be unfamiliar with them. In this article, we briefly review the underpinnings of such studies, note how they already impact acute care, discuss areas in which they might soon be applied, and identify challenges in translation to the emergency department (ED). While such techniques hold much promise, it is unclear whether the obstacles to translating their findings to the ED will be overcome in the near future. Such obstacles include validation, cost, turnaround time, user interface, decision support, standardization, and adoption by end-users.
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.
Resumo:
In his last two State of the Union addresses, President Barack Obama has focused on the need to deliver innovative solutions to improve human health, through the Precision Medicine Initiative in 2015 and the recently announced Cancer Moonshot in 2016. Precision cancer care has delivered clear patient benefit, but even for high-impact medicines such as imatinib mesylate (Glivec) in chronic myeloid leukaemia, the excitement at the success of this practice-changing clinical intervention has been somewhat tempered by the escalating price of this 'poster child' for precision cancer medicine (PCM). Recent studies on the costs of cancer drugs have revealed significant price differentials, which are a major causative factor behind disparities in the access to new generations of immunological and molecularly targeted agents. In this perspective, we will discuss the benefits of PCM to modern cancer control, but also emphasise how increasing costs are rendering the current approaches to integrating the paradigm of PCM unsustainable. Despite the ever increasing pressure on cancer and health care budgets, innovation will and must continue. Value-based frameworks offer one of the most rational approaches for policymakers committed to improving cancer outcomes through a public health approach.
Resumo:
Temporal replicate counts are often aggregated to improve model fit by reducing zero-inflation and count variability, and in the case of migration counts collected hourly throughout a migration, allows one to ignore nonindependence. However, aggregation can represent a loss of potentially useful information on the hourly or seasonal distribution of counts, which might impact our ability to estimate reliable trends. We simulated 20-year hourly raptor migration count datasets with known rate of change to test the effect of aggregating hourly counts to daily or annual totals on our ability to recover known trend. We simulated data for three types of species, to test whether results varied with species abundance or migration strategy: a commonly detected species, e.g., Northern Harrier, Circus cyaneus; a rarely detected species, e.g., Peregrine Falcon, Falco peregrinus; and a species typically counted in large aggregations with overdispersed counts, e.g., Broad-winged Hawk, Buteo platypterus. We compared accuracy and precision of estimated trends across species and count types (hourly/daily/annual) using hierarchical models that assumed a Poisson, negative binomial (NB) or zero-inflated negative binomial (ZINB) count distribution. We found little benefit of modeling zero-inflation or of modeling the hourly distribution of migration counts. For the rare species, trends analyzed using daily totals and an NB or ZINB data distribution resulted in a higher probability of detecting an accurate and precise trend. In contrast, trends of the common and overdispersed species benefited from aggregation to annual totals, and for the overdispersed species in particular, trends estimating using annual totals were more precise, and resulted in lower probabilities of estimating a trend (1) in the wrong direction, or (2) with credible intervals that excluded the true trend, as compared with hourly and daily counts.
Resumo:
The phase difference principle is widely applied nowadays to sonar systems used for sea floor bathymetry, The apparent angle of a target point is obtained from the phase difference measured between two close receiving arrays. Here we study the influence of the phase difference estimation errors caused by the physical structure of the backscattered signals. It is shown that, under certain current conditions, beyond the commonly considered effects of additive external noise and baseline decorrelation, the processing may be affected by the shifting footprint effect: this is due to the fact that the two interferometer receivers get simultaneous echo contributions coming from slightly shifted seabed parts, which results in a degradation of the signal coherence and, hence, of the phase difference measurement. This geometrical effect is described analytically and checked with numerical simulations, both for square- and sine-shaped signal envelopes. Its relative influence depends on the geometrical configuration and receiver spacing; it may be prevalent in practical cases associated with bathymetric sonars. The cases of square and smooth signal envelopes are both considered. The measurements close to nadir, which are known to be especially difficult with interferometry systems, are addressed in particular.
Resumo:
Manipulation of single cells and particles is important to biology and nanotechnology. Our electrokinetic (EK) tweezers manipulate objects in simple microfluidic devices using gentle fluid and electric forces under vision-based feedback control. In this dissertation, I detail a user-friendly implementation of EK tweezers that allows users to select, position, and assemble cells and nanoparticles. This EK system was used to measure attachment forces between living breast cancer cells, trap single quantum dots with 45 nm accuracy, build nanophotonic circuits, and scan optical properties of nanowires. With a novel multi-layer microfluidic device, EK was also used to guide single microspheres along complex 3D trajectories. The schemes, software, and methods developed here can be used in many settings to precisely manipulate most visible objects, assemble objects into useful structures, and improve the function of lab-on-a-chip microfluidic systems.
Resumo:
We describe a new geometry for electrostatic actuators to be used in sensitive laser interferometers, suited for prototype and table top experiments related to gravitational wave detection with mirrors of 100 g or less. The arrangement consists of two plates at the sides of the mirror (test mass), and therefore does not reduce its clear aperture as a conventional electrostatic drive (ESD) would do. Using the sample case of the AEI-10 m prototype interferometer, we investigate the actuation range and the influence of the relative misalignment of the ESD plates with respect to the test mass. We find that in the case of the AEI-10 m prototype interferometer, this new kind of ESD could provide a range of 0.28 mu m when operated at a voltage of 1 kV. In addition, the geometry presented is shown to provide a reduction factor of about 100 in the magnitude of the actuator motion coupling to the test mass displacement. We show that therefore in the specific case of the AEI-10 m interferometer, it is possible to mount the ESD actuators directly on the optical table without spoiling the seismic isolation performance of the triple stage suspension of the main test masses.
Resumo:
A basic requirement of a plasma etching process is fidelity of the patterned organic materials. In photolithography, a He plasma pretreatment (PPT) based on high ultraviolet and vacuum ultraviolet (UV/VUV) exposure was shown to be successful for roughness reduction of 193nm photoresist (PR). Typical multilayer masks consist of many other organic masking materials in addition to 193nm PR. These materials vary significantly in UV/VUV sensitivity and show, therefore, a different response to the He PPT. A delamination of the nanometer-thin, ion-induced dense amorphous carbon (DAC) layer was observed. Extensive He PPT exposure produces volatile species through UV/VUV induced scissioning. These species are trapped underneath the DAC layer in a subsequent plasma etch (PE), causing a loss of adhesion. Next to stabilizing organic materials, the major goals of this work included to establish and evaluate a cyclic fluorocarbon (FC) based approach for atomic layer etching (ALE) of SiO2 and Si; to characterize the mechanisms involved; and to evaluate the impact of processing parameters. Periodic, short precursor injections allow precise deposition of thin FC films. These films limit the amount of available chemical etchant during subsequent low energy, plasma-based Ar+ ion bombardment, resulting in strongly time-dependent etch rates. In situ ellipsometry showcased the self-limited etching. X-ray photoelectron spectroscopy (XPS) confirms FC film deposition and mixing with the substrate. The cyclic ALE approach is also able to precisely etch Si substrates. A reduced time-dependent etching is seen for Si, likely based on a lower physical sputtering energy threshold. A fluorinated, oxidized surface layer is present during ALE of Si and greatly influences the etch behavior. A reaction of the precursor with the fluorinated substrate upon precursor injection was observed and characterized. The cyclic ALE approach is transferred to a manufacturing scale reactor at IBM Research. Ensuring the transferability to industrial device patterning is crucial for the application of ALE. In addition to device patterning, the cyclic ALE process is employed for oxide removal from Si and SiGe surfaces with the goal of minimal substrate damage and surface residues. The ALE process developed for SiO2 and Si etching did not remove native oxide at the level required. Optimizing the process enabled strong O removal from the surface. Subsequent 90% H2/Ar plasma allow for removal of C and F residues.
Resumo:
My study investigated internal consistency estimates of psychometric surveys as an operationalization of the state of measurement precision of constructs in industrial and organizational (I/O) psychology. Analyses were conducted of samples used in research articles published in the Journal of Applied Psychology between 1975 and 2010 in five year intervals (K = 934) from 480 articles yielding 1427 coefficients. Articles and their respective samples were coded for test-taker characteristics (e.g., age, gender, and ethnicity), research settings (e.g., lab and field studies), and actual tests (e.g., number of items and scale anchor points). A reliability and inter-item correlations depository was developed for I/O variables and construct groups. Personality measures had significantly lower inter-item correlations than other construct groups. Also, internal consistency estimates and reporting practices were evaluated over time, demonstrating an improvement in measurement precision and missing data.
Resumo:
Annual counts of migrating raptors at fixed observation points are a widespread practice, and changes in numbers counted over time, adjusted for survey effort, are commonly used as indices of trends in population size. Unmodeled year-to-year variation in detectability may introduce bias, reduce precision of trend estimates, and reduce power to detect trends. We conducted dependent double-observer surveys at the annual fall raptor migration count at Lucky Peak, Idaho, in 2009 and 2010 and applied Huggins closed-capture removal models and information-theoretic model selection to determine the relative importance of factors affecting detectability. The most parsimonious model included effects of observer team identity, distance, species, and day of the season. We then simulated 30 years of counts with heterogeneous individual detectability, a population decline (λ = 0.964), and unexplained random variation in the number of available birds. Imperfect detectability did not bias trend estimation, and increased the time required to achieve 80% power by less than 11%. Results suggested that availability is a greater source of variance in annual counts than detectability; thus, efforts to account for availability would improve the monitoring value of migration counts. According to our models, long-term trends in observer efficiency or migratory flight distance may introduce substantial bias to trend estimates. Estimating detectability with a novel count protocol like our double-observer method is just one potential means of controlling such effects. The traditional approach of modeling the effects of covariates and adjusting the index may also be effective if ancillary data is collected consistently.
Resumo:
We introduce quantum sensing schemes for measuring very weak forces with a single trapped ion. They use the spin-motional coupling induced by the laser-ion interaction to transfer the relevant force information to the spin-degree of freedom. Therefore, the force estimation is carried out simply by observing the Ramsey-type oscillations of the ion spin states. Three quantum probes are considered, which are represented by systems obeying the Jaynes-Cummings, quantum Rabi (in 1D) and Jahn-Teller (in 2D) models. By using dynamical decoupling schemes in the Jaynes-Cummings and Jahn-Teller models, our force sensing protocols can be made robust to the spin dephasing caused by the thermal and magnetic field fluctuations. In the quantum-Rabi probe, the residual spin-phonon coupling vanishes, which makes this sensing protocol naturally robust to thermally-induced spin dephasing. We show that the proposed techniques can be used to sense the axial and transverse components of the force with a sensitivity beyond the yN/\wurzel{Hz}range, i.e. in the xN/\wurzel{Hz}(xennonewton, 10^−27). The Jahn-Teller protocol, in particular, can be used to implement a two-channel vector spectrum analyzer for measuring ultra-low voltages.
Resumo:
The ability of agents and services to automatically locate and interact with unknown partners is a goal for both the semantic web and web services. This, \serendipitous interoperability", is hindered by the lack of an explicit means of describing what services (or agents) are able to do, that is, their capabilities. At present, informal descriptions of what services can do are found in \documentation" elements; or they are somehow encoded in operation names and signatures. We show, by ref- erence to existing service examples, how ambiguous and imprecise capa- bility descriptions hamper the attainment of automated interoperability goals in the open, global web environment. In this paper we propose a structured, machine readable description of capabilities, which may help to increase the recall and precision of service discovery mechanisms. Our capability description draws on previous work in capability and process modeling and allows the incorporation of external classi¯cation schemes. The capability description is presented as a conceptual meta model. The model supports conceptual queries and can be used as an extension to the DAML-S Service Pro¯le.
Resumo:
Mechanical harmonic transmissions are relatively new kind of drives having several unusual features. For example, they can provide reduction ratio up to 500:1 in one stage, have very small teeth module compared to conventional drives and very large number of teeth (up to 1000) on a flexible gear. If for conventional drives manufacturing methods are well-developed, fabrication of large size harmonic drives presents a challenge. For example, how to fabricate a thin shell of 1.7m in diameter and wall thickness of 30mm having high precision external teeth at one end and internal splines at the other end? It is so flexible that conventional fabrication methods become unsuitable. In this paper special fabrication methods are discussed that can be used for manufacturing of large size harmonic drive components. They include electro-slag welding and refining, the use of special expandable devices to locate and hold a flexible gear, welding peripheral parts of disks with wear resistant materials with subsequent machining and others. These fabrication methods proved to be effective and harmonic drives built with the use of these innovative technologies have been installed on heavy metallurgical equipment and successfully tested.