991 resultados para Machine components


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient identification and follow-up of astronomical transients is hindered by the need for humans to manually select promising candidates from data streams that contain many false positives. These artefacts arise in the difference images that are produced by most major ground-based time-domain surveys with large format CCD cameras. This dependence on humans to reject bogus detections is unsustainable for next generation all-sky surveys and significant effort is now being invested to solve the problem computationally. In this paper, we explore a simple machine learning approach to real-bogus classification by constructing a training set from the image data of similar to 32 000 real astrophysical transients and bogus detections from the Pan-STARRS1 Medium Deep Survey. We derive our feature representation from the pixel intensity values of a 20 x 20 pixel stamp around the centre of the candidates. This differs from previous work in that it works directly on the pixels rather than catalogued domain knowledge for feature design or selection. Three machine learning algorithms are trained (artificial neural networks, support vector machines and random forests) and their performances are tested on a held-out subset of 25 per cent of the training data. We find the best results from the random forest classifier and demonstrate that by accepting a false positive rate of 1 per cent, the classifier initially suggests a missed detection rate of around 10 per cent. However, we also find that a combination of bright star variability, nuclear transients and uncertainty in human labelling means that our best estimate of the missed detection rate is approximately 6 per cent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Glaucoma is a leading cause of avoidable blindness worldwide. Open angle glaucoma is the most common type of glaucoma. No randomised controlled trials have been conducted evaluating the effectiveness of glaucoma screening for reducing sight loss. It is unclear what the most appropriate intervention to be evaluated in any glaucoma screening trial would be. The purpose of this study was to develop the clinical components of an intervention for evaluation in a glaucoma (open angle) screening trial that would be feasible and acceptable in a UK eye-care service.

METHODS: A mixed-methods study, based on the Medical Research Council (MRC) framework for complex interventions, integrating qualitative (semi-structured interviews with 46 UK eye-care providers, policy makers and health service commissioners), and quantitative (economic modelling) methods. Interview data were synthesised and used to revise the screening interventions compared within an existing economic model.

RESULTS: The qualitative data indicated broad based support for a glaucoma screening trial to take place in primary care, using ophthalmic trained technical assistants supported by optometry input. The precise location should be tailored to local circumstances. There was variability in opinion around the choice of screening test and target population. Integrating the interview findings with cost-effectiveness criteria reduced 189 potential components to a two test intervention including either optic nerve photography or screening mode perimetry (a measure of visual field sensitivity) with or without tonometry (a measure of intraocular pressure). It would be more cost-effective, and thus acceptable in a policy context, to target screening for open angle glaucoma to those at highest risk but for both practicality and equity arguments the optimal strategy was screening a general population cohort beginning at age forty.

CONCLUSIONS: Interventions for screening for open angle glaucoma that would be feasible from a service delivery perspective were identified. Integration within an economic modelling framework explicitly highlighted the trade-off between cost-effectiveness, feasibility and equity. This study exemplifies the MRC recommendation to integrate qualitative and quantitative methods in developing complex interventions. The next step in the development pathway should encompass the views of service users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Slow release drugs must be manufactured to meet target specifications with respect to dissolution curve profiles. In this paper we consider the problem of identifying the drivers of dissolution curve variability of a drug from historical manufacturing data. Several data sources are considered: raw material parameters, coating data, loss on drying and pellet size statistics. The methodology employed is to develop predictive models using LASSO, a powerful machine learning algorithm for regression with high-dimensional datasets. LASSO provides sparse solutions facilitating the identification of the most important causes of variability in the drug fabrication process. The proposed methodology is illustrated using manufacturing data for a slow release drug.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The existence of loose particles left inside the sealed electronic devices is one of the main factors affecting the reliability of the whole system. It is important to identify the particle material for analyzing their source. The conventional material identification algorithms mainly rely on time, frequency and wavelet domain features. However, these features are usually overlapped and redundant, resulting in unsatisfactory material identification accuracy. The main objective of this paper is to improve the accuracy of material identification. First, the principal component analysis (PCA) is employed to reselect the nine features extracted from time and frequency domains, leading to six less correlated principal components. And then the reselected principal components are used for material identification using a support vector machine (SVM). Finally, the experimental results show that this new method can effectively distinguish the type of materials including wire, aluminum and tin particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present invention relates to a novel class of water compatible molecularly imprinted polymers (AquaMIPs) capable of selectively binding target molecules such as riboflavin, or analogues thereof, in water or aqueous media, their synthesis and use thereof in food processing and extraction or separation processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines a large structural component and its supply chain. The component is representative of that used in the production of civil transport aircraft and is manufactured from carbon fibre epoxy resin prepreg, using traditional hand layup and autoclave cure. Life cycle assessment (LCA) is used to predict the component’s production carbon emissions. The results determine the distribution of carbon emissions within the supply chain, identifying the dominant production processes as carbon fibre manufacture and composite part manufacture. The elevated temperature processes of material and part creation, and the associated electricity usage, have a significant impact on the overall production emissions footprint. The paper also demonstrates the calculation of emissions footprint sensitivity to the geographic location and associated energy sources of the supply chain. The results verify that the proposed methodology is capable of quantitatively linking component and supply chain specifics to manufacturing processes and thus identifying the design drivers for carbon emissions in the manufacturing life of the component.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking in recent advances in neuroscience and digital technology, Gander and Garland assess the state of the inter-arts in America and the Western world, exploring and questioning the primacy of affect in an increasingly hypertextual everyday environment. In this analysis they signal a move beyond W. J. T. Mitchell’s coinage of the ‘imagetext’ to an approach that centres the reader-viewer in a recognition, after John Dewey, of ‘art as experience’. New thinking in cognitive and computer sciences about the relationship between the body and the mind challenges any established definitions of ‘embodiment’, ‘materiality’, ‘virtuality’ and even ‘intelligence, they argue, whilst ‘Extended Mind Theory’, they note, marries our cognitive processes with the material forms with which we engage, confirming and complicating Marshall McLuhan’s insight, decades ago, that ‘all media are “extensions of man”’. In this chapter, Gander and Garland open paths and suggest directions into understandings and critical interpretations of new and emerging imagetext worlds and experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This theoretical paper attempts to define some of the key components and challenges required to create embodied conversational agents that can be genuinely interesting conversational partners. Wittgenstein’s argument concerning talking lions emphasizes the importance of having a shared common ground as a basis for conversational interactions. Virtual bats suggests that–for some people at least–it is important that there be a feeling of authenticity concerning a subjectively experiencing entity that can convey what it is like to be that entity. Electric sheep reminds us of the importance of empathy in human conversational interaction and that we should provide a full communicative repertoire of both verbal and non-verbal components if we are to create genuinely engaging interactions. Also we may be making the task more difficult rather than easy if we leave out non-verbal aspects of communication. Finally, analogical peacocks highlights the importance of between minds alignment and establishes a longer term goal of being interesting, creative, and humorous if an embodied conversational is to be truly an engaging conversational partner. Some potential directions and solutions to addressing these issues are suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background
The use of multiple medicines (polypharmacy) is increasingly common in older people. Ensuring that patients receive the most appropriate combinations of medications (appropriate polypharmacy) is a significant challenge. The quality of evidence to support the effectiveness of interventions to improve appropriate polypharmacy is low. Systematic identification of mediators of behaviour change, using the Theoretical Domains Framework (TDF), provides a theoretically robust evidence base to inform intervention design. This study aimed to (1) identify key theoretical domains that were perceived to influence the prescribing and dispensing of appropriate polypharmacy to older patients by general practitioners (GPs) and community pharmacists, and (2) map domains to associated behaviour change techniques (BCTs) to include as components of an intervention to improve appropriate polypharmacy in older people in primary care.

Methods
Semi-structured interviews were conducted with members of each healthcare professional (HCP) group using tailored topic guides based on TDF version 1 (12 domains). Questions covering each domain explored HCPs’ perceptions of barriers and facilitators to ensuring the prescribing and dispensing of appropriate polypharmacy to older people. Interviews were audio-recorded and transcribed verbatim. Data analysis involved the framework method and content analysis. Key domains were identified and mapped to BCTs based on established methods and discussion within the research team.

Results
Thirty HCPs were interviewed (15 GPs, 15 pharmacists). Eight key domains were identified, perceived to influence prescribing and dispensing of appropriate polypharmacy: ‘Skills’, ‘Beliefs about capabilities’, ‘Beliefs about consequences’, ‘Environmental context and resources’, ‘Memory, attention and decision processes’, ‘Social/professional role and identity’, ‘Social influences’ and ‘Behavioural regulation’. Following mapping, four BCTs were selected for inclusion in an intervention for GPs or pharmacists: ‘Action planning’, ‘Prompts/cues’, ‘Modelling or demonstrating of behaviour’ and ‘Salience of consequences’. An additional BCT (‘Social support or encouragement’) was selected for inclusion in a community pharmacy-based intervention in order to address barriers relating to interprofessional working that were encountered by pharmacists.

Conclusions
Selected BCTs will be operationalised in a theory-based intervention to improve appropriate polypharmacy for older people, to be delivered in GP practice and community pharmacy settings. Future research will involve development and feasibility testing of this intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The peptides derived from envelope proteins have been shown to inhibit the protein-protein interactions in the virus membrane fusion process and thus have a great potential to be developed into effective antiviral therapies. There are three types of envelope proteins each exhibiting distinct structure folds. Although the exact fusion mechanism remains elusive, it was suggested that the three classes of viral fusion proteins share a similar mechanism of membrane fusion. The common mechanism of action makes it possible to correlate the properties of self-derived peptide inhibitors with their activities. Here we developed a support vector machine model using sequence-based statistical scores of self-derived peptide inhibitors as input features to correlate with their activities. The model displayed 92% prediction accuracy with the Matthew’s correlation coefficient of 0.84, obviously superior to those using physicochemical properties and amino acid decomposition as input. The predictive support vector machine model for self- derived peptides of envelope proteins would be useful in development of antiviral peptide inhibitors targeting the virus fusion process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a newly invented parallel kinematic machine (PKM), Exechon has attracted intensive attention from both academic and industrial fields due to its conceptual high performance. Nevertheless, the dynamic behaviors of Exechon PKM have not been thoroughly investigated because of its structural and kinematic complexities. To identify the dynamic characteristics of Exechon PKM, an elastodynamic model is proposed with the substructure synthesis technique in this paper. The Exechon PKM is divided into a moving platform subsystem, a fixed base subsystem and three limb subsystems according to its structural features. Differential equations of motion for the limb subsystem are derived through finite element (FE) formulations by modeling the complex limb structure as a spatial beam with corresponding geometric cross sections. Meanwhile, revolute, universal, and spherical joints are simplified into virtual lumped springs associated with equivalent stiffnesses and mass at their geometric centers. Differential equations of motion for the moving platform are derived with Newton's second law after treating the platform as a rigid body due to its comparatively high rigidity. After introducing the deformation compatibility conditions between the platform and the limbs, governing differential equations of motion for Exechon PKM are derived. The solution to characteristic equations leads to natural frequencies and corresponding modal shapes of the PKM at any typical configuration. In order to predict the dynamic behaviors in a quick manner, an algorithm is proposed to numerically compute the distributions of natural frequencies throughout the workspace. Simulation results reveal that the lower natural frequencies are strongly position-dependent and distributed axial-symmetrically due to the structure symmetry of the limbs. At the last stage, a parametric analysis is carried out to identify the effects of structural, dimensional, and stiffness parameters on the system's dynamic characteristics with the purpose of providing useful information for optimal design and performance improvement of the Exechon PKM. The elastodynamic modeling methodology and dynamic analysis procedure can be well extended to other overconstrained PKMs with minor modifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the availability of a wide range of cloud Virtual Machines (VMs) it is difficult to determine which VMs can maximise the performance of an application. Benchmarking is commonly used to this end for capturing the performance of VMs. Most cloud benchmarking techniques are typically heavyweight - time consuming processes which have to benchmark the entire VM in order to obtain accurate benchmark data. Such benchmarks cannot be used in real-time on the cloud and incur extra costs even before an application is deployed.

In this paper, we present lightweight cloud benchmarking techniques that execute quickly and can be used in near real-time on the cloud. The exploration of lightweight benchmarking techniques are facilitated by the development of DocLite - Docker Container-based Lightweight Benchmarking. DocLite is built on the Docker container technology which allows a user-defined portion (such as memory size and the number of CPU cores) of the VM to be benchmarked. DocLite operates in two modes, in the first mode, containers are used to benchmark a small portion of the VM to generate performance ranks. In the second mode, historic benchmark data is used along with the first mode as a hybrid to generate VM ranks. The generated ranks are evaluated against three scientific high-performance computing applications. The proposed techniques are up to 91 times faster than a heavyweight technique which benchmarks the entire VM. It is observed that the first mode can generate ranks with over 90% and 86% accuracy for sequential and parallel execution of an application. The hybrid mode improves the correlation slightly but the first mode is sufficient for benchmarking cloud VMs.