55 resultados para Numerical Algorithms and Problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of rhodopsin as a structural prototype for the study of the whole superfamily of G protein-coupled receptors (GPCRs) is reviewed in an historical perspective. Discovered at the end of the nineteenth century, fully sequenced since the early 1980s, and with direct three-dimensional information available since the 1990s, rhodopsin has served as a platform to gather indirect information on the structure of the other superfamily members. Recent breakthroughs have elicited the solution of the structures of additional receptors, namely the beta 1- and beta 2-adrenergic receptors and the A(2A) adenosine receptor, now providing an opportunity to gauge the accuracy of homology modeling and molecular docking techniques and to perfect the computational protocol. Notably, in coordination with the solution of the structure of the A(2A) adenosine receptor, the first "critical assessment of GPCR structural modeling and docking" has been organized, the results of which highlighted that the construction of accurate models, although challenging, is certainly achievable. The docking of the ligands and the scoring of the poses clearly emerged as the most difficult components. A further goal in the field is certainly to derive the structure of receptors in their signaling state, possibly in complex with agonists. These advances, coupled with the introduction of more sophisticated modeling algorithms and the increase in computer power, raise the expectation for a substantial boost of the robustness and accuracy of computer-aided drug discovery techniques in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, data mining has emerged as one of the most dynamic and lively areas in information technology. Although many algorithms and techniques for data mining have been proposed, they either focus on domain independent techniques or on very specific domain problems. A general requirement in bridging the gap between academia and business is to cater to general domain-related issues surrounding real-life applications, such as constraints, organizational factors, domain expert knowledge, domain adaption, and operational knowledge. Unfortunately, these either have not been addressed, or have not been sufficiently addressed, in current data mining research and development.Domain-Driven Data Mining (D3M) aims to develop general principles, methodologies, and techniques for modeling and merging comprehensive domain-related factors and synthesized ubiquitous intelligence surrounding problem domains with the data mining process, and discovering knowledge to support business decision-making. This paper aims to report original, cutting-edge, and state-of-the-art progress in D3M. It covers theoretical and applied contributions aiming to: 1) propose next-generation data mining frameworks and processes for actionable knowledge discovery, 2) investigate effective (automated, human and machine-centered and/or human-machined-co-operated) principles and approaches for acquiring, representing, modelling, and engaging ubiquitous intelligence in real-world data mining, and 3) develop workable and operational systems balancing technical significance and applications concerns, and converting and delivering actionable knowledge into operational applications rules to seamlessly engage application processes and systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Newton–Raphson solution scheme with a stress point algorithm is presented for the implementation of an elastic–viscoplastic soilmodel in a finite element program. Viscoplastic strain rates are calculated using the stress and volumetric states of the soil. Sub-incrementsof time are defined for each iterative calculation of elastic–viscoplastic stress changes so that their sum adds up to the time incrementfor the load step. This carefully defined ‘iterative time’ ensures that the correct amount of viscoplastic straining is accumulated overthe applied load step. The algorithms and assumptions required to implement the solution scheme are provided. Verification of the solutionscheme is achieved by using it to analyze typical boundary value problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Microarray Innovations in Leukemia study assessed the clinical utility of gene expression profiling as a single test to subtype leukemias into conventional categories of myeloid and lymphoid malignancies. METHODS: The investigation was performed in 11 laboratories across three continents and included 3,334 patients. An exploratory retrospective stage I study was designed for biomarker discovery and generated whole-genome expression profiles from 2,143 patients with leukemias and myelodysplastic syndromes. The gene expression profiling-based diagnostic accuracy was further validated in a prospective second study stage of an independent cohort of 1,191 patients. RESULTS: On the basis of 2,096 samples, the stage I study achieved 92.2% classification accuracy for all 18 distinct classes investigated (median specificity of 99.7%). In a second cohort of 1,152 prospectively collected patients, a classification scheme reached 95.6% median sensitivity and 99.8% median specificity for 14 standard subtypes of acute leukemia (eight acute lymphoblastic leukemia and six acute myeloid leukemia classes, n = 693). In 29 (57%) of 51 discrepant cases, the microarray results had outperformed routine diagnostic methods. CONCLUSION: Gene expression profiling is a robust technology for the diagnosis of hematologic malignancies with high accuracy. It may complement current diagnostic algorithms and could offer a reliable platform for patients who lack access to today's state-of-the-art diagnostic work-up. Our comprehensive gene expression data set will be submitted to the public domain to foster research focusing on the molecular understanding of leukemias

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An intralaminar damage model, based on a continuum damage mechanics approach, is presented to model the damage mechanisms occurring in carbon fibre composite structures incorporating fibre tensile and compressive breakage, matrix tensile and compressive fracture, and shear failure. The damage model, together with interface elements for capturing interlaminar failure, is implemented in a finite element package and used in a detailed finite element model to simulate the response of a stiffened composite panel to low-velocity impact. Contact algorithms and friction between delaminated plies were included, to better simulate the impact event. Analyses were executed on a high performance computer (HPC) cluster to reduce the actual time required for this detailed numerical analysis. Numerical results relating to the various observed interlaminar damage mechanisms, delamination initiation and propagation, as well as the model’s ability to capture post-impact permanent indentation in the panel are discussed. Very good agreement was achieved with experimentally obtained data of energy absorbed and impactor force versus time. The extent of damage predicted around the impact site also corresponded well with the damage detected by non destructive evaluation of the tested panel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The R-matrix method when applied to the study of intermediate energy electron scattering by the hydrogen atom gives rise to a large number of two electron integrals between numerical basis functions. Each integral is evaluated independently of the others, thereby rendering this a prime candidate for a parallel implementation. In this paper, we present a parallel implementation of this routine which uses a Graphical Processing Unit as a co-processor, giving a speedup of approximately 20 times when compared with a sequential version. We briefly consider properties of this calculation which make a GPU implementation appropriate with a view to identifying other calculations which might similarly benet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An evolution in theoretical models and methodological paradigms for investigating cognitive biases in the addictions is discussed. Anomalies in traditional cognitive perspectives, and problems with the self-report methods which underpin them, are highlighted. An emergent body of cognitive research, contextualized within the principles and paradigms of cognitive neuropsychology rather than social learning theory, is presented which, it is argued, addresses these anomalies and problems. Evidence is presented that biases in the processing of addiction-related stimuli, and in the network of propositions which motivate addictive behaviours, occur at automatic, implicit and pre-conscious levels of awareness. It is suggested that methods which assess such implicit cognitive biases (e.g. Stroop, memory, priming and reaction-time paradigms) yield findings which have better predictive utility for ongoing behaviour than those biases determined by self-report methods of introspection. The potential utility of these findings for understanding "loss of control" phenomena, and the desynchrony between reported beliefs and intentions and ongoing addictive behaviours, is discussed. Applications to the practice of cognitive therapy are considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate systems are of interest in many disciplines. They are often investigated using the discrete element method because of its capability to investigate particulate systems at the individual particle scale. To model the interaction between two particles and between a particle and a boundary, conventional discrete element models use springs and dampers in both the normal and tangential directions. The significance of particle rotation has been highlighted in both numerical studies and physical experiments. Several researchers have attempted to incorporate a rotational torque to account for the rolling resistance or rolling friction by developing different models. This paper presents a review of the commonly used models for rolling resistance and proposes a more general model. These models are classified into four categories according to their key characteristics. The robustness of these models in reproducing rolling resistance effects arising from different physical situations was assessed by using several benchmarking test cases. The proposed model can be seen to be more general and suitable for modelling problems involving both dynamic and pseudo-static regimes. An example simulation of the formation of a 2D sandpile is also shown. For simplicity, all formulations and examples are presented in 2D form, though the general conclusions are also applicable to 3D systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Potentially inappropriate prescribing (PIP) in older people is common in primary care and can result in increased morbidity, adverse drug events, hospitalizations and mortality. The prevalence of PIP in Ireland is estimated at 36% with an associated expenditure of over [euro sign]45 million in 2007. The aim of this paper is to describe the application of the Medical Research Council (MRC) framework to the development of an intervention to decrease PIP in Irish primary care.

Methods: The MRC framework for the design and evaluation of complex interventions guided the development of the study intervention. In the development stage, literature was reviewed and combined with information obtained from experts in the field using a consensus based methodology and patient cases to define the main components of the intervention. In the pilot stage, five GPs tested the proposed intervention. Qualitative interviews were conducted with the GPs to inform the development and implementation of the intervention for the main randomised controlled trial.

Results: The literature review identified PIP criteria for inclusion in the study and two initial intervention components - academic detailing and medicines review supported by therapeutic treatment algorithms. Through patient case studies and a focus group with a group of 8 GPs, these components were refined and a third component of the intervention identified - patient information leaflets. The intervention was tested in a pilot study. In total, eight medicine reviews were conducted across five GP practices. These reviews addressed ten instances of PIP, nine of which were addressed in the form of either a dose reduction or a discontinuation of a targeted medication. Qualitative interviews highlighted that GPs were receptive to the intervention but patient preference and time needed both to prepare for and conduct the medicines review, emerged as potential barriers. Findings from the pilot study allowed further refinement to produce the finalised intervention of academic detailing with a pharmacist, medicines review with web-based therapeutic treatment algorithms and tailored patient information leaflets.

Conclusions: The MRC framework was used in the development of the OPTI-SCRIPT intervention to decrease the level of PIP in primary care in Ireland. Its application ensured that the intervention was developed using the best available evidence, was acceptable to GPs and feasible to deliver in the clinical setting. The effectiveness of this intervention is currently being tested in a pragmatic cluster randomised controlled trial.

Trial registration: Current controlled trials ISRCTN41694007.© 2013 Clyne et al.; licensee BioMed Central Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel device-free stationary person detection and ranging method, that is applicable to ultra-wide bandwidth (UWB) networks. The method utilizes a fixed UWB infrastructure and does not require a training database of template waveforms. Instead, the method capitalizes on the fact that a human presence induces small low-frequency variations that stand out against the background signal, which is mainly affected by wideband noise. We analyze the detection probability, and validate our findings with numerical simulations and experiments with off-the-shelf UWB transceivers in an indoor environment. © 2007-2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical modelling of musical instruments involves studying nonlinear interactions between parts of the instrument. These can pose several difficulties concerning the accuracy and stability of numerical algorithms. In particular, when the underlying forces are non-analytic functions of the phase-space variables, a stability proof can only be obtained in limited cases. An approach has been recently presented by the authors, leading to unconditionally stable simulations for lumped collision models. In that study, discretisation of Hamilton’s equations instead of the usual Newton’s equation of motion yields a numerical scheme that can be proven to be energy conserving. In this paper, the above approach is extended to collisions of distributed objects. Namely, the interaction of an ideal string with a flat barrier is considered. The problem is formulated within the Hamiltonian framework and subsequently discretised. The resulting nonlinearmatrix equation can be shown to possess a unique solution, that enables the update of the algorithm. Energy conservation and thus numerical stability follows in a way similar to the lumped collision model. The existence of an analytic description of this interaction allows the validation of the model’s accuracy. The proposed methodology can be used in sound synthesis applications involving musical instruments where collisions occur either in a confined (e.g. hammer-string interaction, mallet impact) or in a distributed region (e.g. string-bridge or reed-mouthpiece interaction).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: We investigated the physical properties and dynamical evolution of near-Earth asteroid (NEA) (190491) 2000 FJ10 in order to assess the suitability of this accessible NEA as a space mission target. Methods: Photometry and colour determination were carried out with the 1.54 m Kuiper Telescope (Mt Bigelow, USA) and the 10 m Southern African Large Telescope (SALT; Sutherland, South Africa) during the object's recent favourable apparition in 2011-12. During the earlier 2008 apparition, a spectrum of the object in the 6000-9000 Angstrom region was obtained with the 4.2 m William Herschel Telescope (WHT; Canary Islands, Spain). Interpretation of the observational results was aided by numerical simulations of 1000 dynamical clones of 2000 FJ10 up to 106 yr in the past and in the future. Results: The asteroid's spectrum and colours determined by our observations suggest a taxonomic classification within the S-complex although other classifications (V, D, E, M, P) cannot be ruled out. On this evidence, it is unlikely to be a primitive, relatively unaltered remnant from the early history of the solar system and thus a low priority target for robotic sample return. Our photometry placed a lower bound of 2 h to the asteroid's rotation period. Its absolute magnitude was estimated to be 21.54 ± 0.1 which, for a typical S-complex albedo, translates into a diameter of 130 ± 20 m. Our dynamical simulations show that it has likely been an Amor for the past 105 yr. Although currently not Earth-crossing, it will likely become so during the period 50-100 kyr in the future. It may have arrived from the inner or central main belt >1 Myr ago as a former member of a low-inclination S-class asteroid family. Its relatively slow rotation and large size make it a suitable destination for a human mission. We show that ballistic Earth-190491-Earth transfer trajectories with ΔV <2 km s-1 at the asteroid exist between 2052 and 2061. Based on observations made with the Southern African Large Telescope (SALT).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A framework supporting fast prototyping as well as tuning of distributed applications is presented. The approach is based on the adoption of a formal model that is used to describe the orchestration of distributed applications. The formal model (Orc by Misra and Cook) can be used to support semi-formal reasoning about the applications at hand. The paper describes how the framework can be used to derive and evaluate alternative orchestrations of a well know parallel/distributed computation pattern; and shows how the same formal model can be used to support generation of prototypes of distributed applications skeletons directly from the application description.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the principal tasks facing post-crash academic political economy is to analyse patterns of ideational change and the conditions that produce such change. What has been missing from the existing literature on ideational change at times of crises however, is a sense of how processes of persuasive struggle, and how the success of those ‘norm entrepreneurs’ arguing for ideational change is shaped by two contextual variables: the most immediate material symptoms and problems that a crisis displays (the variety of crisis); and the institutional character of the policy subsystem that agents have to operate within to affect change. Introducing these two variables into our accounts of persuasive struggle and ideational change enables us to deepen our understanding of the dynamics of ideational change at times of crisis. The article identifies that a quite rapid and radical intellectual change has been evident in the field of financial regulation in the form of an embrace of a macroprudential frame. In contrast in the field of macroeconomic policy - both monetary and fiscal policy, many pre-crash beliefs remain prominent, there is evidence of ideational stickiness and inertia, and despite some policy experimentation, overarching policy frameworks and their rationales have not been overhauled. The article applies Peter Hall’s framework of three orders of policy changes to help illuminate and explain the variation in patterns of change in the fields of financial regulation and macroeconomic policy since the financial crash of 2008. The different patterns of ideational change in macroeconomic policy and financial regulation in the post-crash period can be explained by timing and variety of crisis; sequencing of policy change; and institutional political differences between micro policy sub systems and macro policy systems.