938 resultados para PDE-based parallel preconditioner
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
ACKNOWLEDGEMENTS This research is based upon work supported in part by the U.S. ARL and U.K. Ministry of Defense under Agreement Number W911NF-06-3-0001, and by the NSF under award CNS-1213140. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views or represent the official policies of the NSF, the U.S. ARL, the U.S. Government, the U.K. Ministry of Defense or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.
Resumo:
ACKNOWLEDGEMENTS This research is based upon work supported in part by the U.S. ARL and U.K. Ministry of Defense under Agreement Number W911NF-06-3-0001, and by the NSF under award CNS-1213140. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views or represent the official policies of the NSF, the U.S. ARL, the U.S. Government, the U.K. Ministry of Defense or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.
Resumo:
This book constitutes the refereed proceedings of the 14th International Conference on Parallel Problem Solving from Nature, PPSN 2016, held in Edinburgh, UK, in September 2016. The total of 93 revised full papers were carefully reviewed and selected from 224 submissions. The meeting began with four workshops which offered an ideal opportunity to explore specific topics in intelligent transportation Workshop, landscape-aware heuristic search, natural computing in scheduling and timetabling, and advances in multi-modal optimization. PPSN XIV also included sixteen free tutorials to give us all the opportunity to learn about new aspects: gray box optimization in theory; theory of evolutionary computation; graph-based and cartesian genetic programming; theory of parallel evolutionary algorithms; promoting diversity in evolutionary optimization: why and how; evolutionary multi-objective optimization; intelligent systems for smart cities; advances on multi-modal optimization; evolutionary computation in cryptography; evolutionary robotics - a practical guide to experiment with real hardware; evolutionary algorithms and hyper-heuristics; a bridge between optimization over manifolds and evolutionary computation; implementing evolutionary algorithms in the cloud; the attainment function approach to performance evaluation in EMO; runtime analysis of evolutionary algorithms: basic introduction; meta-model assisted (evolutionary) optimization. The papers are organized in topical sections on adaption, self-adaption and parameter tuning; differential evolution and swarm intelligence; dynamic, uncertain and constrained environments; genetic programming; multi-objective, many-objective and multi-level optimization; parallel algorithms and hardware issues; real-word applications and modeling; theory; diversity and landscape analysis.
Resumo:
This paper deals with the monolithic decoupled XYZ compliant parallel mechanisms (CPMs) for multi-function applications, which can be fabricated monolithically without assembly and has the capability of kinetostatic decoupling. At first, the conceptual design of monolithic decoupled XYZ CPMs is presented using identical spatial compliant multi-beam modules based on a decoupled 3-PPPR parallel kinematic mechanism. Three types of applications: motion/positioning stages, force/acceleration sensors and energy harvesting devices are described in principle. The kinetostatic and dynamic modelling is then conducted to capture the displacements of any stage under loads acting at any stage and the natural frequency with the comparisons with FEA results. Finally, performance characteristics analysis for motion stage applications is detailed investigated to show how the change of the geometrical parameter can affect the performance characteristics, which provides initial optimal estimations. Results show that the smaller thickness of beams and larger dimension of cubic stages can improve the performance characteristics excluding natural frequency under allowable conditions. In order to improve the natural frequency characteristic, a stiffness-enhanced monolithic decoupled configuration that is achieved through employing more beams in the spatial modules or reducing the mass of each cubic stage mass can be adopted. In addition, an isotropic variation with different motion range along each axis and same payload in each leg is proposed. The redundant design for monolithic fabrication is introduced in this paper, which can overcome the drawback of monolithic fabrication that the failed compliant beam is difficult to replace, and extend the CPM’s life.
Resumo:
Cumulon is a system aimed at simplifying the development and deployment of statistical analysis of big data in public clouds. Cumulon allows users to program in their familiar language of matrices and linear algebra, without worrying about how to map data and computation to specific hardware and cloud software platforms. Given user-specified requirements in terms of time, monetary cost, and risk tolerance, Cumulon automatically makes intelligent decisions on implementation alternatives, execution parameters, as well as hardware provisioning and configuration settings -- such as what type of machines and how many of them to acquire. Cumulon also supports clouds with auction-based markets: it effectively utilizes computing resources whose availability varies according to market conditions, and suggests best bidding strategies for them. Cumulon explores two alternative approaches toward supporting such markets, with different trade-offs between system and optimization complexity. Experimental study is conducted to show the efficiency of Cumulon's execution engine, as well as the optimizer's effectiveness in finding the optimal plan in the vast plan space.
Resumo:
This paper introduces a screw theory based method termed constraint and position identification (CPI) approach to synthesize decoupled spatial translational compliant parallel manipulators (XYZ CPMs) with consideration of actuation isolation. The proposed approach is based on a systematic arrangement of rigid stages and compliant modules in a three-legged XYZ CPM system using the constraint spaces and the position spaces of the compliant modules. The constraint spaces and the position spaces are firstly derived based on the screw theory instead of using the rigid-body mechanism design experience. Additionally, the constraint spaces are classified into different constraint combinations, with typical position spaces depicted via geometric entities. Furthermore, the systematic synthesis process based on the constraint combinations and the geometric entities is demonstrated via several examples. Finally, several novel decoupled XYZ CPMs with monolithic configurations are created and verified by finite elements analysis. The present CPI approach enables experts and beginners to synthesize a variety of decoupled XYZ CPMs with consideration of actuation isolation by selecting an appropriate constraint and an optimal position for each of the compliant modules according to a specific application.
Resumo:
Numerous works have been conducted on modelling basic compliant elements such as wire beams, and closed-form analytical models of most basic compliant elements have been well developed. However, the modelling of complex compliant mechanisms is still a challenging work. This paper proposes a constraint-force-based (CFB) modelling approach to model compliant mechanisms with a particular emphasis on modelling complex compliant mechanisms. The proposed CFB modelling approach can be regarded as an improved free-body- diagram (FBD) based modelling approach, and can be extended to a development of the screw-theory-based design approach. A compliant mechanism can be decomposed into rigid stages and compliant modules. A compliant module can offer elastic forces due to its deformation. Such elastic forces are regarded as variable constraint forces in the CFB modelling approach. Additionally, the CFB modelling approach defines external forces applied on a compliant mechanism as constant constraint forces. If a compliant mechanism is at static equilibrium, all the rigid stages are also at static equilibrium under the influence of the variable and constant constraint forces. Therefore, the constraint force equilibrium equations for all the rigid stages can be obtained, and the analytical model of the compliant mechanism can be derived based on the constraint force equilibrium equations. The CFB modelling approach can model a compliant mechanism linearly and nonlinearly, can obtain displacements of any points of the rigid stages, and allows external forces to be exerted on any positions of the rigid stages. Compared with the FBD based modelling approach, the CFB modelling approach does not need to identify the possible deformed configuration of a complex compliant mechanism to obtain the geometric compatibility conditions and the force equilibrium equations. Additionally, the mathematical expressions in the CFB approach have an easily understood physical meaning. Using the CFB modelling approach, the variable constraint forces of three compliant modules, a wire beam, a four-beam compliant module and an eight-beam compliant module, have been derived in this paper. Based on these variable constraint forces, the linear and non-linear models of a decoupled XYZ compliant parallel mechanism are derived, and verified by FEA simulations and experimental tests.
Resumo:
The purpose of the current dissertation is to identify the features of effective interventions by exploring the experiences of youth with ASD who participate in such interventions, through two intervention studies (Studies 1 and 2) and one interview study (Study 3). Studies 1 and 2 were designed to support the development of social competence of youth with ASD through Structured Play with LEGO TM (Study 1, 12 youths with ASD, ages 7–12) and Minecraft TM (Study 2, 4 youths with ASD, ages 11–13). Over the course of the sessions, the play of the youth developed from parallel play (children playing alone, without interacting) to co-operative play (playing together with shared objectives). The results of Study 2 showed that rates of initiations and levels of engagement increased from the first session to the final session. In Study 3, 12 youths with ASD (ages 10–14) and at least one of their parents were interviewed to explore what children and their parents want from programs designed to improve social competence, which activities and practices were perceived to promote social competence by the participants, and which factors affected their decisions regarding these programs. The adolescents and parents looked for programs that supported social development and emotional wellbeing, but did not always have access to the programs they would have preferred, with factors such as cost and location reducing their options. Three overarching themes emerged through analysis of the three studies: (a) interests of the youth; (b) structure, both through interactions and instruction; and (c) naturalistic settings. Adolescents generally engage more willingly in interventions that incorporate their interests, such as play with Minecraft TM in Study 2. Additionally, Structured Play and structured instruction were crucial components of providing safe and supportive contexts for the development of social competence. Finally, skills learned in naturalistic settings tend to be applied more successfully in everyday situations. The themes are analysed through the lens of Vygotsky’s (1978) perspectives on learning, play, and development. Implications of the results for practitioners and researchers are discussed.
Resumo:
A scenario-based two-stage stochastic programming model for gas production network planning under uncertainty is usually a large-scale nonconvex mixed-integer nonlinear programme (MINLP), which can be efficiently solved to global optimality with nonconvex generalized Benders decomposition (NGBD). This paper is concerned with the parallelization of NGBD to exploit multiple available computing resources. Three parallelization strategies are proposed, namely, naive scenario parallelization, adaptive scenario parallelization, and adaptive scenario and bounding parallelization. Case study of two industrial natural gas production network planning problems shows that, while the NGBD without parallelization is already faster than a state-of-the-art global optimization solver by an order of magnitude, the parallelization can improve the efficiency by several times on computers with multicore processors. The adaptive scenario and bounding parallelization achieves the best overall performance among the three proposed parallelization strategies.
Resumo:
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.
Resumo:
With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.
Resumo:
IMPORTANCE: Prevention strategies for heart failure are needed.
OBJECTIVE: To determine the efficacy of a screening program using brain-type natriuretic peptide (BNP) and collaborative care in an at-risk population in reducing newly diagnosed heart failure and prevalence of significant left ventricular (LV) systolic and/or diastolic dysfunction.
DESIGN, SETTING, AND PARTICIPANTS: The St Vincent's Screening to Prevent Heart Failure Study, a parallel-group randomized trial involving 1374 participants with cardiovascular risk factors (mean age, 64.8 [SD, 10.2] years) recruited from 39 primary care practices in Ireland between January 2005 and December 2009 and followed up until December 2011 (mean follow-up, 4.2 [SD, 1.2] years).
INTERVENTION: Patients were randomly assigned to receive usual primary care (control condition; n=677) or screening with BNP testing (n=697). Intervention-group participants with BNP levels of 50 pg/mL or higher underwent echocardiography and collaborative care between their primary care physician and specialist cardiovascular service.
MAIN OUTCOMES AND MEASURES: The primary end point was prevalence of asymptomatic LV dysfunction with or without newly diagnosed heart failure. Secondary end points included emergency hospitalization for arrhythmia, transient ischemic attack, stroke, myocardial infarction, peripheral or pulmonary thrombosis/embolus, or heart failure.
RESULTS: A total of 263 patients (41.6%) in the intervention group had at least 1 BNP reading of 50 pg/mL or higher. The intervention group underwent more cardiovascular investigations (control, 496 per 1000 patient-years vs intervention, 850 per 1000 patient-years; incidence rate ratio, 1.71; 95% CI, 1.61-1.83; P<.001) and received more renin-angiotensin-aldosterone system-based therapy at follow-up (control, 49.6%; intervention, 56.5%; P=.01). The primary end point of LV dysfunction with or without heart failure was met in 59 (8.7%) of 677 in the control group and 37 (5.3%) of 697 in the intervention group (odds ratio [OR], 0.55; 95% CI, 0.37-0.82; P = .003). Asymptomatic LV dysfunction was found in 45 (6.6%) of 677 control-group patients and 30 (4.3%) of 697 intervention-group patients (OR, 0.57; 95% CI, 0.37-0.88; P = .01). Heart failure occurred in 14 (2.1%) of 677 control-group patients and 7 (1.0%) of 697 intervention-group patients (OR, 0.48; 95% CI, 0.20-1.20; P = .12). The incidence rates of emergency hospitalization for major cardiovascular events were 40.4 per 1000 patient-years in the control group vs 22.3 per 1000 patient-years in the intervention group (incidence rate ratio, 0.60; 95% CI, 0.45-0.81; P = .002).
CONCLUSION AND RELEVANCE: Among patients at risk of heart failure, BNP-based screening and collaborative care reduced the combined rates of LV systolic dysfunction, diastolic dysfunction, and heart failure.
TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00921960.
Resumo:
A novel cyclic sulfonium cation-based ionic liquid (IL) with an ether-group appendage and the bis{(trifluoromethyl)sulfonyl}imide anion was synthesised and developed for electrochemical double layer capacitor (EDLC) testing. The synthesis and chemical-physical characterisation of the ether-group containing IL is reported in parallel with a similarly sized alkyl-functionalised sulfonium IL. Results of the chemical-physical measurements demonstrate how important transport properties, i.e. viscosity and conductivity, can be promoted through the introduction of the ether-functionality without impeding thermal, chemical or electrochemical stability of the IL. Although the apparent transport properties are improved relative to the alkyl-functionalised analogue, the ether-functionalised sulfonium cation-based IL exhibits moderately high viscosity, and poorer conductivity, when compared to traditional EDLC electrolytes based on organic solvents (propylene carbonate and acetonitrile). Electrochemical testing of the ether-functionalised sulfonium IL was conducted using activated carbon composite electrodes to inspect the performance of the IL as a solvent-free electrolyte for EDLC application. Good cycling stability was achieved over the studied range and the performance was comparable to other solvent free,
IL-based EDLC systems. Nevertheless, limitations of the attainable performance are primarily the result of sluggish transport properties and a restricted operative voltage of the IL, thus highlighting key aspects of this field which require further attention.
Resumo:
Structured parallel programming, and in particular programming models using the algorithmic skeleton or parallel design pattern concepts, are increasingly considered to be the only viable means of supporting effective development of scalable and efficient parallel programs. Structured parallel programming models have been assessed in a number of works in the context of performance. In this paper we consider how the use of structured parallel programming models allows knowledge of the parallel patterns present to be harnessed to address both performance and energy consumption. We consider different features of structured parallel programming that may be leveraged to impact the performance/energy trade-off and we discuss a preliminary set of experiments validating our claims.