950 resultados para compliant cryptologic protocols


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In combination of the advantages of both parallel mechanisms and compliant mechanisms, a compliant parallel mechanism with two rotational DOFs (degrees of freedom) is designed to meet the requirement of a lightweight and compact pan-tilt platform. Firstly, two commonly-used design methods i.e. direct substitution and FACT (Freedom and Constraint Topology) are applied to design the configuration of the pan-tilt system, and similarities and differences of the two design alternatives are compared. Then inverse kinematic analysis of the candidate mechanism is implemented by using the pseudo-rigid-body model (PRBM), and the Jacobian related to its differential kinematics is further derived to help designer realize dynamic analysis of the 8R compliant mechanism. In addition, the mechanism’s maximum stress existing within its workspace is tested by finite element analysis. Finally, a method to determine joint damping of the flexure hinge is presented, which aims at exploring the effect of joint damping on actuator selection and real-time control. To the authors’ knowledge, almost no existing literature concerns with this issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a screw theory based method termed constraint and position identification (CPI) approach to synthesize decoupled spatial translational compliant parallel manipulators (XYZ CPMs) with consideration of actuation isolation. The proposed approach is based on a systematic arrangement of rigid stages and compliant modules in a three-legged XYZ CPM system using the constraint spaces and the position spaces of the compliant modules. The constraint spaces and the position spaces are firstly derived based on the screw theory instead of using the rigid-body mechanism design experience. Additionally, the constraint spaces are classified into different constraint combinations, with typical position spaces depicted via geometric entities. Furthermore, the systematic synthesis process based on the constraint combinations and the geometric entities is demonstrated via several examples. Finally, several novel decoupled XYZ CPMs with monolithic configurations are created and verified by finite elements analysis. The present CPI approach enables experts and beginners to synthesize a variety of decoupled XYZ CPMs with consideration of actuation isolation by selecting an appropriate constraint and an optimal position for each of the compliant modules according to a specific application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Semiconductor chip packaging has evolved from single chip packaging to 3D heterogeneous system integration using multichip stacking in a single module. One of the key challenges in 3D integration is the high density interconnects that need to be formed between the chips with through-silicon-vias (TSVs) and inter-chip interconnects. Anisotropic Conductive Film (ACF) technology is one of the low-temperature, fine-pitch interconnect method, which has been considered as a potential replacement for solder interconnects in line with continuous scaling of the interconnects in the IC industry. However, the conventional ACF materials are facing challenges to accommodate the reduced pad and pitch size due to the micro-size particles and the particle agglomeration issue. A new interconnect material - Nanowire Anisotropic Conductive Film (NW-ACF), composed of high density copper nanowires of ~ 200 nm diameter and 10-30 µm length that are vertically distributed in a polymeric template, is developed in this work to tackle the constrains of the conventional ACFs and serves as an inter-chip interconnect solution for potential three-dimensional (3D) applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous works have been conducted on modelling basic compliant elements such as wire beams, and closed-form analytical models of most basic compliant elements have been well developed. However, the modelling of complex compliant mechanisms is still a challenging work. This paper proposes a constraint-force-based (CFB) modelling approach to model compliant mechanisms with a particular emphasis on modelling complex compliant mechanisms. The proposed CFB modelling approach can be regarded as an improved free-body- diagram (FBD) based modelling approach, and can be extended to a development of the screw-theory-based design approach. A compliant mechanism can be decomposed into rigid stages and compliant modules. A compliant module can offer elastic forces due to its deformation. Such elastic forces are regarded as variable constraint forces in the CFB modelling approach. Additionally, the CFB modelling approach defines external forces applied on a compliant mechanism as constant constraint forces. If a compliant mechanism is at static equilibrium, all the rigid stages are also at static equilibrium under the influence of the variable and constant constraint forces. Therefore, the constraint force equilibrium equations for all the rigid stages can be obtained, and the analytical model of the compliant mechanism can be derived based on the constraint force equilibrium equations. The CFB modelling approach can model a compliant mechanism linearly and nonlinearly, can obtain displacements of any points of the rigid stages, and allows external forces to be exerted on any positions of the rigid stages. Compared with the FBD based modelling approach, the CFB modelling approach does not need to identify the possible deformed configuration of a complex compliant mechanism to obtain the geometric compatibility conditions and the force equilibrium equations. Additionally, the mathematical expressions in the CFB approach have an easily understood physical meaning. Using the CFB modelling approach, the variable constraint forces of three compliant modules, a wire beam, a four-beam compliant module and an eight-beam compliant module, have been derived in this paper. Based on these variable constraint forces, the linear and non-linear models of a decoupled XYZ compliant parallel mechanism are derived, and verified by FEA simulations and experimental tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design and analysis of conceptually different cooling systems for the human heart preservation are numerically investigated. A heart cooling container with required connections was designed for a normal size human heart. A three-dimensional, high resolution human heart geometric model obtained from CT-angio data was used for simulations. Nine different cooling designs are introduced in this research. The first cooling design (Case 1) used a cooling gelatin only outside of the heart. In the second cooling design (Case 2), the internal parts of the heart were cooled via pumping a cooling liquid inside both the heart’s pulmonary and systemic circulation systems. An unsteady conjugate heat transfer analysis is performed to simulate the temperature field variations within the heart during the cooling process. Case 3 simulated the currently used cooling method in which the coolant is stagnant. Case 4 was a combination of Case 1 and Case 2. A linear thermoelasticity analysis was performed to assess the stresses applied on the heart during the cooling process. In Cases 5 through 9, the coolant solution was used for both internal and external cooling. For external circulation in Case 5 and Case 6, two inlets and two outlets were designed on the walls of the cooling container. Case 5 used laminar flows for coolant circulations inside and outside of the heart. Effects of turbulent flow on cooling of the heart were studied in Case 6. In Case 7, an additional inlet was designed on the cooling container wall to create a jet impinging the hot region of the heart’s wall. Unsteady periodic inlet velocities were applied in Case 8 and Case 9. The average temperature of the heart in Case 5 was +5.0oC after 1500 s of cooling. Multi-objective constrained optimization was performed for Case 5. Inlet velocities for two internal and one external coolant circulations were the three design variables for optimization. Minimizing the average temperature of the heart, wall shear stress and total volumetric flow rates were the three objectives. The only constraint was to keep von Mises stress below the ultimate tensile stress of the heart’s tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims/Purpose: Protocols are evidenced-based structured guides for directing care to achieve improvements. But translating that evidence into practice is a major challenge. It is not acceptable to simply introduce the protocol and expect it to be adopted and lead to change in practice. Implementation requires effective leadership and management. This presentation describes a strategy for implementation that should promote successful adoption and lead to practice change.
Presentation description: There are many social and behavioural change models to assist and guide practice change. Choosing a model to guide implementation is important for providing a framework for action. The change process requires careful thought, from the protocol itself to the policies and politics within the ICU. In this presentation, I discuss a useful pragmatic guide called the 6SQUID (6 Steps in QUality Intervention Development). This was initially designed for public health interventions, but the model has wider applicability and has similarities with other change process models. Steps requiring consideration include examining the purpose and the need for change; the staff that will be affected and the impact on their workload; and the evidence base supporting the protocol. Subsequent steps in the process that the ICU manager should consider are the change mechanism (widespread multi-disciplinary consultation; adapting the protocol to the local ICU); and identifying how to deliver the change mechanism (educational workshops and preparing staff for the changes are imperative). Recognising the barriers to implementation and change and addressing these locally is also important. Once the protocol has been implemented, there is generally a learning curve before it becomes embedded in practice. Audit and feedback on adherence are useful strategies to monitor and sustain the changes.
Conclusion: Managing change successfully will promote a positive experience for staff. In turn, this will encourage a culture of enthusiasm for translating evidence into practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a European BIOMED-2 collaborative study, multiplex PCR assays have successfully been developed and standardized for the detection of clonally rearranged immunoglobulin (Ig) and T-cell receptor (TCR) genes and the chromosome aberrations t(11;14) and t(14;18). This has resulted in 107 different primers in only 18 multiplex PCR tubes: three VH-JH, two DH-JH, two Ig kappa (IGK), one Ig lambda (IGL), three TCR beta (TCRB), two TCR gamma (TCRG), one TCR delta (TCRD), three BCL1-Ig heavy chain (IGH), and one BCL2-IGH. The PCR products of Ig/TCR genes can be analyzed for clonality assessment by heteroduplex analysis or GeneScanning. The detection rate of clonal rearrangements using the BIOMED-2 primer sets is unprecedentedly high. This is mainly based on the complementarity of the various BIOMED-2 tubes. In particular, combined application of IGH (VH-JH and DH-JH) and IGK tubes can detect virtually all clonal B-cell proliferations, even in B-cell malignancies with high levels of somatic mutations. The contribution of IGL gene rearrangements seems limited. Combined usage of the TCRB and TCRG tubes detects virtually all clonal T-cell populations, whereas the TCRD tube has added value in case of TCRgammadelta(+) T-cell proliferations. The BIOMED-2 multiplex tubes can now be used for diagnostic clonality studies as well as for the identification of PCR targets suitable for the detection of minimal residual disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical exercise programmes are routinely prescribed in clinical practice to treat impairments, improve activity and participation in daily life because of their known physiological, health and psychological benefits (RCP, 2009). Progressive resistance exercise is a type of exercise prescribed specifically to improve skeletal muscle strength (Latham et al., 2004). The effectiveness of progressive resistance exercise varies considerably between studies and populations. This thesis focuses on how training parameters influence the delivery of progressive resistance exercise. In order to appropriately evaluate the influence of training parameters, this thesis argues the need to record training performance and the total work completed by participants as prescribed by training protocols. In the first study, participants were taken through a series of protocols differentiated by the intensity and volume of training. Training intensity was defined as a proportion of the mean peak torque achieved during maximal voluntary contractions and was set at 80% and 40% respectively of the MVC mean peak torque. Training volume was defined as the total external work achieved over the training period. Measures of training performance were developed to accurately report the intensity, repetitions and work completed during the training period. A second study evaluated training performance of the training protocols over repeated sessions. These protocols were then applied to 3 stroke survivors. Study 1 found sedentary participants could achieve a differentiated training intensity. Participants completing the high and low intensity protocols trained at 80% and 40% respectively of the MVC mean peak torque. The total work achieved in the high intensity low repetition protocol was lower than the total work achieved in the low intensity high repetition protocol. With repeated practice, study 2 found participants were able to improve in their ability to perform manoeuvres as shown by a reduction in the variation of the mean training intensity achieving total work as specified by the protocol to a lower margin of error. When these protocols were applied to 3 stroke survivors, they were able to achieve the specified training intensity but they were not able to achieve the total work as expected for the protocol. This is likely to be due to an inability in achieving a consistent force throughout the contraction. These results demonstrate evaluation of training characteristics and support the need to record and report training performance characteristics during progressive resistance exercise, including the total work achieved, in order to elucidate the influence of training parameters on progressive resistance exercise. The lack of accurate training performance may partly explain the inconsistencies between studies on optimal training parameters for progressive resistance exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer remains an undetermined question for modern medicine. Every year millions of people ranging from children to adult die since the modern treatment is unable to meet the challenge. Research must continue in the area of new biomarkers for tumors. Molecular biology has evolved during last years; however, this knowledge has not been applied into the medicine. Biological findings should be used to improve diagnostics and treatment modalities. In this thesis, human formalin-fixed paraffin embedded colorectal and breast cancer samples were used to optimize the double immunofluorescence staining protocol. Also, immunohistochemistry was performed in order to visualize expression patterns of each biomarker. Concerning double immunofluorescence, feasibility of primary antibodies raised in different and same host species was also tested. Finally, established methods for simultaneous multicolor immunofluorescence imaging of formalin-fixed paraffin embedded specimens were applied for the detection of pairs of potential biomarkers of colorectal cancer (EGFR, pmTOR, pAKT, Vimentin, Cytokeratin Pan, Ezrin, E-cadherin) and breast cancer (Securin, PTTG1IP, Cleaved caspase 3, ki67).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mammography equipment must be evaluated to ensure that images will be of acceptable diagnostic quality with lowest radiation dose. Quality Assurance (QA) aims to provide systematic and constant improvement through a feedback mechanism to address the technical, clinical and training aspects. Quality Control (QC), in relation to mammography equipment, comprises a series of tests to determine equipment performance characteristics. The introduction of digital technologies promoted changes in QC tests and protocols and there are some tests that are specific for each manufacturer. Within each country specifi c QC tests should be compliant with regulatory requirements and guidance. Ideally, one mammography practitioner should take overarching responsibility for QC within a service, with all practitioners having responsibility for actual QC testing. All QC results must be documented to facilitate troubleshooting, internal audit and external assessment. Generally speaking, the practitioner’s role includes performing, interpreting and recording the QC tests as well as reporting any out of action limits to their service lead. They must undertake additional continuous professional development to maintain their QC competencies. They are usually supported by technicians and medical physicists; in some countries the latter are mandatory. Technicians and/or medical physicists often perform many of the tests indicated within this chapter. It is important to recognise that this chapter is an attempt to encompass the main tests performed within European countries. Specific tests related to the service that you work within must be familiarised with and adhered too.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended formulation of a polyhedron P is a linear description of a polyhedron Q together with a linear map π such that π(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis’ factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441–466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix S. Moreover, Yannakakis also shows that the nonnegative rank of S is at most 2c, where c is the complexity of any deterministic protocol computing S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis’ factorization theorem, this implies that the base-2 logarithm of the smallest size of an extended formulation of a polytope P equals the minimum complexity of a randomized communication protocol computing the slack matrix of P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.