954 resultados para Eigenfunctions and fundamental solution
Resumo:
Dendritic cells (DCs) are leukocytes specialised in the uptake, processing, and presentation of antigen and fundamental in regulating both innate and adaptive immune functions. They are mainly localised at the interface between body surfaces and the environment, continuously scrutinising incoming antigen for the potential threat it may represent to the organism. In the respiratory tract, DCs constitute a tightly enmeshed network, with the most prominent populations localised in the epithelium of the conducting airways and lung parenchyma. Their unique localisation enables them to continuously assess inhaled antigen, either inducing tolerance to inoffensive substances, or initiating immunity against a potentially harmful pathogen. This immunological homeostasis requires stringent control mechanisms to protect the vital and fragile gaseous exchange barrier from unrestrained and damaging inflammation, or an exaggerated immune response to an innocuous allergen, such as in allergic asthma. During DC activation, there is upregulation of co-stimulatory molecules and maturation markers, enabling DC to activate naïve T cells. This activation is accompanied by chemokine and cytokine release that not only serves to amplify innate immune response, but also determines the type of effector T cell population generated. An increasing body of recent literature provides evidence that different DC subpopulations, such as myeloid DC (mDC) and plasmacytoid DC (pDC) in the lungs occupy a key position at the crossroads between tolerance and immunity. This review aims to provide the clinician and researcher with a summary of the latest insights into DC-mediated pulmonary immune regulation and its relevance for developing novel therapeutic strategies for various disease conditions such as infection, asthma, COPD, and fibrotic lung disease.
Resumo:
The relationship between the structures of protein-ligand complexes existing in the crystal and in solution, essential in the case of fragment-based screening by X-ray crystallography (FBS-X), has been often an object of controversy. To address this question, simultaneous co-crystallization and soaking of two inhibitors with different ratios, Fidarestat (FID; K(d) = 6.5 nM) and IDD594 (594; K(d) = 61 nM), which bind to h-aldose reductase (AR), have been performed. The subatomic resolution of the crystal structures allows the differentiation of both inhibitors, even when the structures are almost superposed. We have determined the occupation ratio in solution by mass spectrometry (MS) Occ(FID)/Occ(594) = 2.7 and by X-ray crystallography Occ(FID)/Occ(594) = 0.6. The occupancies in the crystal and in solution differ 4.6 times, implying that ligand binding potency is influenced by crystal contacts. A structural analysis shows that the Loop A (residues 122-130), which is exposed to the solvent, is flexible in solution, and is involved in packing contacts within the crystal. Furthermore, inhibitor 594 contacts the base of Loop A, stabilizing it, while inhibitor FID does not. This is shown by the difference in B-factors of the Loop A between the AR-594 and AR-FID complexes. A stable loop diminishes the entropic energy barrier to binding, favoring 594 versus FID. Therefore, the effect of the crystal environment should be taken into consideration in the X-ray diffraction analysis of ligand binding to proteins. This conclusion highlights the need for additional methodologies in the case of FBS-X to validate this powerful screening technique, which is widely used.
Resumo:
Over the last century, numerous techniques have been developed to analyze the movement of humans while walking and running. The combined use of kinematics and kinetics methods, mainly based on high speed video analysis and forceplate, have permitted a comprehensive description of locomotion process in terms of energetics and biomechanics. While the different phases of a single gait cycle are well understood, there is an increasing interest to know how the neuro-motor system controls gait form stride to stride. Indeed, it was observed that neurodegenerative diseases and aging could impact gait stability and gait parameters steadiness. From both clinical and fundamental research perspectives, there is therefore a need to develop techniques to accurately track gait parameters stride-by-stride over a long period with minimal constraints to patients. In this context, high accuracy satellite positioning can provide an alternative tool to monitor outdoor walking. Indeed, the high-end GPS receivers provide centimeter accuracy positioning with 5-20 Hz sampling rate: this allows the stride-by-stride assessment of a number of basic gait parameters--such as walking speed, step length and step frequency--that can be tracked over several thousand consecutive strides in free-living conditions. Furthermore, long-range correlations and fractal-like pattern was observed in those time series. As compared to other classical methods, GPS seems a promising technology in the field of gait variability analysis. However, relative high complexity and expensiveness--combined with a usability which requires further improvement--remain obstacles to the full development of the GPS technology in human applications.
Resumo:
In arson cases, the collection and detection of traces of ignitable liquids on a suspect's hands can provide information to a forensic investigation. Police forces currently lack a simple, robust, efficient and reliable solution to perform this type of swabbing. In this article, we describe a study undertaken to develop a procedure for the collection of ignitable liquid residues on the hands of arson suspects. Sixteen different collection supports were considered and their applicability for the collection of gasoline traces present on hands and their subsequent analysis in a laboratory was evaluated. Background contamination, consisting of volatiles emanating from the collection supports, and collection efficiencies of the different sampling materials were assessed by passive headspace extraction with an activated charcoal strip (DFLEX device) followed by gas chromatography-mass spectrometry (GC-MS) analysis. After statistical treatment of the results, non-powdered latex gloves were retained as the most suitable method of sampling. On the basis of the obtained results, a prototype sampling kit was designed and tested. This kit is made of a three compartment multilayer bag enclosed in a sealed metal can and containing three pairs of non-powdered latex gloves: one to be worn by the sampler, one consisting of a blank sample and the last one to be worn by the person suspected to have been in contact with ignitable liquids. The design of the kit was developed to be efficient in preventing external and cross-contaminations.
Resumo:
This article studies alterations in the values, attitudes, and behaviors that emerged among U.S. citizens as a consequence of, and as a response to, the attacks of September 11, 2001. The study briefly examines the immediate reaction to the attack, before focusing on the collective reactions that characterized the behavior of the majority of the population between the events of 9/11 and the response to it in the form of intervention in Afghanistan. In studying this period an eight-phase sequential model (Botcharova, 2001) is used, where the initial phases center on the nation as the ingroup and the latter focus on the enemy who carried out the attack as the outgroup. The study is conducted from a psychosocial perspective and uses "social identity theory" (Tajfel & Turner, 1979, 1986) as the basic framework for interpreting and accounting for the collective reactions recorded. The main purpose of this paper is to show that the interpretation of these collective reactions is consistent with the postulates of social identity theory. The application of this theory provides a different and specific analysis of events. The study is based on data obtained from a variety of rigorous academic studies and opinion polls conducted in relation to the events of 9/11. In line with social identity theory, 9/11 had a marked impact on the importance attached by the majority of U.S. citizens to their identity as members of a nation. This in turn accentuated group differentiation and activated ingroup favoritism and outgroup discrimination (Tajfel & Turner, 1979, 1986). Ingroup favoritism strengthened group cohesion, feelings of solidarity, and identification with the most emblematic values of the U.S. nation, while outgroup discrimination induced U.S. citizens to conceive the enemy (al-Qaeda and its protectors) as the incarnation of evil, depersonalizing the group and venting their anger on it, and to give their backing to a military response, the eventual intervention in Afghanistan. Finally, and also in line with the postulates of social identity theory, as an alternative to the virtual bipolarization of the conflict (U.S. vs al-Qaeda), the activation of a higher level of identity in the ingroup is proposed, a group that includes the United States and the largest possible number of countries¿ including Islamic states¿in the search for a common, more legitimate and effective solution.
Resumo:
This article studies alterations in the values, attitudes, and behaviors that emerged among U.S. citizens as a consequence of, and as a response to, the attacks of September 11, 2001. The study briefly examines the immediate reaction to the attack, before focusing on the collective reactions that characterized the behavior of the majority of the population between the events of 9/11 and the response to it in the form of intervention in Afghanistan. In studying this period an eight-phase sequential model (Botcharova, 2001) is used, where the initial phases center on the nation as the ingroup and the latter focus on the enemy who carried out the attack as the outgroup. The study is conducted from a psychosocial perspective and uses "social identity theory" (Tajfel & Turner, 1979, 1986) as the basic framework for interpreting and accounting for the collective reactions recorded. The main purpose of this paper is to show that the interpretation of these collective reactions is consistent with the postulates of social identity theory. The application of this theory provides a different and specific analysis of events. The study is based on data obtained from a variety of rigorous academic studies and opinion polls conducted in relation to the events of 9/11. In line with social identity theory, 9/11 had a marked impact on the importance attached by the majority of U.S. citizens to their identity as members of a nation. This in turn accentuated group differentiation and activated ingroup favoritism and outgroup discrimination (Tajfel & Turner, 1979, 1986). Ingroup favoritism strengthened group cohesion, feelings of solidarity, and identification with the most emblematic values of the U.S. nation, while outgroup discrimination induced U.S. citizens to conceive the enemy (al-Qaeda and its protectors) as the incarnation of evil, depersonalizing the group and venting their anger on it, and to give their backing to a military response, the eventual intervention in Afghanistan. Finally, and also in line with the postulates of social identity theory, as an alternative to the virtual bipolarization of the conflict (U.S. vs al-Qaeda), the activation of a higher level of identity in the ingroup is proposed, a group that includes the United States and the largest possible number of countries¿ including Islamic states¿in the search for a common, more legitimate and effective solution.
Resumo:
Nowadays Scanning Electron Microscopy (SEM) is a basic and fundamental tool in the study of geologic samples. The collision of a highlyaccelerated electron beam with the atoms of a solid sample results in theproduction of several radiation types than can be detected and analysed byspecific detectors, providing information of the chemistry and crystallography ofthe studied material. From this point of view, the chamber of a SEM can beconsidered as a laboratory where different experiments can be carried out. Theapplication of SEM to geology, especially in the fields of mineralogy andpetrology has been summarised by Reed (1996).The aim of this paper is to showsome recent applications in the characterization of geologic materials.
Resumo:
During the first year of research, work was completed to identify Iowa DOT needs for web-based project management system (WPMS) and evaluate how commercially available solutions could meet these needs. Researchers also worked to pilot test custom developed WPMS solutions on Iowa DOT bridge projects. At the end of the first year of research, a Request for Proposals (RFP) was developed and issued by the Iowa DOT for the selection of a commercial WPMS to pilot test on multiple bridge projects. During the second year of research, the responses to the RFP issued during the first year of research were evaluated and a solution was selected. The selected solution, Attolist, was customized, tested, and implemented during the fall of 2009. Beginning in the winter of 2010, the solution was implemented on Iowa DOT projects. Researchers worked to assist in the training, implementation, and performance evaluation of the solution. Work will continue beyond the second year of research to implement Attolist on an additional pilot project. During this time, work will be completed to evaluate the impact of WPMS on Iowa DOT bridge projects.
Resumo:
Multihop ad-hoc networks have a dynamic topology. Retrieving a route towards a remote peer requires the execution of a recipient lookup, which can publicly reveal sensitive information about him. Within this context, we propose an efficient, practical and scalable solution to guaranteethe anonymity of recipients' nodes in ad-hoc networks.
Resumo:
This paper presents a simple and fast solution to the problem of finding the time variations of the forces that keep the object equilibrium when a finger is removed from a three contact point grasp or a finger is added to a two contact point grasp, assuming the existence of an external perturbation force (that can be the object weight itself). The procedure returns force set points for the control system of a manipulator device in a regrasping action. The approach was implemented and a numerical example is included in the paper to illustrate how it works.
Resumo:
PURPOSE OF REVIEW: Intensive insulin therapy titrated to restore and maintain blood glucose between 80 and 110 mg/dl (4.4-6.1 mmol/l) was found to improve survival of critically ill patients in one pioneering proof-of-concept study performed in a surgical intensive care unit. The external validity of these findings was investigated. RECENT FINDINGS: Six independent prospective randomized controlled trials, involving 9877 patients in total, were unable to confirm the survival benefit reported in the pioneering trial. Several hypotheses were proposed to explain this discrepancy, including the case-mix, the features of the usual care, the quality of glucose control and the risks associated with hypoglycemia. SUMMARY: Before a better understanding and delineation of the conditions associated with and improved outcome by tight glycemic control, the choice of an intermediate glycemic target appears as a safe and effective solution.
Resumo:
Uudistunut ympäristölainsäädäntö vaatii energiantuotantolaitoksilta yhä enemmän järjestelmällistä ympäristötiedon hallintaa. LCP- ja jätteenpolttoasetuksen velvoitteet ovat asettaneet uusia vaatimuksia päästöjen valvontaan ja siihen käytettävien mittausjärjestelmien laadunvarmennukseen sekä päästötietojen raportointiin. Uudistukset ovat lisänneet huomattavasti laitoksilla ympäristötiedon käsittelyyn kuluvaa aikaa. Laitosten toimintaehdot määritellään ympäristöviranomaisen myöntämässä ympäristöluvassa, joka on tärkein yksittäinen laitoksen toimintaa ohjaava tekijä. Tämän lisäksi monet toimijat haluavat parantaa ympäristöasioiden tasoaan vapaaehtoisilla ympäristöjärjestelmillä. Tässä diplomityössä kuvataan energiantuotantolaitosten ympäristöasioiden tallentamiseen ja hallintaan kehitetty selainpohjainen Metso Automationin DNAecoDiary'sovellus. Työ on rajattu koskemaan Suomessa toimivia LCP- ja/tai jätteenpolttoasetuksen alaisia laitoksia. Sovelluksen avulla voidaan varmistaa energiantuotantolaitosten poikkeamien, häiriöilmoitusten, päästömittalaitteisiin liittyvien tapahtumien ja muun ympäristöasioiden valvontaan liittyvän informaation tehokas hallinta. Sovellukseen tallennetaan ympäristötapahtumiin liittyvät perustiedot sekä etenkin käyttäjien tapahtumiin liittyvä kokemustietämys. Valvontakirjaukseen voidaan liittää tapahtuman perustietojen lisäksi myös tiedostoja ja kuvia. Sovellusta ja sillä kerättyä tietoa voidaan hyödyntää laitoksella käsilläolevien ongelmien ratkaisuun, ympäristötapahtumien todentamiseen sekä ympäristöraporttien laadintaan. Kehitystyön tueksi järjestettiin asiakastarvekartoitus, jonka perusteella ideoitiin sovelluksen ominaisuuksia. Tässä työssä on esitetty ympäristötiedon hallinan perusteet, selvitetty DNAecoDiaryn toimintaperiaatteet ja annettu esimerkkejä sen hyödyntämisestä. Sovelluksen lopullinen sisältö määritellään kunkin asiakkaan ympäristöluvan ja oma-valvonnan tarpeiden mukaisesti. Sovellus toimii itsenäisesti tai osana laajempaa Metso Automationin päästöjenhallinta- ja raportointisovelluskokonaisuutta.
Resumo:
The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.
Resumo:
Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.
Resumo:
Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.