945 resultados para Boi-inspired robotics
Resumo:
BACKGROUND: The considerable malaria decline in several countries challenges the strategy of chemoprophylaxis for travellers visiting moderate- to low-risk areas. An international consensus on the best strategy is lacking. It is essential to include travellers' opinions in the decision process. The preference of travellers regarding malaria prevention for moderate- to low-risk areas, related to their risk perception, as well as the reasons for their choices were investigated. METHODS: Prior to pre-travel consultation in the Travel Clinic, a self-administered questionnaire was given to travellers visiting moderate- to low-risk malaria areas. Four preventive options were proposed to the traveller, i.e., bite prevention only, chemoprophylaxis, stand-by emergency treatment alone, and stand-by emergency treatment with rapid diagnostic test. The information was accompanied by a risk scale for incidence of malaria, anti-malarial adverse drug reactions and other travel-related risks, inspired by Paling palettes from the Risk Communication Institute. RESULTS: A total of 391 travellers were included from December 2012 to December 2013. Fifty-nine (15%) opted for chemoprophylaxis, 116 (30%) for stand-by emergency treatment, 112 (29%) for stand-by emergency treatment with rapid diagnostic test, 100 (26%) for bite prevention only, and four (1%) for other choices. Travellers choosing chemoprophylaxis justified their choice for security reasons (42%), better preventive action (29%), higher efficacy (15%) and easiness (15%). The reasons for choosing stand-by treatment or bite prevention only were less medication consumed (29%), less adverse drug reactions (23%) and lower price (9%). Those who chose chemoprophylaxis were more likely to have used it in the past (OR = 3.0 (CI 1.7-5.44)), but were not different in terms of demographic, travel characteristics or risk behaviour. CONCLUSIONS: When travelling to moderate- to low-risk malaria areas, 85% of interviewees chose not to take chemoprophylaxis as malaria prevention, although most guidelines recommend it. They had coherent reasons for their choice. New recommendations should include shared decision-making to take into account travellers' preferences.
Resumo:
Cette thèse rassemble une série de méta-analyses, c'est-à-dire d'analyses ayant pour objet des analyses produites par des sociologues (notamment celles résultant de l'application de méthodes de traitement des entretiens). Il s'agit d'une démarche réflexive visant les pratiques concrètes des sociologues. Celles-ci sont envisagées comme des activités gouvernées par des règles. Une part importante de cette thèse sera donc consacrée au développement d'un outil d'analyse « pragmatologique » (E. Durkheim), c'est-à-dire permettant l'étude des pratiques et des règles en rapport avec elles. Pour aborder les règles, la philosophie analytique d'inspiration wittgensteinienne apporte plusieurs propositions importantes. Les règles sont ainsi considérées comme des concepts d'air de famille : il n'y a pas de définitions communes recouvrant l'ensemble des règles. Pour étudier les règles, il convient alors de faire des distinctions à partir de leurs usages. Une de ces distinctions concerne la différence entre règles constitutives et règles régulatives : une règle constitutive crée une pratique (e.g. le mariage), alors qu'une règle régulative s'applique à des activités qui peuvent exister sans elle (e.g. les règles du savoir-vivre). L'activité méthodologique des sociologues repose et est contrainte par ces types de règles, qui sont pour l'essentiel implicites. Cette thèse vise donc à rendre compte, par la description et la codification des règles, du caractère normatif des méthodes dans les pratiques d'analyse de la sociologie. Elle insiste en particulier sur les limites logiques qu'instituent les règles constitutives, celles-ci rendant impossibles (et non pas interdites) certaines actions des sociologues. This thesis brings together a series of meta-analyzes, that is, analyzes that tackle analyzes produced by sociologists (notably those resulting from the application of methods in treating interviews). The approach is reflexive and aimed at the concrete practices of sociologists, considered as activities governed by rules. An important part of this thesis is therefore devoted to the development of a "pragmatological" analytical tool (Durkheim) to conduct a study of such practices and of the rules that govern them. To approach these rules, Wittgenstein-inspired analytic philosophy offers several important proposals. The rules are, at first, seen as concepts of family resemblance, assuming that there is no common definition accounting for all rules. In order to conduct the study of such rules, it is therefore necessary to discern how they are respectively used. One of these distinctions concerns the difference between constitutive rules and regulative rules: a constitutive rule creates a practice (for example marriage), while a regulative rule applies to activities that can exist outside of the rule (for example, the rules of etiquette). The methodological activity of sociologists relies on, and is constrained by these types of rules, which are essentially implicit. Through the description and codification of rules, this thesis aims to account for the normative character of methods governing analytical practices in sociology. Particular emphasis is on the logical limits established by constitutive rules, limits that render several of the sociologist's actions impossible (rather than forbidden).
Resumo:
Work-life issues have become a major concern across Western societies with the objective to promote women's careers and well-being. However, despite growing attempts to increase the number of women in senior management positions in European countries, such as Switzerland, they remain highly underrepresented. Inspired from the cultural approach in psychology, this article focuses on these women's concrete everyday life to understand how they articulate different life domains and how this influences their subjective well-being. A narrative approach based on reflexivity is adopted to analyze women's activity. Results show meaning intertwinements between life priorities that are often conflicting. Two psychological functions are identified: the feeling of control and the letting go of control. Each of these contributes to women's subjective well-being through the use of diversified supports, but their structuring roles appear only in relation to one another. Results are discussed in the light of existing literature and of their implications.
Resumo:
To develop systems in order to detect Alzheimer’s disease we want to use EEG signals. Available database is raw, so the first step must be to clean signals properly. We propose a new way of ICA cleaning on a database recorded from patients with Alzheimer's disease (mildAD, early stage). Two researchers visually inspected all the signals (EEG channels), and each recording's least corrupted (artefact-clean) continuous 20 sec interval were chosen for the analysis. Each trial was then decomposed using ICA. Sources were ordered using a kurtosis measure, and the researchers cleared up to seven sources per trial corresponding to artefacts (eye movements, EMG corruption, EKG, etc), using three criteria: (i) Isolated source on the scalp (only a few electrodes contribute to the source), (ii) Abnormal wave shape (drifts, eye blinks, sharp waves, etc.), (iii) Source of abnormally high amplitude ( �100 �V). We then evaluated the outcome of this cleaning by means of the classification of patients using multilayer perceptron neural networks. Results are very satisfactory and performance is increased from 50.9% to 73.1% correctly classified data using ICA cleaning procedure.
Resumo:
Tutkielman tavoitteena on tutkia, miten rakentaa yritykselle toimiva kunnossapidon kustannusten ja suorituskyvyn raportointimalli ohjauksen tarpeisiin. Työn aihepiiri on lähtöisin case-yrityksen käytännön tarpeista luoda yhtenäinen kunnossapidon raporttimalli. Ongelmaa lähestytään kirjallisuuden perusteella, jonka avulla analysoidaan ensinnäkin talouden ohjauksen tehtäviä ja menetelmiä sekä johdon tiedon tarpeita. Toiseksi analysointi kattaa kunnossapidon tavoitteiden ja vaikutusten käsittelyn. Kunnossapidolla nähdään olevan merkittävä vaikutus tuotantoon sekä tätä kautta myös yrityksen kannattavuuteen. Kirjallisuuden perusteella työssä on luotu seitsenportainen kunnossapitoraportoinnin prosessimalli. Tutkielman empiriaosa kattaa case-yrityksen ylemmän johdon, tehtaanjohtajien, kunnossapitopäälliköiden sekä taloushallinnon business controllereiden haastatteluja näiden kunnossapidon raportointiin liittyvien tiedon tarpeiden selvittämiseksi. Haastattelujen perusteella on luotu malliehdotus kunnossapidon kustannusten ja suorituskyvyn raportoimiseksi. Työ on tutkimusotteeltaan konstruktiivinen.
Resumo:
The topic of this study is the language of the educational policies of the British Labour party in the General Election manifestos between the years 1983-2005. The twenty-year period studied has been a period of significant changes in world politics, and in British politics, especially for the Labour party. The emergence educational policy as a vote-winner of the manifestos of the nineties has been noteworthy. The aim of the thesis is two-fold: to look at the structure of the political manifesto as an example of genre writing and to analyze the content utilizing the approach of critical discourse analysis. Furthermore, the aim of this study is not to pinpoint policy positions but to look at what is the image that the Labour Party creates of itself through these manifestos. The analysis of the content is done by a method of close-reading. Based on the findings, the methodology for the analysis of the content was created. This study utilized methodological triangulation which means that the material is analyzed from several methodological aspects. The aspects used in this study are ones of lexical features (collocation, coordination, euphemisms, metaphors and naming), grammatical features (thematic roles, tense, aspect, voice and modal auxiliaries) and rhetoric (Burke, Toulmin and Perelman). From the analysis of the content a generic description is built. By looking at the lexical, grammatical and rhetorical features a clear change in language of the Labour Party can be detected. This change is foreshadowed already in the 1992 manifesto but culminates in the 1997 manifesto which would lead Labour to a landslide victory in the General Election. During this twenty-year period Labour has moved away from the old commitments and into the new sphere of “something for everybody”. The pervasiveness of promotional language and market inspired vocabulary into the sphere of manifesto writing is clear. The use of the metaphors seemed to be the tool for the creation of the image of the party represented through the manifestos. A limited generic description can be constructed from the findings based on the content and structure of the manifestos: especially more generic findings such as the use of the exclusive we, the lack of certain anatomical parts of argument structure, the use of the future tense and the present progressive aspect can shed light to the description of the genre of manifesto writing. While this study is only the beginning, it proves that the combination of looking at the lexical, grammatical and rhetorical features in the study of manifestos is a promising one.
Resumo:
Le traitement de radiochirurgie par Gamma Knife (GK) est utilisé de plus en plus souvent comme une alternative à la microchirurgie conventionnelle pour le traitement des pathologies neurochirurgicales intracrâniennes. Il s'agit d'irradier en dose unique et à haute énergie, en condition stéréotaxique et à l'aide d'une imagerie multimodale (imagerie par résonance magnétique [IRM], tomodensitométrie et éventuellement artériographie). Le GK a été inventé par le neurochirurgien suédois Lars Leksell, qui a réalisé le premier ciblage du nerf trijumeau en 1951, sur la base d'une radiographie standard. Depuis, les progrès de l'informatique et de la robotique ont permis d'améliorer la technique de radiochirurgie qui s'effectue actuellement soit par accélérateur linéaire de particules monté sur un bras robotisé (Novalis®, Cyberknife®), soit par collimation de près de 192 sources fixes (GK). La principale indication radiochirurgicale dans le traitement de la douleur est la névralgie du nerf trijumeau. Les autres indications, plus rares, sont la névralgie du nerf glossopharyngien, l'algie vasculaire de la face, ainsi qu'un traitement de la douleur d'origine cancéreuse par hypophysiolyse. Gamma Knife surgery (GKS) is widely used as an alternative to open microsurgical procedures as noninvasive treatment of many intracranial conditions. It consists of delivering a single dose of high energy in stereotactic conditions, and with the help of a multimodal imaging (e.g., magnetic resonance imaging [MRI], computer tomography, and eventually angiography). The Gamma Knife (GK) was invented by the Swedish neurosurgeon Lars Leksell who was the first to treat a trigeminal neuralgia sufferer in 1951 using an orthogonal X-ray tube. Since then, the progresses made both in the field of informatics and robotics have allowed to improve the radiosurgical technique, which is currently performed either by a linear accelerator of particles mounted on a robotized arm (Novalis®, Cyberknife®), or by collimation of 192 fixed Co-60 sources (GK). The main indication of GKS in the treatment of pain is trigeminal neuralgia. The other indications, less frequent, are: glossopharyngeal neuralgia, cluster headache, and hypophysiolyse for cancer pain.
Resumo:
Mobile technologies have brought about major changes in police equipment and police work. If a utopian narrative remains strongly linked to the adoption of new technologies, often formulated as 'magic bullets' to real occupational problems, there are important tensions between their 'imagined' outcomes and the (unexpected) effects that accompany their daily 'practical' use by police officers. This article offers an analysis of police officers' perceptions and interactions with security devices. In so doing, it develops a conceptual typology of strategies for coping with new technology inspired by Le Bourhis and Lascoumes: challenging, neutralizing and diverting. To that purpose, we adopt an ethnographic approach that focuses on the discourses, practices and actions of police officers in relation to three security devices: the mobile digital terminal, the mobile phone and the body camera. Based on a case study of a North American municipal police department, the article addresses how these technological devices are perceived and experienced by police officers on the beat.
Resumo:
L’objectiu del treball és emular virtualment l’entorn de treball del robot Stäubli Tx60 quehi ha al laboratori de robòtica de la UdG (dins les possibilitats que ofereix el software adquirit).Aquest laboratori intenta reproduir un entorn industrial de treball en el qual es realitzal’assemblatge d’un conjunt de manera cent per cent automatitzada.En una primera fase, s’ha dissenyat en tres dimensions tot l’entorn de treball que hi hadisponible al laboratori a través del software CAD SolidWorks. Cada un dels conjuntsque conformen l’estació de treball s’ha dissenyat de manera independent.Posteriorment s’introdueixen tots els elements dissenyats dins el software StäubliRobotics Suite 2013. Amb tot l’anterior, cal remarcar que l’objectiu principal del treball consta de duesetapes. Inicialment es dissenya el model 3D de l’entorn de treball a través del software SolidWorks i s’introdueix dins el software Stäubli Robotics Suite 2013. Enuna segona etapa, es realitza un manual d’ús del nou software de robòtica
Resumo:
«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-‐variabilité) et entre les traces digitales de donneurs différents (inter-‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-‐variabilité des résidus était significativement plus basse que l'inter-‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-‐variability) and between fingermarks of different donors (inter-‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-‐variability of the fingermark residue was significantly lower than the inter-‐variability, but that it was possible to reduce both kind of variability using different statistical pre-‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.
Resumo:
BACKGROUND: Hypoxia-induced pulmonary vasoconstriction increases pulmonary arterial pressure (PAP) and may impede right heart function and exercise performance. This study examined the effects of oral nitrate supplementation on right heart function and performance during exercise in normoxia and hypoxia. We tested the hypothesis that nitrate supplementation would attenuate the increase in PAP at rest and during exercise in hypoxia, thereby improving exercise performance. METHODS: Twelve trained male cyclists [age: 31 ± 7 year (mean ± SD)] performed 15 km time-trial cycling (TT) and steady-state submaximal cycling (50, 100, and 150 W) in normoxia and hypoxia (11% inspired O2) following 3-day oral supplementation with either placebo or sodium nitrate (0.1 mmol/kg/day). We measured TT time-to-completion, muscle tissue oxygenation during TT and systolic right ventricle to right atrium pressure gradient (RV-RA gradient: index of PAP) during steady state cycling. RESULTS: During steady state exercise, hypoxia elevated RV-RA gradient (p > 0.05), while oral nitrate supplementation did not alter RV-RA gradient (p > 0.05). During 15 km TT, hypoxia lowered muscle tissue oxygenation (p < 0.05). Nitrate supplementation further decreased muscle tissue oxygenation during 15 km TT in hypoxia (p < 0.05). Hypoxia impaired time-to-completion during TT (p < 0.05), while no improvements were observed with nitrate supplementation in normoxia or hypoxia (p > 0.05). CONCLUSION: Our findings indicate that oral nitrate supplementation does not attenuate acute hypoxic pulmonary vasoconstriction nor improve performance during time trial cycling in normoxia and hypoxia.
Resumo:
Modelling the shoulder's musculature is challenging given its mechanical and geometric complexity. The use of the ideal fibre model to represent a muscle's line of action cannot always faithfully represent the mechanical effect of each muscle, leading to considerable differences between model-estimated and in vivo measured muscle activity. While the musculo-tendon force coordination problem has been extensively analysed in terms of the cost function, only few works have investigated the existence and sensitivity of solutions to fibre topology. The goal of this paper is to present an analysis of the solution set using the concepts of torque-feasible space (TFS) and wrench-feasible space (WFS) from cable-driven robotics. A shoulder model is presented and a simple musculo-tendon force coordination problem is defined. The ideal fibre model for representing muscles is reviewed and the TFS and WFS are defined, leading to the necessary and sufficient conditions for the existence of a solution. The shoulder model's TFS is analysed to explain the lack of anterior deltoid (DLTa) activity. Based on the analysis, a modification of the model's muscle fibre geometry is proposed. The performance with and without the modification is assessed by solving the musculo-tendon force coordination problem for quasi-static abduction in the scapular plane. After the proposed modification, the DLTa reaches 20% of activation.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
(ENGLISH VERSION BELOW) Dieser Beitrag fügt sich in eine post-doktorale Forschung über die Geschichte der Orthopädie ein, die unter dem Mandat des Centre Hospitalier Universitaire Vaudois (CHUV) geleitet wird und teilerweise auf die Archiv der Schweizerischen Gesellschaft für Orthopädie (aktuelle Swiss Orthopaedics) beruht. Die Autorin untersucht die Herausforderungen, welche die Geschichte der Orthopädie in der Schweiz prägten und berücksichtigt dabei die Anpassungsstrategien einer medizinischen und technischen Disziplin in einer sich wandelnden Gesellschaft. Zusammenfassung der Beitrag und Informationen auf der Website der Zeitschrift: http://econtent.hogrefe.com/toc/tum/72/7 This article is inspired by a post-doctorale research about the history of orthopedics, mandated by the Centre Hospitalier Universitaire Vaudois (CHUV), and partly supported by the Archiv of the Swiss Society of Orthopedics (nowadays Swiss Orthopaedics). By examining the implications that have shaped the history of orthopedics in Switzerland, the author seeks to shed light on the strategies that were implemented in adopting a medical and technical discipline within a transforming society. Summary of the article and information on the journal's website: http://econtent.hogrefe.com/toc/tum/72/7