75 resultados para Operational efficiency
em Université de Lausanne, Switzerland
Resumo:
DNA is nowadays swabbed routinely to investigate serious and volume crimes, but research remains scarce when it comes to determining the criteria that may impact the success rate of DNA swabs taken on different surfaces and situations. To investigate these criteria in fully operational conditions, DNA analysis results of 4772 swabs taken by the forensic unit of a police department in Western Switzerland over a 2.5-year period (2012-2014) in volume crime cases were considered. A representative and random sample of 1236 swab analyses was extensively examined and codified, describing several criteria such as whether the swabbing was performed at the scene or in the lab, the zone of the scene where it was performed, the kind of object or surface that was swabbed, whether the target specimen was a touch surface or a biological fluid, and whether the swab targeted a single surface or combined different surfaces. The impact of each criterion and of their combination was assessed in regard to the success rate of DNA analysis, measured through the quality of the resulting profile, and whether the profile resulted in a hit in the national database or not. Results show that some situations - such as swabs taken on door and window handles for instance - have a higher success rate than average swabs. Conversely, other situations lead to a marked decrease in the success rate, which should discourage further analyses of such swabs. Results also confirm that targeting a DNA swab on a single surface is preferable to swabbing different surfaces with the intent to aggregate cells deposited by the offender. Such results assist in predicting the chance that the analysis of a swab taken in a given situation will lead to a positive result. The study could therefore inform an evidence-based approach to decision-making at the crime scene (what to swab or not) and at the triage step (what to analyse or not), contributing thus to save resource and increase the efficiency of forensic science efforts.
Resumo:
The World Health Organization (WHO) criteria for the diagnosis of osteoporosis are mainly applicable for dual X-ray absorptiometry (DXA) measurements at the spine and hip levels. There is a growing demand for cheaper devices, free of ionizing radiation such as promising quantitative ultrasound (QUS). In common with many other countries, QUS measurements are increasingly used in Switzerland without adequate clinical guidelines. The T-score approach developed for DXA cannot be applied to QUS, although well-conducted prospective studies have shown that ultrasound could be a valuable predictor of fracture risk. As a consequence, an expert committee named the Swiss Quality Assurance Project (SQAP, for which the main mission is the establishment of quality assurance procedures for DXA and QUS in Switzerland) was mandated by the Swiss Association Against Osteoporosis (ASCO) in 2000 to propose operational clinical recommendations for the use of QUS in the management of osteoporosis for two QUS devices sold in Switzerland. Device-specific weighted "T-score" based on the risk of osteoporotic hip fractures as well as on the prediction of DXA osteoporosis at the hip, according to the WHO definition of osteoporosis, were calculated for the Achilles (Lunar, General Electric, Madison, Wis.) and Sahara (Hologic, Waltham, Mass.) ultrasound devices. Several studies (totaling a few thousand subjects) were used to calculate age-adjusted odd ratios (OR) and area under the receiver operating curve (AUC) for the prediction of osteoporotic fracture (taking into account a weighting score depending on the design of the study involved in the calculation). The ORs were 2.4 (1.9-3.2) and AUC 0.72 (0.66-0.77), respectively, for the Achilles, and 2.3 (1.7-3.1) and 0.75 (0.68-0.82), respectively, for the Sahara device. To translate risk estimates into thresholds for clinical application, 90% sensitivity was used to define low fracture and low osteoporosis risk, and a specificity of 80% was used to define subjects as being at high risk of fracture or having osteoporosis at the hip. From the combination of the fracture model with the hip DXA osteoporotic model, we found a T-score threshold of -1.2 and -2.5 for the stiffness (Achilles) determining, respectively, the low- and high-risk subjects. Similarly, we found a T-score at -1.0 and -2.2 for the QUI index (Sahara). Then a screening strategy combining QUS, DXA, and clinical factors for the identification of women needing treatment was proposed. The application of this approach will help to minimize the inappropriate use of QUS from which the whole field currently suffers.
Resumo:
Summary : Internal ribosome entry sites (IRES) are used by viruses as a strategy to bypass inhibition of cap-dependent translation that commonly results from viral infection. IRES are also used in eukaryotic cells to control mRNA translation under conditions of cellular stress (apoptosis, heat shock) or during the G2 phase of the cell cycle when general protein synthesis is inhibited. Variation in cellular expression levels has been shown to be inherited. Expression is controlled, among others, by transcriptional factors and by the efficiency of cap-mediated translation and ribosome activity. We aimed at identifying genomic determinants of variability in IRES-mediated translation of two representative IRES [Encephalomyocarditis virus (EMCV) and X-linked Inhibitor-of-Apoptosis (XIAP) IRES]. We used bicistronic lentiviral constructions expressing two fluorescent reporter transgenes. Lentiviruses were used to transduce seven different laboratory cell lines and B lymphoblastoid cell lines from the Centre d'Etude du Polymorphisme Humain (CEPH; 15 pedigrees; n=209); representing an in vitro approach to family structure allowing genome scan analyses. The relative expression of the two markers was assessed by FACS. IRES efficiency varies according to cellular background, but also varies, for a same cell type, among individuals. The control of IRES activity presents an inherited component (h2) of 0.47 and 0.36 for EMCV and XIAP IRES, respectively. A genome scan identified a suggestive Quantitative Trait Loci (LOD 2.35) involved in the control of XIAP IRES activity. Résumé : Les sites internes d'entrée des ribosomes (IRES = internal ribosome entry sites) sont utilisés par les virus comme une stratégie afin d'outrepasser l'inhibition de traduction qui résulte communément d'une infection virale. Les IRES sont également utilisés par les cellules eucaryotes pour contrôler la traduction de l'ARN messager dans des conditions de stress cellulaire (apoptose, choc thermique) ou durant la phase G2 du cycle cellulaire, situations durant lesquelles la synthèse générale des protéines est inhibée. La variation des niveaux d'expression cellulaire de transcription est un caractère héréditaire. L'expression des gènes est contrôlée entre autre par les facteurs de transcription et par l'efficacité de la traduction initiée par la coiffe ainsi que par l'activité des ribosomes. Durant cette étude nous avons eu pour but d'identifier les déterminants génomiques responsables de la variabilité de la traduction contrôlée par l'IRES. Ceci a été effectué en étudiant deux IRES représentatifs : l'IRES du virus de l'encéphalomyocardite (EMCV) et l'IRES de l'inhibiteur de l'apoptose XIAP (X-linked Inhibitor-of-Apoptosis). Nous avons utilisés des lentivirus délivrant un transgène bicistronique codant pour deux gènes rapporteurs fluorescents. Ces lentivirus ont été utilisés pour transduire sept différentes lignées cellulaires de laboratoire et des lignées cellulaires lymphoblastoïdes B du Centre d'Etude du Polymorphisme Humain (CEPH; 15 pedigrees; n=209) qui représentent une approche in vitro de la structure familiale et qui permettent des analyses par balayage du génome. L'expression relative des deux marqueurs fluorescents a été analysée par FACS. Nos résultats montrent que l'efficacité des IRES varie en fonction du type de cellules. Il varie aussi, pour le même type de cellules, selon les individus. Le contrôle de l'activité de l'IRES est un caractère héritable (héritabilité h2) de 0.47 et 0.36 pour les IRES de EMCV et XIAP respectivement. Le balayage du génome a permis l'identification d'un locus à effets quantitatifs [QTL Quantitative Trait Loci (LOD 2.35)] impliqué dans le contôle de l'activité de l'IRES de XIAP.
Resumo:
This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
ABSTRACT: BACKGROUND: There is no recommendation to screen ferritin level in blood donors, even though several studies have noted the high prevalence of iron deficiency after blood donation, particularly among menstruating females. Furthermore, some clinical trials have shown that non-anaemic women with unexplained fatigue may benefit from iron supplementation. Our objective is to determine the clinical effect of iron supplementation on fatigue in female blood donors without anaemia, but with a mean serum ferritin </= 30 ng/ml. METHODS/DESIGN: In a double blind randomised controlled trial, we will measure blood count and ferritin level of women under age 50 yr, who donate blood to the University Hospital of Lausanne Blood Transfusion Department, at the time of the donation and after 1 week. One hundred and forty donors with a ferritin level </= 30 ng/ml and haemoglobin level >/= 120 g/l (non-anaemic) a week after the donation will be included in the study and randomised. A one-month course of oral ferrous sulphate (80 mg/day of elemental iron) will be introduced vs. placebo. Self-reported fatigue will be measured using a visual analogue scale. Secondary outcomes are: score of fatigue (Fatigue Severity Scale), maximal aerobic power (Chester Step Test), quality of life (SF-12), and mood disorders (Prime-MD). Haemoglobin and ferritin concentration will be monitored before and after the intervention. DISCUSSION: Iron deficiency is a potential problem for all blood donors, especially menstruating women. To our knowledge, no other intervention study has yet evaluated the impact of iron supplementation on subjective symptoms after a blood donation. TRIAL REGISTRATION: NCT00689793.
Resumo:
Background: Retrospective analyses suggest that personalized PK-based dosage might be useful for imatinib, as treatment response correlates with trough concentrations (Cmin) in cancer patients. Our objectives were to improve the interpretation of randomly measured concentrations and to confirm its efficiency before evaluating the clinical usefulness of systematic PK-based dosage in chronic myeloid leukemia patients. Methods and Results: A Bayesian method was validated for the prediction of individual Cmin on the basis of a single random observation, and was applied in a prospective multicenter randomized controlled clinical trial. 28 out of 56 patients were enrolled in the systematic dosage individualization arm and had 44 follow-up visits (their clinical follow-up is ongoing). PK-dose-adjustments were proposed in 39% having predicted Cmin significantly away from the target (1000 ng/ml). Recommendations were taken up by physicians in 57%, patients were considered non-compliant in 27%. Median Cmin at study inclusion was 754 ng/ml and differed significantly from the target (p=0.02, Wilcoxon test). On follow-up, Cmin was 984 ng/ml (p=0.82) in the compliant group. CV decreased from 46% to 27% (p=0.02, F-test). Conclusion: PK-based (Bayesian) dosage adjustment is able to bring individual drug exposure closer to a given therapeutic target. Its influence on therapeutic response remains to be evaluated.
Resumo:
Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.
Resumo:
This work is part of a continuing goal to improve the multimetal deposition technique (MMD), as well as the single-metal deposition (SMD), to make them more robust, more user-friendly, and less labour-intensive. Indeed, two major limitations of the MMD/SMD were identified: (1) the synthesis of colloidal gold, which is quite labour-intensive, and (2) the sharp decrease in efficiency observed when the pH of the working solution is increased above pH 3. About the synthesis protocol, it has been simplified so that there is no more need to monitor the temperature during the synthesis. The efficiency has also been improved by adding aspartic acid, conjointly with sodium citrate, during the synthesis of colloidal gold. This extends the range of pH for which it is possible to detect fingermarks in the frame of the MMD/SMD. The operational range is now extended from 2 to 6.7, compared to 2-3 for the previous formulations. The increased robustness of the working solution may improve the ability of the technique to process substrates that tend to increase the pH of the solution after their immersion.
Resumo:
Summary : Division of labour is one of the most fascinating aspects of social insects. The efficient allocation of individuals to a multitude of different tasks requires a dynamic adjustment in response to the demands of a changing environment. A considerable number of theoretical models have focussed on identifying the mechanisms allowing colonies to perform efficient task allocation. The large majority of these models are built on the observation that individuals in a colony vary in their propensity (response threshold) to perform different tasks. Since individuals with a low threshold for a given task stimulus are more likely to perform that task than individuals with a high threshold, infra-colony variation in individual thresholds results in colony division of labour. These theoretical models suggest that variation in individual thresholds is affected by the within-colony genetic diversity. However, the models have not considered the genetic architecture underlying the individual response thresholds. This is important because a better understanding of division of labour requires determining how genotypic variation relates to differences in infra-colony response threshold distributions. In this thesis, we investigated the combined influence on task allocation efficiency of both, the within-colony genetic variability (stemming from variation in the number of matings by queens) and the number of genes underlying the response thresholds. We used an agent-based simulator to model a situation where workers in a colony had to perform either a regulatory task (where the amount of a given food item in the colony had to be maintained within predefined bounds) or a foraging task (where the quantity of a second type of food item collected had to be the highest possible). The performance of colonies was a function of workers being able to perform both tasks efficiently. To study the effect of within-colony genetic diversity, we compared the performance of colonies with queens mated with varying number of males. On the other hand, the influence of genetic architecture was investigated by varying the number of loci underlying the response threshold of the foraging and regulatory tasks. Artificial evolution was used to evolve the allelic values underlying the tasks thresholds. The results revealed that multiple matings always translated into higher colony performance, whatever the number of loci encoding the thresholds of the regulatory and foraging tasks. However, the beneficial effect of additional matings was particularly important when the genetic architecture of queens comprised one or few genes for the foraging task's threshold. By contrast, higher number of genes encoding the foraging task reduced colony performance with the detrimental effect being stronger when queens had mated with several males. Finally, the number of genes determining the threshold for the regulatory task only had a minor but incremental effect on colony performance. Overall, our numerical experiments indicate the importance of considering the effects of queen mating frequency, genetic architecture underlying task thresholds and the type of task performed when investigating the factors regulating the efficiency of division of labour in social insects. In this thesis we also investigate the task allocation efficiency of response threshold models and compare them with neural networks. While response threshold models are widely used amongst theoretical biologists interested in division of labour in social insects, our simulation reveals that they perform poorly compared to a neural network model. A major shortcoming of response thresholds is that they fail at one of the most crucial requirement of division of labour, the ability of individuals in a colony to efficiently switch between tasks under varying environmental conditions. Moreover, the intrinsic properties of the threshold models are that they lead to a large proportion of idle workers. Our results highlight these limitations of the response threshold models and provide an adequate substitute. Altogether, the experiments presented in this thesis provide novel contributions to the understanding of how division of labour in social insects is influenced by queen mating frequency and genetic architecture underlying worker task thresholds. Moreover, the thesis also provides a novel model of the mechanisms underlying worker task allocation that maybe more generally applicable than the widely used response threshold models. Resumé : La répartition du travail est l'un des aspects les plus fascinants des insectes vivant en société. Une allocation efficace de la multitude de différentes tâches entre individus demande un ajustement dynamique afin de répondre aux exigences d'un environnement en constant changement. Un nombre considérable de modèles théoriques se sont attachés à identifier les mécanismes permettant aux colonies d'effectuer une allocation efficace des tâches. La grande majorité des ces modèles sont basés sur le constat que les individus d'une même colonie diffèrent dans leur propension (inclination à répondre) à effectuer différentes tâches. Etant donné que les individus possédant un faible seuil de réponse à un stimulus associé à une tâche donnée sont plus disposés à effectuer cette dernière que les individus possédant un seuil élevé, les différences de seuils parmi les individus vivant au sein d'une même colonie mènent à une certaine répartition du travail. Ces modèles théoriques suggèrent que la variation des seuils des individus est affectée par la diversité génétique propre à la colonie. Cependant, ces modèles ne considèrent pas la structure génétique qui est à la base des seuils de réponse individuels. Ceci est très important car une meilleure compréhension de la répartition du travail requière de déterminer de quelle manière les variations génotypiques sont associées aux différentes distributions de seuils de réponse à l'intérieur d'une même colonie. Dans le cadre de cette thèse, nous étudions l'influence combinée de la variabilité génétique d'une colonie (qui prend son origine dans la variation du nombre d'accouplements des reines) avec le nombre de gènes supportant les seuils de réponse, vis-à-vis de la performance de l'allocation des tâches. Nous avons utilisé un simulateur basé sur des agents pour modéliser une situation où les travailleurs d'une colonie devaient accomplir une tâche de régulation (1a quantité d'une nourriture donnée doit être maintenue à l'intérieur d'un certain intervalle) ou une tâche de recherche de nourriture (la quantité d'une certaine nourriture doit être accumulée autant que possible). Dans ce contexte, 'efficacité des colonies tient en partie des travailleurs qui sont capable d'effectuer les deux tâches de manière efficace. Pour étudier l'effet de la diversité génétique d'une colonie, nous comparons l'efficacité des colonies possédant des reines qui s'accouplent avec un nombre variant de mâles. D'autre part, l'influence de la structure génétique a été étudiée en variant le nombre de loci à la base du seuil de réponse des deux tâches de régulation et de recherche de nourriture. Une évolution artificielle a été réalisée pour évoluer les valeurs alléliques qui sont à l'origine de ces seuils de réponse. Les résultats ont révélé que de nombreux accouplements se traduisaient toujours en une plus grande performance de la colonie, quelque soit le nombre de loci encodant les seuils des tâches de régulation et de recherche de nourriture. Cependant, les effets bénéfiques d'accouplements additionnels ont été particulièrement important lorsque la structure génétique des reines comprenait un ou quelques gènes pour le seuil de réponse pour la tâche de recherche de nourriture. D'autre part, un nombre plus élevé de gènes encodant la tâche de recherche de nourriture a diminué la performance de la colonie avec un effet nuisible d'autant plus fort lorsque les reines s'accouplent avec plusieurs mâles. Finalement, le nombre de gènes déterminant le seuil pour la tâche de régulation eu seulement un effet mineur mais incrémental sur la performance de la colonie. Pour conclure, nos expériences numériques révèlent l'importance de considérer les effets associés à la fréquence d'accouplement des reines, à la structure génétique qui est à l'origine des seuils de réponse pour les tâches ainsi qu'au type de tâche effectué au moment d'étudier les facteurs qui régulent l'efficacité de la répartition du travail chez les insectes vivant en communauté. Dans cette thèse, nous étudions l'efficacité de l'allocation des tâches des modèles prenant en compte des seuils de réponses, et les comparons à des réseaux de neurones. Alors que les modèles basés sur des seuils de réponse sont couramment utilisés parmi les biologistes intéressés par la répartition des tâches chez les insectes vivant en société, notre simulation montre qu'ils se révèlent peu efficace comparé à un modèle faisant usage de réseaux de neurones. Un point faible majeur des seuils de réponse est qu'ils échouent sur un point crucial nécessaire à la répartition des tâches, la capacité des individus d'une colonie à commuter efficacement entre des tâches soumises à des conditions environnementales changeantes. De plus, les propriétés intrinsèques des modèles basés sur l'utilisation de seuils conduisent à de larges populations de travailleurs inactifs. Nos résultats mettent en évidence les limites de ces modèles basés sur l'utilisation de seuils et fournissent un substitut adéquat. Ensemble, les expériences présentées dans cette thèse fournissent de nouvelles contributions pour comprendre comment la répartition du travail chez les insectes vivant en société est influencée par la fréquence d'accouplements des reines ainsi que par la structure génétique qui est à l'origine, pour un travailleur, du seuil de réponse pour une tâche. De plus, cette thèse fournit également un nouveau modèle décrivant les mécanismes qui sont à l'origine de l'allocation des tâches entre travailleurs, mécanismes qui peuvent être appliqué de manière plus générale que ceux couramment utilisés et basés sur des seuils de réponse.
Resumo:
Introduction The flexible derotator is one of the therapeutic resources used to combat primary and secondary abnormalities in walking cerebral palsy children. It was developed to reduce abnormal femoral and tibial torsions and lessen the latter's negative functional impact. Objective To determine the effect of wearing a flexible derotator on anatomic and functional parameters in walking cerebral palsy children. Methods We performed a retrospective study of walking cerebral palsy children by gathering data on bone-related parameters (femoral and tibial torsion) and functional parameters (distance and speed gait, and the energy expenditure index (EEI)). Fifteen walking cerebral palsy children were treated with the flexible derotator for one year and 15 untreated walking cerebral palsy children were included as controls. The two groups were compared in terms of the various parameters' change over time between the initial examination (the last examination prior to the start of the study or prior to use of the flexible derotator) and the final examination (after one year of follow-up). Results Right femoral anteversion and right and left external tibial torsion improved. There was a significant increase in distance and speed gait and a decrease in the EEI in walking cerebral palsy children. Conclusion Our retrospective study revealed a significant improvement in functional parameters in children with cerebral palsy, as a result of wearing the flexible derotator for at least 6 hours a day for a year. Bone parameters only improved slightly. Use of the flexible derotator could improve these children's quality of life.
Resumo:
BACKGROUND: Advances in nebulizer design have produced both ultrasonic nebulizers and devices based on a vibrating mesh (vibrating mesh nebulizers), which are expected to enhance the efficiency of aerosol drug therapy. The aim of this study was to compare 4 different nebulizers, of 3 different types, in an in vitro model using albuterol delivery and physical characteristics as benchmarks. METHODS: The following nebulizers were tested: Sidestream Disposable jet nebulizer, Multisonic Infra Control ultrasonic nebulizer, and the Aerogen Pro and Aerogen Solo vibrating mesh nebulizers. Aerosol duration, temperature, and drug solution osmolality were measured during nebulization. Albuterol delivery was measured by a high-performance liquid chromatography system with fluorometric detection. The droplet size distribution was analyzed with a laser granulometer. RESULTS: The ultrasonic nebulizer was the fastest device based on the duration of nebulization; the jet nebulizer was the slowest. Solution temperature decreased during nebulization when the jet nebulizer and vibrating mesh nebulizers were used, but it increased with the ultrasonic nebulizer. Osmolality was stable during nebulization with the vibrating mesh nebulizers, but increased with the jet nebulizer and ultrasonic nebulizer, indicating solvent evaporation. Albuterol delivery was 1.6 and 2.3 times higher with the ultrasonic nebulizer and vibrating mesh nebulizers devices, respectively, than with the jet nebulizer. Particle size was significantly higher with the ultrasonic nebulizer. CONCLUSIONS: The in vitro model was effective for comparing nebulizer types, demonstrating important differences between nebulizer types. The new devices, both the ultrasonic nebulizers and vibrating mesh nebulizers, delivered more aerosolized drug than traditional jet nebulizers.